×

Belief revision and the EM algorithm. (English) Zbl 1455.68204

Carvalho, Joao Paulo (ed.) et al., Information processing and management of uncertainty in knowledge-based systems. 16th international conference, IPMU 2016, Eindhoven, The Netherlands, June 20–24, 2016. Proceedings. Part II. Cham: Springer. Commun. Comput. Inf. Sci. 611, 279-290 (2016).
Summary: This paper provides a natural interpretation of the EM algorithm as a succession of revision steps that try to find a probability distribution in a parametric family of models in agreement with frequentist observations over a partition of a domain. Each step of the algorithm corresponds to a revision operation that respects a form of minimal change. In particular, the so-called expectation step actually applies Jeffrey’s revision rule to the current best parametric model so as to respect the frequencies in the available data. We also indicate that in the presence of incomplete data, one must be careful in the definition of the likelihood function in the maximization step, which may differ according to whether one is interested by the precise modeling of the underlying random phenomenon together with the imperfect observation process, or by the modeling of the underlying random phenomenon alone, despite imprecision.
For the entire collection see [Zbl 1385.68004].

MSC:

68T35 Theory of languages and software systems (knowledge-based systems, expert systems, etc.) for artificial intelligence
68T37 Reasoning under uncertainty in the context of artificial intelligence

References:

[1] Domotor, Z.: Probability kinematics - conditional and entropy principles. Synthese 63, 74-115 (1985) · doi:10.1007/BF00485956
[2] Couso, I., Dubois, D.: Statistical reasoning with set-valued information: ontic vs. epistemic views. Int. J. Approximate Reasoning 55(7), 1502-1518 (2014) · Zbl 1407.62032 · doi:10.1016/j.ijar.2013.07.002
[3] Dempster, A.P.: Upper and lower probabilities induced by a multivalued mapping. Ann. Math. Stat. 38, 325-339 (1967) · Zbl 0168.17501 · doi:10.1214/aoms/1177698950
[4] Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm (with discussion). J. Roy. Stat. Soc. B 39, 1-38 (1977) · Zbl 0364.62022
[5] Do, C.B., Batzoglou, S.: What is the expectation maximization algorithm? Nat. Biotechnol. 26(8), 897-899 (2009) · doi:10.1038/nbt1406
[6] Edwards, A.W.F.: Likelihood. Cambridge University Press, Cambridge (1972) · Zbl 0578.62001
[7] Guillaume, R., Dubois, D.: Robust parameter estimation of density functions under fuzzy interval observations. In: 9th ISIPTA Symposium, Pescara, Italy, pp. 147-156 (2015)
[8] Haas, S.E.: The Expectation-Maximization and Alternating Minimization Algorithms. MIT, Cambridge (2002)
[9] Hüllermeier, E.: Learning from imprecise and fuzzy observations: data disambiguation through generalized loss minimization. Int. J. Approximate Reasoning 55(7), 1519-1534 (2014) · Zbl 1407.68396 · doi:10.1016/j.ijar.2013.09.003
[10] Hüllermeier, E., Cheng, W.: Superset learning based on generalized loss minimization. In: Appice, A., Rodrigues, P.P., Santos Costa, V., Gama, J., Jorge, A., Soares, C. (eds.) ECML PKDD 2015. LNCS, vol. 9285, pp. 260-275. Springer, Heidelberg (2015) · doi:10.1007/978-3-319-23525-7_16
[11] Jeffrey, R.C.: The Logic of Decision. McGraw-Hill, New York (1965), 2nd edn.: University of Chicago Press, Chicago, (1983). Paperback Edition (1990)
[12] McLachlan, G., Krishnan, T.: The EM Algorithm and Extensions. Wiley, New York (2007) · Zbl 1165.62019
[13] Russell, S.: The EM algorithm, Lecture notes CS 281, Computer Science Department University of California, Berkeley (1998)
[14] Schafer, J.L.: Multiple imputation: a primer. Stat. Methods Med. Res. 8, 3-15 (1999) · doi:10.1191/096228099671525676
[15] Ting, A., Lin, H.: Comparison of multiple imputation with EM algorithm and MCMC method for quality of life missing data. Qual. Quant. 44, 277-287 (2010) · doi:10.1007/s11135-008-9196-5
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.