×

Bayesian network models for incomplete and dynamic data. (English) Zbl 1541.62070

Summary: Bayesian networks are a versatile and powerful tool to model complex phenomena and the interplay of their components in a probabilistically principled way. Moving beyond the comparatively simple case of completely observed, static data, which has received the most attention in the literature, in this paper, we will review how Bayesian networks can model dynamic data and data with incomplete observations. Such data are the norm at the forefront of research and in practical applications, and Bayesian networks are uniquely positioned to model them due to their explainability and interpretability.
{© 2020 The Authors. Statistica Neerlandica © 2020 VVS.}

MSC:

62F15 Bayesian inference
68T35 Theory of languages and software systems (knowledge-based systems, expert systems, etc.) for artificial intelligence

Software:

bnstruct; GitHub; bnlearn

References:

[1] Adel, T., & deCampos, C. P. (2017). Learning Bayesian networks with incomplete data by augmentation. In S.Satinder (ed.) & S.Markovitch (ed.) (Eds.), Proceedings of the 31st AAAI Conference on Artificial Intelligence, Palo Alto, CA, 1684-1690.
[2] Aderhold, A., Husmeier, D., Lennon, J. J., Beale, C. M., & Smith, V. A. (2012). Hierarchical Bayesian models in ecology: Reconstructing species interaction networks from non‐homogeneous species abundance data. Ecological Informatics, 11, 55-64.
[3] Almahmoud, K., Namas, R. A., Zaaqoq, A. M., Abdul‐Malak, O., Namas, R., Zamora, R., … Vodovotz, Y. (2015). Prehospital hypotension is associated with altered inflammation dynamics and worse outcomes following blunt trauma in humans. Critical Care Medicine, 43(7), 1395-1404.
[4] Andersen, S. K., Olesen, K. G., Jensen, F. V., & Jensen, F. (1989). HUGIN - a shell for building Bayesian belief universes for expert systems. In N. S.Sridharan (ed.) (Ed.), Proceedings of the 11th International Joint Conference on Artificial Intelligence, San Francisco, CA, 1080-1085. · Zbl 0713.68051
[5] Balov, N. (2013). Consistent model selection of discrete Bayesian networks from incomplete data. Electronic Journal of Statistics, 7(1), 1047-1077. · Zbl 1336.62087
[6] Balov, N., & Salzman, P. (2019). catnet: Categorical Bayesian network inference. Retrieved from https://cran.r-project.org/web/packages/catnet
[7] Bańbura, M., Giannone, D., & Reichlin, L. (2010). Large Bayesian vector auto regressions. Journal of Applied Econometrics, 25(1), 71-92.
[8] Bartlett, M., & Cussens, J. (2013). Advances in Bayesian network learning using integer programming. In A.Nicholson (ed.) & P.Smyth (ed.) (Eds.), Proceedings of the 29th Conference on Uncertainty in Artificial Intelligence, Arlington, VA, 182-191.
[9] BayesFusion (2019). GeNIe Modeler. Retrieved from https://www.bayesfusion.com/genie
[10] Beal, M. J., & Ghahramani, Z. (2003). The variational Bayesian EM algorithm for incomplete data: With application to scoring graphical model structures. In J. M.Bernardo (ed.), M. J.Bayarri (ed.), J. O.Berger (ed.), A. P.Dawid (ed.), D.Heckerman (ed.), A. F. M.Smith (ed.), & M.West (ed.) (Eds.), Proceedings of the 7th Valencia International Meeting, Oxford, UK, 453-464. · Zbl 1044.62002
[11] Bøttcher, S. G. (2001). Learning Bayesian networks with mixed variables. In T. S.Richardson (ed.) & T. S.Jaakkola (ed.) (Eds.), Proceedings of the 8th International Workshop in Artificial Intelligence and Statistics, San Francisco, CA.
[12] Box, G. E. P., Jenkins, G. M., Reinsel, G. C., & Ljung, G. M. (2016). Time series forecasting and control(5th ed.). Hoboken, NJ: Wiley. · Zbl 1317.62001
[13] Cai, B., Liu, Y., Ma, Y., Liu, Z., Zhou, Y., & Sun, J. (2015). Real‐time reliability evaluation methodology based on dynamic Bayesian networks: A case study of a subsea pipe ram BOP system. ISA Transactions, 58, 595-604.
[14] Cassola, F., & Burlando, M. (2012). Wind speed and wind energy forecast through Kalman filtering of Numerical Weather Prediction model output. Applied Energy, 99, 154-166.
[15] Castelo, R., & Siebes, A. (2000). Priors on network structures. Biasing the search for Bayesian networks. International Journal of Approximate Reasoning, 24(1), 39-57. · Zbl 0995.68112
[16] Castillo, E., Gutiérrez, J. M., & Hadi, A. S. (1997). Expert systems and probabilistic network models. New York, NY: Springer.
[17] Chickering, D. M. (1996). Learning Bayesian networks is NP‐complete. In D.Fisher (ed.) & H.Lenz (ed.) (Eds.), Learning from data: Artificial intelligence and statistics V (pp. 121-130). New York, NY: Springer. · Zbl 0840.00025
[18] Chickering, D. M. (2002). Optimal structure identification with greedy search. Journal of Machine Learning Research, 3, 507-554. · Zbl 1084.68519
[19] Chickering, D. M., Geiger, D., & Heckerman, D. (1994). Learning Bayesian networks is NP‐hard (Technical Report MSR‐TR‐94‐17). Redmond, WA: Microsoft Corporation.
[20] Chickering, D. M., & Heckerman, D. (1997). Efficient approximations for the marginal likelihood of Bayesian networks with hidden variables. Machine Learning, 29(2-3), 181-212. · Zbl 0894.68124
[21] Chickering, D. M., Heckerman, D., & Meek, C. (2004). Large‐sample learning of Bayesian networks is NP‐hard. Journal of Machine Learning Research, 5, 1287-1330. · Zbl 1222.68169
[22] Codetta‐Raiteri, D., & Portinale, L. (2014). Dynamic Bayesian networks for fault detection, identification, and recovery in autonomous spacecraft. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 45(1), 13-24.
[23] Cowell, R. G. (2001). Conditions under which conditional independence and scoring methods lead to identical selection of Bayesian network models. In J.Breese (ed.) & D.Koller (ed.) (Eds.), Proceedings of the 17th Conference on Uncertainty in Artificial Intelligence, San Francisco, CA, 91-97.
[24] Cowell, R. G., Dawid, A. P., Lauritzen, S. L., & Spiegelhalter, D. J. (2007). Probabilistic networks and expert systems: Exact computational methods for Bayesian networks. New York, NY: Springer. · Zbl 1120.68444
[25] Cussens, J. (2011). Bayesian network learning with cutting planes. In F.Cozman (ed.) & A.Pfeffer (ed.) (Eds.), Proceedings of the 27th Conference on Uncertainty in Artificial Intelligence, Corvallis, OR, 153-160.
[26] deCampos, L. M., Fernández‐Luna, J. M., Gámez, J. A., & Puerta, J. M. (2002). Ant colony optimization for learning Bayesian networks. International Journal of Approximate Reasoning, 31(3), 291-311. · Zbl 1033.68091
[27] deCampos, L. M., Huete, J. F., & Moral, S. (1994). Probability intervals: A tool for uncertain reasoning. International Journal of Uncertainty, Fuzziness and Knowledge‐Based Systems, 2, 167-196. · Zbl 1232.68153
[28] Delaneau, O., Marchini, J., & Zagury, J.‐F. (2012). A linear complexity phasing method for thousands of genomes. Nature Methods, 9(2), 179-181.
[29] Dojer, N. (2006). Learning Bayesian networks does not have to be NP‐hard. In R.Královič (ed.) & P.Urzyczyn (ed.) (Eds.), Mathematical foundations of computer science 2006 (pp. 305-314.). Berlin, Germany: Springer. · Zbl 1132.68537
[30] Druzdzel, M. J., & van derGaag, L. C. (1995). Elicitation of probabilities for belief networks: Combining qualitative and quantitative information. In P.Besnard (ed.) & S.Hanks (ed.) (Eds.), Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence, San Francisco, CA, 141-148.
[31] Elvira Consortium. (2002). Elvira: An environment for creating and using probabilistic graphical models. In J. A.Gámez (ed.) & A.Salmerón (ed.) (Eds.), Proceedings of the 1st European Workshop on Probabilistic Graphical Models, 222-230.
[32] Emr, B., Sadowsky, D., Azhar, N., Gatto, L. A., An, G., Nieman, G., … Vodovotz, Y. (2014). Removal of inflammatory ascites is associated with dynamic modification of local and systemic inflammation along with prevention of acute lung injury: In vivo and in silico studies. Shock, 41(4), 317-323.
[33] Fernández, A., Nielsen, J. D., & Salmerón, A. (2010). Learning Bayesian networks for regression from incomplete databases. International Journal of Uncertainty, Fuzziness and Knowledge‐Based Systems, 18, 69-86.
[34] Fine, S., Singer, Y., & Tishby, N. (1998). The hierarchical hidden Markov model: Analysis and applications. Machine Learning, 32(1), 41-62. · Zbl 0901.68178
[35] Franzin, A., Sambo, F., & Di Camillo, B. (2017). bnstruct: An R package for Bayesian network structure learning in the presence of missing data. Bioinformatics, 33(8), 1250-1252.
[36] Friedman, N. (1997). Learning belief networks in the presence of missing values and hidden variables. In D. H.Fisher (ed.) (Ed.), Proceedings of the 14th International Conference on Machine Learning, San Francisco, CA, 125-133.
[37] Friedman, N. (1998). The Bayesian Structural EM Algorithm. In G. F.Cooper (ed.) & S.Moral (ed.) (Eds.), Proceedings of the 14th Conference on Uncertainty in Artificial Intelligence, San Francisco, CA, 129-138.
[38] Friedman, N., Ninio, M., Pe’er, I., & Pupko, T. (2002). A structural EM algorithm for phylogenetic inference. Journal of Computational Biology, 9(2), 331-353.
[39] Frigault, M., Wang, L., Singhal, A., & Jajodia, S. (2008). Measuring network security using Dynamic Bayesian Network. In A.Ozment (ed.) & K.Stølen (ed.) (Eds.), Proceedings of the 4th ACM Workshop on Quality of Protection, New York, NY, 23-30.
[40] Gales, M., & Young, S. (2008). The application of hidden Markov models in speech recognition. Foundations and Trends in Signal Processing, 1(3), 195-304. · Zbl 1145.68045
[41] Gates, K. M., Molenaar, P. C. M., Hillary, F. G., Ram, N., & Rovine, M. J. (2010). Automatic search for fMRI connectivity mapping: An alternative to Granger causality testing using formal equivalences among SEM path modeling, VAR, and unified SEM. NeuroImage, 50, 1118-1125.
[42] Geiger, D., & Heckerman, D. (1994). Learning Gaussian networks. In R. L.De Mántaras (ed.) & D.Poole (ed.) (Eds.), Proceedings of the 10th International Conference on Uncertainty in Artificial Intelligence, San Francisco, CA, 235-243.
[43] Geman, S., & Geman, D. (1984). Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence, 6, 721-741. · Zbl 0573.62030
[44] Ghahramani, Z. (2001). An introduction to hidden Markov models and Bayesian networks. International Journal of Pattern Recognition and Artificial Intelligence, 15(1), 9-42.
[45] Ghahramani, Z., & Jordan, M. I. (1996). Factorial hidden Markov models. In M. C.Mozer (ed.), M. I.Jordan (ed.), & T.Petsche (ed.) (Eds.), Advances in Neural Information Processing Systems 9: Proceedings of the 1996 Conference, Cambridge, MA, 472-478.
[46] Granger, C. W. J. (1969). Investigating causal relations by econometric models and cross‐spectral methods. Econometrica, 37, 424-438. · Zbl 1366.91115
[47] Grzegorczyk, M., & Husmeier, D. (2009). Non‐stationary continuous dynamic Bayesian networks. In Y.Bengio (ed.), D.Schuurmans (ed.), J. D.Lafferty (ed.), C. K. I.Williams (ed.), & A.Culotta (ed.) (Eds.), Advances in Neural Information Processing Systems 22, Red Hook, NY, 682-690.
[48] Grzegorczyk, M., & Husmeier, D. (2011). Non‐homogeneous dynamic Bayesian networks for continuous data. Machine Learning, 83(3), 355-419. · Zbl 1274.62201
[49] Gupta, S., & Kim, H. W. (2008). Linking structural equation modeling to Bayesian networks: Decision support for customer retention in virtual communities. European Journal of Operational Research, 190(3), 818-833. · Zbl 1146.90358
[50] Hamilton, J. D. (1994). Time series analysis. Princeton, NJ: Princeton University Press. · Zbl 0831.62061
[51] Heckerman, D. (1997). Bayesian networks for data mining. Data Mining and Knowledge Discovery, 1(1), 79-119.
[52] Heckerman, D., Geiger, D., & Chickering, D. M. (1995). Learning Bayesian networks: The combination of knowledge and statistical data. Machine Learning, 20(3), 197-243. · Zbl 0831.68096
[53] Hill, S. M., Lu, Y., Molina, J., Heiser, L. M., Spellman, P. T., Speed, T. P., … Mukherjee, S. (2012). Bayesian inference of signaling network topology in a cancer cell line. Bioinformatics, 28(21), 2804-2810.
[54] Hofleitner, A., Herring, R., Abbeel, P., & Bayen, A. (2012). Learning the dynamics of arterial traffic from probe data using a dynamic Bayesian network. IEEE Transactions on Intelligent Transportation Systems, 13(4), 1679-1693.
[55] Kalton, G., & Kasprzyk, D. (1986). The treatment of missing survey data. Survey Methodology, 12(1), 1-16.
[56] Kamalabad, M. S., & Grzegorczyk, M. (2018). Improving nonhomogeneous dynamic Bayesian networks with sequentially coupled parameters. Statistica Neerlandica, 72(3), 281-305. · Zbl 1541.62321
[57] Kim, S., Imoto, S., & Miyano, S. (2004). Dynamic Bayesian network and nonparametric regression for nonlinear modeling of gene networks from time series gene expression data. Biosystems, 75(1-3), 57-65.
[58] Koller, D., & Friedman, N. (2009). Probabilistic graphical models: Principles and techniques. Cambridge, MA: MIT Press. · Zbl 1183.68483
[59] Korb, K. B., & Nicholson, A. E. (2011). Bayesian artificial intelligence(2nd ed.). Boca Raton, FL: Chapman & Hall. · Zbl 1208.68214
[60] Lähdesmäki, H., & Shmulevich, I. (2008). Learning the structure of dynamic Bayesian networks from time series and steady state measurements. Machine Learning, 71(2-3), 185-217. · Zbl 1470.68127
[61] Larrañaga, P., Poza, M., Yurramendi, Y., Murga, R. H., & Kuijpers, C. M. H. (1996). Structure learning of Bayesian networks by genetic algorithms: A performance analysis of control parameters. IEEE Transactions on Pattern Analysis and Machine Intelligence, 18(9), 912-926.
[62] Lauritzen, S. L. (1995). The EM algorithm for graphical association models with missing data. Computational Statistics & Data Analysis, 19(2), 191-201. · Zbl 0875.62237
[63] Lauritzen, S. L., & Wermuth, N. (1989). Graphical models for associations between variables, some of which are qualitative and some quantitative. The Annals of Statistics, 17(1), 31-57. · Zbl 0669.62045
[64] Liang, X. F., Wang, H. D., Yi, H., & Li, D. (2017). Warship reliability evaluation based on dynamic Bayesian networks and numerical simulation. Ocean Engineering, 136, 129-140.
[65] Liao, W., & Ji, Q. (2009). Learning Bayesian network parameters under incomplete data with domain knowledge. Pattern Recognition, 42(11), 3046-3056. · Zbl 1175.68372
[66] Liew, B. X. W., Scutari, M., Peolsson, A., Peterson, G., Ludvigsson, M. L., & Falla, D. (2019). Investigating the causal mechanisms of symptom recovery in chronic whiplash‐associated disorders using Bayesian networks. The Clinical Journal of Pain, 35(8), 647-655.
[67] Little, R. J. A., & Rubin, D. B. (1987). Statistical analysis with missing data. New York, NY: Wiley. · Zbl 0665.62004
[68] Liu, M., Stella, F., Hommersom, A., & Lucas, P. J. F. (2018). Making continuous time Bayesian networks more flexible. Proceedings of Machine Learning Research, 72, 237-248.
[69] López‐Kleine, L., Leal, L., & López, C. (2013). Biostatistical approaches for the reconstruction of gene co‐expression networks based on transcriptomic data. Briefings in Functional Genomics, 12(5), 457-467.
[70] Lugo‐Martinez, J., Ruiz‐Perez, D., Narasimhan, G., & Bar‐Joseph, Z. (2019). Dynamic interaction network inference from longitudinal microbiome data. Microbiome, 7, 54.
[71] Luque, J., & Straub, D. (2016). Reliability analysis and updating of deteriorating systems with dynamic Bayesian networks. Structural Safety, 62, 34-46.
[72] Ma, F., Chen, Y.‐W., Yan, X.‐P., Yan, X.‐M., & Wang, J. (2016). A novel marine radar targets extraction approach based on sequential images and Bayesian Network. Ocean Engineering, 120, 64-77.
[73] Marchini, J., & Howie, B. (2010). Genotype imputation for genome‐wide association studies. Nature Reviews Genetics, 11(7), 499-511.
[74] Mathew, S., Sundararaj, S., & Banerjee, I. (2015). Network analysis identifies crosstalk interactions governing TGF‐β signaling dynamics during endoderm differentiation of human embryonic stem cells. Processes, 3(2), 286-308.
[75] McGeachie, M. J., Sordillo, J. E., Gibson, T., Weinstock, G. M., Liu, Y.‐Y., Gold, D. R., … Litonjua, A. (2016). Longitudinal prediction of the infant gut microbiome with dynamic Bayesian networks. Scientific Reports, 6, 20359.
[76] Molina, J.‐L., Pulido‐Velázquez, D., García‐Aróstegui, J. L., & Pulido‐Velázquez, M. (2013). Dynamic Bayesian networks as a decision support tool for assessing climate change impacts on highly stressed groundwater systems. Journal of Hydrology, 479, 113-129.
[77] Mukherjee, S., & Speed, T. P. (2008). Network inference using informative priors. Proceedings of the National Academy of Sciences, 105(38), 14313-14318.
[78] Murphy, K. P. (2002). Dynamic Bayesian networks: Representation, inference and learning (Doctoral dissertation). University of California, Berkeley, Berkeley, CA.
[79] Myers, J. W., Laskey, K. B., & DeJong, K. A. (1999). Learning Bayesian networks from incomplete data using evolutionary algorithms. In W.Banzhaf (ed.), J. M.Daida (ed.), A. E.Eiben (ed.), M. H.Garzon (ed.), & V.Honavar (ed.) (Eds.), Proceedings of the 1st Annual Conference on Genetic and Evolutionary Computation, San Francisco, CA, 458-465.
[80] Myers, J. W., Laskey, K. B., & Levitt, T. S. (1999). Learning Bayesian networks from incomplete data with stochastic search algorithms. In K. B.Laskey (ed.) & H.Prade (ed.) (Eds.), Proceedings of the 15th Conference on Uncertainty in Artificial Intelligence, San Francisco, CA, 467-485.
[81] Neapolitan, R. E. (1989). Probabilistic reasoning in expert systems: Theory and algorithms. Scotts Valley, CA: CreateSpace.
[82] Neapolitan, R. E. (2004). Learning Bayesian Networks. Upper Saddle River, NJ: Prentice Hall.
[83] Nodelman, U., Shelton, C. R., & Koller, D. (2003). Learning continuous time Bayesian networks. In U.Kjærulff (ed.) & C.Meek (ed.) (Eds.), Proceedings of the 19th Conference on Uncertainty in Artificial Intelligence, San Francisco, CA, 451-458.
[84] Oniśko, A., Druzdzel, M. J., & Wasyluk, H. (2002). An experimental comparison of methods for handling incomplete data in learning parameters of Bayesian networks. In M. A.Kłopotek (ed.), S. T.Wierzchoń (ed.), & M.Michalewicz (ed.) (Eds.), Proceedings of the Intelligent Information Systems 2002 Symposium, Berlin, Germany, 351-360. · Zbl 1088.68786
[85] Patterson, T. A., Thomas, L., Wilcox, C., Ovaskainen, O., & Matthiopoulos, J. (2008). State-space models of individual animal movement. Trends in Ecology & Evolution, 23(2), 87-94.
[86] Pavlovic, V., Frey, B. J., & Huang, T. S. (1999). Time‐series classification using mixed‐state dynamic Bayesian networks. Proceedings of the 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 609-615.
[87] Pearl, J. (1988). Probabilistic reasoning in intelligent systems: Networks of plausible inference. San Francisco, CA: Kaufmann.
[88] Peña, J. M., Lozano, J. A., & Larrañaga, P. (2000). An improved Bayesian structural EM algorithm for learning Bayesian networks for clustering. Pattern Recognition Letters, 21(8), 779-786.
[89] Perrin, B.‐E., Ralaivola, L., Mazurie, A., Bottani, S., Mallet, J., & d’Alché‐Buc, F. (2003). Gene networks inference using dynamic Bayesian networks. Bioinformatics, 19(Suppl. 2), ii138-ii148.
[90] Plötz, T., & Fink, G. A. (2009). Markov models for offline handwriting recognition: A survey. International Journal on Document Analysis and Recognition, 12(4), 269-298.
[91] Ramoni, M., & Sebastiani, P. (1997). The use of exogenous knowledge to learn Bayesian Networks from incomplete databases. In X.Liu (ed.), P. R.Cohen (ed.), & M. R.Berthold (ed.) (Eds.), Advances in intelligent data analysis reasoning about data: Second International Symposium, IDA‐97 London, UK, August 4‐6, 1997 Proceedings (pp. 537-548). London, UK: Springer.
[92] Ramoni, M., & Sebastiani, P. (2001). Robust learning with missing data. Machine Learning, 45(2), 147-170. · Zbl 1007.68154
[93] Rancoita, P. M. V., Zaffalon, M., Zucca, E., Bertoni, F., & deCampos, C. P. (2014). Bayesian network data imputation with application to survival tree analysis. Computational Statistics & Data Analysis, 93, 373-387. Supplementary material at https://github.com/cassiopc/csda-dataimputation · Zbl 1468.62166
[94] Riggelsen, C. (2006). Learning parameters of Bayesian networks from incomplete data via importance sampling. International Journal of Approximate Reasoning, 42(1-2), 69-83. · Zbl 1096.68706
[95] Robinson, J. W., & Hartemink, A. J. (2010). Learning non‐stationary dynamic Bayesian networks. Journal of Machine Learning Research, 11, 3647-3680. · Zbl 1242.68244
[96] Ropero, R. F., Flores, M. J., Rumí, R., & Aguilera, P. A. (2017). Applications of hybrid dynamic Bayesian networks to water reservoir management. Environmetrics, 28, e2432.
[97] Roweis, S., & Ghahramani, Z. (1999). A unifying review of linear Gaussian models. Neural Computation, 11(2), 305-345.
[98] Rubin, D. B. (1976). Inference and missing data. Biometrika, 63, 581-592. · Zbl 0344.62034
[99] Russell, S. J., & Norvig, P. (2009). Artificial intelligence: A modern approach. Upper Saddle River, NJ: Prentice Hall.
[100] Ryynänen, O.‐P., Leppänen, T., Kekolahti, P., Mervaala, E., & Töyräs, J. (2018). Bayesian network model to evaluate the effectiveness of continuous positive airway pressure treatment of sleep apnea. Healthcare Informatics Research, 24(4), 346-358.
[101] Sakaki, T., Okazaki, M., & Matsuo, Y. (2010). Earthquake shakes Twitter users: Real‐time event detection by social sensors. In M.Rappa (ed.) (Ed.), Proceedings of the 19th International Conference on the World Wide Web, New York, NY, 851-860.
[102] Scanagatta, M., Corani, G., Zaffalon, M., Yoo, J., & Kang, U. (2018). Efficient learning of bounded‐treewidth Bayesian networks from complete and incomplete data sets. International Journal of Approximate Reasoning, 95, 152-166. · Zbl 1444.68161
[103] Scanagatta, M., deCampos, C. P., Corani, G., & Zaffalon, M. (2015). Learning Bayesian networks with thousands of variables. In C.Cortes (ed.), N. D.Lawrence (ed.), D. D.Lee (ed.), M.Sugiyama (ed.), & R.Garnett (ed.) (Eds.), Advances in Neural Information Processing Systems 28, 1864-1872.
[104] Schiffels, S., & Durbin, R. (2014). Inferring human population size and separation history from multiple genome sequences. Nature Genetics, 46(8), 919-925.
[105] Schwarz, G. (1978). Estimating the dimension of a model. The Annals of Statistics, 6(2), 461-464. · Zbl 0379.62005
[106] Scutari, M. (2010). Learning Bayesian networks with the bnlearn R package. Journal of Statistical Software, 35(3), 1-22.
[107] Scutari, M., & Denis, J‐B. (2014). Bayesian networks: With examples in R. Boca Raton, FL: Chapman & Hall.
[108] Scutari, M., Graafland, C. E., & Gutiérrez, J. M. (2018). Who learns better Bayesian network structures: Constraint‐based, score‐based or hybrid algorithms?Proceedings of Machine Learning Research, 2018, 416-427.
[109] Scutari, M., Vitolo, C., & Tucker, A. (2019). Learning Bayesian networks from big data with greedy search: Computational complexity and efficient implementation. Statistics and Computing, 29(5), 1095-1108. · Zbl 1430.62275
[110] Seixas, F. L., Zadrozny, B., Laks, J., Conci, A., & Muchaluat Saade, D. C. (2014). A Bayesian network decision model for supporting the diagnosis of dementia, Alzheimer’s disease and mild cognitive impairment. Computers in Biology and Medicine, 51, 140-158.
[111] Singh, M. (1997). Learning Bayesian networks from incomplete data. In B.Kuipers (ed.) & B. L.Webber (ed.) (Eds.), Proceedings of the Ninth Conference on Innovative Applications of Artificial Intelligence, Cambridge, MA, 27-31.
[112] Song, L., Kolar, M., & Xing, E. P. (2009). Time‐varying dynamic Bayesian networks. In Y.Bengio (ed.), D.Schuurmans (ed.), J. D.Lafferty (ed.), C. K. I.Williams (ed.), & A.Culotta (ed.) (Eds.), Advances in Neural Information Processing Systems 22, 1732-1740.
[113] Spiegelhalter, D. J., & Cowell, R. G. (1992). Learning in probabilistic expert systems. In J. M.Bernardo (ed.), J. O.Berger (ed.), & M. H.DeGroot (ed.) (Eds.), Proceedings of the 4th Valencia International Meeting, Oxford, UK, 447-466.
[114] Suzuki, J. (2017). An efficient Bayesian network structure learning strategy. New Generation Computing, 35(1), 105-124. · Zbl 1442.68205
[115] Tanner, M., & Wong, W. (1987). The calculation of posterior distributions by data augmentation. Journal of the American Statistical Association, 82(398), 528-540. · Zbl 0619.62029
[116] Tobon‐Mejia, D. A., Medjaher, K., & Zerhouni, N. (2012). CNC machine tool’s wear diagnostic and prognostic by using dynamic Bayesian networks. Mechanical Systems and Signal Processing, 28, 167-182.
[117] Trifonova, N., Kenny, A., Maxwell, D., Duplisea, D., Fernandes, J., & Tucker, A. (2015). Spatio‐temporal Bayesian network models with latent variables for revealing trophic dynamics and functional networks in fisheries ecology. Ecological Informatics, 30, 142-158.
[118] van derHeijden, M., Velikova, M., & Lucas, P. J. F. (2014). Learning Bayesian networks for clinical time series analysis. Journal of Biomedical Informatics, 48, 94-105.
[119] Weatherburn, C. E. (1961). A first course in mathematical statistics. Cambridge, UK: Cambridge University Press. · Zbl 0111.15205
[120] Work, D. B., Tossavainen, O.‐P., Blandin, S., Bayen, A. M., Iwuchukwu, T., & Tracton, K. (2008). An ensemble Kalman filtering approach to highway traffic estimation using GPS enabled mobile devices. Proceedings of the 47th IEEE Conference on Decision and Control, 2141.
[121] Yamazaki, K., & Motomura, Y. (2019). Hidden node detection between observable nodes based on Bayesian clustering. Entropy, 21(1), 32.
[122] Zaaqoq, A. M., Namas, R., Almahmoud, K., Azhar, N., Mi, Q., Zamora, R., … Vodovotz, Y. (2014). IP‐10, a potential driver of neurally‐controlled IL‐10 and morbidity in human blunt trauma. Critical Care Medicine, 42(6), 1487-1497.
[123] Zou, C., & Feng, J. (2009). Granger causality vs. dynamic Bayesian network inference: A comparative study. BMC Bioinformatics, 10(1), 122.
[124] Zucchini, W., MacDonald, I. L., & Langrock, R. (2009). Hidden Markov models for time series: An introduction using R. Boca Raton, FL: Chapman & Hall. · Zbl 1180.62130
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.