×

Likelihood-free inference in state-space models with unknown dynamics. (English) Zbl 1529.62010

Summary: Likelihood-free inference (LFI) has been successfully applied to state-space models, where the likelihood of observations is not available but synthetic observations generated by a black-box simulator can be used for inference instead. However, much of the research up to now has been restricted to cases in which a model of state transition dynamics can be formulated in advance and the simulation budget is unrestricted. These methods fail to address the problem of state inference when simulations are computationally expensive and the Markovian state transition dynamics are undefined. The approach proposed in this manuscript enables LFI of states with a limited number of simulations by estimating the transition dynamics and using state predictions as proposals for simulations. In the experiments with non-stationary user models, the proposed method demonstrates significant improvement in accuracy for both state inference and prediction, where a multi-output Gaussian process is used for LFI of states and a Bayesian neural network as a surrogate model of transition dynamics.

MSC:

62-08 Computational methods for problems pertaining to statistics

References:

[1] Alpaydin, E.; Kaynak, C., Cascading classifiers, Kybernetika, 34, 4, 369-374 (1998) · Zbl 1274.68284
[2] Alsing, J.; Wandelt, B.; Feeney, S., Massive optimal data compression and density estimation for scalable, likelihood-free inference in cosmology, Mon. Not. R. Astron. Soc., 477, 3, 2874-2885 (2018) · doi:10.1093/mnras/sty819
[3] Alvarez, MA; Lawrence, ND, Computationally efficient convolved multiple output Gaussian processes, J. Mach. Learn. Res., 12, 1459-1500 (2011) · Zbl 1280.68153
[4] Anderson, B.D., Moore, J.B.: Optimal filtering. Courier Corporation (2012)
[5] Andrei, N., Scaled conjugate gradient algorithms for unconstrained optimization, Comput. Optim. Appl., 38, 3, 401-416 (2007) · Zbl 1168.90608 · doi:10.1007/s10589-007-9055-7
[6] Aushev, A., Pesonen, H., Heinonen, M., Corander, J. Kaski, S.: Likelihood-free inference with deep Gaussian processes. arXiv:2006.10571 (2020)
[7] Balandat, M., Karrer, B. Jiang, D. R., Daulton, S., Letham, B., Wilson, A. G., Bakshy. E.: BoTorch: a framework for efficient Monte-Carlo Bayesian optimization. Adv. Neural Inf. Process. Syst. 33 (2020)
[8] Barndorff-Nielsen, OE; Shephard, N., Econometric analysis of realized volatility and its use in estimating stochastic volatility models, J. R. Stat. Soc. Ser. B, 64, 2, 253-280 (2002) · Zbl 1059.62107 · doi:10.1111/1467-9868.00336
[9] Barthelmé, S.; Chopin, N., Expectation propagation for likelihood-free inference, J. Am. Stat. Assoc., 109, 505, 315-333 (2014) · Zbl 1367.62063 · doi:10.1080/01621459.2013.864178
[10] Beaumont, MA, Approximate Bayesian computation in evolution and ecology, Annu. Rev. Ecol. Evol. Syst., 41, 379-406 (2010) · doi:10.1146/annurev-ecolsys-102209-144621
[11] Beaumont, MA; Zhang, W.; Balding, DJ, Approximate Bayesian computation in population genetics, Genetics, 162, 4, 2025-2035 (2022) · doi:10.1093/genetics/162.4.2025
[12] Beaumont, MA; Zhang, W.; Balding, DJ, Approximate Bayesian computation in population genetics, Genetics, 162, 4, 2025-2035 (2002) · doi:10.1093/genetics/162.4.2025
[13] Bertorelle, G.; Benazzo, A.; Mona, S., ABC as a flexible framework to estimate demography over space and time: some cons, many pros, Mol. Ecol., 19, 13, 2609-2625 (2010) · doi:10.1111/j.1365-294X.2010.04690.x
[14] Blundell, C., Cornebise, J., Kavukcuoglu, K., Wierstra, D.: Weight uncertainty in neural network. In: International Conference on Machine Learning, pp. 1613-1622. PMLR (2015)
[15] Bonatti, C.; Mohr, D., One for all: Universal material model based on minimal state-space neural networks, Sci. Adv., 7, 26, eabf3658 (2021) · doi:10.1126/sciadv.abf3658
[16] Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers. In: Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pp. 144-152 (1992)
[17] Brochu, E., Cora, V.M., De Freitas, N.: A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. arXiv:1012.2599 (2010)
[18] Brockman, G., Cheung, V., Pettersson, L., Schneider, J., Schulman, J., Tang, J., Zaremba, W.: OpenAI gym (2016)
[19] Brockwell, PJ; Davis, RA, Time Series: Theory and Methods (2009), New York: Springer science & business media, New York
[20] Byrd, RH; Lu, P.; Nocedal, J.; Zhu, C., A limited memory algorithm for bound constrained optimization, SIAM J. Sci. Comput., 16, 5, 1190-1208 (1995) · Zbl 0836.65080 · doi:10.1137/0916069
[21] Caflisch, RE, Monte Carlo and quasi-Monte Carlo methods, Acta Numer, 1-49, 1998 (1998) · Zbl 0949.65003
[22] Calvet, LE; Czellar, V., Accurate methods for approximate Bayesian computation filtering, J. Financ. Economet., 13, 4, 798-838 (2015) · doi:10.1093/jjfinec/nbu019
[23] Campello, R.J., Moulavi, D., Sander, J.: Density-based clustering based on hierarchical density estimates. In: Pacific-Asia Conference on Knowledge Discovery and Data Mining, pp. 160-172. Springer (2013)
[24] Chen, X., Acharya, A., Oulasvirta, A.: An adaptive model of gaze-based selection. In: CHI Conference on Human Factors in Computing Systems (CHI’21). Association for Computing Machinery (2021)
[25] Cortes, C.; Vapnik, V., Support-vector networks, Mach. Learn., 20, 3, 273-297 (1995) · Zbl 0831.68098 · doi:10.1007/BF00994018
[26] Cranmer, K.; Brehmer, J.; Louppe, G., The frontier of simulation-based inference, Proc. Natl. Acad. Sci., 117, 48, 30055-30062 (2020) · Zbl 1485.62004 · doi:10.1073/pnas.1912789117
[27] Csilléry, K.; Blum, MG; Gaggiotti, OE; François, O., Approximate bayesian computation (abc) in practice, Trends Ecol. Evolut., 25, 7, 410-418 (2010) · doi:10.1016/j.tree.2010.04.001
[28] Daulton, S., Balandat, M., Bakshy, E.: Differentiable expected hypervolume improvement for parallel multi-objective Bayesian optimization. arXiv:2006.05078 (2020)
[29] Dean, TA; Singh, SS; Jasra, A.; Peters, GW, Parameter estimation for hidden Markov models with intractable likelihoods, Scand. J. Stat., 41, 4, 970-987 (2014) · Zbl 1305.62303 · doi:10.1111/sjos.12077
[30] Doerr, A., Daniel, C., Schiegg, M., Nguyen-Tuong, D., Schaal, S., Toussaint, M., Trimpe, S.: Probabilistic recurrent state-space models. arXiv:1801.10395 (2018)
[31] Doucet, A.; De Freitas, N.; Gordon, NJ, Sequential Monte Carlo Methods in Practice (2001), New York: Springer, New York · Zbl 0967.00022 · doi:10.1007/978-1-4757-3437-9
[32] Durkan, C., Murray, I., Papamakarios, G.: On contrastive learning for likelihood-free inference. In: International Conference on Machine Learning, pp. 2771-2781. PMLR (2020)
[33] Errico, RM; Yang, R.; Privé, NC; Tai, K-S; Todling, R.; Sienkiewicz, ME; Guo, J., Development and validation of observing-system simulation experiments at NASA’s global modeling and assimilation office, Q. J. R. Meteorol. Soc., 139, 674, 1162-1178 (2013) · doi:10.1002/qj.2027
[34] Esposito, P.: BLiTZ - Bayesian layers in Torch zoo (a Bayesian deep learning library for torch). https://github.com/piEsposito/blitz-bayesian-deep-learning/ (2020)
[35] Fanshawe, TR; Diggle, PJ, Bivariate geostatistical modelling: a review and an application to spatial variation in radon concentrations, Environ. Ecol. Stat., 19, 2, 139-160 (2012) · doi:10.1007/s10651-011-0179-7
[36] Fengler, A.; Govindarajan, LN; Chen, T.; Frank, MJ, Likelihood approximation networks (lans) for fast inference of simulation models in cognitive neuroscience, Elife, 10 (2021) · doi:10.7554/eLife.65074
[37] Fiske, S.T., Taylor, S.E.: Social Cognition: From Brains to Culture. Sage (2013)
[38] Frigola, R.; Chen, Y.; Rasmussen, CE, Variational Gaussian process state-space models, Adv. Neural. Inf. Process. Syst., 27, 3680-3688 (2014)
[39] Futrell, R.; Gibson, E.; Levy, RP, Lossy-context surprisal: An information-theoretic model of memory effects in sentence processing, Cogn. Sci., 44, 3 (2020) · doi:10.1111/cogs.12814
[40] Georgiou, T.; Demiris, Y., Adaptive user modelling in car racing games using behavioural and physiological data, User Model. User-Adap. Inter., 27, 2, 267-311 (2017) · doi:10.1007/s11257-017-9192-3
[41] Ghassemi, M.; Wu, M.; Hughes, MC; Szolovits, P.; Doshi-Velez, F., Predicting intervention onset in the icu with switching state space models, AMIA Summits Transl. Sci. Proc., 2017, 82 (2017)
[42] Gimenez, O.; Rossi, V.; Choquet, R.; Dehais, C.; Doris, B.; Varella, H.; Vila, J-P; Pradel, R., State-space modelling of data on marked individuals, Ecol. Model., 206, 3-4, 431-438 (2007) · doi:10.1016/j.ecolmodel.2007.03.040
[43] Goncalves, P., Lueckmann, J.-M., Bassetto, G., Oecal, K., Nonnenmacher, M., Macke, J. H.: Flexible statistical inference for mechanistic models of neural dynamics. In: Bonn Brain 3 Conference 2018, Bonn, Germany (2018)
[44] GPy. GPy: A gaussian process framework in python. http://github.com/SheffieldML/GPy (2012)
[45] Greenberg, D., Nonnenmacher, M., Macke, J.: Automatic posterior transformation for likelihood-free inference. In: International Conference on Machine Learning, pp. 2404-2414. PMLR (2019)
[46] Gutmann, MU; Corander, J., Bayesian optimization for likelihood-free inference of simulator-based statistical models, J. Mach. Learn. Res., 17, 1, 4256-4302 (2016) · Zbl 1392.62072
[47] He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770-778 (2016)
[48] He, S.; Li, Y.; Feng, Y.; Ho, S.; Ravanbakhsh, S.; Chen, W.; Póczos, B., Learning to predict the cosmological structure formation, Proc. Natl. Acad. Sci., 116, 28, 13825-13832 (2019) · Zbl 1431.83191 · doi:10.1073/pnas.1821458116
[49] Hermans, J., Begy, V., Louppe, G.: Likelihood-free MCMC with amortized approximate ratio estimators. In: International Conference on Machine Learning, pp. 4239-4248. PMLR (2020)
[50] Hoffman, MD; Gelman, A., The No-U-Turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo, J. Mach. Learn. Res., 15, 1, 1593-1623 (2014) · Zbl 1319.60150
[51] Hornik, K.; Stinchcombe, M.; White, H., Multilayer feedforward networks are universal approximators, Neural Netw., 2, 5, 359-366 (1989) · Zbl 1383.92015 · doi:10.1016/0893-6080(89)90020-8
[52] Ialongo, A. D., Van Der Wilk, M., Hensman, J., Rasmussen, C.E.: Overcoming mean-field approximations in recurrent Gaussian process models. arXiv:1906.05828 (2019)
[53] Izenman, A.J.: An introduction to Kalman filtering with applications (1988)
[54] Jasra, A.; Singh, SS; Martin, JS; McCoy, E., Filtering via approximate Bayesian computation, Stat. Comput., 22, 6, 1223-1237 (2012) · Zbl 1252.62093 · doi:10.1007/s11222-010-9185-0
[55] Jeffrey, N.; Alsing, J.; Lanusse, F., Likelihood-free inference with neural compression of DES SV weak lensing map statistics, Mon. Not. R. Astron. Soc., 501, 1, 954-969 (2021) · doi:10.1093/mnras/staa3594
[56] Kahneman, D., Tversky, A.: Prospect theory: An analysis of decision under risk. In: Handbook of the fundamentals of financial decision making: Part I, pp. 99-127. World Scientific (2013)
[57] Kalman, RE, Contributions to the theory of optimal control, Boletín de la Sociedad Matemática, 5, 2, 102-119 (1960) · Zbl 0112.06303
[58] Kalnay, E., Atmospheric Modeling, Data Assimilation and Predictability (2003), Cambridge: Cambridge University Press, Cambridge
[59] Karl, M., Soelch, M., Bayer, J., Van der Smagt, P.: Deep variational Bayes filters: Unsupervised learning of state space models from raw data. arXiv:1605.06432 (2016)
[60] Kitagawa, G., Monte Carlo filter and smoother for non-Gaussian nonlinear state space models, J. Comput. Graph. Stat., 5, 1, 1-25 (1996)
[61] Koller, D.; Friedman, N., Probabilistic Graphical Models: Principles and Techniques (2009), Cambridge: MIT Press, Cambridge · Zbl 1183.68483
[62] Kononenko, I., Bayesian neural networks, Biol. Cybern., 61, 5, 361-370 (1989) · doi:10.1007/BF00200801
[63] Lange, JU; van den Bosch, FC; Zentner, AR; Wang, K.; Hearin, AP; Guo, H., Cosmological evidence modelling: a new simulation-based approach to constrain cosmology on non-linear scales, Mon. Not. R. Astron. Soc., 490, 2, 1870-1878 (2019) · doi:10.1093/mnras/stz2664
[64] Lichtenstein, S.; Slovic, P., The Construction of Preference (2006), Cambridge: Cambridge University Press, Cambridge · doi:10.1017/CBO9780511618031
[65] Lintusaari, J.; Vuollekoski, H.; Kangasrääsiö, A.; Skytén, K.; Järvenpää, M.; Marttinen, P.; Gutmann, MU; Vehtari, A.; Corander, J.; Kaski, S., ELFI: Engine for likelihood-free inference, J. Mach. Learn. Res., 19, 16, 1-7 (2018)
[66] Martin, GM; McCabe, BP; Frazier, DT; Maneesoonthorn, W.; Robert, CP, Auxiliary likelihood-based approximate Bayesian computation in state space models, J. Comput. Graph. Stat., 28, 3, 508-522 (2019) · Zbl 07499073 · doi:10.1080/10618600.2018.1552154
[67] Martin, JS; Jasra, A.; Singh, SS; Whiteley, N.; Del Moral, P.; McCoy, E., Approximate Bayesian computation for smoothing, Stoch. Anal. Appl., 32, 3, 397-420 (2014) · Zbl 1429.62368 · doi:10.1080/07362994.2013.879262
[68] McInnes, L.; Healy, J.; Astels, S., hdbscan: Hierarchical density based clustering, J. Open Source Softw., 10, 1-12 (2017) · doi:10.21105/joss.00205
[69] McInnes, L., Healy, J., Melville, J.: UMAP: Uniform manifold approximation and projection for dimension reduction. arXiv:1802.03426 (2018a)
[70] McInnes, L.; Healy, J.; Saul, N.; Grossberger, L., UMAP: Uniform manifold approximation and projection, J. Open Source Softw., 3, 29, 861 (2018) · doi:10.21105/joss.00861
[71] Melchior, S., Curi, S., Berkenkamp, F., Krause, A.: Structured variational inference in unstable Gaussian process state space models. arXiv:1907.07035 (2019)
[72] Moulavi, D., Jaskowiak, P.A., Campello, R.J., Zimek, A., Sander, J.: Density-based clustering validation. In: Proceedings of the 2014 SIAM International Conference on Data Mining, pp. 839-847. SIAM (2014)
[73] Myung, IJ, Tutorial on maximum likelihood estimation, J. Math. Psychol., 47, 1, 90-100 (2003) · Zbl 1023.62112 · doi:10.1016/S0022-2496(02)00028-7
[74] Ong, VM-H; Nott, DJ; Tran, M-N; Sisson, SA; Drovandi, CC, Likelihood-free inference in high dimensions with synthetic likelihood, Comput. Stat. Data Anal., 128, 271-291 (2018) · Zbl 1469.62123 · doi:10.1016/j.csda.2018.07.008
[75] Owen, AB, Scrambling Sobol’and Niederreiter-Xing points, J. Complex., 14, 4, 466-489 (1998) · Zbl 0916.65017 · doi:10.1006/jcom.1998.0487
[76] Papamakarios, G., Murray, I.: Fast \(\varepsilon \)-free inference of simulation models with Bayesian conditional density estimation. In: Advances in Neural Information Processing Systems, pp. 1028-1036, (2016)
[77] Papamakarios, G., Sterratt, D., Murray, I.: Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows. In: The 22nd International Conference on Artificial Intelligence and Statistics, pp. 837-848. PMLR (2019)
[78] Parzen, E.: On spectral analysis with missing observations and amplitude modulation. Sankhyā Indian J. Stat. Ser. A pp. 383-392 (1963) · Zbl 0136.40701
[79] Paszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J., Chanan, G., Killeen, T., Lin, Z., Gimelshein, N., Antiga, L., et al.: PyTorch: an imperative style, high-performance deep learning library. arXiv:1912.01703 (2019)
[80] Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; Vanderplas, J.; Passos, A.; Cournapeau, D.; Brucher, M.; Perrot, M.; Duchesnay, E., Scikit-learn: Machine learning in Python, J. Mach. Learn. Res., 12, 2825-2830 (2011) · Zbl 1280.68189
[81] Peters, GW; Sisson, SA; Fan, Y., Likelihood-free Bayesian inference for \(\alpha \)-stable models, Comput. Stat. Data Anal., 56, 11, 3743-3756 (2012) · Zbl 1255.62071 · doi:10.1016/j.csda.2010.10.004
[82] Pothos, EM; Chater, N., A simplicity principle in unsupervised human categorization, Cogn. Sci., 26, 3, 303-343 (2002) · doi:10.1207/s15516709cog2603_6
[83] Pritchard, JK; Seielstad, MT; Perez-Lezaun, A.; Feldman, MW, Population growth of human y chromosomes: a study of y chromosome microsatellites, Mol. Biol. Evol., 16, 12, 1791-1798 (1999) · doi:10.1093/oxfordjournals.molbev.a026091
[84] Raffin, A., Hill, A., Ernestus, M., Gleave, A., Kanervisto, A., Dormann, N.: Stable baselines3. https://github.com/DLR-RM/stable-baselines3 (2019)
[85] Reynolds, AM; Rhodes, CJ, The lévy flight paradigm: random search patterns and mechanisms, Ecology, 90, 4, 877-887 (2009) · doi:10.1890/08-0153.1
[86] Rivals, I., Personnaz, L.: Black-box modeling with state-space neural networks. In: Neural Adaptive Control Technology, pp. 237-264. World Scientific (1996)
[87] Rubin, D.B.: Bayesianly justifiable and relevant frequency calculations for the applied statistician. Ann. Stat. pp. 1151-1172 (1984) · Zbl 0555.62010
[88] Salvatier, J.; Wiecki, TV; Fonnesbeck, C., Probabilistic programming in Python using PyMC3, PeerJ Comput. Sci., 2 (2016) · doi:10.7717/peerj-cs.55
[89] Schafer, C. M., Freeman, P. E.: Likelihood-free inference in cosmology: Potential for the estimation of luminosity functions. In: Statistical Challenges in Modern Astronomy V, pp. 3-19. Springer (2012)
[90] Schall, JD, Accumulators, neurons, and response time, Trends Neurosci., 42, 12, 848-860 (2019) · doi:10.1016/j.tins.2019.10.001
[91] Schuetz, I., Murdison, T. S., MacKenzie, K. J., Zannoli, M.: An explanation of Fitts’ law-like performance in gaze-based selection tasks using a psychophysics approach. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1-13 (2019)
[92] Schulman, J., Wolski, F., Dhariwal, P., Radford, A., Klimov, O.: Proximal policy optimization algorithms. arXiv preprint arXiv:1707.06347 (2017)
[93] Scott, DW, Multivariate Density Estimation: Theory, Practice, and Visualization (2015), New York: Wiley, New York · Zbl 1311.62004 · doi:10.1002/9781118575574
[94] Septier, F., Peters, G.W., Nevat, I.: Bayesian filtering with intractable likelihood using sequential MCMC. In: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 6313-6317. IEEE (2013)
[95] Shafi, K.; Latif, N.; Shad, SA; Idrees, Z.; Gulzar, S., Estimating option greeks under the stochastic volatility using simulation, Phys. A, 503, 1288-1296 (2018) · Zbl 1494.91163 · doi:10.1016/j.physa.2018.08.032
[96] Shephard, N., Statistical aspects of ARCH and stochastic volatility, Monograph. Stat. Appl. Probab., 65, 1-68 (1996)
[97] Sisson, SA; Fan, Y.; Beaumont, M., Handbook of Approximate Bayesian Computation (2018), CRC Press · Zbl 1416.62005 · doi:10.1201/9781315117195
[98] Slovic, P.; Finucane, M.; Peters, E.; MacGregor, DG, Rational actors or rational fools: Implications of the affect heuristic for behavioral economics, J. Socio-Econ., 31, 4, 329-342 (2002) · doi:10.1016/S1053-5357(02)00174-9
[99] Smidl, V.; Quinn, A., Variational Bayesian filtering, IEEE Trans. Signal Process., 56, 10, 5020-5030 (2008) · Zbl 1390.94061 · doi:10.1109/TSP.2008.928969
[100] Smith, A., Sequential Monte Carlo Methods in Practice (2013), New York: Springer Science & Business Media, New York
[101] Srinivas, N., Krause, A., Kakade, S.M., Seeger, M.: Gaussian process optimization in the bandit setting: No regret and experimental design. arXiv:0912.3995 (2009)
[102] Sunnåker, M., Busetto, A.G., Numminen, E., Corander, J., Foll, M., Dessimoz, C.: Approximate Bayesian computation. PLoS Comput. Biol. 9(1), e1002803 (2013)
[103] Sutskever, I., Martens, J., Dahl, G., Hinton, G.: On the importance of initialization and momentum in deep learning. In: International Conference on Machine Learning, pp. 1139-1147. PMLR (2013)
[104] Tavaré, S.; Balding, DJ; Griffiths, RC; Donnelly, P., Inferring coalescence times from DNA sequence data, Genetics, 145, 2, 505-518 (1997) · doi:10.1093/genetics/145.2.505
[105] Taylor, SJ, Modeling stochastic volatility: A review and comparative study, Math. Financ., 4, 2, 183-204 (1994) · Zbl 0884.90054 · doi:10.1111/j.1467-9965.1994.tb00057.x
[106] Tejero-Cantero, A.; Boelts, J.; Deistler, M.; Lueckmann, J-M; Durkan, C.; Gonçalves, PJ; Greenberg, DS; Macke, JH, SBI: a toolkit for simulation-based inference, J. Open Source Softw., 5, 52, 2505 (2020) · doi:10.21105/joss.02505
[107] Toni, T.; Welch, D.; Strelkowa, N.; Ipsen, A.; Stumpf, MP, Approximate Bayesian computation scheme for parameter inference and model selection in dynamical systems, J. R. Soc. Interface, 6, 31, 187-202 (2009) · doi:10.1098/rsif.2008.0172
[108] Wieschen, EM; Voss, A.; Radev, S., Jumping to conclusion? a lévy flight model of decision making, Quant. Methods Psychol., 16, 2, 120-132 (2020) · doi:10.20982/tqmp.16.2.p120
[109] Wilson, J.T., Hutter, F., Deisenroth, M.P.: Maximizing acquisition functions for Bayesian optimization. arXiv:1805.10196 (2018)
[110] Zeng, X.; Atlas, R.; Birk, RJ; Carr, FH; Carrier, MJ; Cucurull, L.; Hooke, WH; Kalnay, E.; Murtugudde, R.; Posselt, DJ, Use of observing system simulation experiments in the United States, Bull. Am. Meteor. Soc., 101, 8, E1427-E1438 (2020) · doi:10.1175/BAMS-D-19-0155.1
[111] Zerdali, E.; Barut, M., The comparisons of optimized extended Kalman filters for speed-sensorless control of induction motors, IEEE Trans. Ind. Electron., 64, 6, 4340-4351 (2017) · doi:10.1109/TIE.2017.2674579
[112] Zhang, C.; Bütepage, J.; Kjellström, H.; Mandt, S., Advances in variational inference, IEEE Trans. Pattern Anal. Mach. Intell., 41, 8, 2008-2026 (2018) · doi:10.1109/TPAMI.2018.2889774
[113] Zhang, X., Ren, X., Zha, H.: Modeling dwell-based eye pointing target acquisition. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2083-2092 (2010)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.