×

On predictive density estimation under \(\alpha\)-divergence loss. (English) Zbl 1426.62018

Summary: Based on \(X \sim N_d(\theta, \sigma_X^2I_d)\), we study the efficiency of predictive densities under \(\alpha\)-divergence loss \(L_{\alpha}\) for estimating the density of \(Y \sim N_d(\theta, \sigma_Y^2I_d)\). We identify a large number of cases where improvement on a plug-in density are obtainable by expanding the variance, thus extending earlier findings applicable to Kullback-Leibler loss. The results and proofs are unified with respect to the dimension \(d\), the variances \(\sigma_X^2\) and \(\sigma_Y^2\), the choice of loss \(L_{\alpha}\); \(\alpha\in (-1,1)\). The findings also apply to a large number of plug-in densities, as well as for restricted parameter spaces with \(\theta\in\Theta\subset\mathbb{R}^d\). The theoretical findings are accompanied by various observations, illustrations, and implications dealing for instance with robustness with respect to the model variances and simultaneous dominance with respect to the loss.

MSC:

62C20 Minimax procedures in statistical decision theory
62F10 Point estimation
62F15 Bayesian inference
62F30 Parametric inference under constraints
62H10 Multivariate distribution of statistics
62M20 Inference from stochastic processes and prediction

References:

[1] J. Aitchison, “Goodness of Prediction Fit”, Biometrika 62, 547-554 (1975). · Zbl 0339.62018 · doi:10.1093/biomet/62.3.547
[2] J. Aitchison and I. R. Dunsmore, Statistical Prediction Analysis (Cambridge Univ. Press, Cambridge, 1975). · Zbl 0327.62043 · doi:10.1017/CBO9780511569647
[3] A. J. Baranchik, “A family of Minimax Estimators of the Mean of a Multivariate Normal Distribution”, Ann. Math. Statist. 41, 642-645 (1970). · Zbl 0204.52504 · doi:10.1214/aoms/1177697104
[4] J. O. Berger, “Minimax Estimation of a Multivariate Normal Mean under Polynomial Loss”, J. Multivariate Anal. 8, 173-180 (1978). · Zbl 0376.62008 · doi:10.1016/0047-259X(78)90070-2
[5] L. D. Brown, E. I. George, and X. Xu, “Admissible Predictive Density Estimation”, Ann. Statist. 36, 1156-1170 (2008). · Zbl 1216.62012 · doi:10.1214/07-AOS506
[6] J. M. Corcuera and F. Giummolè, “A Generalized Bayes Rule for Prediction”, Scand. J. Statist. 26, 265-279 (1999A). · Zbl 0934.62027 · doi:10.1111/1467-9469.00149
[7] J. M. Corcuera and F. Giummolè, “On the Relationship between <Emphasis Type=”Italic“>α Connections and the Asymptotic Properties of Predictive Distributions”, Bernoulli 5, 163-176 (1999B). · Zbl 0916.62014 · doi:10.2307/3318617
[8] I. Csiszàr, “Information-Type Measures of Difference of Probability Distributions and Indirect Observations”, Studia Sci. Math. Hungar. 2, 299-318 (1967). · Zbl 0157.25802
[9] D. Fourdrinier, É. Marchand, A. Righi, and W. E. Strawderman, “On Improved Predictive Density Estimation with Parametric Constraints”, Electron. J. Statist. 5, 172-191 (2011). · Zbl 1274.62079 · doi:10.1214/11-EJS603
[10] D. Fourdrinier, I. Ouassou, and W. E. Strawderman, “Estimation of a Mean Vector Under Quartic Loss”, J. Statist. Plann, Inference 138, 3841-3857 (2008). · Zbl 1147.62049 · doi:10.1016/j.jspi.2008.02.009
[11] E. I. George, F. Liang, and X. Xu, “Improved Minimax Predictive Densities under Kullback-Leibler Loss”, Ann. Statist. 34, 78-91 (2006). · Zbl 1091.62003 · doi:10.1214/009053606000000155
[12] M. Ghosh, V. Mergel, and G. S. Datta, “Estimation, Prediction and the Stein Phenomenon under Divergence Loss”, J. Multivariate Anal. 99, 1941=-1961 (2008). · Zbl 1274.62080 · doi:10.1016/j.jmva.2008.02.002
[13] T. Kubokawa, É. Marchand, and W. E. Strawderman, “On Predictive Density Estimation for Location Families under Integrated Absolute Value Loss”, Bernoulli 23, 3197-3212 (2017). · Zbl 1382.62011 · doi:10.3150/16-BEJ842
[14] T. Kubokawa, É. Marchand, and W. E. Strawderman, “On Predictive Density Estimation for Location Families under Integrated Squared Error Loss”, J. Multivariate Anal. 142, 57-74 (2015A) · Zbl 1327.62054 · doi:10.1016/j.jmva.2015.07.013
[15] T. Kubokawa, É. Marchand, and W. E. Strawderman, “On Improved Shrinkage Estimators under Concave Loss”, Statist. Probab. Lett. 96, 241-246 (2015B). · Zbl 1308.62014 · doi:10.1016/j.spl.2014.09.024
[16] A. L’Moudden, É. Marchand, O. Kortbi, and W. E. Strawderman, “On Predictive Density Estimation for Gamma Models with Parametric Constraints”, J. Statist. Plann. Inference 185, 56-68 (2017). · Zbl 1356.62044 · doi:10.1016/j.jspi.2017.01.003
[17] É. Marchand, F. Perron, and I. Yadegari, “On Estimating a Bounded Normal Mean with Applications to Predictive Density Estimation”, Electron. J. Statist. 11, 2002-2025 (2017). · Zbl 1362.62018 · doi:10.1214/17-EJS1279
[18] É. Marchand and N. Sadeghkhani, “On Predictive Density Estimation with Additional Information”, Electron. J. Statist. (in press) (2017). · Zbl 1409.62116
[19] É. Marchand and W. E. Strawderman, “A Unified Minimax Result for Restricted Parameter Spaces”, Bernoulli 18, 635-643 (2012). · Zbl 1251.49024 · doi:10.3150/10-BEJ336
[20] Y. Maruyama and T. Ohnishi, “Harmonic Bayesian Prediction under α-Divergence”, arXiv:1605.05899v4 (2017). · Zbl 1432.62057
[21] Y. Maruyama and W. E. Strawderman, “Bayesian Predictive Densities for Linear Regression Models under α-Divergence Loss: Some Results and Open Problems”, in Contemporary Developments in Bayesian analysis and Statistical Decision Theory: A Festschrift for William E. Strawderman, IMS Collections (2012), Vol. 8, pp. 42-56. · Zbl 1326.62021
[22] T. Yanagimoto and T. Ohnishi, “Bayesian Prediction of a Density Function in Terms of e-Mixture”, J. Statist. Plann. Inference 139, 3064-3075 (2009). · Zbl 1168.62024 · doi:10.1016/j.jspi.2009.02.005
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.