×

Dynamic generalized information measures. (English) Zbl 1058.62006

Summary: In many reliability and survival analysis problems the current age of an item under study must be taken into account by information measures of the lifetime distribution. Kullback-Leibler information and Shannon entropy have been considered in this context, which led to information measures that depend on time, and thus are dynamic. This paper develops dynamic information divergence and entropy of order \(\alpha\), also known as Rényi information and entropy, which for \(\alpha =1\) give the Kullback-Leibler information and Shannon entropy, respectively.
We give characterizations of the proportional hazards model, the exponential distribution, and Generalized Pareto distributions in terms of dynamic Rényi information and entropy. It is also shown that dynamic Rényi entropy uniquely determines distributions that have monotone densities. A result that relates dynamic Rényi entropy and hazard rate orderings is given. This result leads to a Maximum Dynamic Entropy of order \(\alpha\) formulation and characterizations of some well-known lifetime models. A dynamic entropy hazard rate inequality is developed as an analog of the well-known entropy moment inequality.

MSC:

62B10 Statistical aspects of information-theoretic topics
62N05 Reliability and life testing
62E10 Characterization and structure theory of statistical distributions
Full Text: DOI

References:

[1] Asadi, M.; Ebrahimi, N., Residual entropy and its characterizations in terms of hazard function and mean residual life function, Statist. Probab. Lett., 49, 263-269 (2000) · Zbl 1118.62306
[2] Asadi, M.; Ebrahimi, N.; Hamedani, G. G.; Soofi, E. S., Maximum dynamic entropy models, J. Appl. Probab., 41, 379-390 (2004) · Zbl 1063.94015
[3] Belzunce, F.; Navarro, J.; Ruiz, J. M., Some results on residual entropy functions, Metrika, 59, 147-161 (2004) · Zbl 1079.62008
[4] Di Crescenzo, A.; Longobardi, M., Entropy-based measure of uncertainty in past lifetime distributions, J. Appl. Probab., 39, 434-440 (2002) · Zbl 1003.62087
[5] Di Crescenzo, A.; Longobardi, M., A measure of discrimination between past life-time distributions, Statist. Probab. Lett., 67, 173-182 (2004) · Zbl 1058.62088
[6] Ebrahimi, N., How to measure uncertainty in the residual lifetime distributions, Sankhya A, 58, 48-57 (1996) · Zbl 0893.62098
[7] Ebrahimi, N.; Kirmani, S. N.U. A., A characterization of the proportional hazards model through a measure of discrimination between two residual life distributions, Biometrika, 83, 233-235 (1996) · Zbl 0865.62075
[8] Ebrahimi, N.; Kirmani, S. N.U. A., A measure of discrimination between two residual lifetime distributions and its applications, Ann. Inst. Statist. Math., 48, 257-265 (1996) · Zbl 0861.62063
[9] Ebrahimi, N.; Kirmani, S. N.U. A., Some results on ordering of survival functions through uncertainty, Statist. Probab. Lett., 29, 167-176 (1996) · Zbl 1007.62527
[10] Golan, A.; Perloff, J. M., Comparison of maximum entropy and higher-order entropy estimators, J. Econometrics, 107, 195-211 (2002) · Zbl 1043.62001
[11] Hamedani, G.G., 2005. Characterizations of univariate continuous distributions based on hazard functions. J. Appl. Statist. Sci., to appear.; Hamedani, G.G., 2005. Characterizations of univariate continuous distributions based on hazard functions. J. Appl. Statist. Sci., to appear. · Zbl 1102.62010
[12] Hardy, G. H.; Littlewood, J. E.; Pólya, G., Inequalities (1934), The University Press: The University Press Cambridge · Zbl 0010.10703
[13] Jaynes, E. T., Information theory and statistical mechanics, Phys. Rev., 106, 620-630 (1957) · Zbl 0084.43701
[14] Kullback, S., Information Theory and Statistics (1959), Wiley: Wiley NY, (reprinted in 1968 by Dover) · Zbl 0149.37901
[15] Nadarajah, S.; Zografos, K., Formulas for Rényi information and related measures for univariate distributions, Inform. Sci., 155, 119-138 (2003) · Zbl 1053.94005
[16] Rényi, A., 1961. On measures of entropy and information. Proceedings of the Fourth Berkeley Symposium, vol. 1, UC Press, Berkeley, pp. 547-561.; Rényi, A., 1961. On measures of entropy and information. Proceedings of the Fourth Berkeley Symposium, vol. 1, UC Press, Berkeley, pp. 547-561. · Zbl 0106.33001
[17] Shannon, C. E., A mathematical theory of communication, Bell System Tech. J., 27, 379-423 (1948) · Zbl 1154.94303
[18] Wyner, A. D.; Ziv, J., On communication of analog data from bounded source space, Bell System Tech. J., 48, 3139-3172 (1969) · Zbl 0187.41801
[19] Zografos, K., Nadarajah, S., 2005. Expressions for Rényi and Shannon entropies for multivariate distributions. Statist. Probab. Lett., in press.; Zografos, K., Nadarajah, S., 2005. Expressions for Rényi and Shannon entropies for multivariate distributions. Statist. Probab. Lett., in press. · Zbl 1058.62008
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.