×

Goodness of fit test using Lin-Wong divergence based on type-I censored data. (English) Zbl 07552810

Summary: Goodness of fit tests with two different methods are carried out in the present paper in order to recognize normal and exponential distributions within Type-I censored data. Here, we use the Lin-Wong divergence as our proposed measure of distance between distributions and compare its performance with that of other well known distance measures using an empirical distribution function and [C. E. Shannon, Bell Syst. Tech. J. 27, 379–423, 623–656 (1948; Zbl 1154.94303)], entropy concept. Furthermore, we present two real data analysis procedures using those proposed methods.

MSC:

62-XX Statistics

Citations:

Zbl 1154.94303
Full Text: DOI

References:

[1] Abbasnejad, M.; Arghami, N. R.; Tavakoli, M., A goodness of fit test for exponentiality based on Lin Wong information, Journal of the Iranian Statistical Society, 11, 2, 191-202 (2012) · Zbl 1278.62058
[2] Alizadeh Noughabi, H., A new estimator of entropy and its application in testing normality, Journal of Statistical Computation and Simulation, 80, 10, 1151-62 (2010) · Zbl 1270.62021
[3] Bitaraf, M.; Rezaei, M.; Yousefzadeh, F., Test for normality based on two new estimators of entropy, Journal of Statistical Computation and Simulation, 87, 2, 280-94 (2017) · Zbl 07191938
[4] Cha, S. H., Comprehensive survey on distance/similarity measures between probability density functions, International Journal of Mathematical Models and Methods in Applied Sciences, 1, 4, 300-7 (2007)
[5] Correa, J. C., A new estimator of entropy, Communications in Statistics-Theory and Methods, 24, 10, 2439-49 (1995) · Zbl 0875.62030
[6] Deza, M. M.; Deza, E., Encyclopedia of distance (2009), Heidelberg, Germany: Springer Berlin Heidelberg, Heidelberg, Germany · Zbl 1167.51001
[7] Ebrahimi, N.; Pflughoeft, K.; Soofi, E. S., Two measures of sample entropy, Statistics & Probability Letters, 20, 3, 225-34 (1994) · Zbl 0805.62009
[8] Esteban, M.; Castellanos, M.; Morales, D.; Vajda, I., Monte carlo comparison of four normality tests using different entropy estimates, Communication in Statistics-Simulation and Computation, 30, 4, 761-85 (2001) · Zbl 1008.62505
[9] Joarder, A.; Krishna, H.; Kundu, D., Inferences on weibull parameters with conventional Type-I censoring, Computational Statistics and Data Analysis, 55, 1, 1-11 (2011) · Zbl 1247.62061
[10] Koziol, J. A.; Byar, D. P., Percentage points of the asymptotic distributions of one and two sample KS statistics for truncated or censored data, Technometrics, 17, 4, 507-10 (1975) · Zbl 0326.62032
[11] Kullback, S.; Leibler, R. A., On information and sufficiency, The Annals of Mathematical Statistics, 22, 1, 79-86 (1951) · Zbl 0042.38403
[12] Lin, J.; Wong, S. K. M., A new directed divergence measure and its characterization, International Journal of General System, 17, 1, 73-81 (1990) · Zbl 0703.94003
[13] Lin, J., Divergence measures based on the shannon entropy, IEEE Transactions on Information Theory, 37, 1, 145-51 (1991) · Zbl 0712.94004
[14] Nelson, W., Applied life data analysis (1982), NewYork: JohnWiley and Sons, NewYork · Zbl 0579.62089
[15] Pakyari, R.; Balakrishnan, N., Testing exponentiality based on Type-I censored, Journal of Statistical Computation and Simulation, 83, 12, 2369-78 (2013) · Zbl 1453.62372
[16] Pakyari, R.; Nia, K. R., Testing goodness-of-fit for some lifetime distributions with conventional Type-I censoring, Communications in Statistics-Simulation and Computation, 46, 4, 2998-3009 (2017) · Zbl 1373.62501
[17] Park, S.; Shin, M., Kullback-leibler information of a censored variable and its applications, Statistics, 48, 4, 756-65 (2014) · Zbl 1326.62013
[18] Parzen, E., Nonparametric statistical data modeling, Journal of the American Statistical Association, 74, 365, 105-21 (1979) · Zbl 0407.62001
[19] Persson, T.; Rootzen, H., Simple and highly efficient estimators for a Type-I censored normal sample, Biometrika, 64, 1, 123-8 (1977) · Zbl 0352.62025
[20] Pettitt, A. N.; Stephens, M. A., Modified cramer-von mises statistics for censored data, Biometrika, 63, 2, 291-8 (1976) · Zbl 0329.62013
[21] Shannon, C. E., A mathematical theory of communication, ACM Sigmobile Mobile Computing and Communications Review, 5, 1, 3-55 (1948)
[22] Shioya, H.; Da-Te, T., A generalization of lin divergence and the derivation of a new information divergence, Electronics and Communications in Japan (Part III: Fundamental Electronic Science), 78, 7, 34-40 (1995)
[23] Silverman, B., Density estimation (1986), London, UK: Chapman & Hall/CRC, London, UK · Zbl 0617.62042
[24] Topsoa, F., Some inqualities for information divergence and related measures of discrimination, Research Report Collection, 2, 1, 73-83 (1999)
[25] Vasicek, O., A test for normality based on sample entropy, Journal of Research Statistical Society, 38, Serial. B, 54-9 (1976) · Zbl 0331.62031
[26] Wasserman, L., All of nonparametric statistics (2010), New York: Springer Texts in Statistics, New York
[27] Wieczorkowski, R.; Grzegorzewski, P., Entropy estimators - Improvements and comparisons, Communication Statistics Computing and Simulation, 28, 2, 541-67 (1999) · Zbl 0932.62007
[28] Zamanzade, E.; Arghami, N. R., Goodness of fit test based on correcting moments of modified entropy estimator, Journal of Statistical Computation and Simulation, 81, 12, 2077-93 (2011) · Zbl 1431.62026
[29] Zamanzade, E.; Arghami, N. R., Testing normality based on new entropy estimators, Journal of Statistical Computation and Simulation, 82, 11, 1701-13 (2012) · Zbl 1431.62202
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.