×

Study of a measure of efficiency as a tool for applying the principle of least effort to the derivation of the Zipf and the Pareto laws. (English) Zbl 1536.62032

Summary: The principle of least effort (PLE) is believed to be a universal rule for living systems. Its application to the derivation of the power law probability distributions of living systems has long been challenging. Recently, a measure of efficiency was proposed as a tool of deriving Zipf’s and Pareto’s laws directly from the PLE. This work is a further investigation of this efficiency measure from a mathematical point of view. The aim is to get further insight into its properties and usefulness as a metric of performance. We address some key mathematical properties of this efficiency such as its sign, uniqueness and robustness. We also look at the relationship between this measure and other properties of the system of interest such as inequality and uncertainty, by introducing a new method for calculating nonnegative continuous entropy.

MSC:

62B10 Statistical aspects of information-theoretic topics
94A17 Measures of information, entropy

References:

[1] Ferrero, G., L’inertie mentale et la loi du moindre effort, Philos. Rev.3 (1894) 362; Rev. Philos. France L’Etranger37 (1894) 169.
[2] Zipf, G. K., Selected Studies of the Principle of Relative Frequency in Language (Harvard University Press, Cambridge, MA, 1932).
[3] Zipf, G. K., Human Behavior and the Principle of Least Effort (Addison-Wesley, Cambridge, MA, 1949).
[4] Zhu, Y. Y., Wang, Q. A., Li, W. and Cai, X., The principle of least effort and Zipf distribution, J. Phys.: Conf. Ser.1113 (2018) 012007.
[5] Mandelbrot, B., An informational theory of the statistical structure of language, Commun. Theory84 (1953) 486-502.
[6] Cancho, R. F. and Sole, R. V., Least effort and the origins of scaling in human language, PNAS100 (2002) 788. · Zbl 1071.68096
[7] Wang, Q. A., Principle of least effort vs. Maximum efficiency: Deriving Zipf-Pareto’s laws, Chaos Solitons Fractals153 (2021) 111489. arXiv:2003.02376
[8] https://en.wikipedia.org/wiki/Economic_growth.
[9] https://en.wikipedia.org/wiki/Gini_coefficient and references therein.
[10] Ou, C. J., El Kaabouchi, A. and Nivanen, L., Maximizable informational entropy as a measure of probabilistic uncertainty, Int. J. Mod. Phys. B24 (2010) 3461-3468. arXiv:0803.3110. · Zbl 1203.82004
[11] Jiang, J., Metz, F., Beck, C., Lefevre, S., Chen, J. C., Pezeril, M. and Wang, Q. A., Double power law degree distribution and informational entropy in urban road networks, Int. J. Mod. Phys. C22 (2011) 33.
[12] Abe, S., Generalized entropy optimized by a given arbitrary distribution, J. Phys. A: Math. Gen.36 (2003) 8733. · Zbl 1161.82320
[13] https://en.wikipedia.org/wiki/Principle_of_good_enough.
[14] Tsallis, C., Possible generalization of Boltzmann \(-\) Gibbs statistics, J. Stat. Phys.52 (1988) 479-487. · Zbl 1082.82501
[15] Tsallis, C., Introduction to Nonextensive Statistical Mechanics (Springer Science + Business Media, 2009). · Zbl 1172.82004
[16] Naudts, J., Generalised Statistics (Springer-Verlag, London, 2011). · Zbl 1231.82001
[17] Beck, C., Generalized information and entropy measures in physics, Contemp. Phys.50 (2009) 495.
[18] Kaniadakis, G., Statistical mechanics in the context of special relativity, Phys. Rev. E66 (2002) 056125. · Zbl 0994.81054
[19] Beck, C. and Cohen, E. G. D., Superstatistics, Superstat. Physica A322 (2003) 267-275. · Zbl 1038.82049
[20] Jaynes, E. T., Information theory and statistical mechanics, Phys. Rev. Ser. II106 (1957) 620-630. · Zbl 0084.43701
[21] Pareto, V., Cours d’economie politique, J. Polit. Econ.6 (1898) 549-552.
[22] Newitz, A., A mysterious law that predicts the size of the world’s biggest cities, https://io9.gizmodo.com/the-mysterious-law-that-governs-the-size-of-your-city-1479244159.
[23] Albert, R. and Barabasi, A.-L., Statistical mechanics of complex networks, Rev. Mod. Phys.74 (2002) 47-97. · Zbl 1205.82086
[24] Machu, F. X., Chen, J. L., Wang, R., El Kaabouchi, A. and Wang, Q. A., The necessity of the dynamics of preferential attachment due to the principle of least effort, to be submitted (2022).
[25] Lesche, B., Instabilities of Rényi entropies, J. Stat. Phys.27 (1982) 419.
[26] El Kaabouchi, A., Wang, Q. A., Ou, C. J., Chen, J. C., Su, G. Z. and Le Méhauté, A., A counterexample against the Lesche stability of a generic entropy functional, J. Math. Phys.52 (2011) 063302. · Zbl 1317.94039
[27] Berg, A. and Ostry, J., Inequality and unsustainable growth: Two sides of the same coin, IMF Econ. Rev.65 (2017) 792-815.
[28] Dollar, D., Kleineberg, D. T. and Kraay, A., Growth, inequality and social welfare: Cross-country evidence, Econ. Policy30 (2015) 335-377.
[29] Nivanen, L., Le Méhauté, A. and Wang, Q. A., Generalized algebra within a nonextensive statistics, Rep. Math. Phys.52 (2003) 437. · Zbl 1125.82300
[30] Wang, Q. A., Probability distribution and entropy as a measure of uncertainty, J. Phys. A: Math. Theor.41 (2008) 065004. · Zbl 1133.81011
[31] Jaynes, E. T., Information theory and statistical mechanics, Brandeis University Summer Institute, Lect. Theor. Phys.3 (1963) 181-218.
[32] Jaynes, E. T., Probability Theory: The Logic of Science, Bretthorst, G. L., ed. (Cambridge University Press, UK, 2003). · Zbl 1045.62001
[33] Renyi, A., Probability Theory (Dover, New York, 1998).
[34] Sanov, I. N., On the probability of large deviations of random variables, Math. Sbornik42 (1957) 11 (in Russian). · Zbl 0112.10106
[35] https://en.wikipedia.org/wiki/Differential_entropy.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.