×

Information and dimensionality of anisotropic random geometric graphs. (English) Zbl 1453.05118

Klartag, Bo’az (ed.) et al., Geometric aspects of functional analysis. Israel seminar (GAFA) 2017–2019. Volume 1. Cham: Springer. Lect. Notes Math. 2256, 273-324 (2020).
The authors investigate the problem of detecting geometric structures in large random graphs. In the present article, they continue a line of investigation started in [S. Bubeck et al., Random Struct. Algorithms 49, No. 3, 503–532 (2016; Zbl 1349.05315)], where it was shown that \[ d_{TV}\left(G(n,p),G(n,p,d)\right)\to\begin{cases}1, &\text{ if }d/n^3\to 0,\\ 0, &\text{ if }d/n^3\to\infty.\end{cases}\tag{1} \] Here, \(d_{TV}\) denotes total variation distance, \(G(n,p)\) is the law of a dense Erdős-Rényi random graph on \(n\) vertices with \(p\in(0,1)\) and \(G(n,p,d)\) is the law of the isotropic random geometric graph obtained by generating a sample \(X_1,\dots,X_n\) of \(n\) i.i.d.vertices drawn from the standard normal distribution in \(\mathbb{R}^d\) and connecting \(X_i\), \(X_j\) by an edge if their scalar product exceeds the threshold value \(t_p\) determined by the edge density condition \(\mathbb{P}(\langle X_i, X_j\rangle\geq t_p)=p.\) Since the coordinates of a data point tend to represent different features which may influence the network formation in different ways, it is natural to replace the model \(G(n,p,d)\) by the anisotropic model \(G(n,p,\alpha)\) in which the vertex positions follow a normal law determined by a more general covariance matrix \(D_\alpha\). In particular, the authors investigate the case in which \(D_\alpha\) is a diagonal matrix with non-negative diagonal entries given by \(\alpha=(\alpha_1,\dots,\alpha_d)\in \mathbb{R}_{\geq 0}^d.\) The main result of the paper is that \[ d_{TV}\left(G(n,p),G(n,p,\alpha)\right)\to\begin{cases}1, &\text{ if }\left(\frac{||\alpha||_2}{||\alpha||_3}\right)^6/n^3\to 0,\\ 0, &\text{ if }\left(\frac{||\alpha||_2}{||\alpha||_4}\right)^4/n^3\to\infty,\end{cases}\tag{2} \] which generalises (1). However, unlike in the isotropic case, the dichotomy in (2) is not quite exclusive and the authors conjecture that in fact \[ d_{TV}\left(G(n,p),G(n,p,\alpha)\right)\to 0, \text{ whenever }\left(\frac{||\alpha||_2}{||\alpha||_3}\right)^6/n^3\to \infty.\tag{3} \] The main tool employed to differentiate \(G(n,p)\) from \(G(n,p,\alpha)\) are triangle counts: \(G(n,p)\) has in expectation \(p^3 \binom{n}{3}\) triangles. In the isotropic geometric model \(G(n,p,d)\), there are in expectation about \(1+1/d^{1/2}\) times as many triangles present as in \(G(n,p)\) and this observation, together with variance estimates, can be leveraged to obtain bound on the total variation distance between the models. In fact, to obtain sharp variance estimates it turned out in [V. Bentkus, Theory Probab. Appl. 49, No. 2, 311–323 (2004; Zbl 1090.60019); translation from Teor. Veroyatn. Primen. 49, No. 2, 400–410 (2004)], that the correct quantity to consider is a count of signed triangles. Analysing the variance of signed triangles in \(G(n,p,\alpha)\) yields \(\left(\frac{||\alpha||_2}{||\alpha||_3}\right)^6/n^3\) as the crucial quantity in distinguishing \(G(n,p,\alpha)\) from \(G(n,p)\) and this forms the basis of the conjecture (3).
For the entire collection see [Zbl 1446.00029].

MSC:

05C80 Random graphs (graph-theoretic aspects)
60F05 Central limit and other weak theorems
60E15 Inequalities; stochastic orderings

References:

[1] V. Bentkus, A lyapunov-type bound in R^d. Theory Probab. Its Appl. 49(2), 311-323 (2005) · Zbl 1090.60019 · doi:10.1137/S0040585X97981123
[2] S. Bubeck, S. Ganguly, Entropic CLT and phase transition in high-dimensional Wishart matrices. Int. Math. Res. Not. 2018(2), 588-606 (2016) · Zbl 1407.82024
[3] S. Bubeck, J. Ding, R. Eldan, M.Z. Rácz, Testing for high-dimensional geometry in random graphs. Random Struct. Algoritm. 49(3), 503-532 (2016) · Zbl 1349.05315 · doi:10.1002/rsa.20633
[4] T.M. Cover, J.A. Thomas, Elements of Information Theory (John Wiley & Sons, Hoboken, 2012) · Zbl 1140.94001
[5] J. Duchi, Derivations for Linear Algebra and Optimization (Berkeley, 2007). http://www.cs.berkeley.edu/ jduchi/projects/generalnotes.pdf
[6] R. Durrett, Probability: Theory and Examples (Cambridge University Press, Cambridge, 2010) · Zbl 1202.60001 · doi:10.1017/CBO9780511779398
[7] M.L. Eaton, Chapter 8: The wishart distribution, in Multivariate Statistics, Lecture Notes-Monograph Series, vol. 53 (Institute of Mathematical Statistics, Beachwood, 2007), pp. 302-333
[8] P. Erdős, A. Rényi, On the evolution of random graphs. Publ. Math. Inst. Hungar. Acad. Sci 5, 17-61 (1960) · Zbl 0103.16301
[9] L. Hörmander, The Analysis of Linear Partial Differential Operators I: Distribution Theory and Fourier Analysis (Springer, Berlin, 2015) · Zbl 1028.35001
[10] R. Latala, P. Mankiewicz, K. Oleszkiewicz, N. Tomczak-Jaegermann, Banach-Mazur distances and projections on random subgaussian polytopes. Discret. Comput. Geom. 38(1), 29-50 (2007) · Zbl 1134.52016 · doi:10.1007/s00454-007-1326-7
[11] I. Nourdin, G. Peccati, Normal Approximations with Malliavin Calculus: from Stein’s Method to Universality, vol. 192 (Cambridge University Press, Cambridge, 2012) · Zbl 1266.60001 · doi:10.1017/CBO9781139084659
[12] V.V. Petrov, Limit Theorems of Probability Theory (Oxford University Press, Oxford, 1995) · Zbl 0826.60001
[13] N.G. Shephard, From characteristic function to distribution function: a simple framework for the theory. Economet. Theor. 7(4), 519-529 (1991) · doi:10.1017/S0266466600004746
[14] T. Tao, Topics in Random Matrix Theory, vol. 132 (American Mathematical Society, Providence, 2012) · Zbl 1256.15020 · doi:10.1090/gsm/132
[15] R.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.