The Annals of Statistics

Think globally, fit locally under the manifold setup: Asymptotic analysis of locally linear embedding

Hau-Tieng Wu and Nan Wu

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

Since its introduction in 2000, Locally Linear Embedding (LLE) has been widely applied in data science. We provide an asymptotical analysis of LLE under the manifold setup. We show that for a general manifold, asymptotically we may not obtain the Laplace–Beltrami operator, and the result may depend on nonuniform sampling unless a correct regularization is chosen. We also derive the corresponding kernel function, which indicates that LLE is not a Markov process. A comparison with other commonly applied nonlinear algorithms, particularly a diffusion map, is provided and its relationship with locally linear regression is also discussed.

Article information

Source
Ann. Statist., Volume 46, Number 6B (2018), 3805-3837.

Dates
Received: March 2017
Revised: December 2017
First available in Project Euclid: 11 September 2018

Permanent link to this document
https://projecteuclid.org/euclid.aos/1536631291

Digital Object Identifier
doi:10.1214/17-AOS1676

Mathematical Reviews number (MathSciNet)
MR3852669

Zentralblatt MATH identifier
1405.62058

Subjects
Primary: 60K35: Interacting random processes; statistical mechanics type models; percolation theory [See also 82B43, 82C43]

Keywords
Locally linear embedding diffusion maps dimension reduction locally linear regression measurement error

Citation

Wu, Hau-Tieng; Wu, Nan. Think globally, fit locally under the manifold setup: Asymptotic analysis of locally linear embedding. Ann. Statist. 46 (2018), no. 6B, 3805--3837. doi:10.1214/17-AOS1676. https://projecteuclid.org/euclid.aos/1536631291


Export citation

References

  • [1] Belkin, M. and Niyogi, P. (2003). Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15 1373–1396.
  • [2] Belkin, M. and Niyogi, P. (2005). Towards a theoretical foundation for Laplacian-based manifold methods. In Learning Theory. Lecture Notes in Computer Science 3559 486–500. Springer, Berlin.
  • [3] Belkin, M. and Niyogi, P. (2007). Convergence of Laplacian eigenmaps. In Advances in Neural Information Processing Systems 19 (NIPS 2006) 129–136. MIT Press, Cambridge, MA.
  • [4] Bérard, P., Besson, G. and Gallot, S. (1994). Embedding Riemannian manifolds by their heat kernel. Geom. Funct. Anal. 4 373–398.
  • [5] Bérard, P. H. (1986). Spectral Geometry: Direct and Inverse Problems. Lecture Notes in Math. 1207. Springer, Berlin.
  • [6] Cheeger, J., Gromov, M. and Taylor, M. (1982). Finite propagation speed, kernel estimates for functions of the Laplace operator, and the geometry of complete Riemannian manifolds. J. Differential Geom. 17 15–53.
  • [7] Cheng, M.-Y. and Wu, H.-T. (2013). Local linear regression on manifolds and its geometric interpretation. J. Amer. Statist. Assoc. 108 1421–1434.
  • [8] Coifman, R. R. and Lafon, S. (2006). Diffusion maps. Appl. Comput. Harmon. Anal. 21 5–30.
  • [9] Devroye, L. P. and Wagner, T. J. (1977). The strong uniform consistency of nearest neighbor density estimates. Ann. Statist. 5 536–540.
  • [10] do Carmo, M. P. and Flaherty, F. (1992). Riemannian Geometry. Birkhäuser, Boston, MA.
  • [11] Donoho, D. L., Gavish, M. and Johnstone, I. M. (2013). Optimal shrinkage of eigenvalues in the spiked covariance model. Available at arXiv:1311.0851.
  • [12] Donoho, D. L. and Grimes, C. (2003). Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proc. Natl. Acad. Sci. USA 100 5591–5596.
  • [13] El Karoui, N. (2010). On information plus noise kernel random matrices. Ann. Statist. 38 3191–3216.
  • [14] El Karoui, N. and Wu, H.-T. (2016). Graph connection Laplacian methods can be made robust to noise. Ann. Statist. 44 346–372.
  • [15] Fan, J. and Gijbels, I. (1996). Local Polynomial Modelling and Its Applications. Chapman & Hall/CRC, Boca Raton, FL.
  • [16] Gao, T. (2016). The diffusion geometry of fibre bundles. Available at arXiv:1602.02330.
  • [17] Garcia Trillos, N. and Slepcev, D. (2018). A variational approach to the consistency of spectral clustering. Appl. Comput. Harmon. Anal. To appear.
  • [18] Giné, E. and Koltchinskii, V. (2006). Empirical graph Laplacian approximation of Laplace–Beltrami operators: Large sample results. In High Dimensional Probability. Institute of Mathematical Statistics Lecture Notes—Monograph Series 51 238–259. IMS, Beachwood, OH.
  • [19] Hein, M., Audibert, J.-Y. and von Luxburg, U. (2005). From graphs to manifolds—Weak and strong pointwise consistency of graph Laplacians. In Learning Theory. Lecture Notes in Computer Science 3559 470–485. Springer, Berlin.
  • [20] Johnstone, I. M. (2006). High dimensional statistical inference and random matrices. Available at arXiv:math/0611589v1.
  • [21] Moore, D. S. and Yackel, J. W. (1977). Consistency properties of nearest neighbor density function estimators. Ann. Statist. 5 143–154.
  • [22] Roweis, S. T. and Saul, L. K. (2000). Nonlinear dimensionality reduction by locally linear embedding. Science 290 2323–2326.
  • [23] Singer, A. (2006). From graph to manifold Laplacian: The convergence rate. Appl. Comput. Harmon. Anal. 21 128–134.
  • [24] Singer, A. and Wu, H.-T. (2012). Vector diffusion maps and the connection Laplacian. Comm. Pure Appl. Math. 65 1067–1144.
  • [25] Singer, A. and Wu, H.-T. (2013). 2-D tomography from noisy projections taken at unknown random directions. SIAM J. Imaging Sci. 6 136–175.
  • [26] Singer, A. and Wu, H.-T. (2017). Spectral convergence of the connection Laplacian from random samples. Inf. Inference 6 58–123.
  • [27] Smolyanov, O., Weizsacker, H. v. and Wittich, O. (2007). Chernoff’s theorem and discrete time approximations of Brownian motion on manifolds. Potential Anal. 26 1–29.
  • [28] Stein, E. M. and Weiss, G. (2016). Introduction to Fourier Analysis on Euclidean Spaces. PMS 32. Princeton Univ. Press, Princeton, NJ.
  • [29] Tenenbaum, J. B., de Silva, V. and Langford, J. C. (2000). A global geometric framework for nonlinear dimensionality reduction. Science 290 2319–2323.
  • [30] van der Maaten, L. and Hinton, G. (2008). Visualizing data using t-SNE. J. Mach. Learn. Res. 9 2579–2605.
  • [31] von Luxburg, U., Belkin, M. and Bousquet, O. (2008). Consistency of spectral clustering. Ann. Statist. 36 555–586.
  • [32] Wang, X. (2015). Spectral convergence rate of graph Laplacian. Available at arXiv:1510.08110.
  • [33] Weinberger, K. Q. and Saul, L. K. (2006). An introduction to nonlinear dimensionality reduction by maximum variance unfolding. In AAAI 1683–1686.
  • [34] Wu, H.-T. and Wu, N. (2018). Supplement to “Think globally, fit locally under the manifold setup: Asymptotic analysis of locally linear embedding.” DOI:10.1214/17-AOS1676SUPP.
  • [35] Zhang, Z. and Wang, J. (2006). MLLE: Modified locally linear embedding using multiple weights. In Advances in Neural Information Processing Systems 19 (NIPS 2006) 1593–1600. MIT Press, Cambridge, MA.
  • [36] Zhang, Z. and Zha, H. (2004). Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM J. Sci. Comput. 26 313–338.

Supplemental materials

  • Supplement to “Think globally, fit locally under the manifold setup: Asymptotic analysis of locally linear embedding”. Proof of main theorems and technical details.