Involve: A Journal of Mathematics

  • Involve
  • Volume 12, Number 2 (2019), 301-319.

On the minimum of the mean-squared error in 2-means clustering

Bernhard G. Bodmann and Craig J. George

Full-text: Access denied (no subscription detected)

However, an active subscription may be available with MSP at msp.org/involve.

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

We study the minimum mean-squared error for 2-means clustering when the outcomes of the vector-valued random variable to be clustered are on two spheres, that is, the surface of two touching balls of unit radius in n-dimensional Euclidean space, and the underlying probability distribution is the normalized surface measure. For simplicity, we only consider the asymptotics of large sample sizes and replace empirical samples by the probability measure. The concrete question addressed here is whether a minimizer for the mean-squared error identifies the two individual spheres as clusters. Indeed, in dimensions n3, the minimum of the mean-squared error is achieved by a partition obtained from a separating hyperplane tangent to both spheres at the point where they touch. In dimension n=2, however, the minimizer fails to identify the individual spheres; an optimal partition is associated with a hyperplane that does not contain the intersection of the two spheres.

Article information

Source
Involve, Volume 12, Number 2 (2019), 301-319.

Dates
Received: 6 November 2017
Revised: 9 February 2018
Accepted: 7 March 2018
First available in Project Euclid: 25 October 2018

Permanent link to this document
https://projecteuclid.org/euclid.involve/1540432919

Digital Object Identifier
doi:10.2140/involve.2019.12.301

Mathematical Reviews number (MathSciNet)
MR3864219

Zentralblatt MATH identifier
06980503

Subjects
Primary: 62H30: Classification and discrimination; cluster analysis [See also 68T10, 91C20]

Keywords
$k$-means clustering performance guarantees mean-squared error

Citation

Bodmann, Bernhard G.; George, Craig J. On the minimum of the mean-squared error in 2-means clustering. Involve 12 (2019), no. 2, 301--319. doi:10.2140/involve.2019.12.301. https://projecteuclid.org/euclid.involve/1540432919


Export citation

References

  • T. Berger, Rate distortion theory: a mathematical basis for data compression, Prentice-Hall, Englewood Cliffs, NJ, 1971.
  • J. A. Bucklew and G. L. Wise, “Multidimensional asymptotic quantization theory with $r$th power distortion measures”, IEEE Trans. Inform. Theory 28:2 (1982), 239–247.
  • S. Dasgupta, “Learning mixtures of Gaussians”, pp. 634–644 in 40th Annual Symposium on Foundations of Computer Science (New York, 1999), IEEE Computer Soc., Los Alamitos, CA, 1999.
  • Q. Du, V. Faber, and M. Gunzburger, “Centroidal Voronoi tessellations: applications and algorithms”, SIAM Rev. 41:4 (1999), 637–676.
  • A. Gersho and R. M. Gray, Vector quantization and signal compression, Kluwer International Series in Engineering and Computer Science 159, Kluwer, New York, 1991.
  • S. Graf and H. Luschgy, Foundations of quantization for probability distributions, Lecture Notes in Mathematics 1730, Springer, 2000.
  • T. Iguchi, D. G. Mixon, J. Peterson, and S. Villar, “On the tightness of an SDP relaxation of $k$-means”, preprint, 2015.
  • T. Iguchi, D. G. Mixon, J. Peterson, and S. Villar, “Probably certifiably correct $k$-means clustering”, Math. Program. 165:2 (2017), 605–642.
  • J. C. Kieffer, “Exponential rate of convergence for Lloyd's method, I”, IEEE Trans. Inform. Theory 28:2 (1982), 205–210.
  • X. Li, Y. Li, S. Ling, T. Strohmer, and K. Wei, “When do birds of a feather flock together? $k$-means, proximity, and conic programming”, preprint, 2017.
  • S. P. Lloyd, “Least squares quantization in PCM”, IEEE Trans. Inform. Theory 28:2 (1982), 129–137.
  • Y. Lu and H. H. Zhou, “Statistical and computational guarantees of Lloyd's algorithm and its variants”, preprint, 2016.
  • D. J. C. MacKay, Information theory, inference and learning algorithms, Cambridge University Press, 2003.
  • D. G. Mixon, S. Villar, and R. Ward, “Clustering subgaussian mixtures with $k$-means”, pp. 211–215 in 2016 IEEE Information Theory Workshop (Cambridge, 2016), IEEE Computer Soc., Piscataway, NJ, 2016.
  • C. E. Mueller and F. B. Weissler, “Hypercontractivity for the heat semigroup for ultraspherical polynomials and on the $n$-sphere”, J. Funct. Anal. 48:2 (1982), 252–283.
  • J. Peng and Y. Wei, “Approximating $k$-means-type clustering via semidefinite programming”, SIAM J. Optim. 18:1 (2007), 186–205.
  • D. Pollard, “A central limit theorem for $k$-means clustering”, Ann. Probab. 10:4 (1982), 919–926.
  • M. K. Roychowdhury, “Optimal quantizers for some absolutely continuous probability measures”, preprint, 2016.
  • S. Z. Selim and M. A. Ismail, “$k$-means-type algorithms: a generalized convergence theorem and characterization of local optimality”, IEEE Trans. Pattern Anal. Mach. Intell. 6:1 (1984), 81–87.
  • H. Steinhaus, “Sur la division des corps matériels en parties”, Bull. Acad. Polon. Sci. Cl. III. 4 (1956), 801–804.
  • A. Vattani, “$k$-means requires exponentially many iterations even in the plane”, Discrete Comput. Geom. 45:4 (2011), 596–616.