Journal of the Mathematical Society of Japan

Entropy and its many Avatars

Srinivasa R. S. VARADHAN

Full-text: Open access

Abstract

Entropy was first introduced in 1865 by Rudolf Clausius in his study of the connection between work and heat. A mathematical definition was given by Boltzmann as the logarithm of the number of micro states that corresponds to a macro state. It plays important roles in statistical mechanics, in the theory of large deviations in probability, as an invariant in ergodic theory and as a useful tool in communication theory. This article explores some of the connections between these different contexts.

Article information

Source
J. Math. Soc. Japan, Volume 67, Number 4 (2015), 1845-1857.

Dates
First available in Project Euclid: 27 October 2015

Permanent link to this document
https://projecteuclid.org/euclid.jmsj/1445951169

Digital Object Identifier
doi:10.2969/jmsj/06741845

Mathematical Reviews number (MathSciNet)
MR3417516

Zentralblatt MATH identifier
1357.94050

Subjects
Primary: 60-02: Research exposition (monographs, survey articles)
Secondary: 37A35: Entropy and other invariants, isomorphism, classification 94A17: Measures of information, entropy

Keywords
entropy information large deviations

Citation

VARADHAN, Srinivasa R. S. Entropy and its many Avatars. J. Math. Soc. Japan 67 (2015), no. 4, 1845--1857. doi:10.2969/jmsj/06741845. https://projecteuclid.org/euclid.jmsj/1445951169


Export citation

References

  • J. Axzel and Z. Daroczy, On Measures of Information and Their Characterizations, Academic Press, New York, 1975.
  • L. Boltzmann, Über die Mechanische Bedeutung des Zweiten Hauptsatzes der Wärmetheorie, Wiener Berichte, 53 (1866), 195–220.
  • R. Clausius, Théorie mécanique de la chaleur, 1ère partie, Paris: Lacroix, 1868.
  • H. Cramer, On a new limit theorem in the theory of probability, Colloquium on the Theory of Probability, Hermann, Paris, 1937.
  • J. D. Deuschel and D. W. Stroock, Large deviations, Pure and Appl. Math., 137, Academic Press, Inc., Boston, MA, 1989, xiv+307 pp.
  • M. D. Donsker and S. R. S. Varadhan, Asymptotic evaluation of certain Markov process expectations for large time, IV, Comm. Pure Appl. Math., 36 (1983), 183–212.
  • A. Feinstein, A new basic theorem of information theory, IRE Trans. Information Theory PGIT-4 (1954), 2–22.
  • L. Gross, Logarithmic Sobolev inequalities, Amer. J. Math., 97 (1975), 1061–1083.
  • M. Z. Guo, G. C. Papanicolaou and S. R. S. Varadhan, Nonlinear diffusion limit for a system with nearest neighbor interactions, Comm. Math. Phys., 118 (1988), 31–59.
  • A. I. Khinchin, On the fundamental theorems of information theory, Translated by Morris D. Friedman, 572 California St., Newtonville MA 02460, 1956, 84 pp.
  • A. N. Kolmogorov, A new metric invariant of transitive dynamical systems and automorphisms of Lebesgue spaces, (Russian) Topology, ordinary differential equations, dynamical systems, Trudy Mat. Inst., Steklov., 169 (1985), 94–98, 254.
  • O. Lanford, Entropy and equilibrium states in classical statistical mechanics, Statistical Mechanics and Mathematical Problems, Lecture notes in Physics, 20, Springer-Verlag, Berlin and New York, 1971, 1–113.
  • D. S. Ornstein, Ergodic theory, randomness, and dynamical systems, James K. Whittemore Lectures in Mathematics given at Yale University, Yale Mathematical Monographs, No.,5. Yale University Press, New Haven, Conn.-London, 1974, vii+141 pp.
  • I. N. Sanov, On the probability of large deviations of random magnitudes, (Russian) Mat. Sb. (N. S.), 42 (84) (1957), 11–44.
  • C. E. Shannon, A mathematical theory of communication, Bell System Tech. J., 27 (1948), 379–423, 623–656.
  • Y. G. Sinai, On a weak isomorphism of transformations with invariant measure, (Russian) Mat. Sb. (N.S.), 63 (105) (1964), 23–42.
  • H. T. Yau, Relative entropy and hydrodynamics of Ginzburg-Landau models, Lett. Math. Phys., 22 (1991), 63–80.