Entropy. How little we (still) know,

Entropy, and its complex and subtle relations with information, has been an interest of mine for a long while, a paper on the subject being updated by posts on this blog. Since the last blog post, a number of interesting ideas have been put forward.

The first is not really new, having been made available in 2018, but has recently come to my notice. An arXiv paper by Ted Jacobson, a theoretical physicist from the University of Maryland, Entropy from Carnot to Beckenstein retells the story of the developing concept of entropy from a physicist’s perspective. An unusual mix of anecdote, since part of the paper is developed from an after dinner speech, and technical detail, this is a readable account with rich historical detail, and gives some insights missing from other overviews of the topic. The link between the physical and informational concepts of entropy is brought out strikingly, linking Shannon’s informational entropy (associated with the probability of a message) with Boltzmann’s thermodynamic entropy (associated with the number of microstates of a system within a given macrostate). “The connection to thermodynamic entropy”, says Jacobsob, “is that the microstate of a physical system can be viewed as a message whose information content is greater when the entropy of the corresponding macrostate is greater”. Obvious in a sense, but I have not seen it expressed in this way before.

Jacobson concludes his paper with the idea of the universe beginning in a state of zero entropy: Lemaître’s “primeval atom” or, as would be said today, a pure quantum state. From that point, entropy has inexorably increased, following the second law of thermodynamics, and giving rise to, according to the most popular interpretation, the “arrow of time”, giving the irreversible distinction between past and future whereas other physical laws are time reversible. Although commonly accepted, this viewpoint not without problems, including how to fully understand the apparent increase in structure and complexity in the universe alongside the increase in entropy. An alternative view is put forward in a recent open access paper in Frontiers of Physics by the well-known science commentator Jim Al‑Khalili and his colleague Eddy Chen. They suggest that the arrow of time is given by an increase in decoherence, by which the systems of the universe become increasingly quantum entangled, rather than by an increase in thermodynamic entropy. It is not clear whether this entanglement entropy, which increases with time, necessarily has the same relation to information as does thermodynamic entropy. If this viewpoint is accepted it may have interesting implications for information physics and information theory. This paper is rather technical, but worth the effort.

Finally, a short popular article in Quanta Magazine on how entropy is a measure of how little we know reflects on the meaning of entropy at very small scales. What could be the meaning of the entropy of a single electron? These issues are now becoming of practical importance with the increasing ability to manipulate matter at the smallest scale, and the advent of so-called “information engines“; going, in a sense full-circle, to where entropy was first conceived with Sadi Carnot’s steam engines.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.