This post gives an update on the development of the idea of information as a constituent of the physical world, and is a companion piece to earlier posts on information in the biological domain, on the conservation of information, on quantum information, and on the theory of relativity and its informational component. All are expressions of an interest in the idea that there may be interesting links between conceptions of information in different contexts.
The physical nature of information, and the interest and possible significance of this for the library/information sciences, has been examined by Lyn Robinson and myself, in papers in JASIST and in Information Research. This post gives an update on these, with the caveat that is is a highly selective updating. Information physics is a burgeoning area, and hundreds of papers, most of them highly technical. In this post, I choose a few of these, to exemplify some of the accessible and relevant contributions.
The increasing acceptance of information as a physical quantity is indicated by the importance given to it in a 2015 article on physics in 100 years, written by Nobel-winning theoretical physicist Frank Wilczek.
Wilczek gives an intriguing prediction that “Fundamental action principles, and thus the laws of physics, will be interpreted as statements about information and its transformation” (p.17). His argument for this is based on the relation between information and entropy: “Information is [a] dimensionless quantity that plays a large and increasing role in our description of the world. Many of the terms that arise naturally in discussions of information have a distinctly physical character. For example, we commonly speak of density of information and flow of information Going deeper, we find far-reaching analogies between information and (negative) entropy, as noted already in Shannon’s original work. Nowadays many discussions of the microphysical origin of entropy, and of foundations of statistical mechanics in general, start from discussions of information and ignorance. I think it is fair to say that there has been a unification fusing the physical quantity (negative) entropy and the conceptual quantity information” [Wilczek’s emphasis] (p.17). He argues for a “strong formal connection” between entropy and action, through the path integral mathematic process developed by Richard Feynman. [Action is one of the most fundamental quantities of physics. Roughly speaking, it is energy multiplied by time, and measures the activity of all physical entities in space-time]. This unification of action and information is one of a number of unifications of seemingly disparate entities which Wilczek sees as fundamental to the development of physics over the next decades.
Mention of action raises an obvious analogy between meaning of information. The principle of least action states that the physical universe evolves so as to minimise physical action as a whole. Zipf’s principle of least effort suggests that, among other things, people conduct themselves so as to minimise the effort they expend in dealing with information. Is this more than a superficial analogy? Hopefully, this will be the subject of a future post.
The information-entropy relationship emphasised by Wilczek remains at the heart of discussions of information physics. In an influential review article in Nature in 2015, which has been cited more than 400 times, Juan Parrondo and colleagues analysed the introduction of information into thermodynamics, pointing out that this theoretical interest has now become relevant in practical situations where information is manipulated at small scales, as in molecular and cell biology, nano-devices, and quantum computing. To reconcile the information/entropy relation with classical thermodynamics which makes no mention of information requires two things: restating thermodynamic laws to incorporate information explicitly, and clarifying the physical nature of information, so that it enters thermodynamic laws as a physical entity, not an abstraction. New models involve mutual information, ordered memories, information reservoirs (equivalent to other thermodynamic reservoirs, like thermal of chemical baths). But the authors pointed out “we are still several steps away from a complete understanding of the physical nature of information”; in particular, we need a general physical theory of information to explain how the macroscopic world, and the subjectivity of entropy, emerge from statistical mechanics.
In an article in Quanta magazine from 2017, Natalie Wolchover updates the story in an accessible way, showing how the relation relation between information and entropy is linked to Rolf Landauer’s conception that “information is physical”, how it may be better explained by invoking concepts from quantum theory, and how we must be aware that information behaves very differently from other physical quantities. Wolchover does give a voice to scientists who object to the whole idea, on the grounds that thermodynamic entropy and information-theoretic entropy are quite different things, and should each be applied only in their own domain; however, he suggests that such views are now generally regarded as old-fashioned. At the same time, he quotes scholars who caution against a simplistic view that “everything is information”. In the nineteenth century, at the height of the steam age, there were those who argued that the universe was nothing but an enormous steam engine; perhaps it is not surprising that, in our modern information age, there are those who think it is nothing but an enormous computer. (I suspect that Luciano Floridi’s approach of Levels of Abstraction might resolve this problem, but that’s for another day.)
Wolchover also reminds those who worry that to link information with entropy is make an objective physical quantity disturbing dependent on our state of knowledge is a concern recognised by James Clerk Maxwell (as in Maxwell’s Demon), who wrote in the nineteenth century “The idea of dissipation of energy depends on the extent of our knowledge”. (Maxwell’s musings on his ‘demon’ have been a continuing stimulus to thoughts about information and entropy; see the 2016 review by Hemmo and Shenker.)
Another point identified by Wolchover is the link between the information-entropy relationship and the idea of the ‘conservation of information’ in a physical sense. I written about this informally in a previous post; for a reasonably accessible recent technical discussion, see the paper on thermodynamics as a consequence of information conservation, by Bera and colleagues.
Finally, as regards the information-entropy relationship, the discussion continues as to whether the increase in entropy in the universe as a whole, as dictated by the Second Law of Thermodynamics, accounts for our perception of the passage of time (another idea commonly attributed the Richard Feynman). Paul Halpern, in his popular science book The Quantum Labyrinth, argues that can best be understood in terms of the relation between entropy and our information/knowledge: “As the universe progresses over time, more and more of the past becomes known. That growing knowledge represents a measure of order that matches low entropy. In contrast, the future remains unknown; it is comparatively disorganised and therefore has higher entropy. The discrepancy between the low-entropy past (of burgeoning information) and the high entropy future (of events unknown) results in a natural arrow of time” (p.211).
The information-entropy-time relation is, however, still very much undecided, see a Forbes article by Ethan Siegal and a responding Medium article by Domino Valdano, for a flavour of the debate.
Moving away from issues around entropy, another way in which information concepts are entering mainstream physics is in which is termed the ‘reconstruction’ of quantum mechanics. It is notorious that the principles of quantum mechanics are almost entirely ad hoc, derived by the pioneers in the early twentieth century. While the mathematical machinery is very effective for calculating correct answers to physical problems, there is no agreed interpretation of what the theory means, or why it is as it is. Attempts at reconstruction begin from scratch, by trying to derive the quantum mechanical formalism from basic ideas. Most of these basic ideas have some informational flavour, and some are wholly information centric. The literature of this topic is highly technical, but an accessible introduction is given in an article in Quanta magazine by Philip Ball. Those who want to have a look at more technical examples might try a paper on an informational derivation of quantum theory, one on a unification of force and matter by quantum information, or a recent review of the topic.
Staying with quantum mechanics, we may note that one its interpretations, due to David Bohm, the American theoretical physicist who spent much of his career in London, has always been informational in nature. Bohm held that a crucial ingredient of the quantum world is so-called ‘active information’, a form of semantic information which guides the elements of the physical universe. This smacks to some of panpsychism, and has an intriguing resonance with the idea of integrated information as the basis for consciousness discussed in a previous post. Bohmian views of quantum mechanics have been somewhat out of favour, but may perhaps regain popularity with the general informational turn; see the article by William Seager for an accessible and sympathetic review.
Rolf Landauer’s fundamental insight that information is physical has also been continually investigated and extended. His calculation that heat is dissipated and entropy increased when a bit of information is erased has been experimentally proven. New, and more sophisticated, conceptions of physical information have been derived on the foundations established by Landauer; see, for example, Anderson’s idea of observer-local referential information. For a review, see the article by Ciliberto and Lutz, and for an accessible summary see the shorter version by the same authors.
Interesting enough, you may say, but does this have anything to do with the library/information sciences, and their concept of meaningful, recorded information? Is there any substantive link between physical and social information? Frank Wilczek, contemplating the unification of mind and matter, thinks there may be: “And if physics evolves to describe matter in terms of information … a circle of ideas will have closed. Mind will have become more matter-like and matter will have become more mind-like” (p.19). And I’m sure Byrhtferth would have agreed. It would be well to keep an open mind.