Can information be conserved, and why would it matter?

The idea that information may be conserved may strike many of us interested in recorded human information information as faintly ridiculous. By ‘conserved’, we mean that there is a fixed amount of information in the universe, and that, while it may be changed, it can neither be created nor destroyed. This does not seem to accord in any way with the commonly held that there is much more information than there used to be, and that information is being created at an ever-increasing rate; nor indeed that information may be lost, by the destruction of libraries and archives, and the decay and obsolescence of storage formats.

And yet the idea of conservation of information is taken as a given within the physical sciences. If information is a part of the physical universe, as it is taken to be in an increasing commonly held viewpoint, then it is reasonable to assume that it will be conserved, like other fundamental physical quantities, like energy or momentum. This contradiction is an intriguing one, and sheds light on the differences, and possible relationship, between the concept on information in the physical and social worlds. This post gives a very informal view of my understanding of the issue.

One caveat should be entered first. The phrase “conservation of information” is sometimes used by advocates of creation science to justify some of their views; that is not at all what I am discussing here.

The physicist Sean Carroll has written very clearly and cogently about conservation of physical information for a non-expert readership. This quotation from his 2016 book The Big Picture gives a very clear statement:
“Since the days of Laplace, every serious attempt at understanding the behaviour of the universe at a deep level has included the feature that the past and future are determined by the present state of the system. … This principle goes by a simple, if potentially misleading, name: conservation of information. Just as conservation of momentum implies that the universe can just keep on moving, without any unmoved mover behind the scenes, conservation of information implies that each moment contains precisely the right amount of information to determine every other moment. The term “information” here requires caution because scientists use the same word to mean different things in different contexts. Sometimes “information” refers to the knowledge you actually have about a state of affairs. Other times, it means the information that is readily accessible, embodied in what the systems macroscopically looks like (whether you are looking at it and have the information or not). We are using a third possible definition, what we might call the “microscopic” information: the complete specification of the state of the system, everything you could possibly know about it. When speaking of the conservation of information, we mean literally all of it… the future and the past seem not only like different directions but also like completely different kinds of things. The past is fixed, our intuition assures us; it has already happened, while the future is still unformed and up for grabs. The present moment the now, is what actually exists. And then along came Laplace to tell us differently. Information about the precise state of the universe is conserved over time; there is no fundamental difference between the past and future. ….. From the Laplacian point of view, where information is present in each moment and conserved over time, a memory isn’t some kind of direct access to events in the past. It must be a feature of the present state, since the present state is all we presently have.”

Leonard Susskind, in his admirable Theoretical Minimum text on basic physical principles, puts it more succinctly. Given that any change in the physical world can be represented in abstract terms by two states (before and after) with an arrow showing that the second state evolved directly from the first then we have: “the most fundamental of all physical laws – the conservation of information. The conservation of information is simply the rule that every state has one arrow in and one arrow out. It ensures that you never lose track of where you started. The conservation of information is not a conventional conservation law.” This last point, that conservation of information is not the same as conservation of other physical quantities is certainly worth bearing in mind.

This idea is at the basis of on the great controversies of modern physics, Stephen Hawking’s ‘black hole paradox’. Once anything drops into a black hole, it disappears from our universe; the black hole increases in mass, but that tells us nothing about the nature of what has gone in; it, and the information it carries, appear to have disappeared from the universe. Howking showed that black holes emit a form of radiation, which will eventually cause the black hole to dissipate, returning its mass/energy to the universe, But, according to Hawking, the nature of this radiation is the same regardless of what has gone into the black hole; information is not returned to the universe, and is therefore not conserved. This is disputed by some physicists, and the issue is far from resolved, despite Hawking’s claims over the years to have solved the problem. This is an illustration of how the concept of information- though in a different guise from that familiar to the library/information science – is at the heart of physical questions.

Another puzzle relating to the conservation of information is the complex and disputed relationship between information and entropy. That there is such a relation is not doubted, but there is a conflict between those who follow Claude Shannon in seeing information and entropy as essentially synonymous, those who, like Norbert Wiener, see them as opposites. Whichever view is taken, it poses problems for the idea of conservation of information, since the second law of thermodynamics, which no-one disputes, states that entropy, or a system as a whole, always increases, although it may decrease locally. In no way is entropy conserved, and therefore, if information is closely related to entropy in any way, it is difficult to see how information can be conserved either.

There are various solutions to this conundrum. One is to follow Sean Carroll, quoted above, in arguing that there are different meanings of information, even in the physical sciences; the information which is conserved is not the information which is related to entropy. Informational entropy is to do with the information embedded in a system, of which we may or may not have knowledge, Carroll’s second form of information; and this is not conserved.

Another solution is to argue that information, though certainly related to entropy, is neither the same as entropy, not its inverse. What is conserved is some combination of information and entropy, one increasing as the other diminishes. A third solution is to agree with Susskind that conservation of information is not the same as other conservation laws, although it is phrased in the same way and not worry about the issue.

None of these seems wholly satisfactory, for those who, like me, would like to see a seamless account of the relations between concepts of information in different domains. Nor are they as distinct as they might seem. Hawking’s black hole paradox, clearly concerning information of the conserved kind, began with, and continues to be analysed in terms of, the entropy of the back hole.

When we come to think of examples more directly related to the concerns of LIS, things do not necessarily get easier.

Suppose we have two printed books, of the exactly same size, weight, quality of paper, nature of ink, etc.; identical as physical objects. Suppose that they deal with different topics; their information content, in the sense of knowledge and meaning, is not at all similar. Suppose finally that these are the only copies of the books in existence, their content is not stored digitally anywhere, and no-one knows their contents.

Then suppose that both books are burned in a hot fire, so that all that remains is smoke and fine ash. At a first approximation, because the books were physically identical, the remnants, in terms f smoke and ash will be identical. In both the physical and intellectual meanings of the word, their information has been destroyed. However, if we accept conservation of information, then, in principle if not in practice, the trajectories of the smoke and ash particles could be plotted and reversed, and the books recreated. [Those who dislike “in principle if not in practice” arguments, might like to reflect on how many things thought impossible a generation ago are now almost routine: ‘seeing’ atoms and planets around other stars, for example.] Since recorded information must have a physical carrier, this suggests that, in principle, meaningful recorded information is also, in this sense, conserved, and potentially always recoverable.

Sean Carroll deals with this problem by again appealing to the idea of different kinds of information: “We tend to use the word “information” in multiple, often incomparable, ways… what we might call the “microscopic information” refers to a complete specification of the exact state of a physical system, and is neither created nor destroyed. But often we think of a higher-level macroscopic concept of information, one that can indeed come and go: if a book is burned, the information contained in it is lost to us, even if not to the universe.”

Examining what conservation of information may mean in various contexts may be a useful tool in elucidating relations between the idea of information in different domains. An example of this approach in the context of Mark Burgin’s General Theory of Information looks at ways of understanding conservation of structural (physical) and symbolic (social) information. (There is a similarity between the forms of information in this theory, and those put forward earlier by authors such as Marcia Bates and Tom Stonier.) The example given by Burgin and Rainer Feistel is of scientific research, whereby structural information (essentially regularities in the physical world) is extracted and converted into symbolic information, in the form of articles, books, etc. They argue that, although we do not possess any theory for the formulation of precise information conservation laws applicable to this context, we feel intuitively that the amount of symbolic information produced cannot exceed the amount of structural information in the part of the physical world being studied.

This linkage between the form of physical and social information, in the form of a qualitative approach to a conservation law which essentially limits the production of meaningful information to available physical information, seems attractive, but has been challenged by the introduction of different information-related entities. The physicist/philosopher David Deutsch, for instance, argues that although information may be limited (and hence potentially conserved), knowledge is infinite (and hence certainly not conserved).

It seems clear that the answer to the question as to whether information is conserved is still an open one. But any answer must inevitably begin with the caveat that, first and foremost, it depends what we mean by information. This need not be a sterile argument about the meaning of words, but rather a means of exploring different concepts of information, and – crucially – the ways in which information of different kinds may interact, and provide linkages between the physical, biological and social worlds.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.