Still waiting for Carnot: information and complexity

Back in 2015, Lyn Robinson and I published an article in the Journal of the Association for Information Science and Technology [1], which gave an analysis of the relation between information and complexity, showing that ideas of complexity, organization, and ‘interesting order’, were intertwined with concepts of information, and of entropy. In particular, we noted that Claude Shannon’s idea of information is associated with uncertainty, unpredictability, randomness, and disorder, whereas Norbert Wiener’s conception of information, calculated in the same way as Shannon but with a minus sign, is associated with the opposite, with pattern and order. Neither fully captures the idea of complex, interesting order. Complexity itself, although it seems an intuitively obvious concept, is actually difficult to define, and still more difficult to measure; a considerable part of the paper was devoted to discussing the many complexity measures which have been proposed, many of which involve information, in one of its guises.

Nicolas Sadi Carnot, for whom we are waiting

(The paper title ‘Waiting for Carnot’ was taken from what Melanie Mitchell reports as an in-joke among complexity scholars: that they’re waiting for someone to clarify the key concepts of complexity theory, in the same way as Sadi Carnot clarified the concepts of thermodynamics in the nineteenth century, and plays on Beckett’s Waiting for Godot. I’m sure you got it.)

The paper argues that there are profound and subtle relationships between information, entropy, order and chaos, simplicity and complexity, and that these relationships suggest that there may be underlying ‘information laws’, applicable across many domains of enquiry and levels of reality.

In this post, I give an update on material relevant to the information-complexity issue which has appeared since the paper was published; highly selective, since complexity theory is an area of very active research. This is the latest in a series of posts about how the concept of information appears in different domains; the previous post dealt with information as a constituent of the physical world.

The paper created a fair amount of interest, and, rather unusually, several people wrote to us about, with comments and critiques.

Ted Pavlic, at Arizona State University, pointed out an important article which we wish we had cited, a primer on complexity, self-organization, and emergence in terms of information theory [2]. It does not change the conclusions of our paper, but is a clearly, and quite rigorously presented, support to them. Ted also drew our attention to the classic work of ET Jaynes on the link between information theory and statistical mechanics [3]; although this was referenced in several of the papers we cited, it is certainly worth mentioning it explicitly as establishing the groundwork for study of complexity in informational terms. It is interesting to note how this paper, one of the first to apply Shannon’s information theory to physical problems, pointed out some problems which are still current: particularly whether probability is to be seen as a feature of the universe (objective) or a feature of our knowledge (subjective), and, following this, what is the best way to think of the relation between physical and thermodynamic entropies. A 2017 paper gives a (fairly technical) comparison of three entropies in context of complex systems: thermodynamic entropy; Shannon’s information theoretic entropy; and Jaynes’ statistical entropy [4].

Peter Grassberger pointed out three of his papers [5,6,7], which addressed at an early date the concept of ‘stochastic complexity’, or ‘forecasting complexity’ as Grassberger prefers. He also suggested that ‘effective complexity’ is in many cases identical to Koppel and Atlan’s ‘sophistication’, and Grassberger’s ‘effective measure complexity’; a good example of how, well complex, the inter-relations between complexity measures can become. His 2012 pre-print on randomness, information and complexity is a useful review of the issues discussed in our paper, from a slightly different perspective.

Finally, Ping Ao, of Shanghai Jiao Tong University, drew our attention to two of his papers [8, 9] on complexity theory in biology, an interesting illustration how ideas of information and complexity infuse the biological sciences, as well as the physical. The ways in which physical and biological complexity interact, and the extent to which physical complexity, and information, can reasonably be regarded as a ‘foundation’ for their biological equivalents is an area of current debate. A 2018 paper by Wolf, Katsnelson and Koonin [10] considers the hierarchical complexity in biological systems that has no counterpart outside the domain on living things, and suggests that it may be due to self-organization, facilitated by the long term ‘memory’ provided by information transmission through DNA. Similarly, studies seeking to use information theory to explain the emergence of complex life are numerous; an example is the 2018 paper of Seoane and Solé [11].

Several of the issues which we discussed in our JASIST paper have been the subject of subsequent discussions in the literature.

Holovatch, Kenna and Thurner [12] argue that the physics of complex systems may be applied to aspects of the world that are not normally regarded as within the purview of physics, including social systems, evolving texts, and networks of all kinds; situations typically characterised by power law distributions, such as those of Zipf and Pareto. They describe this as ‘physics beyond physics’; a good example of the way in which complexity, and its associated information-related concepts, readily jump the boundaries of disciplines and domains.

The plethora of definitions and measures for complexity shows no sign of being narrowed down, and debates over their merits and applicability continue. Freiberger, for example, gives an accessible account of the ‘sophistication’ measure [13]. Where the widely quoted ‘algorithmic complexity’ is essential just a measure of ‘patternlessness’, sophistication measures the extent of structure, disregarding the random elements. However, it not clear that this measure will be useful in any practical circumstances. Haken and Portugali consider the role of information in self-organization within complex systems, specifically looking for universal principles that may apply in the inanimate and animate worlds, the latter including consciousness [14]. They conclude that both Shannon information (meaning-free) and semantic information must be involved in any such principles. Schuster identifies these two forms of information as involved in biological evolution towards more complex forms of life [15]. Raimbault compares three approaches to understanding complexity – emergent, informational, and computational, concluding that a multi-faceted approach is necessary [16].

It’s complicated
(image from http://www.kierandkelly.com/what-is-evolution/complexity-dynamics/)

One issue that continues to generate discussion is the possible relation between growth of complexity and the perception of “time’s arrow”. It is a notorious problem that the laws are physics do not specify a time direction, yet the existence of such a direction is evident. Attempts have been made to relate the passage to time to the one physical quantity which does have a temporal movement; physical entropy, which increases inexorably with time, by the second law of thermodynamics. However, it is not clear that this is the solution, since when entropy decreases locally, there is no time reversal; we do not get younger when we tidy our desks (to take a very inexact analogy).

An alternative is to consider that perhaps complexity, which tends to increase along with entropy, has some association with time’s arrow. This idea was discussed from various perspectives by the contributors to a book back in 2013 [17], and has been analysed by others since. One problem is that the passage of time may defined by a steady increase in all of entropy, information and complexity, as is argued by Mikhailovsky and Levich, who use algorithmic complexity as their measure [18]. As physicist Sean Carroll writes, in his popular science book The Big Picture, it is difficult to find a simple relation between entropy/information and complexity, whether in the stirring of cream into a cup of coffee or in the evolution of the universe, from Big Ban to heat death: “Is it possible that there is a new law of nature yet to be found, and analogous to the second law of thermodynamics, that summarises the evolution of complexity over time? The short answer is ‘we don’t know’. The somewhat longer answer is ‘we don’t know, but maybe: and if so, there’s good reason to believe it will be – appropriately enough – complicated” [19, p. 231]. This seemingly paradoxical simultaneous increase of entropy (often equated to disorder) and complexity is succinctly discussed in a Medium blog post by Mark Traphagen.

So, lots of continuing interest, but not much in the way of clear understanding about complexity, and how it relates to information. We are clearly still waiting for Carnot.

References
1. D. Bawden and L. Robinson. “Waiting for Carnot”: information and complexity, Journal of the Association for Information Science and Technology. 2015, 66(11), 2177-2186. Open access version available at http://openaccess.city.ac.uk/6432.

2. M. Propenko, F. Boschetti, and A.J. Ryan. An information-theoretic primer on complexity, self-organization, and emergence. Complexity, 2009, 15(1), 11-28.

3. E.T. Jaynes. Information theory and statistical mechanics. Physical Review, 1957, 106(4), 620-630.

4. Thurner, S., Corominas-Murtra, B. and Hanel, R. (2017). Three faces of entropy for complex systems: information, thermodynamics, and the maximum entropy principle. Physical Review E, 2017, 96(3-1),032124, available at https://arxiv.org/abs/1705.07714.

5. P. Grassberger. Randomness, information and complexity. 2012, available at https://arxiv.org/abs/1208.3459.

6. P. Grassberger. Some comments on computational mechanics, complexity measures, and all that. 2017, available at https://arxiv.org/abs/1708.03197.

7. P. Grassberger. Comment on “Inferring statistical complexity”. 2017, available at https://arxiv.org/abs/1708.04190.

8. Ping Ao. Darwinian dynamics implies developmental ascendency. Biological Theory. 2007, 2(1), 113-115., available at https://arxiv.org/abs/0708.0987.

9. Ping Ao. Equivalent formulations of “the equation of life”. Chinese Physics B, 2014, 23(7), article id 070513.

10. Y.I. Wolf, M.I. Katsnelson and E.V. Koonin. Physical foundations of biological complexity, Proceedings of the National Academy of Sciences, 115, no. 37, available at https://www.pnas.org/content/pnas/115/37/E8678.full.pdf.

11. L.F. Seoane and R.V. Solé. Information theory, predictability, and the emergence of complex life. Royal Society Open Science, 2018, 5(2), paper 172221, available at https://royalsocietypublishing.org/doi/pdf/10.1098/rsos.172221.

12. Y. Holovatch, R. Kenna and S. Thurner. Complex systems: physics beyond physics. European Journal of Physics, 2017, 38(2), 023002, available at https://arxiv.org/pdf/1610.01002.pdf.

13. M. Freiberger. Information is sophistication. Plus magazine [online] 24 March 2015, available at https://plus.maths.org/content/information-sophistication.

14. H. Haken and J. Portugali. Information and selforganization: a unifying approach and applications. Entropy, 2016, 18(6), paper 197, available at https://www.mdpi.com/1099-4300/18/6/197.

15. P. Schuster. Increase in complexity and information through molecular evolution. Entropy, 2016, 18(11), paper 397, available at https://www.mdpi.com/1099-4300/18/11/397.

16. J. Raimbault. Relating complexities for the reflexive study of complex systems. In Theories and models of urbanization. Springer lecture notes in morphogenesis [forthcoming]. 2018, available at https://arxiv.org/pdf/1811.04270v1.pdf.

17. C.H. Lineweaver, P.C.W. Davies and M. Ruse (eds.). Complexity and the arrow of time. Cambridge: Cambridge University Press, 2013.

18. G.E. Mikhailovsky and A.P. Levich. Entropy, information and complexity or which aims the arrow of time? Entropy, 2015, 17(7), 4863-4890, available at https://www.mdpi.com/1099-4300/17/7/4863.

19. S. Carroll. The big picture: on the origins of life, meaning and the universe itself, London: Oneworld, 2016.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.