Amid the churn and hype of the London restaurant scene, there are some places that qualify as Old London for their longevity as the same kind of restaurant on the same site. Among these, on the same spot since 1828, is Simpsons-in-the-Strand (yes, for the pedants among you, it’s ‘in’, not ‘on’, the Strand).

Aiming at ‘British classics’ and ‘tradition upheld’, its list of regulars includes Conan Doyle and Churchill, Dickens and Disraeli, van Gough and Bernard Shaw. Sherlock Holmes himself “found something nutritious” there regularly (again for the pedants, see The Dying Detective and The Illustrious Client for verification). And, since its earliest days as a coffee house, it has been closely associated with the game of chess.

Simpson’s could easily have degenerated into a mediocre mock-heritage tourist trap, but has largely avoided that fate. Revamped a year ago by its owners, the Savoy Hotel, it does what it does to a very good standard, and in very nice surroundings. The ‘bill of fare’ (nothing so 20th century as a ‘menu’ here) is inevitably meat- and fish-heavy, with roast beef from the carving trolley their best-known signature dish.

However, for something a little different, and vegetarian-friendly, this must surely be the only place to offer Lord Woolton’s pie; a root vegetable pie, devised by the chef of the Savoy in the face of wartime rationing, and promoted by the eponymous lord, then the minster of food. Tasty as well as thoroughly traditional.

Lord Woolton’s pie

Supporting truth and promoting understanding: knowledge organization and the curation of the infosphere

This is an updated text of a keynote address given at the Fifteenth International ISKO Conference, Porto, 9th July 2018. A brief account of the conference is given in an earlier blog post.

Supporting truth and promoting understanding: knowledge organization and the curation of the infosphere

David Bawden and Lyn Robinson

This paper considers the response of knowledge organisation (KO) to a variety of problems and pathologies associated with the post-factual, or post-truth, society. It argues that there are no quick fixes, but that KO has several roles to play in mitigating these problems, particularly in the promotion of understanding, as well as the communication of information and the sharing of knowledge. Borrowing from Floridi’s Philosophy of Information, it argues that KO, and more broadly library and information science (LIS), should address these problems as part of our role as ‘curators of the infosphere’.

This paper addresses two of the main themes of this conference, by considering a new foundational direction and purpose for knowledge organisation, as a response to certain societal challenges to the effective communication of information and knowledge. The new direction involves a realignment of purpose; from knowledge organisation being applied in the cause of the effective provision of information and documents to its application for the explicit purpose of promoting understanding. The societal challenges which this may address the the much-discussed problems of the post-factual or post-truth society, with its accompanying phenomena of fake news, the death of expertise, and the rest. This paper builds on a session at the ISKO UK 2017 annual meeting, devoted to these issues, and goes beyond it to consider how the promotion of understanding may in itself contribute to a solution. It uses Luciano Floridi’s Philosophy of Information as a theoretical back-drop throughout.

Societal problems
The linked collection of social problems which have been described by such terms as ‘fake news’, ‘alternate facts’, ‘post-truth society’, ‘post-factual society’, ‘death of expertise’, ‘filter bubbles’, and ‘social media echo chambers’ are well known. Indeed, ‘post-truth’ was Oxford Dictionaries’ Word of the Year for 2016, and ‘fake news’ was Collins Dictionaries equivalent for 2017, with ‘echo-chamber’ on its shortlist. Together, they describe a situation where objective factual truth is denied, expert informed, opinion is derided, and exposure to novel and challenging ideas is actively avoided. This situation, which is in many respects the antithesis of what library and information science (LIS) has sought to promote, and causes soul-searching within the theory and practice of LIS (see, for example, Bawden 2017 and Cooke 2017). It has not arisen de novo: fake news has a long history (Cooper 2017), while the philosopher Bertrand Russell observed more that seventy years ago that “.. most people go through life with a whole world of beliefs that have no sort of rational justification… People’s opinions are mainly designed to make them feel comfortable: truth, for most people is a secondary consideration.” (Russell 1942). Our present concerns are the culmination of a series of changes in the social and informational environment, brought into stark relief by political issues in Europe and North America from 2016, accompanied, and to an extent brought about, by a torrent of misinformation and disinformation (Clark 2017, Corner 2017, Wardle and Derakhshan 2017). A Pew Internet study carried out in early 2017 found a panel of experts almost evenly divided as to whether the problems could be ameliorated or would become worse over the next decade (Anderson 2017).

Naturally, many well-intentioned proposals have been advanced to remedy the situation. Some have addressed deep seated issues, in educational systems, in economic and social policy, in use and mis-use of alogorithms, in regulation of the media (including social media), and in political structures. Others, including several advanced from within the LIS community, have recommended more immediate and specific remedies; see Clark (2017) for an overview. Some have focused on the development of IT solutions, particularly with the algorithms used to filer news in social media and with automated fact checking and comparison (see, for example, Cooper 2017, Madrigal 2017 and Tomchak 2017). Others have advocated the enhancement of information and digital literacies (see, for example, Cooke 2017, Polizzi 2017 and Poole 2017), of restoring the importance of expert objective fact checking and ‘kite marks’ (see, for example, Cooper 2017, Jirotka and Webb 2017 and O’Leary 2017), and of improving education and training for information providers (see, for example. UNESCO’s new curriculum for ‘Journalism: Fake news and disinformation‘.
Our view is that, although these sort of initiatives may well have value, taken alone they will have relatively little impact. The problems and issues are deep-rooted, ‘systemic’ as Beckett (2017) puts it, and are not amenable to any ‘quick fix’. As the philosopher of information Luciano Floridi emphasises, the more important the problem, the most it needs a long period of reflection to find the best solution. And as we have suggested more specifically, LIS has no quick fix for these issues, and we should not pretend that we have; Beckett (2017) suggests the same for journalism in respect of fake news. We believe that LIS has a very considerable contribution to make, but it must be a deeper level than a tweak to an algorithm, a guide to information evaluation, or a reliance on the manipulation of big data (Robinson 2016, Bawden 2017, Poole 2017); as Floridi (2016) puts it, solving the problems of fake news and the rest requires a reshaping of the infosphere, our whole information environment and our interactions within it.

It seems to us that part of LIS’s contribution to a longer-term approach to these issues will certainly lie in knowledge organisation. This has already been noted by others. ISKO UK devoted a session at their 2017 annual conference to ‘False narratives: developing a KO community response to post-truth issues’ (ISKO 2017), and a plenary discussion of the Dublin Core Metadata Initiative in October 2017 debated ‘A metadata community response to the post truth information age’ (DCMI 2017). We will include points made at these sessions, and consider some other possibilities, later. First, we will make a slight detour, and think about the nature of understanding

Promoting understanding
One way of expressing the problems of post-factual, expertise-less society is to say that it lacks a full and clear understanding of the issues facing it (Robinson 2016, Bawden 2017). We have argued that LIS should take a new stance of focusing on the promotion of understanding as much as on the provision of information and the sharing of knowledge in an era when, for most people for the most part, information is provided through search engines, particularly Google, through a few encyclopedic websites, particularly Wikipedia, and through social media (Bawden and Robinson 2016A, 2016B). In this environment, we contend, the promotion of understanding falls, arguably uniquely, within the remit of LIS; this seems to be a novel suggestion, although it has been supported by Gorichanaz (2016, 2017), and fits within the Floridi-derived idea of LIS professions as ‘curators of the infosphere’ (Bawden and Robinson 2018).

There is, we may say, little understanding of understanding, in as much as it is defined very differently by various authors. We may note that Ackoff (1989) in his original formulation of the well-known data-information-knowledge-wisdom model, included understanding, which he characterised as ‘an appreciation of why’ as a high-level concept, between knowledge and wisdom. In the widely-used educational taxonomy due to Bloom, on the other hand, it comes as a rather low-level concept, above remembering, but below applying, analysing, etc. (Anderson, Krathwohl and Bloom 2001).

On the basis of an analysis of various conceptions of understanding (for details of which, see Bawden and Robinson 2016A, 2016B), we propose a definition of understanding, relevant to the purposes of LIS, following Ackoff, and situated within Floridi’s Philosophy of Information:

Information is taken to be well-formed, meaningful, truthful data. Knowledge is taken to be information organised in a network of account-giving inter-relations. Understanding occurs when a conscious entity, supported as necessary by information systems, appreciates the totality of a body of knowledge, including its interconnections. The extent to which the knowledge is incomplete, contradictory or false determines the degree to which understanding is less than complete.

Developing understanding in this sense would seem to be a worthy aim for LIS, and on which may go some way towards helping mitigate the societal problems noted above. However, we need to note that people may have an understanding of a topic, in this sense, based on misinformation or disinformation, and may be impervious to contradictory information (see, for example, Requarth 2017), and this may be reinforced by emotional attachments to certain viewpoints (Beckett 2017, Poole 2017). We should therefore add a rider, to to effect that the extent to which someone is open to changing their views on the basis of new information, in effect to the extent of their curiosity, is also a measure of the completeness of their understanding. This could amount to a commitment (individual or societal), to accepting, indeed actively seeking, new knowledge even if it be potentially disruptive of current understanding (Bawden and Robinson 2016B). For LIS, this fits in well with the suggestions of Beckett (2017) that news media should provide content that is stimulating and challenging as well as relevant, and of Finch (2017) that libraries should be as much a safe place to indulge curiosity, rather than a trusted dispense of facts and information, or a repository of the truth.

Expressed in this way, we can see that the development of understanding may in itself be a powerful force for counteracting the problems discussed above; Bradley (2017) explicitly notes that helping people to understand and use items in their information environment is a role for libraries in countering the fake. It is important to note that developing understanding, in the sense meant here, is a broad and general approach, rather than a specific tool or technique, and goes far beyond didactic approaches to evaluation of information. Information systems are beginning to be developed to support understanding, from the relative conceptual simplicity of Google’s Knowledge Graph, which integrates information from a variety of sources with the aim of giving a quick overview, to systems explicitly aiming a developing understanding from a corpus of sources. As an early example of the latter, see Donne et al. (2012).

We now turn to consider the ways in which knowledge organisation may contribute to these linked aims: the promotion of understanding, and the mitigation of the problems of the post-factual society.

KO’s contribution
As suggested above, a variety of contributions to the amelioration of these problems have been suggested. At the risk of over-simplification, we can consider them under five headings.

First, we might notice the suggestion that an ontology, taxonomy, terminology, or glossary of the post-truth society, its pathologies, and potential solutions, may be of value in itself, as a way of clarifying the concepts and their inter-relations, and as a guide to action (Bradley 2017, Clark 2017, Poole 2017, Wardle and Derakhshan 2017). For the most detailed example yet extant, see Synaptica’s Post-Truth Forum Knowledge Base.

Second, there is what we might think of the classic, if limited, response of KO: adaption of methods of resource description, and revision of existing descriptions. One popular example of the former is the idea of ‘credibility metadata’, the addition of terms aimed at reducing misinformation and disinformation by establishing veracity of resources (Wardle and Derakhshan 2017). This may involve markers of location, time, etc. on items, or metadata to note ‘quality factors’, such as that a source has a corrections policy, or that the author of an item has written on the topic before (Cuellar 2017). Conversely, indexing may directly address the ‘fake’ nature of an item, as in the idea of adding a term for ‘satirical article’ (Quinn 2016, Cuellar 2017). The latter, by which discredited materials by be identified as such, is well exemplified by the controversy over the reclassification of Holocaust denial literature as ‘historiography’ rather than ‘history’ (Simon XIX 2017).

These responses are generally implemented by traditional intellectual metadata construction. The third category is the use of automated classification and indexing to attempt to identify and categorise fake news and other pathologies of the post-factual society; of the many developments of this kind, a good, albeit simple, example of a classifier to distinguish genuine news items from fake is given by McIntire (2017). More complex examples, based on more sophisticated machine learning techniques and classification techniques are likely to play an increasing role (Cooper 2017). Both Facebook and YouTube announced innovations to this end in mid-2018.

Fourthly, there are those KO techniques which directly support curiosity which, as noted above, is a powerful force for finding alternative perspectives, breaking filter bubbles, and building understanding. In this respect, another long-established aspect of KO, classification techniques with their ability to show both hierarchical and associative relations, may be of particular importance.

It is worth considering, from the perspective of the issues discussed here, whether any particular form, or theory, or classification may be most appropriate. In particular, pragmatic or critical classification (Hjørland 2017) appears to be something of a two-edged sword. On the one hand, it may support, or reflect, an understanding of the world helpful to an individual or a group in developing understanding, and coincide with their emotional responses to issues; on the other hand, such an organisation may simply reinforce filter bubbles. It may be that systems could be developed to allow a ready comparison of alternative classifications, assisting curiosity-driven explorations of different perspectives. We may also need to consider the status of classical classifications, based on a single agreed picture of the world, and approximating as closely to truth as may be possible. Are these sustainable, at a time when alternative facts seem as viable as any other, and when expertise is said to be said? They appear to be the antithesis of this negative viewpoint. The status and role of classification in the post-truth era seems to be an area in need of thoughtful research.

Fifthly, and finally, it seems to us that to deal adequately with current problems, KO must fully recognise the deep and irreversible changes in the information environment brought about by the shift to what Floridi categorises as ‘infosphere’ and ‘onlife’ in which our digital and physical lives merge, and information, contextual and mobile, is central to our society, and indeed to our humanity (Floridi 2014, Bawden and Robinson 2018). This is a long-term and far-reaching challenge, encompassing and going beyond the challenges of the post-truth society, and one in which KO should have a unique position.

There are no quick fixes to the problems set out above. KO cannot solve these problems alone, any more than can the wider LIS discipline; more far-reaching and structural changes – educational, technical, legislative, regulatory, and political – are needed for that. However, KO can play a significant role in improving the situation, using a combination of classic KO concepts and familiar KO practice, integrated with newer technological and organisational environments. In this way, by opposing misinformation and disinformation and promoting understanding, we may justify a claim to be curators of the infosphere.

We are grateful to Nick Poole and to Vanda Broughton for helpful comments and advice.

Anderson, L., Krathwohl, D. and Bloom, B. (2001). A taxonomy for learning, teaching, and assessing: a revision of Bloom’s taxonomy of educational objectives. New York: Longman.

Anderson, J. and Rainie, L. (2017). The future of truth and misinformation online. Pew Research centre, Internet and Technology, October 19 2017. available at, accessed 2 January 2018.

Bawden, D. (2017), Why LIS doesn’t have a quick fix for the post-factual society, and why that’s OK [blog post]. Available at, accessed 2 January 2018.

Bawden, D. and Robinson, L. (2018). Curating the infosphere: Luciano Floridi’s Philosophy of Information as the foundation for library and information science. Journal of Documentation, 74(1), 2-17.

Bawden, D. and Robinson, L. (2016A). Information and the gaining of understanding. Journal of Information Science, 42(3), 294-299.

Bawden, D. and Robinson, L. (2016B). “A different kind of knowing”: speculations on understanding in light of the Philosophy of Information. Paper presented at the 9th CoLIS (Conceptions of Library and Information Science) conference, Uppsala, June 25 2016. Open access version in the Humanities Commons at

Beckett, C. (2017). Dealing with the disinformation dilemma: a new agenda for news media [blog post]. Available at, accessed 4 January 2018.

Bradley, F. (2017). Valuing truth in an age of fake [blog post]. Available at, accessed 4 January 2018.

Clark, D. (2017). Scoping out post-truth issues and how might KO help. Paper presented at the ISKO UK 2017 annual meeting, presentation available at, accessed 3 January 2018.

Cooke, N.A. (2017). Facts: information behaviour and critical information consumption for a new age. Library Quarterly, 87(3), 211-221

Cooper, G. (2017). False news – a journalist’s perspective. Paper presented at the ISKO UK 2017 annual meeting, presentation available at, accessed 3 January 2018.

Corner, J. (2017). Fake news, post-truth and media-political change. Media, Culture and Society, 39(7), 1100-1107.

Cuellar, A. (2017). Combating fake news with metadata [blog post]. Available at, accessed 3 January 2017.

DCMI (2017). Dublin Core Metadata Initiative: responding to the post-truth phenomenon. Available at, accessed 3 January 2018.

Donne, S. et al. (2012). Rapid understanding of scientific paper collections: integrating statistics, text analytics, and visualization. Journal of the American Society for Information Science and Technology, 63(2), 2351-2369.

Finch, M. (2017). Curiosity vs the post-truth world [blog post]. Available at, accessed 2 January 2018.

Floridi, L. (2014). The fourth revolution: how the infosphere is shaping human reality. Oxford: Oxford University Press.

Floridi, L. (2016). Fake news and a 400-year-old problem: we need to resolve the ‘post-truth’ crisis. The Guardian, 29 November 2016, available at, accessed 2 January 2018.

Gorichanaz, T. (2016). There’s no shortcut: building understanding from information in ultrarunning. Journal of Information Science, 43(5), 713-722

Gorichanaz, T. (2017). Applied epistemology and understanding in information studies. Information Research, 22(4), paper 776, available at, accessed 2 January 2018

Hjørland, B. (2017). Classification. In ISKO Encyclopedia of Knowledge Organization, available at, accessed 3 January 2018.

ISKO (2017). Session 5: False narratives – developing a KO community response to post-truth issues. UK Chapter of the International Society for Knowledge Organization, available at, accessed 3 January 2018.

Jirotka, M. and Webb, H. (2017). Spotting the fake [blog post]. Available at, accessed 2 January 2018

Madrigal, A.C. (2017). Google and Facebook failed us. The Atlantic, 2 October 2017, available at, access 2 January 2018.

McIntire, G. (2017). On building a fake news classification model [blog post]. Available at, accessed 3 January 2018.

O’Leary, M. (2017). Fact-checkers resist alternative facts. Information Today, October 2017, pp. 18-19.

Polizzi, G. (2017). Critical digital literacy: ten key readings for our distrustful media age [blog post]. Available at, accessed 2 January 2018.

Poole, N. (2017). Why ‘facts matter’ – evidence, trust and literacy in a post-truth world. Paper presented at the ISKO UK 2017 annual meeting.

Quinn, B. (2016). Helping to fix the fake news problem with metadata. Available at, accessed 3 January 2018.

Requarth, T. (2017), Scientists, stop thinking explaining science will fix things. Slate, 19 April 2017, available at, accessed 2 January 2018.

Robinson, L. (2016). Documentation in the post-factual society: or what LIS did next, after Brexit [blog post]. Available at, accessed 2 January 2018.

Russell, B. (1942). The art of philosophizing and other essays. Girard KS: Haldeman-Julius, p. 5

SimonXIX (2017). An open letter to the Sunday Times. Available at, accessed 3 January 2018.

Tomchak, A-M (2017). Algorithms are screwing us over with fake news but could also fix the problem. Mashable, 5 October 2017, available at, accessed 2 January 2017.

Wardle, C. and Derakhshan, H. (2017). Information disorder: toward an interdisciplinary framework for research and policy making. Council of Europe report DGI(2017)09, available atésinformation-1.pdf?x29719, accessed 4 January 2018.

Can information be conserved, and why would it matter?

The idea that information may be conserved may strike many of us interested in recorded human information information as faintly ridiculous. By ‘conserved’, we mean that there is a fixed amount of information in the universe, and that, while it may be changed, it can neither be created nor destroyed. This does not seem to accord in any way with the commonly held that there is much more information than there used to be, and that information is being created at an ever-increasing rate; nor indeed that information may be lost, by the destruction of libraries and archives, and the decay and obsolescence of storage formats.

And yet the idea of conservation of information is taken as a given within the physical sciences. If information is a part of the physical universe, as it is taken to be in an increasing commonly held viewpoint, then it is reasonable to assume that it will be conserved, like other fundamental physical quantities, like energy or momentum. This contradiction is an intriguing one, and sheds light on the differences, and possible relationship, between the concept on information in the physical and social worlds. This post gives a very informal view of my understanding of the issue.

One caveat should be entered first. The phrase “conservation of information” is sometimes used by advocates of creation science to justify some of their views; that is not at all what I am discussing here.

The physicist Sean Carroll has written very clearly and cogently about conservation of physical information for a non-expert readership. This quotation from his 2016 book The Big Picture gives a very clear statement:
“Since the days of Laplace, every serious attempt at understanding the behaviour of the universe at a deep level has included the feature that the past and future are determined by the present state of the system. … This principle goes by a simple, if potentially misleading, name: conservation of information. Just as conservation of momentum implies that the universe can just keep on moving, without any unmoved mover behind the scenes, conservation of information implies that each moment contains precisely the right amount of information to determine every other moment. The term “information” here requires caution because scientists use the same word to mean different things in different contexts. Sometimes “information” refers to the knowledge you actually have about a state of affairs. Other times, it means the information that is readily accessible, embodied in what the systems macroscopically looks like (whether you are looking at it and have the information or not). We are using a third possible definition, what we might call the “microscopic” information: the complete specification of the state of the system, everything you could possibly know about it. When speaking of the conservation of information, we mean literally all of it… the future and the past seem not only like different directions but also like completely different kinds of things. The past is fixed, our intuition assures us; it has already happened, while the future is still unformed and up for grabs. The present moment the now, is what actually exists. And then along came Laplace to tell us differently. Information about the precise state of the universe is conserved over time; there is no fundamental difference between the past and future. ….. From the Laplacian point of view, where information is present in each moment and conserved over time, a memory isn’t some kind of direct access to events in the past. It must be a feature of the present state, since the present state is all we presently have.”

Leonard Susskind, in his admirable Theoretical Minimum text on basic physical principles, puts it more succinctly. Given that any change in the physical world can be represented in abstract terms by two states (before and after) with an arrow showing that the second state evolved directly from the first then we have: “the most fundamental of all physical laws – the conservation of information. The conservation of information is simply the rule that every state has one arrow in and one arrow out. It ensures that you never lose track of where you started. The conservation of information is not a conventional conservation law.” This last point, that conservation of information is not the same as conservation of other physical quantities is certainly worth bearing in mind.

This idea is at the basis of on the great controversies of modern physics, Stephen Hawking’s ‘black hole paradox’. Once anything drops into a black hole, it disappears from our universe; the black hole increases in mass, but that tells us nothing about the nature of what has gone in; it, and the information it carries, appear to have disappeared from the universe. Howking showed that black holes emit a form of radiation, which will eventually cause the black hole to dissipate, returning its mass/energy to the universe, But, according to Hawking, the nature of this radiation is the same regardless of what has gone into the black hole; information is not returned to the universe, and is therefore not conserved. This is disputed by some physicists, and the issue is far from resolved, despite Hawking’s claims over the years to have solved the problem. This is an illustration of how the concept of information- though in a different guise from that familiar to the library/information science – is at the heart of physical questions.

Another puzzle relating to the conservation of information is the complex and disputed relationship between information and entropy. That there is such a relation is not doubted, but there is a conflict between those who follow Claude Shannon in seeing information and entropy as essentially synonymous, those who, like Norbert Wiener, see them as opposites. Whichever view is taken, it poses problems for the idea of conservation of information, since the second law of thermodynamics, which no-one disputes, states that entropy, or a system as a whole, always increases, although it may decrease locally. In no way is entropy conserved, and therefore, if information is closely related to entropy in any way, it is difficult to see how information can be conserved either.

There are various solutions to this conundrum. One is to follow Sean Carroll, quoted above, in arguing that there are different meanings of information, even in the physical sciences; the information which is conserved is not the information which is related to entropy. Informational entropy is to do with the information embedded in a system, of which we may or may not have knowledge, Carroll’s second form of information; and this is not conserved.

Another solution is to argue that information, though certainly related to entropy, is neither the same as entropy, not its inverse. What is conserved is some combination of information and entropy, one increasing as the other diminishes. A third solution is to agree with Susskind that conservation of information is not the same as other conservation laws, although it is phrased in the same way and not worry about the issue.

None of these seems wholly satisfactory, for those who, like me, would like to see a seamless account of the relations between concepts of information in different domains. Nor are they as distinct as they might seem. Hawking’s black hole paradox, clearly concerning information of the conserved kind, began with, and continues to be analysed in terms of, the entropy of the back hole.

When we come to think of examples more directly related to the concerns of LIS, things do not necessarily get easier.

Suppose we have two printed books, of the exactly same size, weight, quality of paper, nature of ink, etc.; identical as physical objects. Suppose that they deal with different topics; their information content, in the sense of knowledge and meaning, is not at all similar. Suppose finally that these are the only copies of the books in existence, their content is not stored digitally anywhere, and no-one knows their contents.

Then suppose that both books are burned in a hot fire, so that all that remains is smoke and fine ash. At a first approximation, because the books were physically identical, the remnants, in terms f smoke and ash will be identical. In both the physical and intellectual meanings of the word, their information has been destroyed. However, if we accept conservation of information, then, in principle if not in practice, the trajectories of the smoke and ash particles could be plotted and reversed, and the books recreated. [Those who dislike “in principle if not in practice” arguments, might like to reflect on how many things thought impossible a generation ago are now almost routine: ‘seeing’ atoms and planets around other stars, for example.] Since recorded information must have a physical carrier, this suggests that, in principle, meaningful recorded information is also, in this sense, conserved, and potentially always recoverable.

Sean Carroll deals with this problem by again appealing to the idea of different kinds of information: “We tend to use the word “information” in multiple, often incomparable, ways… what we might call the “microscopic information” refers to a complete specification of the exact state of a physical system, and is neither created nor destroyed. But often we think of a higher-level macroscopic concept of information, one that can indeed come and go: if a book is burned, the information contained in it is lost to us, even if not to the universe.”

Examining what conservation of information may mean in various contexts may be a useful tool in elucidating relations between the idea of information in different domains. An example of this approach in the context of Mark Burgin’s General Theory of Information looks at ways of understanding conservation of structural (physical) and symbolic (social) information. (There is a similarity between the forms of information in this theory, and those put forward earlier by authors such as Marcia Bates and Tom Stonier.) The example given by Burgin and Rainer Feistel is of scientific research, whereby structural information (essentially regularities in the physical world) is extracted and converted into symbolic information, in the form of articles, books, etc. They argue that, although we do not possess any theory for the formulation of precise information conservation laws applicable to this context, we feel intuitively that the amount of symbolic information produced cannot exceed the amount of structural information in the part of the physical world being studied.

This linkage between the form of physical and social information, in the form of a qualitative approach to a conservation law which essentially limits the production of meaningful information to available physical information, seems attractive, but has been challenged by the introduction of different information-related entities. The physicist/philosopher David Deutsch, for instance, argues that although information may be limited (and hence potentially conserved), knowledge is infinite (and hence certainly not conserved).

It seems clear that the answer to the question as to whether information is conserved is still an open one. But any answer must inevitably begin with the caveat that, first and foremost, it depends what we mean by information. This need not be a sterile argument about the meaning of words, but rather a means of exploring different concepts of information, and – crucially – the ways in which information of different kinds may interact, and provide linkages between the physical, biological and social worlds.

Information generations; the end of the Millennials?

The idea of a ‘generation’ is a widely understood one, and we often take it for granted that people of a certain age will have similar experiences, expectations, and values. Terms like ‘Baby Boomers’, ‘Gen X’, and ‘Millennials’ are in common use, and it seems to be generally accepted that they have some value as a category of people with much in common.

Differences between generations are not, in the opinion of most writers, ‘just an age thing’; they are a supposed unique life style, based on a common environment (social, cultural, economic, technical) during formative early years, and experiencing different formative world events. Growing up in a different social and cultural environment, and with different technologies, changes skills sets and expectations. To take an obvious example. The web arrived when boomers were typically around 40. Although many have enthusiastically adopted it, their approach is different to Gen X, who encountered it in young adulthood, Millennials, many of whom first experienced it at school, and the Google generation, for whom it as always been an integral part of life.

In the library/information sciences also, the idea of generations has had some acceptance; the assumption that those of one generation will deal differently with information, and will prefer certain kinds of information systems and services to those of others. This is something I have been interested in for some time, and have even presented about it, in meetings in Prague and in Vilnius.

Of course, we all know that, like many such groupings, this idea of the information generation should not be pushed too far. The generations overlap by 7 or 8 years, and their dates are somewhat arbitrary; it seems that every report has slightly different years for each generation. There are always exceptional individuals, who act ‘out of their generation’, many people identify with two generations, and some commentators treat generations together, e.g. X and Y together as ‘Me-Gen’ or ‘Next Gen’, or Y and Google as ‘Digital Natives’

In information terms also, people are individuals, and the individual differences in information behaviour should not be forgotten. We cannot assume that people of a certain age will behave in similar ways with respect to information, any more than we would assume identical behaviour in other respects. In particular, the idea that younger people can be helpfully characterised as ‘digital natives’, has been severely criticised. Although Millennials may take the lead, older generations can also be keen technology adopters.

This has not, however, prevented the idea of generations being applied in planning and management of library/information services. Recent examples include information literacy tutorials for millennials and mentoring across generations in law librarianship.

Now a new report from the Pew Research Centre has declared the end of the Millenials. Anyone born from 1997 onwards will be part of a new generation, yet to be given a name. Given that, as the report notes, they have grown up in an ‘always on’ technical environment, perhaps they will be the Always-Ons, or perhaps, in a nod to Luciano Floridi, the ‘Infosphere Generation’. They will, no doubt develop their own information behaviours, and make their own demands on information systems and services. But we would do well to remember that they are, first and foremost, individuals.

In praise of speculative (or even science) fiction

I have always liked science fiction. This is not something that serious people usually want to admit to, though the perception that the genre is fit only for nerdy adolescents has diminished over recent years. There has been a growing, if somewhat reluctant, acceptance that the more thoughtful end of science fiction can be valuable as a kind of futurology; I’ve even used the idea myself in considering Arthur C. Clarke’s ‘Odyssey vision’ and prediction in the information sciences.

I now feel largely vindicated, however, by the increasing interest in science fiction by the philosophical community, surely a mark of vindication. Several journal issues and books along these lines have appeared recently, and two in particular caught my eye.

A book providing a collection of articles on Philosophy and science fiction, produced as a supplement to the Journal of Social Philosophy, addresses the nature and value of science fiction. I was particularly taken with Ben Blumson’s argument that all fiction, and science fiction on particular, provides modal knowledge; it helps us to learn what may possibly be the case, and what must necessarily be the case, even in circumstances far different from those which we observe in our own life / planet / branch of the multiverse. A slightly different explanation is given by Brian Keeley: speculative fiction, a term which he prefers to science fiction, takes what we believe to be true, and imaginatively explores what might be the consequences if the circumstances were different. In both cases, the genre is seem as a kind of thought experiment instantiated in fiction.

This idea of science-fiction-as-thought-experiment is explicitly developed in a volume on Science fiction and philosophy, which uses a series of articles on concepts from science fiction to explore a wide variety of current issues. These range from from virtual reality and free will, to time travel and the singularity, and to artificial intelligence and transhumanism. As Eric Schwizgebel puts it, introducing a selection of science fiction recommended by philosophers, speculative fiction “engages imaginative and emotive cognition about possibilities outside the ordinary run of human experience”.

So, speculative (and even science) fiction is a serious matter now. If the philosophers think it’s worthwhile, I shall not feel queasy about appreciating it, and neither should you.

Chemistry and its (information) history

It has often been said that chemistry was, and to an extent may still be, the most information-intensive of the sciences; see, for example, the article by Lyn Robinson and myself on chemical information literacy. This status is now challenged by molecular biology, with its ‘Central Dogma’ stating that information flows from DNA to RNA to proteins, and its reliance on an array of informatics systems, as noted by Divan and Royds’ recent short introduction. However, particularly to someone like myself who studied chemistry, it is interesting to reflect on the extent to which information representation and communication has gone hand-in-hand with the development of concepts and theories in chemistry, so that it is difficult to tell where the one ends and the other begins.
I was particularly struck by this, when looking through William Brock’s short introduction to the history of chemistry. Although this is stated to be about how chemical explanations were found, and how chemical phenomena were discovered, studied, and exploited, the book is, to a remarkable extent, also a history of information representation and communication over many years.
Listed below are just some of the examples described by Brock.

Williams’ representation of ether
Alexander Williamson, professor of chemistry at University College London, derived in 1850 a new way of explaining chemical ‘types’, such as ethers and alcohols, which required typographers to invent new methods of displaying them diagrammatically on the printed page.
Graphical notations to represent the structures of organic compounds were developed by the Scottish chemists Couper and Crum Brown in the 1850s and 1860s, greatly assisting analysis of chemical phenomena by giving the organising principles of structure and bonding, and providing essentially the form in which organic structures are represented today in chemical information systems.
Couper’s structure representation
This was later developed, through ideas of atomic and molecular structure, into a modern understanding of the reaction mechanisms through which substances are transformed. This in turn was captured by the ‘curly arrow’ symbolism for electron movements, devised by the British chemist Robert Robinson in the 1920s.
Crum Brown’s structure representation
Robinson’s curly arrow notation

For inorganic chemistry, the equivalent organising principle was Mendeleev’s periodic table of the elements. It is interesting that, as Brock notes, Mendeleev was led to construct his table by his search for a better way to organise the material of a chemistry textbook; the table has been used for information organisation and retrieval ever since.

Periodic Table (source: ScienceNotes)

Chemistry library at Smith College Massachusetts 1915
Chemistry was in the lead in the development of disciplinary speciality generally, with its associated science communication system, in the latter part of the nineteenth century, with initiatives including learned societies, specialist libraries, conferences and their proceedings, journals, monographs, abstracts, reviews, and guides to the literature. Notably, chemistry was the domain in which encyclopaedic compilations of information and data were first developed: particularly well-known examples noted by Brock, which continue to the present day as digital databases, are those of Gmelin and of Beilstein, for inorganic and organic chemistry respectively, and Kaye and Laby for physical properties of substances.


Chemistry has always been a science involving a mass of observational and experimental facts and data, such that the renowned physicist Ernest Rutherford could dismiss it as ‘stamp collecting’, somewhat unfairly in view of the organising principles already developed at that time, and much further developed later. It is interesting to reflect that both the main aspects of the subject, data compilation, and organising concepts and principles, have been reflected in the information conventions and systems of chemistry.
The naming of chemical substances has always been an issue, even in the days when only a relatively small number were known (in 2015 the number of distinct substances recognised by the Chemical Abstracts Registry System reached 100 million). Brock draws attention to the significance of a conference held in Geneva in 1892, at which a group of chemists and editors of chemical journals agreed to use structural theory as the basis for giving names to substances, this form of information representation greatly improving the ability to find chemical information in handbooks and indexes. It led to what is today known as the IUPAC nomenclature. This, however, came at the cost of public understanding of the science, as the new substance names were lengthy and difficult to understand; for example, ‘citric acid’ in the new nomenclature became ‘2-hydroxypropane-1,2,3-tricarboxylic acid’. Hence the continuation of ‘common names’ for substances familiar to the general public.
Brock’s book, a condensation and also an updating of his much more extensive 1992 History of Chemistry text, is certainly a reliable and readable introduction to the history of the subject. But it also gives a clear insight as to the extent to which information representation, and information communication, both reflect and determine a subject domain.

Bawden, D. (2015), Storing the wisdom: chemical concepts and chemoinformatics, Informatics, 2(4), 50-67, available at

Bawden, D. and Robinson, L. (2017), “An intensity around information”: the changing face of chemical information literacy, Journal of Information Science, 43(1), 17-24, open access version at

Brock, W.H. (2016), The history of chemistry – a very short introduction, Oxford, Oxford University Press

Brock, W.H. (1992), The Fontana history of chemistry (2 vols.), London: Fontana

Chemical Abstracts Service (2015), CAS assigns the 100 millionth CAS Registry Number to a substance designed to treat myeloid leukemia, [media release] available at

Divan, A. and Royds, J.A. (2016), Molecular biology – a very short introduction, Oxford: Oxford University Press

Hepler-Smith, E. (2015), “Just as the structural formula does”: names, diagrams, and the structure of organic chemistry at the 1892 Geneva Nomenclature Congress, Ambix, 62(1), 1-28

Rzepa, H. (2012), The first ever curly arrows, [blog post] available at

Wikipedia (2017), IUPAC nomenclature of organic chemistry, available at

“The summary of the universe”: thoughts on Venice in the words of Peter Ackroyd

I visited Venice for the first time recently, and wanted to set down some impressions: partly on the nature of the city itself, partly on its history of collections, archives, printing, and recording knowledge. However, I found that these ideas were expressed more evocatively than I could ever manage by Peter Ackroyd in his ‘Venice: Pure City’. So here are Ackroyd’s words in italics. The photographs are mine, except where noted.

Liminality and time, night and silence, maps and labyrinths
There are many legends and superstitions of the sea within popular Venetian lore. It is a shifting city, between land and sea, and thus it becomes the home for liminal fantasies of death and rebirth… Venice as a city of transit, where you might easily be lost among the press, a city on the frontier between different worlds, where those who did not ‘fit in’ to their native habitat were graciously accepted… Venice was always a frontier. It was called ‘the hinge of Europe’. It has the essence of a boundary – a liminal space – in all its dealings. It is a perpetual threshold. It is half land and half sea. It is the middle place between the ancient imperial cities of Rome and Byzantium … Goethe described it as ‘the market place of the Morning and the Evening lands’ by which he meant that the city, poised between west and east, is the median point of the rising and the setting sun… It was a frontier, too, between the sacred and the profane. The public spaces of the city were liminal areas between piety and patriotism. The boundaries between past and present were ill-defined.

Our colleague Ian Rodwell has noted other views of Venice as a liminal space in a post on his Liminal Narratives blog.

Yet time seems to shift in the city. The tokens of various periods appear together, and various times modify one another. In Venice there is no true chronological time; it has been overtaken by other forces. There are occasions, indeed, when time seems to be suspended; if you enter a certain courtyard, in a shaft of sunlight, the past rises all around you… The city is so old, and so encrusted with habit and tradition, that the people can be said to fit within its existing rhythms… It has often been said that Venice cannot be modernised. More pertinently, it will not be modernised. It resists any such attempt with every fibre of its being…. It can hardly be doubted then, that Venice still exerts some strange power over the human imagination. To walk around the city is to enter a kind of reverie. Water instils memories of the past, made all the more real by the survival of the ancient brick and stone.

Photo by Lyn Robinson

The night and silence of Venice are profound. Moonlight can flood Saint Mark’s Square. Venice is most characteristic at night. It has a quality of stillness that suits the mood of time preserved. Then it is haunted by what it most loves – itself. The doorways seem darker than in any other city, lapped as they are by the black water.

The secret city takes the shape of a labyrinth. It is a maze that can elicit anxiety and even fear from the unwary traveller. It lends an element of intrigue to the simplest journey. It is a city of dead-ends, and of circuitous alleys: there are twisting calli, and hidden turnings; there are low archways and blank courtyards, where the silence is suspended like a mist. There are narrow courts that terminate in water. The natives do not lose their way, but the traveller always gets lost. It is impossible not to get lost. But then suddenly, as if by some miracle of revelation, you find that for which you have been searching …. But, then, it is unlikely that you will ever find that place again.

The bureaucracy of Venice was one of the wonders of the western world. Everything was committed to writing, as the overflowing archives of modern Venice will testify. At a time when other cities, or other nations, had only the most rudimentary internal organisation Venice was already a model of administrative expertise.

The Venetians were obsessed with their history. They produced the largest body of chronicles in the Italian world. Extant from the fourteenth century are more than a thousand such texts.

It is wholly to be expected, therefore, that the Venetian archives are the second largest in the world. Only the archives of the Vatican are more extensive. Yet none are more rich or more detailed than the Venetian papers. Some date from the ninth century. Everything was written down, in the hope that older decision and provisions might still be useful. .. The Archivo di Stato, just one of the many official archives, contain 160 km of files and documents. When the German historian Leopold von Ranke first came upon them in the 1820s he was, like Cortez on a peak in Darien, staring at an ocean; from his encounter with the papers sprang the first exercise in what was known as ‘scientific history’. They are still an infinite resource for contemporary historians and sociologists….

A very ambitious digital humanities project is currently beginning, using digitization and machine learning to text mine these archives, revealing information on many aspects of Venetian society through the centuries.

Photo by Lyn Robinson
One resident of Venice has been celebrated, if that is the right word, as the first of all journalists. Pietro Aretino came to Venice from Rome in 1527 [and] wrote pasquinades or flysheets that were distributed everywhere in the city, and he refurbished the form of the giudizio or almanac … It is not perhaps surprising that the first newspaper in the world, the Gazzetta, emerged in Venice at the beginning of the seventeenth century. At various times in the following century one of the first modern journalists, Gasparo Gozzi, published L’Osservatore Veneto and La Gazzetta Veneta.

There was a passion for collecting in Venice; anything, from Roman coins to freaks of nature, could be taken up and placed in cabinets and cupboards… The first known collections were Venetia, dating from the fourteenth century. But the obsession with studioli or curiosity shops just grew and grew. [A Venetian collector] Federigo Contarini aspired to possess a specimen of everything or being ever created… During the course of the seventeenth century possession became more specific and specialised. … The whole world could be purchased and displayed …There was a market for antiques and a market for landscape paintings; there was a market in natural marvels, such as the many-headed hydra valued at six thousand ducats, and a market in ancient musical instruments…. The last great Venetian collector, Conte Vittorio Cini, died in 1977.

The Venetians have never been known for their commitment to scholarship, or to learning for its own sake; they are no inclined to abstract inquiry, or to the adumbration of theory… There was no concern for dogma or theory. There was no real interest in pure or systematic knowledge as such; empirical knowledge was for the Venetians the key to truth… There was no university in the city itself. The absence might seem a singular omission for any city-state; but there of course was no university in London either, that other centre of trade and business… There may have been no great poetry in the city, but there were important texts on hydrostatics and geography, on hydraulics and astronomy. The Venetians also possessed a practical inventiveness, in pursuits as different as glass- and instrument-making.

The real intellectual success of Venice, however, came in practical manufacture of books. The first licence to print was issued in 1469. Just eighteen or nineteen years after the invention of moveable type printing by Johannes Gutenberg, the Venetian senate announced that ‘this peculiar invention of our time, altogether unknown to former ages, is in every way to be fostered and advanced’. In this, the senators were five years ahead of William Caxton… The Venetian authorities had sensed a commercial opportunity, and the city soon became the centre of European printing. They created the privilege of copyright for certain printed works in 1486; it was the first legislation for copyright in the world. .. It was only right and natural that Venice should become the pioneer of that trade, Venice, in 1474, was said to be ‘stuffed with books’ … At the beginning of the sixteenth century there were almost two hundred print shops, producing a sixth of all the books published in Europe… The printers of Venice also became masters of musical printing, map printing and medical printing, spreading information around Europe. Books on the human anatomy, and on military fortifications, were published. Works of popular piety, light literature in the vernacular, chapbooks, all issued from the city of the lagoon.

Image by image
Cadore, BAC gallery, rio S. Polo
In 1605 Venice was described as ‘the summary of the universe’, because all that the world contained could be found somewhere within it: if the world were a ring, then Venice was its jewel…. Peggy Guggenheim once said that when Venice is flooded it is even more truly beloved… Venice has always been in peril, its existence most fragile. It is a man-made structure relying on the vicissitudes of the natural world. Yet it has endured.

Tweet, tweet … analysing a library conference backchannel with Hawksey’s TAGS

Twitter has gained a reputation as a social media tool which is very popular within the LIS community, and most libraries and archives, LIS schools, and library/information conferences, and well as many individuals in the discipline and profession, make serious use of it for information exchange. Being able to easily get an analysis of the tweets around a topic is therefore very useful for LIS folk, as well as giving an insight into the increasingly important area of social media data analysis.

In this post, I give an account of how one tool, Hawksey’s TAGS, can be used in this way. As someone who had never used this, or any similar system, before, I was particularly impressed that I was able to get useful analysis within 20 minutes from starting cold.

Devised as a “hobby” by Martin Hawksey, TAGS is a free Google Sheet template which gives simple automated collection and display of Twitter search results. It uses the Twitter API to identify tweets according to specified criteria, creates an archive of tweets, and uses Google’s visualisation tools to display them. It is very easy to use, but if need be support and help is provided through online forums.

To use TAGS in a simple way – and there are more complicated and clever things that can be done with it, but that’s for another day – all that is necessary is to (1) get a Google account, if you do not already have one, (2) download TAGS (6.1 is the current version) into your Google Drive, and (3) get a Twitter authorisation to collect Twitter data (the TAGS software prompts you to do this, and it should only be necessary to do it once). Then for each analysis you wish to do, you run TAGS specifying the collection criteria (essentially entering terms into what is effectively a search box), and wait for the script to complete. Basic statistics on the number of tweets included, and the time period overview which they were sent, are presented. You can then make the the archive usable, by clicking on the ‘share’ button, and visualise it by selecting TAGSExplorer. (You can also make the archive searchable for detailed analysis, but that is also for another day.)

We can exemplify this by looking at tweets about CILIP’s annual conference held in July 2017, which used the hashtag ‘CILIPConf17’. The screenshot below shows TAGS set up to this search; it is started by clicking ‘TAGS’ and ‘Run Now’.

The simplest way of using TAGS is to find all Tweets with a particular characteristic, for example including the CILIP conference hashtag. A partial display of these is shown below, from an analysis done by my CityLIS colleague Lyn Robinson. The somewhat indigestible display below does give a good impression of the extent of twitter activity. The linking lines show Twitter users replying to one another; the isolated user names are those who use the hashtags, but do not engage. The size of the usernames indicates the frequency of replies; there is, however, no distinction between the isolated users based on the number of tweets. This display is gives an indication of who is using the hashtags, but not to what extent, and how much conversation is taking place. At the left-hand side is shown a list of the most common hashtags in this tweet archive: not surprising ‘#CILIPConf17’ (which had to be in all tweets) is at the top, followed by two more CILIP-related tags, ‘#factsmatter’ and ‘#CILIPethics’; next in line is the ‘#CityLIS’ tag, showing the engagement of CityLIS tweeters with the conference.

The display below shows tweets about the CILIP conference which also mention the CityLIS library school; it is created simply by entering ‘#CILIPConf17’ and ‘#citylis’ as the search term.

The same can be done using a twitter username, as in the display below, created by entering ‘@floridi’ and ‘#CILIPconf17’ to find tweets mentioning Luciano Floridi who gave a keynote talk. Again the lines join those who interact, the size of the name showing the extent of interaction, with the isolated usernames being those who simply mention Floridi.

More extensive Boolean logic can be used. For example, the display below shows tweets using the ‘CILIPConf17’ with either ‘#citylis’, ‘@ucldis’ or ”@infoschoolsheff’, to find tweets from, or mentioning, any of three leading UK LIS departments. Note that there are limitations to the complexity of Boolean logic can be employed (because of the way Google interrogates Twitter, rather than limitations in the TAGS software) but simple combination of ANDs and ORs work fine.

It is important to remember that TAGS is intended as a simple, quick and free tool, and therefore has some limitations. Some issues in the way the visualisation works have been noted. There are also limitations in the collection of tweets, using the Twitter API. As Twitter themselves say “it’s important to know that the Search API is focused on relevance and nor completeness. This means some Tweets and users may be missing from search results.”

Also, TAGS only accesses tweets from roughly the previous week (between 6 and 9 days), because of the limitations on the length of time that tweets remain available; there are ways round this, but they are more complicated that the simple use of TAGS. There are also limitations on the numbers of tweets that TAGS will handle, though this will only affect analyses of topics with very large volumes.

So TAGS are best used to get a quick and simple picture of recent Twitter activity in a topic of interest. If completeness and precision are required, then it will be necessary to extra work in identifying a full set of Tweets, and then checking and cleaning the data. This might involve collecting tweets over a long period, or exporting the data to a more sophisticated analysis and visualisation program such as NodeXL. For an example of the use of NodeXL, for detailed analysis see the paper by Lee et al. on an analysis of a dataset of Tweets from three annual conferences of the Association of Internet Researchers, showing the nature of the networks formed, the most influential Tweeters, and the topics mentioned. This paper also gives a good literature review of examples of Twitter analysis.

Bear in mind that Twitter has rules about collecting Twitter data sets and making them public, and that these rules change to match new uses; see the current Twitter terms; however, it is unlikely that they would be infringed by the kind of small scale analysis and display exemplified by this CILIP conference example.

The TAGS software allows simple but effective Twitter analysis to be done with very little effort or resources, and is often all that is needed. It is something that anyone from an LIS background interested in how Twitter is being used should get familiar with.

Still awaiting the quantum turn

Two years ago a paper by myself and my colleagues Lyn Robinson and Tyabba Siddiqui was published in JASIST, introducing and explaining the idea of an emerging ‘quantum information science’. We argued that this could be seen in five respects: use of loose analogies and metaphors between concepts in quantum physics and library/information science; use of quantum concepts and formalisms in information retrieval; use of quantum concepts and formalisms in studying meaning and concepts; development of quantum social science, in areas adjacent to information science; and qualitative application of quantum concepts in the information disciplines themselves. This post discusses some developments since that paper was written.

Interest in the links between quantum theory and information continues. In the physics arena, an intriguing attempt is being made to construct the whole formalism of quantum mechanics on information-theoretic principles, as set out by D’Ariano, Chiribella and Perinotti in their new Quantum theory from first principles: an informational approach. A similar attempt is being made by the proponents of ‘QBism’ (Quantum Bayesianism), or ‘participatory realism’, according to which any the result of any quantum measurement will depend on the information possessed by the observer. Quantum computers are getting near the stage of demonstrating their practical utility, as shown by the stated intention of Google’s quantum computer team to produce, by the end of 2017, a small quantum device able to deal with problems previously the preserve of supercomputers.

In the application of quantum formalisms applied to information retrieval, a book by Massimo Melucci, several of whose papers were discussed in our JASIST paper, summarises the state of the art. He states particularly clearly the way in which the quantum ideas are applied: “The idea behind the quantum-like approach to disciplines of than physics is that, although the quantum properties exhibited by particles such as photons cannot be exhibited by macroscopic objects, some phenomena can be described by the language or have some characteristics of the phenomena (e.g. superposition or entanglement) described by the quantum mechanical framework in physics … This book is not about quantum phenomena in IR: in contrast, it aims to propose the use of the mathematical language of the quantum mechanical framework for describing the mode of action of a retrieval system” (pp viii and xi).

At a more general level, the idea of “quantum informational structural realism” (QISR) has caused some interest since it was introduced by Terrell Ward Bynum. An extension of “Information Structural Realism”, first proposed by Luciano Floridi, this provides a full ontological account of the universe in which there is an observer-independent reality, whose ultimate nature is neither physical or mental, but informational, and defined by the interactions between informational entities. QSIR insists that these entities have quantum properties. Betsy Van der Veer Martens was kind enough to note that this “links intriguingly” with the idea of a quantum turn in information studies identified in our JASIST paper

In the area of ‘quantum social science’, there has been one major contribution since the JASTST paper appeared. Alexander Wendt in his book Quantum mind and social science: unifying physical and social ontology starts from the idea of consciousness as a quantum phenomenon on the macro-scale, and uses it to argue that language, social interaction, and culture should be regarded also as quantum in nature, and hence that a quantum approach is of direct relevance to social science. Wave functions are real, and operate at the social level. However, the arguments seem, like some of those reviewed in our JASIST paper, to be essentially metaphorical. In an interview, Wendt, noting that he was influenced to think about the topic by Zohar and Marshall’s popular book, The Quantum Society, gives an example of what he considers quantum effects in social science. He considers a Vietnamese tourist in Denmark going into a shop. The tourist speaks no Danish, and the shopkeeper no Vietnamese; but if they discover that they have English as a common language, then their minds will, Wendt suggests, become “entangled” in a quantum sense. One has to say that this is not the sense of entanglement which would be understood by a physicist. Nonetheless, this book is symptomatic of a potential quantum turn in social science generally, which has clear relevance to the information sciences.

We may conclude that quantum concepts still intrigue and influence the social sciences, including the information sciences, but that no new paradigm has been accepted. The information retrieval applications of the mathematical formalisms of quantum mechanics seems most firmly grounded; claims of true quantum phenomena in settings are as yet un-evidenced, and the metaphorical use of terminology, though increasingly popular, has yet to show real benefit. Perhaps we need to wait for a new formulation of quantum mechanics in informational terms to emerge from physics and be fully accepted, before the quantum turn in information science can be realised; it may be that QISR is the first indicator of this.

Bawden, D., Robinson, L. and Siddiqui, T. (2015), “Potentialities or possibilities”: Towards quantum information science? Journal of the American Society for Information Science and Technology, 66(3), 437-449, open access version in the Humanities Commons at

Becker, C. (2015), Q and A: Alexander Wendt on ‘Quantum mind and social science’, Mershon Centrer for international Security Studies [online], available at

Courtland, R. (2017), Google plans to demonstrate the supremacy of quantum computing, IEEE Spectrum [online], available at

D’Ariano, G.M., Chiribella, G. and Perinotti, P. (2017), Quantum theory from first principles: an informational approach, Cambridge: Cambridge University Press

Melucci, M. (2015), Introduction to information retrieval and quantum mechanics, Berlin: Springer

Van der Veer Martens, B. (2015), An illustrated introduction to the infosphere, Library Trends, 63(3), 317-361

Waldrop, M.M. (2017) Painting a QBist picture of reality, FQXI Community [online], available at

Ward Bynum, T. (2013), On the possibility of quantum informational structural realism, Minds and Machines, 24(1), 123-139

Ward Bynum, T. (2016), Informational metaphysics, in Floridi, L. (ed.), The Routledge Handbook of the Philosophy of Information, London: Routledge, pp. 203-218

Wendt, A. (2015), Quantum mind and social science: unifying physical and social ontology, Cambridge: Cambridge University Press