“The summary of the universe”: thoughts on Venice in the words of Peter Ackroyd


I visited Venice for the first time recently, and wanted to set down some impressions: partly on the nature of the city itself, partly on its history of collections, archives, printing, and recording knowledge. However, I found that these ideas were expressed more evocatively than I could ever manage by Peter Ackroyd in his ‘Venice: Pure City’. So here are Ackroyd’s words in italics. The photographs are mine, except where noted.

Liminality and time, night and silence, maps and labyrinths
There are many legends and superstitions of the sea within popular Venetian lore. It is a shifting city, between land and sea, and thus it becomes the home for liminal fantasies of death and rebirth… Venice as a city of transit, where you might easily be lost among the press, a city on the frontier between different worlds, where those who did not ‘fit in’ to their native habitat were graciously accepted… Venice was always a frontier. It was called ‘the hinge of Europe’. It has the essence of a boundary – a liminal space – in all its dealings. It is a perpetual threshold. It is half land and half sea. It is the middle place between the ancient imperial cities of Rome and Byzantium … Goethe described it as ‘the market place of the Morning and the Evening lands’ by which he meant that the city, poised between west and east, is the median point of the rising and the setting sun… It was a frontier, too, between the sacred and the profane. The public spaces of the city were liminal areas between piety and patriotism. The boundaries between past and present were ill-defined.

Our colleague Ian Rodwell has noted other views of Venice as a liminal space in a post on his Liminal Narratives blog.

Yet time seems to shift in the city. The tokens of various periods appear together, and various times modify one another. In Venice there is no true chronological time; it has been overtaken by other forces. There are occasions, indeed, when time seems to be suspended; if you enter a certain courtyard, in a shaft of sunlight, the past rises all around you… The city is so old, and so encrusted with habit and tradition, that the people can be said to fit within its existing rhythms… It has often been said that Venice cannot be modernised. More pertinently, it will not be modernised. It resists any such attempt with every fibre of its being…. It can hardly be doubted then, that Venice still exerts some strange power over the human imagination. To walk around the city is to enter a kind of reverie. Water instils memories of the past, made all the more real by the survival of the ancient brick and stone.

Photo by Lyn Robinson

The night and silence of Venice are profound. Moonlight can flood Saint Mark’s Square. Venice is most characteristic at night. It has a quality of stillness that suits the mood of time preserved. Then it is haunted by what it most loves – itself. The doorways seem darker than in any other city, lapped as they are by the black water.

The secret city takes the shape of a labyrinth. It is a maze that can elicit anxiety and even fear from the unwary traveller. It lends an element of intrigue to the simplest journey. It is a city of dead-ends, and of circuitous alleys: there are twisting calli, and hidden turnings; there are low archways and blank courtyards, where the silence is suspended like a mist. There are narrow courts that terminate in water. The natives do not lose their way, but the traveller always gets lost. It is impossible not to get lost. But then suddenly, as if by some miracle of revelation, you find that for which you have been searching …. But, then, it is unlikely that you will ever find that place again.

The bureaucracy of Venice was one of the wonders of the western world. Everything was committed to writing, as the overflowing archives of modern Venice will testify. At a time when other cities, or other nations, had only the most rudimentary internal organisation Venice was already a model of administrative expertise.

The Venetians were obsessed with their history. They produced the largest body of chronicles in the Italian world. Extant from the fourteenth century are more than a thousand such texts.

It is wholly to be expected, therefore, that the Venetian archives are the second largest in the world. Only the archives of the Vatican are more extensive. Yet none are more rich or more detailed than the Venetian papers. Some date from the ninth century. Everything was written down, in the hope that older decision and provisions might still be useful. .. The Archivo di Stato, just one of the many official archives, contain 160 km of files and documents. When the German historian Leopold von Ranke first came upon them in the 1820s he was, like Cortez on a peak in Darien, staring at an ocean; from his encounter with the papers sprang the first exercise in what was known as ‘scientific history’. They are still an infinite resource for contemporary historians and sociologists….

A very ambitious digital humanities project is currently beginning, using digitization and machine learning to text mine these archives, revealing information on many aspects of Venetian society through the centuries.

Photo by Lyn Robinson
One resident of Venice has been celebrated, if that is the right word, as the first of all journalists. Pietro Aretino came to Venice from Rome in 1527 [and] wrote pasquinades or flysheets that were distributed everywhere in the city, and he refurbished the form of the giudizio or almanac … It is not perhaps surprising that the first newspaper in the world, the Gazzetta, emerged in Venice at the beginning of the seventeenth century. At various times in the following century one of the first modern journalists, Gasparo Gozzi, published L’Osservatore Veneto and La Gazzetta Veneta.

There was a passion for collecting in Venice; anything, from Roman coins to freaks of nature, could be taken up and placed in cabinets and cupboards… The first known collections were Venetia, dating from the fourteenth century. But the obsession with studioli or curiosity shops just grew and grew. [A Venetian collector] Federigo Contarini aspired to possess a specimen of everything or being ever created… During the course of the seventeenth century possession became more specific and specialised. … The whole world could be purchased and displayed …There was a market for antiques and a market for landscape paintings; there was a market in natural marvels, such as the many-headed hydra valued at six thousand ducats, and a market in ancient musical instruments…. The last great Venetian collector, Conte Vittorio Cini, died in 1977.

The Venetians have never been known for their commitment to scholarship, or to learning for its own sake; they are no inclined to abstract inquiry, or to the adumbration of theory… There was no concern for dogma or theory. There was no real interest in pure or systematic knowledge as such; empirical knowledge was for the Venetians the key to truth… There was no university in the city itself. The absence might seem a singular omission for any city-state; but there of course was no university in London either, that other centre of trade and business… There may have been no great poetry in the city, but there were important texts on hydrostatics and geography, on hydraulics and astronomy. The Venetians also possessed a practical inventiveness, in pursuits as different as glass- and instrument-making.
 


The real intellectual success of Venice, however, came in practical manufacture of books. The first licence to print was issued in 1469. Just eighteen or nineteen years after the invention of moveable type printing by Johannes Gutenberg, the Venetian senate announced that ‘this peculiar invention of our time, altogether unknown to former ages, is in every way to be fostered and advanced’. In this, the senators were five years ahead of William Caxton… The Venetian authorities had sensed a commercial opportunity, and the city soon became the centre of European printing. They created the privilege of copyright for certain printed works in 1486; it was the first legislation for copyright in the world. .. It was only right and natural that Venice should become the pioneer of that trade, Venice, in 1474, was said to be ‘stuffed with books’ … At the beginning of the sixteenth century there were almost two hundred print shops, producing a sixth of all the books published in Europe… The printers of Venice also became masters of musical printing, map printing and medical printing, spreading information around Europe. Books on the human anatomy, and on military fortifications, were published. Works of popular piety, light literature in the vernacular, chapbooks, all issued from the city of the lagoon.

Image by image
Cadore, BAC gallery, rio S. Polo
In 1605 Venice was described as ‘the summary of the universe’, because all that the world contained could be found somewhere within it: if the world were a ring, then Venice was its jewel…. Peggy Guggenheim once said that when Venice is flooded it is even more truly beloved… Venice has always been in peril, its existence most fragile. It is a man-made structure relying on the vicissitudes of the natural world. Yet it has endured.

Tweet, tweet … analysing a library conference backchannel with Hawksey’s TAGS

Twitter has gained a reputation as a social media tool which is very popular within the LIS community, and most libraries and archives, LIS schools, and library/information conferences, and well as many individuals in the discipline and profession, make serious use of it for information exchange. Being able to easily get an analysis of the tweets around a topic is therefore very useful for LIS folk, as well as giving an insight into the increasingly important area of social media data analysis.

In this post, I give an account of how one tool, Hawksey’s TAGS, can be used in this way. As someone who had never used this, or any similar system, before, I was particularly impressed that I was able to get useful analysis within 20 minutes from starting cold.

Devised as a “hobby” by Martin Hawksey, TAGS is a free Google Sheet template which gives simple automated collection and display of Twitter search results. It uses the Twitter API to identify tweets according to specified criteria, creates an archive of tweets, and uses Google’s visualisation tools to display them. It is very easy to use, but if need be support and help is provided through online forums.

To use TAGS in a simple way – and there are more complicated and clever things that can be done with it, but that’s for another day – all that is necessary is to (1) get a Google account, if you do not already have one, (2) download TAGS (6.1 is the current version) into your Google Drive, and (3) get a Twitter authorisation to collect Twitter data (the TAGS software prompts you to do this, and it should only be necessary to do it once). Then for each analysis you wish to do, you run TAGS specifying the collection criteria (essentially entering terms into what is effectively a search box), and wait for the script to complete. Basic statistics on the number of tweets included, and the time period overview which they were sent, are presented. You can then make the the archive usable, by clicking on the ‘share’ button, and visualise it by selecting TAGSExplorer. (You can also make the archive searchable for detailed analysis, but that is also for another day.)

We can exemplify this by looking at tweets about CILIP’s annual conference held in July 2017, which used the hashtag ‘CILIPConf17’. The screenshot below shows TAGS set up to this search; it is started by clicking ‘TAGS’ and ‘Run Now’.

 
The simplest way of using TAGS is to find all Tweets with a particular characteristic, for example including the CILIP conference hashtag. A partial display of these is shown below, from an analysis done by my CityLIS colleague Lyn Robinson. The somewhat indigestible display below does give a good impression of the extent of twitter activity. The linking lines show Twitter users replying to one another; the isolated user names are those who use the hashtags, but do not engage. The size of the usernames indicates the frequency of replies; there is, however, no distinction between the isolated users based on the number of tweets. This display is gives an indication of who is using the hashtags, but not to what extent, and how much conversation is taking place. At the left-hand side is shown a list of the most common hashtags in this tweet archive: not surprising ‘#CILIPConf17’ (which had to be in all tweets) is at the top, followed by two more CILIP-related tags, ‘#factsmatter’ and ‘#CILIPethics’; next in line is the ‘#CityLIS’ tag, showing the engagement of CityLIS tweeters with the conference.

 
The display below shows tweets about the CILIP conference which also mention the CityLIS library school; it is created simply by entering ‘#CILIPConf17’ and ‘#citylis’ as the search term.

The same can be done using a twitter username, as in the display below, created by entering ‘@floridi’ and ‘#CILIPconf17’ to find tweets mentioning Luciano Floridi who gave a keynote talk. Again the lines join those who interact, the size of the name showing the extent of interaction, with the isolated usernames being those who simply mention Floridi.

 
More extensive Boolean logic can be used. For example, the display below shows tweets using the ‘CILIPConf17’ with either ‘#citylis’, ‘@ucldis’ or ”@infoschoolsheff’, to find tweets from, or mentioning, any of three leading UK LIS departments. Note that there are limitations to the complexity of Boolean logic can be employed (because of the way Google interrogates Twitter, rather than limitations in the TAGS software) but simple combination of ANDs and ORs work fine.

It is important to remember that TAGS is intended as a simple, quick and free tool, and therefore has some limitations. Some issues in the way the visualisation works have been noted. There are also limitations in the collection of tweets, using the Twitter API. As Twitter themselves say “it’s important to know that the Search API is focused on relevance and nor completeness. This means some Tweets and users may be missing from search results.”

Also, TAGS only accesses tweets from roughly the previous week (between 6 and 9 days), because of the limitations on the length of time that tweets remain available; there are ways round this, but they are more complicated that the simple use of TAGS. There are also limitations on the numbers of tweets that TAGS will handle, though this will only affect analyses of topics with very large volumes.

So TAGS are best used to get a quick and simple picture of recent Twitter activity in a topic of interest. If completeness and precision are required, then it will be necessary to extra work in identifying a full set of Tweets, and then checking and cleaning the data. This might involve collecting tweets over a long period, or exporting the data to a more sophisticated analysis and visualisation program such as NodeXL. For an example of the use of NodeXL, for detailed analysis see the paper by Lee et al. on an analysis of a dataset of Tweets from three annual conferences of the Association of Internet Researchers, showing the nature of the networks formed, the most influential Tweeters, and the topics mentioned. This paper also gives a good literature review of examples of Twitter analysis.

Bear in mind that Twitter has rules about collecting Twitter data sets and making them public, and that these rules change to match new uses; see the current Twitter terms; however, it is unlikely that they would be infringed by the kind of small scale analysis and display exemplified by this CILIP conference example.

The TAGS software allows simple but effective Twitter analysis to be done with very little effort or resources, and is often all that is needed. It is something that anyone from an LIS background interested in how Twitter is being used should get familiar with.

Still awaiting the quantum turn

Two years ago a paper by myself and my colleagues Lyn Robinson and Tyabba Siddiqui was published in JASIST, introducing and explaining the idea of an emerging ‘quantum information science’. We argued that this could be seen in five respects: use of loose analogies and metaphors between concepts in quantum physics and library/information science; use of quantum concepts and formalisms in information retrieval; use of quantum concepts and formalisms in studying meaning and concepts; development of quantum social science, in areas adjacent to information science; and qualitative application of quantum concepts in the information disciplines themselves. This post discusses some developments since that paper was written.

Interest in the links between quantum theory and information continues. In the physics arena, an intriguing attempt is being made to construct the whole formalism of quantum mechanics on information-theoretic principles, as set out by D’Ariano, Chiribella and Perinotti in their new Quantum theory from first principles: an informational approach. A similar attempt is being made by the proponents of ‘QBism’ (Quantum Bayesianism), or ‘participatory realism’, according to which any the result of any quantum measurement will depend on the information possessed by the observer. Quantum computers are getting near the stage of demonstrating their practical utility, as shown by the stated intention of Google’s quantum computer team to produce, by the end of 2017, a small quantum device able to deal with problems previously the preserve of supercomputers.

In the application of quantum formalisms applied to information retrieval, a book by Massimo Melucci, several of whose papers were discussed in our JASIST paper, summarises the state of the art. He states particularly clearly the way in which the quantum ideas are applied: “The idea behind the quantum-like approach to disciplines of than physics is that, although the quantum properties exhibited by particles such as photons cannot be exhibited by macroscopic objects, some phenomena can be described by the language or have some characteristics of the phenomena (e.g. superposition or entanglement) described by the quantum mechanical framework in physics … This book is not about quantum phenomena in IR: in contrast, it aims to propose the use of the mathematical language of the quantum mechanical framework for describing the mode of action of a retrieval system” (pp viii and xi).

At a more general level, the idea of “quantum informational structural realism” (QISR) has caused some interest since it was introduced by Terrell Ward Bynum. An extension of “Information Structural Realism”, first proposed by Luciano Floridi, this provides a full ontological account of the universe in which there is an observer-independent reality, whose ultimate nature is neither physical or mental, but informational, and defined by the interactions between informational entities. QSIR insists that these entities have quantum properties. Betsy Van der Veer Martens was kind enough to note that this “links intriguingly” with the idea of a quantum turn in information studies identified in our JASIST paper

In the area of ‘quantum social science’, there has been one major contribution since the JASTST paper appeared. Alexander Wendt in his book Quantum mind and social science: unifying physical and social ontology starts from the idea of consciousness as a quantum phenomenon on the macro-scale, and uses it to argue that language, social interaction, and culture should be regarded also as quantum in nature, and hence that a quantum approach is of direct relevance to social science. Wave functions are real, and operate at the social level. However, the arguments seem, like some of those reviewed in our JASIST paper, to be essentially metaphorical. In an interview, Wendt, noting that he was influenced to think about the topic by Zohar and Marshall’s popular book, The Quantum Society, gives an example of what he considers quantum effects in social science. He considers a Vietnamese tourist in Denmark going into a shop. The tourist speaks no Danish, and the shopkeeper no Vietnamese; but if they discover that they have English as a common language, then their minds will, Wendt suggests, become “entangled” in a quantum sense. One has to say that this is not the sense of entanglement which would be understood by a physicist. Nonetheless, this book is symptomatic of a potential quantum turn in social science generally, which has clear relevance to the information sciences.

We may conclude that quantum concepts still intrigue and influence the social sciences, including the information sciences, but that no new paradigm has been accepted. The information retrieval applications of the mathematical formalisms of quantum mechanics seems most firmly grounded; claims of true quantum phenomena in settings are as yet un-evidenced, and the metaphorical use of terminology, though increasingly popular, has yet to show real benefit. Perhaps we need to wait for a new formulation of quantum mechanics in informational terms to emerge from physics and be fully accepted, before the quantum turn in information science can be realised; it may be that QISR is the first indicator of this.

References
Bawden, D., Robinson, L. and Siddiqui, T. (2015), “Potentialities or possibilities”: Towards quantum information science? Journal of the American Society for Information Science and Technology, 66(3), 437-449, open access version in the Humanities Commons at https://hcommons.org/deposits/item/hc:14697

Becker, C. (2015), Q and A: Alexander Wendt on ‘Quantum mind and social science’, Mershon Centrer for international Security Studies [online], available at https://mershoncenter.osu.edu/news/mershon-news/q-and-a-alexander-wendt-on-quantum-mind-and-social-science.html

Courtland, R. (2017), Google plans to demonstrate the supremacy of quantum computing, IEEE Spectrum [online], available at http://spectrum.ieee.org/computing/hardware/google-plans-to-demonstrate-the-supremacy-of-quantum-computing

D’Ariano, G.M., Chiribella, G. and Perinotti, P. (2017), Quantum theory from first principles: an informational approach, Cambridge: Cambridge University Press

Melucci, M. (2015), Introduction to information retrieval and quantum mechanics, Berlin: Springer

Van der Veer Martens, B. (2015), An illustrated introduction to the infosphere, Library Trends, 63(3), 317-361

Waldrop, M.M. (2017) Painting a QBist picture of reality, FQXI Community [online], available at http://fqxi.org/community/articles/display/218

Ward Bynum, T. (2013), On the possibility of quantum informational structural realism, Minds and Machines, 24(1), 123-139

Ward Bynum, T. (2016), Informational metaphysics, in Floridi, L. (ed.), The Routledge Handbook of the Philosophy of Information, London: Routledge, pp. 203-218

Wendt, A. (2015), Quantum mind and social science: unifying physical and social ontology, Cambridge: Cambridge University Press

Rocking documentation

William Dyce. Pegwell Bay, Kent – a recollection of October 5th 1858 (Tate Britain)
There rolls the deep where grew the tree.
O earth what changes hast thou seen!
There where the long street roars hath been
The stillness of the central sea.

The hills are shadows and they flow
From form to form and nothing stands;
The melt like mists, the solid lands,
Like clouds they shape themselves and go.

(Alfred Lord Tennyson, In Memoriam, 1850)

Tennyson’s verse, and Dyce’s painting of the rock strata in the cliffs, remind us of the realisation in mid-Victorian times that the study of rocks and minerals could give a great deal of information and insight into the evolution of the earth; that they could, in fact, act as documents.

I was reminded of this by reading Jan Zalasiewicz’s Rocks, one of the books in the Oxford University Press Very Short Introduction series.

This excellent short book speaks of rocks and minerals using the explicit language of information, document, and archive “rocks contain our sense of planetary history: indeed, in a very literal sense, they are the evidence from which earth history, as encapsulated in the geological time scale, is constructed … [sedimentary rocks] can include invaluable information on the chemistry and biology of long-vanished oceans … [the ice strata of Antarctica are] a marvellous archive of information which can be interrogated to give an eloquent picture of climate change”

The link between the study of rocks and minerals and documentation has always been an important one. Geology and mineralogy have been among the natural science which have relied from their origins on collecting, naming and classifying, and virtually all museums with a natural history component have had a classified collection. Though such collections are now regarded as somewhat old-fashioned for public display, it is hard to doubt their documentary nature and value. Information on rocks and minerals is organised by nomenclature, officially maintained by the International Mineralogical Association, and by a variety of classifications, of which the most important at present are those of Dana and of Strunz.

Mineral collection in the National Museum, Prague (photograph by DB)

They have also featured extensively in discussions of the nature of documents and documentation, from the early writings of Suzanne Briet, who noted that, while a pebble in a river bed was not a document, the stones in a museum of geology certainly were. Bringing the story up to date, Betsy Van der Veer Martens gives an interesting and detailed discussion of rocks as documents and information objects, in a recent article in Education for Information, while Luciano Floridi, whose information ethics also assigns an intrinsic moral worth to all items on the basis of their information nature, specifies rocks as among these informational entities.

At a time when the concept of documentation is being expanded to include whole landscapes, it is worth remembering that humble rocks and pebbles are one of the most fundamental forms of document, and the way in which they have been dealt with illustrates important principles of documentation.

Library and information science in an age of messages: Rafael Capurro’s comments

In a previous post, I gave a slightly modified version of a chapter written by Lyn Robinson and myself for a Festschrift in honour of Rafael Capurro.


Capurro subsequently wrote an insightful and generous commentary on all of the book’s chapters. Below, I reproduce a shortened version of his perceptive comments on our chapter:

Thanks for your clear, concise and comprehensive analysis of thoughts on the nature of information science and its foundations.

You write “There is a good deal to be said about the relation between information and entropy, complexity and similar physical concepts, but it is not yet evident that this is best expressed in terms of messages and messengers.” You are right. What is missing in my analysis is no more and no less than the concept of time. Three-dimensional time plays a key role also in quantum mechanics as Carl Friedrich von Weizsäcker and others have shown.

With regard to “Capurro’s Trilemma”, you write: “There are, it seems, two kinds of gaps: those between the concepts; and those between scholars who think it worthwhile to bridge such gaps and those who do not.” The “gaps between concepts” are in fact gaps between contexts. Aristotle is a master in presenting commonalities and differences between the use of concepts in different contexts. Take, for instance, his analysis of the concept of middle (meson, mesotes) in physics, logic, epistemology, and ethics.

With regard to the importance of the concept of document … you write “This might be seen as an endorsement of a focus on documentation as a central concern within LIS, although Capurro does not seem to have made this link explicitly”. This is not quite the case. I pointed to it in my PhD [thesis of 1978] where I defined information as documented knowledge made available or “useful” – ready-to-hand or “Zuhanden” in Heideggerian terms – within a network of institutions, media, instruments for classification and retrieval and the like. This definition is not only not in contradiction to Popperian World 3, but includes also Worlds 1 and 2. Popper’s criticisms of “pure facts” and his insistence that any observation is “theory-laden” is not dissimilar to the hermeneutic concept of “pre-understanding”.

Floridi’s “Philosophy of Information” is pretty near to my early research on the Latin root informatio and the Greek concepts of eidos, idea, morphe and typos. In the course of time I took a self-critical distance from it, becoming less metaphysical and more existential. Some clarity in these matters might come from a thorough analysis of what ontology means on different schools of thought and, as in my case, in Heideggerian phenomenology. An analysis of the question “what is a document?” should reflect the epochal changes of this concept in such a way that the word “is” in the definition should be always hermeneutically understood as an ‘as’.

… I am more curious than ever on how information science – will find its place within this [an] interdisciplinary framework (one that appears to me more like a labyrinth than having one sort of rationale based on a common language and related to the whole of reality.) But you are right when you ask: “What is real and what does ‘real’ actually mean?” These are fundamental questions that need to be asked again and again because the meaning of being changes epochally, as in the case of Heideggerian interpretation of being (as three-dimensional time). Thinking of the nature of the real from this perspective means to be able to look at the changing essence of what appears within a field of possibilities and not the other way round as metaphysics tends to do.

LIS can embrace both traditions, the metaphysical and the phenomenological, as it has to do with the reification of human knowledge as well as with its use. The use perspective is the practical and original horizon in which users are embedded. Information science takes the objectivizing “present-at-hand” perspective. In the preface of their seminal book Understanding computers and cognition (1986), Terry Winograd and Fernando Flores wrote “All technologies develop within the background of a tacit understanding of human nature and human work. The use of technology in turn leads to fundamental changes in what we do, and ultimately in what it is to be human. We encounter the deep questions of design when we recognise that in designing tools we are designing new ways of being.” This was and is still a key insight for my LIS research as well as for my view of information ethics from an intercultural perspective.

Unveiling of nature or social creativity: classification and discovery in astronomy


It has always interested me to see how the development of ideas of classification and categorisation in the information sciences has been intertwined with analogous developments in the natural sciences. This is most obviously the case for botany, where Linnaeus’s stipulation that “classification and name-giving will be the foundation of our science” could equally well apply to the information disciplines. But it is also true elsewhere, not least in astronomy, as is shown by Steven J. Dick’s magisterial work Discovery and Classification in Astronomy. Dick, an astronomer, astrobiologist and historian of science, is perhaps uniquely qualified to write such a book.

Steven J. Dick

Dick argues that classification is fundamentally enmeshed with the process of discovery in astronomy, and many (if not all) major astronomical discoveries amount to the recognition of a new class of objects. Therefore, what is counted as a ‘class’, what criteria are used for classification, and by what means and processes new classes and classification systems become accepted, are vital issues. The wide-ranging historical and conceptual discourse in this book, which includes a novel classification of astronomical objects into 82 classes in three ‘kingdoms’ (planets, stars, and galaxies), gives the fullest account of these debates that we have yet had. Remarkably, despite the many classifications of particular kinds of astronomical objects presented in this book, this is first modern comprehensive classification of the astronomy domain.

He proposes a new idea of astronomical discovery, typically drawn out over a long time period, and comprising phases of detection, interpretation, and multiple stages of understanding. This is complicated by periods of ‘pre-discovery’: it has often been the case that astronomical objects have been seen and noted, but without their true nature being realised until much later.

William Jerschel’s illustration of his nebulae classification

The origins of systematic classification in astronomy are dated by Dick to William Herschel’s classification of nebulae, which first raised the possibility that some were dense groups of stars, which might be resolved with a powerful enough telescope, while others were different; we know today that the latter may be gaseous objects within the Milky Way galaxy, or remote galaxies in their own right. He used eight main classes of nebulae based on appearance: bright nebulae, faint nebulae, very faint nebulae, planetary nebulae, very large nebulae, very condensed and rich clusters of stars, compressed clusters of small and large stars, and coarsely scattered clusters of stars. Dick mentions the intriguing possibility that Herschel may have been inspired to produce this first astronomical taxonomy under the influence of his brother Dietrich, a keen natural historian and butterfly collector. Herschel himself recognised an analogy with library classification and book arrangement, modestly describing his classification as “little more than an arrangement of the objects for the convenience of the observer compared to the disposition of the books in a library, where the different sizes of the volumes is often more considered than their contents”.

The continuing relation between classification in astronomy and in natural history is emphasised by a quotation from the historian David DeVorkin about twentieth century astronomy: “Akin to the naturalist, the typical American professional astronomer was collector and classifier. Instead of museum shelves and cases, astronomers stored their systematic observations in plate vaults and letterpress log books, and displayed them in catalogues sponsored by universities and observatories”.

Hubble’s classification of nebuae

Classifications of nebulae were revised as astrophotography and spectroscopy revealed more of their detailed nature. There was a continuing debate as to whether such classifications were true reflections of reality, or, as Edwin Hubble said of Max Wolf’s classification of 1909, merely “temporary filing systems”. The debates continued into the second half of the twentieth century, with the recognition of new classes such as quasars and blazars, now known to be types of active galaxies. New discoveries have raised the question as to what counts as an ‘object’ to be classified. Is a black hole an object? A galactic filament, a huge and extended structure? A galactic void, which is defined by the absence of any objects?

Typical modern stellar classification

Dick goes on to consider the classification of stars, initially by colour, later shown to be related to temperature and composition; an illustration of the way in which an increasing sophisticated understanding is aided by, and in turn influences, classifications.
Messier 11 (Canada-France-Hawaii telescope) Nebula, open star cluster, or multiple star system?

Classifications of stellar systems led to debates as to whether ‘double stars’ were an example of ‘multiple stars’ or were a separate category, and at what arbitrary point multiple star systems end, and sparse open clusters begin. The same question may be asked asked of galaxies: at what point does a group of galaxies become a cluster, and a cluster a super-cluster, if indeed there is any real difference between the concepts.

Naturally, Dick gives a lot of attention to the case of Pluto, which has been mentioned on this blog. Classed as one of the nine planets of the solar system on its discovery in 1930, its status was changed in 2006 to that of a dwarf planet, alongside many other such with orbits outside that of Neptune. He notes the oddity that dwarf planets are not considered a sub-set of planets, but as a parallel category, criticising it as incorporating neither neither consistent principles, nor precise language, and suggesting that astronomy have much to learn from biological taxonomy in this respect.

Pluto (NASA image)

The Pluto debate raised a number of issues familiar in bibliographic classification. Was dual classification allowed, i.e. could Pluto be both a planet and a trans-Neptunian object? Was the classification to be made on purely scientific grounds: Dick quotes David Levy, the biographer of Pluto’s discoverer, Clyde Tombaugh, as arguing that “science wasn’t just for scientists or taxonomists, but for people, and normal people considered Pluto a planet”. Indeed, Pluto’s classification became a political issue, and not just in the sense of academic politics; the House of Representatives of New Mexico, where Tombaugh was a long-time resident, passed a resolution to the effect that when Pluto was visible in the New Mexico night sky it was a planet. Should the classification be dependent on its users: planetary scientists consider Pluto a planet, as it has an atmosphere, geological changes, and weather, while astronomers, more interested in its size, shape, and orbit, do not.

The Pluto case clearly shows up the realist/constructivist dichotomy: Pluto’s existence is a reality, but what class it is assigned to is socially determined. Dick suggests that the less that is known conceptually about an object, the more social construction plays a central role. Had Pluto’s true size been known at the time of its discovery, plus the fact that there were many trans-Uranian objects of similar size, then it would not have been declared a planet in the first place. The discovery of astronomical objects is an unveiling of nature, while the creation of classification systems is a wholly human invention; creation of classes within a system, and the assignment of objects to them, falls somewhere in the middle.

“In the end”, Dick writes, “despite being grounded in nature, the declaration of a new class of astronomical object is a socially determined exercise, and the construction of any classification scheme doubly so, as class is piled upon class in the attempt to order nature”. This viewpoint, fully justified by the detailed analyses in the book, is a useful corrective to claims of ‘ontological realism’ on the part of those who claim that scientific ontologies, or taxonomies for that matter, but necessarily be ‘true’. The same is true of biological ontologies, as shown by the studies of Charlie Mayor and Lyn Robinson on the Genome Ontology.

One general conclusion that Dick draws from this lengthy and detailed story is that astronomy classification schemes must have a “Goldilocks” quality – not too simple, and not too complex – if they are to be useful to the practitioner. This is likely to be true for all forms of classification. Another is that classification schemes can evolve over time, as new information and understanding is attained, but they must not evolve too much, or they will lose their original usefulness. These points are true for bibliographic classifications, and other systematic vocabularies in the information sciences. Scientific taxonomy and document classification still have much in common, and much to learn from each other.

Two encyclopaedias: both alike in dignity?

wiki1-copy
A recent article, originally appearing on an Australian radio website and widely republished, celebrated the 16th anniversary of Wikipedia, suggested that traditional encyclopaedias were now worthless, as Wikipedia completed the process of organising knowledge begun by the Romans.

Well, up to a point. Wikipedia, however widely used it may be, is not the only online encyclopaedia in town. And it is interesting to compare Wikipedia with one of them, which might be considered its polar opposite: the Stanford Encyclopaedia of Philosophy (SEP). What is intriguing about this is that the two sources are different in many respects, and united only in their status as free to use online sources, each very popularity with their user groups.

First and foremost, of course, there is their scope. Wikipedia is general, SEP is subject-specific. I think it would be good if those us who are old enough the world before Wikipedia took a moment from time to time to reflect on the fact that we now an immediately accessible online source that gives us information on pretty much anything. I can recall that when I first gave a course on information resources I set the student a set of reference query exercises which varied from the ‘hard’ to the ‘almost impossible unless you know the answer in the first place’. Today, with Wikipedia, most, though not all, of them would be trivial. This sea-change in access to information across the spectrum, and its implications, is not always given enough credit. But, precisely because of its universal scope, Wikipedia encounters problems which are much less likely to be faced by a resource or more limited scope, as we see later. In particular, when the scope is closely defined, it is easier to control consistency and accuracy.
sep-copy

Having said that, the SEP covers a pretty wide variety of topics. I would not have immediately associated translating and interpreting, pacifism, republicanism, multiculturalism, friendship, marriage, molecular biology, sounds, or voting methods with philosophy, but they all have entries in the SEP.

As for the funding which allows free access to the sources, the SEP has an array of rather traditional forms of support for an academic endeavour: support from an institution, grants from philanthropic institutions, and an open access model providing added-value features for a fee. Funding for Wikipedia, and its supporting Wikimedia Foundation, comes largely from small donations from individual users – appeals funding from its users appear annually on Wikipedia – with some support from philanthropic donors.

Perhaps the most striking difference between the two encyclopaedias is the way in which each is created and edited. In the SEP, all articles are attributed to named individuals, and control is exercised by a board of named editors, according to the traditional model of the academic encyclopaedia. By contrast, Wikipedia articles are anonymous, and may be created and edited by any user. Control is exercised by a body of volunteer editors, who are typically identified only by an uninformative user name, and overall policy decisions are taken by a smaller group of elite editors. Concerns have been expressed about the make-up of the Wikipedia controlling group: 94% of Wikipedia’s active editors are male, with limited geographic diversity, Africa and Central Asia in particular being very underrepresented.

Wikipedia entries may be amended frequently; such amendments, their dates, and any reasons and discussion, can be found only by looking on talk pages, ‘behind’ the entry itself. SEP entries are amended only infrequently; the dates of the original entry and of subsequent revisions are shown on the main page. Whereas Wikipedia notes that it is a continually improving ‘work in progress’, with entries at different stages of development, SEP entries are all of a consistent standard, and unfinished material does not appear. This difference in consistency also applies to what topics are covered: in the SEP, a balanced coverage is promoted by the editorial board, while in Wikipedia coverage of topics depends solely on the interests of the contributors. And also to how the topics are covered: nothing would be published in the SEP if it were not broadly in line with the source’s standards of scope and depth, while in Wikipedia great variation can be seen, even among articles on very similar topics.

Finally, the approach to supporting articles with citations differs considerably. All SEP entries are fully referenced. Wikipedia asks for supporting citations, but many entries lack these, and are noted accordingly by the Wikipedia editors. This is one aspect in which librarians have been very active in trying to improve Wikipedia, through initiatives such as #1Lib1Ref, and it generally agreed that this aspect of Wikipedia is improving, as I was reminded in a recent Twitter exchange.
twitter-exchange

So, in both cases it may be argued that quality is ensured by the encyclopaedia community; but the way this works out in practice is very different, the identifiable individual taking a much more prominent role in the SEP. And because of the process of creation and maintenance, consistency in much higher in the SEP.

The nature of the expertise of the creators is also very different in the two resources. The SEP relies on the justified expertise of its contributors and editors, whose academic status is publicly stated. Wikipedia relies rather on the communal expertise of its user groups, and its editors in particular. This is not to say that there is not considerable expertise involved in Wikipedia. On the contrary, one hears anecdotally that particular entries and sections are created and updated by expert individuals and institutions; but such expertise is not identified and foregrounded in the way that it is in the SEP. Indeed, Wikipedia explicitly states that “what is contributed is more important than the expertise or qualifications of the contributor”.

This leads directly to issues of validity of information, of misinformation and disinformation, which have nagged at Wikipedia since its inception. Apart from the fear of errors being inadvertently introduced by inexpert editors, Wikipedia was being accused of harbouring deliberately false information long before ‘alternative facts’ became a thing, This was largely a matter of individuals or organisations improving their own story, or degrading that of opponents.

Then there are the hoaxes and April Fool pranks, all of which Wikipedia commendably tries to identify and list. [Some of them seem too good to be hoaxes: I think that the BBC really should have made a series of two of Olimar the Wonder Cat, just to keep the record straight.] SEP does not have hoaxes; or, if it does, no one has found them yet.

This is not to say that Wikipedia is careless of misinformation and disinformation. On the contrary, its editors are assiduous, if somewhat inconsistent, in rooting it out. It caused some ironic amusement to see the Daily Mail newspaper voted as a deprecated source by Wikipedia’s editors because of its poor fact-checking and sometimes outright falsification; both accusations which have been levelled at Wikipedia in the past, because of its assumption that, whatever erroneous material might appear would be corrected by the community. Numerous studies of the accuracy of Wikipedia over the years have led to little more of a conclusion that it all depends which bits you look at, and that it is generally improving. Fine for general interest purposes, but for something really important maybe we’d prefer something like the SEP.

Of course, it would be wrong to set these two sources up in some sort of opposition. Those who use the SEP for philosophical topics, no doubt use Wikipedia for other things. And there is no harm, for someone interested in philosophical topics, in having a look in Wikipedia to see if there’s an alternative viewpoint to SEP on your subject of interest; though personally I’d be cautious in reading too much into it if there is. And, for all its faults, the Wikipedia approach, with its inconsistencies and continual ‘work in progress’ status, is no doubt the only way of achieving such a comprehensive collection of information.

A recent article in Quartz online magazine claimed that the SEP has “achieved what Wikipedia can only dream of”, in providing rigorously accurate free online information. The comparison may not be a fair one, but it will be interesting whether Wikipedia, with the help of enthusiasts in the library community in collaboration, moves in the SEP direction. Or whether, as Jutta Haider and Olof Sundin suggested in a thoughtful article seven years ago, Wikipedia is indeed firmly in the tradition of the Enlightenment encyclopaedic tradition, updated for the network age; in which case, perhaps it will be the model for the future.

Gutta percha: forgotten material of the communication revolution

Few other materials have had such a revolutionary impact on the world. And few others have been forgotten so quickly.
(Ben Wilson, Heyday: Britain and the birth of the modern world, Weidenfeld and Nicholson, London, 2016, p. xxiii)

Palaquim fruit tree (Rimbun Daham Arts centre)
Palaquim fruit tree (Rimbun Daham Arts centre)

Describing gutta percha as “the vanished material that made the telecommunication revolution possible”, Ben Wilson gives it centre-stage in his history of Britain in 1850s. A form of rubber, gutta percha is derived from the sap of the the palaquium fruit tree, native to what is now Malaysia; if drawn off and exposed to air, it solidifies. If heated, it becomes a pliable latex, and can be formed and hardened into whatever shape is needed. The local people had used it for centuries for items such as utensil handles and vases. Although British travellers had noted its strange properties in the sixteenth century, it was not until 1832 that a government doctor, William Montgomery, realised the significance of its properties.
Domestic items made from gutta percha (Atlantic Cable website)
Domestic items made from gutta percha (Atlantic Cable website)

Within a decade a new industry had developed. The hard, pliable latex was washed, folded into blocks, and brought by ship from Singapore to London, and to the new, and by the standards of the time, very high-tech, factory of the Gutta Percha Company, at Wharf Road, Islington. A report of a visit to the factory by a journalist in the early 1850s appeared in the The Illustrated Exhibitor and Magazine of Art. He noted that “We enter a modest-looking doorway between a pair of folding gates, on which the words ‘Gutta Percha Company’ are printed, and we become speedily aware that a branch of manufacture of which we hitherto knew next to nothing is being carried on within”. In the manufacturing process, the gutta-percha was boiled, shaved by a cutting machine, boiled again, then kneaded at high temperature, cooled in another machine and rolled into sheets, to be sold to manufacturers around the world.” Gutta-percha became ubiquitous in mid-Victorian daily life, used in tents, clothing, shoes, jewellery, domestic appliances and furniture. Waterproof, and resistant to acids, salt water, and chemicals, it was invaluable to industry for many purposes.
Atlantic cable, encased in gutta percha (Atlantic Cable website)
Atlantic cable, encased in gutta percha (Atlantic Cable website)

Its significance for information history, is that it is not only strong and waterproof, but also does not deteriorate when submerged for long periods in salt water, it proved the perfect insulator for electric wiring for undersea telegraph cables. The first cross channel cable, laid by HMS Blazer in 1851, was made from 100 miles of copper telegraph wire encased in a tube of gutta-percha, provided by the Gutta-Percha Company; four lengths were twined together with hemp, and encased in galvanised iron wiring as a protection. In 1858 it was used to insulate the first (unsuccessful) trans-Atlantic cable.
making-the-atlantic-cable-copy
Subsequently, the Gutta-Percha Company amalgamated with Glass, Elliot and Co., a maker and layer of cables, to form the Telegraph Construction and Maintenance Co. Ltd. (Telcon), who made the successful Atlantic cable laid in 1866. Its use in supporting the infrastructure of the communications revolution continued until after the end of the nineteenth century.
HMS Agamemnon laying the first Atlantic cable in 1858 (National Maritime Museum)
HMS Agamemnon laying the first Atlantic cable in 1858 (National Maritime Museum)

Gutta percha fell from widespread use during the twentieth century, replaced for most purposes by synthetic materials, particularly polyethylene. It is still, however, widely used in dentistry; a worthwhile use certainly, but perhaps a come-down for a material which played a major part in the instantiation of the modern information age.

Why LIS doesn’t have a quick fix for the post-factual society … and why that’s OK

The irony is that by now it was supposed to be perfect. For most of my working life in the library/information area, first as a practitioner and then as an academic, the emphasis was on providing access to information. Most of the time, whatever the topic, there was never enough information, and accessing what there was could be difficult. Then came the web, Google, Wikipedia, social media, mobile information, open access, and the rest. So that now, we should be living in an information nirvana, where we have ready access to all the information we could need, for any purpose. And indeed, to an extent, that is what we have.

But we also have, as Luciano Floridi has pointed out, a situation where we can carry a device in our pocket which gives us access to the accumulated knowledge of humanity; and we mostly use it to send each other pictures of cats, and to have arguments with people we don’t know. [Not that I object at all to pictures of cats, but you get the idea.] More seriously, we have fake news, alternative facts, a post-truth and post-factual society, filter bubbles, and all the other accompaniments of what seems to be a deliberate retreat from the rational, knowledge-based world that many of us believed we were naturally headed for.

What to do? Specially, what can be the particular response of the LIS discipline and profession, as distinct from political and social ideas to which we might individually or collectively subscribe. Some early thoughts on this were presented by my colleague Lyn Robinson, in the immediate aftermath of the Brexit vote. Debate has, of course, intensified since then. What I hear are, for the most part, calls for more education, more information access, and more information literacy. Now, as someone who has been employed in education for nearly three decades, much of it spent researching, teaching and promoting issues of information access and information literacy, I am certainly not going to argue that we don’t need more of all three. But I have to say, I don’t think they are enough.

On education, I believe (and obviously have a strong personal interest in believing) that the more the better. But one cannot ignore the evidence that some highly educated people are rather stupid, and seem to delight in promoting extra stupidity; nor that many relatively uneducated people are rather wise. So I think that education, while an unarguable public good, isn’t of itself enough.

Information access? Yes, that’s good, on the whole, But, alas, we can see that misinformation and disinformation proliferates just as well as valid information. I was particularly struck by seeing this unfold in the Twitter responses to the Quebec shootings of 29th January, with false information repeated and embellished to meet the need for facts to suit pre-determined conclusions. Over fifteen years ago, Lyn Robinson and I argued, in a paper on libraries and open society, that providing a ‘free flow of information’, with access to all available and relevant sources, is necessary but not sufficient to support an open and democratic society. While only this month, a paper in The Information Society argues cogently that providing access to the ‘right’ information does not in itself bring about desirable policy outcomes. It’s also worth remembering that one of the few things that have been established beyond doubt by several decades of information behaviour research is that everyone – academics and politicians included – deal with information through satisficing and the principle of least effort. And what could involve less effort – physical, technical and mental – than getting from Twitter and Facebook a selection of news and information recommended to fit in with your prior beliefs. Full and unimpeded information access is a necessary precondition for improving things, but let’s not imagine that more of it will be any kind of solution.

Information literacy, then? (Or digital literacy, or metaliteracy, or media literacy, or whatever, it doesn’t matter.). Isn’t more information literacy the answer, getting people to choose good sources, and to reject misinformation and disinformation? Well, up to a point. But, as many people more qualified than me on this topic, have pointed out, it can’t be the full answer. If we take a rather simple understanding of information literacy – the ability to find, access and use appropriate information – then many of those who appear to embody the post-truth era are highly information literate; it’s just that they choose to regard as appropriate only those sources which support their own world view. Perhaps then we just need to focus more on the ‘critical appraisal’ aspect of IL? Not so; our post-factual friends will say, rightly, that they are highly critical of any information that doesn’t support what they know to be right. And anyway, as Lane Wilkinson has pointed out, the most widely-used information literacy frameworks and standards have almost nothing to say about ‘truth’ or ‘facts’. Asking “did media literacy backfire?”, danah boyd has given a thoughtful analysis of the failure of information literacy to enable people to cope with a ” very complicated – and in many ways overwhelming – information landscape”. While information literacy is certainly needed more than ever, it may need to change its nature if it is to make the impact it should.

Lyn Robinson and I have argued for the promotion of understanding, as much as the provision of information, as a remit for the library/information disciplines and professions. I think that this might go some way to overcoming the ‘alternate facts’ issue, but in itself it is not a panacea. We take understanding to be a coherently arranged body of truthful information, with the individual items linked by logical and meaningful relations. However, such understanding, if we remove the ‘truthful’ criterion, may be possessed by anyone who has given some thought to an issue, including conspiracy theorists of all kinds, and deniers of everything from climate change and the moon landings to the Holocaust. And these people will all be able to point to a set of coherent information sources to back up their understanding. Perhaps what is needed is the promotion of some form of ‘open understanding’; you don’t really understand something until you have genuinely considered the alternative viewpoints. I think Karl Popper, who argued that we should subject our ideas to the most severe criticism we can muster, would approve of that approach. It is hard work, though, for any of us, and probably unappealing to the conspiracy theorists and deniers.

So, what should we, the LIS profession and discipline, do now? I have no convincing answer; at least not one that offers an immediate quick-fix. Certainly, it will involve a combination of carrying on with our information access/literacy work, plus activities on a wider canvas, with a specific ethical commitment to opposing the post-fact/post-truth nexus. As Georgina Cronin says, in her wide-ranging analysis of what librarians can do in a post-truth society: “Educate. Vote. Protest. Whatever it is, do it”.

Perhaps one specific, and important, role for LIS, actually quite a traditional one, is to keep the information environment, or that part of it which may be to a degree under our influence if not control, in a clean, tidy and welcoming state. This will involve a variety of activities, from reporting abuse on social media, to helping to remove fake news, to adding citations to Wikipedia, and much more. In the new information environment this process, which Floridi has dubbed the moral duty to both clean and to restore the infosphere to a proper ethical status, is both very difficult and very important.

But I am not too disconcerted by the lack of any immediate quick-fix. If Luciano Floridi is right, then we are living through an ICT-led ‘fourth revolution’ (following those associated with Copernicus, Darwin and Freud), leading to a radically new ‘hyper-historical’ information-dominated environment; certainly that fits with the experience of those of us who have been around the information world for some time. And if that is so, then we should expect to have to take a while to work out our response: as danah boyd says, “the path forward is hazy … no simple band aid will work”. And Floridi has commented more than once that, precisely because the changes we are encountering are so great, so is the need to step back and think calmly, philosophically, and at length, about how to react, rather than seek a rapid response.

So, if we in LIS have no immediate answer to the problems of the new information environment, maybe that’s a good thing. Maybe, in being troubled how best to proceed we are reacting in the right way, and our ultimate contribution will be all the more effective. Immediate action certainly, but please let’s back it up by long term reflection.

Invented and discovered: mathematics in Popper’s World 3

I have always had an interest in mathematics. This is despite, or perhaps because of, never being very good at the subject at school, and avoiding it to the maximum extent compatible with getting a science degree at university. Not that I have any fondness for what is called ‘recreational mathematics’, which has always seemed to me to be a contradiction in terms, nor any desire to do substantive academic work involving mathematics, for which I am ill-equipped. My interest rather has been in the qualitative ‘meta’ of the subject: what is mathematics, and how does it relate to the real world. In particular, I always wondered about two points. Why are the mathematical concepts developed by humanity so useful in accounting for the behaviour of the physical universe; what the physicist Eugene Wigner called the “unreasonable effectiveness” of mathematics. And whether mathematics was something independent of people, waiting somewhere to be discovered, or whether it was a human creation.
pi-sky-copy
Quite a long while ago, I was given, for a Christmas present, a copy of John Barrow’s ‘Pi in the Sky‘, and account of the history and nature of mathematics, addressing exactly these issues. It also introduced me to the three main philosophical approaches to mathematics (of which I give only the simplest summary here). The Platonist approach holds that mathematical entities and relationships have an eternal existence, independent of humanity, in some kind of world of abstract ideas. The formalist approach believes that maths is created by humans, and is simply the manipulation of symbols according to arbitrary rules, without any associated meaning. And then there are a number of constructivist approaches, including inventionism and intuitionism, which hold that mathematics is created by humans under the influence of psychological and cultural factors. Barrow carefully pointed out the flaws in each of these, which mean that there is no consensus among mathematicians as to the very basis of what they are doing, and how, and why. Although Barrow was writing over twenty years ago, as I understand it the same is true today.

As I read Barrow, I was struck by what seemed, and still seems, to be a particular contradiction. When mathematicians describe what they do in creating new mathematics, they almost always do so in terms of discovery rather than invention. They think they are finding out things that exist independently of them, rather than creating them; and indeed the many instances when the same new mathematics is derived by several people in ignorance of one another seems to confirm this. And yet it seems self-evident that mathematics is a human creation, since we observe mathematicians creating it.

It occurred to me that one solution to this conundrum might be given by Karl Popper’s idea of World 3 of objective, communicable information. If we can agree that mathematical entities are inhabitants of World 3, as Popper said they were, then we can see how mathematics might be both invented and discovered. World 3 objects are created by humanity, but once created they take on, in a sense, a life of their own, with implications and consequences not necessarily understood by their creators. Mathematics is then indeed discovered, and mathematicians’ sense of what they are doing is not false, but it is discovered in the human-created World 3.

This struck me a particularly nice idea, and I was surprised that no-one else seemed to have thought of it.

Reuben Hersh
Reuben Hersh
Well, of course, someone had, though I only found out years later. Back in 1981, more than ten years before Barrow’s book appeared, the American mathematician and philosopher Reuben Hersh had presented the idea in a book co-authored with Philip J David, ‘The mathematical experience‘.

“Start from two facts”, Hersh wrote. “(1) mathematics is a human creation, about ideas in human minds; (2) mathematics is an objective reality, in the sense that mathematical objects objects have definite properties, which we may or may not be able to discover. Platonism is incompatible with Fact 1, since it asserts that maths is independent of humans; constructionism is incompatible with Fact 2, since there no properties until they are proved constructively; formalism with both, since it denies the existence of mathematical objects… Mathematics is an objective reality that is neither subjective nor physical. It is an ideal (ie non-physical) reality that is objective (external to the consciousness of any one person). In fact, the example of mathematics is the strongest, most convincing proof of such an ideal reality.

This is our conclusion, not to truncate mathematics to fit a philosophy too small to accommodate it – rather, to demand that the philosophical categories be enlarged to accept the reality of our mathematical experience…. The recent work of Karl Popper provides a context in which mathematical experience fits without distortion. He has introduced the terms World 1, 2, and 3, to distinguish three major levels of distinct reality. World 1 is the physical world, the world of mass and energy, of stars and rocks, blood and bone. The world of consciousness emerges from the material world in the course of biological evolution. Thoughts, emotions, awareness are nonphysical realities. Their existence is inseparable from that of the living organism, but they are different in kind from the phenomena of physiology and anatomy; they have to be understood on a different level. They belong to World 2.

In the further course of evolution, there appear social consciousness, traditions, language, theories, social institutions, all the nonmaterial culture of mankind. Their existence is inseparable from the individual consciousness of the members of the society. But they are different from in kind from the phenomena of individual consciousness. They have to be understood on a different level. They belong to World 3. Of course, this is the world where mathematics is located.”
math-exp-copy
“Mathematical statements”, Hersh concluded, ” are meaningful, and their meaning is found in the shared understanding of human beings, not in an external nonhuman reality. So mathematics deals with human meanings in the context of culture, it is one of the humanities; but it has a science-like quality, since its findings are conclusive, not matters of opinion…. As mathematicians, we know that we invent ideal objects, and then try to discover the facts about them ”

Hersh had stated the idea that I had, formulated more clearly than I could have done, fifteen years beforehand. Ah well.

Hersh has gone on to develop his ideas, though his more recent writings don’t reference Popper, but describe mathematics as being discovered in a human-created socio-conceptual world (which still sounds pretty much like Popper’s World 3 to me). His recent thoughts are nicely presented in a 2014 book, ‘Experiencing mathematics: what we do, when we do mathematics‘, in which he presents the socio-conceptual idea as an explicit alternative to the formalist, Platonist, and intuitionist approaches.

So, although I didn’t think of it first, it seems to me that the mathematical World 3 is a more realistic basis for describing what mathematics is, and what mathematicians do, than any of the alternatives. And also, another vindication that Popper’s World 3, though an idea which has fallen out of fashion to an extent, is still a fruitful concept for any disciplines which deal with abstract concepts at their heart.