Supporting truth and promoting understanding: knowledge organization and the curation of the infosphere

This is an updated text of a keynote address given at the Fifteenth International ISKO Conference, Porto, 9th July 2018. A brief account of the conference is given in an earlier blog post.

Supporting truth and promoting understanding: knowledge organization and the curation of the infosphere

David Bawden and Lyn Robinson

Abstract
This paper considers the response of knowledge organisation (KO) to a variety of problems and pathologies associated with the post-factual, or post-truth, society. It argues that there are no quick fixes, but that KO has several roles to play in mitigating these problems, particularly in the promotion of understanding, as well as the communication of information and the sharing of knowledge. Borrowing from Floridi’s Philosophy of Information, it argues that KO, and more broadly library and information science (LIS), should address these problems as part of our role as ‘curators of the infosphere’.

Introduction
This paper addresses two of the main themes of this conference, by considering a new foundational direction and purpose for knowledge organisation, as a response to certain societal challenges to the effective communication of information and knowledge. The new direction involves a realignment of purpose; from knowledge organisation being applied in the cause of the effective provision of information and documents to its application for the explicit purpose of promoting understanding. The societal challenges which this may address the the much-discussed problems of the post-factual or post-truth society, with its accompanying phenomena of fake news, the death of expertise, and the rest. This paper builds on a session at the ISKO UK 2017 annual meeting, devoted to these issues, and goes beyond it to consider how the promotion of understanding may in itself contribute to a solution. It uses Luciano Floridi’s Philosophy of Information as a theoretical back-drop throughout.

Societal problems
The linked collection of social problems which have been described by such terms as ‘fake news’, ‘alternate facts’, ‘post-truth society’, ‘post-factual society’, ‘death of expertise’, ‘filter bubbles’, and ‘social media echo chambers’ are well known. Indeed, ‘post-truth’ was Oxford Dictionaries’ Word of the Year for 2016, and ‘fake news’ was Collins Dictionaries equivalent for 2017, with ‘echo-chamber’ on its shortlist. Together, they describe a situation where objective factual truth is denied, expert informed, opinion is derided, and exposure to novel and challenging ideas is actively avoided. This situation, which is in many respects the antithesis of what library and information science (LIS) has sought to promote, and causes soul-searching within the theory and practice of LIS (see, for example, Bawden 2017 and Cooke 2017). It has not arisen de novo: fake news has a long history (Cooper 2017), while the philosopher Bertrand Russell observed more that seventy years ago that “.. most people go through life with a whole world of beliefs that have no sort of rational justification… People’s opinions are mainly designed to make them feel comfortable: truth, for most people is a secondary consideration.” (Russell 1942). Our present concerns are the culmination of a series of changes in the social and informational environment, brought into stark relief by political issues in Europe and North America from 2016, accompanied, and to an extent brought about, by a torrent of misinformation and disinformation (Clark 2017, Corner 2017, Wardle and Derakhshan 2017). A Pew Internet study carried out in early 2017 found a panel of experts almost evenly divided as to whether the problems could be ameliorated or would become worse over the next decade (Anderson 2017).

Naturally, many well-intentioned proposals have been advanced to remedy the situation. Some have addressed deep seated issues, in educational systems, in economic and social policy, in use and mis-use of alogorithms, in regulation of the media (including social media), and in political structures. Others, including several advanced from within the LIS community, have recommended more immediate and specific remedies; see Clark (2017) for an overview. Some have focused on the development of IT solutions, particularly with the algorithms used to filer news in social media and with automated fact checking and comparison (see, for example, Cooper 2017, Madrigal 2017 and Tomchak 2017). Others have advocated the enhancement of information and digital literacies (see, for example, Cooke 2017, Polizzi 2017 and Poole 2017), of restoring the importance of expert objective fact checking and ‘kite marks’ (see, for example, Cooper 2017, Jirotka and Webb 2017 and O’Leary 2017), and of improving education and training for information providers (see, for example. UNESCO’s new curriculum for ‘Journalism: Fake news and disinformation‘.
Our view is that, although these sort of initiatives may well have value, taken alone they will have relatively little impact. The problems and issues are deep-rooted, ‘systemic’ as Beckett (2017) puts it, and are not amenable to any ‘quick fix’. As the philosopher of information Luciano Floridi emphasises, the more important the problem, the most it needs a long period of reflection to find the best solution. And as we have suggested more specifically, LIS has no quick fix for these issues, and we should not pretend that we have; Beckett (2017) suggests the same for journalism in respect of fake news. We believe that LIS has a very considerable contribution to make, but it must be a deeper level than a tweak to an algorithm, a guide to information evaluation, or a reliance on the manipulation of big data (Robinson 2016, Bawden 2017, Poole 2017); as Floridi (2016) puts it, solving the problems of fake news and the rest requires a reshaping of the infosphere, our whole information environment and our interactions within it.

It seems to us that part of LIS’s contribution to a longer-term approach to these issues will certainly lie in knowledge organisation. This has already been noted by others. ISKO UK devoted a session at their 2017 annual conference to ‘False narratives: developing a KO community response to post-truth issues’ (ISKO 2017), and a plenary discussion of the Dublin Core Metadata Initiative in October 2017 debated ‘A metadata community response to the post truth information age’ (DCMI 2017). We will include points made at these sessions, and consider some other possibilities, later. First, we will make a slight detour, and think about the nature of understanding

Promoting understanding
One way of expressing the problems of post-factual, expertise-less society is to say that it lacks a full and clear understanding of the issues facing it (Robinson 2016, Bawden 2017). We have argued that LIS should take a new stance of focusing on the promotion of understanding as much as on the provision of information and the sharing of knowledge in an era when, for most people for the most part, information is provided through search engines, particularly Google, through a few encyclopedic websites, particularly Wikipedia, and through social media (Bawden and Robinson 2016A, 2016B). In this environment, we contend, the promotion of understanding falls, arguably uniquely, within the remit of LIS; this seems to be a novel suggestion, although it has been supported by Gorichanaz (2016, 2017), and fits within the Floridi-derived idea of LIS professions as ‘curators of the infosphere’ (Bawden and Robinson 2018).

There is, we may say, little understanding of understanding, in as much as it is defined very differently by various authors. We may note that Ackoff (1989) in his original formulation of the well-known data-information-knowledge-wisdom model, included understanding, which he characterised as ‘an appreciation of why’ as a high-level concept, between knowledge and wisdom. In the widely-used educational taxonomy due to Bloom, on the other hand, it comes as a rather low-level concept, above remembering, but below applying, analysing, etc. (Anderson, Krathwohl and Bloom 2001).

On the basis of an analysis of various conceptions of understanding (for details of which, see Bawden and Robinson 2016A, 2016B), we propose a definition of understanding, relevant to the purposes of LIS, following Ackoff, and situated within Floridi’s Philosophy of Information:

Information is taken to be well-formed, meaningful, truthful data. Knowledge is taken to be information organised in a network of account-giving inter-relations. Understanding occurs when a conscious entity, supported as necessary by information systems, appreciates the totality of a body of knowledge, including its interconnections. The extent to which the knowledge is incomplete, contradictory or false determines the degree to which understanding is less than complete.

Developing understanding in this sense would seem to be a worthy aim for LIS, and on which may go some way towards helping mitigate the societal problems noted above. However, we need to note that people may have an understanding of a topic, in this sense, based on misinformation or disinformation, and may be impervious to contradictory information (see, for example, Requarth 2017), and this may be reinforced by emotional attachments to certain viewpoints (Beckett 2017, Poole 2017). We should therefore add a rider, to to effect that the extent to which someone is open to changing their views on the basis of new information, in effect to the extent of their curiosity, is also a measure of the completeness of their understanding. This could amount to a commitment (individual or societal), to accepting, indeed actively seeking, new knowledge even if it be potentially disruptive of current understanding (Bawden and Robinson 2016B). For LIS, this fits in well with the suggestions of Beckett (2017) that news media should provide content that is stimulating and challenging as well as relevant, and of Finch (2017) that libraries should be as much a safe place to indulge curiosity, rather than a trusted dispense of facts and information, or a repository of the truth.

Expressed in this way, we can see that the development of understanding may in itself be a powerful force for counteracting the problems discussed above; Bradley (2017) explicitly notes that helping people to understand and use items in their information environment is a role for libraries in countering the fake. It is important to note that developing understanding, in the sense meant here, is a broad and general approach, rather than a specific tool or technique, and goes far beyond didactic approaches to evaluation of information. Information systems are beginning to be developed to support understanding, from the relative conceptual simplicity of Google’s Knowledge Graph, which integrates information from a variety of sources with the aim of giving a quick overview, to systems explicitly aiming a developing understanding from a corpus of sources. As an early example of the latter, see Donne et al. (2012).

We now turn to consider the ways in which knowledge organisation may contribute to these linked aims: the promotion of understanding, and the mitigation of the problems of the post-factual society.

KO’s contribution
As suggested above, a variety of contributions to the amelioration of these problems have been suggested. At the risk of over-simplification, we can consider them under five headings.

First, we might notice the suggestion that an ontology, taxonomy, terminology, or glossary of the post-truth society, its pathologies, and potential solutions, may be of value in itself, as a way of clarifying the concepts and their inter-relations, and as a guide to action (Bradley 2017, Clark 2017, Poole 2017, Wardle and Derakhshan 2017). For the most detailed example yet extant, see Synaptica’s Post-Truth Forum Knowledge Base.

Second, there is what we might think of the classic, if limited, response of KO: adaption of methods of resource description, and revision of existing descriptions. One popular example of the former is the idea of ‘credibility metadata’, the addition of terms aimed at reducing misinformation and disinformation by establishing veracity of resources (Wardle and Derakhshan 2017). This may involve markers of location, time, etc. on items, or metadata to note ‘quality factors’, such as that a source has a corrections policy, or that the author of an item has written on the topic before (Cuellar 2017). Conversely, indexing may directly address the ‘fake’ nature of an item, as in the idea of adding a term for ‘satirical article’ (Quinn 2016, Cuellar 2017). The latter, by which discredited materials by be identified as such, is well exemplified by the controversy over the reclassification of Holocaust denial literature as ‘historiography’ rather than ‘history’ (Simon XIX 2017).

These responses are generally implemented by traditional intellectual metadata construction. The third category is the use of automated classification and indexing to attempt to identify and categorise fake news and other pathologies of the post-factual society; of the many developments of this kind, a good, albeit simple, example of a classifier to distinguish genuine news items from fake is given by McIntire (2017). More complex examples, based on more sophisticated machine learning techniques and classification techniques are likely to play an increasing role (Cooper 2017). Both Facebook and YouTube announced innovations to this end in mid-2018.

Fourthly, there are those KO techniques which directly support curiosity which, as noted above, is a powerful force for finding alternative perspectives, breaking filter bubbles, and building understanding. In this respect, another long-established aspect of KO, classification techniques with their ability to show both hierarchical and associative relations, may be of particular importance.

It is worth considering, from the perspective of the issues discussed here, whether any particular form, or theory, or classification may be most appropriate. In particular, pragmatic or critical classification (Hjørland 2017) appears to be something of a two-edged sword. On the one hand, it may support, or reflect, an understanding of the world helpful to an individual or a group in developing understanding, and coincide with their emotional responses to issues; on the other hand, such an organisation may simply reinforce filter bubbles. It may be that systems could be developed to allow a ready comparison of alternative classifications, assisting curiosity-driven explorations of different perspectives. We may also need to consider the status of classical classifications, based on a single agreed picture of the world, and approximating as closely to truth as may be possible. Are these sustainable, at a time when alternative facts seem as viable as any other, and when expertise is said to be said? They appear to be the antithesis of this negative viewpoint. The status and role of classification in the post-truth era seems to be an area in need of thoughtful research.

Fifthly, and finally, it seems to us that to deal adequately with current problems, KO must fully recognise the deep and irreversible changes in the information environment brought about by the shift to what Floridi categorises as ‘infosphere’ and ‘onlife’ in which our digital and physical lives merge, and information, contextual and mobile, is central to our society, and indeed to our humanity (Floridi 2014, Bawden and Robinson 2018). This is a long-term and far-reaching challenge, encompassing and going beyond the challenges of the post-truth society, and one in which KO should have a unique position.

Conclusions
There are no quick fixes to the problems set out above. KO cannot solve these problems alone, any more than can the wider LIS discipline; more far-reaching and structural changes – educational, technical, legislative, regulatory, and political – are needed for that. However, KO can play a significant role in improving the situation, using a combination of classic KO concepts and familiar KO practice, integrated with newer technological and organisational environments. In this way, by opposing misinformation and disinformation and promoting understanding, we may justify a claim to be curators of the infosphere.

Acknowledgements
We are grateful to Nick Poole and to Vanda Broughton for helpful comments and advice.

References
Anderson, L., Krathwohl, D. and Bloom, B. (2001). A taxonomy for learning, teaching, and assessing: a revision of Bloom’s taxonomy of educational objectives. New York: Longman.

Anderson, J. and Rainie, L. (2017). The future of truth and misinformation online. Pew Research centre, Internet and Technology, October 19 2017. available at http://www.pewinternet.org/2017/10/19/the-future-of-truth-and-misinformation-online, accessed 2 January 2018.

Bawden, D. (2017), Why LIS doesn’t have a quick fix for the post-factual society, and why that’s OK [blog post]. Available at https://theoccasionalinformationist.com/2017/02/02/why-lis-doesnt-have-a-quick-fix-for-the-post-factual-society-and-why-thats-ok/, accessed 2 January 2018.

Bawden, D. and Robinson, L. (2018). Curating the infosphere: Luciano Floridi’s Philosophy of Information as the foundation for library and information science. Journal of Documentation, 74(1), 2-17.

Bawden, D. and Robinson, L. (2016A). Information and the gaining of understanding. Journal of Information Science, 42(3), 294-299.

Bawden, D. and Robinson, L. (2016B). “A different kind of knowing”: speculations on understanding in light of the Philosophy of Information. Paper presented at the 9th CoLIS (Conceptions of Library and Information Science) conference, Uppsala, June 25 2016. Open access version in the Humanities Commons at http://dx.doi.org/10.17613/M65046

Beckett, C. (2017). Dealing with the disinformation dilemma: a new agenda for news media [blog post]. Available at http://blogs.lse.ac.uk/mediapolicyproject/2017/12/22/dealing-with-the-disinformation-dilemma-a-new-agenda-for-news-media/, accessed 4 January 2018.

Bradley, F. (2017). Valuing truth in an age of fake [blog post]. Available at http://www.rluk.ac.uk/about-us/blog/valuing-truth-in-the-age-of-fake, accessed 4 January 2018.

Clark, D. (2017). Scoping out post-truth issues and how might KO help. Paper presented at the ISKO UK 2017 annual meeting, presentation available at https://davidclarke.blog/wp/wp-content/uploads/Clarke_ISKO_TRUTH_20170912.pdf, accessed 3 January 2018.

Cooke, N.A. (2017). Facts: information behaviour and critical information consumption for a new age. Library Quarterly, 87(3), 211-221

Cooper, G. (2017). False news – a journalist’s perspective. Paper presented at the ISKO UK 2017 annual meeting, presentation available at http://www.iskouk.org/sites/default/files/CooperSlidesISKO-UK2017-09-12.pptx, accessed 3 January 2018.

Corner, J. (2017). Fake news, post-truth and media-political change. Media, Culture and Society, 39(7), 1100-1107.

Cuellar, A. (2017). Combating fake news with metadata [blog post]. Available at http://blog.quark.com/2017/12/combating-fake-news-metadata, accessed 3 January 2017.

DCMI (2017). Dublin Core Metadata Initiative: responding to the post-truth phenomenon. Available at http://www.dublincore.org/news/2017/2017-10-02_responding_to_post_truth_phenomenon, accessed 3 January 2018.

Donne, S. et al. (2012). Rapid understanding of scientific paper collections: integrating statistics, text analytics, and visualization. Journal of the American Society for Information Science and Technology, 63(2), 2351-2369.

Finch, M. (2017). Curiosity vs the post-truth world [blog post]. Available at https://mechanicaldolphin.com/2017/03/13/curiosity-vs-the-post-truth-world/, accessed 2 January 2018.

Floridi, L. (2014). The fourth revolution: how the infosphere is shaping human reality. Oxford: Oxford University Press.

Floridi, L. (2016). Fake news and a 400-year-old problem: we need to resolve the ‘post-truth’ crisis. The Guardian, 29 November 2016, available at https://www.theguardian.com/technology/2016/nov/29/fake-news-echo-chamber-ethics-infosphere-internet-digital, accessed 2 January 2018.

Gorichanaz, T. (2016). There’s no shortcut: building understanding from information in ultrarunning. Journal of Information Science, 43(5), 713-722

Gorichanaz, T. (2017). Applied epistemology and understanding in information studies. Information Research, 22(4), paper 776, available at http://www.informationr.net/ir/22-4/paper776.html, accessed 2 January 2018

Hjørland, B. (2017). Classification. In ISKO Encyclopedia of Knowledge Organization, available at http://www.isko.org/cyclo/classification, accessed 3 January 2018.

ISKO (2017). Session 5: False narratives – developing a KO community response to post-truth issues. UK Chapter of the International Society for Knowledge Organization, available at http://www.iskouk.org/content/sessions/session-5-false-narratives-developing-ko-community-response-post-truth-issues, accessed 3 January 2018.

Jirotka, M. and Webb, H. (2017). Spotting the fake [blog post]. Available at https://blog.esrc.ac.uk/2017/07/28/spotting-the-fake, accessed 2 January 2018

Madrigal, A.C. (2017). Google and Facebook failed us. The Atlantic, 2 October 2017, available at https://www.theatlantic.com/technology/archive/2017/10/google-and-facebook-have-failed-us/541794, access 2 January 2018.

McIntire, G. (2017). On building a fake news classification model [blog post]. Available at https://opendatascience.com/blog/how-to-build-a-fake-news-classification-model, accessed 3 January 2018.

O’Leary, M. (2017). Fact-checkers resist alternative facts. Information Today, October 2017, pp. 18-19.

Polizzi, G. (2017). Critical digital literacy: ten key readings for our distrustful media age [blog post]. Available at http://blogs.lse.ac.uk/mediapolicyproject/2017/12/15/critical-digital-literacy-ten-key-readings-for-our-distrustful-media-age, accessed 2 January 2018.

Poole, N. (2017). Why ‘facts matter’ – evidence, trust and literacy in a post-truth world. Paper presented at the ISKO UK 2017 annual meeting.

Quinn, B. (2016). Helping to fix the fake news problem with metadata. Available at https://medium.com/@brendanquinn/weighing-into-the-fake-news-problem-with-metadata-ebe067fac0e7, accessed 3 January 2018.

Requarth, T. (2017), Scientists, stop thinking explaining science will fix things. Slate, 19 April 2017, available at http://www.slate.com/articles/health_and_science/science/2017/04/explaining_science_won_t_fix_information_illiteracy.html, accessed 2 January 2018.

Robinson, L. (2016). Documentation in the post-factual society: or what LIS did next, after Brexit [blog post]. Available at https://thelynxiblog.com/2016/07/17/documentation-in-the-post-factual-society, accessed 2 January 2018.

Russell, B. (1942). The art of philosophizing and other essays. Girard KS: Haldeman-Julius, p. 5

SimonXIX (2017). An open letter to the Sunday Times. Available at https://medium.com/@SimonXIX/an-open-letter-to-the-sunday-times-bf815f0c01d0, accessed 3 January 2018.

Tomchak, A-M (2017). Algorithms are screwing us over with fake news but could also fix the problem. Mashable, 5 October 2017, available at http://mashable.com/2017/10/05/artificial-intelligence-algorithm-neva-labs, accessed 2 January 2017.

Wardle, C. and Derakhshan, H. (2017). Information disorder: toward an interdisciplinary framework for research and policy making. Council of Europe report DGI(2017)09, available at https://firstdraftnews.com/wp-content/uploads/2017/11/PREMS-162317-GBR-2018-Report-désinformation-1.pdf?x29719, accessed 4 January 2018.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.