HomeClass-article

A Post-Truth World?

“How do we know?” is a perennial epistemological question. The answer is becoming more difficult with the advent of social media which helps to create misinformation, disinformation and epistemic bubbles and echo chambers.

“How do we know?” is a perennial epistemological question. The answer is becoming more difficult with the advent of social media which helps to create misinformation, disinformation and epistemic bubbles and echo chambers.

We hear sometimes that we are living in a post-truth world. If so, we would be in serious trouble – but then in many ways we are in serious trouble, with so-called ‘post-truth’ both a cause and effect of that condition. Well, one might despairingly say that we live in a post-truth world. But I object to this way of speaking because in any world, statements would still be true: there are mountains, oceans, trees, plants, and animals including human beings. Five and seven would still be twelve. As distinct from post-truth, what I sometimes fear is lapsing into a post-knowledge world, a world in which too many people are too willing to believe on the basis of selective and unreliable evidence or no evidence at all.

Exploring the phenomenon of spreading irresponsible belief, some intriguing concepts have been developed. There is misinformation, which distorts and misleads; there is also disinformation, which is false and communicated with an intent to deceive. We have epistemic bubbles: you are in one of these when your views come only from those who agree with you, so that you are protected in the bubble from evidence with implications contrary to what you think and would like to think. Then there is the related phenomenon of an epistemic echo chamber; you are in one of these if your own views come back to you again and again so that you repeatedly encounter your own reflections. (Dictators experience this, often to their peril, because people are afraid to disagree with them.) Commenting on social media and increasing polarization, philosopher Regina Rini uses the term ‘partisan epistemology.’ In this epistemology, beliefs are shaped largely by affiliation and ideology based on political party, religion, or identity. These three influences are, of course, frequently intertwined.

I reflect at this point on my own teaching experience. If I made a mistake students seemed to remember it indefinitely, whereas that was not the case when I was delivering facts, sound judgments, and good arguments.

On social media, ill-founded claims and conspiracy theories can spread easily. It is estimated that false claims spread far more rapidly than true ones. An oft-cited MIT study in 2018 by Sorous Vosoughi, Deb Roy, and Sinan Aral calculated on the basis of 126,000 Twitter items that false news spreads six times more quickly than the truth. The cause was re-tweeting, not computer algorithms. Underlying the human cause seemed to be novelty and heightened emotion. I reflect at this point on my own teaching experience. If I made a mistake students seemed to remember it indefinitely, whereas that was not the case when I was delivering facts, sound judgments, and good arguments.

‘Do you have reason to think this is true?’ or ‘would you simply like it to be true?’ The distinction should be made, though it is often not. Relevant to it is the distinction between reasons and motives. Reasons are considered to explore support for a claim, having to do with its plausibility regarding logic, conditions in the world, and consistency with established facts and knowledge. We seek reasons in action and thought — when deciding what to do and in seeking truth. In our quest we may find bad reasons but still, there is an underlying search for truth – and normally a practical need for it. Motives are something else, related to what we want rather than what is the case. If I want to be taller, I have a motive for measuring my height wearing shoes; then I get an extra inch or even two. My beliefs are more to my liking when I think and act for this motive. Yet the reality of my height is unaffected. Perhaps weight is a better example than height: I ask myself ‘why are all my clothes shrinking?’– or ‘What’s going on, my scales seem to be off?’ In an epistemic bubble, no one will question me; in an epistemic echo chamber (harder to achieve for these cases), doctors and public statistics would always function to confirm my self-concept. These personal matters are unlikely to be of general importance. If, however, I do not want to believe that climate change is happening and I attribute wildfires to arson as a way of distracting myself from evidence of climate peril, the matter is of great general importance. It affects safety, environment, policy, and our human future. A motivated unscientific interpretation can spread and spread quickly, influencing public debate and taking over the beliefs of some political leaders. It is an understatement, at best, to say that in contexts of public concern social media has created a situation of unstable norms in epistemology.

People may be motivated to accept faulty claims by a sense of alienation from experts, loyalties to associates, and their sense of identity. Being and identifying as anti-vax, or anti-mask – or, for that matter, progressive or trans – may come to dominate beliefs in various ways, including interpretation and the accessing of sources. You can put yourself in a bubble to protect the beliefs you like and you can build your own echo chamber to spread them around. You can choose what to acknowledge; you can choose what to ignore. You can construct your picture of the world, the world as you like it. The justifiability and truth of what you believe play no part in such processes. We can call this active ignorance, or self-deception.

Spreading your unreliable ideas around, whether on social media or by other means, you are not necessarily lying. Lying means knowingly communicating what you believe to be false, intending that others should believe it. Your constructions may be your real beliefs, a phenomenon illustrated when people act on them, often to their peril. And yet some who perpetuate false beliefs on social media really are lying, a plausible example being that of Alex Jones regarding the Sandy Hook killings. Jones falsely claimed that the December 14, 2012 shootings in Newtown Connecticut were a hoax, saying that the event was staged as a pretext for confiscating guns. He widely disseminated claims that the massacre was faked and grieving parents were actors participating in the fabrication. His and other conspiracy theories about the shootings led to hateful messages, threats, and intensified suffering for victims’ families. When, in 2022, Jones was convicted of ‘circulating falsehoods’ he was ordered by an American court to pay nearly one billion dollars to the families of eight victims. In court he said ‘sorry’ and admitted that the shootings were real. He may have known all along. But this is not always the case: some genuinely believe false – even outrageously false – claims, cultivated by their own active ignorance.

A ‘two sides’ fallacy can happen at this point. This is the dynamic. There is a claim, X, such as ‘dry conditions due to climate change are increasing the number of wildfires’. But wait: here are these others who deny that claim; not everyone thinks X. There are a lot of people who claim arson is a cause of these fires. Shouldn’t we consider their view too? X; but also not-X; so it is said. In fairness, should we not attend to both views? Yes, acknowledge this division, regardless of who is denying and why, and so the coverage goes. X – but then also, there is not-X. So the matter of fires and climate change is not settled; oh, maybe we should suspend judgment and hold back from action. The technique of sponsoring an opposed view was used by tobacco companies regarding smoking and copied by the oil industry concerning emissions from fossil fuels. The underlying argument is: ‘There is a debate about this; these dangers you speak of are, after all, not known to occur.’ Such a ‘balanced’ view can be dangerous, as illustrated in this case.

Claims on social media can be regarded as testimony, in the broad sense in which testimony is claims communicated to us by others. We learn from others; they tell us things and teach us things: the meanings of words, our birth and location, where we are in the world, and modes of investigation and reflection. Some philosophers maintain that induction and deduction are our two sources of knowledge and understand our trust in testimony as supported by induction, reasoning from past experience. On this account, we have trusted others as sources of knowledge and have, most of the time, found their accounts to be reliable. So we have learned from experience that we can usually trust the claims of others. Author C.A.J. Coady put forward a detailed argument against this view in his 1994 book Testimony. I agree with Coady’s claim that testimony is too basic to be founded on induction. Why? To learn from experience and do inductive reasoning, we need to have accepted much on testimony: language learning is an obvious case in this context. Knowledge based on testimony is needed for induction – and deduction too for that matter. We can’t get started without it and it is for this reason fundamental. To think we need to start thinking and to start thinking we need concepts and background given to us by parents, teachers, and others. Trusting them is the default stance: we believe what we are told unless there is reason to doubt it.

All this is not to say that we always put faith in whatever others tell us. Although early in development, trust is nearly always full and unquestioning, later that is not the case. In the end, we may still believe what others tell us, but that will be a more reflective belief, coming after consideration of doubts. There are two sorts of reasons for doubt. One is the content of the claim. If someone tells us something incompatible with what we already know or believe, we have reason to doubt their claim. Suppose, for instance, that someone tells me that Canada has a president. I have read and heard so many times, and have been taught in school, that we have a prime minister and a governor general, so, given my background knowledge, I question the claim of Canadian presidency and reject it. The other main basis for doubt concerns credibility. We may question the credibility of the person telling us something and doubt their claim, not wanting to trust them as a claimant because for some reason we deem them unreliable. If someone has lied to me in the past, I will not be willing to take their word for something. There are bad reasons for questioning credibility: stereotypes based on appearance, race, ethnicity, gender, or sexual orientation. But there are good ones as well: perceptual defects, lack of cognitive skills, dishonesty, a record of unreliability, or absence of required academic training.

This sort of analysis – default trust, or more reflective trust after doubt about credibility or content – is a standard account of forming beliefs based on testimony. If social media offer us testimony – claims put forward by others, put forward for us to accept – then the standard approach should be applicable to social media as well. And yet the fit is poor: there gullibility knows few bounds. What is the difference? What is distinctive about social media? I can’t claim knowledge on this point, but there are plausible features. One is anonymity and the absence of those personal encounters that may offer relevant information about sources. The supposed information is easy to access on your phone, and ownership of one of these is almost mandatory of late. You needn’t subscribe to a newspaper or magazine; your ‘information’ will be there for no apparent cost. Another factor is fast spread; the believability of a claim may increase if you are told that 1.5 million people ‘like’ it, producing a vastly enhanced bandwagon effect. You may think ‘This is what they are all accepting; this is the latest; ok.’ There is a self-reinforcing effect too. The acceptance of implausible (even outlandish) claims in bubbles or echo chambers makes consensus less likely, the outlandish more respectable, and caution less usual. Conspiracy theorists may see themselves as ‘in the know’, seeing through establishment experts to cultivate their own explanations and predictions.

As to evidence and verbal or grammatical cues that something may be wrong, attention is needed but may not be found. A cure is said to be ‘a miracle’; signing up for more news is ‘urgent’; the company is a reputable ‘business’. Note: these should be alerts to scams – but you need to pay attention to notice them.

I think we have a collective epistemic malady here. Is there a cure? Many will recommend critical thinking, which is always a good idea. But there are pitfalls. For coming to doubt and then, at that point, appraising content and credibility we rely on background knowledge, which is a matter of past beliefs. When consensus in an area diminishes, the merit of assessments will come under dispute. The sad truth may be that conspiracy theorists and the like do practice critical thinking; their tools may be formally just like ours. The difference is material. If one’s background belief is that the U.S. government faked the moon landing and successfully fooled millions of people, then a claim that it is faking massacres or lighting wildfires or bombing of the Twin Towers will seem more plausible. In his 1974 book Trust and Power, Niklas Luhmann argued that truth, derived from trusting testimony, is an essential simplifier in life: we do not have to consider every possibility. For this to work, people have to seek truth and believe well-founded claims when they find them. Exaggerated claims incompatible with science and common sense should be received with scepticism, not indulged with self-deception and active ignorance.

I recall a story my mother-in-law told, from her youth long ago in Indonesia. On a street were three tailors; I don’t know whether this street was fictional or real. As the story goes, one tailor advertised that he was the best in the world. A second claimed to be the best tailor in the country. The third said he was the best tailor on the street. Which should be believed? Base your decision not on novelty or excitement, but on plausibility, and decide.