Rational discourse and critical thinking as the antidote for fake news, fake claims of fake news, and other information extremism
Warning: the post below is at times rather dull and riddled with links to long articles to support various points.
Alas! This is what it takes to construct an adequate argument!
But don’t take my word for it. I explain the reasons why I think so below. I hope you will be convinced, or at least think that I’ve made points worthy of consideration and refutation (in principle, even if no-one is so motivated in practice).
If you want the short version, just read the title.
Though the term “fake news” has only recently come to prominence, misinformation is not remotely new. Even misinformation in news sources is not new. For instance, Jacob Soell has provided a quick tour of the history of fake news in an article in Politico Magazine; the bottom line is that fake news has been effectively misleading people for as long as news has existed.
Soell ends on a pessimistic note, observing the decline in the prevalence and reputation of journalists: “popular role models and societal links are gone, and with them, a trusted filter within civil society — the sort of filter that can say with authority to fellow local citizens that fake news is not only fake, it is also potentially deadly.”
There are surely reasons for concern. Misinformation, unconstrained by reality, can have whichever blend of emotional saliency and novelty that best fosters propagation. Truth, we might fear, is at a competitive disadvantage. At least on Twitter, this seems to be the case, according to an extensive study by Vosoughi, Roy, and Aral (there is a nice summary in The Atlantic).
While our instinct might be to try to return to a situation where trusted journalists deliver accurate, unbiased information to a public unswayed by other news sources, this is unlikely to happen for three reasons. First, the technological advancements that allow us to share news are so widespread and general-purpose that eliminating them is all but impossible: people will continue to share what feels meaningful to them on Twitter and Facebook and other platforms, and this will continue to include news. Second, public trust in many institutions has been declining in recent years — at least in the United States (here is an example, though note also that survey results are highly sensitive to wording choices) — and it is difficult to regain trust when there is no mechanism to verify trustworthiness. Third, the historical view is rose-tinted at best; reporting on the Vietnam war, for instance, was for a long time heavily swayed by what the government wanted reported (see this example).
The situation seems grim.
At this point we should step back and ask: how do we know that any supposed fact is true? How can we decide upon any position? Is it all just a matter of opinion and perspective?
The Greeks, and the Islamic philosophers of 800–1000 A.D., and the philosophers and scientists of the Enlightenment thought not. (It is instructive to review the history of critical thinking in the West or investigate one of the most prominent Islamic philsopher-scientists.)
The necessary ingredients for critical thinking, and those components that go into a rationally persuasive argument, are still common topics in philosophy departments and even high school instruction, and are covered in popular books and on websites. And, while there aren’t nearly as many studies on the subject as I’d like, those that have been reported show that critical thinking yields (or is at least correlated with) better life outcomes and better decision-making.
So it doesn’t seem that everything is just opinion. When deciding what to believe, one should have factual material that is relevant to the topic at hand and sufficient to support the belief. When deciding whether to change one’s belief, the same criteria apply.
This isn’t always easy, but it’s not (usually) rocket science. We base lots of our beliefs on sound arguments and sufficient evidence. If my hair is wet and you ask me why, I might say I was rained on. If you were skeptical of my claim, my retort of “Well, YOU have food on your shirt!” (irrelevant), “It rains sometimes!” (insufficient), or “It’s raining in the hallway!” (false) would all have serious flaws as justifications.
So the problem isn’t that we are unaware of how to reliably determine the state of affairs (at least not always), or of how to convey adequate reasons to others. And anyway, if this were the problem, we’d just practice, get better, and be able to resolve issues by having rational discussions.
The root of problem lies somewhere else.
Humans are social creatures. Being social provides tremendous benefits: we can keep better watch for danger, specialize in many areas, coordinate to fight off danger, and so on. But being social is difficult. In particular, there are two crucial problems to solve: how do you coordinate within your group, and how do you organize your group when faced by a large threat (like another group)?
Within social groups, a lot of the coordination problems are solved by having a social hierarchy. You follow the alpha wolf; if the alpha is killed, you follow the next highest ranking wolf, and so on. This is a core part of our nature; we don’t understand how it is implemented as well as we ought to given how important it is to us, but at least there is research indicating that it is extremely important (here is an example). In a hierarchy, it becomes very important to maintain good standing within the hierarchy, as this determines all kinds of benefits that you might get.
Unfortunately, we seem wired to interpret discussions about what we believe as an attack on our hierarchical status. And it’s important to get the status levels right (here is one paper suggesting — granted, the evidence is far from conclusive — that disagreements about hierarchy within a team is a net drain on the team’s effectiveness). Whether for this or other reasons, it seems hard for people to admit that they are wrong. Of course, objectively, since you can have a huge number of mutually contradictory beliefs but only one right belief, you’d expect it to be really, really hard to be right about a lot of things and so almost everyone should be wrong about a lot of stuff; and if so, wouldn’t it make sense to readily and gladly accept adequate information to revise one’s position? I think it would. But, alas, this does not seem to come naturally to us humans.
Matters get even worse when it comes to inter-group competition. Now not only do you have to maintain your within-group status, you also have to constantly put your loyalty to your side on display: defection could seriously harm your group, so everyone on the group has to be on guard for it. This manifests itself in various ways, including that political bias is now stronger than racial bias. Andrew Sullivan has a piece in New York Magazine that discusses many problems arising from tribalism. I don’t wish to reiterate all these points here; the bottom line is that having your own group has historically and continues to be of great advantage, that it also causes great problems and conflicts, and that when the group identity is defined in part by beliefs (and why not use beliefs as a way to delimit tribe boundaries?), questioning those beliefs is interpreted as a lack of loyalty.
This, I think, is the root of the problem: it is socially difficult to accept correction, and even more difficult to examine a defining belief of one’s own tribe.
This is not looking good.
But wait. We don’t have many literal tribes any more.
Why not?
Answering this question in full would require a long reading of history, but in brief: we found other ways to organize ourselves that suited our predispositions well enough.
In a modern democratic society, especially in a large city, it feels kind of like we’re one of the high-ranking members in a tribe. We don’t get told what to do very often (outside of work, but that’s just work). We have a lot of freedom to make our own choices. We can say and think what we want, to a large extent. Instead of having top-down leadership from an alpha individual, our behaviors are mostly constrained by laws, and we give up that alpha spot to the laws and even make our nominal leaders subject to them (to some extent).
We have already tamed our tribal instincts once. It’s not a complete taming — consider, for instance, the many instances of aggressive nationalism that we have seen throughout history. But it’s a far cry from actual individual tribes.
Can we tame our instincts again?
Before exploring a solution, we should take a moment to reject some bad ideas that occasionally come up.
One idea is that we should just outlaw “fake news”. If nobody was making up fake news, nobody would be believing it, right?
The problem here is threefold. First, “I have the power of the government behind me to imprison you” is not in our set of guidelines on how to come to a reliable understanding of the world. Second, people who are inclined to believe the account will feel like a highly threatened tribe, making it much harder for them to be reasonable and open to evidence. Third, there are plenty of issues where the evidence isn’t so clear, so it’s an inadequate solution.
The way to deal with misinformation is not to outlaw it.
Another idea I’ve heard is that people who are right should use every propaganda and cognitive trick they can in order to get everyone else agreeing with them. I’m not going to cite any sources, but you may have come across this thing already. I hope not. It’s a terrible idea. That you can employ propaganda says no more about your correctness than that you can employ a big stick. “I will whack you if you don’t believe this” is no more reliable a method to reach a reliable understanding of the world than is, “I’m going to make simple statements, repeat them ad nauseum, and engage in character attacks on anyone who says anything to the contrary.”
It might make people change their minds (or say they have), but it’s unrelated to the truth content.
That’s not what we want. In very many cases, the truth matters. The consequences matter. We want to know what they are, individually, and as a society.
It is time to embrace rationality again.
Not in the manner of Enlightenment philosophers, who were all too prone to view humans as purely rational in all ways that mattered. As Dan Ariely explains in Predictably Irrational, there are many ways in which we are not. (Edit: since I first wrote this, Ariely himself has come under scrutiny for at-best-carelessness with data, but even if some of his own results are suspect, the overall conclusion that we’re rather non-rational has been confirmed many times.)
As Tom Stafford explains, rational arguments can and do work to change people’s minds. Not always, not easily, but it’s not uniformly ineffectual. And if we adhere to the guidelines about how to construct a strong argument, the change will be largely in the direction of making everyone more correct, because it is rare to be able to formulate a strong argument in favor of something badly wrong.
If you see someone littering, your first instinct is probably not to think, “Is that a Republican or a Democrat?”. It’s probably something more like, “What a pig! Littering and making this place uglier for everyone just because they’re lazy!” We have a social sense that usually trumps our remaining tribal affinities. I see no reason why this same social sense cannot extend to reasoned argument.
Now I admit it will take a lot of work, because a huge fraction of our society is geared around advertising, which these days is pretty much synonymous for making you want things for bad reasons (the valuable informational advertising is hard to discern these days through all the emotional pulling). But even so, there are plenty of contexts where arguments are routinely used to great effect (science and law), and the people engaged in those are responsive to them in large part because they are expected to be.
Why shouldn’t everyone expect me to be responsive to good arguments? Why shouldn’t they expect it of you, too?
If I had data that this worked, I would present it; if I had a really compelling argument with relevant and sufficient points, I would present that. I don’t. I’m speculating. But I’m hopeful that it is possible for the reasons I’ve stated: we’ve done it before with the transition from literal tribes to modern liberal democracy; and we have strong evidence that reason can persuade people in the right contexts and that the context can extend to almost everything one does if that is what the profession calls for.
If you would like to try to undertake a modern approach to rationality and implement it on social media, here is what I suggest. (Perhaps you already do. Perhaps you do it better than I suggest below. Nonetheless, here are some suggestions.)
- Accept that some, perhaps many, of your own views may be incorrect, and resolve to fix at least some of them when you come upon good evidence indicating that you were wrong.
- Accept that you’re not going to be able to construct an airtight argument for everything you feel strongly about, even if you are right. (Too bad; you won’t be able to convince people who listen to rational arguments.)
- Envision describing to someone else how to construct a good argument. If it seems difficult to do this, consider taking an online course or reading a book on critical thinking. You want this to be really solid in your mind, so solid that you can teach it on demand. You need all your brainpower for actually evaluating the evidence for things, not trying to figure out what makes for a good argument and what makes for a weak or fallacious one.
- Seek out people/blog posts/whatever that agree with you and provide good evidence for their positions. Read them and evaluate how strong the reasons are. Now perhaps you better know why you believe what you believe.
- Seek out people/blog posts/whatever that disagree with you and provide good evidence for their positions. Read them and evaluate how strong the reasons are, but don’t start trying to debunk them just yet. You’re understanding them, first. Ignore the non-reasons as emotional clutter; you are trying to pick out the best that they have. (“If this were true, it would prove their point. Maybe I don’t think it’s true, but if they think it is, they would think this is a good argument. So this at least has the shape of a good argument.”)
- Find the weakest point in both the agreeing and disagreeing arguments (either logic or facts). If the venue allows, comment how much you like the thoughtful approach (especially when it’s the other side!), but mention that this particular argument seems really weak. Maybe they’ll drop it, maybe they’ll strengthen it, maybe nothing will happen save that both sides will feel a bit more that it’s important to have really good arguments, and you will be free from being overly swayed by two inadequate arguments.
- Repeat a few times. Does the evidence for your side seem drastically better than that for the other side? If yes, maybe you need to look harder for reasoned arguments on the other side, or maybe you’ve discovered that you’re right. Does the evidence for the other side seem drastically better? Maybe you’ve discovered that you should change your view, or that your side really needs to get its act together in writing down good arguments. Does it seem like neither side has dramatically better arguments? Maybe it isn’t something that is clear, or maybe it is something that boils down to a matter of taste upon which nominally reasonable people can disagree.
- If people on either side of this issue litter the intellectual sidewalk with falsehoods, inflammatory statements, or whatever, call them on it. At least in your mind. (“Oh, wow, I love so-and-so but I can’t believe they dumped all that intellectual garbage out of their keyboard.”)
If everyone approached arguments something like this, the internet, and society in general, would be a far more informative, correct, and productive place.
And I think this is a perfectly reasonable way to start an overall push towards rational discourse and critical thinking throughout society.
Or so I suspect. I don’t actually have much evidence. How about we collect some?