Tactics for increasing reason-responsiveness

Rex Kerr
7 min readAug 1, 2024

--

If you are reason-responsive, it means that, as it says, you respond to reasons by considering them and, if the reasons are good enough, potentially changing your mind.

It is an important concept, as opposed to mere openmindedness, because the premise is that there are reasons to believe or act differently. It’s not merely a stance of being willing to hear different perspectives (like openmindendness is). The reasons might not be sufficient, but the idea is that if they are, you will (or at least are likely to) respond in some way.

Unfortunately, this can be hard, especially if the reasons come from a source that you view as an enemy — someone with different politics, a country that has opposed yours, or just someone who hogs the limelight and puts down others along the way.

It can also be hard simply because your narratives have been built assuming something is true, and it’s daunting to even consider re-examining them and rebuilding the ones that no longer work. Far easier if the pesky reasons just go away.

And the worst part is that if your brain is like most people’s, it’s very very good at coming up with quick rationalizations for why the reasons can go away.

If you read various pieces of advice about successful rational argumentation, you can find common-sense tips about being humble, staying openminded and curious, steelmanning your opponent’s position (i.e. putting the best spin on it you can), establishing emotional security with the person you disagree with, stuff like that. It’s all pretty easy and/or positive.

I’m not going to give you advice like that.

Instead, I’m going to give you three difficult and/or uncomfortable pieces of advice that you can use to make a major difference in your ability to think clearly and use reason to alter your outlook.

Ask not true or false, but to what extent and with what likelihood.

Sometimes beliefs are incorrect because they’re based on flat-out lies about easily-verified matters. For example, Chevron isn’t the largest company on the planet, and it’s easy to check this. Consequently, few people are likely to have wrong beliefs that result from thinking that Chevron is, in fact, the largest company. Furthermore, if you’re one of them, the evidence to the contrary will be incredibly clear, so if you have any interest in being responsive to reason and evidence, you’ll find it easy to switch.

Most of the time, it isn’t so easy. Either the facts of the matter are difficult to check (what did actually happen, in full, between Amber Heard and Johnny Depp?), or the matter is more complex than can be adequately forced into a binary distinction (do Republicans value freedom?), or both.

If a statement is forced to be either true or false, even a single counterexample is enough to change one’s mind. This makes it extremely tempting, if you think something is true, to find any one instance where it is (basically) true and rationalize away any contrary evidence that might suggest it’s false; or if you think it’s false, you’re liable to find one counterexample in someone else’s claim thus “proving them wrong” and then rationalize away cases where it seems more right.

Better is to simply refuse to play that game. Instead of viewing statements like “Liberals are destroying American society” as a two-choice alternative, True or False, view statements like this as a claim that the some features of a Liberal outlook have a negative impact on important aspects of American society, and furthermore the extent of this and likelihood that this is the impact are unacceptably high.

That is, you take all the absolute, binary true/false aspects of the statement and place them on a continuum of degrees in all aspects that might be remotely arguable (including, as necessary, going the exact opposite way, e.g. “the Liberal outlook has a strongly positive impact on American society”, and including “well golly at least some of this seems really hard to know”), you suddenly have set yourself a task where reason and evidence can easily come to bear.

Turning the dial from “Liberalism is 99% good for 95% of American society” to “Liberalism is 97% good for 92% of American society” — not literal numbers, of course, just the vague feeling — is way easier than flipping from “YAAAAY Liberalism :) :) :)” to “BOOOO Liberalism >:( >:(”, and probably a lot more justified, too, on the basis of the kinds of evidence and reasoning you’re likely to get. Framing it as degrees of certainty, degrees of agreement gives you room to refine your position gradually.

It’s also very useful when you talk with others, because they might benefit from exploring the pros and cons, not just yelling about the bottom line. But we’re only considering here the benefit for you.

The benefit for you is that you get to make as large or small of a course correction as warranted. Not facing a giant yes/no flip as the only way to correct course does wonders for actually allowing you to think about the reasoning.

If you were wrong, hypothetically, what types of evidence would reveal that to you, and what would have to be the case for that to be?

This is prepping yourself in advance for being responsive to reason.

We’re setting up a completely hypothetical question here. It’s explicitly not real reality. You can make up whatever alternate reality is necessary to make you wrong.

But here’s the trick: you’re not doing an existence proof where you show that there exists a conceivable state of affairs where you’re wrong. You want to make the smallest changes possible to reality such that you’re wrong, and then you want to check: if I get swapped into that reality, would I be looking at what I need to in order to catch my mistake?

Other people don’t need to give you reasons for their beliefs in order for you to hypothesize in this fashion. And note, this is not an exercise where you recite why you’re right — it’s good to know what your evidence is for believing something, but the point here is to engage in a thought-experiment where you are, in fact, wrong.

Then an actual correction to your beliefs (or perhaps the certainty of your beliefs) can happen by noticing that the real world is more like your hypothetical belief-changing world than you’d initially envisioned.

Choose to speak with the wise, not with the fools

Of course people aren’t actually “the wise” and “fools”. We’re more complex and multivaried than that. But some people sure have a habit, in certain contexts, at least, of acting pretty foolish when it comes to debating things.

It might be fun, at times, to battle them with inanities and put-downs. If you feel like it, well, you do you. But don’t mistake this for there being no reasons for beliefs on “their side”.

Sometimes, everyone holding a position really does have only the stupidest, most laughable reasons to believe something. It’s a social phenomenon, not a rational one. But a lot of times, at least some somewhat-convincing rationalizations have been invented, and by engaging in the flame-fest you’re not exposing yourself to the reasons such as they stand. Being reason-responsive in theory doesn’t count if you habitually avoid exposing yourself to reasons for opposing viewpoints.

Volume doesn’t equal thoughtfulness.

So speak to the wise. “Wise.” People who have reasons for what they believe and who will share them. People who will at least consider and respond thoughtfully to challenges. (You will likely have to conduct yourself the same way, or at least with polite humility, in order for them not to use the same rule as a reason to not talk to you.) If you can get a loudmouthed troll to give reasons and respond thoughtfully, great, talk to them after all! This isn’t about their nature, it’s about their behavior.

Now, there’s a danger here. You might not be the world’s greatest expert in your perspective, either. If you expose yourself to the most carefully crafted rationale in support of something wrong, might you be swayed to switch from thinking correct things to thinking wrong things because you don’t possess the knowledge or sophistication to notice the errors in their reasoning?

Yes. You might! That’s why this can be uncomfortable. You have to be both open and on guard at the same time. Tricky! You have to remember that good arguments aren’t necessarily conclusive, and that you might want to suspend judgment while you seek out better justifications for those things you believe but don’t have the absolute best justification for at your fingertips.

The alternative is strictly worse, however: you end up believing whatever you randomly were assigned by the fate of who you ended up associating with and whichever proclivities your genes and upbringing pushed you towards. Far better to thoughtfully risk being mislead while seeking greater correctness than to stick stubbornly in wrongness.

Conclusion

If you want to be a savvy buyer in the marketplace of ideas, your beliefs need to be susceptible to good arguments. Reason needs to find some purchase in you.

If you check all the easy mode boxes for using reason, maybe you should level up.

  • Think in degrees and probabilities, not true/false
  • Construct hypothetical worlds where you’re mistaken and figure out how to discover it
  • Talk to those who can present good reasons for beliefs that are different from your own.

Good luck! May your purchases of ideas be wise.

--

--

Rex Kerr
Rex Kerr

Written by Rex Kerr

One who rejoices when everything is made as simple as possible, but no simpler. Sayer of things that may be wrong, but not so bad that they're not even wrong.

Responses (2)