Rex Kerr
2 min readJan 24, 2025

--

You set this up so well, but you kind of never circled back to it.

Of course, ultimately, everyone is going to have to decide for themselves, after all the intervening decisions are made.

If it is the fact-checkers getting to decide what gets seen, who are the fact checkers? What measures are there in place--can there even be any such measures?--to ensure their quality? When something seems wrong-ish, how is the boundary set between no intervention and some sort of intervention?

If fact checking can be done decently well, I love it, because I love facts.

However, if "this fact-checking is being done well" does not itself rise to the level of a fact that can be demonstrated on demand (no "it's great, trust me" or "well I mean look how would you even show that, it's just good, any sensible person can tell that"), then how can anyone who loves facts judge whether they should love this particular instance of fact checking, and think it should be kept rather than discarded?

Of course, Facebook, TikTok, Twitter, and the rest, all profoundly shape via algorithm the experiences of their users. Online reality is heavily curated all the time--and this too is a type of getting to decide. I know that strawberry ice cream is the best ice cream because Facebook fills my feed with posts from strawberry ice cream lovers. If a "fact-checker" comes in and stops me from seeing some of the "omg strawberry beats every other flavor by so much it is not even funny!" posts, what am I going to decide? That the social proof of "strawberry is the best" is wrong, or that Big Fact is suppressing the truth of the best ice cream flavor?

Social media is set up to generate the perfect conditions to demonstrate the apparent reality that fact-checkers are biased, when the "facts" come in with very low evidential support (e.g. typically just lowered distribution), and the social proof comes in in a flood.

So I'm none too keen on fact-checking as typically construed. I would like to see compelling evidence that it works in the big picture as opposed to driving us harder and faster towards a post-truth society. When who decides is Facebook (via curating someone's feed), and then Facebook some more in the opposite direction (by removing fact-checked content from a feed that implicitly denies that fact), it can be even worse than if Facebook was just doing its usual echo chamber thing. At least, I'd like to know before passing judgment.

That doesn't mean that nothing is needed. It doesn't mean that we ignore facts. We need to embrace them far more than we do, as a society.

But ultimately, people decide for themselves. And if we want to know what the common errors will be, we need to attend to their whole experience, not just a piece of fact-checking here and there.

--

--

Rex Kerr
Rex Kerr

Written by Rex Kerr

One who rejoices when everything is made as simple as possible, but no simpler. Sayer of things that may be wrong, but not so bad that they're not even wrong.

No responses yet