Pluralistic: Conspiratorialism and the epistemological crisis (25 Mar 2024)

Originally published at:

Today's links

A grand paneled hearing room, seen from the back of the room, looking at a dais over the heads of an audience of smartly turned out, attentive people. On the dais itself is a gargantuan, badly damaged cardboard box bearing a FRAGILE sticker. The saturation of the audience has been tuned down, while the saturation of the box has been cranked up.

Conspiratorialism and the epistemological crisis (permalink)

Earlier this month, Ed Pierson was supposed to fly from Seattle to New Jersey on Alaska Airlines. He boarded his flight, but then he had an urgent discussion with the flight attendant, explaining that as a former senior Boeing engineer, he'd specifically requested that flight because the aircraft wasn't a 737 Max:

But for operational reasons, Boeing had switched out the equipment on the flight and there he was on a 737 Max, about to travel cross-continent, and he didn't feel safe doing so. He demanded to be let off the flight. His bags were offloaded and he walked back up the jetbridge after telling the spooked flight attendant, "I can’t go into detail right now, but I wasn’t planning on flying the Max, and I want to get off the plane."

Boeing, of course, is a flying disaster that was years in the making. Its planes have been falling out of the sky since 2019. Floods of whistleblowers have come forward to say its aircraft are unsafe. Pierson's not the only Boeing employee to state – both on and off the record – that he wouldn't fly on a specific model of Boeing aircraft, or, in some cases any recent Boeing aircraft:

And yet, for years, Boeing's regulators have allowed the company to keep turning out planes that keep turning out lemons. This is a pretty frightening situation, to say the least. I'm not an aerospace engineer, I'm not an aircraft safety inspector, but every time I book a flight, I have to make a decision about whether to trust Boeing's assurances that I can safely board one of its planes without dying.

In an ideal world, I wouldn't even have to think about this. I'd be able to trust that publicly accountable regulators were on the job, making sure that airplanes were airworthy. "Caveat emptor" is no way to run a civilian aviation system.

But even though I'm don't have the specialized expertise needed to assess the airworthiness of Boeing planes, I do have the much more general expertise needed to assess the trustworthiness of Boeing's regulator. The FAA has spent years deferring to Boeing, allowing it to self-certify that its aircraft were safe. Even when these assurances led to the death of hundreds of people, the FAA continued to allow Boeing to mark its own homework:

What's more, the FAA boss who presided over those hundreds of deaths was an ex-Boeing lobbyist, whom Trump subsequently appointed to run Boeing's oversight. He's not the only ex-insider who ended up a regulator, and there's plenty of ex-regulators now on Boeing's payroll:

You don't have to be an aviation expert to understand that companies have conflicts of interest when it comes to certifying their own products. "Market forces" aren't going to keep Boeing from shipping defective products, because the company's top brass are more worried about cashing out with this quarter's massive stock buybacks than they are about their successors' ability to manage the PR storm or Congressional hearings after their greed kills hundreds and hundreds of people.

You also don't have to be an aviation expert to understand that these conflicts persist even when a Boeing insider leaves the company to work for its regulators, or vice-versa. A regulator who anticipates a giant signing bonus from Boeing after their term in office, or a an ex-Boeing exec who holds millions in Boeing stock has an irreconcilable conflict of interest that will make it very hard – perhaps impossible – for them to hold the company to account when it trades safety for profit.

It's not just Boeing customers who feel justifiably anxious about trusting a system with such obvious conflicts of interest: Boeing's own executives, lobbyists and lawyers also refuse to participate in similarly flawed systems of oversight and conflict resolution. If Boeing was sued by its shareholders and the judge was also a pissed off Boeing shareholder, they would demand a recusal. If Boeing was looking for outside counsel to represent it in a liability suit brought by the family of one of its murder victims, they wouldn't hire the firm that was suing them – not even if that firm promised to be fair. If a Boeing executive's spouse sued for divorce, that exec wouldn't use the same lawyer as their soon-to-be-ex.

Sure, it takes specialized knowledge and training to be a lawyer, a judge, or an aircraft safety inspector. But anyone can look at the system those experts work in and spot its glaring defects. In other words, while acquiring expertise is hard, it's much easier to spot weaknesses in the process by which that expertise affects the world around us.

And therein lies the problem: aviation isn't the only technically complex, potentially lethal, and utterly, obviously untrustworthy system we all have to navigate. How about the building safety codes that governed the structure you're in right now? Plenty of people have blithely assumed that structural engineers carefully designed those standards, and that these standards were diligently upheld, only to discover in tragic, ghastly ways that this was wrong:

There are dozens – hundreds! – of life-or-death, highly technical questions you have to resolve every day just to survive. Should you trust the antilock braking firmware in your car? How about the food hygiene rules in the factorie that produced the food in your shopping cart? Or the kitchen that made the pizza that was just delivered? Is your kid's school teaching them well, or will they grow up to be ignoramuses and thus economic roadkill?

Hell, even if I never get into another Boeing aircraft, I live in the approach path for Burbank airport, where Southwest lands 50+ Boeing flights every day. How can I be sure that the next Boeing 737 Max that falls out of the sky won't land on my roof?

This is the epistemological crisis we're living through today. Epistemology is the process by which we know things. The whole point of a transparent, democratically accountable process for expert technical deliberation is to resolve the epistemological challenge of making good choices about all of these life-or-death questions. Even the smartest person among us can't learn to evaluate all those questions, but we can all look at the process by which these questions are answered and draw conclusions about its soundness.

Is the process public? Are the people in charge of it forthright? Do they have conflicts of interest, and, if so, do they sit out any decision that gives even the appearance of impropriety? If new evidence comes to light – like, say, a horrific disaster – is there a way to re-open the process and change the rules?

The actual technical details might be a black box for us, opaque and indecipherable. But the box itself can be easily observed: is it made of sturdy material? Does it have sharp corners and clean lines? Or is it flimsy, irregular and torn? We don't have to know anything about the box's contents to conclude that we don't trust the box.

For example: we may not be experts in chemical engineering or water safety, but we can tell when a regulator is on the ball on these issues. Back in 2019, the West Virginia Department of Environmental Protection sought comment on its water safety regs. Dow Chemical – the largest corporation in the industry's largest industry – filed comments arguing that WV should have lower standards for chemical contamination in its drinking water.

Now, I'm perfectly prepared to believe that there are safe levels of chemical runoff in the water supply. There's a lot of water in the water supply, after all, and "the dose makes the poison." What's more, I use the products whose manufacture results in that chemical waste. I want them to be made safely, but I do want them to be made – for one thing, the next time I have surgery, I want the anesthesiologist to start an IV with fresh, sterile plastic tubing.

And I'm not a chemist, let alone a water chemist. Neither am I a toxicologist. There are aspects of this debate I am totally unqualified to assess. Nevertheless, I think the WV process was a bad one, and here's why:

That's Dow's comment to the regulator (as proffered by its mouthpiece, the WV Manufacturers' Association, which it dominates). In that comment, Dow argues that West Virginians safely can absorb more poison than other Americans, because the people of West Virginia are fatter than other Americans, and so they have more tissue and thus a better ratio of poison to person than the typical American. But they don't stop there! They also say that West Virginians don't drink as much water as their out-of-state cousins, preferring to drink beer instead, so even if their water is more toxic, they'll be drinking less of it:

Even without any expertise in toxicology or water chemistry, I can tell that these are bullshit answers. The fact that the WV regulator accepted these comments tells me that they're not a good regulator. I was in WV last year to give a talk, and I didn't drink the tap water.

It's totally reasonable for non-experts to reject the conclusions of experts when the process by which those experts resolve their disagreements is obviously corrupt and irredeemably flawed. But some refusals carry higher costs – both for the refuseniks and the people around them – than my switching to bottled water when I was in Charleston.

Take vaccine denial (or "hesitancy"). Many people greeted the advent of an extremely rapid, high-tech covid vaccine with dread and mistrust. They argued that the pharma industry was dominated by corrupt, greedy corporations that routinely put their profits ahead of the public's safety, and that regulators, in Big Pharma's pocket, let them get away with mass murder.

The thing is, all that is true. Look, I've had five covid vaccinations, but not because I trust the pharma industry. I've had direct experience of how pharma sacrifices safety on greed's altar, and narrowly avoided harm myself. I have had chronic pain problems my whole life, and they've gotten worse every year. When my daughter was on the way, I decided this was going to get in the way of my ability to parent – I wanted to be able to carry her for long stretches! – and so I started aggressively pursuing the pain treatments I'd given up on many years before.

My journey led me to many specialists – physios, dieticians, rehab specialists, neurologists, surgeons – and I tried many, many therapies. Luckily, my wife had private insurance – we were in the UK then – and I could go to just about any doctor that seemed promising. That's how I found myself in the offices of a Harley Street quack, a prominent pain specialist, who had great news for me: it turned out that opioids were way safer than had previously been thought, and I could just take opioids every day and night for the rest of my life without any serious risk of addiction. It would be fine.

This sounded wrong to me. I'd lost several friends to overdoses, and watched others spiral into miserable lives as they struggled with addiction. So I "did my own research." Despite not having a background in chemistry, biology, neurology or pharmacology, I struggled through papers and read commentary and came to the conclusion that opioids weren't safe at all. Rather, corrupt billionaire pharma owners like the Sackler family had colluded with their regulators to risk the lives of millions by pushing falsified research that was finding publication in some of the most respected, peer-reviewed journals in the world.

I became an opioid denier, in other words.

I decided, based on my own research, that the experts were wrong, and that they were wrong for corrupt reasons, and that I couldn't trust their advice.

When anti-vaxxers decried the covid vaccines, they said things that were – in form at least – indistinguishable from the things I'd been saying 15 years earlier, when I decided to ignore my doctor's advice and throw away my medication on the grounds that it would probably harm me.

For me, faith in vaccines didn't come from a broad, newfound trust in the pharmaceutical system: rather, I judged that there was so much scrutiny on these new medications that it would overwhelm even pharma's ability to corruptly continue to sell a medication that they secretly knew to be harmful, as they'd done so many times before:

But many of my peers had a different take on anti-vaxxers: for these friends and colleagues, anti-vaxxers were being foolish. Surprisingly, these people I'd long felt myself in broad agreement with began to defend the pharmaceutical system and its regulators. Once they saw that anti-vaxx was a wedge issue championed by right-wing culture war shitheads, they became not just pro-vaccine, but pro-pharma.

There's a name for this phenomenon: "schismogenesis." That's when you decide how you feel about an issue based on who supports it. Think of self-described "progressives" who became cheerleaders for the America's cruel, ruthless and lawless "intelligence community" when it seemed that US spooks were bent on Trump's ouster:

The fact that the FBI didn't like Trump didn't make them allies of progressive causes. This was and is the same entity that (among other things) tried to blackmail Martin Luther King, Jr into killing himself:

But schismogenesis isn't merely a reactionary way of flip-flopping on issues based on reflexive enmity. It's actually a reasonable epistemological tactic: in a world where there are more issues you need to be clear on than you can possibly inform yourself about, you need some shortcuts. One shortcut – a shortcut that's failing – is to say, "Well, I'll provisionally believe whatever the expert system tells me is true." Another shortcut is, "I will provisionally disbelieve in whatever the people I know to act in bad faith are saying is true." That is, "schismogenesis."

Schismogenesis isn't a great tactic. It would be far better if we had a set of institutions we could all largely trust – if the black boxes where expert debate took place were sturdy, rectilinear and sharp-cornered.

But they're not. They're just not. Our regulatory process sucks. Corporate concentration makes it trivial for cartels to capture their regulators and steer them to conclusions that benefit corporate shareholders even if that means visiting enormous harm – even mass death – on the public:

No one hates Big Tech more than I do, but many of my co-belligerents in the war on Big Tech believe that the rise of conspiratorialism can be laid at tech platforms' feet. They say that Big Tech boasts of how good they are at algorithmically manipulating our beliefs, and attribute Qanons, flat earthers, and other outlandish conspiratorial cults to the misuse off those algorithms.

"We built a Big Data mind-control ray" is one of those extraordinary claims that requires extraordinary evidence. But the evidence for Big Tech's persuasion machines is very poor: mostly, it consists of tech platforms' own boasts to potential investors and customers for their advertising products. "We can change peoples' minds" has long been the boast of advertising companies, and it's clear that they can change the minds of customers for advertising.

Think of department store mogul John Wanamaker, who famously said "Half the money I spend on advertising is wasted; the trouble is I don't know which half." Today – thanks to commercial surveillance – we know that the true proportion of wasted advertising spending is more like 99.9%. Advertising agencies may be really good at convincing John Wanamaker and his successors, through prolonged, personal, intense selling – but that doesn't mean they're able to sell so efficiently to the rest of us with mass banner ads or spambots:

In other words, the fact that Facebook claims it is really good at persuasion doesn't mean that it's true. Just like the AI companies who claim their chatbots can do your job: they are much better at convincing your boss (who is insatiably horny for firing workers) than they are at actually producing an algorithm that can replace you. What's more, their profitability relies far more on convincing a rich, credulous business executive that their product works than it does on actually delivering a working product.

Now, I do think that Facebook and other tech giants play an important role in the rise of conspiratorial beliefs. However, that role isn't using algorithms to persuade people to mistrust our institutions. Rather Big Tech – like other corporate cartels – has so corrupted our regulatory system that they make trusting our institutions irrational.

Think of federal privacy law. The last time the US got a new federal consumer privacy law was in 1988, when Congress passed the Video Privacy Protection Act, a law that prohibits video store clerks from leaking your VHS rental history:

It's been a minute. There are very obvious privacy concerns haunting Americans, related to those tech giants, and yet the closest Congress can come to doing something about it is to attempt the forced sale of the sole Chinese tech giant with a US footprint to a US company, to ensure that its rampant privacy violations are conducted by our fellow Americans, and to force Chinese spies to buy their surveillance data on millions of Americans in the lawless, reckless swamp of US data-brokerages:

For millions of Americans – especially younger Americans – the failure to pass (or even introduce!) a federal privacy law proves that our institutions can't be trusted. They're right:

Occam's Razor cautions us to seek the simplest explanation for the phenomena we see in the world around us. There's a much simpler explanation for why people believe conspiracy theories they encounter online than the idea that the one time Facebook is telling the truth is when they're boasting about how well their products work – especially given the undeniable fact that everyone else who ever claimed to have perfected mind-control was a fantasist or a liar, from Rasputin to MK-ULTRA to pick-up artists.

Maybe people believe in conspiracy theories because they have hundreds of life-or-death decisions to make every day, and the institutions that are supposed to make that possible keep proving that they can't be trusted. Nevertheless, those decisions have to be made, and so something needs to fill the epistemological void left by the manifest unsoundness of the black box where the decisions get made.

For many people – millions – the thing that fills the black box is conspiracy fantasies. It's true that tech makes finding these conspiracy fantasies easier than ever, and it's true that tech makes forming communities of conspiratorial belief easier, too. But the vulnerability to conspiratorialism that algorithms identify and target people based on isn't a function of Big Data. It's a function of corruption – of life in a world in which real conspiracies (to steal your wages, or let rich people escape the consequences of their crimes, or sacrifice your safety to protect large firms' profits) are everywhere.

Progressives – which is to say, the coalition of liberals and leftists, in which liberals are the senior partners and spokespeople who control the Overton Window – used to identify and decry these conspiracies. But as right wing "populists" declared their opposition to these conspiracies – when Trump damned free trade and the mainstream media as tools of the ruling class – progressives leaned into schismogenesis and declared their vocal support for these old enemies of progress.

This is the crux of Naomi Klein's brilliant 2023 book Doppelganger: that as the progressive coalition started supporting these unworthy and broken institutions, the right spun up "mirror world" versions of their critique, distorted versions that focus on scapegoating vulnerable groups rather than fighting unworthy institutions:

This is a long tradition in politics: hundreds of years ago, some leftists branded antisemitism "the socialism of fools." Rather than condemning the system's embrace of the finance sector and its wealthy beneficiaries, anti-semites blame a disfavored group of people – people who are just as likely as anyone to suffer under the system:

It's an ugly, shallow, cartoon version of socialism's measured and comprehensive analysis of how the class system actually works and why it's so harmful to everyone except a tiny elite. Literally cartoonish: the shadow-world version of socialism co-opts and simplifies the iconography of class struggle. And schismogenesis – "if the right likes this, I don't" – sends "progressive" scolds after anyone who dares to criticize finance as the crux of our world's problems as popularizing "antisemetic dog-whistles."

This is the problem with "horseshoe theory" – the idea that the far right and the far left bend all the way around to meet each other:

When the right criticizes pharma companies, they tell us to "do our own research" (e.g. ignore the systemic problems of people being forced to work under dangerous conditions during a pandemic while individually assessing conflicting claims about vaccine safety, ideally landing on buying "supplements" from a grifter). When the left criticizes pharma, it's to argue for universal access to medicine and vigorous public oversight of pharma companies. These aren't the same thing:

Long before opportunistic right wing politicians realized they could get mileage out of pointing at the terrifying epistemological crisis of trying to make good choices in an age of institutions that can't be trusted, the left was sounding the alarm. Conspiratorialism – the fracturing of our shared reality – is a serious problem, weakening our ability to respond effectively to endless disasters of the polycrisis.

But by blaming the problem of conspiratorialism on the credulity of believers (rather than the deserved disrepute of the institutions they have lost faith in) we adopt the logic of the right: "conspiratorialism is a problem of individuals believing wrong things," rather than "a system that makes wrong explanations credible – and a schismogenic insistence that these institutions are sound and trustworthy."

(Image: Nuclear Regulatory Commission,, CC BY 2.0)

Hey look at this (permalink)

A Wayback Machine banner.

This day in history (permalink)

#20yrsago I just finished another novel!

#20yrsago Lessig’s Free Culture, free online, under a Creative Commons license

#15yrsago London cops reach new heights of anti-terror poster stupidity

#15yrsago Bruce Sterling on “generative art”

#15yrsago Basil Wolverton’s Culture Corner — HOWTOs for modern living from the past

#15yrsago domain-owner’s house raided over publication of secret government censorship lists,28348,25240192-5014239,00.html

#15yrsago Are kidnapped children tax-deductible?

#15yrsago New Zealand’s stupid copyright law dies

#10yrsago NSA hacked Huawei, totally penetrated its networks and systems, stole its sourcecode

#10yrsago Business Software Alliance accused of pirating the photo they used in their snitch-on-pirates ad

#10yrsago Turkey orders block of Twitter’s IP addresses

#10yrsago Wil Wheaton reads chapter one of Homeland

#10yrsago UK Tories ban sending books to prisoners

#10yrsago Unless companies pay, their Facebook updates reach 6 percent of followers

#5yrsago #10yrsago Unless companies pay, their Facebook updates reach 6 percent of followers

#5yrsago Man stole $122m from Facebook and Google by sending them random bills, which the companies dutifully paid

#5yrsago Video from the Radicalized launch with Julia Angwin at The Strand

#5yrsago More than 100,000 Europeans march against #Article13

#5yrsago Procedurally generated infinite CVS receipt

#5yrsago British schoolchildren receive chemical burns from “toxic ash” on Ash Wednesday

#5yrsago DCCC introduces No-More-AOCs rule

#1yrago The "small nonprofit school" saved in the SVB bailout charges more than Harvard

Upcoming appearances (permalink)

A photo of me onstage, giving a speech, holding a mic.

A screenshot of me at my desk, doing a livecast.

A grid of my books with Will Stahle covers..

Latest books (permalink)

A cardboard book box with the Macmillan logo.

Upcoming books (permalink)

  • Picks and Shovels: a sequel to "Red Team Blues," about the heroic era of the PC, Tor Books, February 2025
  • Unauthorized Bread: a graphic novel adapted from my novella about refugees, toasters and DRM, FirstSecond, 2025

Colophon (permalink)

Today's top sources:

Currently writing:

  • A Little Brother short story about DIY insulin PLANNING
  • Picks and Shovels, a Martin Hench noir thriller about the heroic era of the PC. FORTHCOMING TOR BOOKS JAN 2025

  • Vigilant, Little Brother short story about remote invigilation. FORTHCOMING ON TOR.COM

  • Spill, a Little Brother short story about pipeline protests. FORTHCOMING ON TOR.COM

Latest podcast: The Majority of Censorship is Self-Censorship

This work – excluding any serialized fiction – is licensed under a Creative Commons Attribution 4.0 license. That means you can use it any way you like, including commercially, provided that you attribute it to me, Cory Doctorow, and include a link to

Quotations and images are not included in this license; they are included either under a limitation or exception to copyright, or on the basis of a separate license. Please exercise caution.

How to get Pluralistic:

Blog (no ads, tracking, or data-collection):

Newsletter (no ads, tracking, or data-collection):

Mastodon (no ads, tracking, or data-collection):

Medium (no ads, paywalled):

Twitter (mass-scale, unrestricted, third-party surveillance and advertising):

Tumblr (mass-scale, unrestricted, third-party surveillance and advertising):

"When life gives you SARS, you make sarsaparilla" -Joey "Accordion Guy" DeVilla

1 Like

I really like this post. This is a tough nut to crack and the way you just keep coming back at it from different angles is awesome.

misuse of those algorithms.

Re: horseshoe theory this part is really interesting to me because the similarity I see is that it’s grifters all around. Reminds me of Carlins “They don’t care what drugs you do, they care WHOSE drugs you do.” I wonder if you consider Biden on the “left”, because from where I sit Biden absolutely does not want “universal access to medicine”, he’s one of the grifters, but instead of “supplements” his grifter friends are insurance and pharma companies. Perhaps this is the clarification you were trying to make when you mentioned the Overton Window as part of defining what you meant by progressives? i.e. you consider Biden a “leftist” but not a “progressive”? Because obviously if that is the case it would have been more accurate using the terminology as defined there to say " When progressives criticize pharma". Perhaps this seems like splitting hairs but I think this matters when you want to dissect the degree to which political figures (what I think many people read when you say right/left) vs. coalitions of people (what I think you mean) are “the same” in terms of horseshoe theory. All this talk about the swivel-eyed loons having a point when it comes to pharma vs. how different coalitions interpret and respond to those truths is hugely important in all this.

Generally it concerns me to try and say the grifters on the left (Obama care’s supporters, insurance companies, big pharma, others invested in the status quo) and the grifters on the right (the herbal “supplements” people, and I’m not sure who else… they hate Obama care via schismogenesis, but they also hate Medicare for All for the same reason, so I’m not sure what their plan is, rugged individuality??) are not both interested more in profits than people. That’s not horseshoe theory, that’s just their common greed which binds everyone in the tiny group in power together.

I also find this fascinating:

as finance frequently finds itself under my tires, but oddly enough I’ve never been accused of this. I can see where it comes from though. Always good to be aware of these things BEFORE someone tries to throw it in your face.

This topic was automatically closed after 15 days. New replies are no longer allowed.