Beliefs I currently hold:
- In the past we have been surprised by the capacity of blatantly false information to persuade large groups of people.
- In the future we will continue to be surprised by the ability of blatantly false information to persuade large groups of people.
From the point of view of classic economic theory, this is almost a two-way paradox. First, why aren’t people rationally updating their beliefs to be more skeptical of the information presented to them by state and private media with fairly transparent agendas? If we accept the premise of the first, though, it invites a second question: why is anyone surprised by the efficacy of propaganda and the credulity of large swaths of the public? Shouldn’t we, the meta-observers, also be updating our beliefs?
My preferred explanations, as they stand, are:
- Preference Falsification i.e. people are not fooled, they just happen to believe that at the moment it is safer and more rewarding to appear as though they believe the lies.
- Social coordination i.e. narrowly held false beliefs make more better coordination mechanisms for solving collective action problems than broadly held truths
The first is a classic theory that originated with Timur Kuran’s seminal work. Whenever the median voter, or median would-be voter in an autocracy or failed democracy, seems to be held in sway by particularly transparent propaganda, I usually start from the default assumption of preference falsification. These people know the media regime they live within is a menagerie of lies that exist solely to flatter leadership and disrupt any opposition, but they also know that their short run futures remain more secure if they not only publicly accept, but actively parrot the lies.
For now at least. Preference falsification is an inherently fragile equilibria. As effective and impenetrable as propaganda can appear at a given moment, public support for those lies can collapse in the blink of an eye, a dynamic only intensified by modern communication technology.
The ability of false beliefs to solve coordination problems is more subtle, but no less salient to the propaganda premium puzzle, particularly when a regime is dependent on a small subset of a society to hold on to power (a “selectorate“) or the support of a political “base” who would otherwise have difficulty signaling their identity to one another. The reality is that obvious or widely shared truths have almost no value when trying to signal mutual affinity and trustworthiness to individuals trying to solve collective action problems. Patently ridiculous beliefs, on the other hand, work precisely because the only people who would publicly commit to holding such beliefs are those who are committed to the collectively produced club good.
So why does propaganda continue to work better than we think it should? Because we’re using the wrong metrics. Or, more precisely, because the right metrics aren’t available to us. We can ask people what they believe, but we can’t make them tell us the truth. And even if we could make them tell us the truth, we can’t measure the benefits motivating their reasoning, the value of the club goods they are gaining access to because they’ve performed the mental gymnastics necessary to hold those beliefs. Sure, it was cognitively costly to convince yourself the earth is likely flat, but those costs are trivial in the face falling out of the tightest network of friends you’ve ever been a part of.
All of this armchair theorizing is really just a long-winded way of suggesting that fighting propaganda is decidedly not about curing people of their false beliefs. If you want to unravel preference falsification, people don’t need the truth. They already know the truth. What they need are safe channels to express it to one another. If subgroups are forming around false beliefs, the answer is not to shame them for their beliefs. That will only strengthen their group and members’ committment to one another. Rather, the answer is to provide superior substitutes for the club goods they are currently receiving. When in doubt, if you want to break a social equilibrium, you’re better off giving people what they need rather than demeaning what they have.
Come to think of it, that’s probably pretty good life advice in general.
This is a hugely important topic (sadly). I agree with the action steps: ” If subgroups are forming around false beliefs, the answer is not to shame them for their beliefs. That will only strengthen their group and members’ committment to one another. Rather, the answer is to provide superior substitutes for the club goods they are currently receiving. When in doubt, if you want to break a social equilibrium, you’re better off giving people what they need rather than demeaning what they have…” .
But I am not so sure that the masses currently under the spell of the propaganda really “know the truth”. For instance, in today’s Russia, the urban elites all know that the official news is bogus, but from all I have read, the masses (like 85% of the people) are all bought into the official stories (Ukraine/the West is the aggressor, all those dead bodies in Bucha were planted there by Ukrainians after the nice Russian troops withdrew, etc.). It is so much easer just to go along with what you are being told, especially if it meshes with decades of prior propagana about the West is out to get Russia….
LikeLiked by 1 person