2 Comments
Apr 8, 2023·edited Apr 8, 2023

I don't think people use Bayesian inference at all; I am sure you are familiar with the work of Daniel Kahneman and Amos Tversky, where they tested exactly this and found that people's answers are not at all consistent with Bayesian reasoning, and instead are consistent with the use of simple heuristics. Even the critics (E.G. Gigerenzer) of their findings within psychology do not believe we use Bayesian reasoning, they just point to other heuristics. Maybe you have already considered this, and think they are dumb.

Expand full comment

"Whether or not they reject updates of priors based on the projected costs of such an update is an interesting question."

I do think this is the case, and there are both cognitive and emotional costs to updating priors. The cognitive cost is is the cost of updating your world model given the new information, which can potentially result in cascading changes when the new information makes the model internally inconsistent. For instance, if you used to believe that women are equal to men, and in light of new evidence you now believe that women are substantially different from men, you then need to reassess your beliefs on the wage gap, which may then lead to reconsidering the role of women in the workforce, etc. Given how the mainstream world model is basically inverted, one small change in beliefs can necessitate a huge cascade of changes to keep the model consistent and reduce cognitive dissonance. This makes the “activation energy” for accepting non-mainstream beliefs very high for most people.

It’s also possible that the new data suggests a new possible world model, but doesn’t provide enough evidence to justify accepting it over the existing model. Then you would need to keep both models in mind when evaluating evidence, eventually accepting the new one or keeping the old one once it is clear which one is better. This is time and energy intensive, and requires a high IQ to begin with. This is why Kuhnian revolutions are generally the product of a single person who has taken the time to come up with a new model. Everyone who comes after simply accepts the new model and works within it (of course this is a simplification). See Max Planck’s quote about science advancing one funeral at a time.

Then you have the emotional cost. I think Peterson is basically right when he says that people live within a narrative that they tell about themselves, comprised of their objective world model, their value hierarchy, and their accomplished past and projected future. Changes to this narrative can be very stressful, and if the change is large enough it can completely destabilize someone’s identity. It’s fundamentally the same reason that people ignore evidence that their partner is cheating on them, even when it’s obvious to other people. If you are a standard issue Leftist then accepting evidence that points to the fact that the Black-White achievement gap is due to genetics can blow up your entire identity. Unless you have a fundamental commitment to truth (which I agree the vast majority do not have) you’re not very likely to pay that price.

“Knowing truth, whether unconsciously or not, should be maximally adaptive. If there is a wall in front of you, and it becomes politically expedient to lie about there being a wall, you want to be an organism that both lies about the wall and knows there is a wall there.”

Crafting a lie to fit with a given ideology is cognitively expensive, and most of the population is probably not capable of doing that on a regular basis. Someone who is speaking honestly will simply generate an utterance based on their genuinely-held worldview, but someone who is lying has to simulate an ideologically-compliant distribution and then sample from that. This would be why individuals and populations that specialize in lying skew towards higher verbal IQs. Repeatedly lying can also have the effect of causing you to believe your own bullshit, which I think is an adaptation to reduce the cognitive cost of lying. Consequently, I think this is only practicable for people with >115 verbal IQ and a particular temperament (probably high in dark tetrad traits).

Expand full comment