Lest we forget, our species is called Homo sapiens—the “thinking man”. We are supposed to be better than apes because, so the theory goes, we can “get into somebody’s head”, meaning that we know what other people know. We can attribute mental states such as intentions, goals, and knowledge to others.
So, you would think that a person of average intelligence would have no problem watching Sean Spicer or Kellyanne Conway lying through their teeth—and figure out that they themselves don’t really believe the increasingly bizarre “alternative facts” that they are spinning. That being said, apparently, about 30% of the people do believe every word they utter. How could that be? Understanding false belief is supposed to be one of the things that define us as H. sapiens.
The capacity to know the false beliefs of others
Here is a classic experiment that demonstrates how our minds work:
It starts by showing children a video of a doll named Sally hiding an item. Then, they see Sally leave the room and another doll comes into the room. The second doll takes the item and hides it in a different place. When you ask the children where Sally will look for the item upon her return, very young children, less than approximately age four, will pick the new hiding place where they themselves know the item to be.
However, older children, after about age four, understand that Sally doesn’t know what they know—that is that the item was moved from its original hiding place. They will answer the question by saying that Sally will look for the item where she originally left it. This experiment demonstrates that even children have the capacity to “get into somebody’s mind”.
In psychology, this capacity to know what’s going on in somebody’s mind, or not just know but also feel what another person feels (which is the essence of empathy), is called “the theory of mind”. Initially, it was thought that this was a uniquely human characteristic, but now we know better.
In a series of sophisticated experiments, using eye-tracking technology, scientists repeated the doll experiment with apes (except that the doll was now King Kong instead of Sally). They showed that apes are just as smart as the older children when it came to figuring out a false belief compared to their own knowledge of the facts.
So, assuming that the 30% of people who believe Spicer and Conway’s “post-facts ‘facts'” are more sophisticated than children under 4 years old and at least as sophisticated as the apes, what explains their inability to separate fact from fiction?
And lest I come across as rank partisan, I include as alternative fact believers the leftists and liberals who believe that vaccination causes autism despite overwhelming evidence to the contrary, including the fact that the original publication claiming evidence for it was found to be fraudulent, plain and simple. I also include those that consider as an almost religious belief that GMO will kill you or that dairy food is toxic.
Not only does science not support these unfounded beliefs, the evidence to the contrary is in plain sight. Witness the millions of people that eat and drink those ‘harmful’ things and are still alive and kicking, in excellent health. This includes those fanatic practitioners who unwittingly are consumers of GMO, which is in almost every food we consume nowadays, regardless of claims to the contrary.
An intriguing explanation
So, that being said, how do we know what we know? Just think for a moment: Suppose you are totally isolated from other human beings and all you know is from personal observation of your immediate environment. Obviously, your fund of knowledge is going to be pretty limited.
The main point that two cognitive scientists are making in an article in the NYT is that all human knowledge is shared. For example, we know that the earth is round, but this knowledge came not from our own observation, but from scientists and teachers who shared the knowledge. The evolutionary imperative of sharing knowledge is obvious. Can you see a stone age individual hunter surviving very long? It is the band, with shared knowledge and strategy, that could corner a fleet-footed prey or overcome a ferocious predator.
The issue at hand is not that the people who believe non-facts are stupid—they are not. They are simply sharing, knowingly or unknowingly, non-factual knowledge.
In the words of the cognitive scientists mentioned above:
“It is remarkable that large groups of people can coalesce around a common belief when few of them individually possess the requisite knowledge to support it.”
Their solution to the problem? Insist on “expertise and nuanced analysis from our leaders.”
Ha! They just got through telling us that, individually, we are pretty much ignorami, and are naturally prone to believe in whatever falsehoods “leaders” feed us. It takes a great leap of faith to believe that our “leaders” would be honest enough, thoughtful enough, empathic enough, to think of the common good rather than their own.
I am a bit more cynical. I believe that people act, and vote, according to their instincts, beliefs embedded in their psyche over a lifetime, prejudices, and not necessarily through rational analysis. They will vote for people that, on a gut level, seem like them, either through holding the same prejudices, or using the same language, or coming from a social background that is similar to theirs. Neither of these factors is quantifiable. It is gut level reign supreme. This is basically Kahneman’s System 1, the instinctual one, vs. System 2, the analytical one.
Once we form our view of the world, it is hard to change it. As I said above, our mind perceives it as existential. And if we grew up in the Bible Belt, we “cling to our Bible and guns”, as someone said.
We are tribal, by nature. And the common belief in alternative facts is not an accident; it makes the worldview of the true believers internally consistent, hence, its strength.
Can this be changed?
I am afraid not by much. Our minds are inherently lazy; System 1 rules supreme in the brain. It takes time and persistent counter-experiences to convince us that this system is wrong for us or, in plain language, “to change our minds”, which is another way of saying that System 2 (analytical) will eventually embed our experiences into System 1 (instinctual).
It will take major outbreaks of polio to finally convince the mothers of Marin County and Manhattan to accept childhood vaccination. It took widespread unemployment and major outbreaks of crack addiction in their own communities for people to realize that “their tribe” is not immune from the “other tribe”s” problems.
How do you break down the walls the tribalism erects? The answer: Not easily. Unfortunately, it is deeply embedded in us.