Touch here for mobile friendly version

Saturday, April 14, 2018

Breaking the Cycle of Anti-nuclear Indoctrination--Tribalism

In a nutshell, indoctrination is all about creating an "us verses them" mindset ...to create imaginary ingroups and outgroups. Facts are irrelevant. The following podcast made it clear to me how and why indoctrination works.

Go here for the transcript of a podcast titled Tribal Psychology on the You are Not so Smart (a celebration of self-delusion) website. I suspect that the grammatical errors in the transcript may be the result of having been transcribed by a computer program instead of a person. A bunch of money quotes:
In the 1970s, a psychologist named Henri Tajfel develop something called social identity theory which basically said that when we define ourselves, we do so in large part by asserting our loyalty to the groups to which we belong. Tajfel developed this theory when in his research he discovered it didn’t take very much for humans to organize themselves into groups, and once they did, they immediately began to act like assholes to people who were in groups that they were not. Tajfel’s experiments showed that humans can enter into us-versus-them thinking in seconds, and they will do so over just about anything.

After many experiments built on these foundations, social psychologists found that there is simply no salient, shared quality around which opposing groups will not form. And once they do form people in those groups immediately begin exhibiting tribal favoritism, tribal signaling, tribal bias, and so on. And that’s why this is called the minimal group paradigm. Humans not only instinctively form groups, they will form them over anything — no matter how arbitrary or minimal or meaningless.

One of the amazing takeaways from this work is that the origin of opposing tribes and groups and parties and any clustering that leads to partisanship is often something random and out of the individual member’s control, where they were born, the religions they inherited, the schools they attended, the food they eat or don’t eat. And these starting conditions are like specks of dirt around which forms cultural pearls.

...we have a psychological need for both inclusion and exclusion. So, we need to feel that we are part of something, but we also need to feel that not anybody can be part of if it in order for us to feel important ourselves. We need to feel like we’re included in some group, and that there are outsiders. There are some people that don’t get in. So there’s something that makes you special, and the underlying idea behind social identity is really self-esteem based. So we need to be part of groups to enhance our own self-esteem, to feel good about ourselves as individuals. We have to feel that we’re being accepted by group, and that group has status. And as long as that group has status, we feel good about ourselves. When the status of that group is threatened, we start feeling bad, and then we have to do things to improve the status of a group. That’s why there’s a conflict between groups, we start fighting a lot harder, because the conflict between groups is really a fight for your own self-esteem and sense of worth.

And the research in both psychology and neuroscience says that because our identities have so much to do with group loyalty, thinking in terms of us versus them is an essential property of the human brain, and if activated if stimulated, we can’t help ourselves but to think in a tribal way.

He looked across many different examples of conflict and genocide, and he thought the differences that people claimed as the source of their hatred seemed arbitrary, rooted in meaningless categorical differences, not world-defining, ideological differences.

...if you made people into nothing more than a group A and Group B, or even strip that out — make people just Group Red and Group Blue. And then add one small meaningless difference at a time, like one group whereas hats, and the other doesn’t. At what point would people start showing animosity for the other — for them — when would they show favoritism for their group and bias for their outgroup? If he could find that point he thought it would establish a baseline for prejudice and discrimination. But what he discovered was that there is no baseline. Any noticeable difference of any kind will reliably stimulate the behavior that flows from tribal psychology.

This was a shocking finding for Tajfel, and he repeated this experiment in a number of different ways, and he found the same thing every time.

...and still knowing they were randomly assorted, people exhibited favoritism toward their imaginary ingroup and bias toward their imaginary outgroup.

The power of modern media and modern social media has allowed humans to signal their tribal loyalties on a scale that has never ever been possible, and this one thing might just be what is driving polarization.

Since we tend to form tribes very easily, and often around differences that are arbitrary, and since we usually are more motivated by tribal psychology than anything else, what is happening is that more and more issues are simply leaving the realm of compromise and debate, of evidence and rational analysis, and becoming mutated by politicization, by tribal signaling, and once an issue becomes politicized, it just leaves the realm of facts and figures — it just becomes another way to tell us from them.

There is just this natural thing that’s driving people to discriminate against their outgroup, regardless of content — even when it is a meaningless group. Which means that technically it’s possible for people to not disagree about anything, and still discriminate against the people that they are in competition with. That’s what I started with when I started my research — was that assumption — it should be possible for people to agree on stuff and still not like each other just based on identity alone, and I actually did find evidence for that.

...the desire to be correct becomes far less-important than the desire to be a good member of your tribe.

Basically, to think like a Bayesian, is to imagine your beliefs as a percentage of confidence instead of simply true or false. So instead of saying, “I believe my Hamster is alive and well,” you would say, “I am 70 percent sure that my hamster is alive and well based on the evidence available to me at this time.” If we were motivated by the pursuit of accuracy above all else, Bayesian reasoning would be how we updated all of our beliefs, but we aren’t, and it isn’t.

...if individuals are members of groups who have become polarized about a particular issue, and that polarization puts the group’s opinions at odds with scientific consensus, people will almost always go with what their group believes over what the preponderance of the evidence suggests.

...the evidence is clear that humans value being a good member of their tribe much more than they value being correct.

We are unaware of how unaware we are, yet we proceed with confidence in the false assumption that we are fully aware of our motivations and the sources of our thoughts, feelings, and emotions. In fact, much of the time, if not most of the time, the true source of those things, the true motivations behind our behaviors, is often invisible or unknowable, or in the case of tribal psychology something we’d rather not believe about ourselves. None of us wants to think that we are simply parroting the perspectives of elites or going along with the attitudes of our tribes, but the work of Dan Kahan and Lillanna Mason and many others suggests that for many issues that is exactly what is happening.

...literally any evidence-based issue can become politicized.

Nobody was arguing about that one, even though it came just a couple of years before. The difference is that people learned about the HBV vaccine from their doctors. It wasn’t politicized. The HPV vaccine, however, they learned about probably by watching MSNBC and Fox News, where that message was it’s us versus them again. That occurred because the manufacturer took a very unorthodox route to try to introduce the vaccine.

Now people who are gladly allowing their children to get the HBV vaccine are opposed, completely opposed, to the nearly identically administered HPV vaccine. Now, it seems nonsensical, but again, being a good member of your tribe is more important than holding correct views, and Kahan says that the very same thing can happen to anything. Dark matter, volcanoes, Net Neutrality, self-driving cars. So, it’s in our best interest to keep every single scientific concept as neutral and bipartisan as possible, because once evidence is polluted by tribal loyalty, people can’t help but be wrong and stay wrong, even if 98 percent of scientists are telling them they should change their minds.

...there was a moment during the debates, the presidential debates, when Hillary Clinton said something about ingroup bias. And I think it was Mike Pence who said something like, “How dare you accuse us of being biased,” and just blew my mind. Obviously she wasn’t saying that Republicans are biased, what she was saying is every single human being has this in them. It’s not it’s not offensive to say it, it shouldn’t be offensive to say; it’s just natural psychology.

A research study that we did recently showed that people who are high in science curiosity aren’t as polarized on these issues, and they don’t display this really kind of perverse effect of becoming even more active more and more consistent with their their groups position...

If your group has it right on what scientific consensus is then just count yourself lucky, because you don’t understand what the scientist does in his or her own terms. It just happens to be that your intermediary groups managed to get you the right answer despite the assault that they’re under, and all groups have embarrassing instances where the message they’ve got from their intermediaries is false. And I think a little humility in recognizing that might well be something that can help us to to try to solve these problems.

...you really need some empathy here because people can’t change their minds when they are trapped in tribes that believe one way or the other. They can’t accept the evidence even when they want to, even when they know in their hearts that they are incorrect.
The podcast suggests two ways out of this evolutionary trap:

Bayesian thinking
Ranking high in science curiosity

Assuming any given individual has the cognitive firepower to do those things, the motivation to do them is still largely lacking. Few people in the comment fields below websites like GTM and CleanTechnica are searching for truth. Most are simply participating in the reinforcement of tribal ideology and assisting their perceived tribe to oust those who they perceive to be members of other tribes who have infiltrated what they perceive to be their tribal boundary.

All the same, it's important to be putting facts out there for those still looking for them. 

This article will be added to the list found at Breaking the Anti-nuclear Indoctrination Cycle.