Friday, February 21, 2014

How to make a Social Justice Warrior

This is the almost-final-draft of a section from a book about social justice warriors, identitarianism, and mobbing. For more information and links to other chapters, see How to Make a Social Justice Warrior.

How to make a Social Justice Warrior
• The cult symptoms of Social Justice Warriors
• Ten Points About Conformity

How to make a Social Justice Warrior

"Decent people participate in horrific acts not because they become passive, mindless functionaries who do not know what they are doing, but rather because they come to believe—typically under the influence of those in authority—that what they are doing is right.” —Alexander Haslam

“It has been said that man is a rational animal. All my life I have been searching for evidence which could support this.” —Bertrand Russell

“Man is a rational animal who always loses his temper when called upon to act in accordance with the dictates of reason.” —Oscar Wilde

“When we remember that we are all mad, the mysteries disappear and life stands explained.” —Mark Twain

The recipe for making a Social Justice Warrior:

1. Call attention to an injustice.

2. Blame the injustice on a group the injustice appears to favor.

Social justice warriors are the crusaders of identitarianism. Where history’s crusaders cried “God wills it!” to excuse their deeds, today’s crusaders cite social justice. Just as many Christians reject the Crusaders’ approach, many identitarians reject the warriors’ approach—but just as few Christians spoke out against the Crusaders, few identitarians speak out against social justice warriors. “They mean well,” their enablers say without remembering that history is filled with bad things done with the best intentions.

Though social justice warriors sometimes act offline—delivering death threats, calling employers, etc.—they are primarily an internet phenomenon. This doesn’t mean they’re a product of the internet. Their predecessors posted flyers, shared pamphlets, and scrawled their outrage on walls.

To social justice warriors, the most privileged group in the US are white men. They fail to see that rulers may have nothing to do with the ruled. According to the US Census Bureau in 2004, white people were 82% of all US households, 75% of the lowest economic quintile (the bottom 20%) and 88% of the top 5%. If you believe the races should be evenly distributed, the underrepresentation of white people in the bottom quintile was 92.5% and their overrepresentation in the top 5% was 107%. But that approach means Asian-Americans are far more privileged. Though Asian Americans were 3.65% of the population in 2004, they were only 2.76% of the bottom quintile and 6.46% of the top quintile—the underrepresentation of Asian Americans in the bottom quintile was 75%, and the overrepresentation in the top 5% was 176%. By identitarian logic, Asian-Americans are nearly twice as privileged as white Americans.

If you prefer to look at median household wealth when thinking about privilege, the United States Census Bureau’s 2006-2010 American Community Survey found the median household wealth was $54,857 for white Americans and $68,089 for Asian Americans. By that analysis, white Americans were only 80% as privileged as Asian Americans.

If you’d rather look at religion and privilege, Pew found in 2009 that 43% of Hindus and 46% of Jews make more than $100,000 a year, while only 21% of Protestants, 16% of Mormons, and 19% of Catholics are in that same top bracket. By identitarian logic, Jewish and Hindu Americans should be more than twice as privileged as Christian Americans.

But that calls for identitarian logic to be consistent. It’s not. Subjectivity doesn’t call for consistency.

• The cult symptoms of Social Justice Warriors

Irving L. Janis could have been describing social justice warriors when he wrote in Victims of Groupthink: “The member’s firm belief in the inherent morality of their group and their use of undifferentiated negative stereotypes of opponents enable them to minimize decision conflicts between ethical values and expediency, especially when they are inclined to resort to violence. The shared belief that “we are a wise and good group” inclines them to use group concurrence as a major criterion to judge the morality as well as the efficacy of any policy under discussion. “Since our group’s objectives are good,” the members feel, “any means we decide to use must be good.” This shared assumption helps the members avoid feelings of shame or guilt about decisions that may violate their personal code of ethical behavior. Negative stereotypes of the enemy enhance their sense of moral righteousness as well as their pride in the lofty mission of the in-group.”

Janis listed eight symptoms of cults:

1. an illusion of invulnerability, shared by most or all the members, which creates excessive optimism and encourages taking extreme risks; 

2. collective efforts to rationalize in order to discount warnings which might lead the members to reconsider their assumptions before they recommit themselves to their past policy decisions; 

3. an unquestioned belief in the group’s inherent morality, inclining the members to ignore the ethical or moral consequences of their decisions; 

4. stereotyped views of enemy leaders as too evil to warrant genuine attempts to negotiate, or as too weak and stupid to counter whatever risky attempts are made to defeat their purposes; 

5. direct pressure on any member who expresses strong arguments against any of the group’s stereotypes, illusions, or commitments, making clear that this type of dissent is contrary to what is expected of all loyal members; 

6. self-censorship of deviations from the apparent group consensus, reflecting each member’s inclination to minimize to himself the importance of his doubts and counterarguments; 

7. a shared illusion of unanimity concerning judgments conforming to the majority view (partly resulting from self-censorship of deviations, augmented by the false assumption that silence means consent); 

8. the emergence of self-appointed mindguards—members who protect the group from adverse information that might shatter their shared complacency about the effectiveness and morality of their decisions.

Cults keep members from questioning their assumptions by creating a self-contained environment, complete with unique terminology. Robert Jay Lifton wrote in Thought Reform and the Psychology of Totalism, “The language of the totalist environment is characterized by the thought-terminating clichĂ©. The most far-reaching and complex of human problems are compressed into brief, highly reductive, definitive-sounding phrases, easily memorized and easily expressed. These become the start and finish of any ideological analysis.”

Whether the anonymous author of “The Culture of Cults” offers an accurate description of cults in general, I don't know, but the writer’s description includes many bits that describe Social Justice Warriors. It identifies:

Their kind of cult: “therapy cults promote a secular type of belief system, based on quasi-scientific or quasi-psychological principles.”

Their approach: “Actions which, to an outsider, might seem devious or immoral, may, in the mind of a believer, seem perfectly just and ethical.”

And their pursuit of ideological perfection: “‘The Demand for Purity: The creation of a guilt and shame milieu by holding up standards of perfection that no human being can accomplish. People are punished and learn to punish themselves for not living up to the group's ideals.’”

Its list of cult traits that fit social justice warriors:

• Independent and non-accountable—believers follow their own self-justifying moral codes: e.g. a Moonie may, in their own mind, justify deceptive recruiting as 'deceiving evil into goodness'.

• Aspirational—they appeal to ambitious, idealistic people. The assumption that only weak, gullible people join cults is not necessarily true.

• Personal and experiential—it is not possible to exercise informed free choice in advance, about whether the belief system is valid or not, or about the benefits of following the study and training opportunities offered by the group. The benefits, if any, of group involvement can only be evaluated after a suitable period of time spent with the group. How long a suitable period of time might be, depends on the individual, and cannot be determined in advance.

• Hierarchical and dualistic—cult belief systems revolve around ideas about higher and lower levels of understanding. There is a hierarchy of awareness, and a path from lower to higher levels. Believers tend to divide the world into the saved and the fallen, the awakened and the deluded, etc.

• Bi-polar—believers experience alternating episodes of faith and doubt, confidence and anxiety, self-righteousness and guilt, depending how well or how badly they feel they are progressing along the path.

• Addictive—believers may become intoxicated with the ideals of the belief system, and feel a vicarious pride in being associated with these ideals. Cults tend to be cliquey and elitist, and believers can become dependent on the approval of the group's elite to maintain their own self-esteem...

Non-falsifiable—a cult belief system can never be shown to be invalid or wrong. This is partly why critics have low credibility, and why it can be difficult to warn people of the dangers of a cult.

• Ten Points About Conformity

1. Choose your group carefully—you will conform

“Every generation laughs at the old fashions, but follows religiously the new.” ― Henry David Thoreau

“We are not talking about mere instinctive conformity — it is, after all, a perennial failing of mankind. What we are talking about is a rationalized conformity — an open, articulate philosophy which holds that group values are not only expedient but right and good as well.” —William H. White on groupthink

A Harvard study in 2011 led by Jamil Zaki found that the process of changing your mind to conform may involve literally changing your mind. Men were asked to rate women in terms of attractiveness, then told their rating was different than most other men’s. The men then rated the women again, and their new choices were more like what they thought the majority believed. Changes in the parts of their brains linked to subjective values suggested the men were not lying to conform. They were rewriting their opinion.

2. Your community may change your memories

“Social manipulation can alter memory and extend the known functions of the amygdala to encompass socially mediated memory distortions.” —Micah Edelson, Tali Sharot,Raymond J. Dolan, Yadin Dudai, “Following the Crowd: Brain Substrates of Long-term Memory Conformity”

3. Your community may make you more extreme

"Groups consisting of individuals with extremist tendencies are more likely to shift, and likely to shift more (a point that bears on the wellsprings of violence and terrorism; the same is true for groups with some kind of salient shared identity like Republicans, Democrats, and lawyers, but unlike jurors and experimental subjects).When like-minded people are participating in 'iterated polarization games'—when they meet regularly, without sustained exposure to competing views—extreme movements are all the more likely." —Cass R. Sunstein, "The Law of Group Polarization", 

4. If your friends jump off a bridge, you’ll probably jump too

Referring to people who choose not to vaccinate their children as “nonconformers” and calling their communities “people networks”, Nancy Walsh noted in “Social Network Sways Vaccine Compliance” that “the most striking difference between the conformers’ and nonconformers’ people networks was that 72% of the nonconformers’ network members also were in favor of nonconformity, while only 13% of conformers’ network members held that view.”

5. Creating a believer is all about timing

Kim of the "Den of the Biting Beaver" wrote: “I suppose it all came together for me when I bought my first Andrea Dworkin book, Pornography: Men Possessing Women, right about the time I kicked my ex-husband out. In it I found names for the infrastructures I had already recognized in my mental meanderings. Andrea Dworkin gave me a center from which to work, and I connected with her words on a very basic level. From there I read everything I could get, and sometime around then I also began blogging about the threads I was so excited to be seeing.”

If she had read a book by a Scientologist or a Jehovah's Witness or Ayn Rand, she would’ve found different "names for the infrastructures" that gave her a “center from which to work”. There are times when we know something is wrong with the world and we need a model to understand it. Then we accept the first one that seems to work. So long as we can interpret the world in that model’s terms, we have no reason to seek another.

6. Trolls can make you doubt me: the nasty effect

1,183 participants in a study read a fake blog post about new technology. Half saw polite comments on the post; half saw rude ones. Except for the tone, the comments were similar in content and length. One of the study’s co-authors, Dominique Brossard said, “Basically what we saw is people that were exposed to the polite comments didn’t change their views really about the issue covering the story, versus the people that did see the rude comments became polarized — they became more against the technology that was covered in the story.”

When people with agendas act offensively, they can be acting effectively. When warriors object to “tone policing” and “concern trolling”, they’re not just being abusive, they’re furthering their cause.

The “nasty effect” makes sense if you think humans are pretentious monkeys. When one group is flinging feces, the others fear there’s a good reason.

7. You will trust me if you agree with me: confirmation bias

“Men readily believe what they want to believe.” —Julius Caesar

"In one sense [Stephen Jay] Gould has been proved right, though not in the way he would have wanted. His distortion of Morton’s data reveals how strongly held ideological beliefs – in this case not racism but anti-racism – can persuade one to see what one wants to see among the thicket of facts." —Kenan Malik, “The Science of Seeing What You Want To See”

Confirmation bias is the tendency to trust information that supports our beliefs and reject information that doesn’t. Humans may be susceptible to confirmation bias because it’s easiest to be a complacent member of a tribe if it’s hard to lose faith in the tribe. Confirmation bias explains why most people don’t change their opinions until the circumstances of their lives change.

Confirmation bias pays the bills in journalism. Pundits provide fuel for their fans’ beliefs. We love the ones whose views are like ours and hate those who differ. Whether they tell the truth is irrelevant. We listen for confirmation, not information.

8. You may vilify people and lose any ability to question your beliefs

Clay Shirky wrote in “A Group Is Its Own Worst Enemy”: “The second basic pattern that Bion detailed: The identification and vilification of external enemies. ...even if someone isn't really your enemy, identifying them as an enemy can cause a pleasant sense of group cohesion. And groups often gravitate towards members who are the most paranoid and make them leaders, because those are the people who are best at identifying external enemies. The third pattern Bion identified: Religious veneration. The nomination and worship of a religious icon or a set of religious tenets. The religious pattern is, essentially, we have nominated something that's beyond critique.”

9. You may come to believe in altruistic punishment

From Douglas Preston’s “Amanda Knox: She was acquitted of the Meredith Kercher murder. Why do people still hate her so much?”: “Experiments show that when some people punish others, the reward part of their brain lights up like a Christmas tree. It turns out we humans avidly engage in something anthropologists call “altruistic punishment.” What is altruistic punishment? It is when a person punishes someone who has done nothing against them personally but has violated what they perceive to be the norms of society. Why “altruistic”? Because the punisher is doing something that benefits society at large, with no immediate personal gain. Altruistic punishment is normally a good thing. Our entire criminal justice system is based on it….”

10. You will dance to the Amygdala Hijack, or the Hulk Effect

Humans react, then think. The reason? Our emotions are controlled by the brain’s limbic system and more specifically, by the amygdala. Wikipedia defines the Amygdala hijack as “….the term to describe emotional responses from people which are immediate and overwhelming, and out of measure with the actual stimulus because it has triggered a much more significant emotional threat. ...not all limbic hijackings are distressing. When a joke strikes someone as so uproarious that their laughter is almost explosive, that, too, is a limbic response. It is at work also in moments of intense joy.”

The hijack could be called the Hulk Effect: the angrier we get, the stupider we get. From “Neuroscience Fundamentals—The limbic System”: ”Amygdala hijack is known to be an evolutionary response to the environment where there is no time for rational thinking. Actions must be done to protect yourself from harm immediately resulting in “unthinkingly” or impulsive behaviours. Hadley (2010) proposed that up to 75% of the conscious reasoning is lost during the hijack. This conclusion was backed up by another paper by Peters (2011) who claimed that the energy sent to prefrontal cortex is greatly reduced during the hijack. Moreover, only 5% of the brain is devoted to the “present” situation whereas the rest is occupied with the past or future hassles.”

The hijack especially affects teenagers. From Kelly Pfeiffer’s “The Adolescent Brain and Decision Making Skills”: “In 2008, B. J. Casey’s research team at Weill Medical College of Cornell University and California Institute of Technology published an article suggesting that the limbic system is the major contributor of poor judgment and impulsivity in the teen brain. The team proposed that although the frontal lobe of the cortex can weigh options to make a decision with a safe outcome, a teen’s “on guard” limbic system often wins out over the reasoning of the prefrontal cortex resulting in more high risking taking decisions. The theory is that the impulsive ever vigilant limbic system keeps a teen’s brain focused on primal tasks such as finding a mate, elevating one’s status with peers and seeking pleasure activities such as eating, sex and novelty.”

But adults shouldn’t get smug. When we feel a threat—including a threat to our belief systems—we all dance the to the Limbic Hijack.