It only takes one dissenting voice
DURING the decades leading up to World War II, fascists around the world were very publicly supported by highly influential sections of the international ruling elite, in political, cultural and commercial terms.
And this was going on at the same time as many working people were speaking out against those same fascists in terms of the far right being used as a counter-force against socialist movements in regions where the elite were most threatened.
Italy, Germany and Spain at that time were perhaps the three regions in the northern hemisphere most likely to develop into revolutions, which is why it is no surprise that, in those countries during the pre-war years, alongside the traditions of patriarchy and anti-semitism, the fascists also embraced the capitalist tradition of anti-trade unionism and the ruling elite’s tradition of anti-equality.
The war changed everything. As journalists started to report back from the liberated concentration camps, millions of very angry people began asking the sorts of questions that threaten the structures of power.
While many of the pre-war fascist sympathisers within the ruling elite fell silent, the wider political class recognised the urgency and necessity for a clear explanation of how this had come about. And it wasn’t long before almost an entire field of psychology began focusing on the questions that many were asking.
The first reasoned and informed response to these questions came from Theodor Adorno et al, who argued that an authoritarian personality type existed.
Adorno’s team suggested that certain people were simply more susceptible to succumbing to authoritarian ideologies and demagogues.
They even created a scale of traits with which they believed they would be able to use to test for it.
Even though this work has been largely consigned to the academic texts, there is one very interesting conclusion that comes out of it.
The Adorno study suggested that a direct relationship may exist between the way an individual thinks and behaves and the groups inside which they see themselves and those outside which they see themselves.
The next major study was in 1951, when a research team led by Solomon Asch conducted a series of experiments looking at how a majority of people within a larger group can influence a minority in that same group.
In the experiment, the groups were shown two cards, one with a single line on it, and the second with three lines of different lengths.
They were then asked, in turn, to say out loud which of the three lines was the same length as the line on the single line card.
Only one of the group was an actual participant in the study, while the rest were confederates of the study team.
The experiment was run multiple times with multiple groups, with the confederates sometimes answering correctly and sometimes incorrectly.
The study was testing how often the participant would repeat the majority opinion when it was wrong. Although the average rate of conforming to an incorrect majority opinion was only around 37 per cent, 75 per cent of all the participants conformed to an incorrect opinion on at least one occasion and 5 per cent of them conformed on every occasion.
There was evidence mounting that, in certain circumstances, large sections of society appeared willing to express an opinion they knew to be wrong in order to be part of the majority in a group.
Afterwards the participants gave three explanation for why they had conformed. The first was that they honestly believed that they were giving the right answer.
The second was that they doubted their own judgement based on the view of the majority. And the last explanation was that they had acted in a way they believed would avoid them being judged badly by the other members of the group.
Variations on this study continue to this day. More recent findings have suggested that the level of conformity increases if the group is pre-established, as in the case of a local political or campaigning group, or if members of the group are thought to be of high social status, such as highly educated or wealthy.
It also appears that this majority-influence effect only needs as few as three people for it to take place. The variables that show conformity decreasing are also very interesting.
It appears that it only takes one dissenting voice, regardless of whether they agree with the participant or not, to decrease the level of conformity by as much as a quarter.
It took almost a decade for the next major study to report. This time, a team at Yale University led by Stanley Milgram had been running a series of studies to research the conflict between a person’s conscience and their obedience to authority.
In their model, the participant, referred to as the “teacher,” was put in front of a machine they were told would deliver increasing levels of electric shock to another person, referred to as the “learner.”
The “teacher” was told that they were testing to see the effect of the shocks on the “learner’s” ability to learn a series of paired words and that they would need to administer an increasing level of shock for each incorrect answer.
The “learner” was in the next room, so that the “teacher” could hear the “learner’s” responses to the shocks.
A second confederate, playing the role of a researcher, told the “teacher” they had to continue whenever they expressed a desire to stop administering the shocks.
The first study found 65 per cent of participants obeyed every order to continue, even when it appeared to threaten the life of the “learner.”
Milgram’s teams and many others have since replicated this study on numerous occasions with almost countless variables to better understand its terrifying conclusions.
There have been studies comparing male and female participants, different proximities between the “learner” and the “teacher,” different types of building for the experiment to take place in, participants of different educational levels and even one comparing military participants with non-military participants.
In one variation the “teacher” was paired with two other “teachers,” who were in fact confederates, and were dissenting from the order to obey.
In this study the participant’s obedience to continue administering shocks to the highest level dropped to only 10 per cent.
It was the second major study to demonstrate that it is possible that a significant number of us, given the right circumstances, have a propensity to allow others to dictate our behaviour, even when it directly conflicts with our own judgement.
But that wasn’t the only similarity. The Milgram study also suggested that dissent can significantly reduce the level of obedience of others.
The third major study, built on the work of Asch’s study into majority influence. It was led by Serge Moscovici and studied the strength of influence a minority opinion can have within a larger group.
The study consisted of 32 groups of six people, each made up of four participants and two confederates. Each group was shown 36 blue slides, and had to report on the colours afterwards.
In the first experiment the confederates consistently said that the slides were green. In the second experiment the answers were written down. And in the final experiment the confederates answered green 24 times, and blue 12 times.
Overall, the participants agreed with the confederates just under 8.5 per cent of the time, agreeing at least once 32 per cent of the time.
The team concluded that the order of the confederates answers had no significant effect, but, when the confederates were inconsistent, the agreement dropped to just 1.25 per cent of the time.
Moscovici’s team had demonstrated that, in laboratory conditions, a minority opinion can influence a majority opinion. And furthermore, that the more consistent the minority opinion is, the more influence it has on the majority.
The fourth major study took place in 1971 in Stanford University led by Haney, Banks and Zimbardo. It became known as the Stanford Prison Experiment.
Building on the findings of Adorno, Asch, Milgram and Moscovici, the purpose was to understand whether it is the situation that a person is in or the person themselves that dictates how they will behave towards others.
And, yet again, what happened shocked everyone involved. The participants who had been allocated the roles of “guards” became oppressive, abusive and sadistic, while the participants who had been allocated the “prisoner” roles became submissive, compliant and secretive.
Originally planned to last two weeks, by the second day the experiment was getting out of control. Five of the “prisoners” had to be released as they were displaying symptoms of extreme depression and acute anxiety.
Shortly after that, only six days in, the entire experiment had to be shut down. The research team had realised that even they were not immune to taking on roles in the prison narrative that they had created.
Dr Zimbardo has since gone to astonishing lengths to warn others of the virulent nature of situational influences.
He has admitted how it took people coming in from outside the experiment to make him realise that he had unwittingly embraced the role of a prison administrator, putting the participants whose safety he was responsible for at risk.
In bringing the Stanford Prison Experiment to a close and presenting their conclusions, Haney et al suggested that the participants’ change in behaviour was in response to a normative social influence and, rather than internalising a new understanding of themselves, they were in fact complying with a perceived set of traits of a preconceived group.
To put it simply, the “prisoners” allowed themselves to become prisoners, while the “guards” allowed themselves to become guards, and all that they believed those two roles entailed.
Haney et al went on to argue that the unexpected submission of the “prisoners” came about through an accumulation of three processes.
The prisoners had lost a sense of being individuals by accepting the numbers allocated to them, they had lost any sense of being able to predict the guards’ behaviour and had become dependent on the “guards” who controlled the food, toilets, sleep times and access to the outside world.
Decades later Dr Zimbardo would recognise the pattern he saw in 1971 again, as images of the torture at Abu Ghraib flashed across television screens around the world. It appeared that we had learned nothing.
Moscovici’s study argues that a very small persistent minority can, over time, drive ever increasing numbers of people to internalise an opinion they know to be wrong.
Like a social cancer eating away at our communities, or activists’ groups, if not taken seriously and brought out into the open and confronted, it can grow into a majority opinion.
In the Asch study, at least one of the reasons the participants gave for why they conformed with a clearly incorrect majority opinion was that their need to establish and become a member of a group among strangers was greater than their need to demonstrate that they were able to differentiate right from wrong.
Without dissenting voices, a great many people will accept an opinion, an attitude or a behaviour that they know to be wrong.
Haney’s study builds upon this by demonstrating how, by institutionalising these sorts of ideas and behaviours into group roles, a great many of us can easily find ourselves playing out roles we had previously thought ourselves incapable of.
And for those of us who imagine that we would not “just follow orders,” the obedience to authority studies argue that, given the right conditions, an overwhelming majority of us could do exactly that.
These arguments go a long way to explain how hatred and prejudice can quickly spiral out of control in online social networks and how intelligence agents can infiltrate and effectively immobilise activist groups with such apparent ease and speed.
By feeding the group’s internal discourse, even from a minority position and if left unquestioned, hatred can slowly come to define the beliefs and behaviours of its individual members and the group itself.
However, these were not the only findings from these studies. In the Asch study it was shown that it takes only one dissenting voice to undermine the power of the majority.
Every time one of us speaks out, we cut through a thousand other voices. In the Moscovici study it was demonstrated that, when a minority, however small, can maintain a persistent and consistent position, it has the power to influence the wider society.
Clearly the Establishment recognises this, as it appears to spend an overwhelming amount of time and energy on creating divisions among the left. And as bad as the Stanford experiment was, it is worth remembering that it only took a couple of people to point out that the situation was spiralling out of control.
Once again, a measured, informed, and non-judgemental voice of dissent was enough to give those caught in the situation the catalyst that they needed to stop and collect their thoughts.
Out of all the studies mentioned here, the one many find most worrying is Milgram’s “obedience to authority.” However even among these, there was one story that stood out as a positive reminder.
In one of the later studies a participant refused to obey the confederates’ command and then, when she realised what was expected of her, refused to take part in the study altogether.
She had grown up in Germany under fascism and knew exactly where unquestioning obedience to authority leads. Brutality and cruelty can begin as an idea repeated over and over again by small a minority of people.
If left uncontested it can grow into a social movement. If people don’t stand up for what they know is right, it can roll across an entire society destroying and killing everyone that stands in its way.
It takes courage to dissent when everyone is telling you that you’re wrong. It takes strength to persist when hardly anyone else appears to be listening.
But we all know where this road leads if we do nothing. In our societies, communities, activist networks and even among our friends and families, there will often be people trying to spread division and distrust by promoting hatred and fear, but equally it only takes one of us speaking out to stop it.
For more of Nicolas Lalaguna’s writing visit www.nicolaslalaguna.com.