To explain the Sokolov effect, and why it has been mainly forgotten by society, we first have to understand the life of the man that gave it its name. Vasily Sokolov was born in 1884 in Minsk (current capital of Belarus). Son of Andrei Ilya Sokolov, bourgeois descent, and Maria Orlov, youngest daughter of an influential nobleman. He had a quite spared childhood by the time’s standards, loved to play chess with his father, brother and sister, and developed a devotion for reading from a very young age. Young Sokolov did suffer a devastating lost when, at the age of 7, his beloved brother Mika (13) died from tuberculosis.
Story by:Néstor Vázquez Bernat
Sokolov studied in the International Diplomatic School of Minsk until the age of 17, where he learned English, German and a bit of Latin and French. His grades placed him amongst the first of his class, with numerous extracurricular activities like chess andor debate team in which he also excelled. Unfortunately for his parents, during his school time he started showing interest for many liberal philosophers, and specifically for Karl Marx and his views of the problems of capitalism. He took with special interest The Communist Manifesto, and the idea that the person who worked the land must own its bearings, which was gaining a lot of popularity in Russia at the time.
In 1901 his parents sent him to study law to the University college of London, partly because of the institution’s good reputation, partly to get him away of the forming socialistic movement surging in Russia at the time. In an extremely ironical turn of events, it was there where Sokolov ended up meeting the members of the Russian Social Democratic Labor Party (RSDLP), who were trying to incite the revolution from the exile with the subversive newspaper Iskra (“Spark”). Despite his enthusiasm and will to participate in the newspaper, he was initially assigned small roles as errand boy which seemed to disappoint him according to the letters he had written to his sister in Minsk.
Sokolov’s luck turned in 1902 when a recently self-renamed Leon Trotsky joined Iskra after escaping his forced exile in Siberia. Trotsky got involved with Iskra at an accelerated rate and shortly became one of its more prolific writers and advisory to the editorial board, which was a maneuver orchestrated by Lenin to dampen the control of the so-called “old guard” in the board. Sokolov, quickly became sort of an apprentice of Trotsky, who seemed to fit the role of the older brother that Vasily needed since the untimely death of his own. Leon also became fond of Sokolov and helped him become more and more relevant in the newspaper up to the point where he managed to get published a couple of articles by the end of 1903.
Sokolov and Trotsky were inseparable from those days on, and he followed him through all the changes in the party, from Mensheviks to Bolsheviks and all the way to the permanent revolution concept. Trotsky and Sokolov had some pretty chaotic years until the soviet revolution finally took place on October 1917, their path took them back to Russia, to Finland, exiled in Obdorsk, to Austria, Hungary; they did not always reside at the same place, but kept constant communication through letters when they did not. When Lenin rose to power, and Trotsky became a minister although for a short period, Sokolov was his deputy.
The pair of them where the strong believers on internationalizing the revolution and they kept pushing the party towards that direction, which finally caused the falling of with Stalin when he rose to power and the final exile of them to Cuba. It was in 1937, with Sokolov away in London, when Stalin’s assassin ended the life of Leon Trotsky, which shocked a big part of the communist party and took a final blow to Sokolov’s belief in the Soviet project.
Sokolov spent the rest of his life writing safely in the UK and later in America against the Communist government in several publications, and researching about mass control and how the public could be completely fooled if the media was not free. His research about the brain’s incapacity to accept information that contradicted one’s core beliefs (named the Sokolov effect) gave him the recognition in the American Psicological Society and granted him the novel price in 1952. Sokolov finally died home in Berkeley in 1957 from a massive brain stroke, and his wife said that he was reading his old correspondence with Trotsky when it happened, with a smile on his face.
We could have finished the tale here, and it would have been a somewhat mediocre scientific story but probably an entertaining one nonetheless. Fortunately, is not the goal of Medicor to spread falsehoods (much the opposite), so I am obliged to acknowledge that Mr. Vasily Sokolov is not real, neither is the Sokolov effect a thing (that we know of). He is a purely fictional character, conjured by my considerably wild imagination. But despite being all fiction, this character illustrates a real effect, in fact two complementing ones: the confirmational bias and the backfire effect.
What would have happened if I said that Vasily Sokolov, a Russian, invented capitalism? What if he invented the Snorkel mask? Or Scottish Whisky? Depending on your core believes you might have found any of those much harder to believe than the original; maybe the one about capitalism, am I right? What if I told you that Napoleon was not short for his time (1,71m) or that buying an eco-friendly car pollutes more than keeping your old one for 100000km more? I guess the first one is much easier to accept than the second one? This is called backfire effect, which is largely ignored, was beautifully illustrated in The Oatmeal’s, is where our amygdala enters the game.
It has been proved that when information that contradict one’s core beliefs is heard, our amygdala fires, exactly as happens in response to an attack. This somewhat primitive effect is not fully understood, but it suggested to aim to protect our moral code and help us keep our mind clear under distress. This effect is what is behind many challenges of scientific communication in our modern time, like homeopathy or anti-vaxxers. Even with the astonishing amount of information available to prove that homeopathy is merely a placebo effect and that vaccines are safe, the non-believers will find information online to calm their amygdala’s and maintain their claim (Facebook, blogs, pseudo-scientific sites…). Could it be that the access to massive amounts of “easy” information is making us less informed because we cherry pic the news that agree with what we believe in? (Confirmational bias).
The amygdala does not just make you aggressive when you hear contradictory information, it has also been shown that this reaction will solidify even more your beliefs. Professor David Redlawsk from University of Delaware has performed studies that show that you need to receive at least 30% of information contrary to your beliefs to start being able to intake the contradictory data. But how can you reach this 30% if you only read information that agrees with whatever you think? Professor Redlawsk points out that “We are living in a world that we are not evolutionarily adapted to” in an interview to the You are not so smart podcasts by David McRaney.
John Steinback (The winner of the 1962 Nobel Prize in Literature) wrote “Sometimes a man wants to be stupid if it lets him do a thing his cleverness forbids”. Personally, the most frightening idea of the backfire effect is that one does not know when it happens and nobody is immune, since it is masked as just a strong urge to correct the perceived misinformation. If at this point you are telling yourself that you are smart and very rational and this kind of things do not affect you, maybe you should consider it twice; Steven Lewandowsky, writer of The Debunking Handbook says that there seems to be no correlation with the educational level. Having the case of Climate Change as an example, people with more education appear to be more polarized on the matter. He declared to David McRaney that “Smart people are better able to generate counter arguments to evidence they don’t like” and not much can be done (in that climate change example) without changing the elites who spread the misinformation.
Harder does not mean impossible and in the Book by Dr. Lewandowsky outlines a series of tools to bypass this effect and convince somebody of the true. First and most important, one cannot convince a firm believer who actively searches for information that supports his claims, logically you will never manage to provide more than 30% of information. Targeting the people that doubts or are skeptical will be much easier and, in time, might isolate the radicals and force them to change their belief. Then one must easy the blow, never start with “This is not true” or “I will prove you wrong”, that will only help to trigger one’s amygdala and make your arguments futile. It would be much easier to get someone’s attention if you start by accepting their belief, but slightly pointing towards the argument’s flaws. Never mention the myth you are trying to fight since it will only get the receiver in guard, instead focus on the evidence. Keep your statements short and avoid overwhelming the interlocutor, or they will feel more anxious. He also mentions that focusing on behaviors instead of beliefs might be easier, since our minds always adapt to our behavior. This last one seems to agree with the opinion of those who believe that, for example, vaccines should be mandatory, which not only would increase vaccine coverage, but maybe also slowly convince people of their safety. Finally, if one aims to change someone’s core belief you cannot only destroy the myths, you also have to be able to fill the gaps that those myths occupied in the individual’s logic. Failing to do so will most likely result on them trusting their myths despite the logical flaws – “People prefer an incorrect model over an incomplete model, in the absence of a better explanation they opt for a wrong explanation” Lewandowsky said.
To sum up, although the backfire effect seems to prevent people from believing facts and allow corporations, politicians or plain conspiracy paranoids to spread misinformation, there are mechanisms to counteract and make the truth prevail. Those mechanisms might fall short sometimes in these our times of alternative facts and real “fake news”, when anyone can go online and swim in a sea of information that confirm whatever they might believe in. This is why critical thinking and skepticism is more important than ever, which the scientific method should have already prepared us for – we must definitely not believe everything we read.