Sometimes, false messages can sound true because they’re delivered by authority figures. A series of famous experiments in the 60’s revealed the persuasive power of authority, showing how far people will go against their convictions at an authority figure’s request, and reminding us that our highest authority is God.
Ever notice how even false messages can sound true when they come from an authoritative source?
The power of authority has long been a staple of propaganda techniques, which try to promote messages based on factors other than logic. Even pioneer propagandist Edward Bernays, who famously applied war-time propaganda techniques to the marketing and advertising world, harnessed authority figures’ sway. As this article about the history of propaganda explains, Bernays managed to popularize bacon and eggs in 20th century America by recruiting doctors to promote the “health benefits” of eating bacon for breakfast. Today, you can spot the same tactic in advertisements starring people with lab coats, clipboards, or intelligent-sounding manners of speech.1
As I’ve mentioned in earlier posts, such advertisements persuade by tapping into a decision-making shortcut known as the authority heuristic—the assumption that experts usually know what they’re talking about, so it’s reasonable to believe what they say. This mental “rule of thumb” comes in handy much of the time, but even experts are fallible. So, the authority heuristic spawns the authority bias, a faulty thinking pattern describing our tendency to be influenced by authority figures regardless of their messages’ content. However, believing a message only because an expert shared it is a logical error called the appeal to authority fallacy.
The Milgram Experiments
To glimpse how far the tidal force of authority can sweep people, let’s backtrack to the 1960s and step into the lab of the famous—and highly controversial—researcher, Stanley Milgram. His classic study2 went like this:
Imagine you see an ad saying that a researcher will pay you to take part in “a scientific study of memory and learning.” You show up at the lab, where you meet two individuals: a well-dressed experimenter and an unassuming, grandfatherly civilian. (Spoiler alert: while you think that Mr. Unassuming is a research subject just like you, he’s really an actor working for Milgram.)
The experimenter passes you and Mr. Unassuming an upside-down hat with two slips of paper inside, asking you to select a paper to determine your role in the study. Drawing one paper, you read the word teacher written on it. Unbeknownst to you, both papers said teacher, so that would have been your role either way. Mr. Unassuming, meanwhile, adopts the role of “learner.”
The experimenter leads you both into a room, where you watch as Mr. Unassuming is strapped into an electric shock generator. You’re then taken to a separate area where you can hear—but not see—the “learner.” There, the experimenter sits you in front of the control panel for the shock device, hands you a list of memory questions to ask Mr. Unassuming, and orders you to deliver increasingly strong shocks every time he answers incorrectly. (In reality, the shock device doesn’t generate electricity. But you don’t realize that.)
Mr. Unassuming protests the shocks. But if you say that you want to stop the study, the experimenter begins giving you four spoken prods, in this order: “Please continue; The experiment requires that you continue; It is absolutely essential that you continue; and You have no other choice, you must go on.”
If you’re like most people, you’d probably feel uncomfortable being commanded to hurt another person. But according to Milgram, nearly two thirds of participants in this study not only went ahead with it, but also proceeded all the way to delivering the final “450 volt” shock at the researcher’s demand.
This study understandably raised major ethical concerns, and various researchers have questioned other aspects of Milgram’s methods and interpretations as well. For instance, some participants likely realized the experiment was a hoax, and later review suggested that Milgram’s experimenter actually gave far more verbal commands than the four spoken prods alone.3 Even so, replications of these experiments over time have shown similar results, revealing that well over half of participants will apparently act against their consciences because an authority figure says to.4,5,6
Authority, Persuasion, and Scripture
Now, let’s think about these findings in the context of today’s culture, where authority figures (like university professors) may require Christians to behave or believe in ways which go against God’s Word. For example, while my professors and textbook authors never asked me to shock anyone, they did expect me to believe ideas which I knew weren’t right based on Scripture, like human evolution.7 The fact that these messages came from authority figures made them seem true. But a message’s source does not logically affect its truth—unless, of course, that source is 100% infallible.
God alone matches that description. He is the ultimate authority (Colossians 1:15–18), and we can trust his Word completely. Proverbs 30:5 states that every word of God is true; both Numbers 23:29 and Titus 1:2 assure us that God does not lie; and Romans 3:4 (ESV) declares, “let God be true, though every one were a liar.” Correspondingly, Acts 5:9 records that when authoritative human mandates conflicted with God’s Word, the apostles resolved, “We must obey God rather than man.”
In the end, while the power of authority helps us understand why unbiblical messages can sound persuasive, the truth reminds us that our highest authority is our Creator.
- Propaganda can be used to promote good causes and true messages. The trouble comes when the main persuasive power of a message lies in factors besides the message’s content. (This is especially prone to happen in messages which have limited logical basis, so must be promoted primarily by propaganda.)
- Stanley Milgram, “Behavioral Study of Obedience,” The Journal of Abnormal and Social Psychology 67, no. 4 (1963): 371.
- Gina Perry (2013), as cited in Griggs, Richard A., and George I. Whitehead III. “Coverage of recent criticisms of Milgram’s obedience experiments in introductory social psychology textbooks.” Theory & Psychology 25, no. 5 (2015): 564–580.
- Jerry Burger, “Replicating Milgram: Would People Still Obey Today?” American Psychologist 64, no. 1 (2009): 1.
- Doliński, Dariusz, Tomasz Grzyb, Michał Folwarczny, Patrycja Grzybała, Karolina Krzyszycha, Karolina Martynowska, and Jakub Trojanowski. “Would you deliver an electric shock in 2015? Obedience in the experimental paradigm developed by Stanley Milgram in the 50 years following the original studies.” Social Psychological and Personality Science 8, no. 8 (2017): 927-933. Note: this study found that 90% of people went ahead with delivering a 150 volt shock—a similar obedience rate to that which Milgram observed for 150 volts.
- Interestingly, one modernized replication of Milgram’s study used an EEG to measure the electrical activity in people’s brains as they were deciding to hurt another person at an experimenter’s command. Results revealed that people do not think as deeply about what they’re doing when they’re “merely following orders.” As the researchers put it, “Coercive instructions appear to induce a passive mode of processing in the brain compared to free choice between alternatives.” (Caspar, Emilie A., Julia F. Christensen, Axel Cleeremans, and Patrick Haggard. “Coercion changes the sense of agency in the human brain.” Current Biology 26, no. 5 (2016): 585-592.)
- There are clear differences between being a research subject ordered to hurt another person and being a student told that unbiblical messages are true. So, this is not meant to be a perfect analogy but rather to illustrate the psychological power of authority, and to serve as a reminder that our Creator must be our ultimate authority in any case when the words of human authorities conflict with the Word of God.