The Crowd vs. The Expert: Eight Centuries of Getting It Right, Getting It Wrong, and Never Really Knowing Which
The Crowd vs. The Expert: Eight Centuries of Getting It Right, Getting It Wrong, and Never Really Knowing Which
There is a version of this article that ends with a tidy lesson — either a defense of expertise or a rousing endorsement of the wisdom of crowds. You will not find that article here. What history actually offers on this subject is considerably messier and, for that reason, considerably more useful.
Human beings have been arguing about whether to trust institutional authority since the first institution had the audacity to tell someone what to do. What follows are seven moments when that tension produced consequences — some vindicating, some devastating, most maddeningly ambiguous.
1. Medieval Peasants and the Black Death Doctors (1340s) — The Crowd Was Partly Right
When the bubonic plague arrived in Europe in 1347, the medical establishment had an explanation: bad air, or miasma, emanating from rotting matter and corrupted atmospheric conditions. The prescribed treatments included bloodletting, aromatic herbs, and, in some cases, sitting near large fires to purify the surrounding air.
Peasants in affected regions frequently ignored these recommendations. Some fled cities entirely — which, it turns out, was the single most effective thing a person could do. Others avoided contact with the sick, practiced a rough form of quarantine, and refused to follow medical advice that required them to remain in place.
The physicians were not stupid. They were working from the best theoretical framework available to them. But the framework was wrong, and the ordinary people who trusted their instinct that proximity to the dying was dangerous — rather than trusting the miasma theory — survived at higher rates.
The lesson here is not that experts are useless. It is that expertise is only as good as the underlying model, and that models can be catastrophically incorrect while still being held with complete institutional confidence.
2. Colonial Americans and British Mercantile Theory (1760s–1770s) — The Crowd Was Right, But for Complicated Reasons
British mercantile economic theory held that colonies existed to serve the mother country's trade interests — providing raw materials, purchasing finished goods, and generally functioning as a captive market. Educated British economists and parliamentarians considered this framework settled science.
American colonists, many of them without formal economic training, rejected it. Their objection was initially moral and political — no taxation without representation — but it carried an embedded economic argument: that a market economy functions better when participants have agency over their own commercial decisions.
They were, as it happened, largely correct, in ways that Adam Smith would articulate more formally in The Wealth of Nations in 1776 — the same year the Declaration of Independence was signed, a coincidence that history appears to have arranged for dramatic effect.
The crowd's instinct outran the institution's theory. But it is worth noting that the crowd was also operating from self-interest, which is not the same as operating from wisdom. They were right, but not necessarily because they reasoned more clearly.
3. Anti-Vaccination Sentiment in 19th-Century England — The Crowd Was Wrong, Expensively
When Edward Jenner's smallpox vaccine became subject to mandatory public health laws in Victorian England, a robust anti-vaccination movement emerged almost immediately. Objectors cited government overreach, concerns about introducing animal matter into human bodies, and a general distrust of medical authority that was, given the state of 19th-century medicine, not entirely irrational.
Parliament eventually passed the Vaccination Act of 1853, making infant vaccination compulsory. The resistance was fierce enough that enforcement became a public controversy for decades.
Smallpox killed an estimated 400,000 Europeans annually before widespread vaccination. The disease was eventually eradicated entirely. The crowd's skepticism, in this instance, had no vindication waiting at the end of the story — only the statistical weight of preventable deaths.
The instinct that drove the resistance — distrust of institutional power, concern about bodily autonomy, suspicion of novel interventions — was psychologically understandable. The outcome was not.
4. The South Sea Bubble and the Wisdom of Sophisticated Investors (1720) — The Experts Were Wrong Together
This entry inverts the usual framing. The South Sea Company's catastrophic collapse in 1720 was not driven by ignorant peasants ignoring expert advice. It was driven by some of the most financially sophisticated people in England — including Isaac Newton, who lost the equivalent of millions of dollars — collectively abandoning rational valuation in favor of speculative mania.
Newton reportedly said, after his losses, that he could calculate the motions of heavenly bodies but not the madness of people.
The experts were not defeated by the crowd here. The experts were the crowd. Credentialed, experienced, intelligent people are entirely capable of collectively losing their minds when social proof, momentum, and the fear of missing out align properly. This is not a modern discovery. This is a documented feature of human cognition that has been producing financial crises since markets existed.
5. Ignaz Semmelweis and the Doctors Who Refused to Wash Their Hands (1840s–1860s) — The Institution Was Catastrophically Wrong
Ignaz Semmelweis, a Hungarian physician working in Vienna, observed in 1847 that mortality rates in maternity wards dropped dramatically when doctors washed their hands before delivering babies. He published his findings. The medical establishment largely rejected them, in part because germ theory had not yet been established and his explanation lacked a satisfying theoretical framework.
Women continued dying of childbed fever at preventable rates for decades. Semmelweis died in an asylum in 1865, his findings unvalidated by the institution he had tried to correct.
This case is frequently cited as proof that experts resist uncomfortable truths. It is also proof that being correct is not sufficient — that institutional adoption of new knowledge has its own timeline, driven by sociology as much as evidence.
6. Thalidomide and the FDA Bureaucrat Who Said No (1950s–1960s) — The Institution Was Right, One Person at a Time
In the late 1950s, thalidomide was prescribed widely in Europe as a treatment for morning sickness. It caused severe birth defects in thousands of children. It was kept off the American market almost entirely because of one FDA reviewer, Frances Oldham Kelsey, who refused to approve it despite significant industry pressure, citing insufficient safety data.
The institution, in this case, worked — but only because one individual within it exercised independent judgment against considerable pressure. The lesson here is neither pro-institution nor anti-institution. It is that systems produce outcomes based on the people inside them, and that bureaucratic caution, often derided, occasionally saves thousands of lives.
7. The Lysenko Affair and State-Sponsored Scientific Consensus (1930s–1960s) — When the Institution Is the Danger
Trofim Lysenko, a Soviet agronomist with no formal genetics training, convinced Stalin that mainstream genetics — including Mendelian inheritance — was bourgeois pseudoscience. Soviet biology was reorganized around Lysenko's theories. Scientists who disagreed were imprisoned or executed. The result was catastrophic agricultural policy that contributed to famines.
This is the darkest entry on the list, and the most important. It demonstrates that institutional consensus is not inherently protective — that when political power colonizes epistemic authority, the results can be worse than any crowd skepticism. The scientists who privately doubted Lysenko and said nothing were not serving truth. They were serving survival.
The Pattern Without a Moral
Across eight centuries and seven cases, one thing holds: there is no reliable heuristic. Crowds are sometimes right and sometimes catastrophically wrong. Experts are sometimes indispensable and sometimes the most dangerous people in the room. Institutions protect us until they don't.
What history actually suggests is not that you should trust experts or distrust them, defer to crowds or fear them. It suggests that the question itself — who should I believe? — is one that human beings have been getting wrong with impressive consistency across every culture and era, and that the failure mode is not stupidity. It is the very normal human desire for certainty in conditions that do not support it.
The experiments on that subject are still running.