Skeptoid PodcastSkeptoid on Facebook   Skeptoid on Twitter   Skeptoid on Spotify   iTunes   Google Play

Members Portal

Support Us Store

 

Bermuda Triangle

 

The Wisdom and Stupidity of Crowds

Donate How the wisdom of crowds can sometimes allow groups of non-experts to reach an expert conclusion.  

Skeptoid Podcast #741
Filed under General Science

Listen on Apple Podcasts Listen on Spotify

The Wisdom and Stupidity of Crowds

by Brian Dunning
August 18, 2020

There's an old saying that "A person is smart; people are stupid." And while we might see examples that seem to support that frequently, it turns out that under certain conditions the reverse can be true. Surprisingly, crowds consisting of non-experts who have no tools at their disposal other than guesswork can come up with a better answer to certain questions than even an expert in that field, when certain conditions are satisfied. Today we're going to look at how and why this works, and also how and when you should — and shouldn't — rely on it.

The actual term "wisdom of crowds" comes from the title of the 2004 book The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations, by journalist James Surowiecki. He opened the book by telling a famous anecdote, it's one of the best-known stories about the British polymath Francis Galton. In 1906, Galton was at a livestock fair where he encountered a very popular type of contest being run: Guess the weight of this ox, and you'd win a cash prize. Some 800 people submitted their guesses, and Galton noted that the group was quite diverse. People who might reasonably know what an ox weighed — such as butchers and farmers — were certainly there, but also present were plenty of people who would have no idea and who made guesses based on their own assorted experiences of what things weighed. The ox turned out to weigh 1,198 pounds, and the person with the closest guess was awarded the prize. Galton obtained the list of guesses and got to work, expecting to find that the average person had no idea what an ox weighed; but when he took the median guess, he found that it was 1,208 pounds, wrong by less than 1% (in fact it was later discovered that Galton had made an error and the actual median was 1,207, even closer to the actual weight). But the real surprise was when he later took the average of all the guesses. 1,197 pounds, only one pound off. The diverse crowd was smarter than even the very best expert guess.

Galton published this account the following year in the journal Nature under the title "Vox Populi", Latin for "the voice of the people". And with that 1907 article, the wisdom of crowds was born.

This basic pattern of Galton's Vox Populi turns out to be surprisingly repeatable. One application where we see the wisdom of crowds borne out time and time again is in prediction markets. These are just like stock markets but instead of trading stocks, you trade probabilities of future events. You may remember the short-lived so-called "terrorism futures market" set up by the US government in an attempt to learn what kind of defense-related events were likely to take place in the world, but it was politically unpalatable and was shut down (its actual name was the Policy Analysis Market). But many other prediction markets are around now, and the "stock prices" (or the probabilities being traded) turn out to be eerily accurate at predicting world events like election outcomes, currency trends, government policy decisions, and much more. The ability to make actual money trading these probabilities keeps people coming, and they are as diverse as Galton's ox weight diviners: some may have actual knowledge of these events, most have somewhat informed guesses, and some are purely speculating based on general life experience. Although individually they may buy and sell all over the map, in the aggregate these crowds turn out to be wise too.

Why? Surowiecki proposed a number of characteristics of a wise crowd that separate it from an unwise crowd. We're going to focus on two of the most important. First, diversity is key. Diversity of background, experience, opinion, including having people with little knowledge of the question being explored. Second, the crowd members must remain independent. They have to come up with their own independent estimates, and not copy from those around them.

This independence is crucial. If someone at the ox contest had shouted out a guess, some of the people who had no idea whatsoever would have gone with that — as human nature compels us to presume (rightly or wrongly) that such a person probably knows better than we do. Had that happened, it would have dramatically skewed the guesses to one side or other of the actual weight, and resulted in an average that was more wrong than the average that Galton actually got. This particular type of breakdown in the process is what economists call an information cascade. We discussed these in Skeptoid #676 on glyphosate, the active ingredient in Roundup. Glyphosate is one of the safest and most effective herbicides ever developed, and yet today, a huge number of people believe that it is poisonous or carcinogenic, claims that are at odds with all the scientific data. The process by which this misinformation spread was a great example of an information cascade. This is when one person, faced with a decision that's beyond their ability to make based on their knowledge, instead simply adopts the position they've observed someone else take. The average person lacks the expertise to make an informed decision about whether Roundup is safe or not, so they simply go with whatever their friend says — exactly like how the man at the ox contest who shouted out a weight triggers many others in the crowd to adopt his guess. Information cascades are like dominos toppling over in a great big expanding fan.

When an information cascade takes place, both the diversity of the crowd and the independence of its members are thrown out the window. People are simply copying from each other; they're no longer independent, and whatever value their diversity may have brought is lost. So this crowd turns out to be a stupid one, following along with whatever random choice one member made once.

These pitfalls become more myriad in prediction markets. Take the example of people betting on the outcomes of a political election. All else being equal, the betters generally have excellent diversity of opinion, and that works to its favor. But with an election that's often mentioned in the news — polling results may be released, candidates may perform well or badly in primaries or in debates — this news can skew opinion. News sources and popular pundits can also spread biased information or can make influential predictions of varying quality, and these can reduce both the diversity of opinion among the betters and their independence from one another.

In the best of circumstances — again, using Galton's ox contest as an example — a crowd consisting largely of non-experts can often make a prediction that is more accurate than any of the actual experts in the group can come up with. I have seen this fact spun and misrepresented by certain groups and individuals as evidence supporting their belief that a lack of expertise is just as valid as actual expertise, if not more so. This can be irrationally applied to issues like vaccination and climate change. So, fair warning: That the wisdom of crowds gives accurate predictions in certain specific circumstances is not to be misconstrued as laypeople are smarter than experts in their own field. In a wise crowd, a non-expert's own prediction is almost always very wrong; it is only an aggregate of many such wrong predictions — from not only non-experts but actual experts as well — that can hope to match or beat the accuracy of expert opinion. So please don't misunderstand this phenomenon to mean that your own preferred stance on vaccines or climate change is wiser than what the experts in those fields have concluded.

There is one more case I'd like to cover. If you saw the title of this episode and have read or listened to anything on the wisdom of crowds before, you were almost certainly expecting me to cover the 1968 search for the lost nuclear submarine USS Scorpion. This ocean drama is one of the most popularly cited examples of a real-world problem being solved by the wisdom of crowds. The way the story's usually told, Dr. John Craven, the engineer put in charge of finding the wreck, took the few facts they had and asked a diverse group of people to guess the sub's location on a bathymetric map. He then averaged the locations and miraculously discovered the wreck only 220 yards from the predicted spot. If that makes the wisdom of crowds sound a little too magical and a little too good to be true, well, you're onto something; because it turns out that this story is usually terribly and inaccurately reported.

The reality of Craven's search had little to do with the wisdom of crowds. Craven had pounded the pavement to collect as much sonar data as he could, from military and private sources, and discovered that the sub had made a strange 180º turn before imploding. Experienced submariners advised him this was a standard procedure to disarm an accidentally armed torpedo — an explanation that the Navy doubted. Craven then assembled a small team of submarine and salvage experts to bet on the likelihood of this scenario, using bottles of Chivas Regal whisky, as per their tradition. Based on most of the group betting on the accidentally armed torpedo scenario, Craven decreed it to be the assumption, so for round two they started from there, and then bet on the speed at which the sub struck the bottom. They took the most popularly bet speed and made that the second assumption. Then for round three they bet on the slope of the sub's glide path downward. With these figures in hand, Craven hired Daniel H. Wagner Associates, mathematicians, to help him devise probabilities for various points on the ocean floor — it was the same firm he'd used two years earlier to help him find a lost hydrogen bomb. When the search ship went out, Craven and his mathematicians used Bayesian search theory to constantly refine their probabilities — and when the sub was ultimately found, it was a mere 220 yards from where the model said was most likely. The bottles of scotch were duly awarded to the winning experts.

Why was this not a very good case of the wisdom of crowds? Mainly because Craven's scotch-betting experts were well informed, not diversely so; they were not independent of one another's guesses; and because the final most-probable crash site was the product of Bayesian statistics refined by actual searching, not of guesses by members of a large crowd. In fact, a few years later, one of the mathematicians Craven had hired, Lawrence Stone, would write a book about their methodology, titled Theory of Optimal Search. It's heavy on math; light on Vox Populi.

But for cases that do conform to Surowiecki's criteria, the wisdom of crowds turns out to be a real and fascinating phenomenon — if not a universally applicable one. But if you've got a problem that has an unknown but simple and quantifiable solution, you could do a lot worse than collecting a bunch of randos off the street and extracting a guess from each one.

This episode is dedicated to the memory of Skeptoid family member Alison Hudson, 1975-2020. Alison wrote and guest hosted several episodes of Skeptoid back in 2015, she was a prolific author on the Skeptoid blog, and as a career educator, also wrote the outstanding free lesson plans provided with our film Principles of Curiosity. Her talent, her humor, and her friendship will be sorely missed.


By Brian Dunning

Please contact us with any corrections or feedback.

 

Shop apparel, books, & closeouts

Cite this article:
Dunning, B. (2020, August 18) The Wisdom and Stupidity of Crowds. Skeptoid Media. https://skeptoid.com/episodes/4741

 

References & Further Reading

De Vany, A., Lee, C. Information Cascades in Multi-Agent Models. Irvine: University of California, Irvine, 2012.

Galton, F. "Vox Populi." Nature. 7 Mar. 1907, Number 75: 450-451.

Jalili, M., Perc, M. "Information cascades in complex networks." Journal of Complex Networks. 6 Jul. 2017, Volume 5, Issue 5: 665–693.

Munafo, M., Pfeiffer, T., Altmejd, A., Heikensten, E., Almenberg, J., Bird, A., Chen, Y., Wilson, B., Johannesson, M., Dreber, A. "Using prediction markets to forecast research evaluations." Royal Society Open Science. 28 Oct. 2015, Volume 2, Number 10.

Rosenberg, L. "Human Swarms, a real-time method for collective intelligence." Proceedings of the European Conference on Artificial Life 2015. 21 Jun. 2018, Issue 2015.

Sontag, S., Drew, C. Blind Man's Bluff: The Untold Story of American Submarine Espionage. New York: Public Affairs, 1998. 96-131.

Surowiecki, J. The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations. New York: Doubleday, 2004.

 

©2024 Skeptoid Media, Inc. All Rights Reserved. Rights and reuse information

 

 

 

Donate

Donate



Shop: Apparel, books, closeouts


Now Trending...

Tartaria and the Mud Flood

Chemtrails: Real or Not?

The Siberian Hell Sounds

Skinwalkers

Exploring Kincaid's Cave

The 1994 Ruwa Zimbabwe Alien Encounter

The Headless Goats of the Chattahoochee

Valiant Thor: Your Friendly Pentagon Alien

 

Want more great stuff like this?

Let us email you a link to each week's new episode. Cancel at any time: