Why the World Feels Worse Than Ever, Even Though It’s Not
A phenomenon known as prevalence induced concept change is throwing off your big-picture view.
In the past year or so, the world has increasingly felt like a pile of rubbish set on fire. The headlines are filled with election meddling, religious extremism, rape, corruption, murder and mayhem. None of it logically jives with the fact — and it is fact — that by virtually all metrics, the world is the most peaceful and prosperous it’s ever been. Why does reality feel totally different from what it is?
Four little words — “prevalence induced concept change” — might explain it. It’s a concept that’s been the center of a series of new studies by Daniel Gilbert, a professor of psychology at Harvard University, who says it boils down to this: the more we solve a problem, the more we see it.
“Our studies show that people judge each new instance of a concept in the context of the previous instances,” Gilbert said. “So as we reduce the prevalence of a problem, such as discrimination for example, we judge each new behavior in the improved context that we have created.”
As the prevalence of a problem is reduced, society redefines what it means to be problematic. It’s why certain segments of society think women can’t stop complaining about inequality and just appreciate how far they’ve come. It’s why the safer we make our homes, the more we hover over our children.
“Another way to say this is that solving problems causes us to expand our definitions of them,” he said. “When problems become rare, we count more things as problems. Our studies suggest that when the world gets better, we become harsher critics of it, and this can cause us to mistakenly conclude that it hasn’t actually gotten better at all. Progress, it seems, tends to mask itself.”
Gilbert, post-doctoral student David Levari, and other colleagues conducted three experiments to arrive at this decision. The first one seems almost innocuous.
“We had volunteers look at thousands of dots on a computer screen one at a time and decide if each was or was not blue,” Gilbert said. “When we lowered the prevalence of blue dots, and what we found was that our participants began to classify as blue dots they had previously classified as purple.”
Even when the researchers warned participants about prevalence induced concept change, and offered them incentives to monitor their actions against it, participants continued to spot blue where they should’ve seen purple.
Their second experiment used human faces; when the number of threatening faces in a group was reduced, participants started identifying neutral faces as threatening.
Finally, Gilbert and team asked participants to act as an ethical review board for experimental research.
“We asked participants to review proposals for studies that varied from highly ethical to highly unethical,” he said. “Over time, we lowered the prevalence of unethical studies, and sure enough, when we did that, our participants started to identify innocuous studies as unethical.”
In other words, when unethical activity reduces overall, new instances of unethical activity take on new, heftier, problematic meaning.
If you’re still struggling to get it — it’s a weighty concept, after all — Gilbert uses a casualty ward as the ultimate example.
“If the ER is full of gunshot victims and someone comes in with a broken arm, the doctor will tell that person to wait,” he said. “But imagine one Sunday where there are no gunshot victims. Should that doctor hold her definition of ‘needing immediate attention’ constant and tell the guy with the broken arm to wait anyway? Of course not! She should change her definition based on this new context.”
In the ER example, prevalence induced concept change makes sense; priorities and definitions have to shift for progress to be made. But sometimes, an unconscious recalibration of the problematic can be, well, problematic.
“Nobody thinks a radiologist should change his definition of what constitutes a tumor and continue to find them even when they’re gone,” Gilbert said. “That’s a case in which you really must be able to know when your work is done. You should be able to see that the prevalence of tumors has gone to zero and call it a day. Our studies simply suggest that this isn’t an easy thing to do. Our definitions of concepts seem to expand whether we want them to or not.”
In the real world, prevalence induced concept change may be the root of much political clashing, Gilbert adds.
“Expanding one’s definition of a problem may be seen by some as evidence of political correctness run amuck,” Gilbert said. “They will argue that reducing the prevalence of discrimination, for example, will simply cause us to start calling more behaviors discriminatory. Others will see the expansion of concepts as an increase in social sensitivity, as we become aware of problems that we previously failed to recognize.”
What’s the way around this? Or rather, the way to middle ground? If we find more and more once-gray behavior problematic as we solve our problems, are we just creating new ones where they don’t exist — or simply redefining the problem with more nuance, which casts a wider net? It’s a question that has dogged the evolution of the #MeToo movement, notably since the much-disputed assault allegations against comedian Aziz Ansari in January.
Gilbert and the rest of the researchers aren’t sure and take no stance.
“Anyone whose job involves reducing the prevalence of something should know that it isn’t always easy to tell when their work is done,” he said. “On the other hand, our studies suggest that simply being aware of this problem is not sufficient to prevent it. What can prevent it? No one yet knows. That’s what the phrase ‘more research is needed’ was invented for.”
Until we get it, it may be wise to at least bear in mind this old adage — if only to choose when to discard it: When you go looking for trouble, you’re likely find it.
Liesl Goecker is The Swaddle's managing editor.