In 2017 Kris De Meyer, a neuroscientist who directs the Climate Action Unit at University College London, ran the opening session of a conference on decision-making under uncertainty for an audience of scientists, finance professionals and policy makers. He divided them into groups of six and gave them questions and activities centered on their personal and professional experiences of risk. After a while, some hands went up. “They said, ‘We just realized we cannot agree on the definitions of risk and uncertainty,’” De Meyer says. “Even within those small groups, they ran into irreconcilable differences.”
De Meyer works to improve communication about climate change, and it quickly struck him that a major problem was how often professionals who were involved simply misunderstood one another. This, he says, is because people differ in the concepts they have even for basic terms, so what someone thinks they are saying is often not what others understand. This, he claims, explains why climate scientists struggle to get their messages across and why big financial organizations underestimate the threats of climate change. Recent psychology research shows that conceptual differences of this sort turn up everywhere and that people are usually oblivious to these disparities. Neuroscience studies demonstrate that they are underpinned by differences in how the brain represents concepts, a process influenced by politics, emotion and character. Differences in thinking that have been shaped by lifetimes of experience, practice or beliefs can be almost impossible to shift. But two steps offer a way forward: making people become aware of their differences and encouraging them to choose new language that is free of conceptual baggage.
The very term “concept” is difficult to define. A good rough idea of what it means is that concepts are all the properties, examples and associations we think of when we use, hear or read a word. For instance, the concept of “birds” might include the following: they have wings and can fly; blackbirds are a good example of them; and we associate them with nests and animals in general, among other things. Concepts are different from dictionary definitions, which are rigorously determined and specific (and usually need to be learned). When we use language in everyday life, however, our concepts are central to what we actually mean.
On supporting science journalism
If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
Psychologists, linguists and philosophers have long strived to pin down what a concept is. James Hampton, a psychologist at City, University of London, favors a “prototype” theory of the term. Prototypes are sets of features that determine how typical specific examples are in terms of fitting into a broader category. Baseball is a more typical sport than poker because it has more of the important features that characterize sports as a concept. Hampton and his colleagues have studied differences in how people rate the importance of features such as competition, skill and athleticism for defining concepts such as sports.
The way people judge how typical an example is shows that such concepts are vague, and there are always borderline cases people cannot agree on. Whether tomatoes are a fruit is a famous example. And while many might agree that marathon running is a sport, people are unlikely to reach consensus on rifle shooting or bridge. “You need things to be vague and fuzzy, so when you create a new sport of Frisbee, you know it doesn’t need to have everything other sports have, like commentators,” Hampton says. This allows concepts to be flexible and dynamic as categories evolve, new examples arise and the importance of features change. Consider how our concept of “telephone” has evolved over the past 50 years, from landlines to smartphones. This “fuzziness” makes concepts less distinct (consider the overlap between “sports” and “games”), which helps us generalize what we learn about one thing to other related things. It also introduces ambiguity and uncertainty, however.
Researchers have struggled to quantify how often concepts differ from person to person. Yet in a recent study, cognitive psychologist Celeste Kidd of the University of California, Berkeley, and her colleagues showed that such differences are the norm: they occur not just for kinds of things, such as the whole category of birds, but also for specific exemplars for that category, such as a penguin. Kidd and her colleagues asked participants to give both “feature” judgements (whether a penguin is noisy) and “similarity” judgements (whether a penguin is more similar to a chicken or a whale). By analyzing how participants’ responses clustered into groups, the researchers estimated that “at least ten to thirty quantifiably different variants of word meanings exist for even common nouns,” according to their study. They also showed that people are usually unaware that “…others do not agree on a meaning. “People generally overestimate the degree to which other people will share the same concept as them when they’re speaking,” Kidd says, which helps explain why people talk past each other so much.
Disagreeing about penguins hardly matters. The issue gets worse for more abstract words, however, such as “fairness” or “freedom.” Not only do the consequences of disagreement increase, but differences also become harder to resolve. “With concrete words, we can point at something and say, ‘You said that penguin was black, but isn’t it more white?’ So it becomes resolvable,” De Meyer says. “The more abstract the word, the harder that becomes.”
Neuroscientists have also found that emotion is involved in shaping how the brain represents abstract concepts. “You can place your own emotional flavor onto these words because of their ambiguity,” says Oriel FeldmanHall, a psychologist at Brown University. As a consequence, they can become resistant to change. “Because they’re abstract, you fill them with your own meaning, so they become tied to your identity,” De Meyer says. Psychologists have found that once a concept becomes part of someone’s identity, it becomes difficult to shift. “Kidd’s study says nothing about how easily we can bridge these differences,” De Meyer says. “On words like penguin, we can easily [do so]. But where the meaning becomes associated with your identity, we can’t.”
Conceptual differences arise from several sources, mostly to do with life experiences. A zoo visitor and a zoology student are likely to have quite different conceptions of penguin, for instance. Training is a particularly strong influence on the meaning of concepts and where the problems De Meyer is tackling mostly stem from. He runs workshops designed to show economists and climate scientists that they think very differently about words such as risk and uncertainty. “They have these different concepts of terms they think they all understand but actually don’t,” De Meyer says. “They don’t understand each other because they have very different practices, which leads to different semantic representations in the brain.”
Most scientists spend much of their time quantifying the uncertainty represented by a spread of data values in order to distinguish meaningful findings from mere noise, so to them, uncertainty is quantifiable. To an economist, uncertainty is more akin to doubt. “It’s that colloquial meaning of not knowing,” De Meyer says. Risk is almost the opposite. Economists spend their careers calculating “risk distributions,” which are estimates of the probabilities of potential outcomes multiplied by the associated losses. So to them, risk is quantifiable by definition. To a climate scientist, risk means the negative consequences of climate change. “Some outcomes, like flood risks, are quantifiable, but what a heat wave in Africa does to migration across the Mediterranean, how that affects the political climate in Italy, and so on, aren’t,” De Meyer says. The system is just too complex.
To visualize the differences, De Meyer asked participants to rate on a scale of one to five whether risk is quantifiable, as well as whether “conservative risk estimate” means the worst-case scenario or the “side of least drama.” This is about the kind of mistakes one is trying to avoid. Physical scientists are usually most concerned with avoiding false positives results from their work. For instance, if an astronomer wrongly claims to have identified a new planet, it could ruin their career. In other disciplines, such as medicine, avoiding false negatives is more important. If an oncologist fails to diagnose malignancies, lives are at stake. These different priorities give being “conservative” different meanings. For climate scientists, it means erring on the side of least drama, whereas economists are more concerned with the worst possible outcome.
These differences highlight a deep disconnect between two professional communities engaged in a dialogue about responding to climate change. “The climate risk information coming from the science community is simply indigestible by the financial markets,” De Meyer says, “which means they continually underestimate the dangers of climate change.” These concepts are related to practices routinely employed in separate professions, so they are tied up with people’s identities as professionals. De Meyer has experienced how deeply ingrained these differences are. He once gave workshop participants two colored lenses and told them the purple lens represented scientists’ concept of risk and the yellow lens stood for economists’ concept of risk. He instructed them to keep in mind which lens applied when someone spoke. “It worked for about half an hour, then they started to drift back to baseline,” he says. “Two hours later, they had completely forgotten about the different meanings.”
Beliefs and values also contribute to disparate conceptions. De Meyer gives the term “1.5 degrees” as an example. “For some people, it’s the boundary of doom,” he says. “For finance professionals, it’s often seen as a negotiable number.” Two recent studies by FeldmanHall’s group have provided a window onto how such differences manifest in the brain. The first showed that people with different political ideologies show more dissimilar patterns of neural activity in response to politically charged words, such as “abortion” or “immigration.” The second showed that how uncomfortable people are with uncertainty influences how their brain represents concepts, adding a facet of people’s character to the reasons concepts differ.
In the second study, FeldmanHall and her colleagues assessed participants’ intolerance of uncertainty using a questionnaire, then imaged their brain while they read a list of words. The researchers found that neural activity patterns in response to two related words were more dissimilar in the brains of participants who were averse to uncertainty than they were in the brains of people tolerant of it. People intolerant of uncertainty “have this separation in semantic representations at a neural level,” FeldmanHall says, “which helps you disambiguate concepts, thereby reducing some of the uncertainty that’s all around us.” The researchers call this “semantic expansion.” They also found that participants who were averse to uncertainty were better at distinguishing between terms but worse at generalizing across them, such as when extrapolating a key press they had learned to associate with an image of a wrench to that of a screwdriver. In contrast, people who are comfortable with uncertainty can embrace ambiguity, so they more easily navigate situations where a single term has multiple meanings depending on context.
De Meyer thinks he has seen these differences play out in workshops that he runs. “Some people say, ‘Gosh, that’s interesting,’ assimilate the two meanings and then can talk about it,” he says. “Then there are people who stay stuck in their framing and even stop being open to discussing it further.” A strong aversion to uncertainty could explain some people’s reluctance to shine a spotlight on conceptual differences. “It may relate to how interested they are in testing their concepts or allowing them to be challenged, as opposed to holding them very fixed,” Hampton says.
FeldmanHall and her colleagues’ research may help explain why people respond differently to efforts to mitigate misunderstanding, but progress is possible. De Meyer’s first step is always just to make people aware their concepts differ. “If people are aware it’s there, that will make a big difference in how they’re able to communicate,” Kidd says.
The next step is to give it a name. For this, De Meyer turns to a Far Side cartoon in which an owner berates his pet dog, who hears nothing but “Blah blah” and its name, “Ginger.” “We call it the ‘Ginger the Dog’ effect, and when we introduce it, people often start using it: ‘Hey, we just experienced a Ginger moment,’” De Meyer says. “The difference doesn’t disappear, but if you can name it, you can bypass it.”
He uses Ginger to label the issue rather than a term like “miscommunication” so as not to import any conceptual baggage that the latter might have and in order to build a fresh concept with shared understanding from the start. “You have to use language not already ‘sullied’ by two sides,” FeldmanHall says, “language that hasn’t traditionally been used within the domain of climate change, so people can come together and conceptually adopt it in the same way.”
This seems like a promising approach. De Meyer has tried using “threats” instead of the word “risk” in the climate change conversation. “That seems to overcome some of the challenges, though not all,” he says. “It’s about making your own evocative language to bypass problems of misunderstanding.”