Changing Minds Part 6: Misinformation & Mistrust

Michael speaks with the University of Oxford’s Sacha Altay about how misinformation and mistrust affect our ability to take action on the climate crisis.

(00:01): 

Well, I'm in over my head. No one told me I'm trying to keep my footprints small, harder than I thought it could be. I'm in over my head. What do I really need? Trying to save the planet oh will someone please save me. Trying to save the planet oh will someone please save me? 

(00:25): 

Welcome to In Over My Head. I'm Michael Bartz. My guest today is Sacha Altay. Sasha is a post-doctoral research fellow working on misinformation and mistrust at the University of Oxford's Router's Institute for the study of journalism. Sasha completed a PhD in experimental psychology testing novel communication techniques to correct people's misperceptions and fight misinformation. He draws inspiration from theories in social psychology, media studies and evolutionary psychology. Sasha is a dedicated science communicator, regularly giving talks to the public and writing articles. His research has been published in academic journals such as Nature, human Behavior, new Media and Society, digital Journalism, evolution in Human Behavior, and the Journal of Experimental Psychology applied. Welcome to In Over my Head Sasha. 

(01:07): 

Thanks for having me. It's a pleasure. 

(01:09): 

So in thinking about changing minds, I'm interested in the role of communication and especially misinformation. This has been the focus of much of your research. I'm looking forward to discussing your work. Perhaps we can start with maybe some basics. In your article titled The Cognitive Foundations of Misinformation on Science, you break down psychologically how communication works and how this might contribute to misinformation. So tell me a bit about this paper. 

(01:31): 

Thanks. So I will start very broadly with communication and one interesting thing about communication is I think how difficult it is to share ideas with other. So of course it appears very easy. We just talk and we're like, oh yeah, we, I've shared my idea. But actually the mental representation that I have, the idea that I have is very hard for me to share in the sense that what's in my head, I first have to put it in words, then I have to say it, then you have to understand it and then you have to encode it and memorize it. And in this tedious process of communication, the information is not replicated identically, the information is modified as we communicate ideas. And you can imagine that when we are talking to each other it's fine. Like if you misunderstand something, you can ask me and we can like it. 

(02:19): 

It works basically when we talk. But imagine now that when we communicate with like a lot of people like mass communication, when you have mass communication, you have a lot of opportunities for misunderstanding and for information to be changed. And that's one way in which misinformation can arise is just people talk but misunderstand each other and as they memorize and transmit information further, the information get distorted. And you have a lot of work in, for instance, cultural evolution doing these experiments when you have, and you probably have seen some on social media where you have one person saying something to another, then another, then another. And at the end of the chain the message is completely different. And that's a result of the common biases that govern human communication. 

(03:06): 

Yeah, and I think about specifically with that paper, you're conveying scientific information, right? So I think there's even a barrier there where it's, I'm not just saying that I like this restaurant or something that I could communicate more easily. Now I'm having to as a layperson communicate these maybe complex ideas that I maybe read an article about to someone else. And I think that's really interesting that also could get lost in translation, right? 

(03:27): 

Yeah, totally. And one thing about scientific information is that a lot of it is very counterintuitive. Like it's not like for instance the tectonic plates. Like it's not at all intuitive and we cannot really grasp it. So when we communicate it, it's very difficult. And one challenge I think for scientific communication is that because these findings are counterintuitive, they are accepted based on trust. They're not accepted based on people being able to evaluate the content of the argument. I cannot evaluate how plausible the theory of the tectonic plates is. I don't have the expertise so I have to accept it based on trust. And that's where things can go wrong in the sense that some people do not trust scientists. So they have no reasons to accept this argument. And I think that's why we have some science journalism or people who basically don't believe in scientific truth. It's not because they're stupid, it's just because they don't trust scientists. 

(04:21): 

Yeah. And you talked a bit about some of those biases too and and depending on your worldview and maybe the group that you're part of, would that be part of it as well? And as far as how you're receptive to information or spreading misinformation? 

(04:32): 

Yeah, totally. You can use scientific information to fulfill political goals. For instance, like you are against gay people, let's say like you, you're homophobic and there is a new study showing that homosexuality is genetically determined and you're gonna use that to say that they're bad or whatever. But you can also use it to say that it's good and that people don't choose. And so you can use scientific information to fulfill any goal basically. And that's the same for information and based on people goals that are either political or whatever. And of course then you will share information selectively and you will not share like a representative sample of the scientific information. You will share the information that fits your preconceptions and you will also interpret this information in a way that fits with your preconceptions. 

(05:16): 

Yeah. And then speak about sharing information. I think about one of your papers about being green or being nice and even just the types of information that we shared. Do you wanna talk a bit about that? 

(05:24): 

Yeah, so in this paper, basically what we look at is why do so few people talk and share information about climate change or at least information that is accusatorial such as for instance, the richest 1% in the world is responsible for most of the greenhouse emissions because most citizens in countries like the US consume too much energy. That's an accusatorial statement, but it's true. And most people are reluctant to share this kind of information despite the fact that we know that it can have an influence like you know, it can affect the passive consensus, the norms, and it can also convince people if everyone were telling that or look in the US we consume too much energy, could help change things. So what we did in this paper is hypothesized that people were reluctant to share accusatorial statements about climate change because they were scared of appearing unfriendly. We connected true experiments and we find that people who shared accusatory statements as opposed to non-actors statement were deemed more unfriendly. And what's worse is that it's not even compensated by appearing more committed to pro environmental causes. So basically you have very few benefits in doing so. 

(06:36): 

I'm so curious about where maybe some of the root causes are. So people are taking in media, maybe they're reading news articles, they're seeing headlines. So there's that incentive for the news organizations or even in that first paper I referenced even educational institutions to make sensational headlines or to maybe kind of twist some of the results to make it sound more sensational. Right? So there's that, but then there's also the personal side where maybe you are spreading that misinformation unintentionally. Do you think it's equal or is there one that has a bigger impact on the spread of misinformation? 

(07:09): 

I think of course clickbait headlines and headlines that are a bit like too sensationalist are a bigger problem than people. The problem is that in the way the media is funded today is that news on social media, like news outlets don't get any money for the news they put on social media. They only get the money if people click on the headlines. So they have strong incentives of not doing like clicked headlines, but at least put a sexier title and incentivize people to click on it. Especially when you think that they're competing against extremely attractive content that is not really worried that much about accuracy or you know, it's about like cat videos or whatever. So it's very hard for the news media to compete today on social media. So to some extent they have to yeah, manipulate a little bit. The headlines manipulate not in a very bad sense, it's just they want people who to click on it. 

(08:01): 

And of course it's bad because then people are gonna trust the news media a bit less. But unless we change the incentive structures and have like for instance the tax on social media and give the money to the news media or whatever, we cannot change that. Like it's very hard. Or you have public broadcast service and for instance the BBC doesn't need to do that because they don't gain money like in this way. So I'd say it's the biggest problem, but it's a very difficult question to solve. Like I think we need a systemic solution and change the incentives of the game to solve this problem. 

(08:29): 

Something that really interests me when I think about misinformation, I think about perhaps changing even our own opinions as environmentalists about what is effective and what's not. One thing that comes to mind is nuclear energy, which you've done some research on. Can you tell me a bit about that? 

(08:44): 

Yeah, I think nuclear energy is a topic close to my heart. I think the climate crisis is very important and stuff, but I don't really work in this area. But I've realized that many environmentalists were against nuclear energy even though it's one of the most effective and cleanest source of energy that we have today, especially non-intermittent one. So I've been really passionate about that. And with my PhD supervisor Hugo Mercier, we've tried to understand why are people opposed to it and how we can change people's mind about it. So we did one first study where we looked at whether people who are worried about nuclear energy were higher in discussed sensitivity. So you know, the idea of disgust is that we are disgusted about many things and you know, you, we have like evolutionary adaptations for avoiding certain types of food and stuff like that. But the thing is that pathogens, we cannot see them and we can hardly see their effects. So we have all these mechanisms that allow us to basically avoid threats that are invisible and whose effects are invisible and who contaminate. So basically if someone is infected, you should not approach them, et cetera. So we have all these mechanisms that are like the behavioral immune system as some people call behaviorally, we should avoid certain pathogens and certain things that we cannot see. And I don't know if you've seen the the TV show channel, but 

(10:06): 

I've heard of it. No, but I have not yet seen it. 

(10:07): 

Because there are many myths conception about nuclear energy in this TV show and some of which are that when someone has been contaminated by radiation, they can contaminate other people and stuff. And you have this thing that is infectious. It really taps I think into people's intuition and people's cognitive systems that treat pathogens. And so what we did is just looked at many samples and many ways how people who were more easily disgusted were more opposed to nuclear energy. And that's what we found. We found correlation between that. So that was the first step. And the other one we wanted to look at is many surveys around the world have shown that people think that nuclear energy emits a lot of co2 and that's wrong that they do not. But that's also, you know, because you can see the steam coming out of some nuclear power plants, but that's not co2. 

(10:53): 

And I think some people think that. So what we did is we looked at participants misconception about nuclear energy. So we replicated that participants in France and the UK think that nuclear energy emits more CO2 than it does. And what we hypothesized is that people, you know, they have general attitudes and intuitions about nuclear energy and that these intuitions and attitudes drive the acquisition of misconceptions about nuclear energy. And to do that, what we did is we invented a substance that we said nuclear energy was emitting. We called it calison. So it, it doesn't exist, calison is bad. We told them kung is harmful and how much does nuclear energy emits that? And we asked wind, solar many things and what we found is that participants were more likely to develop so these new negative misconceptions about Calison for nuclear energy than for renewables or even some fossil fuels such as natural gas or things like that. 

(11:52): 

And that's very interesting because it shows that people have this negative preconception and that it drives their acquisition of specific misconceptions such as that it emits co2, et cetera. And in the second step, what happens if we correct the misconception that nuclear energy emits a lot of co2? Will it change people's general attitudes about nuclear energy? So what we did is we presented participants with arguments about the facts, basically about nuclear energy. And we had many conditions and in one condition it was about the fact that nuclear energy emits a very, very low amounts of co2. And then we measured participants attitudes toward climate change and how much they think for instance, nuclear energy is a possible solution against climate change, et cetera. And what we found is that exposing people to, like it was a paragraph or two about why and how nuclear energy emit very low amounts of co2. 

(12:48): 

So it changed people's mind but that, but it also changed people's broader attitudes about nuclear energy. And they were more likely to say both in France and the Uk, they were more likely to say that nuclear energy should be like basically in the solutions against climate change. So I think it's very interesting because it shows that even though specific misconceptions we think are in broad part the product of these broader attitudes and things that people have about nuclear energy, when you correct specific misconceptions, you can actually change people's attitudes about nuclear energy and they can be more in favor of it to fight climate change. So I think it's quite optimistic. The reality is about nuclear energy and yeah, the misconceptions. 

(13:25): 

That's super interesting and I'm really glad to hear that you could change people's perspectives even on a small scale cuz it seems like it's really entrenched in our psyche that nuclear energy is bad potentially or has a lot of those effects. So I'm glad that you're doing that work. 

(13:43): 

Yeah, a lot of work in psychology has shown that overall people are responsive to evidence. Even if you have like some motivated reasoning in the sense that, you know, people like interpret information based on their political goals, et cetera. Overall people are responsive to evidence and they change their minds when they're exposed to it. The problem is just that they need to trust the source and the arguments need to be good and people of course need to understand the arguments. But when you have all that, that is aligned and when you address people's counter arguments, because sometimes you give me a good argument, but if I have a good counter argument, I don't have any reason to change my mind. So when you do all that, people do change their mind. Like people are pretty rational. It's just that on a large scale, changing the mind of the whole country, it's very difficult because you need directional messaging is not very effective. It's very difficult to change people's mind based on the news media for instance. We know that the mass media have very little effects on what people think on people's attitudes. Like mass communication has minimal effects on people, but people do change their mind when they chat with people or when they're exposed to good information in like a kind of interactive format like we are doing now for instance. 

(14:52): 

When you're talking about people change their minds based on evidence, I just, to me that seems counterintuitive because even just the problem of misinformation, like if people were so rational and they could just look at something and go, oh, that's a trusted source, well I'm going to change my opinion. I feel like we wouldn't be in this crisis, the climate crisis because people would just say, all right, well if that's what we need to do, that's what we'll do. But there's a lot of resistance I think about in a more, maybe a more micro scale, I don't know if this is a good comparison, I don't always love talking about it, but you know the pandemic, right? If we can't come together on responding to one virus, how can we come together and respond to climate change which is so much more opaque and distant and not in our faces, right? 

(15:37): 

Maybe I'm an optimist but I wouldn't share your conclusions about the covid 19. I think in most countries people came together despite the fact that governments and scientists often communicated in very bad and not very straightforward ways. We did all these very weird and conflicting forms of communication and still the people got vaccinated, the people wear mask, the people stay at home like in most countries it worked very well even though the vaccines were very new, even though we knew very little, even though we told them that they should do that and then they shouldn't. And we gave them very little reasons. And so my take would be that during the covid 19, governments were quite bad at communicating uncertainty for instance. And I think one problem is that many governments assumed that basically people are stupid and would panic if you tell them the truth. 

(16:34): 

Like if you tell them we don't know there is this new virus, it's actually very scary. We are trying to understand how it's spread, but we don't know yet. So maybe wear a mask just in case it works. We don't know. I think governments are very bad at doing that and they should improve because they are losing people's trust when they don't communicate uncertainty and when they don't communicate. But to respond to your broader point about how you can have misinformation if people are rational, I think one answer is that some people's goal is not to be accurate. Some people's goal is to promote a political cause. Their goal is just to promote a political party. And so of course they don't care about accuracy. Does it make them irrational? I don't think so. Their goal is just not accuracy. And some people have very good reasons to not trust institutions and to not trust scientists which are part of these institutions. 

(17:23): 

Like if you're part of a minority, then all the interactions you've had with institutions are very bad. Like you have reasons to not really trust them. So I don't think we should like pathologize or say that people who believe in share misinformation are irrational and they, they, they're very different from us. I think they are different from people who don't believe in misinformation, but in subtle ways. Some people they feel left out, they, they feel like the institution did not do enough for them. It's very complicated. And the use of the term rational and irrational I think should be used carefully because we don't want to say that some people are irrational when we don't necessarily understand their goals, we don't necessarily understand their background and I'm not being relativistic here. Of course they are scientific truths in the sense that some things are more likely to be true than others, et cetera. Some of course people should trust science, but still some people have good reasons not to trust it and we should address these these reasons. 

(18:15): 

Yeah, no, and that's a a good point about even just incentives, right? So yeah, obviously even someone who maybe realizes that climate change is a problem, maybe they work for an oil and gas company or a certain institution that just has to prioritize their economic interests. Maybe even the short term over the greater good of saving the planet as it were. Right? I so I get that.  

  

(18:37): 

Yeah, or they or they're poor and they live in the countryside and they need to use their car and they cannot buy a new electric car or whatever. And so yeah, of course they're off when you want to have like a carbon tax because they will be the one paying because they have their car. And even though the, the carbon tax would I think be amazing and solve a lot of our problem if it affects mainly poor people, they are right to be mad at the government. And that's what happened in France with the yellow vest. They're mostly poor people living in the countryside that need a car, et cetera. And of course they were off by the carbon tax. It doesn't make them irrational, it's just for them it's difficult. 

(19:12): 

Although you, you know, you said that people are rational and intelligent and I agree with that. I guess one thing I'm still curious about is even just the sharing and spreading of misinformation, is that actually happening at the scale that people think it is? Or maybe is it less of a problem than we think it is? 

(19:27): 

I think it's definitely less of a problem than many people think it is. It depends how you define it. But when you look at how much news from untrustworthy news outlets to people consume, and by untrustworthy I mean like news outlets that use very deceptive headlines that share force information or that do not regularly correct errors when they do make errors, et cetera. So basically news outlets with weak standards of journalism. So when you look at these, people consume very few of that and it also includes like blog posts that talk about what's going on in the world over When you look at that, very, very few people consume that. To give you an idea, it's like less than 5% of all the news that people consume come from these news website with very weak standards of journalism. And when you look at what people actually consume in the world, it comes mostly from reputable news outlets like the New York Times in the US or the BBC in the uk. 

(20:19): 

And of course it doesn't include all the misinformation et cetera, but we need to keep in mind that people in general consume very few news. Like news consumption is not that big. You have some people that are very interested in the news and politics and consume a lot of it and they're very visible, especially on social media. And so we tend to think that yeah, they represent the general population but not at all the average internet user, they consume like less than five minutes of news per day. It's very small. And of course, I'm not saying it's very small in all countries in the world when the government controls the media and stuff, of course you have like misinformation, propaganda, et cetera. But I'm talking about in democracies, although it's not that bad and I'm worried that by saying, you know, we have misinformation everywhere and people consume a lot of misinformation and people are not reasonable and people are are a bit stupid, et cetera, we risk eroding democracy cuz democracy arrests on the premise that we think other people in our country are reasonable. 

(21:14): 

Cause if they're not, why do we think that the new president is legitimate? He has been elected by people who have been brainwashed. And that's something we've seen with Trump. You know, it's like, oh Trump is not legitimate people who voted for him, they don't know what they're doing, they are stupid, they have been brainwashed or whatever. And I think that's very wrong to think that. And it doesn't help democracy to think that Trump voters are just stupid and don't know what they're doing because you are missing the reasons why they did that and you're missing that, I don't know, they fear left out. You have racial resentment, you have all these very deep and complex problems that explain why they voted for Trump and reducing them to stupid people who have been brainwashed will not help understand or solve the problem. It'll even make us focus on the wrong things. 

(21:53): 

It'll be like, oh it must be an information problem. It's missing all the context for which people voted for Trump. I think we should be careful about all the alarmist narratives about misinformation. They should be put in context like people don't consume that much information and overall people are quite reasonable and we are not, we should be asking why are they not, why are people voting for Trump? What's going on? Why are populist party more successful in Europe? What's going on? And in some ways, you know, on the internet and on social media in particular, we have a very biased perspective on reality because we see the people who interact the most and who basically have the most extreme opinions. So in the US for instance, it's not polarization that is increasing, its effective polarization. It's not the attitudes between Democrats and Republicans that are going further apart. 

(22:40): 

It's how much they hate each other and how much they love basically they're in. It's that because when you look at attitudes on gun control for instance, like background checks, most Republicans agree that we should have background checks even on the most controversial issues. You have actually a lot of agreements on many attitudes and the divide between Republicans and Democrats has not increased much. It's really about how people feel about the outgroup and how they represent the Outgroup and their end group. Most voters are just moderates and most people are moderate and have very moderate opinions and can change their mind. 

(23:12): 

So is there a difference between having a conversation with let's say between a Republican and Democrat in person versus online? 

(23:20): 

Well it's difficult. There are very few studies comparing that and we know in the US for instance, that many people just avoid political discussions with their colleagues. People do avoid heated political discussions in many countries, in many countries people are quite uncomfortable with disagreements. So I think one increasing consensus is that people are exposed to more contradicting views online than offline, which is exactly the opposite of what you would expect from like the echo chambers or the filter boards or whatever the narrative is, which is the idea that online you're only exposed to things you agree with. And I think increasingly the consensus is that no, actually online people are exposed to more contradicting views. And when you think about it, even when you watch, I dunno, politicians you agree with on YouTube in the comment section you see people who disagree with him and the same on Twitter. 

(24:10): 

Like you have a lot of disagreement and it's not something that happens a lot in real life offline. Like we, with my colleagues, even though I'm an intellectual and academic, we rarely discuss politics because it often comes down to values and stuff like that. And you often don't really want to discuss that and to argue about that. But online many people do. And so we are exposed to it and sometimes it's quite uncomfortable and that's why you have the narrative that yeah, online people are mean. But I wanna say that this narrative is not necessarily true . So we are exposed to more contracting views online, but it doesn't mean that being online makes people bad or angry or whatever. Like you have at least some evidence now showing that the people who are angry and at stuff online, they're also angry offline. It's just that we see them more online because online they can express themself a lot, they can be very active and they're overrepresented. Whereas offline is just that we at friend that we don't wanna see and invite to the party basically. 

(25:07): 

. Yeah, that's an interesting point about online being exposed to different ideas but then necessarily making people more angry or more conflicted. That's kind of my general perception is that social media in general feeds off of that misinformation and people fighting and those bad incentives. So that was kind of my perspective. But that's maybe not necessarily as much of a problem as we might think it is. 

(25:31): 

I think on some platforms, on some specific platforms like I don't know Twitter of course it feeds from some political disagreements, but it's a very weird example of people again, and when you talk about social media in general, I think it feeds from entertainment way more than political disagreement. Even though political disagreement can be a form of entertainment for most people it's not most because most people don't care about politics and I'm not really interested in it. So for them they just avoid it. And when you look at TikTok and stuff, of course you have some political debates or whatever or some political content, but that's not really what drives most engagement. And when you look at Facebook, that's not political disagreement that drive engagement. It's like cat videos, animal videos, people dancing, electronic postcards that you send where it's written return like oh have a good weekend or whatever. People use social media to connect with others and you have a small minority of people who are very politically interested and who want to engage in political debates. And here these people will drive some part of the algorithm and they will have their bubble where they, they discuss their things and disagree but I don't think it represents the experience of the average social media user. 

(26:36): 

One idea that I came across when I was reading your paper Sacha, that I found was interesting, I think applicable to this conversation was the idea of beliefs and how the beliefs we hold affect our actions. So you talked about intuitive versus reflective beliefs. Maybe explain that to me. 

(26:50): 

Yeah, so it's a philosophical distinctions based on cognitive science and it has first been introduced by Dan Sperber, it's the idea that some of the ideas and representations we have in our heads are intuitive in the sense that they will be treated by all our cognitive mechanisms as true information. We have intuitions about naive physics. So when I drop an object I have the intuition that we'll form and basically I have many inference systems, even babies have that you have many studies in like experimental developmental psychology showing that babies have intuitions about how objects move in space. They have intuitions that object will stop when there is another object, that object will fall to the ground, et cetera. So we have all these intuitions for instance, my belief that there is a table here, it's intuitive in the sense that all my actions will be constrained by it and I cannot do anything about it. 

(27:46): 

My actions will be constrained by it. I cannot believe them basically, and I cannot not act on them. It's very difficult to not act on the belief that there is a table. Even if I try to pass my hand through the table, some inference system in my head will tell me it's not possible and they will stop it. So it's a very, very deep intuitive belief. And there are some other beliefs that are reflective in the sense that they are held in such a way that they call it in parenthesis. So for instance, imagine your kid, your five-year-old and your teacher tells you that cucumber are made at 90% of water. So you're being told cucumber is mostly water, but you're like okay, but I also have these intuitive systems that tell me I can grab the cucumber and the cucumber act as a solid. 

(28:29): 

So you have this intuitive belief that is, yeah cucumber as a act, as a solid, they're solid. But then you have your teacher whom you trust very much, who told you that it's made out of weather and whether doesn't act as your thing. So you hold this belief in parentheses and it's a reflective belief in the sense that it'll not influence most of your behaviors and it'll not be treated by the other part of your brain as information in the sense that if I throw you a cucumber you will not have the intuition, oh it'll splash me and it'll be great. No, you have the intuition that you'll have to grab it. So that's the difference between the two and many beliefs that we hold, like scientific beliefs like the tectonic plates that I mentioned or it climate change, they are reflective in the sense that we don't grasp them and most of our brains they don't grasp them. 

(29:12): 

We just have the idea that oh yeah, climate change is happening but what is exactly climate change? How does our brain represent that? And the same for the 1.5 degree or three degrees or whatever, like how does our brain represent that? It's very, very high level and of course we can act on them in ways we are like, oh climate change is a very big problem, I need to give money to association fighting climate change. So I'll do that. But it's a very reflective way to think about it and it won't affect the way I work in the street or whatever. And for instance, the people who believe the earth to be flat, all they do is exactly the same as people who believe it's fair. You know, they don't, they take the plane, they do everything like everyone else. They just have this belief that the earth is flat. 

(29:50): 

And so you have this distinction between the two. And of course, so intuitive beliefs have very strong behavioral implication, whereas reflective beliefs often have very weak and indirect behavioral implications. And unfortunately, yeah, most of our beliefs about climate change, they are reflective and so to act on them, we need to really focus and force ourselves basically to act on them. And generally, unfortunately the correlation there between beliefs and actions is very small because you have the structure in which you make decisions. For instance alone, if you are poor, even though you believe climate change is a problem, maybe you don't have the choice and you need to take your car. So that's one reason why you have a discrepancy between belief and actions. But you have also many other explanations such as why are sometimes people think that they need to do something about climate change, but they also think nuclear energy is bad. So they're fighting with nuclear energy or people think they need to do something about climate change. So they're recycling but they're taking the plane or they still eat meat. You have also people who have misconceptions about what is effective or not to fight climate change. So there's also explain why you sometimes have a gap between people's value and attitudes regarding climate change and their actions. 

(31:03): 

Thank you for breaking that down. I love the cucumber example. When I read that, that made a lot of sense to me as how you could say, oh climate change is a problem but then not do anything about it. So that makes a lot of sense to me that it's reflective. But if it was more maybe intuitive, if it was happening directly to you, then perhaps people would be more likely to act right? 

(31:21): 

Yeah, the problem is that climate change is most intuitive for people who live in poor countries and more generally like the global south. Whereas in many other countries, rich countries that historically have contributed to most of the pollution in the world, they often don't feel it because they're rich. Even when it's very hot they can just turn on the AC and they're fine. So they, they don't feel it. So yeah, that's, that's the sad part. 

(31:44): 

So Sacha, we've been talking about misinformation and media and how we communicate with other people and of course this show is about empowering citizens to take action on the climate crisis. So what can people do today to have an impact on our communication and misinformation? 

(31:59): 

So that's a very difficult question. So I will just give you my opinion, like it's not necessarily based on hard evidence, but I think one thing that regular people can do is just amplify reliable information. So amplify the sources that you trust, especially if you are not like very interested in politics and very interested in the news and you know that people around you are not either, I think it's good to like put some news in their social media field and to amplify the BBC, the New York Times and the science or nature like this good source of information. I think it's good to amplify them and especially on social media where it's very hard for them to compete and to ignore misinformation. And yeah, I think it's important for the people who are reasonable and not very interested in politics, et cetera, to express themself on social media. 

(32:44): 

I'm not asking them to take strong positions, but even just sometimes from time to time sharing a, a news article about climate change just to show their network that they care about climate change, that they want things to happen. So I think it can be good to just be to for reasonable people to be more vocal about the things they have at heart and even to sometime show that they have very moderate views even on like gun controls or whatever. And to say, yeah, I'm open minded. It would be good if we had like a more representative views of people when looking at social media. And for scientists, I have a bit more opinions because I'm a scientist. I think it'll be good for scientists to be nicer and appear nicer basically because there is a lot of evidence that scientists are deemed very competent. 

(33:26): 

But some people thinks that scientists don't always have people's interest at hurt. And I think that's a problem. Like we need to show people that we are just like them. We have their interest at heart. And even though sometimes we accept money from some companies, it doesn't mean we are evil and that we don't have people's interests at heart. And one way to do that is to be on social media for instance, and interact with people, but also just talk with people to show that, yeah, I'm a scientist and look I'm nice more broadly. I think it's important to try to frame the information that we communicate in an intuitive manner, try to avoid common misunderstandings. We know for instance, that when you, we use the word natural, like natural gas, people will think it's good and also need to be clear about effects sizes. 

(34:08): 

Because when we say for instance you have differences between men and women in terms of, I don't know, brain visions or whatever, people will think that intuitively when we say differences, it means differences. Like it's a real thing. But when you look at effects sizes in science, often they are very small. So we need to be clear about that because there is actually one study showing that lay people are unimpressed by most effects sizes in psychology, , they think that even medium or moderate effect are quite small. And of course we need to communicate better about uncertainty. Science is not the theory of relativity and Darwin’s theory of evolution, science is messy, science is humans disagreeing and there is often not very strong certainty except there is a scientific consensus, but it takes time to reach it. And so we need to be clear about that and to help policymakers I think communicate better about scientific uncertainty. 

(35:00): 

When you communicate uncertainty in an intuitive manner, people are quite fine with it. And of course they're gonna trust the results a bit less if it's very uncertain, but it makes sense and it'll help in the long run. Cause I think we've seen that during covid 19. When you say masks use less, but the uncertainty is very high. Actually we don't know. It's better to say that than to say mask don't work on a week later say mask work because they're gonna say what? You change your mind all the time. Whereas if you communicate a certainty, it's easier to change your mind and provide conflicting evidence in the future. And it's better for trust because otherwise it'll erode trust and people will trust politicians and scientists less and less if, if we keep communicating this way. 

(35:39): 

Well yeah. Sacha, this has been a really interesting conversation, so thanks so much for coming on the show. 

(35:44): 

Thanks Michael. It was very nice talking to you. 

(35:47): 

Well that was my chat with Sacha. I think the biggest thing actually that stood out for me was that intuitive versus the reflective beliefs. How we can know something is true but not act on it. And that made total sense to me. Well that's all for me. I'm Michael Bartz. Here's the feeling a little less in of our heads when it comes to saving the planet. We'll see you again soon. In Over My Head was produced and hosted by Michael Bartz original theme song by Gabriel Thaine. If you would like to get in touch with us, email info@inovermyheadpodcast.com. Special thanks to Telus STORYHIVE for making this show possible. 

(36:23): 

I'm tryin’ save the planet, oh will someone please save me.  

Changing Minds Part 6: Misinformation & Mistrust
Broadcast by