May 26, 2021
We know that not all American trust science. But why, exactly, should they, when scientists are fallible humans, just like the rest of us? Acclaimed science historian and author Naomi Oreskes is on the show this week to answer exactly that. Check out her new book, Why Trust Science?, at factuallypod.com/books.
106 — Why Trust Science? with Naomi Oreskes
Speaker 1 [00:00:02] Hello, welcome to Factually, I’m Adam Conover and this week; let’s talk about science. We hear all the time in the media that a lot of Americans don’t trust science and we’ve seen that during the pandemic; public health advice from some of the most experienced, knowledgeable scientists in the country was treated by many like a partisan football, with arguments erupting everywhere over whether those scientists are really trustworthy. The pandemic in many ways highlighted what seems to be a divide in how Americans view scientific expertise. And so a lot of people (especially in the media) started asking, ‘What do we do about that? How do we make people trust science?’ Well, you know what? I think that is the wrong question to ask. I don’t think we can treat science as just some authority figure that we must trust and we must bully other people into trusting. Instead, we need to ask a different question and we need to make sure that question is answered to the satisfaction of our fellow citizens. And that question is ‘Why exactly should we trust science?’ I mean, seriously, ask yourself this. It’s a harder question to answer than you might think. The truth is that contrary to what some people assert, scientists are not infallible angels strumming truth harps on fact clouds. We shouldn’t just trust them because they are scientists. They are, in fact, fallible people working for fallible institutions funded by fallible sources like governments and businesses. Like anyone else, scientists are subject to biases and distortions and mistakes. Science is as messy as every other fucking human enterprise on this goddamn planet, and when it makes mistakes they can be big ones. For instance, in the 19th and 20th centuries, scientific racism in the form of studies like phrenology and eugenics was the toast of science town. Many, many scientists asserted and believed these racist propositions that we now know are false, with devastating results. So given that fallibility and that history, why exactly should we believe that scientists are more trustworthy than anyone else or that science is a more trustworthy enterprise? Well, here’s one argument that you might advance. Unlike many other human enterprises over the last several centuries, science has made indisputable progress in our understanding of the universe. We absolutely, indisputably know more about the world and how it functions this year than we did last year and last year we knew more than the year before that. And this process has been going on in some form for hundreds, if not thousands of years. It’s not just little bits and pieces of info we’re compiling. We are breaking off big old chunks of reality from the darkness and bringing it into the light of human understanding. Let’s say you’re a baby who was born in 1900, but you’re a really smart baby, so you’re able to read all the scientific knowledge of that time. OK, well, even though you were very, very smart, you would not know anything about the structure of DNA or RNA. You wouldn’t know about the variety of weirdo subatomic particles that make up matter. You wouldn’t know that antibiotics could stop infections and you wouldn’t know that black holes are real. Without question, science has delivered the goods in terms of our understanding of the universe. But we still have to ask the question, ‘What is it about science that makes it so successful? Why should we trust science and scientists when we speak?’ And honestly, what is science to begin with? There is a lot that we need to know here before we know where to put our intellectual trust. Well, to answer those questions, I am so honored to say that we have one of the great scholars of science and the history of science on the show today; Naomi Oreskes. She’s a professor of history at Harvard and the author of the landmark book ‘Merchants of Doubt,’ as well as her new book, fittingly titled ‘Why Trust Science?’ which you can get at our special bookstore at factuallypod.com/books. Please welcome Naomi Oreskes. Naomi, thank you so much for being here.
Speaker 2 [00:04:03] Thank you for having me.
Speaker 1 [00:04:06] So your newest book, let’s just jump right into it, is called ‘Why Trust Science?’ Why trust science? I’m sorry to make you summarize the whole book in my first question, I bet you might have a good answer to this question and why is it an important question to ask?
Speaker 2 [00:04:23] Well, it’s obviously an important question to ask because we’ve seen what’s happened this past year when people don’t trust science, when people (whether it’s politicians, governors, the President of the United States or ordinary citizens) refuse to accept scientific advice, refused to trust what scientists have learned about the world then people get hurt, people get sick, people die. Science is our best way of understanding the world around us and when we understand the world around us, we can use that knowledge to protect ourselves against disease and to do things we want to do, like go to the moon or invent better technologies. But when we ignore that evidence, we do so at our peril. And so it’s an incredibly important issue for all of us. And that’s why I wrote the book, even before covid-19. The answer to the question is twofold. Scientists are the people who have the job of trying to understand the natural world. That’s what they do for a living. Every day they wake up and they get to the work and their work is the work of understanding the natural world. So if we have questions about the natural world, then scientists are the experts to whom we turn. So on one level, the answer to my question is very simple. If you have a toothache, you go to the dentist. If your car breaks down, you get to a car mechanic and if you have a question about nature then you go to a scientist. So that’s the simple answer. And then I can get into more detail if you’re interested.
Speaker 1 [00:05:44] Yes please, that’s why we’re here, is to go into detail. So that’s a really good heuristic to use throughout our lives. We’ve talked about on the show before, when dealing with misinformation, one of the first things to do, it’s often we say too much ‘Go do your own research.’ What you should really do often is go to an expert and find out what an expert thinks. But I don’t think you can just reduce that to an argument from authority. Right? That, ‘Hey, this is the person who knows. Therefore, everything they say must be unimpeachable.’ Just to take the devil’s advocate position, there have been plenty of things that scientists have been wrong about. Certainly innumerable statements that, some of which were very harmful. So how do we integrate that into an argument for trusting science?
Speaker 2 [00:06:32] Well, integrate is the operative word because there are conspicuous examples where in hindsight we would say scientists were wrong, and sometimes in damaging ways. And the second chapter of my book is all about that. So I look at some specific examples where scientists did go wrong and ask the question, ‘Well, why did they go wrong and what can we learn from that?’ But if we put those examples in context, what we find is that they’re actually not that common. That if we look at the track record of contemporary and modern science and I think as a historian I would say – I think we can legitimately say that science as an institution, science as we know it, has been around since about the 17th century. So that’s a long time. So we have a long historical record and that historical record is very well documented. It’s very well studied. And what we see is an amazing track record of success. So many things about the world that we understand now is because of the work that scientists did and so much work that we’ve been able to use in good ways: to cure disease, to create safe and effective medicines and vaccinations, to build safer houses, to predict where and when earthquakes may occur, to understand volcanic hazards, to build effective technologies, there are so many things we’ve done and so many things that we take for granted in our daily lives. And I think one of the problems we face is that when everything’s going right, we tend not to notice. It’s like we take our car for granted when it doesn’t break down. We take Zoom for granted when it works right. We tend to notice when things go wrong. So I think there’s been a bit of undue emphasis on some of the mistakes or errors of science against a larger background of a tremendous track record of success.
Speaker 1 [00:08:13] Yeah, science seems really remarkable in that there’s so many fields of human endeavor that we’ve been doing as long as science that have not made progress in the same way. I’ve had philosophers on the show and we’ve talked about how philosophy hasn’t advanced in the same way (and that’s really the wrong way to look at it). But it’s not like we’re like, ‘Oh, we’ve learned so much about philosophy since Socrates’ day.’ We still read Socrates and think he has some good points and recent philosophers have no better claim to truth than he does in many ways. Architecture, right? We’ve been doing for centuries. But it’s not as though architecture is much better than it was. It’s better in some ways but I’m here in Miami today and I’m running around. Architecture is not great here in Miami, Florida. But science by contrast, has made progress throughout the last few hundred years. And I suppose if you want to call proto scientific endeavors that were done millennia past, we do actually understand the world in a concrete way, much more fully than we did a century ago. We have made undeniable progress and that’s rare in humanity.
Speaker 2 [00:09:28] Well, exactly and historians never want to use the word ‘undeniable’ because you can always find someone to deny it. I think what is fair to say is, if you think about the very word progress, it’s a little bit of a vexed word because for one group of people might actually be regress for another. Progress is a very value laden concept, so different people have different conceptions of what constitutes legitimate progress. But nevertheless, it is true that the very word progress is often associated in many of our minds with scientific progress and with technological progress. So why is that? Why do we have this close coupling in our minds between science and technology and the notion of progress? And I think the answer is exactly embedded in your question, because science is a process of learning. It’s about learning about the natural world. And we do have a notion of progress being tied up with learning. We could talk about children progressing in school. We could talk about making progress in learning to play the violin or progress in learning to speak Chinese. When we’re thinking about learning, we have a strong sense of progress because we can track our own learning process and say, ‘Oh yeah, I play the piano better today than I did a year ago.’ Or ‘My French has become more fluid.’ Or ‘We know more about the natural world.’ There are questions about the ocean, the atmosphere, about life on earth, about the structure of the atom that we can answer now in powerful and convincing ways that we couldn’t 10 years ago, 100 years ago, 200 years ago. So I think it’s that notion of learning that makes us feel that, yes, there is a sense of progress in science and it’s not that art isn’t good but if you compare contemporary art to 18th century landscape painting, it might be better. It might not be. It’s more a matter of taste.
Speaker 1 [00:11:18] Yeah, you’re right it’s really embedded in there. I think a lot about how – my sister is a physics reporter for Science News. She writes about particle physics. She’s telling me about the most recent advancements and she’s been telling me about how particle physics specifically has made such enormous advances in the last 50, 70, 100 years; constant discoveries about the fundamental nature of matter, of existence. Almost metaphysical questions are almost being answered, except that in recent years they’ve sort of run out of frontier. There’s been this problem where the physicists are like, ‘Wait, we’ve got questions that we’re trying to answer but nothing new is coming up. We’re not yet finding the avenues for exploration’ and when an experiment does turn up, like an anomalous result that contradicts what they know, everyone gets excited because they’re like, ‘Oh my God, there might be new physics that we can do.’ And if that doesn’t happen, they’re frustrated because they’re like, ‘Well, we can’t know everything. There must be more to learn, but what is it?’ And they can’t even figure out what they don’t know. And I think that’s such an interesting dynamic that is so buried in our expectation about what science is, because an artist would never do that. An artist would never say – actually maybe an artist would say that.
Speaker 2 [00:12:41] I think in literature you can find people who have said things like ‘After the novel, what’s next?’ So maybe that’s not so unique. But what I would point out there, that seems to be something a little peculiar to physics. I’m not sure what it is about physicists that they sometimes feel as if they’ve run out of questions, because I’ve never known a biologist or a geologist or a chemist to ever speak in that kind of language. But physicists sometimes do and one thing any historian would point out is they were talking like that in 1904. There was a very widespread sense in physics at the start of the 20th century that they were done and we look back now and we think, ‘Wow, that’s pretty funny that they thought they were done when they were actually right on the eve of relativity and quantum mechanics.’ So I’m confident the physicists will find new and more and good things to do, even if they don’t see it right now.
Speaker 1 [00:13:31] Do you think that emphasis on progress, is that a mistake at all or is that part of what science is?
Speaker 2 [00:13:41] Yeah, that’s a really great question. I think it’s tricky because as I said, as a historian, I’m mindful of the idea that progress is a very value laden concept. And the notion that there is progress in some absolute sense is one that almost all historians would challenge, both intellectually and politically. But nevertheless as a historian of science, I’m mindful of the idea that in science there is this sense of progress. It’s what a historian would call an actors category. That is to say, scientists believe that their enterprise progresses and they believe it in a way, as you said, that philosophers don’t generally. So the fact that scientists experience what they do as progress and we as people who use science also experience as progress, tells me that it’s not a category that we just want to throw away, it’s a category we want to understand. And as I said, the way I understand it is to think of it as progress in learning; that we learn more things, we learn new things, we discover things in the world that we didn’t know were there. And that’s a kind of progress.
Speaker 1 [00:14:40] Yeah, so what is it about science that has caused this remarkable success, as you put it? Because there’s certainly plenty of fields of human endeavor that are trying to progress. There are certainly plenty of philosophers who thought that they were progressing the field. There’s many, many – there’s alchemists who felt they were understanding the nature of reality but were failing to do so. Yet science has actually been successful at it. So what is it about science that causes it to work where other fields do not?
Speaker 2 [00:15:17] Well, I think there’s two things, maybe three. One is the sustained engagement. Artists might not have a sense of progress in art, we might not say that cubism is progress over representational art. But art continues. There is a sustained project that we call art. So in that sense, we can say science is not that different from other activities. Scientists have hung with the project. They’ve sustained the project of investigating and understanding the natural world. And so, some of the success is simply that it has been sustained, that people have stuck with it, they haven’t quit. And especially when the going got tough, science can be really hard. Many of the questions that the people I study have tried to answer were really, really hard questions. I have another new book about the history of oceanography in which I look at the question of deep ocean circulation. This is something that scientists argued about for 300 years before they finally got to a point where they said, ‘OK, yes, we think we know what’s going on.’ So some of it is patience and persistence that science as an enterprise has illustrated. The second thing is money. Part of the reason scientists have been able to sustain this project is because people have been willing to fund it (and particularly in the 20th century, governments) because there’s been a perception that science is useful, and particularly in the 20th century I’m sorry to say, useful for warfare. The book I wrote on oceanography is called ‘Science on a Mission,’ and it’s all about military funding of oceanography because of its relevance for submarine and anti-submarine warfare. So there’s been a very steady stream of funding into science, particularly in the last 100 to 150 years, which has made it possible for scientists to have the kind of sustained engagement that they do have. Other activities could have maybe been successful if they’d had the same kind of institutional and financial support. But science, we know, has had it. The third thing that I focus on in this book (in ‘Why Trust Science?’) is what I describe as the ‘critical vetting of claims,’ that it’s not enough in science to do the work, to do the investigations, to come up with ideas, to think that you personally have an answer to an important question. You have to expose that conclusion to the critical scrutiny of your colleagues, your peers, your fellow experts in the field, and that critical scrutiny is really, really tough. The philosopher of science, Helen Longido, refers to it as ‘transformative interrogation’ and I really like that term because I think it hits the nail on the head of what’s going on. It’s interrogation because it’s tough, your colleagues see their job (and it’s part of the cultural values of science) as challenging our colleagues. So if you, Adam, come to a conference and you say, ‘Hey, I have this brilliant new theory that I think is going to explain everything. And here’s my argument.’ I’m going to say, ‘OK, that’s great, but what’s your evidence?’ And then you say, ‘Oh, I have evidence.’ And then someone else is going to say, ‘Well, yeah, that’s great. But is that really enough evidence? Is that evidence really robust? Does really hold up?’ And so critical scrutiny means that what we call a scientific fact is never the opinion of one person (no matter how smart or clever that person is) it’s something that has been accepted by a group of peers who have looked at it in a tough way from many, many angles. And then one more thing. So, as I said, Helen Longido calls this ‘transformative interrogation’ because typically what happens is that the initial claim doesn’t entirely hold up to scrutiny. So you come and you say, ‘I have a theory of everything,’ and I say, ‘Hold on, wait a minute.’ And then we have a back and forth and we discuss it in conferences. You submit it to a peer review journal, the reviewers criticize it, then it gets published. Maybe then there’s more feedback and commentary. And it might be that you say, ‘OK, well, yeah, my theory of everything turns out not actually to work for everything, but it does work really well for these two things.’ And so it becomes a theory of those two things, or you modify the theory to make it work better in response to the critiques and comments of your colleagues. And so the process is transformative. By the time we finish and we agree on something that we think is true, it’s typically not the same claim as what we started with initially. And that starting point might have been a year ago, it might have been 10 years ago it might have been a hundred years ago.
Speaker 1 [00:19:45] That’s really interesting. I’m really struck by the fact that, I feel like if I asked a lot of scientists (or people who think about science) that same question, they would have answered me in a more epistemological sense. They would have said, ‘Oh, well, science, here’s how it works. You advance a claim and you test the claim and you throw out what isn’t true and even though you might have mistakes over time that winnows out the bad ideas, et cetera,’ and ‘The scientific method, et cetera, et cetera.’ Everything you just described to me, though, was about the cultural context of science; its position in society, the way that we fund it, the way that the community of scientists operates as a group of people, which is often left out of that discussion. I think that’s a really interesting emphasis that you have. I wonder if you could tell me a little bit about that.
Speaker 2 [00:20:37] Well, the odd thing about scientists is that they’re often extremely unscientific about their own enterprise. So many scientists have ideas, they have ideologies, they have ideals of what they think science is or should be and often these are ideals that were passed down to them by their teachers or mentors. But they’re actually not data based. They’re not based on the evidence. And so this is why my field exists. This is why there are professional historians and philosophers, sociologists and anthropologists of science; because we actually study science as it is. I don’t think of my project as exactly a science, but it’s an empirical project. We look at evidence, we look at data. We say, ‘Well, what are these people really doing?’ Set aside the theories and the ideals and the ideology. What are they actually doing? And when we actually look at what they’re doing, that’s when we get this more complicated picture, and a picture in which these social institutions of science; institutions like conferences, scientific societies, peer reviewed journals, these are crucial to the vetting of claims. And if you actually look at what scientists do, you find they spend huge amounts of time in conferences. They spend huge amounts of time preparing papers for peer reviewed journals, revising those papers when they get them back (when they get the criticism of the reviews back), resubmitting them sometimes the revision and resubmission can take two, three, four or five rounds. So these are the activities of scientists. It doesn’t mean that they don’t also sometimes do experiments or test a hypothesis. Of course they do that, too, but it’s just a part of the process. Well, I’ll just put it that way. It’s a part of the process. It’s certainly important in many contexts, but it’s by no means the whole picture.
Speaker 1 [00:22:21] I love that you as a historian are doing science (or you’re at least doing investigation) on the scientists themselves who – a physicist is not thinking scientifically about their work. They have not been trained or have any interest in studying the dynamics of human groups. But that’s what you do. I think that is so often left out of our discussion of science, so many of the times when there’s a debate within scientific circles about inequity, gender, race disparities or any other sort of cultural issues in science that might be happening, there’ll be a faction says ‘No, science is just about the scientific method. That’s all it is,’ is that ideological version of it. And you’re really pointing out, no, where science happens – That’s such a startling observation, that science is happening in those conference rooms, all those people are – they’re at the Large Hadron Collider. Sure, they’re spinning the particles around real fast. But a big portion of science is happening when they go out to the bar afterwards and talk about it and then when they go to the conference and they go to the bar – Everything happens in bars, is what I’m saying. Everyone’s drunk when science is happening.
Speaker 2 [00:23:32] I won’t go quite that far. But, you have the general picture. Yes.
Speaker 1 [00:23:36] Well, so my question about that then is, if the scientists themselves aren’t really thinking along that dimension, that is so much a part of what keeps what they do a healthy, successful enterprise. It’s not crossing their minds because they’re not really necessarily trained to think that way. That suddenly makes science feel like a much more precarious enterprise than I thought. For instance when we’re booking the show, we try to find folks who – We want to make sure that our guests, when they’re scientists, are part of the community, that mainstream community in their field. But there is such a thing as brilliant people who are outside of that, who are not able to hang. Who are seen by that group as being outside of it. And there’s also such a thing as groupthink and cultural groups that make mistakes like that. So being that you have that focus on the cultural organization of science, how is it successful even though there’s all these built in biases in human groups?
Speaker 2 [00:24:36] Well, that’s a great question. I’ll try to unpack that a little bit. So, first of all, I think the vision of science that I put forward in my book is not more precarious. It’s more human. I think it’s important for us to recognize the human dimension, because if we put science up on a pedestal and make it seem as if scientists are some kind of gods and then we discover that they have feet of clay, then we’re all disappointed and we’re crushed. And we saw that, I was just talking in another interview where the issue of the so-called Climategate scandal came up when some climate scientist emails were stolen and what really came out of that? What did we really learn? We learned that the climate scientists were people. We learned that there are people who sometimes get frustrated and angry. We learn that they didn’t really know how to cope with organized disinformation that they were facing in their work. And in their frustration, they considered the possibility of some things that probably were not a great idea. Now, did they actually do those things? No. There was no evidence that any of the scientists involved had actually done anything wrong. But there was the suggestion that they had maybe possibly considered some things that weren’t entirely great. So this is splashed all over the front page of newspapers. And there were even people who said, ‘My faith in science has been shattered.’ And honestly I wanted to – Can I say this? Is this is a family show?
Speaker 1 [00:25:53] No, go right ahead.
Speaker 2 [00:25:53] I wanted to hit these people. What do expect? These are human beings, people make mistakes. The good news of that story is that they didn’t actually do the bad things they thought about, but they did think about it. Well, come on guys, have you never thought about doing something wrong in your life? Have you never been tempted to – Well, I don’t want to say what I’ve been tempted to do, but have you never been tempted to do something inappropriate in a moment of frustration? Yeah, of course you have. We all have. But here’s the crucial thing: that makes science less precarious because the reality is that ‘What do we really want from these scientists?’ We don’t need them to be perfect human beings. We don’t need them to be saints. What we need is that for the scientific claims they make (in this case about the climate system) to be accurate, to be reliable and as far as we know, to be true. And how is that achieved? That’s achieved in that social dimension that I was talking about, because it’s not just about the one climate scientist. He’s got to take those claims into the court of scientific opinion and they’ve got to be vetted by other colleagues who are going to ask hard questions and who are going to say, ‘Well, what is the evidence for this claim?’ And so that human dimension is actually what makes science strong because it’s not just the opinion of one person who might be fallible, but it’s the collective wisdom of a whole group of very smart people who have looked hard at these issues. Now, the other thing I’d like to add about that is, one of the interesting things: my work draws heavily on work and sociology of science. Work that was being done in the 70s and 80s (particularly in the 1980s when I went to graduate school), and in the 1990s a lot of scientists took offense at this work and they considered work that looked at the social dimensions of science was somehow a kind of debunking. I think those scientists got it completely backwards, that actually the social dimension of science is part of its strength. And when we really look closely at how these institutional structures operate we realize, ‘Oh, well it makes sense this is the strength because you’re not relying on an individual.’ That would be a slim reed to hang your hat on. I just mixed my metaphors but you know what I mean.
Speaker 1 [00:28:00] Yes.
Speaker 2 [00:28:00] If you were only relying on one person and that person turned out to be problematic, then you would be in trouble. But if you’re actually relying on the wisdom and the knowledge of a collective group of people who are all engaged in trying to find the right answer to these questions, that’s a much stronger position. So I often say, I work for years on these projects and I read unbelievable amounts of stuff, a lot of it difficult and boring. And then at the end, I come down to a saying that could be summarized by a cliche. So we all know the cliche that two heads are better than one. But in this case, it’s not just two heads. It’s two hundred or two thousand.
Speaker 1 [00:28:36] Yeah, it’s that group gestalt of all those minds working at it together that come up with those accurate, reproducible results (in most cases) and that in most cases correct when there’s been a mistake.
Speaker 2 [00:28:51] Sometimes scientists like to say that science is self-correcting. And I don’t know what that means because that kind of makes it sound as if science is alive. It’s an entity that fixes itself. No, science is not a robot that knows how to repair itself, but the process of science is involved in identifying and correcting errors. So even once we accept a claim and we say, ‘OK, the climate system has warmed,’ for example, ‘and we are pretty sure it’s warmed just about one degree centigrade.’ But the scientific process is always open to the possibility of revision, that even things we think we know very well can be revisited if (and of course, this is a big if) someone has evidence. This also helps us distinguish between what I would call legitimate revision and legitimate dissent versus disinformation, misinformation, fake news and all the other bogus stuff that we’ve been trying to deal with in our lives in recent years. So in science, we do revisit questions and my early work was about one example of that. In the early 20th century, the idea was raised that our continents were not fixed; that the earth was in motion, that big pieces of the earth were moving around and that’s what caused earthquakes and volcanoes. That idea was debated, there was an open discussion about it. The people who proposed it were not neglected geniuses. They had their day in court, but particularly here in the United States, American geologists in general rejected the idea. There was some acceptance, but not a lot; mostly rejected. Europe was a little bit more mixed, but even there it was not taken as established to be true. About 30 years later, the debate was reopened and there was a whole body of new evidence available that made it possible to look at those questions again in a different way. And with those new bodies of evidence, scientists said, ‘Actually, yeah, Alfred Wegener was right. Continents really do move.’ Now you could look at that as a failure, that scientists got it wrong the first time around, that Alfred Wegener didn’t get the credit he deserved. Or you could look at it as a success story that, yes, it took 30 years and that is too bad (it would have been nicer if it happened sooner) but actually, scientists came back to the question and they sorted it out. And now we’re pretty sure that continents really do move and we’re not likely to be revisiting that any time soon (although, again it’s always possible). A good scientist keeps open the possibility that we may be revisiting these claims.
Speaker 1 [00:31:18] What we’re talking about is plate tectonics, which is something that I learned about in (I don’t know) sixth, seventh grade, it was like a very bedrock piece of my education in geology to the extent that I had one
Speaker 2 [00:31:31] Bedrock education, I like the pun there.
Speaker 1 [00:31:35] I didn’t intend it, but I’ll take it. Yes. So the idea that it was actually rejected by scientists in the United States for 30 years?
Speaker 2 [00:31:47] Yes, now some of that had to do with World War Two. So in my book, ‘Science on a Mission,’ I revisit this story and one of the things I point out is that actually – scientists mostly rejected it, the debate mostly took place in the 1920’s. But in the 1930’s, there were a group of scientists; it was a group of Americans and Europeans who are working on a model to explain how continents might move and were very, very close to what we would now say is true (it wasn’t exact but it was awfully close) and they actually presented that work at a set of conferences in 1933 and 1936. But then World War Two broke out and the key scientists involved ended up doing military work that was classified secret, and so they couldn’t talk about their work. And after the war a lot of the key data became classified. So in the new book, I argue that even though military funding of science made a lot of things possible; made it possible for scientists to do a lot of new work that had not been possible before, it also led to conditions of secrecy that limited conversations on certain questions and one of those was plate tectonics. And so it’s not until the 1950’s when a group of British scientists start looking at the question again that the debate gets reopened. Not by Americans, but by British scientists. But once they reopen it, then the Americans say, ‘Oh, wait, wait, wait. Yeah, we want to be part of this too.’
Speaker 1 [00:33:08] Wow, this is fascinating. You raise the question of military and government involvement in science, I have a question for you about that but we got to take a really quick break. We’ll be right back with more Naomi Oreskes. OK, we’re back with Naomi Oreskes. So you were talking about (right before the break) World War Two and plate tectonics, a realization that I’ve had in the last 10 years is: I was brought up by scientists. I come from a family of scientists and very much came from that idea of scientists really care about scientific truth. That is what they’re into it for. That is why they’re doing it. And I believe that’s why they’re doing it. But I read that book ‘Sapiens’ a couple of years ago, and I think a lot of that book is a little glib, but it has lots of little insights that stuck with me. And one was that throughout human history, as much as we want to believe that’s true of scientists (and it is true of the scientists on the ground), science is always funded by someone else who has another interest, for the most part. Agricultural science is funded by the USDA, which is trying to increase crop yields, et cetera. The Manhattan Project was funded by the government to create an atom bomb. And that there are these pressures upon science in that way, that what science gets done and what science has money poured into it is determined by social factors, not by some group of scientists deciding, ‘Oh, what’s the best science that we could do right now?’ How does that affect science, in your view?
Speaker 2 [00:34:47] Well, it affects it a lot, and this is certainly a really important question that I think both ordinary people and the scientific community should be talking about more, because there are really big questions about what we want science to do for us. What do we want our scientists to be studying and I think all of us should be involved in that conversation and yet that conversation almost never happens. So in ‘Science on a Mission,’ the subtitle of my book is ‘How Military Funding Shaped What We Do and Don’t Know About the Oceans.’ The argument I make is that funding structures very, very strongly determine what gets studied and what gets ignored or neglected. And so this becomes an argument for diversity in funding sources. The U.S. Navy was a good patron of science. It funded a lot of outstanding research. We learned a lot of things about the ocean thanks to Navy support. But there are also a lot of things we didn’t learn. There were a lot of things that were neglected because it wasn’t something that mattered to the Navy. And that’s not a criticism of the Navy. The Navy was doing its job. In fact, I think you could argue that it would have been a misuse of federal funds if the Navy had given money for things that were irrelevant to the military mission. But we need to recognize that if we rely too much on the Defense Department to fund science or too much on the Department of Agriculture to fund plant research, we will end up with a lopsided vision of the world. And it’s not a criticism of patrons. Patrons, of course, have something they’re interested in. Like you said, Department of Agriculture wants better corn yield, and that’s fine. There’s nothing wrong with better corn yield. But we need to recognize that there might be other kinds of questions about plants that might be – For many, many years the Department of Agriculture funded what was known as ‘applied entomology.’ So the applied study of insects. But what was applied entomology, really? It was actually mostly how to kill insects. It was mostly about pesticides. There are some contexts in which killing insects might be what you want to do. But there are an awful lot of other things about insects, lots of good insects like bees that we need, or insects that play important roles in ecosystems in terms of pollinating flowers or insects that are food for birds (and lots of people love birds). So if you only study insects to kill them, you will definitely miss out on a lot of important things that you might actually benefit from knowing about insects.
Speaker 1 [00:37:12] How do we get funding for the things we actually want to know about insects? And in your view, do we have enough of it?
Speaker 2 [00:37:17] That’s a really good question. I think this is something that I’d love to see more people talking about. Rush Holt, who is a former congressman (one of the few members of Congress to ever have a Ph.D. in physics) has recently written a new preface to ‘Science: The Endless Frontier,’ which was the report that was written by Vannevar Bush at the end of World War Two. That’s generally viewed as the blueprint for science, as we’ve known it in America since the end of World War Two. And in his preface to it he says, ‘We really need a new social contract for science. We need to rethink what the relationship is between the American people and the scientific enterprise.’ And I think he’s right about that. And so how do we do that? I don’t know. I think we all need to think about it but maybe conversations like this, maybe having journalists who are interested in science beginning to think through ‘We’ve done it this way for, what? Something like 70 years now and it’s worked pretty well in a lot of ways but we can also identify some areas in which it hasn’t worked so well and maybe we might want to make some adjustments.’
Speaker 1 [00:38:19] Yeah, but in your view that’s not a reason to trust science less; the fact that it can have this lopsided funding structure?
Speaker 2 [00:38:29] Correct. It just means that we need to know that the knowledge we have is going to be incomplete. And so we might want to ask ourselves questions. For example, here’s a good one. I’ve given a lot of talks this year about covid-19 and one of the things I always say is: the scientists have done a great job. They’ve done exactly what we needed them to do. They identified the virus, took a little while but they figured out how it was transmitted. They sequenced the genome. And with that information, they created these amazing messenger RNA vaccines that many of us here in America have now got that are proving astonishingly successful. Ninety five percent effectiveness in the clinical trials and seemingly something like 98 to 90 percent effective in real life, which is unusual. Usually it goes the other way round, and apparently virtually 99 to 100 percent effective in preventing death. This is an astonishing success. It shows us scientists have delivered exactly what we wanted of them. So that’s the good news. But here’s the bad news. We have about 20 percent of our fellow Americans who have said they don’t intend to get vaccinated. Not now, not ever. And why is that? Is it because the vaccine doesn’t work? No. Is it because the vaccine is unsafe? No. So what is it? Well, we have indications that it’s cultural. It has to do with personal identity. It’s ideological. It has to do with how you feel about the government. It’s partly tribal. People in certain states are much less likely to get vaccinated than others. So these are not questions that will be resolved by better sequencing of the genome. Right? These are social scientific questions that might be resolved if we had a better understanding of people’s relationship to science and people’s relationship to their governments. That can be answered with social scientific research. But as most of us know, social scientists have been massively underfunded in this country. So this is a shameless plug for what I do. I think we’ve learned in the covid crisis that we need social science as much as we need physical science.
Speaker 1 [00:40:34] Yeah, wow, that’s an amazing point, because the vaccines are exactly what you’re talking about. This proof that the science that we funded has had enormous success that MRNA vaccines are one of the coolest scientific innovations, it’s literally hacking the machinery of our DNA replication system in a way that allows us to fight a novel disease within months of it appearing. That combined with the spike protein, all these other discoveries all coming together into this enormous thing. But I’ve talked to plenty of folks on this show about how half of the battle with public health (or maybe even more than half the battle) is communication; is knowing how we get people to understand how to protect themselves; how to wash their hands – all over the world, this is the case. So much of our progress like getting rid of communicable diseases has been teaching people to use toilets in places throughout the world. The government, through the NIH, funds medical science to this enormous degree; billions upon billions of dollars flowing through the NIH (one of the largest research organizations of any kind in the world) devoted to coming up with things like new vaccines but precious little devoted to understanding the people who are going to take the vaccines, who we need to convince to take the vaccines. We need to understand why they might not take it. And yet we don’t put – Why isn’t the government massively funding that program to understand its own people better and why they might be hesitant?
Speaker 2 [00:42:11] Exactly right. The why to that has to do with the legacies of the Cold War and the relationship between science and the military, that’s one reason why I became interested in the science-military relationship, because our whole scientific infrastructure was built (to a very great extent) because of the perception of how science could help in national defense, and that wasn’t necessarily wrong. I mean, you may take objection to it or not (depending upon your point of view) but it wasn’t necessarily wrong. But it’s incomplete and the covid-19 crisis really illustrates that, and we can even take it a step further. So, as you said, public health is about the public. And so if you want public health to work you have to understand the public, but then you can even take it one step further. So we know from opinion polls we have about 20 percent of Americans who don’t intend to get the vaccine if they can avoid it. But we also know there’s another percentage (and I’m not sure the numbers here): the people who would like to get the vaccine but they don’t get paid sick leave, they can’t take a day off from work to get the vaccine without sacrificing salary. I know where I got vaccinated, well I got the vaccine in Utah. So there were no vaccines available on Sunday in Utah. But I was fortunate that I could go on a Tuesday. But then there’s also people who, in addition to not be able to take a day off or an afternoon off to get the vaccine, are worried that they’ll get sick because we know that many people do have some degree of adverse reactions particularly to the second shot. Many of us have had the second shot have been pretty sick for a day or two afterwards but they don’t have sick leave, so they feel like ‘Now I’m looking at maybe three days of lost work and I can’t afford that.’ Now, the NIH is not going to intervene in that issue because that would be considered politics. And it’s not the job of NIH to figure out how to fix the fact that millions of Americans don’t have sick leave. And yet that political piece is crucial to the public health part. How do we sort that out?
Speaker 1 [00:44:09] That also just goes to your point of how successful science has been as a field of human enterprise that we have advanced to the point that we have these vaccines but our political culture, our political systems, are clearly far less advanced. If we were as successful in organizing our society as we are in organizing science specifically, people would have that sick leave and people would not feel those pressures, people would have more trust in their government. A lot of people not wanting to take the vaccine is based on the legacy of the Tuskegee experiment and all these sorts of things.
Speaker 2 [00:44:45] Well hold on, I need to just jump in there because that’s actually not true.
Speaker 1 [00:44:48] Oh okay, please correct me.
Speaker 2 [00:44:49] That’s a good example of, again, where we actually need to study something. So a lot of people have been saying that and suggesting that African-Americans have a legitimate reason to be nervous about the public health system because of abuses like Tuskegee. Well, there’s actually very little evidence to support that. And the poll data we have (we don’t have a lot because it’s a situation that’s very mobile, it’s changing by the day) has shown that most African-Americans and people of color in this country do want to get the vaccine and if they’re not getting it, it’s because of these issues of access that I was just referring to. They can’t get time off from work, they don’t have sick leave. There’s no vaccination center close to their homes. Now, the good news on that is when you realize that’s what’s going on, then you can say, ‘OK, well, maybe we can bring the vaccines to the workplace and people can get their vaccine at their lunch break, or maybe we can bring the vaccines into their communities.’ In Los Angeles, there was a very effective intervention a couple of months ago where doctors did that and they spread the word in the community through churches and thousands of people showed up. So this is why it’s so important to have a social science research, to know what is really the obstacle for folks because if it were the legacy of Tuskegee, that would be a different kind of problem with a different solution than if we just have to send a truck with vaccines into your workplace, right?
Speaker 1 [00:46:15] Yeah, fair point. I mean, that legacy is real but we should understand what the problem that we are trying to address is or we’re going to come up with the wrong solution. I want to make sure that we touch on denialism, by which I don’t mean – I think when we talk about denialism, we often think of the stereotype of a person who’s just ‘I hate science, blah, blah, blah.’ I’m talking more about the concerted effort to disrupt the scientific process by the fossil fuel industry, by other bad actors that are specifically, almost sometimes harnessing the language of science in order to disrupt the work of science. Tell us a little bit about it.
Speaker 2 [00:47:02] Well, thank you for asking because this obviously is a very important part of this landscape. So I’ll start with good news. The good news is that public opinion polls do show that the majority of Americans do trust science broadly and they do believe that in general, science makes our lives better and that in general, scientists do have the interests of the American people at heart. But we also know that there is suspicion and rejection of science in particular areas. And one of those areas is climate change, which is the area I’ve studied most closely, as you said. And here, this is where the role of deliberate disinformation is an important part of this landscape. So we have documented that for 30 years now, the fossil fuel industry and their allies in conservative ideological circles (like libertarian think tanks) have promoted the idea that we don’t really know if climate change is true. The scientists have exaggerated it, that scientists are not to be trusted, that they’re just in it for the money, that they don’t actually care about the truth. And they have deliberately promoted distrust in science in order to promote distrust in climate science and prevent action that would hurt their bottom line. And so it’s really important for us to understand this, to know that this is (in my opinion) malfeasance on a grand scale and that it needs to be called out and that the corporations that do this should be named and shamed, because not only are they preventing us from acting on climate change (which is an incredibly important issue in our lives) but they’ve also now done damage which we’ve seen express itself in distrust of science more broadly which then contributes to people not wanting to wear a mask or not wanting to become vaccinated.
Speaker 1 [00:48:40] And so what are the things that they do to wage this campaign, if you could give us a couple of examples?
Speaker 2 [00:48:46] Oh, well, we only have a few minutes left and there are so many examples. But in ‘Merchants of Doubt,’ we did document the coordinated campaigns that involve things like television advertisements, advertisements in print media, paid lecturers who would go on a lecture circuit, workshops, libertarian think tanks like the Cato Institute (for example, just to name one), attacks on scientists and coordinating with right wing members of Congress to investigate scientists. In the previous administration we saw quite a bit of that, all of which has the same general modus operandi, which is to cast doubt on the reliability of science and to cast doubt on the integrity of scientists. So the American people will say, ‘Yeah, I don’t really know. And since I don’t know, I don’t want to wear a mask.’ Or ‘Since I don’t know. I don’t want to spend money. I don’t want to have to pay a carbon tax. And after all, why should I spend money on a carbon tax? Why should I accept a higher price for gas if we don’t even really know if the science is right anyway?’
Speaker 1 [00:49:51] Sometimes (I’ve covered in my past work) cases in which groups like this, whether the fossil fuel industry or another motivated group, has even done their own science or their own science looking work. Even fucking Gatorade has their ‘sports hydration institute’ or whatever where they fund studies on how great Gatorade will make you at football or whatever it is. How much is that an issue? There have been cases where, speaking about the right wing movement, that they’ve funded chairs at universities that have a particular point of view in economics and that’s influenced our discussion about economics in this country. How much do you worry about that as a distorting influence on not just public perception of science, but the actual work of science itself?
Speaker 2 [00:50:41] It’s a big problem. It’s a big problem on two levels, first of all because there’s a lot of it. And also because, as you suggest, to the degree that real science is compromised by conflicts of interest and distorting influences, then the American people begin to have a legitimate reason to distrust at least some science. And then it really, really muddies the water. And we’ve seen this now in a number of areas, the fossil fuel industry were the people who first developed this is a fine art. I’d say it plays out on two kinds of levels. One is deliberately funding academics in order to influence their findings and so we have good evidence that comes out of the tobacco industry. One of the really scary things about the tobacco industry was that they funded research in practically every major university in America (including most major medical schools) and very, very few schools turned them down. So this led to a huge distorting effect because if you’re doing research that’s funded by the tobacco industry and you know very well the tobacco industry does not want you to find that their product is bad, this is likely to influence your work at least a little bit. And we have studies that show that’s the case, that tobacco funded research was less likely to find that tobacco caused disease than independent research. And this went on for the better part of 30 years. It still continues in some quarters and I think it definitely contributed to delaying action to control tobacco. It contributed to confusing the American people and confusing journalists so that even in the 1970’s (long after it had been proven that tobacco tar caused cancer) The New York Times was still quoting tobacco executives saying, ‘Oh, well, we have this study that says we’re not really sure. We don’t really know.’ And we’ve seen similar things in fossil fuels and plastics and food. There’s a lot of distorting industry research in the food domain. Now, in addition to that though, there’s also what I would call the ‘funding of distracting research.’ So in this case, the research is not necessarily influenced but it’s used to distract attention from the issue at stake and again, the tobacco industry really rose this to a fine art. So one of the things the tobacco industry did was, we know that smoking causes lung cancer but it’s also the case that there are other causes of lung cancer. For example, asbestos causes lung cancer. Radon causes lung cancer. So what did the tobacco industry do? They abundantly funded researchers who worked on asbestos and radon. And then if somebody would raise the link between tobacco and cancer they would say, ‘Well, we have these studies that show cancer is caused by radon and asbestos.’ And of course, it wasn’t a lie. It was true, cancer is caused by radon and asbestos but now the listener is confused. ‘Oh, well, so maybe tobacco’s not that bad. Maybe my Aunt Dayna’s cancer wasn’t because she smoked two packs a day. Maybe somewhere in the past she was exposed to radon.’ And now, again, you’ve muddied the waters and you’ve made it harder to have effective public policy because you’ve made the scientific landscape seem more confusing than, in fact, it actually was.
Speaker 1 [00:53:49] And you’re also changing, like you say, when our oceanography is mostly funded by the Navy and so we have a certain gap in what our understanding is. If these companies are pouring money into researching a certain food – food companies provide a lot of the funding for food science and therefore even when it’s not distorting the findings, it means we’re getting a certain sort of finding and we’re maybe not researching other topics that we might want to know about food. So given those pressures is that something that you think the scientific process is still able to correct for?
Speaker 2 [00:54:33] Well, I think it’s difficult. I think we have seen areas in which the scientific waters have definitely been muddied. So I think universities need to do a better job of vetting funders. I think scientists need to take this more seriously. I’ve certainly been in cases where scientists have taken money directly from industry to fund specific studies but they say, ‘Oh, no, no, no, I’m objective. It’s fine.’ Well yeah, maybe. And of course if my argument is correct then in many cases it actually will be fine because the scientists may present those results at a conference and other scientists will push back and say, ‘Well, we’re not sure that’s true.’ In theory the peer review process should weed out the bad or the biased science. But in practice editors are busy, reviewers are busy. Peer review ends up being a pretty low bar. So anything that can’t pass peer review is probably not high quality, but a lot of things do get through peer review that are not that great. So I do think it’s a big issue and I think we need to do more work, that the scientific and academic communities need to take this on board as part of their own housekeeping to say, ‘Yeah, this actually is a problem and we need to figure out how to do more to clean house so that the public can maintain their trust in our activity because we’re doing our own homework to keep our own house in order.’
Speaker 1 [00:55:52] Well, let me end with this. When we, the public, are trying to evaluate science that we see come across our transom, across our social media feed – I saw an interview where you mentioned an article that got a lot of press in Outside magazine about why we shouldn’t wear sunscreen. And another article that said ‘Why flossing isn’t really good for you.’ And I remember seeing both of these and going, ‘Well, I’m not so sure about this. But I mean, if scientists say so, I don’t know.’ I didn’t quite swallow it but I definitely thought about it a little bit. What should we as laypeople be doing when we receive information like that (which we’re bombarded with)? The media is constantly saying ‘New study about this, new study about that.’ How do we, as laypeople, evaluate it?
Speaker 2 [00:56:41] Well, I think the first question to ask is, ‘Who’s saying this?’ Outside magazine is not a scientific organization and when they start publishing a thing about science, it’s legitimate to say, ‘Well, OK, maybe.’ And it doesn’t mean that you necessarily rejected out of hand but if it’s something you’re really seriously interested in, you need to do a little bit more work and don’t throw away the sunscreen yet. And I think in most cases, you often hear this phrase now (I think you even mentioned it) ‘I’ll go do my own research now,’ right? There’s nothing wrong with people being well informed, but it means you have to actually go to legitimate sources. So if you’re interested in covid-19, then you need to go to NIH or CDC. And if you’re interested in climate change, NASA, for example, has a great website about the evidence (the scientific basis) for our understanding of climate change. The information is there but you can’t just Google climate change. You have to go to NASA and then see what they have to say. Start with the source, not the topic because if you just start with a topic, you will find yourself in a labyrinth of disinformation and fake news. The other thing I would say is that I think this responsibly cuts both ways. A lot of scientific organizations don’t do a good job of making information available to people in plain language. When I’ve tried to learn about – like I tried to learn about the sunscreen thing, for example. I mean, NASA is great. I love NASA. I think they’ve done a fantastic job with their public outreach. I think partly because NASA has always had a public facing element. I think the CDC could do a lot better in some of its communication.
Speaker 1 [00:58:17] Yeah, clearly.
Speaker 2 [00:58:17] Journalists play a role here, too, that journalists have to stop running with one source stories or with something sensational, we saw with that dental floss case; it was very sensational to say, ‘Flossing does you no good.’ And all the newspapers ran with it and there were all these headlines filled with schadenfreude. But hold on, guys. This came out of the AP, which is not a scientific organization, and who are they quoting and where was the data coming from? What did the American Dental Association have to say about this? The reality was that the dentists were all saying, ‘No, dental floss is good. We have lots of clinical evidence.’ But the problem is you can’t do a double blind trial of flossing because people know whether they’re flossing or not. So this gets into another issue that I talked about in the book, which is what I call ‘methodological fetishism.’ Some people become very obsessed with the idea that if you don’t have a double blind, randomized clinical trial; you have nothing. And that’s just ridiculous because there are many forms of evidence that are useful and valuable besides double blind clinical trials. And for some problems like dental floss (like most areas of nutrition) you can’t do a double blind trial. People know what they eat. People know if they’re brushing their teeth. So you have to look at other forms of evidence. It means it’s not always easy, it means you can’t be lazy. You have to do some work. But if you’re a journalist, you’re getting paid, you have a job. You could do the work.
Speaker 1 [00:59:49] There’s also – I mean, yeah you’re right, this sort of methodological fetishism, and I’ve fallen into this trap before myself. Where I say, ‘Oh, there’s no study that conclusively says there’s no evidence for this,’ but with flossing – because I think after I read that article, I did cut back on flossing for a while because I don’t like it. I don’t think I was completely convinced by it. But I was like, ‘Let me see what happens if I don’t floss for a little bit.’ And when I didn’t floss for a while, my gums started bleeding and I went to the dentist and they were like, ‘Your gums are really bleeding. If you floss, your gums won’t bleed so much.’ I was like, ‘OK, I will.’ And then I started flossing more and my gums stopped bleeding and like, that’s evidence, too. And then being empirical is saying, ‘Here’s a here’s a physical fact.’ It’s not just the abstract fetishization of an ideological attachment to the scientific method. It’s ‘I experienced this.’ It happened in the real world.
Speaker 2 [01:00:45] Exactly. And that’s what all the dentists said. They said, ‘Look, talk to any dentist. The dentist can tell whether you’ve been flossing or not. It’s obvious in the state of your teeth.’ And in fact, we did have good evidence that people who didn’t floss were more likely to develop gum disease and in some cases even serious illness where bacteria in the gum could spread to the heart. Evidence comes in lots of sizes, shapes and colors. Some evidence is more robust than others. But if we can’t get the most robust form for whatever reason then we can turn to other forms of information and in some cases, that does involve our own personal experience. As you said, nobody likes flossing, that’s partly why that article got so much attention. So many of us wanted it to be true. We wanted them to be able to say, ‘Oh, yeah, just throw away the dental floss.’ But we know from our own experience, when you don’t floss your gums bleed and bleeding is bad. So, hello, that’s evidence, right?
Speaker 1 [01:01:36] You really seem like an optimist about science despite talking about these issues we’ve been discussing about funding, distorting what science gets done and the issues of denialism and et cetera, all the challenges and headwinds. You still seem to be a real optimist about it.
Speaker 2 [01:01:55] Well, I think the thing is this: I was trained as a scientist, originally. I still keep nice rocks on my desk. This is a very beautiful piece of fluoride from Australia.
Speaker 1 [01:02:05] Gorgeous.
Speaker 2 [01:02:06] The scientific enterprise is one of the great endeavors of people; understanding the world we live in. The world is so complex and so beautiful and so amazing. When you learn how things work, when you understand bumblebees or the ways in which flowers – It’s what Darwin said ‘There is wonder in this view of life.’ Science is this very cool and exciting enterprise and when you do work and you feel like you actually understand something about the natural world, it’s deeply, deeply gratifying. And I think science popularizers (you think about someone like Carl Sagan) that’s partly what they managed to convey, is that sense of excitement and wonder in the natural world. So I see science as a great enterprise, and I think it’s done a lot of good in the world. It’s not all good. And I think it’s something that’s definitely worth preserving and protecting, particularly against organized disinformation by people with vested economic or ideological interests. Because if we lose science, we will lose a lot in our lives. And so, yes, I’m optimistic about something that I think is really valuable and wonderful and that I think has made almost all of our lives better overall
Speaker 1 [01:03:14] Maybe spreading that a little bit more, that view of wonder is key to fighting denialism and helping people trust science. The vaccine is presented to be, ‘Oh, scientists came up with this. You should take it. We made it real fast.’ But if people really could understand what MRNA vaccines do and why they’re are so incredible, and we were better at telling that story, maybe we’d be better at fighting the disease and helping people build that trust in the process.
Speaker 2 [01:03:45] Yeah, I think so. I mean, obviously, the whole vaccine thing is complicated. I don’t think we’ll solve it entirely with just the wonder of nature, that would be naive. I do think there’s a way in which – nobody likes to feel disempowered and nobody likes to feel condescended to. If you just say, ‘Here, take this vaccine,’ there will always be people who will recoil from that. But if you can say, ‘Look, let me explain how this works, and it is kind of cool.’ It’s kind of amazing, right? That in the short amount of time we could send instructions to our cells that they could fight this disease on their own, which is what this is doing. It’s pretty amazing, actually. And I do think that the more scientists can take the time to do that kind of work, that kind of explanation, and also to really listen. I hear a lot of scientists talk about how they need to talk to people more. And I say, ‘Yeah, talking’s good but listening is better.’ To actually listen to people and ask them, ‘What is your concern? Is it about Tuskegee or is it because you can’t get a day off from work?’ And it might be both. For some people it’s one, for some people it’s the other. For some people, it might be some combination, but there’s no way to find out what it is if you don’t listen. I think the scientific community (one way or the other) has to find better ways to really hear the American people and find out more about what people’s concerns really are, because I think in most cases those those concerns can be addressed. We have taken steps to ensure that Tuskegee doesn’t happen again. There are still problems with health care delivery in minority communities. But I don’t think we’re going to have another Tuskegee. These are things that can be addressed, but they don’t get addressed most of the time and so we have to figure out better ways of addressing them.
Speaker 1 [01:05:27] Well, thank you so much for coming on the show to talk to us. This has been a fascinating conversation and I can’t thank you enough. The book is called ‘Why Trust Science?’ And people should really pick it up. Thank you so much for being on the show.
Speaker 2 [01:05:39] You’re welcome. It’s been fun.
Speaker 1 [01:05:43] Well, thank you again to Naomi Oreskes for coming on the show. If you want to get a copy of ‘Why Trust Science?’ that url once again is factuallypod.com/books and because it’s through Bookshop.org you will be supporting not just the show, but your local bookstore when you purchase there. But of course, if you have a local bookshop in your area please go to them first and support those fine people who sell books to you. That is it for our show this week. I want to thank our producers Chelsea Jacobson and Sam Roudman, Andrew Carson (our engineer), Andrew W.K. for the use of our theme song. The fine folks at Falcón Northwest for building me the incredible custom gaming PC that I’m recording this very episode on. You can find me online at AdamConover.net or @AdamConover wherever you get your social media. Thank you so much for listening. We’ll see you next week on Factually.
July 26, 2022
How can we best help animals, when it’s we humans who cause their suffering? Animal Crisis authors Alice Crary and Lori Gruen join Adam to explain how the same systems that hurt and kill animals also harm humans. They discuss the human rights abuses that happen in industrial slaughterhouses and how palm oil monocrops are devastating the world’s rainforests. They also share how we can have solidarity with animals in our daily lives. You can purchase their book at http://factuallypod.com/books
July 19, 2022
In times of turmoil, it can be useful to take a longer view of history. Like, a LOT longer. Paleontologist and author of “The Rise and Reign of the Mammals” Stephen Brusatte joins Adam to explain how mammals took over the Earth hundreds of millions of years ago, and why we survived and achieve sentience when dinosaurs died out. Stephen goes on to discuss why taking a deep look at our history can help prepare us for the crises of the near future. You can purchase Stephen’s book at http://factuallypod.com/books
July 13, 2022
Trans people have existed as long as, you know, people have. But the barriers to legal inclusion and equality are still higher than most people realize. “Sex is as Sex Does” author Paisley Currah joins Adam to discuss why institutions have been slow to give legal recognition to trans identities, why Republicans have shifted their attacks from bathroom policies to trans youth in sports, and why the struggle for trans equality is tied to feminism and women’s liberation. You can purchase Paisley’s book at http://factuallypod.com/books