Science Denial, Pseudoskepticism, and Philosophical Deficits Undermining Public Understanding of Science: A Response to Sharon E. Mason, Lawrence Torcello

In this reply, I examine the relationship between science denialism and education. Specifically I want to address how the mode of science denial I identify as pseudoskepticism relates to information deficits of a philosophical nature … [please read below the rest of the article].

Image credit: Credit Repair Specialist

Article Citation:

Torcello, Lawrence. 2020. “Science Denial, Pseudoskepticism, and Philosophical Deficits Undermining Public Understanding of Science: A Response to Sharon E. Mason.” Social Epistemology Review and Reply Collective 9 (9): 1-9.

🔹 The PDF of the article gives specific page numbers.

This article replies to:

❧ Mason Sharon. 2020. “Climate Science Denial as Willful Hermeneutical Ignorance.” Social Epistemology 34 (5): 469-477.

In this reply, I examine the relationship between science denialism and education. Specifically I want to address how the mode of science denial I identify as pseudoskepticism relates to information deficits of a philosophical nature. Sharon Mason is interested, as I am, in identifying the deficits in knowledge that contribute to climate science denialism and undermine scientific literacy. As we struggle through a global pandemic made worse by the influence of a U.S. President and politically right-wing party openly dismissive of science, the topic of science denialism is of increasingly amplified importance. Mason’s article, “Climate Science Denial as Willful Hermeneutical Ignorance,” (2020) is a valuable addition to academic literature on such topics, and a unique addition insofar as it identifies specific conceptual resources needed for the public’s understanding of science.

Those who deny the urgency of climate change are often capable of scoring well on measures of scientific literacy (Kahan et al. 2012). The best predictor of whether or not someone disregards scientific findings on global warming, and now COVID-19, is political affiliation (McCright and Dunlap 2011; Harris, Bain, and Emily, et al. 2016). Social scientists’ confirmation of this fact can lead to decreased emphasis on educational measures to counter scientific illiteracy and disinformation, in favor of focusing on the cultural and social influences on belief formation. It is tempting to dismiss the influence of education on science denial in deference to the cultural cognition model, but this is a mistake.

The cultural cognition model holds that our interpretation of science (and of information in general) is shaped and filtered by the customs embedded in our social affiliations and consequent sense of identity. This concept is nothing new to philosophers, who have always sought to recognize and interrogate underlying social and cultural assumptions influencing our cognition.

In the 19th century, when the association between democracy and liberal protections was still relatively new, thinkers such as Ralph Waldo Emerson, John Stuart Mill, and Alexis de Tocqueville decried the stultifying influence that ideological conformity has on society. As usual, Emerson is worth quoting at length. He writes in his 1841 essay “Self-Reliance:”

If I know your sect, I anticipate your argument. I hear a preacher announce for his text and topic the expediency of one of the institutions of his church. Do I not know beforehand that not possibly can he say a new and spontaneous word? Do I not know that, with all this ostentation of examining the grounds of the institution, he will do no such thing? Do I not know that he is pledged to himself not to look but at one side — the permitted side, not as a man, but as a parish minister? He is a retained attorney, and these airs of the bench are the emptiest affectation. Well, most men have bound their eyes with one or another handkerchief, and attached themselves to some one of these communities of opinion […] nature is not slow to equip us in the prison-uniform of the party to which we adhere (Emerson 2014).

I make no argument against the concept of cultural cognition—I take it as practically self-evident. And I am firmly on the side of those reformers, writers, and philosophers, who have taken education to be a necessary inoculant against the small-mindedness, parochialism, and abasement to superstition that plagued humanity long before the term cultural cognition was coined. Additionally, I recognize the evidence that effective science communication involves clear, concise, and repeated messaging of scientific consensus (Lewendowsky Gignac, Vaughan 2013;  van der Linden, Clarke, Maibach 2015). The evidence for what has been called consensus messaging is sometimes interpreted as contradicting the evidence for the persistent influence of  cultural cognition. I have shown elsewhere that both sets of data are compatible, by mechanism of the mere exposure effect or what is sometimes called the familiarity principle (Torcello 2016). Simply put, cultural cognition can be overcome with patience and repetition.

Here I want to talk about individuals who are resistant to consensus messaging and how their resistance might relate to broader education deficits. The effectiveness of clear messaging depends on the educational underpinnings that enable one to understand what is conveyed by a scientific consensus.

Willful Hermeneutical Ignorance

Sharon Mason’s examination of science denialism drives directly at why deficits in education are vital for understanding the species of science denialism that I call pseudoskepticism. Mason portrays climate science denial as stemming from a species of active ignorance she identifies as willful hermeneutical ignorance—the latter term borrowed from Gaile Pohlhaus. Active ignorance, as described by philosophers Miranda Fricker and José Medina, is self-perpetuating ignorance driven by motivated reasoning and grounded in ideological commitments (Fricker 2007; Medina 2013). The concept pairs well with that of cultural cognition. Mason describes willful hermeneutical ignorance as follows:

The concept of willful hermeneutical ignorance employed here comes from Gaile Pohlhaus’s work on epistemic injustice, where Pohlhaus describes a type of epistemic harm that results from a subject’s prejudicial failure to develop the interpretative resources that would make certain kinds of experiences intelligible to that subject (Mason 2020).

Willful hermeneutical ignorance results from conceptual gaps in a person’s knowledge; gaps perpetuated by biases undermining the person’s ability to cognize particular types of information. Mason calls these gaps hermeneutical lacunas.

Mason identifies examples of such hermeneutical lacunas; first from Dale Jamieson’s work, and then from mine (Jamieson 2014; Torcello 2011, 2016). Jamieson’s examples illustrate common misunderstandings concerning the meaning of uncertainty in a scientific context. He observes that science is an epistemologically cautious undertaking. Findings responsibly communicated by scientists are presented with their appropriate provisos and caveats. Indeed, this epistemic caution is a necessary element of what makes the scientific process dependable. The acknowledgment of scientific uncertainty should signal that a particular source is trustworthy, but to the layperson, acknowledgment of margins of uncertainty can sound like confessions of ignorance.

Secondly, and relatedly, a misunderstanding of how scientists treat uncertainty can lead to poor risk analysis by nonscientists. As Mason quotes Jamieson:

When a scientist says that she is uncertain that a particular effect will occur, she leaves open the possibility that an even more extreme effect will occur. However, in ordinary language we typically attach the uncertainty to the most extreme effect that we can reasonably envision. Thus, in everyday discourse saying that we are uncertain about whether a given atmospheric concentration of carbon dioxide will produce a 2°C warming conversationally implies that the warming will be no more than 2°C, while in scientific discourse there is no such implication. The denial industry has exploited these confusions on an industrial scale (2014, 87).

Mason’s next example draws from my work. This example concerns how the word consensus is used in the context of science. In science, as noted, uncertainty is a given. In order to minimize uncertainty, science is organized methodologically into a process meant to rule out confounding variables. It is this methodological application of skepticism that informs scientific consensus. When a scientific consensus is achieved, it means that all attempts to skeptically rule out a hypothesis have instead served to strengthen the probability of that hypothesis. Mason quotes me as follows:

[T]he scientific skepticism of modern science is applied by a community of inquirers; it is not enough that one researcher endorse a particular conclusion, for scientific findings must be confirmed repeatedly by other scientists (if they are to gain traction in the broader scientific community). Part of what scientists do is to test conclusions, examining and when warranted rejecting them, before they are accepted as established … when a consensus view does exist, it signals the consensus view functions reliably, under a wide range of skeptically informed empirical applications, to predict and explain observational results (2016, 22).

In rejecting scientific consensus, climate science deniers offer a pretext of skepticism against a conclusion that is itself the result of a methodologically skeptical process. If non-experts rejecting a scientific consensus are not purposefully equivocating, then they must misunderstand what the term consensus denotes in its scientific context. In science, consensus does not signal political agreement or compromise; it signifies a skeptically endorsed finding that has stood the gauntlet of repeated challenge. It is inappropriate to call the obstinate and ill-informed rejection of scientific consensus by non-experts skeptical. That’s why I use the term pseudoskepticism to describe the form of science denialism that rejects an established scientific consensus. Pseudoskepticism signals ignorance and cynicism in the guise of skepticism. For Mason (and I agree), ignorance of what consensus means in the scientific context represents a hermeneutical gap. It is a gap in one’s understanding that allows the conflation of a scientific consensus with an appeal to popular opinion.

Science Denialism and Pseudoskepticism

I wanted to clarify my usage of pseudoskepticism, because I do not believe that every form of science denialism is the result of informational gaps, and dubious cynicism leading to the hollow pretense of skepticism. Some forms of science denialism stem from expressions of faith. For example, some Christian sects express belief in transubstantiation during the ritual of communion. Such beliefs imply a rejection, if only a ritual rejection, of the ability of physical and logical laws to constrain the miraculous. Such faith may be a form of active ignorance, but it isn’t necessarily predicated on information deficits, nor is it necessarily harmful.

Others may reject scientific consensus for philosophical reasons. An example is the epistemological anarchy of the philosopher Paul Feyerabend. This species of science denialism is typically limited to academic contexts—and such science denial may even be beneficial as a speculative foil—but it does not necessarily imply an information deficit. Or, alternatively, one may pretend to reject scientific findings in an effort to deceive the public, as evidence shows fossil fuel corporations have done (Oreskes and Conway 2010). Again, such science denialism does not necessarily involve an information deficit, even if we want to charge a civic or an ethical deficit (Torcello 2018). On the other hand, what I call pseudoskepticism involves an information deficit by definition:

When scientific consensus is rejected by non-experts who naively consider themselves more scientifically astute than the collective scientific community, it is appropriately labeled pseudoskepticism. Pseudoskepticism […] may be influenced by, though perhaps not exclusively, the following two factors: (a) ignorance of the scientific process and (b) ideologically motivated reasoning (as opposed to the exercise of faith) (Torcello 2016).

Pseudoskeptics may be aware that something called a scientific consensus exists. They might, as I mentioned at the outset, score well on tests meant to measure knowledge of scientific information. But it does not follow that pseudoskeptics are competent interpreters of science. It is on this latter point that Mason’s identification of hermeneutical gaps proves especially insightful—not only into the phenomenon of pseudoskepticism, but into the Dunning-Kruger Effect more widely.

The Dunning-Kruger Effect entails overconfidence in one’s own knowledge or skill-set, inversely to how much one actually knows (Kruger and Dunning 1999). Though Mason does not discuss the Dunning-Kruger Effect, her analysis of hermeneutic gaps dovetails with it to provide a tidy explanation of how informational deficits can undermine the metacognitive ability to make relevant self-judgments. The Dunning-Kruger Effect, via hermeneutical lacunas, helps explain why pseudoskeptics are so vulnerable to corporately curated disinformation, because they do not perform the initial, truly skeptical cognitive task of critical self-assessment.

The Denial Industry

The influence of fossil fuel corporations on the American political system is the central reason we haven’t made any meaningful, institutional attempt to decrease fossil fuel emissions. It is the central reason that the topic has become politically divisive. The work of science historians like Naomi Oreskes and Erik M. Conway, and sociologist Robert C. Brulle has laid bare this fact (Oreskes and Conway 2010; Brulle 2014).  In the 1970s scientists at Exxon and other fossil fuel companies warned their employers about anthropogenic global warming. In turn, those corporations endeavored to preserve their business models by sowing disinformation in the public sphere. In doing so they sought to delay regulatory pressures (Supran and Oreskes 2017; Cook et al. 2019).

Philosophical Deficits

The finding that some science deniers score well on tests meant to measure scientific literacy might be mistaken as evidence that information deficits are less relevant than cultural cognition. Yet, just as reading ability is measured by one’s facility in deciphering new words rather than memorization, scientific literacy ought to be measured by one’s ability to think critically about scientifically relevant claims in order to decipher those with scientific merit from those veering toward pseudoscience—including the purposefully disorienting products of the “denial industry.” Such critical judgments are only possible with sufficient epistemic resources. The ability to understand scientific uncertainty and consensus is itself a product of epistemic- and authentic scientific literacy. More broadly, the ability to think critically involves a range of epistemic resources which underwrite the effectiveness of the scientific process in studying the natural world.

So why are so many in the Anglosphere prone to manipulation at the hands of corporate and political peddlers of disinformation? One common assumption is a failure in STEM education. However, the rules of inference and principles of argumentation are instilled foremost through philosophy, not STEM. Becoming adept at applying them is the specialty of the humanities. Moreover, social scientists are finding that the type of instruction routinely taught in philosophy courses, including but not limited to critical thinking and logic, do act as intellectual inoculates against misinformation (Cook et al. 2017). To quote Cognitive Scientist John Cook, writing about these findings in the context of climate change related disinformation:

Typically, inoculations convey resistance by providing people with information that counters misinformation. In contrast, we propose inoculating against misinformation by explaining the fallacious reasoning within misleading denialist claims. We offer a strategy based on critical thinking methods to analyse and detect poor reasoning within denialist claims. This strategy includes detailing argument structure, determining the truth of the premises, and checking for validity, hidden premises, or ambiguous language. Focusing on argument structure also facilitates the identification of reasoning fallacies by locating them in the reasoning process. Because this reason-based form of inoculation is based on general critical thinking methods, it offers the distinct advantage of being accessible to those who lack expertise in climate science (Cook et al. 2018).

Again, it’s not news to philosophers that forearming students with the tools of critical analysis better prepares them to identify and resist disinformation. I am also confident that students who learn the guiding principles of liberal democracy, ethics, and who carefully study the lessons of history, are better inoculated against manipulative politicians and the politics of social and racial division. The problems we face, now more than ever, are less intellectual conundrums and more matters of political will and social organization.

Concluding Remarks

There is no quick fix for the societal scale of science denial and disinformation which is now weakening democratic politics. One does not pass on critical thinking skills, a sense of civic and ethical accountability, or an informed understanding of liberal political principles, through osmosis. If we are to inoculate society against the current plague of disinformation, our herd immunity will, after all, derive from appropriate education in science and the humanities. It is a laborious, protracted process, but we already know that, and we know how to do it well.

Instead of getting on with that broader education as a national project in the United States, and aside from the undeniable importance of science, technology, engineering, and mathematics, we have been asked to settle for “STEM” as a slogan and a panacea. Two reasons that some are willing to settle seem apparent: first, the fact that students in the United States lag behind so many other nations in science and math scores—yet the same is true of reading scores (Camera 2019). More tellingly, people believe that STEM fields are uniquely vital as the basis for a “knowledge-based” economy. Both of these notions are captured by former United States Educational Secretary Arne Duncan in this 2013 statement:

In a knowledge-based, global economy, where education is more important than ever before, both to individual success and collective prosperity, our students are basically losing ground […] We’re running in place, as other high performing countries start to lap us (Bidwell 2013).

Nearly a decade after Duncan’s statement, the United States remains the largest economy in the world (Bajpai 2020). Our economy persists, despite the fact that we have politicians who routinely question the reality of evolution, climate science, and now epidemiological recommendations during a global pandemic.

Graduate students in US university STEM programs, programs that for now are still ranked among the most prestigious in the world, are predominantly international students (Wingfield 2017). A third of American workers in STEM related fields have not completed a baccalaureate degree, while another third have baccalaureate degrees but no graduate degrees (Graf et al. 2018). I raise these points not against the importance of STEM education, but to decouple the association between student STEM scores and any nation’s economic strength. In the end, education must be something more than economic production and job preparation.

The political mishandling of climate change and now COVID-19 may continue to reveal the nation’s inadequacy in STEM disciplines, but it also exposes the artificiality of emphasizing STEM, while neglecting our collective need for broader education in the subjects that foster critical thinking, self-awareness, and the intellectual confidence to seek out and learn from one’s mistakes. As a nation, the US demonstrates a deficit of philosophical tools necessary to understand the value of science and to follow its consensus recommendations. But we can yet emerge from these dark times with a realization of how history, reasoned skepticism, and civic respect allow us to move forward from past mistakes. Confronting willful hermeneutical ignorance is an arduous educational task for any nation, but no more arduous than neglecting it has become in the United States.

Author Information:

Lawrence Torcello, Rochester Institute of Technology,


Bajpai, Prableen. 2020. “The 5 Largest Economies in the World and their Growth in 2020.” NASDAQ World Markets January 22. (accessed 8/18/20).

Brulle, Robert J. 2014. “Institutionalizing Delay: Foundation Funding and the Creation of U.S. Climate Change Counter-Movement Organizations.” Climatic Change 122 (4): 681-694.

Camera, Lauren. 2019. “U.S. Students Show No Improvement in Math, Reading, Science on International Exam.” US News December 3. (accessed 8/18/2020).

Cook, John, Peter Ellerton, David Kinkead. 2018. “Deconstructing Climate Misinformation to Identify Reasoning Errors.” Environmental Research Letters 13 (2):

Cook, John, Stephan Lewandowsky, Ullrich K. H. Ecker. 2017. “Neutralizing Misinformation Through Inoculation: Exposing Misleading Argumentation Techniques Reduces Their Influence.” PLoS ONE May 5.

Cook, John, Geoffrey Supran, Stephan Lewandowsky, Naomi Oreskes, Ed Maibach. 2019. “America Misled: How the Fossil Fuel Industry Deliberately Misled Americans About Climate Change.” Fairfax, VA: George Mason University Center for Climate Change Communication.

Emerson Ralph W. 1841, 2014. “Self-Reliance.” In The Portable Emerson edited by Jeffrey S. Cramer. New York: Penguin Books.

Fricker, Miranda. 2007. “Epistemic Injustice: Power and the Ethics of Knowing.” New York: Oxford University Press.

Graf, Nikki, Richard Fry, Cary Funk. 2018. “7 Facts About the STEM workforce.” PEW Research Center January 9. (accessed 8/18/2020).

Hornsey, Matthew J., Emily A. Harris, Paul G. Bain, Kelly S. Fielding. 2016. “Meta-Analyses of the Determinants and Outcomes of Belief in Climate Change.” Nature Climate Change 6: 622–626.

Jamieson, Dale. 2014. Reason in a Dark Time: Why the Struggle Against Climate Change Failed–and What It Means for Our Future. New York: Oxford University Press.

Kahan, Dan M., Ellen Peters, Maggie Wittlin, Paul Slovic, Lisa Larrimore Ouellette, Donald Braman, Gregory Mandel. 2012 “The Polarizing Impact of Science Literacy and Numeracy on Perceived Climate Change Risks.” Nature Climate Change 2: 732–735.

Kruger, Justin, David Dunning. 1999. “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments.” Journal of Personality and Social Psychology 77 (6): 1121–1134.

Lewandowsky, Stephan, Gilles E. Gignac, Samuel Vaughan. 2013. “The Pivotal Role of Perceived Scientific Consensus in Acceptance of Science” Nature Climate Change 3: 399–404.

Mason Sharon. 2020. “Climate Science Denial as Willful Hermeneutical Ignorance.” Social Epistemology 34 (5): 469-477.

McCright, Aaron M., Riley E. Dunlap. 2011. “The Politicization of Climate Change and Polarization in the American Public’s Views of Global Warming.” The Sociological Quarterly 52 (2): 155-194.

Medina, José. 2013. The Epistemology of Resistance: Gender and Racial Oppression, Epistemic Injustice, and Resistant Imaginations. New York: Oxford University Press.

Oreskes Naomi, Erik M. Conway. 2010. Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. New York: Bloomsbury Press

Supran, Geoffrey, Naomi Oreskes. 2017. “Assessing ExxonMobil’s Climate Change Communications (1977–2014).” Environmental Research Letters 12 (8):

Torcello, Lawrence. 2018. “The Acceleration of Global Warming as Crime Against Humanity: A Moral Case for Fossil Fuel Divestment.” In The Palgrave Handbook of Philosophy and Public Policy edited by David Boonin, 779-793. Cham, Switzerland: Springer Verlag.

Torcello, Lawrence. 2016. “The Ethics of Belief, Cognition, and Climate Change Pseudoskepticism: Implications for Public Discourse.” Topics in Cognitive Science 8 (1): 19-48.

Torcello, Lawrence. 2011. “The Ethics of Inquiry, Scientific Belief, and Public Discourse.” Public Affairs Quarterly 25 (3): 197–215.

van der Linden, Sander L., Chris E. Clarke, Edward W. Maibach. 2015. “Highlighting Consensus Among Medical Scientists Increases Public Support for Vaccines: Evidence from a Randomized Experiment.” BMC Public Health December 3 (15): 1207.

Wingfield, Nick. 2017. “The Disappearing American Grad Student.” The New York Times. November 3. (Accessed 8/18/20).

Categories: Critical Replies

Tags: , , , , , , , , ,

Leave a Reply