Closing the Hermeneutical Gap in STEM Education: A Reply to Lawrence Torcello, Sharon E. Mason

Education has a plurality of aims, one of which is to increase knowledge and decrease ignorance. The relation between knowledge and ignorance, however, turns out to be surprisingly complicated. Sometimes ignorance is actively held in place by various forces that resist correction, such as political or religious ideology, motivational biases, and systematic prejudices. Another source of active ignorance is the absence (or suppression) of interpretative resources necessary for rendering evidence intelligible … [please read below the rest of the article].

Image credit: woodleywonderworks via Flickr / Creative Commons

Article Citation:

Mason, Sharon E. 2020. “Closing the Hermeneutical Gap in STEM Education: A Reply to Lawrence Torcello.” Social Epistemology Review and Reply Collective 9 (11): 53-58. https://wp.me/p1Bfg0-5wM.

🔹 The PDF of the article gives specific page numbers.

This article replies to:

❧ Torcello, Lawrence. 2020. “Science Denial, Pseudoskepticism, and Philosophical Deficits Undermining Public Understanding of Science: A Response to Sharon E. Mason.” Social Epistemology Review and Reply Collective 9 (9): 1-9.

Articles in this dialogue:

❦ Mason Sharon. 2020. “Climate Science Denial as Willful Hermeneutical Ignorance.” Social Epistemology 34 (5): 469-477.

In this reply to Lawrence Torcello’s “Science Denial, Pseudoskepticism, and Philosophical Deficits Undermining Public Understanding of Science: A Response to Sharon E. Mason,” I explore in further detail how hermeneutical gaps support science denial, with a particular focus on the broader questions about STEM education that Torcello raises in his response (2020). Torcello extends my work on hermeneutical lacunas in science denial (Mason 2020) in three ways. First, he articulates a defense of science education in response to the worry that cultural cognition models of science denial erode confidence in the efficacy of education as a response to pseudoscientific beliefs. Second, he examines in further detail how hermeneutical lacunas relate to his analysis of pseudoskepticism, a concept that is useful for distinguishing virtuous skeptical attitudes from a more insidious science denialism. Third, whereas I focused on specific hermeneutical lacunas related to climate science denial in my article, Torcello considers challenges in education more generally. Of special note is his discussion of the “artificiality of emphasizing STEM” (2020) and the central role of philosophy and the humanities in education that prepares individuals for full participation in democratic society. I will address each in turn.

Education and Cultural Cognition

Torcello begins by arguing that there is no conflict between holding that ‘our interpretation of science (and of information in general) is filtered and shaped by the customs embedded in our social affiliations and consequent sense of identity’ and, at the same time, that science education is an essential strategy for combating climate science denial. My work assumes this, but it may be helpful to say a bit more about how I view the relation between cultural cognition and education. Cultural cognition can exert a significant influence on belief, and it is only a rather flat-footed view of education that would ignore this influence. Education is not an activity that occurs outside a particular social context: beliefs are formed by people who have motivational biases, who make credibility assessments that are sensitive to social identities, and who bring to the table a repertoire of interpretative concepts that directly and indirectly influence their comprehension.

On the one hand, the fact that cultural cognition affects belief formation is not new. Torcello identifies Ralph Waldo Emerson, John Stuart Mill, and Alexis de Toqueville as historical examples of astute observers of the influence of cultural cognition on belief. To that list we might add numerous others. Francis Bacon, for instance, famously identified four “idols” or main sources of error that prevent progress toward scientific knowledge: foibles of human nature (idols of the tribe), peculiarities of a person’s upbringing (idols of the cave), ambiguity in language (idols of the marketplace), and the distortion of culturally accepted dogmas (idols of the theater) (Bacon 1620/2019). On the other hand, these complaints from centuries past bear a striking similarity to present-day laments about widespread ignorance, especially in relation to scientific knowledge. The persistence of this problem throughout centuries is itself evidence of the difficulty of developing and sustaining successful educational interventions.

Torcello responds to this worry by pointing out that there is good evidence that ‘cultural cognition can be overcome with patience and repetition’ (2020, 2016). This is an excellent communication strategy that is well-supported by evidence, but I would also like to point out that in my view, cultural cognition is not something that can be overcome. That is to say, the influence of cultural cognition on particular beliefs can be overcome: while it may be difficult, it is nevertheless possible for people to recognize the influence of ideology on their beliefs and to change their views. Sometimes beliefs change as a result of patience and repetition; beliefs may also change in response to relationships of trust that crack open previously closed epistemic communities (Nguyen 2020), or as a result of other interventions. However, I think it is important to note that cultural cognition can never be eradicated from thought and belief. It is an aspect of the human epistemic condition from which no one is exempt.

Instead of viewing it primarily as an obstacle to education, cultural cognition is better thought of as partially constituting any educational context. Effective education will take the various influences of cultural cognition into account, including the epistemic resources already available within a particular context, as well as those that have not yet been developed. One essential strategy in the effort to overcome active ignorance is to analyze a particular educational context by identifying the various forces of resistance that hold ignorance in place. Beliefs about climate science have been shown to be resistant to evidence in a variety of ways (Hennes et al. 2016; Jacquet, Dietrich, and Jost 2014; Hart and Nisbet 2012). Resistance also comes from a variety of sources, including political, religious, and economic sources of resistance, as well as the denial industry. In drawing attention to philosophical deficits that reinforce climate science denial, I aim to add an additional dimension to analyses of the sources of resistance, one that I believe is particularly relevant to effective educational interventions.

Hermeneutical lacunas are a source of active ignorance that has not received adequate attention in the effort to understand climate science denial. A conceptual gap is a hermeneutical lacuna in virtue of the role that it plays in a person’s conceptual architecture: it is a concept (or a cluster of related concepts) that is important for how other concepts are understood and, crucially, how evidence is interpreted. In some cases, climate science denial exhibits the absence of one or more basic concepts in the epistemology of science. Without these basic concepts, it is difficult to accurately interpret scientific evidence, or even to recognize it as evidence. My work assumes that education happens in particular cultural contexts, where a person’s epistemic resources are directly relevant to how that person processes the information they receive. It is through acknowledging the specifics of these contexts and responding to them that education can be transformative, both to individual persons and as a catalyst for broader cultural change.

The Hermeneutical Gaps of Pseudoskepticism

A second focus of Torcello’s response is a detailed examination of how my account of willful hermeneutical ignorance relates to his account of pseudoskepticism. Torcello uses the term ‘pseudoskepticism’ to mark a distinction between a virtuous skeptical attitude that aids critical inquiry and a problematic feigned skepticism that rejects the consensus of the scientific community. As Torcello (2020) explains, “It is inappropriate to call the obstinate and ill-informed rejection of scientific consensus by non-experts skeptical. That’s why I use the term pseudoskepticism to describe the form of science denialism that rejects an established scientific consensus. Pseudoskepticism signals ignorance and cynicism in the guise of skepticism.” Before proceeding, I want to point out that the term ‘pseudoskepticism’ is itself a useful addition to the epistemic resources available to understand one form of climate science denial. The distinction between skepticism and pseudoskepticism is helpful for articulating exactly what is wrong with a kind of science denialism that (wrongly) presents itself as holding up the values of skeptical inquiry against what the pseudoskeptic mistakenly perceives to be the hegemony of the scientific community.

In my analysis of pseudoskepticism, when Torcello describes the lack of understanding of the concept of scientific consensus displayed by the pseudoskeptic, Torcello is describing a hermeneutical lacuna. To put it one way, science is a critical community, or one which is organized around already-skeptical methods of inquiry, such as repeatability, falsification, and peer review (Koertge 2013). These critical practices are informed by the goal of discovering new scientific knowledge, and group identity is in virtue of a commitment to the already skeptical methods of scientific inquiry. While it is hardly infallible, consensus that arises from within a critical community is radically different from consensus that arises from within what Koertge calls a group of “belief buddies”, a group whose membership depends on adhering to the core beliefs of the group. Within a group of belief buddies, consensus is a prerequisite for group membership, but it indicates nothing about the likelihood of truth. However, in a critical community, consensus is the result of an often painstaking process of critical revision, and consensus within that community is strong evidence for truth. A failure to understand scientific consensus, then, is a hermeneutical lacuna that props up pseudoskepticism.

Torcello also suggests that hermeneutical lacunas are one mechanism that produces the Dunning-Kruger Effect, an observed pattern in which people misjudge their knowledge or ability in some domain: the self-estimation of one’s own expertise peaks when one is a novice, falls dramatically as more knowledge and experience are gained, and returns to a moderate level as genuine expertise is achieved. The effect is a description of a correlation that may result from a variety of causal mechanisms, and hermeneutical lacunas look like the primary culprit in pseudoskepticism: “The Dunning-Kruger Effect, via hermeneutical lacunas, helps explain why pseudoskeptics are so vulnerable to corporately curated disinformation, because they do not perform the initial, truly skeptical cognitive task of critical self-assessment” (2020). This illuminating description of pseudoskepticism is a helpful addition to my original analysis.

Considering the relevance of hermeneutical lacunas to the Dunning-Kruger Effect also highlights the fact that the pseudoskeptic sits at a nexus of several different types of ignorance: 1) insofar as the person is a novice, he is ignorant about the subject matter; 2) insofar as the novice overestimates his abilities, he displays a lack of self-awareness; 3) given that he also has a hermeneutical gap in his understanding of scientific consensus, he is ignorant about his own social reality; and 4) his ignorance may also be actively reinforced by pseudo-knowledge, the fact that he thinks he accurately understands skeptical inquiry. This analysis of pseudoskepticism as a type of active ignorance reveals the complexity of the unfortunate situation of the pseudoskeptic, but it also suggests several possibilities for responding to his pseudoskepticism. Filling in the hermeneutical gap in his misunderstanding about the character of scientific consensus may be a start to get the other dominoes of ignorance to fall.

STEM Education within the Knowledge-Ignorance Paradox: Teaching the Epistemology of Science

Third, and finally, Torcello considers the application of my work to broader discussions about STEM Education and the role of the humanities in an education that prepares people for democratic citizenship. I share Torcello’s view that the humanities are an essential complement to STEM education insofar as they are the home of inquiry into central questions in the human experience and democratic life. Philosophers in particular receive extensive training in rational inquiry and critical analysis, and the critical thinking skills taught in a standard philosophy curriculum are a form of applied epistemological training, insofar as students are taught how to evaluate evidence and arguments, spot reasoning errors, develop and respond to objections, charitably interpret opponents, and communicate ideas clearly. These skills are aimed at the production and evaluation of knowledge in general, and thus they can be an effective inoculation against pseudoscience.

My work identifying hermeneutical lacunas that support climate science denial is also applied epistemology of science, and it supports an educational approach that integrates both STEM and humanities education. But I am also proposing a more radical revision in the recognized aims of general education in STEM. The proposal is that, at least in terms of general science education in an effort to resist the proliferation of pseudoscience, the epistemology of scientific inquiry is essential, while a detailed analysis of the scientific evidence is optional, depending on a person’s interest in pursuing the subject further.

To see why, consider first that our scientific knowledge is vast, far beyond what any single person could possibly master. It is also increasing at a rapid pace. One consequence of successful scientific discovery is that along with an increase in what is known (by someone), there will be a corresponding decrease in what the non-expert knows in proportion to the continually expanding whole. The sociologist Sheldon Ungar describes this as the Knowledge-Ignorance Paradox (KIP), a term first coined by Bauer (1996) and explained by Unger as an inverse relation in which: “the growth of specialized knowledges implies a simultaneous increase in (general) ignorance” (2008). As scientific knowledge continues to expand, general knowledge of science will represent an increasingly smaller fraction of the whole, while at the same time the possibility of obtaining genuine expertise in some area of science will be increasingly out of reach without a commitment to extended study (graduate school, for example).

But for those who have been inclined to treat STEM education as a “slogan and a panacea”, to borrow Torcello’s apt phrase, it may be helpful to point out that widespread general ignorance of science need not entail the rejection of scientific expertise any more than widespread general ignorance of how to do surgery entails the rejection of medical expertise or widespread general ignorance about tort law entails the rejection of legal expertise. In the latter two cases, people know very well that they are not experts, and they are generally willing to defer to experts when specialized knowledge is required. More STEM education and better STEM education are worthy goals, as is education that prioritizes the humanities and prepares people for living in democratic society. But pseudoscience is not primarily a problem among those who have scientific expertise. It is not the scientists who deny climate science, but non-experts who are ordinary citizens, policy makers, and voters–especially those who have learned just enough about science that they overestimate their own competencies (cue the Dunning-Kruger Effect).

Climate science denial is not an inevitable result of a lack of general scientific knowledge; it is a problem that arises from a lack of trust in the expertise of the scientific community. One way to (re)build this trust is through emphasizing education in the epistemology of science. Fortunately, it is also much easier to achieve competency in understanding how scientific inquiry works than it is to develop competency in, say, climate science. The fundamentals of the epistemology of science can be taught in STEM education and in critical thinking courses. It may include instruction in the identification of fallacious reasoning that Cook articulates (Cook et al. 2017) and that others have also suggested (Coady and Corry 2013), but it would also teach students about psychological biases, information literacy, and how to assess expertise. Crucially, it is an education that would give attention to the development of foundational concepts in the epistemology of science, in an effort to keep hermeneutical ignorance from being an active force in the rejection of scientific expertise.

Author Information:

Sharon E. Mason,smason@uca.edu, is an Assistant Professor of Philosophy at the University of Central Arkansas. Her primary area of research is in contemporary epistemology and focuses on perspectives, ignorance, virtue epistemology, and epistemic agency.

References

Bacon, Francis. 1620, 2019. Novum Organon. In Modern Philosophy: An Anthology of Primary Sources edited by Roger Ariew and Eric Watkins. Indianapolis: Hackett Publishing Company.

Bauer, Martin. 1996. “Socio-Demographic Correlates of DK-Responses in Knowledge Surveys: Self-Attributed Ignorance of Science.” Social Science Information 35 (1): 39-68.

Coady, David, Richard Corry. 2013. The Climate Change Debate: An Epistemic and Ethical Enquiry. New York: Palgrave Macmillan.

Cook, John, Peter Ellerton, David Kinkead. 2018. “Deconstructing Climate Misinformation to Identify Reasoning Errors.” Environmental Research Letters 13 (2): https://iopscience.iop.org/journal/1748-9326.

Hart, P. Sol, Erik C. Nisbet. 2012. “Boomerang Effects in Science Communication: How Motivated Reasoning and Identity Cues Amplify Opinion Polarization About Climate Mitigation Policies.” Communication Research 39 (6): 701-723.

Hennes, Erin P., Irina Feygina, Benjamin C. Ruisch, Christopher A. Monteiro, John T. Jost. 2016. “Motivated Recall in the Service of the Economic System: The Case of Anthropogenic Climate Change.” Journal of Experimental Psychology, 145 (6): 755-771.

Jacquet, Jennifer, Monica Dietrich, John T. Jost. 2014. “The ideological divide and climate change opinion: ‘top-down’ and ‘bottom-up’ approaches.” Frontiers in Psychology 5: 1458.

Koertge, Noretta. 2013. “Belief Buddies versus Critical Communities: The Social Organization of Pseudoscience.” In Philosophy of Pseudoscience: Reconsidering the Demarcation Problem edited by Massmo Piglucci and Maarten Boudry. Chicago: Chicago University Press.

Kruger, Justin, David Dunning. 1999. “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments.” Journal of Personality and Social Psychology 77 (6): 1121–1134.

Mason, Sharon. 2020. “Climate Science Denial as Willful Hermeneutical Ignorance.” Social Epistemology 34 (5): 469-477.

Nguyen, C. Thi. 2020. “Echo Chambers and Epistemic Bubbles.” Episteme 17 (2): 141-161.

Nichols, Tom. 2018. The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters. New York: Oxford University Press.

Torcello, Lawrence. 2020. “Science Denial, Pseudoskepticism, and Philosophical Deficits Undermining Public Understanding of Science.” Social Epistemology Review and Reply Collective 9 (9): 1-9.

Torcello, Lawrence. 2018. “The Acceleration of Global Warming as Crime Against Humanity: A Moral Case for Fossil Fuel Divestment.” In The Palgrave Handbook of Philosophy and Public Policy edited by David Boonin, 779-793. Cham, Switzerland: Springer Verlag.

Torcello, Lawrence. 2016. “The Ethics of Belief, Cognition, and Climate Change Pseudoskepticism: Implications for Public Discourse.” Topics in Cognitive Science 8 (1): 19-48.

Torcello, Lawrence. 2011. “The Ethics of Inquiry, Scientific Belief, and Public Discourse.” Public Affairs Quarterly 25 (3): 197–215.

Ungar, Sheldon. 2008. “Ignorance as an Under-Identified Social Problem.” The British Journal of Sociology 58 (2): 301-326.



Categories: Critical Replies

Tags: , , , , , , , , , , , , , ,

1 reply

  1. I agree with Sharon Mason that cultural cognition cannot be overcome ultimately. Cultural Cognition is indeed a permanent feature of the human condition. My modest view, to be clear, is that the negative influence of cultural cognition can be overcome on particular topics with educaton. It might have be better to say, for clarity sake, that cultural cognition can be adjusted advantageously through educational experience. Recognizing the ongoing influence of cultural cognition as a permanent epistemic challenge is, in my view, germane to cultivating appropriate epistemic humility.

    I also want to emphasize my agreement that education in the epistemology of science can help inoculate against science denialism. There is good reason to believe that science denialism is often more a problem of epistemic illiteracy than of scientific illiteracy.

    Thanks to Professor Mason for this constructive exchange and to SERRC for facilitating our dialogue.

Leave a Reply