Strongly Held Belief and Open-Mindedness: A Response to Vavova’s “Open-Mindedness, Rational Confidence, and Belief Change”, Jeremy Fantl

In “Fake News vs. Echo Chambers” (2021) I argue that some kinds of attitudes that make it easier to resist fake news also make it more difficult to exit echo chambers. Closed-mindedness makes it easier to resist fake news. But open-mindedness makes it easier to exit echo chambers. The greater your degree of open-mindedness, the more likely you’ll be convinced by fake news. The greater your degree of closed-mindedness, the less likely you’ll be lured out of echo chambers. … [please read below the rest of the article].

Image credit: justin lincoln via Flickr / Creative Commons

Article Citation:

Fantl, Jeremy. 2023. “Strongly Held Belief and Open-Mindedness: A Response to Vavova’s “Open-Mindedness, Rational Confidence, and Belief Change.” Social Epistemology Review and Reply Collective 12 (4): 4–9. https://wp.me/p1Bfg0-7J1.

🔹 The PDF of the article gives specific page numbers.

This article replies to:

❧ Vavova, Katia. 2023. “Open-Mindedness, Rational Confidence, and Belief Change.” Social Epistemology Review and Reply Collective 12 (2): 33–44

Articles in this dialogue:

❦ Fantl, Jeremy. 2021. “Fake News vs. Echo Chambers.” Social Epistemology 35 (6): 645-659.

I also associate open-mindedness and closed-mindedness with strongly held belief. If you have weakly held belief, you’ll tend to be more open-minded. If you have strongly held belief, you’ll tend to be more closed-minded. Katia Vavova (2023) rightly raises concerns about this association. Open-mindedness and closed-mindedness are (in my view) attitudes determined by your willingness to change confidence in response to counterevidence. You can be willing to significantly change confidence in response to counterevidence—and so be open-minded—even if you have very high confidence: strong belief. And you can be unwilling to significantly change confidence in response to counterevidence—and so be closed-minded—even if you have very low confidence: weak belief.

All of this is correct, and Vavova is right to press the point. One nice feature of Vavova’s position is that it undermines some common advice you might hear for avoiding various cognitive biases and other perils of our digital age: “We all need to be a lot less sure of ourselves.” As Vavova makes clear, being less sure of yourself does not ensure that you will be responsive to evidence. Being more sure of yourself does not ensure you will be less responsive to evidence.

On that, Vavova and I agree. We should reject that common advice (or, at least, not accept it only for reasons having to do with the need for open-mindedness). But Vavova draws other conclusions as well:

(1) “Being open-minded thus needn’t mean being an epistemic pushover, with easy to dislodge beliefs” (37).

(2) “Strength of belief won’t protect us from fake news” (37) (and, thus, I am incorrect to say that, “You should sometimes have strongly held beliefs (to resist non-obviously fake news.” (35)).

(3) I am incorrect to say that, “If you should always be open-minded toward counterarguments, then you shouldn’t ever have strongly held beliefs” (35).

I’ll spend the rest of this response-piece clarifying and defending my position on these three points.

Open-Mindedness and Confidence-Revision

First things first: as stated, I agree with (1); being open-minded needn’t mean being an epistemic pushover. But I do think that being open-minded means having a willingness to reduce confidence in response to counterevidence. The willingness is conditional on certain things happening: after investigation, you find compelling all the steps in an argument from the counterevidence to the denial of your position, and you’re unable to figure out where the argument goes wrong. For example, if you’re presented with some argument that vaccines are ineffective, if after investigation all the steps seem compelling and you can’t figure out which step goes wrong, you’re open-minded to the extent you’re willing to thereby reduce confidence that vaccines are effective. If, prior to exposure to the argument, you’re unwilling to reduce confidence on the condition that, after exposure, you find each step compelling and can’t figure out where it goes wrong, then you are to that extent closed-minded as well.

The reason is that, if you promise someone you will be open-minded toward their argument, you immediately renege on that promise if, in the next breath, you say that even if you find each step in their argument compelling and can’t expose a flaw, you won’t reduce confidence in response to their argument: “Yes, I promise to be completely open-minded to your argument. Of course, even if I can’t figure out where your argument goes wrong and all the steps seem compelling to me, I’m going to remain fully confident I’m right. But don’t worry; I’m completely open-minded to what you’re about to say.” This promise isn’t worth much.

So, while open-mindedness doesn’t entail being a pushover, it does require some at least conditional willingness to reduce confidence in response to counterarguments. This is why open-mindedness is so important for escaping echo chambers. Therefore, even if I’m wrong about the relationships between open-mindedness/closed-mindedness and strongly held belief, there is still a tension, here. Closed-mindedness—evidence-resistance—seems like it is conducive to resisting fake news. Open-mindedness seems like it is conducive to exiting echo chambers. The more you’re dismissive of evidence that runs counter to your beliefs, the less impact fake news will have, but the more stuck you’ll be in your echo chamber. The less dismissive you are of evidence that runs counter to your beliefs, the more impact fake news will have, but the easier time you’ll have escaping echo chambers.

In many cases you shouldn’t be closed-minded. But, as I argue in the original paper and elsewhere, in many cases you should be. You should be closed-minded when you know you’re right and, as I argue in the original paper, when you should strongly hold belief. So let me turn to Vavova’s second two conclusions and the association between open-mindedness and strongly held belief.

Why We Want Strongly Held Belief

I want to resist fake news. What does this mean? It doesn’t merely mean that I don’t want my confidence revised by exposure to fake news. It’s too easy to satisfy that goal. For one thing, if I already outright believe what some fake news story tells me, then my confidence won’t be revised at all (or not significantly) by exposure to the fake news story. I’ll already believe what the story says, so at best the story will only confirm me in my already strong belief. But even if I only weakly believe what the story denies, I won’t necessarily be shaken too much out of that weak belief by the story. If I am generally skeptical, my merely weak confidence that p won’t be significantly shaken by exposure to a news story that provides evidence that not-p. If I only weakly believe that vaccines are effective (because, say, I’m just generally distrustful of all information sources), exposure to some fake news story of vaccine ineffectiveness need not force dramatic confidence change.

If ‘resisting’ fake news were only a matter of confidence change, then Vavova would be right in her conclusion (2); you wouldn’t need strongly held belief to resist fake news, because you could resist fake news—that is, fail to significantly change your confidence in response to fake news—even with low confidence. But when I say I want to resist fake news I’m not merely saying that I want weak confidence in the truth to survive exposure to fake news. I’m saying that I want strong belief in the truth to survive exposure to fake news. I want to retain high confidence—and outright belief—in the face of fake news.

One way to do this is to have high confidence even while finding counterevidence sufficiently unsurprising: to have not just a strong belief, but a strongly held belief. As I define it in the original paper, a belief is strongly held when your high confidence in the belief is conjoined with a disposition to find evidence against the belief [un]surprising (651).

Note that this doesn’t simply define strongly held belief as resilient high confidence. Vavova seems to interpret the definition this way when she says, immediately after citing the definition, that “To be strongly held, then, is for a belief to be difficult to sway (stable, resilient)” (38) and adds that the definition  merely redescribes the case. Yes, this belief is strong and resilient, but why? What is the connection between strong and strongly held belief? Without one, Fantl’s case collapses (38).

The answer is that strongly held belief is strong belief that is conjoined with a disposition to find evidence against the belief unsurprising. It’s a substantive claim that finding some counterevidence unsurprising renders your confidence resilient. That your confidence is resilient isn’t constituted by your finding some counterevidence unsurprising. The definition doesn’t merely redescribe the case.

That you hold some belief strongly prior to encountering some fake news can explain why, after encountering fake news, you continue to hold that belief strongly. You fail to revise your confidence level significantly because of the fact that you find counterevidence unsurprising. And your resultant confidence remains high because your starting confidence is high. With only one component, the explanation is lost. Starting high confidence while finding the counterevidence surprising doesn’t explain why the resulting confidence remains at its starting level. Starting low confidence while finding the counterevidence unsurprising doesn’t explain why the resulting confidence ends up high. But with both—starting high confidence while finding the counterevidence unsurprising—we have an explanation for the why the resulting confidence level is high.

Again, this is what we want in resistance to fake news. We don’t want stable, low confidence. We want stable, high confidence before, during, and after exposure to fake news. Strongly held belief gets us that. Sometimes we need strongly held belief in this sense to resist fake news. But strongly held belief, in this sense, makes it harder to escape from echo chambers.

Of course, we don’t want to irrationally continue to strongly believe after exposure to fake news. We want to strongly believe after exposure to fake news when it’s rational to do so. But because we do want to strongly believe after exposure to fake news, we want as well that it’s sometimes rational to do so. We want techniques that allow us to be rational in strongly believing p after exposure to fake news to contrary.

Just as strongly held belief makes it easier to resist fake news (in the sense of retaining strong belief after exposure to fake news), so too does rational strongly held belief make it rational to resist fake news. Suppose your evidence makes it the case that you should have strong belief: your confidence should be very high; your rational confidence is high. If you should also be unsurprised by some counterevidence, then you should have strongly held belief. If you are now exposed to some relevant counterevidence in the form of a fake news story, what happens to your rational confidence? Since you should be unsurprised by the story, your rational confidence won’t budge much. And since it started out high, when it fails to budge much, it remains high.

Again, we not only want it to be the case that our actual confidence remains high through exposure to fake news, but that, when our actual confidence remains high, we’re not thereby irrational: we want to be doing as we should when our confidence remains high in response to fake news. One way this can happen is if we should have strongly held belief: if we both should believe that p and should be unsurprised by counterevidence. For, again, if you should be surprised by the counterevidence, then your confidence should shift significantly. And if your starting confidence should be low, then if your confidence shouldn’t change much, your resulting confidence should still be low.

With this in mind, let’s return to the latter two of Vavova’s three conclusions I mentioned at the outset of this reply:

(2) “Strength of belief won’t protect us from fake news” (and, thus, I am incorrect when I say that “You should sometimes have strongly held beliefs (to resist non-obviously fake news”).

(3) I am incorrect when I say that, “If you should always be open-minded toward counterarguments, then you shouldn’t ever have strongly held beliefs.”

Regarding (2) Vavova is correct that strength of belief won’t protect us from fake news, because as she rightly notes, you can have strong belief that is highly responsive to counterevidence. But strong belief conjoined with a disposition to find counterevidence unsurprising—that is, strongly held belief—does protect us from fake news, because you won’t significantly budge from your strong belief if you encounter counterevidence you already find unsurprising. What is more, if your strongly held belief is rational, your failure to budge will itself be rational. Contra Vavova’s denial in (2), sometimes being such that you should strongly hold a belief will help it be the case that you should resist fake news.

I also continue to maintain the claim Vavova denies in (3). What’s not true is that, if you should always be open-minded toward counterarguments, then you shouldn’t ever have high confidence. Again, as Vavova rightly notes, rightful open-mindedness is compatible with having high confidence, because you can rationally have high confidence while being rationally willing to significantly reduce confidence in response to counterevidence. What you can’t rationally have is strongly held belief while being rationally willing to significantly reduce confidence in response to counterevidence. The reason is that if you rationally strongly hold belief that p, then you are rationally unsurprised by the counterevidence while rationally strongly believing p. And if you are rationally unsurprised by the counterevidence, then exposure to the counterevidence shouldn’t prompt significant confidence revision.

Why and When We Sometimes Should Be Closed-Minded

Rightly having strongly held belief is not the only way to be resistant to fake news and misleading counterevidence. Sometimes you can continue to know that p or be justified in outright believing p even while finding some counterevidence surprising. (If you know that you committed the murder, sometimes this knowledge can survive the surprising discovery of someone else’s fingerprints on the murder weapon.) And, of course, sometimes rationally strongly held belief constitutes knowledge or justified outright belief that p.

In those cases in which your rationally strongly held belief constitutes knowledge or justified outright belief, it’s even clearer that exposure to counterevidence shouldn’t prompt you to reduce confidence in response to that counterevidence. To demonstrate this, I’ll close with some remarks about Vavova’s suggestion for reacting appropriately to potentially misleading counterevidence. Her suggestion involves training:

[W]e could say, then, that in training our students to think critically, we arm them with bullshit detectors, which better position them to defend against misleading evidence, whether it comes from the media, the news, their teachers (me included) or themselves (42).

I take this point to be correct, and in keeping with some work in regulative epistemology, including Robert Roberts and Jay Wood (2007), whose practical advice for how to operate intellectually emphasizes the inculcation of intellectual virtues.

It’s one question how to equip students and ourselves generally to respond well to potentially misleading counterevidence. It’s another question what to do here and now when confronted with some piece of potentially misleading counterevidence. My claim is that how you should respond here and now to some piece of potentially misleading counterevidence depends on what your epistemic status is with respect to various relevant propositions. If you are confronted with some evidence that not-p, then how you should respond to that evidence depends on whether you, despite the counterevidence, know that p (or are justified in outright believing that p).

If you know that p, despite the counterevidence, then you should be closed-minded toward the counterevidence. What this means is that you shouldn’t be willing to reduce your confidence in response to the counterevidence. After all, in knowing that p, you know that the counterevidence is misleading.[1] And if you know it’s misleading, you should do what that fact is a decisive reason for doing.[2] In standard situations—situations in which you don’t prefer to proportion your belief to misleading evidence—that some evidence is misleading is a decisive reason for failing to reduce your confidence in response to that evidence.[3] Therefore, when you know that p, you should fail to reduce your confidence in response to the counterevidence: you should be closed-minded toward the counterevidence.[4]

It’s not only the case when you know that p you should do what that fact is a decisive reason for doing. It’s also the case that when you should outright believe that p, you should—on at least one important, perspectival sense of should—do what that fact is a decisive reason for doing. (The argument is similar to the one just rehearsed for knowledge.) Therefore, as with knowledge, when you should outright believe that p, you shouldn’t be willing to reduce your confidence in response to counterevidence. In those cases in which your rationally strongly held belief that p is a justified outright belief that p, when you should strongly hold belief that p, you should be closed-minded toward counterevidence. Therefore, especially in those cases in which your rationally strongly held belief is knowledge or justified outright belief, I continue to maintain what Vavova denies in her conclusion (3).

This is why I think it’s not enough to train students in the various intellectual virtues. Or perhaps a better way to put it is that we need to do more than instill the virtue of bullshit detection. That’s because virtuous people will respond differently to counterevidence—bullshit and otherwise—depending on what beliefs they rightly strongly hold, what they already know, and what they justifiedly outright believe. Therefore, virtuous people need ways to identify when they are in a position to know that p, when they are such that they should outright believe that p, and when they are such that we should strongly hold belief that p. And, again, that’s because how we—the virtuous and otherwise—should respond to potentially misleading counterevidence to p—open-mindedly or closed-mindedly—can depend on whether p is a proposition we should outright believe and know.

Author Information:

Jeremy Fantl, jfantl@ucalgary.ca, received his PhD from Brown University in 2000 and took up his current position at the University of Calgary in 2006. He works primarily in epistemology, especially on knowledge-action links and their consequences, and is the author of Knowledge in an Uncertain World (with Matthew McGrath) and The Limitations of the Open Mind.

References

Conee, Earl. 2001. “Heeding Misleading Evidence.” Philosophical Studies 103: 99-120.

Fantl, Jeremy. 2022. “Entitlement and Misleading Evidence.” Philosophy and Phenomenological Research. doi: 10.1111/phpr.12945.

Fantl, Jeremy. 2021. “Fake News vs. Echo Chambers.” Social Epistemology 35 (6): 645-659.

Fantl, Jeremy and Matthew McGrath. 2009. Knowledge in an Uncertain World. Oxford, Oxford University Press.

Hawthorne, John and Jason Stanley. 2008. “Knowledge and Action.” Journal of Philosophy 105(10): 571-90.

Roberts, Robert and Jay Wood. 2007. Intellectual Virtues: An Essay in Regulative Epistemology. Oxford, Oxford University Press.

Vavova, Katia. 2023. “Open-Mindedness, Rational Confidence, and Belief Change.” Social Epistemology Review and Reply Collective 12(2): 33-44.


[1] For defense of this premise see, for example, Conee (2001) as well as other commentators on the so-called “dogmatism puzzle.”

[2] This is the lesson of the various arguments for knowledge-action links of the sort defended in, for example, Hawthorne and Stanley (2008) and Fantl and McGrath (2009).

[3] I argue for this claim at more length in Fantl (2022).

[4] This assumes that if you should fail to reduce your confidence you shouldn’t be willing to reduce your confidence in the relevant sense.



Categories: Critical Replies

Tags: , , , , ,

Leave a Reply