Apocalypse Now?  Yes, but Keep Calm and Carry On, Steve Fuller

Editor’s Note: Beginning in 2012, I asked Steve Fuller to provide a Christmas greeting—or, end-of-year reflection. As the SERRC grew, I invited contributions from our members. In this tradition, and at this time of resolutions, Steve asks us to consider revolutions and the “practical interventions in hastening or avoiding ‘end times’.”

My great thanks to the SERRC contributors and readers over the last year. We have accomplished much by bucking academic convention. In so doing, we realize knowledge together.

Image credit: Francisco Guerrero via Flickr / Creative Commons

Article Citation:

Fuller, Steve. 2022. “Apocalypse Now? Yes, but Keep Calm and Carry On.” Social Epistemology Review and Reply Collective 11 (12): 61-67. https://wp.me/p1Bfg0-7sF.

🔹 The PDF of the article gives specific page numbers.

Articles in this dialogue:

Please refer to posts from ❦ 2012,2013,2014,2015,2016,2017,2018,2019,2020, and ❦ 2021.

There’s Violence … and There’s Violence

A couple of weeks ago I completed the sixth iteration of a popular course I teach called ‘The Sociology of End Times’, which is cross-listed for both undergraduates and graduate students. It attracts matriculants from many departments, which is unusual for an optional module in the UK. The course is anchored in considerations of the shape and meaning of history, ranging across religious, philosophical and scientific interpretations. The word ‘end’ in the course’s title is meant in the spirit of constructive ambiguity: It can mean purpose and/or terminus. Students are assessed by writing an essay in response to the question, Should the world begin anew? Needless to say, I point out that answers can move in many different normative and empirical directions, which the students fully exploit. Notably, some students argue that the world should not begin anew, not because everything is as good as it can be or that things are likely to get better in the future. Rather, these students simply believe that the world should end altogether. At that point, I immediately prescribe a dose of Schopenhauer and hope for the best.

An important theme in the course is the significance of practical interventions in hastening or avoiding ‘end times’.  I argue that if these interventions are to work at all, they must involve some sort of violence, which need not be experienced as such at the time it is happening. We do a profound injustice to the true potency of violence when we presume that it must be felt violently. (Steven Pinker is guilty of this in spades.) I take this to be the intuition behind Johan Galtung’s idea of ‘structural violence’, which is only fully realized in the long-term and/or the large scale. A clear example is a normalized system of oppression that goes virtually unnoticed by those in its grip, thereby blinding them to the radical change it is making to their life-chances. That sense of ‘violence’ is more functional than substantive. Versions of the distinction appear in Max Weber’s and Karl Mannheim’s discussions of rationality in modern society. A substantive approach to violence regards everything about an act of violence as essential to its violent nature, including its manner of execution. In contrast, a functional approach to violence abstracts from particular acts of violence to derive a persistent pattern—call it the ‘topology of violence’, if you will—that might be subject to multiple realizations.

In my ‘End Times’ class, I discuss the topology of violence as involving three features:

(1) a presumed normal path of development;

(2) an interruption to that path, which may come from outside or inside the system, and which may be intentional or unintentional;

(3) a new path of development that would not have happened without the violence.

However, the spatio-temporal dimensions of this violence, including when those caught up in its path become aware of it, remain open questions. In this respect, ‘structural violence’ can be easily understood as a pejorative spin on Joseph Schumpeter’s idea of ‘creative destruction’, which he took to be the lifeblood of entrepreneurship and capitalism more generally—especially when one recalls that his paradigm case of an entrepreneur was Henry Ford.

But since we remain some emotional and political distance from treating the automobile’s systematic transformation of the human and natural world as structural violence, let’s turn to something more legally tractable, such as genocide. The term acquired legal significance with the Holocaust, and since then has been applied both retrospectively and prospectively to Holocaust-like events, whereby over a relatively short period an entire ethnic group has been targeted for extermination. Evidence relevant to identifying a situation as genocide normally includes official orders, footage of killings and mass graves. Thus, alleged cases of genocide are prosecuted under a substantive conception of violence—that is, as violence done violently. One legal advantage of working with such a conception is that perpetrators of such ‘crimes against humanity’ are relatively easy to identify, if not so easy to bring to justice.

But topologically speaking, can’t one achieve comparable effects of such mass violence, albeit over a longer period, by restricting the life-chances of the target population? Here a policy of compulsory sterilization may come to mind. But that too might regarded as substantively violent by virtue of its invasiveness. Instead, one might think about background socio-economic conditions that disincentivize or diminish the salience of the target group. In effect, they are starved of the resources they need to flourish. This is how the systemic consequences of poverty in the Global South tend to be discussed by development theorists. A still subtler version of structural violence falls under the rubric of ‘cultural genocide’, a burgeoning area of international law that is concerned with the often indirect but long-term elimination of minority languages and cultural practices, as larger social, economic and political forces work to discourage their promotion and reproduction.

To be sure, all such functionalized forms of violence make it harder to place the blame on specific individuals and groups, especially once many years have passed and the alleged victims (or their descendants) have accommodated to—if not have become complicit in—the perpetuation of the alleged ‘violence’. At that point, what had previously been presented as ‘violent’ has arguably become the new normal. The law routinely recognizes this point as ‘statute of limitations’, which serves to prevent most claims to reparations for ‘historic damages’ from being effectively prosecuted.

The Elusiveness of Revolutions

Nearly forty years ago, a founder and fixture of the Harvard History of Science Department, I.B. Cohen (1985), published a compendious history of revolutions. An important conclusion he drew was that successful revolutions tend to be identified in retrospect, not while they’re happening. It is more likely that we are living through a revolution than that a planned revolution will succeed. Indeed, Antoine Lavoisier’s self-declared ‘Chemical Revolution’ of the late eighteenth century has been one of the few to succeed on more-or-less its own terms. Even the ‘French Revolution’, on which Lavoisier’s own coinage was based, had been seen as a failure within a decade of its occurrence, given Napoleon’s reinstatement of the monarchy with a vengeance. Indeed, the French Revolution only began to be seen as successful with the instalment of the Third Republic in 1870, which ended monarchy once and for all in France. Moreover, it wasn’t until the 1917 Bolshevik Revolution in Russia that the French Revolution’s substantively violent aspects started to be justified as necessary for any radical change.

Cohen’s conclusion applies to historical periodization more generally. Kant coined ‘Enlightenment’ just as it was ending, and Nietzsche’s mentor Jacob Burckhardt coined ‘Renaissance’ 300+ years after the fact to provide an origin myth for secular Europe, which he dubbed ‘modernity’. Nowadays we easily refer to the ‘Industrial Revolution’, yet the phrase only acquired currency towards the end of the nineteenth century, more than a century after it supposedly began. One way to explain that delay is that before the 1880s it wasn’t so obvious that the wholesale reappropriation of land for factory work was positive and irreversible. However, the apparent sustainability of capitalist imperialism eventually changed minds. (Of course, the twentieth century proved that judgement to be somewhat illusory.)

Perhaps most interesting of all is the ‘Scientific Revolution’, which starts to be used in the 1930s to refer to the previous 300+ years of European history. Herbert Butterfield and Alexandre Koyré are normally credited independently with its coinage, though its popularity dates only from the 1960s, with Thomas Kuhn’s (1970) The Structure of Scientific Revolutions, which suggested to many of its readers (mistakenly) that scientific revolutions could be mass produced. On the contrary, a hidden agenda shared by Butterfield and Koyré was to elevate the rise of modern science to a unique status comparable to—and compatible with—the rise of Christianity. Moreover, considering the altered significance of figures like Francis Bacon and Galileo, the invocation of a ‘Scientific Revolution’ might also be reasonably seen as a turn against the modern revival of paganism that had come to be attached to the label ‘Renaissance’, notwithstanding Burckhardt’s own ambivalence on the matter.

I raise the question of revolutions in this way to stress their in medias res character. More to the point, we may be currently undergoing a revolution without realizing its true nature, which will only be defined later, once the revolution has completed the bulk of its work. Not only is this true to our general understanding of how revolutions have occurred in history, but the post-truth condition also lends itself to such an interpretation. It involves envisaging history as neither cyclical nor linear, but quantum (Fuller 2021). By that I mean that at any given moment, several competing narratives coexist that purport to capture the same reality (aka ‘quantum coherence’), but then after the proverbial ‘collapse of the wave function’—which admittedly is hard to identify as a specific event outside a laboratory setting—one of these narratives becomes clearly dominant (aka ‘quantum ‘decoherence’).

The cross-cutting character of discourses about ‘industrialization’ in the nineteenth century is a case in point. Many shared terms carried ambivalent meanings. For example, until the end of the century, ‘innovation’ was just as likely to connote a monstrous deformation of nature as, in the portentous words of HG Wells, ‘the shape of things to come’. However, once there was general acceptance that starting around 1750, Britain had triggered an ‘Industrial Revolution’ that has been spreading across the world, much of this ambivalence disappeared. ‘Innovation’ acquired an unequivocally positive spin, and discussion shifted to how one might expedite industrialization and safeguard against its worst effects.

In a similar vein, the most sophisticated post-Kuhnian theories of scientific progress (e.g., Laudan 1977) tried to capture these quantum ‘tipping point’ moments as ‘pre-analytic intuitions’ about the history of science that could be repurposed as data to construct and test competing philosophies of science. Thus, great store was set by the widespread belief that by 1720 Newton had seen off his Aristotelian and Cartesian rivals and that by 1920 Einstein had overturned Newton. Grammatically speaking, this is history as told from the standpoint of the future perfect. It’s oriented to a prospective moment of resolution in a currently indeterminate situation. It’s the state of mind one has when speculating on the outcome of bets. Much of my early work was about why this project failed on its own terms (e.g., Fuller 1993). Nevertheless, it was getting at something that is worth retaining—namely, the sense of ‘fact’ as a marker, if only that—in a world of flux. This was Karl Popper’s sense of ‘fact’ as a ‘convention’ or ‘waystation’. It’s a bit like the final score in a game that the parties agree to have been playing. The problem is that this ‘game’ is also typically only known in retrospect. Yet, true to Francis Bacon’s original vision, Popper thought that these games could be made for purpose and the relevant parties brought together to play them, with a binding outcome, at least until the next match is played. This is the logic behind what both called a crucial experiment.

Revolutions on Demand?

In Bacon’s seventeenth century parlance, a ‘crucial experiment’ meant something like ‘an experience that makes a difference to action’, the sort of thing that we now associate with William James’ pragmatic theory of meaning. But whereas James realized that such ‘experience’ may come unexpectedly (hence his openness to ‘religious experience’), Bacon thought that the world could be turned into a laboratory, effectively an arena for matters of fact to be decided on a regular basis. Thus, knowledge might be ‘manufactured’ on a production schedule, even without knowing quite what the products will look like. The Royal Society of London is normally seen as the legacy of Bacon’s vision, but a modern research institute might be closer to the mark. In any case, Bacon is rightly seen as a utopian thinker, comparable to Thomas More, his predecessor as Lord Chancellor from the previous century.

However, writing 300 years after Bacon, Popper presumed the feasibility of crucial experiments. The very fact that increasingly enfranchised democracies were abiding by the results of regularly conducted elections was a hopeful sign. It led him to make the reversibility of outcomes central to his thinking about the ‘open society’. More to the point, the young Popper was impressed by recent international agreements on the value of time-independent physical ‘constants’, such as the speed of light, which suggested that some universally binding conventions, if not outright ‘rules of the game’, were emerging in the conduct of science (A useful albeit debunking source is Mirowski 2004, Part III).

One way to think about the signature ‘constructivist’ turn in science and technology studies (STS) is as a ‘cynical’ take on what Popper—qua Bacon’s self-appointed heir—was getting at. Here I mean ‘cynical’ in the original sense of the Greek philosophical school that insisted that people walk their talk, even if it meant spending their time removed from humans and in the company of dogs (kynikos is the source of ‘canine’). Thus, in the manner of an investigative journalist (who, in the person of Robert Park, had introduced the ethnographic method to sociology), STS researchers went behind the scenes of science’s ‘gentlemen’s agreements’ to reveal messier, more agonistic situations that swept up much of the tumult in society at large.

By the end of the 1980s, it became commonplace to believe that the outcomes of crucial experiments settled the knowledge-power nexus in society indefinitely (Shapin and Schaffer 1985, Latour 1988). What made this proposition controversial at the time was that STS researchers highlighted the contingent character of the nexus—it could (should?) have been settled differently. That suggestion (and typically it was no more than that) defied any easy linear, even dialectical conception of scientific and social progress. Yet, once the deal was sealed—once Boyle bested Hobbes, Pasteur pummeled Pouchet, and so on—the nexus appeared to acquire a totalizing character. To be sure, that was just as the young Popper had hoped, at least until the next crucial experiment was staged—and, of course, STS researchers would also be waiting in the wings to demystify that outcome.

But the quantum character of revolutions should make us skeptical of any such neat conclusion. Consider the title of the late Bruno Latour’s breakthrough public intellectual book, We Have Never Been Modern (Latour 1993). Here Latour raised to the level of philosophical anthropology the most obvious lesson of two decades of ‘postmodern’ critique in the academy, which by the early 1990s had seeped into the larger culture—namely, that the world (including we as part of it) has never lived up to the expectations of the theories of modernity. To an investigative journalist, the point would be obvious. But that’s not the real point. Pace Latour, we became modern as soon as we shifted to a new normal, regardless of whether most people conformed to the new norms, even now. In effect, a new polestar began to direct our moral and epistemic compass (Fuller 2022).

In earlier work, I’ve described this as a shift in the burden of proof (Fuller 1988, chap. 4; Fuller and Collier 2004, chap. 10). To be sure, there are always Galileos who can see both sides of the Gestalt switch that such radical yet subtle shifts in perspective entail (Hanson 1958). However, most people simply experience the shift as confusion, with which they cope in whatever way they can, which often includes blaming themselves and/or others. That blame game can continue indefinitely, thereby seeding future shifts in perspective, the proverbial ‘return of the repressed’. But contra Latour, none of this denies the occurrence of the shift.

Moreover, none of this implies that quantum revolutions are mysterious. Think about the resonant phrase permanent revolution, coined by the early nineteenth century liberal publicist Charles Comte (no relation to Auguste) and adopted by thinkers ranging from Trotsky to Popper. Charles Comte wanted his contemporaries to think of the French Revolution as having been a failed experiment that should be tried again and again until its objective is achieved (Voegelin 1974). Of course, any subsequent revolution would require a reorganization of resources and a reorientation of strategy. Nevertheless, Comte was clear that this was the modern empirical way of doing politics. It requires advance preparation so that people are primed to respond appropriately when the time comes, whenever that turns out to be (I.e., the world is not a laboratory). It invariably involves trial balloons, dress rehearsals and false starts. In this context, ‘learning from error’ is most efficiently done when others commit the mistakes on your behalf. The staging of the Bolshevik Revolution was a textbook case in point. However, as to be expected of a ‘quantum coherent’ world, the nature of the outcome was not apparent, and so the revolution carried on even after Lenin had seized power. (Arguably, the revolution continued without end, which resulted in the downfall of the Soviet Union.)

Here it is worth recalling that Kuhn adopted the term ‘crisis’ for the period that precipitates a scientific revolution from Harvard’s then-resident intellectual historian Crane Brinton (1938; Fuller 2000: chap. 3). Brinton was alluding to the role that Hippocrates and Galen had assigned to fever as a medical sign that the patient’s illness had reached a make-or-break point: they would either recover or die. If we think about this ‘fever’ in epidemiological terms, then some will die, some simply get ill temporarily and then develop immunity, and some will simply carry on stronger than ever. Such is the look of ‘Apocalypse Now’.

Author Information:

Steve Fuller, S.W.Fuller@warwick.ac.uk, Auguste Comte Chair in Social Epistemology, Department of Sociology, University of Warwick.


Brinton, Crane. 1938. The Anatomy of Revolution. New York: Random House.

Cohen, I. Bernard. 1985. Revolutions in Science. Cambridge MA: Harvard University Press.

Fuller, Steve. 2022. “The Epistemological Compass and the (Post)Truth about Objectivity.” Social Epistemology 1-6. doi: 10.1080/02691728.2022.2150988.

Fuller, Steve. 2021. “Permanent Revolution In Science: A Quantum Epistemology.” Philosophy of the Social Sciences 51: 48-57. [Translated into Russian as the Preface for the Russian edition of Kuhn vs Popper].

Fuller, Steve. and James H. Collier. 2004. Philosophy, Rhetoric and the End of Knowledge. 2nd ed. (1st edition by Fuller, 1993). Hillsdale NJ: Lawrence Erlbaum Associates.

Fuller, Steve. 2000. Thomas Kuhn: A Philosophical History for Our Times. Chicago: University of Chicago Press.

Fuller, Steve. 1993. Philosophy of Science and Its Discontents. (Orig. 1989). New York: Guilford Press.

Fuller, Steve. 1988. Social Epistemology. Bloomington IN: Indiana University Press.

Hanson, Norwood R. 1958. Patterns of Discovery. Cambridge UK: Cambridge University Press.

Kuhn, Thomas S. 1970. The Structure of Scientific Revolutions. (Orig. 1962). Chicago: University of Chicago Press.

Latour, Bruno. 1993. We Have Never Been Modern. (Orig. 1991). Cambridge MA: Harvard University Press.

Latour, Bruno. 1988. The Pasteurization of France. Cambridge MA: Harvard University Press.

Laudan, Larry. 1977. Progress and Its Problems. Berkeley: University of California Press.

Mirowski, Philip. 2004. The Effortless Economy of Science? Durham NC: Duke University Press.

Shapin, Steven. and Simon Schaffer. 1985. Leviathan and the Air-Pump. Princeton: Princeton University Press.

Voegelin, Eric. 1974. “Liberalism and Its History.” Review of Politics 35 (4): 504-520.

Categories: Articles

Tags: , , , , , , ,

Leave a Reply