Salience Machines: A Review of Florian Jaton’s The Constitution of Algorithms: Ground-Truthing, Programming, Formulating, Emma Stamm

Florian Jaton’s The Constitution of Algorithms: Ground-Truthing, Programming, Formulating takes up a simple question: where do algorithms come from? Although a great deal of sociotechnical research implies problems of this kind, they are rarely posed so directly. There may be good reason for this. To treat the algorithm as genealogically distinct from similar objects is to approach intellectual terrain which, if not techno-utopian, at least fails to register the zeitgeist. At present, a growing number of scholars claim that digital processes reinforce patterns of social bias and exclusion. The literature on algorithmic racism, for example, portrays machine learning as a dyadic operation which obscures as much information as it expresses.[1]… [please read below the rest of the article].

Image credit: MIT Press

Article Citation:

Stamm, Emma. 2021 “Salience Machines: A Review of Florian Jaton’s The Constitution of Algorithms: Ground-Truthing, Programming, Formulating.” Social Epistemology Review and Reply Collective 10 (9) 1-6.

🔹 The PDF of the article gives specific page numbers.

The Constitution of Algorithms: Ground-Truthing, Programming, Formulating
Florian Jaton
The MIT Press, 2021
289 pp.

This literature suggests that the endless segmenting of digital objects into novel categories—e.g. “apps”; “the cloud”; “algorithms”—belies their shared role in advancing corporate interests, sometimes at great cost to society. Among businesspeople, the induction of new labels for established technologies is salutary. But critics know that an “app” is the same thing as software; “the cloud” comprises nothing more than web servers; and an “algorithm” is merely a set of computational rules. In truth, terms like “algorithm” do not indicate self-enclosed functions. Like all digital objects, algorithms are not only constituted by other media, but by social and epistemic forces which predate and supersede them.

Thus I was concerned that The Constitution of Algorithms would reduce its subject in the interest of narrative cohesion. But Jaton is quick to address this potential confound, declaring his project as partial, situated, and epistemically normative in the book’s introduction (16). Admitting the impossibility of finishing the task before him, he claims he has “no other choice” than to ask the reader to follow along nevertheless (17). The book is peppered with such reflexive remarks, and while they may have served to distract in a different context, here they underscore a central point. Books and algorithms both produce salience—that is, they privilege specific ontologies, epistemologies, and facts over others. This principle characterizes all forms of positive knowledge production and has been scrutinized by no small number of philosophers and social theorists. Jaton’s commitments, however, are not theoretical.

As he argues, the many controversies which surround algorithms today may be at least partially resolved through an empirical investigation into their origins (9). Towards this end, he examines the “unrelated entities (e.g., documents, people, desires)” which come into contact throughout the gestation period of algorithmic code (9). The contemporary discourse on digital ethics and politics is both setting and raison d’etre. Although the book’s task may be unfinished, it still meaningfully intervenes in these debates.

A sociologist of Science and Technology Studies (STS), Jaton’s research methods come from social theory, critical historiography, and the sociology of scientific knowledge, among other areas. The book is based on his doctoral dissertation, which included three years of ethnographic fieldwork in a digital image processing laboratory (21). Between 2013 and 2016, Jaton joined a team of PhD students developing a program to detect the most salient features in an image file (34). He learned how to program along the way, eventually contributing to the codebase himself (18). As he writes, his years of fieldwork provided him with insights that could not be gleaned from scientific publications, most of which fail to report “the practical activities” that led to their results (18). Jaton is concerned with nothing if not “practical activities:” the everyday minutiae familiar to programmers but rarely foregrounded in academic and popular representations of computer science. For this reason, The Constitution of Algorithms belongs to the burgeoning fields of critical data studies and critical infrastructure studies, as both illuminate the largely hidden processes compacted in digital structures.

From the outset, Jaton suggests that today’s technological imaginaries suffer from a historical and conceptual myopia. What we know today as computing was not an inevitable outcome of technological progress, he argues, but was shaped by a series of chance political, scientific, and industrial developments. To make this point, Jaton interweaves three ethnographic chapters with three chapters surveying the historical and epistemological predicates of today’s algorithmic ecosystem. Although his aims are descriptive, his facility with theoretical techniques in STS and the social sciences is clear. Bruno Latour’s work on the inscription of knowledge is a touchstone, as is Antonio Negri and Michael Hardt’s work on constituent power (13, 161). Jaton invokes the latter to claim that algorithms are constituted processually—they are not static or finite—and that, as constitutions, they endorse certain social rights (17). Hence his choice to use the word “constitute” rather than “construct” to describe algorithmic development processes. His algorithmic constitutions are texts which both govern and are governed by human decisions (12).

Conceiving Algorithms, Digital Image Processing, and John Von Neumann

The book’s six chapters are divided across three sections. The first section, “Ground-Truthing,” explores the power of referent databases to orient machine learning outcomes. Jaton defines “ground truths” as the data used to build and test machine learning code (54). Among programmers, these data are known as the “training set” and the “test set”; together, they make up the “ground” or epistemic foundation for machine learning applications.

Chapter One confronts what Jaton calls the “standard conception” of algorithms, or their common definition as “interfaces between inputs and outputs” (49). He then suggests a conceptual expansion of “algorithm” to include not only code, but the act of defining the problems targeted by algorithmic calculation (16). As he writes, most programmers accept that these problems already exist as manifest objects in the world. If this is true, he continues, the study of algorithms would be exclusively devoted to optimizing and assessing their efficacy (49). He objects to this framing, declaring that problems do not exist prior to their articulation and translation into programmable formats. These activities comprise what he calls the “problematization” stage of building algorithms, where programmers first define a problem and then rework it to be soluble in code (49). The problematization stage informs the construction of the ground truth, and both bear influence over the algorithm long after its release to users.

Chapter Two, which comprises the first ethnographic case study, depicts problematization in situ. Jaton begins by reflecting on his initial months at the unnamed European university where he joined a digital image processing laboratory (34). His recollections focus on his attempts to learn at least enough about programming to understand the work of his teammates, which included a varying number of PhD students and one postdoctoral scholar. Throughout this section, he shares his newfound knowledge with the reader: chapters one and two both survey foundational concepts in machine learning, and chapter two includes a sophisticated discussion of the function of salience detection. This explanation is necessary to understand all three case studies; here, it accounts for his teammates’ reconceptualization of “saliency” in their subfield (67). As Jaton explains, the conventional meaning of saliency curbed their efforts to reimagine face detection in machine vision programs, despite evident need for such a problematization (75).

Readers with no prior exposure to computer science might find the second chapter unwieldy. But in addition to contextualizing the book’s ethnographic sections, they faithfully reflect the creativity and guesswork involved in coding. Although Jaton’s teammates are equipped with highly specialized knowledge, their labors are speculative and uncertain. Programming thus appears as a field whose criteria for success are not always objective, and which often appear as moving targets. In one memorable part, Jaton recounts his teammates’ disappointment at having a paper rejected from a major technical conference (79). Interestingly, a second paper with “rigorously the same objects” as the first—the same ground truth and computational model—won “Best Short Paper Award” at a different conference the following year (81). Whereas the first paper emphasized the program’s performance, the second emphasized hypothetical applications for its ground truth (81). The outcome of this difference in framing speaks to the subjective nature of computer science.

Chapter Three, titled “Von Neumann’s Draft, Electronic Brains, and Cognition,” presents the book’s most compelling argument. Put simply, Jaton claims that social research on the digital is constrained by the metaphor which likens brains to computers. He arrives at this conclusion through a critical historiography of World War Two and postwar-era computing which centers the career of mathematician John von Neumann. Acknowledging von Neumann’s uncontested status in the history of information technology, Jaton argues that his work rendered the practical labors of programming all but invisible in the official record. Because of his high standing in the global research community, von Neumann “attended meetings—the famous ‘Meetings with von Neumann’—and read reports and letters … but was not part of the mundane tedious practices” which were (and are) essential in computer programming (100). Put differently, Von Neumann’s vision of computing was informed by “clean results of laborious material processes” which did not accurately represent the daily labors of computing. From this privileged outlook, the mathematician began to essentialize computers as seamless input-output machines. Much by contrast, Jaton writes, computing has always been a function of unprogrammed and unpredictable variables carried out by masses of largely invisible workers. If we’ve come to see it as “seamless,” “pure,” or “fully autonomous,” it only means that these workers cleaned up after themselves (99).

Von Neumann, failing to consider the embodied labor of programming, came to imagine computers as disembodied brains—and, ultimately, to imagine brains as biological computers. Although the premises of this metaphor were flawed, Jaton describes its profound intellectual influence. He examines its specific role in shaping cognitivism, a tradition in cognitive science which maintains that cognition functions as a computational input-output machine. Cognitivists believe that brains take in perceptual data from the world at large, render these data as representations, and produce outputs in the form of actions (125). Within this school of thought, Jaton says, “matter thrones in the realm of ‘extended things’” and “mind thrones in the realm of ‘thinking things’,” or the allegedly virtual domain of computing (125). Jaton rejects this Cartesianism on philosophical grounds, but more crucially for his project, because it restricts our practical understanding of both thinking and computing (123).

As an alternative to cognitivism, he endorses a view of cognition known as enactivism. For enactivists, thinking is not a calculative function lodged between inputs (perceptions) and outputs (actions). It is instead directly constitutive of and responsive to the environments it inhabits, with no intermediary in the form of symbolizing or representative mental operations (126). In Jaton’s words, “enactivism considers cognition as adaptive interactions with the environment” whose features “are offered to and modified through the actions of the cognizer.” This school of thought does not equate brains with computers, but nevertheless has important implications for computing practices, which Jaton presents at the end of the chapter.

Enactivism, Wetware, and Programming

Chapter Four, the book’s second ethnographic case study, explores the work of programming through an enactivist lens. It departs from a procedural problem Jaton encounters in the laboratory: how was he to document the fast-paced, cryptic, and focus-intensive work of his teammates (136)? He overcomes this obstacle by developing his own image-processing project with the group’s assistance. In what follows, he draws on his personal experiences to develop something of an autoethnographic enactivist account of coding. Consistent with enactivist principles, he frames code as inscriptions which function less as symbols and more as fluctuating reconfigurations of events. He insists that code does not represent a priori referents, but constitutes new worlds—just as enactivist cognition directly constitutes rather than symbolizes its given environment (148). This sets up an argument which he develops in chapter five: that the production of scientific and mathematical knowledge functions likewise as a matter of creative constitution rather than reference to unchanging and objective predicates.

Chapter Four completes the book’s second section, “Programming.” The third section is titled “Formulating,” Jaton’s term for the processes by which scientific realities take on mathematical (and therefore computable) forms (232). As a case study in formulating, Jaton looks at the work of neuroscientist Michal Lynch. In the 1970s, Lynch was studying the brain’s capacity to generate new neural connections (229). Lynch’s research fused “wet” organic material—thin slices of rat brains—with the mathematical abstractions required to systematize their properties. Noting that brain tissue does not come with its own quantitative correlates, Jaton writes that Lynch’s research entailed the translation of “wet” matter into the “flat and dry ecology” of numbers. This common medium allows for the production of both mathematical and scientific facts, but Jaton maintains that it homogenizes complex ontologies.

In a book rich with arcane information, no detail felt superfluous until this point. Jaton’s appraisal of “formulating” rehearses arguments long made by media theorists about the reduction of ontically and epistemically dynamic phenomena to logico-formal codes. His emphasis on the flattening of “wet” and “bulky” biological realities evokes Eugene Thacker’s The Global Genome: Biotechnology, Politics, and Culture and Richard Doyle’s Wetwares: Experiments in Postvital Living.[2], [3] Both books dissect the contingent social factors which inform mediation processes between digital and biological material, making essentially the same critique as Jaton does. Instead of engaging the abundant literature which theorizes media translations, Jaton presents a byzantine historical tour through neuroplasticity research. At a minimum, an acknowledgment of Marshall McLuhan’s foundational work in this area would have been appropriate.[4] With this said, The Constitution of Algorithms already engages multiple disciplines. To ask Jaton to incorporate frameworks from media theory alongside STS, sociology, computer science, and history might be unfair.

The sixth chapter presents the third ethnography, which recalls an incident of formulating images depicting human faces (the object of his team’s salience-detection algorithm) to logarithmic values (250). Jaton revisits the concept of ground-truthing to indicate that formulating practices inherit the properties of the ground truth. He also makes a crucial observation about programming algorithms, especially for readers with no coding experience. While the formulating stage follows from the ground-truthing stage, Jaton writes, the ground truth is also determined by expectations about future formulating activities (247). Thus programming is an exercise in recursive imagination, and programmers must be conscious of the self-fulfilling tautologies they inscribe in code. Here, it becomes clear that the book’s schematic of “three gerund sections,” as Jaton puts it—the eponymous ground-truthing, programming, and formulating—is not isomorphic with the processes it explores (281). This schema gives necessary if provisional coherence to activities which are nonlinear and constituted by multiple agents. Jaton implies as much in the introduction, and provides a useful reminder towards the conclusion.

Curiosity and Constitution

The book does not furnish the reader with one overarching argument or novel conceptual framework. It is most useful for mapping territory which surrounds us but eludes rigorous understanding. Jaton’s eclectic approach to methodology serves this purpose well, as his ethnographic reports mirror and confirm the ideas which run through the more conceptual chapters. Marshalling so many concepts, stories, and disciplinary approaches, the book risks collapsing under its own weight. Instead, the effect is orchestral. The chapters are all markedly different from one another; together, they amount to a chronicle for contemporary sociotechnical research across a number of disciplines.

From end to end, the book is buoyed by Jaton’s glowing curiosity about computer science. Its mood is passionate and reverent—a far cry from that of social research which dismisses digital media as little more than reified politics. Here, we find the sort of ardor which produces “around one thousand pages of handwritten notes; two thousand .txt files; a dozen modulable Python scripts; and hundreds of audio, image, and movie recordings as well as numerous half-finished analytical propositions,” as Jaton did throughout his doctoral research (45). Perhaps this enthusiasm is key to repairing algorithms’ poor reputation in the social sciences. The Constitution of Algorithms asserts that their forms are never fully determined, and that their capacity for positive change depends on our openness to them.

Author Information:

Emma Stamm,, Villanova University.


Benjamin, Ruha. 2019. Race after Technology: Abolitionist Tools for the New Jim Code. Cambridge, UK: Polity.

Doyle, Richard. 2003. Wetwares: Experiments in Postvital Living. Minneapolis, MN: University of Minnesota Press.

Jaton, Florian. 2021. The Constitution of Algorithms: Ground-Truthing, Programming, Formulating. Cambridge, MA: The MIT Press.

McLuhan, Marshall. 2004. Understanding Media: The Extensions of Man, Cambridge, MA: The MIT Press.

Noble, Safiya Umoja. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. New York, NY: New York University Press.

Thacker, Eugene. 2006. The Global Genome: Biotechnology, Politics, and Culture. Cambridge, MA: The MIT Press, 2006.

[1] E.g., Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (New York, NY: New York University Press, 2018); Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code (Cambridge, UK: Polity, 2019)

[2] Eugene Thacker, The Global Genome: Biotechnology, Politics, and Culture. (Cambridge, MA: The MIT Press, 2006)

[3] Richard Doyle, Wetwares Experiments in Postvital Living (Minneapolis: University of Minnesota Press, 2003)

[4] E.g., Marshall McLuhan, Understanding Media: The Extensions of Man (Cambridge, MA: The MIT Press, 2004)

Categories: Books and Book Reviews

Tags: , , , , , , ,

Leave a Reply