There are many ways that technologies influence our thinking, behaviors, and perceptions. A lock on a door may prevent us from entering. A pot holder or oven mitt enables us to grab hold of a hot baking dish. The empty gas tank prohibits us from starting the car. Chemical dependence on a drug may make it difficult to resist taking more. The alarm clock makes it easier to drift off to sleep without worrying we won’t wake up in time for an important event the next morning. These forces—restricting or affording particular courses of action—are just some of the ways that technologies have an influence on a user as they are used, somehow changing a user’s sensations, abilities, lines of thought, decision-making, and options … [please read below the rest of the article].
Rosenberger, Robert. 2021. “The Politics of the Passive Subject.” Social Epistemology Review and Reply Collective 10 (9): 29-35. https://wp.me/p1Bfg0-692.
🔹 The PDF of the article gives specific page numbers.
❧ Aagaard, Jesper. 2021. “The Passive Subject: A Phenomenological Contribution to STS.” Social Epistemology Review and Reply Collective 10 (5): 14-19.
❦ Rosenberger, Robert. 2020a. “Backing Up Into Advocacy: The Case of Smartphone Driver Distraction.” The Journal of Sociotechnical Critique 1 (1): 1-16. https://digitalcommons.odu.edu/
The Influences of Everyday Technology
Through our work on this topic, Jesper Aagaard and I have been exploring another form of influence exerted by our devices. We’ve been considering the ways that the everydayness of technology itself can constitute a kind of influence. How does the way that technology usage becomes normal itself have effects on what we do? How does the fact that we embody our devices and come to use them in a way deeply saturated by learned bodily-perceptual habitation itself become part of the way that technologies direct our actions, thinking, and perceptions? In particular in this short piece, I’d like to point out some of the political dimensions of this form of technological influence.
Here on the SERRC, Aagaard suggests that we approach this form of influence through the figure of the “passive subject” (2021c). It is easy to fall into a trap of assuming that our actions involving technologies are the result of an explicit choice on the part or the user (e.g., the baker uses the pot holder in order to remove the hot cookie sheet from the oven), the explicit coercion of our behavior by the device (e.g., the locked door prevents me from entering), or perhaps some combination of the two accounted for in terms of a kind of networked, cyborg, or co-constituted action. Of course all these kinds of dynamics do occur. However, the figure of the passive subject highlights a further way our relationships with technology come to determine our experience: our bodily-perceptual habits.
The sedimentation of our past experiences with a technology primes us to encounter this same technology again in the same terms. As Don Ihde puts it, “For phenomenology, intuitions are constituted, not given. Only already constituted intuitions are ‘given’ within an already sedimented context” (1998, 121). Aagaard’s gathering of these ideas under the figure of the passive subject is a useful move. He expertly highlights the ways that sedimented bodily relations represent a distinct, often important, and often overlooked factor in the dynamics of human-technology relations.
We must develop tools for understanding the kind of experiential momentum that is generated within our relationships with our devices. That is, it is important to find ways to articulate how human-technology relations become set within bodily-perceptual habituation. This is because at least some aspects of our relationships with technology can sometimes take on a level of automaticity. And yet this automaticity is not like that of robotic programming or muscle reflex, but instead something built on our past experiences. Our sedimented relations prompt us to encounter our current experience with technology in terms of an already meaningful pre-perceptual context. This automaticity is often useful and even at times essential to the proficient usage of a device. For example, driving a car requires not just knowledge stored in your head, but bodily and perceptual training. Safe driving requires that some aspects of our relationship with the car’s interface (e.g., the steering wheel, pedals, etc.) have become at least somewhat automatic. Turning the wheel must flow naturally from your intention to turn the car, all as you focus attention on the road ahead. However, this automaticity can also take the form of “bad habits” as well. Many of today’s technology-related problems—from issues of online discourse, to environmental disregard, to the spread of misinformation—can be understood to have habitual components that could be usefully articulated and criticized through notions the passive subject and related ideas.
Aagaard and I have developed these ideas through the exploration of a number of concrete topics, such as the classroom distraction of laptop computers, the driver distraction of smartphones, in-person conversation etiquette, phantom vibration syndrome, the nature of e-reading, and technology addiction (e.g., Rosenberger 2012; Aagaard 2015; Rosenberger 2015; Rosenberger 2017b; Aagaard 2020; Rosenberger 2020a; Aagaard 2021a; Aagaard, 2021b). Others have applied these ideas regarding technology and perceptual sedimentation to everything from prostheses, to care organizing technologies, to smartphone communication, to virtual reality (e.g., Yaron et al. 2017; Kerruish 2019; Susser 2019; Lewis 2020; Shaw et al. 2020; Roholt 2021).
An important point to emphasize here is that, while this figure of analysis has been given the name “the passive subject” by Aagaard in his SERRC paper, these dynamics should not be understood as somehow docile or unassertive. As Maurice Merleau-Ponty puts it, “this contracted knowledge is not an inert mass at the foundation of our consciousness” (1962, 131). The kind of automatic contextualizing activity of our sedimented perceptual habituation upon our current experience highlighted by the figure of the passive subject is active, outreaching, and sometimes powerfully so.
What is at issue, then, in labeling these dynamics as “passive”? The language of passivity, as Aagaard has laid it out, is useful for the way it intersects with the discussion occurring within the fields of STS, psychology, and the philosophy of technology. Such debates often hover over how we should understand technological “agency” in comparison to human agency, or how we should understand technological “intentionality.” These discussions often explore how our devices incline or afford particular actions, or limit or shape our choices, how they introduce tradeoffs, or how—often in somewhat mechanistic terms—our relations to technology are determined by factors such as addiction or cognitive resource limitations. This is all well and good. But by labeling the action of learned bodily-perceptual habituation as “passive,” Aagaard effectively underlines the way that these dynamics are somehow different from those others under debate. Although our sedimented pre-perceptions can at times be forceful, this forceful action is not merely the result of a conscious volition. As Aagaard explains, this account understands habits to thus be, “neither choices nor reflexes, they are both self-initiated and involuntary” (2021c, 17). This kind of habituated action is thus “passive” in the sense that is happening with a level of automaticity coming from within the user, as the result of the user’s own past experiences, and yet without an explicit act of will on the user’s part.
All of this is important to keep in mind as we turn to consider how these issues of the passive subject have a role to play in the politics of technology. My suggestion is that these ideas have a distinctive contribution to make regarding the automaticity with which some kinds of political patterns can become a part of human embodied perception.
Of course our technological systems are not above political critique. The ways in which technologies are designed, implemented by communities, and taken up by users can all play roles in injustice. These roles can be intentional and/or unintentional, individual and/or structural, complex and fraught or straightforwardly evil, all at various levels. I suggest that the notion of the passive subject and its related ideas can be helpful for identifying and criticizing a specific space where these injustices reside and perpetuate: user habituation. As Gail Weiss writes:
The project of challenging oppressive perceptual norms that perpetuate negative judgements… necessitates a discussion of habit, for perceptual norms themselves arise directly out of habitual “syntheses” of space and time, the “miraculous” yet mundane intercorporeal activities through which my body “lends itself to the world” and the world lends itself to my body (2017, 213-214).
I argue that this is especially relevant to user relationships with technology. Insofar as injustice and discrimination can be built into our technological systems, their perpetuation will occur not only through material affordance and restriction, and not only through conscious and intentional implementation by users, but through the ways these usages become sedimented within users’ bodily-perceptual habituation. That is, one way such injustices can become instantiated is in the manner by which our technological arrangements become normal and expected by everyday users, even unwitting ones.
It is exactly because our relationships with technology can become normal and expected that they can be difficult to examine. Put another way, it is important to develop phenomenological tools for exposing the details of our relationships with technology exactly because they can become hidden within bodily-perceptual habituation. As Aagaard puts it, “Through practice, our bodies become so familiar with performing certain actions that this performanace eventually happens outside of conscious awareness” (2021c, 15). And since aspects of human-technology relations can become occluded within their everydayness, unjust elements of these relations can correspondingly be obscured by the normal and lived-in character of our typical engagement with our world.
This can be especially true for those who are not the direct target of the injustice. It is an aspect of privilege to not be targeted for discrimination. But it is an additional aspect of privilege to be able to go through the world unaware of the discrimination faced by others. And this lack of awareness can become set within habits of perception and action. As Linda Martín Alcoff writes, “Rendering our habits visible makes them accessible for reflection and evaluation. This is a possible route for change” (2015, 85). I am suggesting that a project for postphenomenology and other perspectives in STS and the philosophy of technology is to develop conceptual resources for drawing out, articulating, and criticizing these experiential components of injustice.
The User Experience
One central example of how injustice can perpetuate through the habits of human-technology relations is in terms of what users notice, and more, what they fail to notice. Phenomenologists describe human perception to occur in the form of a figure and a ground, with something emerging as present against a background of the other things experienced. The postphenomenological perspective has put a lot of work into accounting for human-technology relations in these terms, with notions of embodiment and background relations to technology, and technological transparency, among other ideas. But there is more to it.
As Sara Ahmed observes, there is a kind of activity involved in what becomes set as the background of our perception, what she calls an “act of relegation.” She writes, “Perception involves such acts of relegation that are forgotten in the very preoccupation with what is faced” (Ahmed 2006, 31). The act of failing to notice something is a non-innocent one, not a mere omission, but a kind of work. And as it becomes over time set within perceptual habit, this work can itself become automatic and unnoticed. In this way, the failure to recognize injustice can be perpetuated not just by privilege, but by privilege sedimented within bodily-perceptual habituation.
I’d like to consider these ideas in terms of my own work on the politics of public spaces, and in particular the problem of homelessness. If the set of ideas in the discussion between Aagaard and I here on the passive subject first emerged through my evolving work on the phenomenology of smartphone-induced driving impairment, then the related ideas on the political valence of this figure are emerging through this critique of anti-homeless design.
Anti-homeless designs can at once become pervasive and effective within public spaces, and yet at the same time can go largely unnoticed by those not targeted by them. Such designs take many forms, but some of the more common ones include benches built in a way to deter sleeping (such as through the addition of armrests or seat dividers), spikes built into ledges to deter sitting, and trashcan lids designed to deter picking. These can be added to many other public-space designs that have the effect of discriminating against the unhoused, from fences that block off underpasses, to sound machines that produce a loud racket in parks at night, to sprinkler systems that rain down on empty sidewalks (e.g., Rosenberger 2017a; Rosenberger 2020c). An academic discussion over these examples of “hostile architecture” or “hostile design” is beginning to emerge, one which identifies and studies anti-homeless and other forms of public-space design that have an effect of controlling already vulnerable groups. I have argued that anti-homeless design in particular should be understood in terms of its role in a larger anti-homeless agenda, one that also centrally includes anti-homeless law, differently situated within different cities all over the world, and which has an overall unjust effect of pushing unhoused people out of shared public spaces.
Here, I want to focus on one aspect of all this, the experience of someone not targeted by anti-homeless design as they move through public-space. It is possible that this person will not take notice of these designs, and, more, that this failure to notice could become sedimented within their normal approach to these spaces. The public space around them—as this person moves through it in a typical manner on a typical day—may gestalt as normal, unproblematic, with everything in its place, and largely backgrounded. This person may, for example, sit on the bench and barely think about the bench itself, much less the role the design of the device may play in a larger anti-homeless agenda. That is, they may use the space in a way unaware of its material role as part of a campaign of discrimination against the homeless population. And this unawareness may have become built into their perceptual habituation with regard to the technology of this space. Not only are the unhoused mistreated, but this systematic mistreatment itself is hidden, and this hidden status becomes ingrained within the pre-perceptual habituation of the users of these devices and spaces. In this way, one important element of the unjust treatment of the unhoused population in this scenario is that which has become sedimented within the minds of those users of these spaces who are not targeted by anti-homeless design, and who functionally pass through these spaces oblivious to their politics.
An important political project for STS scholars, philosophers, and anyone thinking about technology is to identify and criticize and resist the ways in which our devices contribute to injustice in the world. Of course often these technology-related injustices take place in a straightforward and unhidden manner, and it is crucial to predict and identify the ways that our devices may be used by some people to actively oppress and discriminate against others. But of course as well our devices have the capacity to “automate” discrimination and inequity; oppression can be offloaded to technological systems. Here, with the figure of the passive subject, we identify another element of the automaticity of technological systems that can contribute to oppression and discrimination: the integration of these biases into bodily-perceptual sedimentation. As users become accustomed to the usage of their devices (and to their technological situation and environment more generally), bias can become built into this very accustomization. We cannot allow this site of the potential perpetuation of injustice to go unchecked.
Robert Rosenberger, email@example.com, Georgia Institute of Technology.
Aagaard, Jesper. 2021a. “Beyond the Rhetoric of Tech Addiction: Why We Should Be Tech Habits Instead (And How).” Phenomenology and the Cognitive Sciences 20: 559-572.
Aagaard, Jesper. 2021b. “‘From a Small Click to an Entire Action’: Exploring Students’ Anti-Distraction Strategies.” Learning, Media and Technology 46 (3): 355-365.
Aagaard, Jesper. 2021c. “The Passive Subject: A Phenomenological Contribution to STS.” Social Epistemology Review and Reply Collective 10 (5): 14-19.
Aagaard, Jesper. 2020. “Digital Akrasia: A Qualitative study of Phubbing.” AI & Society 35: 237-244.
Aagaard, Jesper. 2015. “Drawn to Distraction: A Qualitative Study of Off-Task Use of Educational Technology” Computers & Education. 87: 90-97.
Ahmed, Sara. 2006. Queer Phenomenology. Durham: Duke.
Alcoff, Linda Martín. 2015. The Future of Whiteness. Cambridge: Polity.
Ihde, Don. 2009. Postphenomenology and Technoscience: The Peking University Lectures. Albany: SUNY Press.
Ihde, Don. 1998. Expanding Hermeneutics: Visualism in Science. Evanston: Northwestern University Press.
Kerruish, Erika. 2019. “Arranging Sensations: Smell and Taste in Augmented and Virtual Reality.” Senses and Society 14 (1): 31–45.
Kudina, Olya. 2021. “‘Alexa, Who Am I?’: Voice Assistants and Hermeneutic Lemnescate as the Technologically Mediated Sense Making.” Human Studies 44: 233-253.
Lewis, Richard S. 2020. “Technological Gaze: Understanding How Technologies Transform Perception.” In Perception and the Inhuman Gaze edited by Anya Daly, Fred Cummins, James Jardine, and Dermot Moran, 128-142. London: Routledge,
Merleau-Ponty, Maurice. 1962. Phenomenology of Perception translated by Donald A. Landes. London: Routledge.
Roholt, Tiger. 2021. “Being-With Smartphones.” Techné 25 (2): 284-307.
Rosenberger, Robert. forthcoming. “Technological Multistability and the Trouble with the Things Themselves.” In The Oxford Handbook of Philosophy of Technology edited by Shannon Vallor. Oxford University Press. https://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780190851187.001.0001/oxfordhb-9780190851187-e-42.
Rosenberger, Robert. 2020a. “Backing Up Into Advocacy: The Case of Smartphone Driver Distraction.” The Journal of Sociotechnical Critique 1 (1): 1-16. https://digitalcommons.odu.edu/sociotechnicalcritique/vol1/iss1/3/.
Rosenberger, Robert. 2020b. “‘But That’s Not Phenomenology!’: A Phenomenology of Discriminatory Technologies.” Techné 24 (1/2): 83-113.
Rosenberger, Robert. 2020c. “On Hostile Design: Theoretical and Empirical Prospects.” Urban Studies 57 (4): 883-893.
Rosenberger, Robert. 2017a. Callous Objects: Designs Against the Homeless. Minneapolis: Minnesota University Press.
Rosenberger, Robert. 2017b. “On the Immersion of E-Reading (Or Lack Thereof).” In Postphenomenology and Media contributed to edited by Yoni Van Den Eede, Stacey O’Neal Irwin, and Galit Wellner, 145-163. Lexington Books
Rosenberger, Robert. 2015. “An Experiential Account of Phantom Vibration Syndrome.” Computers in Human Behavior 52: 124-131.
Rosenberger, Robert. 2012. “Embodied Technology and the Dangers of Using the Phone While Driving.” Phenomenology and the Cognitive Sciences 11: 79-94.
Shaw, Sara E., Gemma Hughes, Sue Hinder, Stephany Carolan, and Trisha Greenhalgh. 2020. “Care Organizing Technology and the Post-Phenomenology of Care: An Ethnographic Case Study.” Social Science & Medicine 112984: 1-9.
Susser, Daniel. 2019. “Invisible Influence: Artificial Intelligence and the Ethics of Adaptive Choice Architectures.” AIES ‘19: Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society, 403–408.
Verbeek, Peter-Paul. 2020. “Politicizing Postphenomenology.” In Reimagining Philosophy and Technology, Reinventing Ihde edited by Glen Miller and Ashley Shew, 141-155. Cham: Springer.
Verbeek, Peter-Paul. 2011. Moralizing Technology. Chicago: Chicago University Press.
Weiss, Gail. 2017. “The ‘Normal Abnormalities’ of Disability and Aging.” In Feminist Phenomenology Futures edited by Helen A. Fielding and Dorothea E. Olkowski, 203-217. Bloomington: Indiana University Press.
Yaron, Gili, Widdershoven, Guy, and Slatman, Jenny. 2017. “Recovering a ‘Disfigured’ Face: Cosmesis in the Everyday Use of Facial Prostheses.” Techné 21(1): 1–23.
 I suggest that these factors have to potential for making important contributions to the philosophical perspective through which they are arising: postphenomenology. This perspective has a philosophical commitment to a “relational” ontology, understanding humans and technologies and the world to all become what they are through technological mediation (e.g., Ihde 2009; Verbeek 2011; Kudina 2021; Rosenberger forthcoming). An under-examined aspect of these dynamics is precisely how these issues of bodily-perceptual habituation play into the “co-constitution” at work postphenomenology’s account of relational ontology. An additional reason this could be important is because it could enable a distinctive phenomenological contribution to be made to the larger discussion over relational ontology occurring within the kin group of perspectives on this topic to which postphenomenology belongs, including actor-network theory, and feminist new materialism.
It is also the case that these reflections on the politics of technological habituation can help to demonstrate the potential of postphenomenological thought to contribute to political work on technology. Phenomenology in general, and postphenomenology in particular, are sometimes criticized for a lack of political engagements, both in terms of the work of their adherents, and in terms of the structural potential (or alleged lack thereof) for these ideas. These ideas about the political relevance of learning habituation regarding technologies can join the other ideas coming out of postphenomenology regarding its political relevance (e.g., Rosenberger 2017; Rosenberger 2020b; Verbeek 2020).
 For online collections of examples of hostile architecture and design, see, e.g.:
• Interboro’s Arsenal of Inclusion and Exclusion: https://www.interboropartners.com/projects/the-arsenal-of-exclusion-inclusion.
• Cara Chellew’s Defensive Urbanism Research Network: https://www.defensiveto.com/.
• Don Lockton’s Architectures of Control: https://architectures.danlockton.co.uk/architectures-of-control-in-the-built-environment/.
• Nils Norman’s Dismal Garden Collection: http://www.dismalgarden.com/archives/defensive_
 For more on these ideas, including some initial thoughts on their connections to feminist standpoint epistemology, see: Rosenberger 2017a, chapter 5.
Categories: Critical Replies