Using Institutional Ethnography to Examine the Social Organization of Absence, Jaime McCauley

Author Information: Jaime McCauley, Northern Kentucky University, mccauleyj1@nku.edu

McCauley, Jaime. “Using Institutional Ethnography to Examine the Social Organization of Absence.” Social Epistemology Review and Reply Collective 3, no. 8 (2014): 22-27.

The PDF of the article gives specific page numbers. Shortlink: http://wp.me/p1Bfg0-1zk

Please refer to:

Introduction

The study of absences of knowledge is inherently bound up with questions of power and justice. Asking “What is known, and to whom?” implies that some of us have some power to conceal or reveal what is known to others. Indeed, power and justice are a recurring theme in the Social Epistemology special issue on “absence”: Scott Frickel describes absence as “bound up in the moral economies of societies” (86), Jennifer Croissant recognizes absence as “overdetermined by power relations” (11), and justice is central in Dimitri Papadopoulos’ “politics of matter” (77). In this critical reply, I take up these assertions about the relationship between power and justice and of knowing and not knowing. I seek to complement the arguments made in these papers by illustrating the potential contribution of institutional ethnography as a sociological approach to examining both the contours of absence, and the power relations behind that which is known and unknown. In this illustration, I apply institutional ethnography to my research on volunteer water quality monitoring. 

Institutional Ethnography

Institutional ethnography (IE) starts from the premise that all knowledge is socially organized and structured by “ruling relations” external to the individual. Developed as a new approach to sociology by Dorothy Smith, IE centralizes what Smith calls the “everyday/everynight” experiential knowledge of individuals engaged in the activity under study. From this standpoint, IE maps the “local sites of people’s experience, making visible how we are connected into the extended social relations of ruling and economy and their intersections” (Smith 2005, 629). Institutional ethnographers use data from interviews and/or participant observations to explore the ruling relations behind socially organized knowledge, and then identify a “problematic” that “sets out a project of research and discovery that organizes the direction of the investigation.” (Smith 2005, 4206). IE emphasizes that knowledge and action are textually mediated and “makes visible just how activities in local settings are coordinated and managed extralocally” (DeVault 2006, 295). Mapping the extralocal coordination of local activity reveals the dimensions of power and justice at play in determining which knowledges are available to whom.

I began my current research by exploring the standpoint of participants in volunteer water quality monitoring programs. I was interested in, among other things, why individuals take part and what goals they have for the data they produce. Between June 2013 and June 2014 I completed 19 interviews with people involved in five volunteer water quality monitoring organizations in the Greater Cincinnati Northern Kentucky region, and observed various meetings, events, forums, and lab activities. Taking the standpoint of these informants revealed that they are quite adept at identifying absences of knowledge; each had questions about the quality of some local waterway for which there was no answer. The avid kayaker wanted to know if the water is safe for recreation, the fisherman wanted to know if his catch is safe to eat, the grandparents wanted to know if the water in the creek by their home is safe for the grandchildren to play in, the farmer wondered whether his fertilizers impact the local waterway, and so on.

From the standpoint of these participants I learned that making knowledge present where it had been absent is a primary motivation for participation in water quality monitoring programs. To this end, my informants teamed up with scientific experts in volunteer water quality monitoring programs (in each organization lab analysis is overseen by a team of scientific experts who hold advanced degrees in biology and/or environmental science, and only these experts perform scientific analysis of lab results). Through these partnerships data are produced that make knowledge about the safety and quality of local waters available to the public. The problematic, from the standpoint of my informants, is, “Are we making a difference?” Specifically, how can our data be used to gain the attention of regulatory agencies that protect the water.

Visibility and Potential Action

Whether action is taken based on data produced by citizen water quality monitoring forces us to confront issues of power and justice lurking beneath the surface of claims that citizen science is an inherently democratic enterprise. As Puig de la Bellacasa (2014, 38) poignantly notes in this issue, “Making something visible is never a neutral affair.” Creating knowledge about water quality and making it not just visible, but actionable, is precisely the goal of most volunteer water quality monitors. It is this, perhaps more than the monitoring itself, that entangles citizen water quality monitoring organizations with what institutional ethnographers call ruling relations: “translocal forms of social organization and social relations mediated by texts” (Smith 2005, 4213). These texts then become part of an institutional discourse that coordinates the work of individuals. Producing actionable data requires volunteer water quality organizations to interact with local, regional, and state regulatory agencies. The ruling relations associated with these agencies, in turn, coordinate the work of water quality monitoring programs.

One of the goals of IE is to make ruling relations visible by creating “maps” of the institutional complexes and associated texts that coordinate people’s activities. By mapping these relations, the institutional ethnographer makes ruling relations visible so that those involved with an institution have a more complete view of how to work within the institution, or to change it. For citizen water quality monitors, successfully working with government regulatory agencies or environmental institutions often makes the difference in whether the data they produce is fully actualized; whether the absence of knowledge they’ve worked to address becomes fully present.

To illustrate the ways in which extralocal ruling relations, institutions, and texts coordinate local activities of volunteer water quality monitoring organizations, I examine the impact of the Ohio Environmental Protection Agency’s Credible Data Program. Through this program the state specifies standards that limit which data may be used by the agency. This program is part of an “institutional discourse [that] selects those aspects of what people do that are accountable within it … what is not discursively recognized will not appear” (Smith 2005, 2962). Accordingly, data not produced according to Credible Data Program specifications is officially invisible to the state. Additionally, the OEPA exercises control over the timing and cost of training sessions necessary to become a certified data collector. For these reasons, the program plays a key role in concealing or revealing water quality data produced by volunteer water quality monitoring programs.

The credible data program emerged as a “problematic” area of study during interviews with informants in Ohio. (Kentucky does not have a credible data law like Ohio’s therefore their relationship with government agencies is different.) Many informants emphasized the importance of taking action should data reveal a problem with the water, but organizational leaders know that getting the attention of regulatory agencies is difficult if the organization does not meet Credible Data Program standards. This is how extralocal ruling relations associated with the Credible Data Program shape the local activities of water quality monitoring programs. Organizational leaders take seriously their responsibility to produce scientifically valid and reliable data. Even programs not officially certified as part of the Credible Data Program follow the procedures and protocols specified by the program. Unfortunately, fulfilling the requirements places significant burdens on citizen monitoring programs, especially small, non-profit, all-volunteer organizations. These organizations remain limited in the extent to which their data becomes fully known, present, or actionable.

IE, then, requires mapping the ruling relations in such a way that exposes the coordinating mechanisms between institutional discourse and the actualities of the work in which people are engaged. While my institutional ethnographic research on this topic is ongoing, there are several areas of exposure that identify the role of the state in helping or hindering efforts of citizen water quality monitoring programs in addressing absences of knowledge about their local waterways. Here I use interview and participant observation data combined with online texts from the OEPA website and a sample of documents shared with me by informants. Findings indicate: 1) there are significant temporal and financial burdens associated with meeting Credible Data Program standards that limit the ability of volunteer based organizations attain certification, 2) the OEPA itself is caught up in external ruling relations whereby its funding is limited, which, in turn, limits the availability of training opportunities for monitoring organizations to meet Credible Data Program requirements.

Credible Data

On its website, the OEPA Credible Data Program sets out three categories of data. Level one is for public education only and training is optional, level two may be used to evaluate trends in water quality and assess the effectiveness of pollution abatement projects after implementation, and level three may be used for regulatory purposes. Becoming certified in level two or three data collection requires an investment in training and equipment on the part of the organization, as well as the submission of a series of quality assurance protocols. Of the four monitoring programs in my study, only the two wealthiest and longest running organizations had obtained certification (one of these organizations is jointly supported by a county level government agency and a local university, and the other is part of a private foundation dedicated to environmental education). Both organizations have at least one paid staff member. Even for these organizations, maintaining level-two certification is described as “cumbersome”. For example, re-certification documents must be resubmitted annually and respondents estimate that completing these documents requires eighty staff hours. Both groups have decided against level three certification because the process is so difficult, thereby rendering their data absent from regulatory use.

For smaller programs, meeting the costs of attending days-long trainings and purchasing the necessary equipment is a barrier to participation in the Credible Data Program. The costs of travel for out of town training session are significant for programs on a small budget, and it is difficult for individual volunteers to absorb these costs if not reimbursed by their organization. In addition, most volunteers would be taking time off from paid employment to attend. Regardless, these organizations strive to unofficially comply the program so they may become certified in the future. Much attention is given to writing quality assurance protocols and training manuals, entering data, and performing other tasks in compliance with Credible Data standards. One respondent indicated that their lab meets OEPA standards and they have the quality assurance protocols drafted, yet they lack official certification because no one from the group has attended a level two training session. The extralocal ruling relations coordinating the local activities of water quality monitoring programs limits the extent to which knowledge produced by these organizations becomes fully known.

Complicating this picture are ruling relations extending beyond the Ohio EPA. The OEPA must contend with austerity measures and government cutbacks that have slashed budgets for environmental agencies. This creates a contradictory effect whereby the OEPA increased its reliance on volunteer water quality monitoring programs because such programs are cost effective, but have fewer resources to certify data from these programs according to their own standards. The OEPA website indicates that level two and three training sessions are not offered due to budget constraints. Occasionally similar training sessions are available through private firms. The most recent of these training sessions took place in the state capital over the course of five days for a cost of $3975.00 (plus associated travel costs). It is unclear to my informants whether these training sessions fulfill the requirements necessary to advance their certification, although they were listed on the OEPA website.

So few volunteer programs are able to meet the credible data standards that the OEPA website states, “We have de-emphasized the name Volunteer Monitoring Program on these Web pages (in favor of the name Credible Data Program). While it is absolutely our intent to encourage volunteer monitoring, the large majority of program participants are not volunteers.” Indeed, the list of qualified data collectors (QDCs) published on the OEPA website reveals mostly employees of private firms or governmental agencies. These results reflect Gwen Ottinger’s (2009) study of Louisiana air quality monitoring standards that serve in some cases as a bridge to inclusion for citizen monitors, and for others act as a gatekeeper that excludes citizen data. Rather than making the knowledge production process more inclusive of data collected by citizen scientists, Ohio’s Credible Data Program has instead rendered data produced by these programs largely invisible to the state.

No part of this critique is meant to diminish the importance of quality data. However, failure to meet requirements of the Credible Data Program in and of itself is no indication of poor data or method on the part of volunteer monitoring programs. Each organization has scientific experts to oversee data collection, testing, and analysis. Each group has sampling protocols and volunteer training procedures produced by scientific experts. Each group follows standards set out by the OEPA, whether the group is officially certified or not. The only difference is that two of these groups completed the OEPA training, certification, and re-certification processes and two of them have not (yet) done so. The groups without certification are hindered, in part, by lack of organizational resources but also by a lack of available training sessions. For these reasons the knowledge produced by these organization remains, if not entirely absent, obscured.

Conclusion

The sketch I present of IE’s potential contribution to uncovering absences of knowledge from the standpoint of citizen scientists is just the beginning. A key part of IE is “exploring the work knowledges of those situated differently in the institutional division of labor” (Smith 2005, 3004). From here my analysis will go on to include, for example, interviews with members of the OEPA involved with the Credible Data Program, as well local/county regulatory agencies and/or planning commissions who use citizen produced water quality data. In addition, continued analysis of the texts that regulate and structure the work of people at all points of the problematic under study is necessary.

Institutional ethnography is just one of many ways to answer Scott Frickel’s call for increased empiricism in the study of absence. With the illustration of IE presented here, I hope to have demonstrated the usefulness of this approach to sociology as one way of exploring absences of knowledge. Absences are often “motivated by a multiplicity of factors linked together by considerations of social power” (Croissant 2014, 11). Because IE aims to “take ethnography further into contemporary forms of organization we call power” (Smith 2005, 3640) it is a method primed to unravel the mystery of which knowledges are available to whom, and why. By identifying absences and making visible the power relations behind them, IE can not only bring absences into light but also explore new possibilities for creating knowledge where there was none before, or exposing knowledge that had been hidden. Dimitri Papadopoulis (2014, 75) refers to absence “as a process: the making and unmaking of absences.” IE can help us uncover this process.

References

Croissant, Jennifer L. “Agnotology: Ignorance and Absence or Towards a Sociology of Things That Aren’t There.” Social Epistemology 28, no. 1 (2014): 4-25.

Devault, Marjorie L. “Introduction: What is Institutional Ethnography?” Social Problems, no. 3 (2006): 294-298.

Frickel, Scott. “Absences: Methodological Note about Nothing, in Particular.” Social Epistemology 28, no. 1 (2014): 86-95.

Ohio Environmental Protection Agency. “Credible Data: Home.” Accessed July 17, 2014. http://www.epa.ohio.gov/dsw/credibledata/index.aspx

Ohio Environmental Protection Agency. “Current QDCs.” Accessed July 17, 2014. http://www.epa.ohio.gov/dsw/credibledata/current_QDCs.aspx

Ohio Environmental Protection Agency. “Levels of Credible Data.” Accessed July 17, 2014. http://www.epa.ohio.gov/dsw/credibledata/levels_of_data.aspx

Ohio Environmental Protection Agency. “Training and Testing.” Accessed June 23, 2014. http://www.epa.ohio.gov/dsw/credibledata/training_testing.aspx#126441-level-2-and-level-3-training-availability

Ottinger, Gwen. “Buckets of Resistance: Standards and the Effectiveness of Citizen Science.” Science, Technology, and Human Values 32, no. 2 (2009) 244-270.

Papadopoulos, Dimitri. “Politics of Matter: Justice and Organisation in Technoscience.” Social Epistemology 28, no. 1 (2014): 70-83.

Puig de la Bellacasa, Maria. “Encountering Bioinfrastructure: Ecological Struggles and the Sciences of Soil.” Social Epistemology 28, no. 1 (2014): 26-40.

Smith, Dorothy. Institutional Ethnography: A Sociology for People. Altamira Press, 2005. Kindle edition.



Categories: Critical Replies

Tags: , , , , , ,

1 reply

Trackbacks

  1. How to Know Things in Distributed Crowds : Open Space

Leave a Reply

Discover more from Social Epistemology Review and Reply Collective

Subscribe now to keep reading and get access to the full archive.

Continue reading