Liam Kofi Bright and Remco Heesen (2023) seek to identify a more pragmatic and fruitful criterion for demarcating academic from commercial research. Having described the limitations of criteria based on epistemic success they propose adopting the Mertonian social norm of communism. In this comment, I argue that communism is insufficiently defined—by Merton, the present authors, and the academic research that informs their work. What’s more, it is argued that much of current academic research fails to be scientific, even under a fairly broad definition of communism. Though a step forward, more work remains to be done in order to employ communism as a criterion, and more generally, to resolve the demarcation problem. I close with some brief comments on communism and its relation to the burgeoning open science movement. … [please read below the rest of the article].

Image credit: GovernmentZA via Flickr / Creative Commons
Dunleavy, Daniel J. 2023. “Is Academic Research Sufficiently Communist? A Comment on Bright and Heesen.” Social Epistemology Review and Reply Collective 12 (8): 42–47. https://wp.me/p1Bfg0-81J.
🔹 The PDF of the article gives specific page numbers.
❧ Bright, Liam Kofi, and Remco Heesen. 2023. “To be Scientific is to be Communist.” Social Epistemology 37 (3): 249-258.
❦ Brown, Matthew J. 2023. “Good Science is Communist: A Reply to Bright and Heesen.” Social Epistemology Review and Reply Collective 12 (4): 22-26.
Communism: An Imprecise Term
What does it mean for a researcher, a line of inquiry, or research field to be communist? To my knowledge, Merton never published a sustained exposition of the four CUDOS (communism, universalism, disinterestedness, and organized skepticism) norms. Early in his career (Merton 1942/1973) he provides a somewhat implicit definition, referring to communism, “… in the nontechnical and extended sense of common ownership of goods …” (274). and noting that its “antithesis” is secrecy (275). He elaborates that a researcher’s claim to intellectual property is limited to whatever “recognition and esteem” (274) is conferred upon them by social and professional institutions. Decades later, Zuckerman and Merton (1971) tie communism with the imperative of researchers to openly communicate research findings (69). This definition of communism is affirmed by that presented in Cournard and Zuckerman (1970) who describe the norm as a prescription to share knowledge with the broader scientific community. Its members “…may not limit access to their products” (942).
In their paper, Heeson and Bright state that, “Communism says that one must make one’s work available to others for free, not try to maintain proprietary rights to it, and treat it as always properly open to the evaluation of the scientific community” (254). They go on to say, that “[A claim] is scientific…if it is made appropriately available to the scientific community and proprietary rights are not claimed in any way that interferes with fellow researchers accessing, using, or evaluating it” (254). Finally, they clarify that communism is a continuous trait—something can be more or less communist (255) and thus, more or less scientific.
To help justify their choice of communism, Bright and Heesen cite four papers, which survey researchers’ general approval and endorsement of the CUDOS norms (254). However, upon inspection, these papers likewise define communism with variable degrees of precision.
In a survey by MacFarlane and Cheng (2008), we find that, “Communism…refers to the common ownership of intellectual property”. (71) They do not provide the full set of survey questions, but within the text they provide two statements which (the authors claim) represent confirming and disconfirming perspectives on communism: “Question 6: I am in favour of sharing my teaching materials with my peers.” “Question 13: I feel it is important to protect my individual property rights” (70, emphasis added).
The article by Kim and Kim (2018) states that, “communalism [i.e., communism] refers to the … full and open communication of scientific findings” (3). They only asked participants two questions about communism and the language used is even less clear than in MacFarlane and Cheng. They note that, “The second norm of communalism was tapped by two survey questions asking about the ownership of research data and output, that is, the practice of delaying publication for intellectual property right and that of sharing research data.” (Kim and Kim 2018, 8, emphasis added)
In Anderson et al. (2010), the authors state, “The first Mertonian norm is communality [sic]…the common ownership of scientific results and methods and the consequent imperative to share both freely” (3, emphasis added). They also note that, for their survey, “[I]tems constructed for use in the Acadia study (Anderson 2000, 447–448) are as follows: Communality norm: Scientists openly share new findings with colleagues. Secrecy counternorm: Scientists protect their newest findings to ensure priority in publishing, patenting, or applications” (7, emphasis added).
The final study by Louis, Jones, and Campbell, (2002) is more difficult to evaluate. The article appeared in the bimonthly column of the popular science magazine the American Scientist. It describes a survey conducted by the authors—but the details are rather opaque. We know from the authors that the study is assessing openness of sharing results (304)—however there is little detail on specific survey questions and minimal mention of Merton himself.
From the above, we see that some usages communism seems to refer to the sharing of any knowledge or findings gained from research and scholarship (e.g., Anderson et al. 2010; Cournard and Zuckerman 1970; Zuckerman and Merton 1971). Other usages seem to point to the goods or products of one’s work (e.g., MacFarlane and Cheng 2008; Merton 1942/1973; Bright and Leeson 2023)—such as the sharing of published articles, technical reports, books, conference presentations, teaching materials.
Ultimately, is not clear to me where the boundaries of this norm are. Is it satisfied by simply sharing of a study’s results (i.e., the key primary and secondary findings) or does it compel researchers and scholars to share the entire manuscript and other published output? Or still further, does it include the sharing of study data and materials (e.g., Kim and Kim 2018), the underlying code/syntax, and any other information necessary for computational reproduction and subsequent replication? What’s more, in what manner does such sharing take place (e.g., by complying after “reasonable request” or openly sharing via a public repository)?
Without trying to be disingenuous could this possibly even mean sharing the idiosyncratic details of one’s day-to-day work, which is an often undocumented, but essential part of the scientific process?[1] It is these questions which leave me wondering how “communism” may be precisified and made more fruitful (Carnap 1950)—whether ultimately employed as a demarcation criterion or not.
Much of Current Academic Research is “Unscientific”
Commercial (and industry-sponsored) research certainly has well-documented shortcomings (e.g., Goldacre 2012; Hopkins, Rowland, and Sorich 2018; Lundh et al. 2017; Turner et al. 2008) and its impact on the quality of academic research is worrisome (e.g., Bodenheimer 2000). However, the magnitude of these problems does not entail that academic research, in and of itself, is therefore scientific. Putting aside definitional difficulties (described above)—and taking for granted the fact that it is a widely supported virtue—we find that researchers simply do not widely put communism into practice. The sharing of study results, data, and materials are highly variable within and between fields. A handful of case examples are illustrative.
In my home field of social work, Pendell (2018) took a random sample of articles (n = 638), published in 25 leading social work journals in 2014. Each of these journals permitted authors to upload a pre- or post-print to an institutional or disciplinary repository—with 20 journals also offering a paid open access option. More than half of articles (52%) had no full-text access (1049), a figure which may be an underestimate given the extent to which some articles may have violated publisher copyright agreements (e.g., by uploading via commercial platforms like ResearchGate) and thus, these articles may not necessarily remain openly available, long-term.
Piwowar and colleagues (2018) took three samples of doi-assigned articles (100,000 articles each, randomly sampled from three different scholarly resources) from across the physical and social sciences and the humanities. Of the samples, the low-end estimate for articles with closed access (i.e., no open access of any kind or “color”) was 53.0%, 95% CI [52.7–53.3] and 72.0% [71.8–72.4] at the high-end, for Unpaywall-DOIs and Crossref-DOIs respectively.
Similarly, academic researchers do not historically have a good track record when it comes to sharing the data underlying their work. Wicherts et al. (2006) requested data (via email) from authors of 141 empirical articles published in four leading psychology journals. Of the 249 included studies in the sample, data were received from 64 (25.7%)—which were provided from a mere 38 authors (27%).[2] These findings were supported by a follow-up study by Vanpaemel et al. (2015). In their request for data from articles (also in four leading psychology journals), 41% of authors did not reply to the researchers’ data request (n = 161). Another 18% (n = 69) refused or were unable to share data. Still another 4% did not share data, despite promising to do so.
When asked to share study data from trials published in two leading medical journals (BMJ and PLOS Medicine—both of which strongly encourage the practice as a part of their publishing policies and guidelines), Naudet et al. (2018) only received data for 17 (46%) of 37 eligible studies.
Finally, comes an interesting example from Miyakawa (2020). As Editor-in Chief of the journal Molecular Brain (which maintains a policy that authors will include all relevant raw data), Miyakawa has handled hundreds of submissions. Across almost two years (early 2017—September 2019), he made 41 decisions to revise a submission before it would be sent out for review—asking authors to include raw data. Of these 21 (51%) were withdrawn without providing the data, 19 (46%) were rejected due to inadequate data or data which was mismatched to the reported study results. Only one of the sampled studies ended up providing adequate data and was eventually accepted for publication, after review.
These are but preliminary investigations (though see also Watson 2022), but they point towards a researcher culture that, while becoming more open, is certainly still very much closed and opaque—even among researchers themselves.
Musings On Communism, Open Science, and Technology
Bright and Heesen (2023) close their paper (8) with some favorable comments about the still nascent Open Science movement and its relation to science broadly and the communist norm (“Open Science initiatives embody the definitive feature of science…” 256). This is an assessment which I am quite sympathetic to (Dunleavy 2021, 2022). It recalls some comments by the late paleontologist and open science advocate Jon Tennant (Tennant and Breznau 2022) who once said:
[On what open science is] It’s a tautology. Science was always open. This is where we want to get to in 10 years. …[T]his is going to be the period when we woke up and realized that what we were doing before wasn’t really science. … Science without open is just anecdote, open science is just good science … (19).
But all of these sentiments give me some pause. The open sharing of data and materials and the ease by which researchers and the public can access journal articles and other research products is a recent phenomenon (at least at this scale). This suggests that how communist (or scientific) a study is, is partly a function of one’s access to—and the development of—technological resources[3] and a capability to share more generally. These things may be outside the control of individual researchers and scholarly communities, regardless of their personal commitment to sharing, due to historical and sociocultural factors that may have no direct bearing on the conduct of the research itself (see generally Bezuidenhout et al. 2017 and Serwadda et al. 2018).
This is anticipated, to some extent, by Zuckerman and Merton (1971) who note that the printing press, “…thus provided a technological basis for the emergence of … “communism”: the norm that prescribes the open communication of findings to other scientists and correlatively proscribing secrecy” (69) and their acknowledgment of the importance of “ancillary institutional inventions” (70) in shifting entrenched norms and attitudes.
This makes me wonder whether communism—a social norm deployed as a demarcation criteria—has enough elasticity to accommodate technological advances and institutional pressures, while still clearly separating commercial and academic research, in contemporary and historical contexts.
Daniel Dunleavy, dunldj@gmail.com, completed his PhD (social work) from Florida State University (Florida, USA) in 2020. His published work has focused primarily on issues in behavioral healthcare and research practice. He has interests in the philosophy and sociology of science, meta-research, and scholarly publishing (esp. issues in peer review and open science).
References
Anderson, Melissa S., Emily A. Ronning, Raymond De Vries, and Brian C. Martinson. 2010. “Extending the Mertonian Norms: Scientists’ Subscription to Norms of Research.” The Journal of Higher Education 81 (3): 366–393. https://doi.org/10.1353/jhe.0.0095.
Bezuidenhout, Louise M., Sabina Leonelli, Ann H. Kelly, and Brian Rappert. 2017. “Beyond the Digital Divide: Towards a Situated Approach to Open Data.” Science and Public Policy 44 (4): 464–75. https://doi.org/10.1093/scipol/scw036.
Bodenheimer, Thomas. 2000. “Uneasy Alliance — Clinical Investigators and the Pharmaceutical Industry.” New England Journal of Medicine 342 (20): 1539–1544. https://doi.org/10.1056/NEJM200005183422024.
Bradley, Jean-Claude, Kevin Owens, and Antony Williams. 2008. “Chemistry Crowdsourcing and Open Notebook Science.” Nature Precedings: 1–8. https://doi.org/10.1038/npre.2008.1505.1.
Bright, Liam Kofi, and Remco Heesen. 2023. “To be Scientific is to be Communist.” Social Epistemology 37 (3): 249-258. https://doi.org/10.1080/02691728.2022.2156308.
Carnap, Rudolf. 1967. Logical Foundations of Probability, 2nd ed. Chicago: University of Chicago Press.
Clinio, Anne, and Sarita Albagli. 2017. “Open Notebook Science as an Emerging Epistemic Culture within the Open Science Movement.” Revue Française Des Sciences de l’information et de La Communication 11: 1–19. https://doi.org/10.4000/rfsic.3186.
Cournand, André F., and Harriet Zuckerman. 1970. “The Code of Science. Analysis and Some Reflections on Its Future.” Studium Generale 23 (10): 941–962. https://pubmed.ncbi.nlm.nih.gov/5483232/
Dunleavy, Daniel J. 2021. “The Cultivation of Social Work Knowledge: Toward a More Robust System of Peer Review.” Families in Society 102 (4): 556–568. https://doi.org/10.1177/10443894211012243.
Dunleavy, Daniel J. 2020. “Coronavirus As Impetus for a Lasting Change in Research Culture.” SocArXiv. https://doi.org/10.31235/osf.io/2ryt3.
Goldacre, Ben. 2012. Bad Pharma: How Drug Companies Mislead Doctors and Harm Patients. New York City: Fourth Estate.
Harding, Rachel J. 2018. “2 Years of Open Notebooking; Lessons Learnt from Labscribbles” Zenodo. https://doi.org/10.5281/zenodo.1155846.
Harding, Rachel J. 2019. “Open Notebook Science Can Maximize Impact for Rare Disease Projects.” PLOS Biology 17 (1): e3000120. https://doi.org/10.1371/journal.pbio.3000120.
Hopkins, Ashley M., Andrew Rowland, and Michael J. Sorich. 2018. “Data Sharing from Pharmaceutical Industry Sponsored Clinical Studies: Audit of Data Availability.” BMC Medicine 16 (1): 165. https://doi.org/10.1186/s12916-018-1154-z.
Kim, So Young, and Yoonhoo Kim. 2018. “The Ethos of Science and Its Correlates: An Empirical Analysis of Scientists’ Endorsement of Mertonian Norms.” Science, Technology and Society 23 (1): 1–24. https://doi.org/10.1177/0971721817744438.
Louis, Karen Seashore, Lisa M. Jones, and Eric G. Campbell. 2002. “Macroscope: Sharing in Science.” American Scientist 90 (4): 304–307. https://www.jstor.org/stable/27857685
Lundh, Andreas, Joel Lexchin, Barbara Mintzes, Jeppe B. Schroll, and Lisa Bero. 2017. “Industry Sponsorship and Research Outcome.” Cochrane Database of Systematic Reviews, 2: no. MR000033. https://doi.org/10.1002/14651858.MR000033.pub3.
Macfarlane, Bruce, and Ming Cheng. 2008. “Communism, Universalism and Disinterestedness: Re-Examining Contemporary Support among Academics for Merton’s Scientific Norms.” Journal of Academic Ethics 6 (1): 67–78. https://doi.org/10.1007/s10805-008-9055-y.
Merton, Robert. K. (1942/1973). “The Normative Structure of Science. In The Sociology of Science: Theoretical and Empirical Investigations edited by Norman. W. Storer, 267–278. Chicago: University of Chicago Press.
Miyakawa, Tsuyoshi. 2020. “No Raw Data, No Science: Another Possible Source of the Reproducibility Crisis.” Molecular Brain 13 (1): 24. https://doi.org/10.1186/s13041-020-0552-2.
Naudet, Florian, Charlotte Sakarovitch, Perrine Janiaud, Ioana Cristea, Daniele Fanelli, David Moher, and John P. A. Ioannidis. 2018. “Data Sharing and Reanalysis of Randomized Controlled Trials in Leading Biomedical Journals with a Full Data Sharing Policy: Survey of Studies Published in The BMJ and PLOS Medicine.” BMJ 360: k400. https://doi.org/10.1136/bmj.k400.
Pendell, Kimberly. 2018. “Behind the Wall: An Exploration of Public Access to Research Articles in Social Work Journals.” Advances in Social Work 18 (4): 1041–1052. https://doi.org/10.18060/22180.
Serwadda, David, Paul Ndebele, M. Kate Grabowski, Francis Bajunirwe, and Rhoda K. Wanyenze. 2018. “Open Data Sharing and the Global South—Who Benefits?” Science 359 (6376): 642–643. https://doi.org/10.1126/science.aap8395.
Tennant, Jonathan, and Nate Breznau. 2022. “Legacy of Jon Tennant, ‘Open Science Is Just Good Science.’” SocArXiv. https://doi.org/10.31235/osf.io/hfns2.
Turner, Erick H., Annette M. Matthews, Eftihia Linardatos, Robert A. Tell, and Robert Rosenthal. 2008. “Selective Publication of Antidepressant Trials and Its Influence on Apparent Efficacy.” New England Journal of Medicine 358 (3): 252–260. https://doi.org/10.1056/NEJMsa065779.
Vanpaemel, Wolf, Maarten Vermorgen, Leen Deriemaecker, and Gert Storms. 2015. “Are We Wasting a Good Crisis? The Availability of Psychological Research Data after the Storm.” Collabra 1 (1): 1-5. https://doi.org/10.1525/collabra.13.
Watson, Clare. 2022. “Many Researchers Say They’ll Share Data — But Don’t.” Nature 606 (7916): 853–853. https://doi.org/10.1038/d41586-022-01692-1.
Zuckerman, Harriet, and Robert K. Merton. 1971. “Patterns of Evaluation in Science: Institutionalisation, Structure and Functions of the Referee System.” Minerva 9 (1): 66–100. https://doi.org/10.1007/BF01553188.
[1] Bradley et al. (2009) and Harding (2018, 2019) demonstrate the value and importance of publicly documenting the day-to-day decisions, processes, and otherwise mundane aspects of a study that might not appear in a published paper. The sharing of such information has benefits for interpretability and reproducibility, while increasing transparency, trust, and promoting collaboration. For discussion in the context of the open science movement, see Clinio and Albagli (2017).
[2] This is to say nothing of the usability of such data, once received.
[3] The infrastructure supporting open science practices (i.e., data repositories and file-sharing platforms, high-speed internet, advanced statistical computing software, cloud and personal storage space, etc.) are all recent developments in the broader history of science.
Categories: Critical Replies
Leave a Reply