Author Information: Brian Martin, University of Wollongong, firstname.lastname@example.org.
Martin, Brian. “Bad Social Science.” Social Epistemology Review and Reply Collective 8, no. 3 (2019): 6-16.
People untrained in social science frameworks and methods often make assumptions, observations or conclusions about the social world. For example, they might say, “President Trump is a psychopath,” thereby making a judgement about Trump’s mental state. The point here is not whether this judgement is right or wrong, but whether it is based on a careful study of Trump’s thoughts and behaviour drawing on relevant expertise.
In most cases, the claim “President Trump is a psychopath” is bad psychology, in the sense that it is a conclusion reached without the application of skills in psychological diagnosis expected among professional psychologists and psychiatrists. Even a non-psychologist can recognise cruder forms of bad psychology: they lack the application of standard tools in the field, such as comparison of criteria for psychopathy with Trump’s thought and behaviour.
“Bad social science” here refers to claims about society and social relationships that fall very far short of what social scientists consider good scholarship. This might be due to using false or misleading evidence, making faulty arguments, drawing unsupported conclusions or various other severe methodological, empirical or theoretical deficiencies.
In all sorts of public commentary and private conversations, examples of bad social science are legion. Instances are so common that it may seem pointless to take note of problems with ill-informed claims. However, there is value in a more systematic examination of different sorts of everyday bad social science. Such an examination can point to what is important in doing good social science and to weaknesses in assumptions, evidence and argumentation. It can also provide insights into how to defend and promote high-quality social analysis.
Here, I illustrate several facets of bad social science found in a specific public scientific controversy: the Australian vaccination debate. It is a public debate in which many partisans make claims about social dynamics, so there is ample material for analysis. In addition, because the debate is highly polarised, involves strong emotions and is extremely rancorous, it is to be expected that many deviations from calm, rational, polite discourse would be on display.
Another reason for selecting this topic is that I have been studying the debate for quite a number of years, and indeed have been drawn into the debate as a “captive of controversy.” Several of the types of bad social science are found on both sides of the debate. Here, I focus mainly on pro-vaccination campaigners for reasons that will become clear.
In the following sections, I address several facets of bad social science: ad hominem attacks, not defining terms, use of limited and dubious evidence, misrepresentation, lack of reference to alternative viewpoints, lack of quality control, and drawing of unjustified conclusions. In each case, I provide examples from the Australian public vaccination debate, drawing on my experience. In a sense, selecting these topics represents an informal application of grounded theory: each of the shortcomings became evident to me through encountering numerous instances. After this, I note that there is a greater risk of deficient argumentation when defending orthodoxy.
With this background, I outline how studying bad social science can be of benefit in three ways: as a pointer to particular areas in which it is important to maintain high standards, as a toolkit for responding to attacks on social science, and as a reminder of the need to improve public understanding of social science approaches.
In the Australian vaccination debate, many partisans make adverse comments about opponents as a means of discrediting them. Social scientists recognise that ad hominem argumentation, namely attacking the person rather than dealing with what they say, is illegitimate for the purposes of making a case.
In the mid 1990s, Meryl Dorey founded the Australian Vaccination Network (AVN), which became the leading citizens’ group critical of government vaccination policy. In 2009, a pro-vaccination citizens’ group called Stop the Australian Vaccination Network (SAVN) was set up with the stated aim of discrediting and shutting down the AVN. SAVNers referred to Dorey with a wide range of epithets, for example “cunt.”
What is interesting here is that some ad hominem attacks contain an implicit social analysis. One of them is “liar.” SAVNer Ken McLeod accused Dorey of being a liar, giving various examples. However, some of these examples show only that Dorey persisted in making claims that SAVNers believed had been refuted. This does not necessarily constitute lying, if lying is defined, as it often is by researchers in the area, as consciously intending to deceive. To the extent that McLeod failed to relate his claims to research in the field, his application of the label “liar” constitutes bad social science.
Another term applied to vaccine critics is “babykiller.” In the Australian context, this word contains an implied social analysis, based on these premises: public questioning of vaccination policy causes some parents not to have their children vaccinated, leading to reduced vaccination rates and thence to more children dying of infectious diseases.
“Babykiller” also contains a moral judgement, namely that public critics of vaccination are culpable for the deaths of children from vaccination-preventable diseases. Few of those applying the term “babykiller” provide evidence to back up the implicit social analysis and judgement, so the label in these instances represents bad social science.
There are numerous other examples of ad hominem in the vaccination debate, on both sides. Some of them might be said to be primarily abuse, such as “cunt.” Others, though, contain an associated or implied social analysis, so to judge its quality it is necessary to assess whether the analysis conforms to conventions within social science.
In social science, it is normal to define key concepts, either by explicit definitions or descriptive accounts. The point is to provide clarity when the concept is used.
One of the terms used by vaccination supporters in the Australian debate is “anti-vaxxer.” Despite the ubiquity of this term in social and mass media, I have never seen it defined. This is significant because of the considerable ambiguity involved. “Anti-vaxxer” might refer to parents who refuse all vaccines for their children and themselves, parents who have their children receive some but not all recommended vaccines, parents who express reservations about vaccination, and/or campaigners who criticise vaccination policy.
The way “anti-vaxxer” is applied in practice tends to conflate these different meanings, with the implication that any criticism of vaccination puts you in the camp of those who refuse all vaccines. The label “anti-vaxxer” has been applied to me even though I do not have a strong view about vaccination.
Because of the lack of a definition or clear meaning, the term “anti-vaxxer” is a form of ad hominem and also represents bad social science. Tellingly, few social scientists studying the vaccination issue use the term descriptively.
In their publications, social scientists may not define all the terms they use because their meanings are commonly accepted in the field. Nearly always, though, some researchers pay close attention to any widely used concept. When such a concept remains ill-defined, this may be a sign of bad social science — especially when it is used as a pejorative label.
Limited and Dubious Evidence
Social scientists normally seek to provide strong evidence for their claims and restrict their claims to what the evidence can support. In public debates, this caution is often disregarded.
After SAVN was formed in 2009, one of its initial claims was that the AVN believed in a global conspiracy to implant mind-control chips via vaccinations. The key piece of evidence SAVNers provided to support this claim was that Meryl Dorey had given a link to the website of David Icke, who was known to have some weird beliefs, such as that the world is ruled by shape-shifting reptilian humanoids.
The weakness of this evidence should be apparent. Just because Icke has some weird beliefs does not mean every document on his website involves adherence to weird beliefs, and just because Dorey provided a link to a document does not prove she believes in everything in the document, much less subscribes to the beliefs of the owner of the website. Furthermore, Dorey denied believing in a mind-control global conspiracy.
Finally, even if Dorey had believed in this conspiracy, this does not mean other members of the AVN, or the AVN as an organisation, believed in the conspiracy. Although the evidence was exceedingly weak, several SAVNers, after I confronted them on the matter, initially refused to back down from their claims.
When studying an issue, scholars assume that evidence, sources and other material should be represented fairly. For example, a quotation from an author should fairly present the author’s views, and not be used out of context to show something different than what the author intended.
Quite a few campaigners in the Australian vaccination debate use a different approach, which might be called “gotcha”. Quotes are used to expose writers as incompetent, misguided or deluded. Views of authors are misrepresented as a means of discrediting and dismissing them.
Judy Wilyman did her PhD under my supervision and was the subject of attack for years before she graduated. On 13 January 2016, just two days after her thesis was posted online, it was the subject of a front-page story in the daily newspaper The Australian. The journalist, despite having been informed of a convenient summary of the thesis, did not mention any of its key ideas, instead claiming that it involved a conspiracy theory. Quotes from the thesis, taken out of context, were paraded as evidence of inadequacy.
This journalistic misrepresentation of Judy’s thesis was remarkably influential. It led to a cascade of hostile commentary, with hundreds of online comments on the numerous stories in The Australian, an online petition signed by thousands of people, and calls by scientists for Judy’s PhD to be revoked. In all the furore, not a single critic of her thesis posted a fair-minded summary of its contents.
In high-quality social science, it is common to defend a viewpoint, but considered appropriate to examine other perspectives. Indeed, when presenting a critique, it is usual to begin with a summary of the work to be criticised.
In the Australian vaccination debate, partisans do not even attempt to present the opposing side’s viewpoint. I have never seen any campaigner provide a summary of the evidence and arguments supporting the opposition’s viewpoint. Vaccination critics present evidence and arguments that cast doubt on the government’s vaccination policy, and never try to summarise the evidence and arguments supporting it. Likewise, backers of the government’s policy never try to summarise the case against it.
There are also some intermediate viewpoints, divergent from the entrenched positions in the public debate. For example, there are some commentators who support some vaccines but not all the government-recommended ones, or who support single vaccines rather than multiple vaccines. These non-standard positions are hardly ever discussed in public by pro-vaccination campaigners. More commonly, they are implicitly subsumed by the label “anti-vaxxer.”
To find summaries of arguments and evidence on both sides, it is necessary to turn to work by social scientists, and then only the few of them studying the debate without arguing for one side or the other.
When making a claim, it makes sense to check it. Social scientists commonly do this by checking sources and/or by relying on peer review. For contemporary issues, it’s often possible to check with the person who made the claim.
In the Australian vaccination debate, there seems to be little attempt to check claims, especially when they are derogatory claims about opponents. I can speak from personal experience. Quite a number of SAVNers have made comments about my work, for example in blogs. On not a single occasion has any one of them checked with me in advance of publication.
After SAVN was formed and I started writing about free speech in the Australian vaccination debate, I sent drafts of some of my papers to SAVNers for comment. Rather than using this opportunity to send me corrections and comments, the response was to attack me, including by making complaints to my university. Interestingly, the only SAVNer to have been helpful in commenting on drafts is another academic.
Another example concerns Andrew Wakefield, a gastroenterologist who was lead author of a paper in The Lancet suggesting that the possibility that the MMR triple vaccine (measles, mumps and rubella) might be linked to autism should be investigated. The paper led to a storm of media attention.
Australian pro-vaccination campaigns, and quite a few media reports, refer to Wakefield’s alleged wrongdoings, treating them as discrediting any criticism of vaccination. Incorrect statements about Wakefield are commonplace, for example that he lost his medical licence due to scientific fraud. It is a simple matter to check the facts, but apparently few do this. Even fewer take the trouble to look into the claims and counterclaims about Wakefield and qualify their statements accordingly.
Social scientists are trained to be cautious in drawing conclusions, ensuring that they do not go beyond what can be justified from data and arguments. In addition, it is standard to include a discussion of limitations. This sort of caution is often absent in public debates.
SAVNers have claimed great success in their campaign against the AVN, giving evidence that, for example, their efforts have prevented AVN talks from being held and reduced media coverage of vaccine critics. However, although AVN operations have undoubtedly been hampered, this does not necessarily show that vaccination rates have increased or, more importantly, that public health has benefited.
Many social scientists undertake research in controversial areas. Some support the dominant views, some support an unorthodox position and quite a few try not to take a stand. There is no inherent problem in supporting the orthodox position, but doing so brings greater risks to the quality of research.
Many SAVNers assume that vaccination is a scientific issue and that only people with scientific credentials, for example degrees or publications in virology or epidemiology, have any credibility. This was apparent in an article by philosopher Patrick Stokes entitled “No, you’re not entitled to your opinion” that received high praise from SAVNers. It was also apparent in the attack on Judy Wilyman, whose PhD was criticised because it was not in a scientific field, and because she analysed scientific claims without being a scientist. The claim that only scientists can validly criticise vaccination is easily countered. The problem for SAVNers is that they are less likely to question assumptions precisely because they support the dominant viewpoint.
There is a fascinating aspect to campaigners supporting orthodoxy: they themselves frequently make claims about vaccination although they are not scientists with relevant qualifications. They do not apply their own strictures about necessary expertise to themselves. This can be explained as deriving from “honour by association,” a process parallel to guilt by association but less noticed because it is so common. In honour by association, a person gains or assumes greater credibility by being associated with a prestigious person, group or view.
Someone without special expertise who asserts a claim that supports orthodoxy implicitly takes on the mantle of the experts on the side of orthodoxy. It is only those who challenge orthodoxy who are expected to have relevant credentials. There is nothing inherently wrong with supporting the orthodox view, but it does mean there is less pressure to examine assumptions.
My initial example of bad social science was calling Donald Trump a psychopath. Suppose you said Trump has narcissistic personality disorder. This might not seem to be bad social science because it accords with the views of many psychologists. However, agreeing with orthodoxy, without accompanying deployment of expertise, does not constitute good social science any more than disagreeing with orthodoxy.
It is all too easy to identify examples of bad social science in popular commentary. They are commonplace in political campaigning and in everyday conversations.
Being attuned to common violations of good practice has three potential benefits: as a useful reminder to maintain high standards; as a toolkit for responding to attacks on social science; and as a guide to encouraging greater public awareness of social scientific thinking and methods.
Bad Social Science as a Reminder to Maintain High Standards
Most of the kinds of bad social science prevalent in the Australian vaccination debate seldom receive extended attention in the social science literature. For example, the widely used and cited textbook Social Research Methods does not even mention ad hominem, presumably because avoiding it is so basic that it need not be discussed.
It describes five common errors in everyday thinking that social scientists should avoid: overgeneralisation, selective observation, premature closure, the halo effect and false consensus. Some of these overlap with the shortcomings I’ve observed in the Australian vaccination debate. For example, the halo effect, in which prestigious sources are given more credibility, has affinities with honour by association.
The textbook The Craft of Research likewise does not mention ad hominem. In a final brief section on the ethics of research, there are a couple of points that can be applied to the vaccination debate. For example, ethical researchers “do not caricature or distort opposing views.” Another recommendation is that “When you acknowledge your readers’ alternative views, including their strongest objections and reservations,” you move towards more reliable knowledge and honour readers’ dignity. Compared with the careful exposition of research methods in this and other texts, the shortcomings in public debates are seemingly so basic and obvious as to not warrant extended discussion.
No doubt many social scientists could point to the work of others in the field — or even their own — as failing to meet the highest standards. Looking at examples of bad social science can provide a reminder of what to avoid. For example, being aware of ad hominem argumentation can help in avoiding subtle denigration of authors and instead focusing entirely on their evidence and arguments. Being reminded of confirmation bias can encourage exploration of a greater diversity of viewpoints.
Malcolm Wright and Scott Armstrong examined 50 articles that cited a method in survey-based research that Armstrong had developed years earlier. They discovered that only one of the 50 studies had reported the method correctly. They recommend that researchers send drafts of their work to authors of cited studies — especially those on which the research depends most heavily — to ensure accuracy. This is not a common practice in any field of scholarship but is worth considering in the interests of improving quality.
Bad Social Science as a Toolkit for Responding to Attacks
Alan Sokal wrote an intentionally incoherent article that was published in 1996 in the cultural studies journal Social Text. Numerous commentators lauded Sokal for carrying out an audacious prank that revealed the truth about cultural studies, namely that it was bunk. These commentators had not carried out relevant studies themselves, nor were most of them familiar with the field of cultural studies, including its frameworks, objects of study, methods of analysis, conclusions and exemplary pieces of scholarship.
To the extent that these commentators were uninformed about cultural studies yet willing to praise Sokal for his hoax, they were involved in a sort of bad social science. Perhaps they supported Sokal’s hoax because it agreed with their preconceived ideas, though investigation would be needed to assess this hypothesis.
Most responses to the hoax took a defensive line, for example arguing that Sokal’s conclusions were not justified. Only a few argued that interpreting the hoax as showing the vacuity of cultural studies was itself poor social science. Sokal himself said it was inappropriate to draw general conclusions about cultural studies from the hoax, so ironically it would have been possible to respond to attackers by quoting Sokal.
When social scientists come under attack, it can be useful to examine the evidence and methods used or cited by the attackers, and to point out, as is often the case, that they fail to measure up to standards in the field.
Encouraging Greater Public Awareness of Social Science Thinking and Methods
It is easy to communicate with like-minded scholars and commiserate about the ignorance of those who misunderstand or wilfully misrepresent social science. More challenging is to pay close attention to the characteristic ways in which people make assumptions and reason about the social world and how these ways often fall far short of the standards expected in scholarly circles.
By identifying common forms of bad social science, it may be possible to better design interventions into public discourse to encourage more rigorous thinking about evidence and argument, especially to counter spurious and ill-founded claims by partisans in public debates.
Social scientists, in looking at research contributions, usually focus on what is high quality: the deepest insights, the tightest arguments, the most comprehensive data, the most sophisticated analysis and the most elegant writing. This makes sense: top quality contributions offer worthwhile models to learn from and emulate.
Nevertheless, there is also a role for learning from poor quality contributions. It is instructive to look at public debates involving social issues in which people make judgements about the same sorts of matters that are investigated by social scientists, everything from criminal justice to social mores. Contributions to public debates can starkly show flaws in reasoning and the use of evidence. These flaws provide a useful reminder of things to avoid.
Observation of the Australian vaccination debate reveals several types of bad social science, including ad hominem attacks, failing to define terms, relying on dubious sources, failing to provide context, and not checking claims. The risk of succumbing to these shortcomings seems to be magnified when the orthodox viewpoint is being supported, because it is assumed to be correct and there is less likelihood of being held accountable by opponents.
There is something additional that social scientists can learn by studying contributions to public debates that have serious empirical and theoretical shortcomings. There are likely to be characteristic failures that occur repeatedly. These offer supplementary guidance for what to avoid. They also provide insight into what sort of training, for aspiring social scientists, is useful for moving from unreflective arguments to careful research.
There is also a challenge that few scholars have tackled. Given the prevalence of bad social science in many public debates, is it possible to intervene in these debates in a way that fosters greater appreciation for what is involved in good quality scholarship, and encourages campaigners to aspire to make sounder contributions?
Contact details: email@example.com
Blume, Stuart. Immunization: How Vaccines Became Controversial. London: Reaktion Books, 2017.
Booth, Wayne C.; Gregory G. Colomb, Joseph M. Williams, Joseph Bizup and William T. FitzGerald, The Craft of Research, fourth edition. Chicago: University of Chicago Press, 2016.
Collier, David; Fernando Daniel Hidalgo and Andra Olivia Maciuceanu, “Essentially contested concepts: debates and applications,” Journal of Political Ideologies, 11(3), October 2006, pp. 211–246.
Ekman, Paul. Telling Lies: Clues to Deceit in the Marketplace, Politics, and Marriage. New York: Norton, 1985.
Hilgartner, Stephen, “The Sokal affair in context,” Science, Technology, & Human Values, 22(4), Autumn 1997, pp. 506–522.
Lee, Bandy X. The Dangerous Case of Donald Trump: 27 Psychiatrists and Mental Health Experts Assess a President. New York: St. Martin’s Press, 2017.
Martin, Brian; and Florencia Peña Saint Martin. El mobbing en la esfera pública: el fenómeno y sus características [Public mobbing: a phenomenon and its features]. In Norma González González (Coordinadora), Organización social del trabajo en la posmodernidad: salud mental, ambientes laborales y vida cotidiana (Guadalajara, Jalisco, México: Prometeo Editores, 2014), pp. 91-114.
Martin, Brian. “Debating vaccination: understanding the attack on the Australian Vaccination Network.” Living Wisdom, no. 8, 2011, pp. 14–40.
Martin, Brian. “On the suppression of vaccination dissent.” Science & Engineering Ethics. Vol. 21, No. 1, 2015, pp. 143–157.
Martin, Brian. Evidence-based campaigning. Archives of Public Health, 76, no. 54. (2018), https://doi.org/10.1186/s13690-018-0302-4.
Martin, Brian. Vaccination Panic in Australia. Sparsnäs, Sweden: Irene Publishing, 2018.
Ken McLeod, “Meryl Dorey’s trouble with the truth, part 1: how Meryl Dorey lies, obfuscates, prevaricates, exaggerates, confabulates and confuses in promoting her anti-vaccination agenda,” 2010, http://www.scribd.com/doc/47704677/Meryl-Doreys-Trouble-With-the-Truth-Part-1.
Neuman, W. Lawrence. Social Research Methods: Qualitative and Quantitative Approaches, seventh edition. Boston, MA: Pearson, 2011.
Scott, Pam; Evelleen Richards and Brian Martin, “Captives of controversy: the myth of the neutral social researcher in contemporary scientific controversies,” Science, Technology, & Human Values, Vol. 15, No. 4, Fall 1990, pp. 474–494.
Sokal, Alan D. “What the Social Text affair does and does not prove,” in Noretta Koertge (ed.), A House Built on Sand: Exposing Postmodernist Myths about Science (New York: Oxford University Press, 1998), pp. 9–22
Stokes, Patrick. “No, you’re not entitled to your opinion,” The Conversation, 5 October 2012, https://theconversation.com/no-youre-not-entitled-to-your-opinion-9978.
Wright, Malcolm, and J. Scott Armstrong, “The ombudsman: verification of citations: fawlty towers of knowledge?” Interfaces, 38 (2), March-April 2008.
 Thanks to Meryl Dorey, Stephen Hilgartner, Larry Neuman, Alan Sokal and Malcolm Wright for valuable feedback on drafts.
 For informed commentary on these issues, see Bandy X. Lee, The Dangerous Case of Donald Trump: 27 Psychiatrists and Mental Health Experts Assess a President (New York: St. Martin’s Press, 2017).
 Pam Scott, Evelleen Richards and Brian Martin, “Captives of controversy: the myth of the neutral social researcher in contemporary scientific controversies,” Science, Technology, & Human Values, Vol. 15, No. 4, Fall 1990, pp. 474–494.
 The AVN, forced to change its name in 2014, became the Australian Vaccination-skeptics Network. In 2018 it voluntarily changed its name to the Australian Vaccination-risks Network.
 In 2014, SAVN changed its name to Stop the Australian (Anti-)Vaccination Network.
 Brian Martin and Florencia Peña Saint Martin. El mobbing en la esfera pública: el fenómeno y sus características [Public mobbing: a phenomenon and its features]. In Norma González González (Coordinadora), Organización social del trabajo en la posmodernidad: salud mental, ambientes laborales y vida cotidiana (Guadalajara, Jalisco, México: Prometeo Editores, 2014), pp. 91-114.
 Ken McLeod, “Meryl Dorey’s trouble with the truth, part 1: how Meryl Dorey lies, obfuscates, prevaricates, exaggerates, confabulates and confuses in promoting her anti-vaccination agenda,” 2010, http://www.scribd.com/doc/47704677/Meryl-Doreys-Trouble-With-the-Truth-Part-1.
 Brian Martin, “Debating vaccination: understanding the attack on the Australian Vaccination Network,” Living Wisdom, no. 8, 2011, pp. 14–40, at pp. 28–30.
 E.g., Paul Ekman, Telling Lies: Clues to Deceit in the Marketplace, Politics, and Marriage (New York: Norton, 1985).
 On Wikipedia I am categorised as an “anti-vaccination activist,” a term that is not defined on the entry listing those in the category. See Brian Martin, “Persistent bias on Wikipedia: methods and responses,” Social Science Computer Review, Vol. 36, No. 3, June 2018, pp. 379–388.
 See for example David Collier, Fernando Daniel Hidalgo and Andra Olivia Maciuceanu, “Essentially contested concepts: debates and applications,” Journal of Political Ideologies, 11(3), October 2006, pp. 211–246.
 The only possible exception to this statement is Michael Brull, “Anti-vaccination cranks versus academic freedom,” New Matilda, 7 February 2016, who reproduced my own summary of the key points in the thesis relevant to Australian government vaccination policy. For my responses to the attack, see http://www.bmartin.cc/pubs/controversy.html – Wilyman, for example “Defending university integrity,” International Journal for Educational Integrity, Vol. 13, No. 1, 2017, pp. 1–14.
 Brian Martin, Vaccination Panic in Australia (Sparsnäs, Sweden: Irene Publishing, 2018), pp. 15–24.
 E.g., Stuart Blume, Immunization: How Vaccines Became Controversial (London: Reaktion Books, 2017).
 For own commentary on Wakefield, see “On the suppression of vaccination dissent,” Science & Engineering Ethics, Vol. 21, No. 1, 2015, pp. 143–157.
 Patrick Stokes, “No, you’re not entitled to your opinion,” The Conversation, 5 October 2012, https://theconversation.com/no-youre-not-entitled-to-your-opinion-9978.
 Martin, Vaccination Panic in Australia, 292–304.
 W. Lawrence Neuman, Social Research Methods: Qualitative and Quantitative Approaches, seventh edition (Boston, MA: Pearson, 2011), 3–5.
 Wayne C. Booth, Gregory G. Colomb, Joseph M. Williams, Joseph Bizup and William T. FitzGerald, The Craft of Research, fourth edition (Chicago: University of Chicago Press, 2016), 272–273.
 Malcolm Wright and J. Scott Armstrong, “The ombudsman: verification of citations: fawlty towers of knowledge?” Interfaces, 38 (2), March-April 2008, 125–132.
 For a detailed articulation of this approach, see Stephen Hilgartner, “The Sokal affair in context,” Science, Technology, & Human Values, 22(4), Autumn 1997, pp. 506–522. Hilgartner gives numerous citations to expansive interpretations of the significance of the hoax.
 See for example Alan D. Sokal, “What the Social Text affair does and does not prove,” in Noretta Koertge (ed.), A House Built on Sand: Exposing Postmodernist Myths about Science (New York: Oxford University Press, 1998), pp. 9–22, at p. 11: “From the mere fact of publication of my parody, I think that not much can be deduced. It doesn’t prove that the whole field of cultural studies, or the cultural studies of science — much less the sociology of science — is nonsense.”