Archives For progress

Author Information: Raphael Sassower, University of Colorado, Colorado Springs, rsasswe@uccs.edu.

Sassower, Raphael. “Post-Truths and Inconvenient Facts.” Social Epistemology Review and Reply Collective 7, no. 8 (2018): 47-60.

The pdf of the article gives specific page references. Shortlink: https://wp.me/p1Bfg0-40g

Can one truly refuse to believe facts?
Image by Oxfam International via Flickr / Creative Commons

 

If nothing else, Steve Fuller has his ear to the pulse of popular culture and the academics who engage in its twists and turns. Starting with Brexit and continuing into the Trump-era abyss, “post-truth” was dubbed by the OED as its word of the year in 2016. Fuller has mustered his collected publications to recast the debate over post-truth and frame it within STS in general and his own contributions to social epistemology in particular.

This could have been a public mea culpa of sorts: we, the community of sociologists (and some straggling philosophers and anthropologists and perhaps some poststructuralists) may seem to someone who isn’t reading our critiques carefully to be partially responsible for legitimating the dismissal of empirical data, evidence-based statements, and the means by which scientific claims can be deemed not only credible but true. Instead, we are dazzled by a range of topics (historically anchored) that explain how we got to Brexit and Trump—yet Fuller’s analyses of them don’t ring alarm bells. There is almost a hidden glee that indeed the privileged scientific establishment, insular scientific discourse, and some of its experts who pontificate authoritative consensus claims are all bound to be undone by the rebellion of mavericks and iconoclasts that include intelligent design promoters and neoliberal freedom fighters.

In what follows, I do not intend to summarize the book, as it is short and entertaining enough for anyone to read on their own. Instead, I wish to outline three interrelated points that one might argue need not be argued but, apparently, do: 1) certain critiques of science have contributed to the Trumpist mindset; 2) the politics of Trumpism is too dangerous to be sanguine about; 3) the post-truth condition is troublesome and insidious. Though Fuller deals with some of these issues, I hope to add some constructive clarification to them.

Part One: Critiques of Science

As Theodor Adorno reminds us, critique is essential not only for philosophy, but also for democracy. He is aware that the “critic becomes a divisive influence, with a totalitarian phrase, a subversive” (1998/1963, 283) insofar as the status quo is being challenged and sacred political institutions might have to change. The price of critique, then, can be high, and therefore critique should be managed carefully and only cautiously deployed. Should we refrain from critique, then? Not at all, continues Adorno.

But if you think that a broad, useful distinction can be offered among different critiques, think again: “[In] the division between responsible critique, namely, that practiced by those who bear public responsibility, and irresponsible critique, namely, that practiced by those who cannot be held accountable for the consequences, critique is already neutralized.” (Ibid. 285) Adorno’s worry is not only that one forgets that “the truth content of critique alone should be that authority [that decides if it’s responsible],” but that when such a criterion is “unilaterally invoked,” critique itself can lose its power and be at the service “of those who oppose the critical spirit of a democratic society.” (Ibid)

In a political setting, the charge of irresponsible critique shuts the conversation down and ensures political hegemony without disruptions. Modifying Adorno’s distinction between (politically) responsible and irresponsible critiques, responsible scientific critiques are constructive insofar as they attempt to improve methods of inquiry, data collection and analysis, and contribute to the accumulated knowledge of a community; irresponsible scientific critiques are those whose goal is to undermine the very quest for objective knowledge and the means by which such knowledge can be ascertained. Questions about the legitimacy of scientific authority are related to but not of exclusive importance for these critiques.

Have those of us committed to the critique of science missed the mark of the distinction between responsible and irresponsible critiques? Have we become so subversive and perhaps self-righteous that science itself has been threatened? Though Fuller is primarily concerned with the hegemony of the sociology of science studies and the movement he has championed under the banner of “social epistemology” since the 1980s, he does acknowledge the Popperians and their critique of scientific progress and even admires the Popperian contribution to the scientific enterprise.

But he is reluctant to recognize the contributions of Marxists, poststructuralists, and postmodernists who have been critically engaging the power of science since the 19th century. Among them, we find Jean-François Lyotard who, in The Postmodern Condition (1984/1979), follows Marxists and neo-Marxists who have regularly lumped science and scientific discourse with capitalism and power. This critical trajectory has been well rehearsed, so suffice it here to say, SSK, SE, and the Edinburgh “Strong Programme” are part of a long and rich critical tradition (whose origins are Marxist). Adorno’s Frankfurt School is part of this tradition, and as we think about science, which had come to dominate Western culture by the 20th century (in the place of religion, whose power had by then waned as the arbiter of truth), it was its privileged power and interlocking financial benefits that drew the ire of critics.

Were these critics “responsible” in Adorno’s political sense? Can they be held accountable for offering (scientific and not political) critiques that improve the scientific process of adjudication between criteria of empirical validity and logical consistency? Not always. Did they realize that their success could throw the baby out with the bathwater? Not always. While Fuller grants Karl Popper the upper hand (as compared to Thomas Kuhn) when indirectly addressing such questions, we must keep an eye on Fuller’s “baby.” It’s easy to overlook the slippage from the political to the scientific and vice versa: Popper’s claim that we never know the Truth doesn’t mean that his (and our) quest for discovering the Truth as such is given up, it’s only made more difficult as whatever is scientifically apprehended as truth remains putative.

Limits to Skepticism

What is precious about the baby—science in general, and scientific discourse and its community in more particular ways—is that it offered safeguards against frivolous skepticism. Robert Merton (1973/1942) famously outlined the four features of the scientific ethos, principles that characterized the ideal workings of the scientific community: universalism, communism (communalism, as per the Cold War terror), disinterestedness, and organized skepticism. It is the last principle that is relevant here, since it unequivocally demands an institutionalized mindset of putative acceptance of any hypothesis or theory that is articulated by any community member.

One detects the slippery slope that would move one from being on guard when engaged with any proposal to being so skeptical as to never accept any proposal no matter how well documented or empirically supported. Al Gore, in his An Inconvenient Truth (2006), sounded the alarm about climate change. A dozen years later we are still plagued by climate-change deniers who refuse to look at the evidence, suggesting instead that the standards of science themselves—from the collection of data in the North Pole to computer simulations—have not been sufficiently fulfilled (“questions remain”) to accept human responsibility for the increase of the earth’s temperature. Incidentally, here is Fuller’s explanation of his own apparent doubt about climate change:

Consider someone like myself who was born in the midst of the Cold War. In my lifetime, scientific predictions surrounding global climate change has [sic.] veered from a deep frozen to an overheated version of the apocalypse, based on a combination of improved data, models and, not least, a geopolitical paradigm shift that has come to downplay the likelihood of a total nuclear war. Why, then, should I not expect a significant, if not comparable, alteration of collective scientific judgement in the rest of my lifetime? (86)

Expecting changes in the model does not entail a) that no improved model can be offered; b) that methodological changes in themselves are a bad thing (they might be, rather, improvements); or c) that one should not take action at all based on the current model because in the future the model might change.

The Royal Society of London (1660) set the benchmark of scientific credibility low when it accepted as scientific evidence any report by two independent witnesses. As the years went by, testability (“confirmation,” for the Vienna Circle, “falsification,” for Popper) and repeatability were added as requirements for a report to be considered scientific, and by now, various other conditions have been proposed. Skepticism, organized or personal, remains at the very heart of the scientific march towards certainty (or at least high probability), but when used perniciously, it has derailed reasonable attempts to use science as a means by which to protect, for example, public health.

Both Michael Bowker (2003) and Robert Proctor (1995) chronicle cases where asbestos and cigarette lobbyists and lawyers alike were able to sow enough doubt in the name of attenuated scientific data collection to ward off regulators, legislators, and the courts for decades. Instead of finding sufficient empirical evidence to attribute asbestos and nicotine to the failing health condition (and death) of workers and consumers, “organized skepticism” was weaponized to fight the sick and protect the interests of large corporations and their insurers.

Instead of buttressing scientific claims (that have passed the tests—in refereed professional conferences and publications, for example—of most institutional scientific skeptics), organized skepticism has been manipulated to ensure that no claim is ever scientific enough or has the legitimacy of the scientific community. In other words, what should have remained the reasonable cautionary tale of a disinterested and communal activity (that could then be deemed universally credible) has turned into a circus of fire-blowing clowns ready to burn down the tent. The public remains confused, not realizing that just because the stakes have risen over the decades does not mean there are no standards that ever can be met. Despite lobbyists’ and lawyers’ best efforts of derailment, courts have eventually found cigarette companies and asbestos manufacturers guilty of exposing workers and consumers to deathly hazards.

Limits to Belief

If we add to this logic of doubt, which has been responsible for discrediting science and the conditions for proposing credible claims, a bit of U.S. cultural history, we may enjoy a more comprehensive picture of the unintended consequences of certain critiques of science. Citing Kurt Andersen (2017), Robert Darnton suggests that the Enlightenment’s “rational individualism interacted with the older Puritan faith in the individual’s inner knowledge of the ways of Providence, and the result was a peculiarly American conviction about everyone’s unmediated access to reality, whether in the natural world or the spiritual world. If we believe it, it must be true.” (2018, 68)

This way of thinking—unmediated experiences and beliefs, unconfirmed observations, and disregard of others’ experiences and beliefs—continues what Richard Hofstadter (1962) dubbed “anti-intellectualism.” For Americans, this predates the republic and is characterized by a hostility towards the life of the mind (admittedly, at the time, religious texts), critical thinking (self-reflection and the rules of logic), and even literacy. The heart (our emotions) can more honestly lead us to the Promised Land, whether it is heaven on earth in the Americas or the Christian afterlife; any textual interference or reflective pondering is necessarily an impediment, one to be suspicious of and avoided.

This lethal combination of the life of the heart and righteous individualism brings about general ignorance and what psychologists call “confirmation bias” (the view that we endorse what we already believe to be true regardless of countervailing evidence). The critique of science, along this trajectory, can be but one of many so-called critiques of anything said or proven by anyone whose ideology we do not endorse. But is this even critique?

Adorno would find this a charade, a pretense that poses as a critique but in reality is a simple dismissal without intellectual engagement, a dogmatic refusal to listen and observe. He definitely would be horrified by Stephen Colbert’s oft-quoted quip on “truthiness” as “the conviction that what you feel to be true must be true.” Even those who resurrect Daniel Patrick Moynihan’s phrase, “You are entitled to your own opinion, but not to your own facts,” quietly admit that his admonishment is ignored by media more popular than informed.

On Responsible Critique

But surely there is merit to responsible critiques of science. Weren’t many of these critiques meant to dethrone the unparalleled authority claimed in the name of science, as Fuller admits all along? Wasn’t Lyotard (and Marx before him), for example, correct in pointing out the conflation of power and money in the scientific vortex that could legitimate whatever profit-maximizers desire? In other words, should scientific discourse be put on par with other discourses?  Whose credibility ought to be challenged, and whose truth claims deserve scrutiny? Can we privilege or distinguish science if it is true, as Monya Baker has reported, that “[m]ore than 70% of researchers have tried and failed to reproduce another scientist’s experiments, and more than half have failed to reproduce their own experiments” (2016, 1)?

Fuller remains silent about these important and responsible questions about the problematics (methodologically and financially) of reproducing scientific experiments. Baker’s report cites Nature‘s survey of 1,576 researchers and reveals “sometimes-contradictory attitudes towards reproducibility. Although 52% of those surveyed agree that there is a significant ‘crisis’ of reproducibility, less than 31% think that failure to reproduce published results means that the result is probably wrong, and most say that they still trust the published literature.” (Ibid.) So, if science relies on reproducibility as a cornerstone of its legitimacy (and superiority over other discourses), and if the results are so dismal, should it not be discredited?

One answer, given by Hans E. Plesser, suggests that there is a confusion between the notions of repeatability (“same team, same experimental setup”), replicability (“different team, same experimental setup”), and reproducibility (“different team, different experimental setup”). If understood in these terms, it stands to reason that one may not get the same results all the time and that this fact alone does not discredit the scientific enterprise as a whole. Nuanced distinctions take us down a scientific rabbit-hole most post-truth advocates refuse to follow. These nuances are lost on a public that demands to know the “bottom line” in brief sound bites: Is science scientific enough, or is it bunk? When can we trust it?

Trump excels at this kind of rhetorical device: repeat a falsehood often enough and people will believe it; and because individual critical faculties are not a prerequisite for citizenship, post-truth means no truth, or whatever the president says is true. Adorno’s distinction of the responsible from the irresponsible political critics comes into play here; but he innocently failed to anticipate the Trumpian move to conflate the political and scientific and pretend as if there is no distinction—methodologically and institutionally—between political and scientific discourses.

With this cultural backdrop, many critiques of science have undermined its authority and thereby lent credence to any dismissal of science (legitimately by insiders and perhaps illegitimately at times by outsiders). Sociologists and postmodernists alike forgot to put warning signs on their academic and intellectual texts: Beware of hasty generalizations! Watch out for wolves in sheep clothes! Don’t throw the baby out with the bathwater!

One would think such advisories unnecessary. Yet without such safeguards, internal disputes and critical investigations appear to have unintentionally discredited the entire scientific enterprise in the eyes of post-truth promoters, the Trumpists whose neoliberal spectacles filter in dollar signs and filter out pollution on the horizon. The discrediting of science has become a welcome distraction that opens the way to radical free-market mentality, spanning from the exploitation of free speech to resource extraction to the debasement of political institutions, from courts of law to unfettered globalization. In this sense, internal (responsible) critiques of the scientific community and its internal politics, for example, unfortunately license external (irresponsible) critiques of science, the kind that obscure the original intent of responsible critiques. Post-truth claims at the behest of corporate interests sanction a free for all where the concentrated power of the few silences the concerns of the many.

Indigenous-allied protestors block the entrance to an oil facility related to the Kinder-Morgan oil pipeline in Alberta.
Image by Peg Hunter via Flickr / Creative Commons

 

Part Two: The Politics of Post-Truth

Fuller begins his book about the post-truth condition that permeates the British and American landscapes with a look at our ancient Greek predecessors. According to him, “Philosophers claim to be seekers of the truth but the matter is not quite so straightforward. Another way to see philosophers is as the ultimate experts in a post-truth world” (19). This means that those historically entrusted to be the guardians of truth in fact “see ‘truth’ for what it is: the name of a brand ever in need of a product which everyone is compelled to buy. This helps to explain why philosophers are most confident appealing to ‘The Truth’ when they are trying to persuade non-philosophers, be they in courtrooms or classrooms.” (Ibid.)

Instead of being the seekers of the truth, thinkers who care not about what but how we think, philosophers are ridiculed by Fuller (himself a philosopher turned sociologist turned popularizer and public relations expert) as marketing hacks in a public relations company that promotes brands. Their serious dedication to finding the criteria by which truth is ascertained is used against them: “[I]t is not simply that philosophers disagree on which propositions are ‘true’ or ‘false’ but more importantly they disagree on what it means to say that something is ‘true’ or ‘false’.” (Ibid.)

Some would argue that the criteria by which propositions are judged to be true or false are worthy of debate, rather than the cavalier dismissal of Trumpists. With criteria in place (even if only by convention), at least we know what we are arguing about, as these criteria (even if contested) offer a starting point for critical scrutiny. And this, I maintain, is a task worth performing, especially in the age of pluralism when multiple perspectives constitute our public stage.

In addition to debasing philosophers, it seems that Fuller reserves a special place in purgatory for Socrates (and Plato) for labeling the rhetorical expertise of the sophists—“the local post-truth merchants in fourth century BC Athens”—negatively. (21) It becomes obvious that Fuller is “on their side” and that the presumed debate over truth and its practices is in fact nothing but “whether its access should be free or restricted.” (Ibid.) In this neoliberal reading, it is all about money: are sophists evil because they charge for their expertise? Is Socrates a martyr and saint because he refused payment for his teaching?

Fuller admits, “Indeed, I would have us see both Plato and the Sophists as post-truth merchants, concerned more with the mix of chance and skill in the construction of truth than with the truth as such.” (Ibid.) One wonders not only if Plato receives fair treatment (reminiscent of Popper’s denigration of Plato as supporting totalitarian regimes, while sparing Socrates as a promoter of democracy), but whether calling all parties to a dispute “post-truth merchants” obliterates relevant differences. In other words, have we indeed lost the desire to find the truth, even if it can never be the whole truth and nothing but the truth?

Political Indifference to Truth

One wonders how far this goes: political discourse without any claim to truth conditions would become nothing but a marketing campaign where money and power dictate the acceptance of the message. Perhaps the intended message here is that contemporary cynicism towards political discourse has its roots in ancient Greece. Regardless, one should worry that such cynicism indirectly sanctions fascism.

Can the poor and marginalized in our society afford this kind of cynicism? For them, unlike their privileged counterparts in the political arena, claims about discrimination and exploitation, about unfair treatment and barriers to voting are true and evidence based; they are not rhetorical flourishes by clever interlocutors.

Yet Fuller would have none of this. For him, political disputes are games:

[B]oth the Sophists and Plato saw politics as a game, which is to say, a field of play involving some measure of both chance and skill. However, the Sophists saw politics primarily as a game of chance whereas Plato saw it as a game of skill. Thus, the sophistically trained client deploys skill in [the] aid of maximizing chance occurrences, which may then be converted into opportunities, while the philosopher-king uses much the same skills to minimize or counteract the workings of chance. (23)

Fuller could be channeling here twentieth-century game theory and its application in the political arena, or the notion offered by Lyotard when describing the minimal contribution we can make to scientific knowledge (where we cannot change the rules of the game but perhaps find a novel “move” to make). Indeed, if politics is deemed a game of chance, then anything goes, and it really should not matter if an incompetent candidate like Trump ends up winning the American presidency.

But is it really a question of skill and chance? Or, as some political philosophers would argue, is it not a question of the best means by which to bring to fruition the best results for the general wellbeing of a community? The point of suggesting the figure of a philosopher-king, to be sure, was not his rhetorical skills in this conjunction, but instead the deep commitment to rule justly, to think critically about policies, and to treat constituents with respect and fairness. Plato’s Republic, however criticized, was supposed to be about justice, not about expediency; it is an exploration of the rule of law and wisdom, not a manual about manipulation. If the recent presidential election in the US taught us anything, it’s that we should be wary of political gamesmanship and focus on experience and knowledge, vision and wisdom.

Out-Gaming Expertise Itself

Fuller would have none of this, either. It seems that there is virtue in being a “post-truther,” someone who can easily switch between knowledge games, unlike the “truther” whose aim is to “strengthen the distinction by making it harder to switch between knowledge games.” (34) In the post-truth realm, then, knowledge claims are lumped into games that can be played at will, that can be substituted when convenient, without a hint of the danger such capricious game-switching might engender.

It’s one thing to challenge a scientific hypothesis about astronomy because the evidence is still unclear (as Stephen Hawking has done in regard to Black Holes) and quite another to compare it to astrology (and give equal hearings to horoscope and Tarot card readers as to physicists). Though we are far from the Demarcation Problem (between science and pseudo-science) of the last century, this does not mean that there is no difference at all between different discourses and their empirical bases (or that the problem itself isn’t worthy of reconsideration in the age of Fuller and Trump).

On the contrary, it’s because we assume difference between discourses (gray as they may be) that we can move on to figure out on what basis our claims can and should rest. The danger, as we see in the political logic of the Trump administration, is that friends become foes (European Union) and foes are admired (North Korea and Russia). Game-switching in this context can lead to a nuclear war.

In Fuller’s hands, though, something else is at work. Speaking of contemporary political circumstances in the UK and the US, he says: “After all, the people who tend to be demonized as ‘post-truth’ – from Brexiteers to Trumpists – have largely managed to outflank the experts at their own game, even if they have yet to succeed in dominating the entire field of play.” (39) Fuller’s celebratory tone here may either bring a slight warning in the use of “yet” before the success “in dominating the entire field of play” or a prediction that indeed this is what is about to happen soon enough.

The neoliberal bottom-line surfaces in this assessment: he who wins must be right, the rich must be smart, and more perniciously, the appeal to truth is beside the point. More specifically, Fuller continues:

My own way of dividing the ‘truthers’ and the ‘post-truthers’ is in terms of whether one plays by the rules of the current knowledge game or one tries to change the rules of the game to one’s advantage. Unlike the truthers, who play by the current rules, the post-truthers want to change the rules. They believe that what passes for truth is relative to the knowledge game one is playing, which means that depending on the game being played, certain parties are advantaged over others. Post-truth in this sense is a recognisably social constructivist position, and many of the arguments deployed to advance ‘alternative facts’ and ‘alternative science’ nowadays betray those origins. They are talking about worlds that could have been and still could be—the stuff of modal power. (Ibid.)

By now one should be terrified. This is a strong endorsement of lying as a matter of course, as a way to distract from the details (and empirical bases) of one “knowledge game”—because it may not be to one’s ideological liking–in favor of another that might be deemed more suitable (for financial or other purposes).

The political stakes here are too high to ignore, especially because there are good reasons why “certain parties are advantaged over others” (say, climate scientists “relative to” climate deniers who have no scientific background or expertise). One wonders what it means to talk about “alternative facts” and “alternative science” in this context: is it a means of obfuscation? Is it yet another license granted by the “social constructivist position” not to acknowledge the legal liability of cigarette companies for the addictive power of nicotine? Or the pollution of water sources in Flint, Michigan?

What Is the Mark of an Open Society?

If we corral the broader political logic at hand to the governance of the scientific community, as Fuller wishes us to do, then we hear the following:

In the past, under the inspiration of Karl Popper, I have argued that fundamental to the governance of science as an ‘open society’ is the right to be wrong (Fuller 2000a: chap. 1). This is an extension of the classical republican ideal that one is truly free to speak their mind only if they can speak with impunity. In the Athenian and the Roman republics, this was made possible by the speakers–that is, the citizens–possessing independent means which allowed them to continue with their private lives even if they are voted down in a public meeting. The underlying intuition of this social arrangement, which is the epistemological basis of Mill’s On Liberty, is that people who are free to speak their minds as individuals are most likely to reach the truth collectively. The entangled histories of politics, economics and knowledge reveal the difficulties in trying to implement this ideal. Nevertheless, in a post-truth world, this general line of thought is not merely endorsed but intensified. (109)

To be clear, Fuller not only asks for the “right to be wrong,” but also for the legitimacy of the claim that “people who are free to speak their minds as individuals are most likely to reach the truth collectively.” The first plea is reasonable enough, as humans are fallible (yes, Popper here), and the history of ideas has proven that killing heretics is counterproductive (and immoral). If the Brexit/Trump post-truth age would only usher a greater encouragement for speculation or conjectures (Popper again), then Fuller’s book would be well-placed in the pantheon of intellectual pluralism; but if this endorsement obliterates the silly from the informed conjecture, then we are in trouble and the ensuing cacophony will turn us all deaf.

The second claim is at best supported by the likes of James Surowiecki (2004) who has argued that no matter how uninformed a crowd of people is, collectively it can guess the correct weight of a cow on stage (his TED talk). As folk wisdom, this is charming; as public policy, this is dangerous. Would you like a random group of people deciding how to store nuclear waste, and where? Would you subject yourself to the judgment of just any collection of people to decide on taking out your appendix or performing triple-bypass surgery?

When we turn to Trump, his supporters certainly like that he speaks his mind, just as Fuller says individuals should be granted the right to speak their minds (even if in error). But speaking one’s mind can also be a proxy for saying whatever, without filters, without critical thinking, or without thinking at all (let alone consulting experts whose very existence seems to upset Fuller). Since when did “speaking your mind” turn into scientific discourse? It’s one thing to encourage dissent and offer reasoned doubt and explore second opinions (as health care professionals and insurers expect), but it’s quite another to share your feelings and demand that they count as scientific authority.

Finally, even if we endorse the view that we “collectively” reach the truth, should we not ask: by what criteria? according to what procedure? under what guidelines? Herd mentality, as Nietzsche already warned us, is problematic at best and immoral at worst. Trump rallies harken back to the fascist ones we recall from Europe prior to and during WWII. Few today would entrust the collective judgment of those enthusiasts of the Thirties to carry the day.

Unlike Fuller’s sanguine posture, I shudder at the possibility that “in a post-truth world, this general line of thought is not merely endorsed but intensified.” This is neither because I worship experts and scorn folk knowledge nor because I have low regard for individuals and their (potentially informative) opinions. Just as we warn our students that simply having an opinion is not enough, that they need to substantiate it, offer data or logical evidence for it, and even know its origins and who promoted it before they made it their own, so I worry about uninformed (even if well-meaning) individuals (and presidents) whose gut will dictate public policy.

This way of unreasonably empowering individuals is dangerous for their own well-being (no paternalism here, just common sense) as well as for the community at large (too many untrained cooks will definitely spoil the broth). For those who doubt my concern, Trump offers ample evidence: trade wars with allies and foes that cost domestic jobs (when promising to bring jobs home), nuclear-war threats that resemble a game of chicken (as if no president before him ever faced such an option), and completely putting into disarray public policy procedures from immigration regulations to the relaxation of emission controls (that ignores the history of these policies and their failures).

Drought and suffering in Arbajahan, Kenya in 2006.
Photo by Brendan Cox and Oxfam International via Flickr / Creative Commons

 

Part Three: Post-Truth Revisited

There is something appealing, even seductive, in the provocation to doubt the truth as rendered by the (scientific) establishment, even as we worry about sowing the seeds of falsehood in the political domain. The history of science is the story of authoritative theories debunked, cherished ideas proven wrong, and claims of certainty falsified. Why not, then, jump on the “post-truth” wagon? Would we not unleash the collective imagination to improve our knowledge and the future of humanity?

One of the lessons of postmodernism (at least as told by Lyotard) is that “post-“ does not mean “after,” but rather, “concurrently,” as another way of thinking all along: just because something is labeled “post-“, as in the case of postsecularism, it doesn’t mean that one way of thinking or practicing has replaced another; it has only displaced it, and both alternatives are still there in broad daylight. Under the rubric of postsecularism, for example, we find religious practices thriving (80% of Americans believe in God, according to a 2018 Pew Research survey), while the number of unaffiliated, atheists, and agnostics is on the rise. Religionists and secularists live side by side, as they always have, more or less agonistically.

In the case of “post-truth,” it seems that one must choose between one orientation or another, or at least for Fuller, who claims to prefer the “post-truth world” to the allegedly hierarchical and submissive world of “truth,” where the dominant establishment shoves its truths down the throats of ignorant and repressed individuals. If post-truth meant, like postsecularism, the realization that truth and provisional or putative truth coexist and are continuously being re-examined, then no conflict would be at play. If Trump’s claims were juxtaposed to those of experts in their respective domains, we would have a lively, and hopefully intelligent, debate. False claims would be debunked, reasonable doubts could be raised, and legitimate concerns might be addressed. But Trump doesn’t consult anyone except his (post-truth) gut, and that is troublesome.

A Problematic Science and Technology Studies

Fuller admits that “STS can be fairly credited with having both routinized in its own research practice and set loose on the general public–if not outright invented—at least four common post-truth tropes”:

  1. Science is what results once a scientific paper is published, not what made it possible for the paper to be published, since the actual conduct of research is always open to multiple countervailing interpretations.
  2. What passes for the ‘truth’ in science is an institutionalised contingency, which if scientists are doing their job will be eventually overturned and replaced, not least because that may be the only way they can get ahead in their fields.
  3. Consensus is not a natural state in science but one that requires manufacture and maintenance, the work of which is easily underestimated because most of it occurs offstage in the peer review process.
  4. Key normative categories of science such as ‘competence’ and ‘expertise’ are moveable feasts, the terms of which are determined by the power dynamics that obtain between specific alignments of interested parties. (43)

In that sense, then, Fuller agrees that the positive lessons STS wished for the practice of the scientific community may have inadvertently found their way into a post-truth world that may abuse or exploit them in unintended ways. That is, something like “consensus” is challenged by STS because of how the scientific community pretends to get there knowing as it does that no such thing can ever be reached and when reached it may have been reached for the wrong reasons (leadership pressure, pharmaceutical funding of conferences and journals). But this can also go too far.

Just because consensus is difficult to reach (it doesn’t mean unanimity) and is susceptible to corruption or bias doesn’t mean that anything goes. Some experimental results are more acceptable than others and some data are more informative than others, and the struggle for agreement may take its political toll on the scientific community, but this need not result in silly ideas about cigarettes being good for our health or that obesity should be encouraged from early childhood.

It seems important to focus on Fuller’s conclusion because it encapsulates my concern with his version of post-truth, a condition he endorses not only in the epistemological plight of humanity but as an elixir with which to cure humanity’s ills:

While some have decried recent post-truth campaigns that resulted in victory for Brexit and Trump as ‘anti-intellectual’ populism, they are better seen as the growth pains of a maturing democratic intelligence, to which the experts will need to adjust over time. Emphasis in this book has been given to the prospect that the lines of intellectual descent that have characterised disciplinary knowledge formation in the academy might come to be seen as the last stand of a political economy based on rent-seeking. (130)

Here, we are not only afforded a moralizing sermon about (and it must be said, from) the academic privileged position, from whose heights all other positions are dismissed as anti-intellectual populism, but we are also entreated to consider the rantings of the know-nothings of the post-truth world as the “growing pains of a maturing democratic intelligence.” Only an apologist would characterize the Trump administration as mature, democratic, or intelligent. Where’s the evidence? What would possibly warrant such generosity?

It’s one thing to challenge “disciplinary knowledge formation” within the academy, and there are no doubt cases deserving reconsideration as to the conditions under which experts should be paid and by whom (“rent-seeking”); but how can these questions about higher education and the troubled relations between the university system and the state (and with the military-industrial complex) give cover to the Trump administration? Here is Fuller’s justification:

One need not pronounce on the specific fates of, say, Brexit or Trump to see that the post-truth condition is here to stay. The post-truth disrespect for established authority is ultimately offset by its conceptual openness to previously ignored people and their ideas. They are encouraged to come to the fore and prove themselves on this expanded field of play. (Ibid)

This, too, is a logical stretch: is disrespect for the authority of the establishment the same as, or does it logically lead to, the “conceptual” openness to previously “ignored people and their ideas”? This is not a claim on behalf of the disenfranchised. Perhaps their ideas were simply bad or outright racist or misogynist (as we see with Trump). Perhaps they were ignored because there was hope that they would change for the better, become more enlightened, not act on their white supremacist prejudices. Should we have “encouraged” explicit anti-Semitism while we were at it?

Limits to Tolerance

We tolerate ignorance because we believe in education and hope to overcome some of it; we tolerate falsehood in the name of eventual correction. But we should never tolerate offensive ideas and beliefs that are harmful to others. Once again, it is one thing to argue about black holes, and quite another to argue about whether black lives matter. It seems reasonable, as Fuller concludes, to say that “In a post-truth utopia, both truth and error are democratised.” It is also reasonable to say that “You will neither be allowed to rest on your laurels nor rest in peace. You will always be forced to have another chance.”

But the conclusion that “Perhaps this is why some people still prefer to play the game of truth, no matter who sets the rules” (130) does not follow. Those who “play the game of truth” are always vigilant about falsehoods and post-truth claims, and to say that they are simply dupes of those in power is both incorrect and dismissive. On the contrary: Socrates was searching for the truth and fought with the sophists, as Popper fought with the logical positivists and the Kuhnians, and as scientists today are searching for the truth and continue to fight superstitions and debunked pseudoscience about vaccination causing autism in young kids.

If post-truth is like postsecularism, scientific and political discourses can inform each other. When power-plays by ignoramus leaders like Trump are obvious, they could shed light on less obvious cases of big pharma leaders or those in charge of the EPA today. In these contexts, inconvenient facts and truths should prevail and the gamesmanship of post-truthers should be exposed for what motivates it.

Contact details: rsassowe@uccs.edu

* Special thanks to Dr. Denise Davis of Brown University, whose contribution to my critical thinking about this topic has been profound.

References

Theodor W. Adorno (1998/1963), Critical Models: Interventions and Catchwords. Translated by Henry W. Pickford. New York: Columbia University Press

Kurt Andersen (2017), Fantasyland: How America Went Hotwire: A 500-Year History. New York: Random House

Monya Baker, “1,500 scientists lift the lid on reproducibility,” Nature Vol. 533, Issue 7604, 5/26/16 (corrected 7/28/16)

Michael Bowker (2003), Fatal Deception: The Untold Story of Asbestos. New York: Rodale.

Robert Darnton, “The Greatest Show on Earth,” New York Review of Books Vo. LXV, No. 11 6/28/18, pp. 68-72.

Al Gore (2006), An Inconvenient Truth: The Planetary Emergency of Global Warming and What Can Be Done About It. New York: Rodale.

Richard Hofstadter (1962), Anti-Intellectualism in American Life. New York: Vintage Books.

Jean- François Lyotard (1984), The Postmodern Condition: A Report on Knowledge. Translated by Geoff Bennington and Brian Massumi. Minneapolis: University of Minnesota Press.

Robert K. Merton (1973/1942), “The Normative Structure of Science,” The Sociology of Science: Theoretical and Empirical Investigations. Chicago and London: The University of Chicago Press, pp. 267-278.

Hans E. Plesser, “Reproducibility vs. Replicability: A Brief History of Confused Terminology,” Frontiers in Neuroinformatics, 2017; 11: 76; online: 1/18/18.

Robert N. Proctor (1995), Cancer Wars: How Politics Shapes What We Know and Don’t Know About Cancer. New York: Basic Books.

James Surowiecki (2004), The Wisdom of Crowds. New York: Anchor Books.

Author Information: Steve Fuller, University of Warwick, UK, S.W.Fuller@warwick.ac.uk

Fuller, Steve. “Against Virtue and For Modernity: Rebooting the Modern Left.” Social Epistemology Review and Reply Collective 6, no. 12 (2017): 51-53.

The pdf of the article gives specific page references. Shortlink: https://wp.me/p1Bfg0-3S9

Toby Ziegler’s “The Liberals: 3rd Version.” Photo by Matt via Flickr / Creative Commons

 

My holiday message for the coming year is a call to re-boot the modern left. When I was completing my doctoral studies, just as the Cold War was beginning to wind down, the main threat to the modern left was seen as coming largely from within. ‘Postmodernism’ was the name normally given to that threat, and it fuelled various culture, canon and science wars in the 1980s and 1990s.

Indeed, even I was – and, in some circles, continue to be – seen as just such an ‘enemy of reason’, to recall the name of Richard Dawkins’ television show in which I figured as one of the accused. However, in retrospect, postmodernism was at most a harbinger for a more serious threat, which today comes from both the ‘populist’ supporters of Trump, Brexit et al. and their equally self-righteous academic critics.

Academic commentators on Trump, Brexit and the other populist turns around the world seem unable to avoid passing moral judgement on the voters who brought about these uniformly unexpected outcomes, the vast majority of which the commentators have found unwelcomed. In this context, an unholy alliance of virtue theorists and evolutionary psychologists have thrived as diagnosticians of our predicament. I say ‘unholy’ because Aristotle and Darwin suddenly find themselves on the same side of an argument, now pitched against the minds of ‘ordinary’ people. This anti-democratic place is not one in which any self-respecting modern leftist wishes to be.

To be sure, virtue theorists and evolutionary psychologists come to the matter from rather different premises – the one metaphysical if not religious and the other naturalistic if not atheistic. Nevertheless, they both regard humanity’s prospects as fundamentally constrained by our mental makeup. This makeup reflects our collective past and may even be rooted in our animal nature. Under the circumstances, so they believe, the best we can hope is to become self-conscious of our biases and limitations in processing information so that we don’t fall prey to the base political appeals that have resulted in the current wave of populism.

These diagnosticians conspicuously offer little of the positive vision or ambition that characterised ‘progressive’ politics of both liberal and socialist persuasions in the nineteenth and twentieth centuries. But truth be told, these learned pessimists already have form. They are best seen as the culmination of a current of thought that has been percolating since the end of the Cold War effectively brought to a halt Marxism as a world-historic project of human emancipation.

In this context, the relatively upbeat message advanced by Francis Fukuyama in The End of History and the Last Man that captivated much of the 1990s was premature. Fukuyama was cautiously celebrating the triumph of liberalism over socialism in the progressivist sweepstakes. But others were plotting a different course, one in which the very terms on which the Cold War had been fought would be superseded altogether. Gone would be the days when liberals and socialists vied over who could design a political economy that would benefit the most people worldwide. In its place would be a much more precarious sense of the world order, in which overweening ambition itself turned out to be humanity’s Achilles Heel, if not Original Sin.

Here the trail of books published by Alasdair MacIntyre and his philosophical and theological admirers in the wake of After Virtue ploughed a parallel field to such avowedly secular and scientifically minded works as Peter Singer’s A Darwinian Left and Steven Pinker’s The Blank Slate. These two intellectual streams, both pointing to our species’ inveterate shortcomings, gained increasing plausibility in light of 9/11’s blindsiding on the post-Cold War neo-liberal consensus.

9/11 tore up the Cold War playbook once and for all, side-lining both the liberals and the socialists who had depended on it. Gone was the state-based politics, the strategy of mutual containment, the agreed fields of play epitomized in such phrases as ‘arms race’ and ‘space race’. In short, gone was the game-theoretic rationality of managed global conflict. Thus began the ongoing war on ‘Islamic terror’. Against this backdrop, the Iraq War proved to be colossally ill-judged, though no surprise given that its mastermind was one of the Cold War’s keenest understudies, Donald Rumsfeld.

For the virtue theorists and evolutionary psychologists, the Cold War represented as far as human rationality could go in pushing back and channelling our default irrationality, albeit in the hope of lifting humanity to a ‘higher’ level of being. Indeed, once the USSR lost the Cold War to the US on largely financial grounds, the victorious Americans had to contend with the ‘blowback’ from third parties who suffered ‘collateral damage’ at many different levels during the Cold War. After all, the Cold War, for all its success in averting nuclear confrontation, nevertheless turned the world into a playing field for elite powers. ‘First world’, ‘second world’ and ‘third world’ were basically the names of the various teams in contention on the Cold War’s global playing field.

So today we see an ideological struggle whose main players are those resentful (i.e. the ‘populists’) and those regretful (i.e. the ‘anti-populists’) of the entire Cold War dynamic. The only thing that these antagonists appear to agree on is the folly of ‘progressivist’ politics, the calling card of both modern liberalism and socialism. Indeed, both the populists and their critics are fairly characterised as somehow wanting to turn back the clock to a time when we were in closer contact with the proverbial ‘ground of being’, which of course the two sides define in rather different terms. But make no mistake of the underlying metaphysical premise: We are ultimately where we came from.

Notwithstanding the errors of thought and deed committed in their names, liberalism and socialism rightly denied this premise, which placed both of them in the vanguard – and eventually made them world-historic rivals – in modernist politics. Modernity raised humanity’s self-regard and expectations to levels that motivated people to build a literal Heaven on Earth, in which technology would replace theology as the master science of our being. David Noble cast a characteristically informed but jaundiced eye at this proposition in his 1997 book, The Religion of Technology: The Divinity of Man and the Spirit of Invention. Interestingly, John Passmore had covered much the same terrain just as eruditely but with greater equanimity in his 1970 book, The Perfectibility of Man. That the one was written after and the other during the Cold War is probably no accident.

I am mainly interested in resurrecting the modernist project in its spirit, not its letter. Many of modernity’s original terms of engagement are clearly no longer tenable. But I do believe that Silicon Valley is comparable to Manchester two centuries ago, namely, a crucible of a radical liberal sensibility – call it ‘Liberalism 2.0’ or simply ‘Alt-Liberalism’ – that tries to use the ascendant technological wave to leverage a new conception of the human being.

However one judges Marx’s critique of liberalism’s scientific expression (aka classical political economy), the bottom line is that his arguments for socialism would never have got off the ground had liberalism not laid the groundwork for him. As we enter 2018 and seek guidance for launching a new progressivism, we would do well to keep this historical precedent in mind.

Contact details: S.W.Fuller@warwick.ac.uk

Author Information: Robyn Toler, University of Dallas

Toler, Robyn. “The Progress and Technology of City Life.” Social Epistemology Review and Reply Collective 6, no. 2 (2017): 78-85.

The PDF of the article gives specific page numbers. Shortlink: http://wp.me/p1Bfg0-3t9

Please refer to:

city_life

Image credit: Simon & His Camera, via flickr

The Progress and Technology of City Life

The products of today’s technology deserve scrutiny. The mass media and pop culture exert a powerful influence on Americans, young and old alike. Opportunities for quiet reflection are few and far between despite our much-trumpeted age of convenience. Scientific advancement is not necessarily accompanied by wisdom. In fact, in embracing the new and shiny it is easy to cast aside the tested and proven. Sometimes we tear down fences before finding out why they were built. Likewise, the alluring products of technology deserve some scrutiny before being accepted. The cool, rational consideration needed is all the harder to engage in because of the onslaught of the sensational. Still, prudence suggests that the best course is to take a step back and ponder the choices before us.

Under the influence of the “culture industry,” as described by Theodor Adorno and others, perpetual distraction and artificial consensus crowd out of people’s lives the solitude and individuality required for cultivating critical, independent thought, and the courage for following their own reasoned convictions. Sensitization to the mechanisms used by the culture industry can help audiences more effectively resist them, and preserve or regain an authentic experience and view of life. Some view technological advancements as unqualified goods by virtue of their nature as modern and scientific; however, the gains produced by these technologies bring their own attendant complications, such as compromised privacy, continuous availability to the workplace, and the stress of an externally imposed life rhythm over a natural, personal ebb and flow of work and leisure.

This article challenges the argument that technological advances have made work easier, created more time for leisure, decreased stress, increased satisfaction in relationships, simplified tasks, and made jobs less time consuming, resulting in a net benefit to lived experience.

While people in rural as well as urban locations can easily be involved with technology in many parts of the world, including being connected to the internet, city life has some clear contrasts to country life. Population density is higher in the city. The environment is noisier. Traffic, construction equipment, and the many people in close proximity all contribute to the volume. Limited green spaces reduce exposure to a variety of natural features like plants, birds, and bodies of water. The city is also filled with opportunities to interface with technology. Subway tickets, toll tags, video displays, elevators and escalators, point of sale terminals, smart phones, identification badges for areas with controlled access, passports, and games are just a few of the high-tech items an average person deals with in a normal day in the city.

Examining the Western philosophical tradition, especially from Kant onward, Adorno’s writings cover a wide range of topics including music and literary criticism, aesthetics, mass culture, Hegel, existentialism, sociology, epistemology, and metaphysics; however, his work on what he termed “The Culture Industry” is especially pertinent to understanding the dynamics of life in the city. In his article “How to Look at Television,” from The Culture Industry, Adorno reveals his thoughts on the influence of popular culture. He warned in 1957, soon after the advent of television, that it can produce intellectual passivity and gullibility (166). Current consumers of internet entertainment should heed his admonition, guard their powers of reason, and be wary of technology’s ability to hypnotize and immobilize. (cf. Reider 2015).

Boredom, unlike the hypnotic effect Adorno warned against, is not only an unavoidable part of life, it is the wellspring of creativity. Overscheduling, avoiding monotony at all costs, robs potential artists, poets, scientists, and inventors of their motivation to generate plans and projects. Boredom has developed a bad reputation as a companion to depression and vice and a precursor to mischief; but “research suggests that falling into a numbed trance allows the brain to recast the outside world in ways that can be productive and creative at least as often as they are disruptive” (Carey 2008). As another researcher indicated,

When children have nothing to do now, they immediately switch on the TV, the computer, the phone or some kind of screen. The time they spend on these things has increased. But children need to have stand-and-stare time, time imagining and pursuing their own [emphasis added] thinking processes or assimilating their experiences through play or just observing the world around them. [It is this sort of thing that stimulates the imagination while the screen] tends to short circuit that process and the development of creative capacity (Richardson 2013, 1013).

Some would argue that “switching on the TV” is “doing something;” however, Richardson asserts that imagining, observing, and mentally processing experiences are more valuable.

Leisure and the Workweek

Max Gunther’s The Weekenders takes an amusing yet probing look at the leisure time of Americans. It is particularly interesting to note that this book was published in 1964. While some of the pastimes available have changed, human beings are mostly the same. One of the most distinctive features of city life is scheduling. Busses run on a schedule. School bells, business meetings, and garden clubs stay on schedule so their participants can meet their next obligations. The use of leisure time and how it is incorporated into schedules is particularly interesting. Insights can be gained by studying the movement from an organic life rhythm to an arbitrarily imposed “five days on, two off” schedule, the perceived pressure to be productive during hours away from one’s paid employment, and the tendency to be connected continuously to one’s work through the technological mediation of devices such as smart phones (cf. Drain and Strong 2015).

People in the pre-internet years tended to look at their leisure time, primarily the weekend, as wholly separated and different from the workweek. Of course, there were always the workaholics, but as a national trend, the weekend seemed different. They wore different clothing and participated in different activities, all with a different attitude. Sixty-two hours were partitioned off from “work” to be spent in “leisure.” Divisions during the week between different professions were blurred on the weekend, and everyone, except that unfortunate segment whose businesses hummed on throughout the weekend, took up similar pursuits. “[City dwellers] can no longer work and play according to the rhythms of personal mood or need but are all bound to the same gigantic rhythm: five days on, two off” (Gunther, 10-12; cf. Ellul 1964; cf. Kok 2015). While one would expect that all this “leisure time” provided by the efficiency of industrialization would lead to a slower pace conducive to relaxation, quite the contrary seems to be the case.

Families plunged into furious activity on those days ostensibly set aside for leisure. The weekend was by and for the middle class. Ads were aimed almost exclusively at them. Students were weekenders in training. Sports, play, eating and drinking, cultural arts, church, and civic volunteering all took their share of available time. Although these activities sound pleasant, the real result was a vague insecurity and bewildering Monday fatigue. As Gunther appropriately pondered, it is not clear whether the fatigue was generated by the energy expended in reaching goals or by pent-up, unrelieved tension (Gunther 13-15).

Travel also occupied the weekenders of the 60s. Weekend trips, day trips, outings to events and places of interest, and visits with friends vied for attention. This may have been genuine curiosity about the world and fellowship with neighbors and loved ones, or something else. All that travel and dining out was expensive, even back then. Aggressive driving increased on the weekends, too. It is unclear what drove this restlessness, what inner devil goaded those mid-century weekenders, what they were so desperately seeking. Yet, it is clear that there were high expectations for leisure time, and somehow despite all the recreation, those two days off frequently disappointed (Gunther 16, 21). Technological advances have continued, but the expectations and restlessness do not seem to have abated.

Close and So Far Away

Even though residents in the city are in close proximity to one another, the trend toward social media and away from direct personal interaction has grown. Relationships in the city are heavily influenced by technological mediation. Perhaps limited access to natural settings pushes city dwellers indoors, and into virtual spaces. Sites designed to facilitate dating, networking, creative pursuits, and games, among other activities, have sprung up. The internet “surfing” is always fine because someone else is constantly adding new, tantalizing information. It is the epitome of content “crowdsourcing.”  Social networking sites provide crowds of people who create content, usually out of their own experiences, for the entertainment of others as they browse. Potential romantic interests, job openings, and decorating ideas are perpetually at the ready, with new ones popping up moment by moment. This makes it difficult to break away. Suspense and expectation create enticement. Every genre of social media has its niche and its devotees, but perhaps the most pervasive and invasive of them all is Facebook, with its plethora of “friends.” It is ironic that in cities with their high population density, online, virtual “meetings” are so popular.

Begun as a forum for college students, this social media giant has grown to include anyone who wants to join, with a non-stop, real-time feed of “Status Updates.” Founded by Mark Zuckerberg and some college classmates at Harvard, within 24 hours of its launch the site had over 1200 registrants. Private investors became involved and the company expanded. Facebook acquired a feed aggregator and then the photo site called Instagram. The company made its initial public offering (IPO) in 2012 valued at $104 billion. A new search feature was rolled out in 2013, and changes continue, including an opt-out feature that makes it the user’s responsibility to raise security settings from their lower, default positions. Today one in seven people is a member (Zeevie 2013). The desire to know instantly about the next update to appear in the feed—a great picture, word of something earthshaking in a “friend’s” life, a joke, a political rallying cry—can be addicting. In cities large and small, people often observe each other online in addition to, or instead of, from their front porches.

The moment-to-moment observation of others’ activities through monitoring their posts is not the only aspect of social media that makes it enticing, though. The ability to stay connected with all the people you have ever known—provided they are on Facebook—is a big draw. Consider the evolution of the address book. Years ago a small booklet next to the telephone held all the names, addresses, and phone numbers of the people one most frequently called or corresponded with. As social circles expanded and families became more mobile, address books expanded as well. The inconvenience of constant marking out and erasing information of friends and relatives that moved led to loose-leaf notebooks and index card files. The Rolodex system with its easily interchangeable cards was born, facilitating an ever-growing collection of constantly changing contact information.

Now leap ahead to the electronic version of the address book, the Palm Pilot. It was a utilitarian miracle and a status symbol in one! Then, just as carrying an address book gadget plus a cellular phone became tiresome, the technology merged to produce one convenient device to do both jobs—the smart phone. Cloud data storage debuted to protect data from hardware problems and to make information accessible anywhere with connectivity, cellular or wi-fi. Mail made a similar metamorphosis from postal mail (“snail mail,” referencing its comparatively slow delivery time) to electronically delivered “email,” to web-based systems like gmail. Now networking platforms like Facebook, and LinkedIn for professionals, are widening the messaging options further. Contacts are accumulated over time, surviving any number of physical moves by users, and stored remotely for ubiquitous access. For better or worse, the days of hunting for a scrap of paper with someone’s number on it are over.

Even though city dwellers have all those connections with all those people, and they could be interacting face to face with those nearby, they all too often choose online forums over personal meetings. A large segment of their connectivity is online instead of in person, and it has a negative side. Virtual personalities allow a spectrum of falsity ranging from simply curating one’s image to advantage, to manufacturing a fully fake identity. The self-absorbed use Facebook to promote themselves, not connect with others. Furthermore, instead of enhancing the ability to read social cues and body language, excessive time online erodes these crucial social skills (Kiesbye 55, 58-9). Facebook actually interferes with friendships rather than strengthening them. It seems that social needs would be more effectively met by simply arranging to meet in person, in the city environment with its physical proximity and variety of venues, instead of retreating behind a computerized mediator.

Some cite city crime statistics as a reason to retreat from malls, parks, and other public places. But new categories of crime and vice have arisen or proliferated on the internet. Somewhat, though not altogether, different from face to face encounters on sidewalks and in elevators, it is difficult to know with whom you are dealing on social media. Despite assurances by site administrators, malevolent users can easily misrepresent themselves, luring the young and naïve into dangerous, sometimes fatal, encounters. Teens’ desire for premature autonomy and willingness to lie to their parents in order to sneak off and meet someone surreptitiously complete the potentially tragic scenario (Luna 196-8). “Sexting” over cell phones, and now “sextortion,” have been introduced. Teenagers are particularly vulnerable to this kind of deception. They are notoriously “easy to intimidate, and embarrassed to tell their parents” when their judgment proves poor and plans go awry (Luna 196-7). A young person who carelessly snaps a compromising photo of himself (or is digitally captured by a companion) can be parlayed into a source for a self-incriminating file of pornography by an online predator.

Privacy and anonymity can be viewed two ways in the city. There can be anonymity in a crowd, yet we are captured on camera throughout the day at businesses, traffic lights, and elsewhere. With the exception of satellite surveillance, that type of tracking is rare outside the city. It is difficult to estimate how we modify our behavior because of this “watching.” City life also presents an opportunity for deception and abuse in privacy breaches. Privacy issues in public, in private, and on social networking sites concern politicians and culture critics. The high quantity of pictures posted on Facebook is a valuable source of data for anyone trying to match faces with identities.

In a study led by Alessandro Acquisti of Carnegie Mellon University, information from social media sites including Facebook was combined easily with cloud computing and facial recognition software to identify students on a campus (Luna 121). Whether or not students object to this, their parents may find it disconcerting that the children they have just released into the next phase of their growing independence can be surveilled in this way. Citizens who value their privacy will have a difficult time maintaining it in the age of Facebook, whether or not they are or ever have been subscribers. Friend lists yield copious amounts of information, and trails remain to anyone mentioned or pictured. Even non-subscribers can gain access through search engines (Luna 204-5).

Those who think they are too old or too cautious to become crime victims should consider how their online personas could still have negative repercussions for them. Potential employers and college admissions personnel routinely check their applicants’ presences on social networking sites. Students are careless about their passwords, allowing “friends” to make embarrassing posts in their names. Employers and administrators do not know or care who created the posts, but when they see information that makes a user look bad, they are likely to move on to more appealing candidates to fill their available positions (Luna, 199). Facebook does not cause people to lose opportunities, but it guarantees that many people will see it if you make a mistake.

If predators, lowered productivity, narcissism, and shortened attention spans are not enough incentive to reconsider one’s entanglement with social media, here is a puzzle to ponder: anyone actively attempting to conceal his identity or whereabouts will have a difficult time in the age of social media. This is a coin with two sides. While it seems appealing for local law enforcement and federal Homeland Security to be able to track and locate a suspect, honest citizens who just want to remain anonymous may rightly feel violated knowing that their every traffic decision, subway stop, casual comment, and convenience store errand is at least captured, and possibly monitored in real time. Movies and television shows like Fox’s popular series 24 demonstrate the use of this technology and promote its acceptance—even demand. Before capitulating to the easy solution of simply watching everybody all the time, think about whether that kind of scrutiny is really desirable or acceptable.

A Perfect Day

For a comparison between today’s city life full of electronic gadgets and software and a time before computers, or even electricity, had reached much of rural America, the following poem depicts a different way of life. Neither electronic entertainment nor boredom would have intruded on the grandmother portrayed in this poem. While physically busy, she would have had more opportunity for contemplation than most modern city dwellers.

Perfect Day

Grandmother, on a winter’s day,
Milked the cows and fed them hay;
Slopped the hogs, saddled the mule,
And got the children off to school.
Did a washing, mopped the floors,
Washed the windows and did some chores,
Cooked a dish of home-dried fruit,
Pressed her husband’s Sunday suit.
Swept the parlor, made the bed,
Baked a dozen loaves of bread,
Split some firewood and lugged it in
Enough to fill the kitchen bin.
Cleaned the lamps and put in oil,
Stewed some apples she thought might spoil,
Churned the butter, baked a cake,
Then exclaimed, “For mercy sake the calves have got out of the pen!”
Went out, and chased them in again.
Gathered the eggs and locked the stable,
Back to the house and set the table,
Cooked a supper that was delicious,
And afterward washed all the dishes.
Fed the cat, and sprinkled the clothes
Mended a basket full of hose,
Then opened the organ and began to play:
“When you come to the end of a perfect day!”—Author Unknown (Kaetler 54-5).

Cooking, cleaning, farm chores, organization, time management, nurturing behaviors, and aesthetics are all on display in this narration. Facebook would have been a shallow substitute for the creative work accomplished on this day, leaving the industrious grandmother with the same vague dissatisfaction as Gunther’s “Weekenders,” mentioned earlier.

Technological progress is here to stay, with or without a given individual’s active participation; but users can take steps to stay in control of their data and their minds. The advantages of dialing back technology are delightfully and creatively narrated in the book Better Off. In it author Eric Brende chronicles the lifestyle journey he and his wife made in search of the minimal amount of electronics and machinery necessary to optimize life for them. After spending an extended time living in a rural community that rejected almost all labor-saving devices, they concluded that they were happier and “better off” without most of the expensive, encumbering accouterments of 21st century life most of us take for granted. He ends his book by saying

… in all cases [technology] must serve our needs, not the reverse, and we must determine these needs before considering the needs for technology. The willingness and the wisdom to do so may be the hardest ingredients to come by in this frenetic age. Perhaps what is needed most of all, then, are conditions favorable to them: quiet around us, quiet inside us, quiet born of sustained meditation and introspection. We must set aside time for it, in our churches, in our studies, in our hearts. Only when we have met this last requisite, I suspect, will technology yield its power and become a helpful handservant (Brende 232-3).

Brende and his wife found the life balance that suited them away from the city before rejoining it. His focus on quiet and control are aptly put.

Adorno stated that modern mass culture has been transformed “into a medium of undreamed of psychological control (cf. Guizzo, 2015; cf. Scalambrino, 2015). The repetitiveness, the selfsameness, and the ubiquity of modern mass culture tend to make for automatized reactions and to weaken the forces of individual resistance” (Adorno 2006, 160). Preserving solitude, concentration, independent thought, and courage are worth the effort it takes to resist the popular culture. The culture industry will continue to usurp the territory of life wherever it is allowed to, within or outside of the city; but vigilance can give it boundaries.

References

Adorno, Theodor W. The Culture Industry: Selected Essays on Mass Culture. London: Routledge, 2006.

Brende, Eric. Better Off: Flipping the Switch on Technology. New York: Harpercollins Publishers, 2004.

Carey, Benedict. “You’re Bored But Your Brain is Tuned In.” New York Times August 5, 2008. http://www.nytimes.com/2008/08/05/health/research/05mind.html?_r=0 (accessed May 13, 2014).

Drain, Chris, and Richard Charles Strong. “Situated Mediation and Technological Reflexivity: Smartphones, Extended Memory, and Limits of Cognitive Enhancement.” In Social Epistemology and Technology: Toward Public Self-Awareness Regarding Technological Mediation, edited by Frank Scalambrino, 187-197. London: Rowman & Littlefield International, 2015.

Ellul, Jacques. The Technological Society. Translated by J. Wilkinson. New York: Vintage Books, 1964.

Guizzo, Danielle. “The Biopolitics of the Female: Constituting Gendered Subjects through Technology.” In Social Epistemology and Technology: Toward Public Self-Awareness Regarding Technological Mediation, edited by Frank Scalambrino, 145-155. London: Rowman & Littlefield International, 2015.

Gunther, Max. The Weekenders. Philadelphia, Pennsylvania: J. B. Lippencott Company, 1964.

Kiesbye, Stefan, editor. Are Social Networking Sites Harmful? Detroit, Michigan: Greenhaven Press, 2011.

Kok, Arthur. “Labor and Technology: Kant, Marx, and the Critique of Instrumental Reason Vanishing Subject. Becoming Who You Cybernetically Are.” In Social Epistemology and Technology: Toward Public Self-Awareness Regarding Technological Mediation, edited by Frank Scalambrino, 137-144. London: Rowman & Littlefield International, 2015.

Firestone, Lisa. “Are You Present for Your Children?” Sussex Publishers, LLC. May 5, 2014. http://www.psychologytoday.com/blog/compassion-matters/201405/are-you-present-your-children (accessed May 10, 2014).

Luna, J. J. How to Be Invisible. New York: Thomas Dunne Books. 2012.

Reider, Patrick. “The Internet and Existentialism: Kierkegaardian and Hegelian Insights.” In Social Epistemology and Technology: Toward Public Self-Awareness Regarding Technological Mediation, edited by Frank Scalambrino, 59-69. London: Rowman & Littlefield International, 2-15.

Richardson, Hannah. “Children Should Be Allowed to Get Bored, Expert Says.” BBC. March 22, 2013. http://www.bbc.com/news/education-21895704 (accessed May 14, 2014).

Scalambrino, Frank. “What Control? Life at the Limits of Power Expression.” In Social Epistemology and Technology: Toward Public Self-Awareness Regarding Technological Mediation, edited by Frank Scalambrino, 101-111. London: Rowman & Littlefield International, 2015.

Snopes.com. “Erma Bombeck’s Regrets: A dying Erma Bombeck penned a list of misprioritizations she’d come to regret?” September 29, 2009. http://www.snopes.com/glurge/bombeck.asp. (accessed May 13, 2014).

Snopes.com. “Grandma’s Wash Day: Description of how laundry was done in bygone days?” August 23, 2008. http://www.snopes.com/glurge/washday.asp (accessed May 13, 2014).

Unknown. Grandparents.net. Ltd. Australian Media Pty. 2000. http://www.grandparents.net/perfectday.htm (accessed May 13, 2014).

Zeevi, Daniel. “The Ultimate History of Facebook [INFOGRAPHIC].” SocialMediaToday. February 21, 2013. http://socialmediatoday.com/daniel-zeevi/1251026/ultimate-history-facebook-infographic (accessed May 13, 2014).

Zendaya, Sheryl Burk. Between U and Me. New York, New York: Disney-Hyperion Books, 2013.

Zuidervaart, Lambert, “Theodor W. Adorno.” The Stanford Encyclopedia of Philosophy (Winter 2015 Edition), edited by Edward N. Zalta. https://plato.stanford.edu/archives/win2015/entries/adorno/. (accessed May 13, 2014).