Archives For evolutionary change

Author Information: Luca Tateo, Aalborg University & Federal University of Bahia, luca@hum.aau.dk.

Tateo, Luca. “Ethics, Cogenetic Logic, and the Foundation of Meaning.” Social Epistemology Review and Reply Collective 7, no. 12 (2018): 1-8.

The pdf of the article gives specific page references. Shortlink: https://wp.me/p1Bfg0-44i

Mural entitled “Paseo de Humanidad” on the Mexican side of the US border wall in the city of Heroica Nogales, in Sonora. Art by Alberto Morackis, Alfred Quiróz and Guadalupe Serrano.
Image by Jonathan McIntosh, via Flickr / Creative Commons

 

This essay is in reply to: Miika Vähämaa (2018) Challenges to Groups as Epistemic Communities: Liminality of Common Sense and Increasing Variability of Word Meanings, Social Epistemology, 32:3, 164-174, DOI: 10.1080/02691728.2018.1458352

In his interesting essay, Vähämaa (2018) discusses two issues that I find particularly relevant. The first one concerns the foundation of meaning in language, which in the era of connectivism (Siemens, 2005) and post-truth (Keyes, 2004) becomes problematic. The second issue is the appreciation of epistemic virtues in a collective context: how the group can enhance the epistemic skill of the individual?

I will try to explain why these problems are relevant and why it is worth developing Vähämaa’s (2018) reflection in the specific direction of group and person as complementary epistemic and ethic agents (Fricker, 2007). First, I will discuss the foundations of meaning in different theories of language. Then, I will discuss the problems related to the stability and liminality of meaning in the society of “popularity”. Finally I will propose the idea that the range of contemporary epistemic virtues should be integrated by an ethical grounding of meaning and a co-genetic foundation of meaning.

The Foundation of Meaning in Language

The theories about the origins of human language can be grouped in four main categories, based on the elements characterizing the ontogenesis and glottogenesis.

Sociogenesis Hypothesis (SH): it is the idea that language is a conventional product, that historically originates from coordinated social activities and it is ontogenetically internalized through individual participation to social interactions. The characteristic authors in SH are Wundt, Wittgenstein and Vygotsky (2012).

Praxogenesis Hypothesis (PH): it is the idea that language historically originates from praxis and coordinated actions. Ontogenetically, the language emerges from senso-motory coordination (e.g. gaze coordination). It is for instance the position of Mead, the idea of linguistic primes in Smedslund (Vähämaa, 2018) and the language as action theory of Austin (1975).

Phylogenesis Hypothesis (PhH): it is the idea that humans have been provided by evolution with an innate “language device”, emerging from the evolutionary preference for forming social groups of hunters and collective long-duration spring care (Bouchard, 2013). Ontogenetically, language predisposition is wired in the brain and develops in the maturation in social groups. This position is represented by evolutionary psychology and by innatism such as Chomsky’s linguistics.

Structure Hypothesis (StH): it is the idea that human language is a more or less logic system, in which the elements are determined by reciprocal systemic relationships, partly conventional and partly ontic (Thao, 2012). This hypothesis is not really concerned with ontogenesis, rather with formal features of symbolic systems of distinctions. It is for instance the classical idea of Saussure and of the structuralists like Derrida.

According to Vähämaa (2018), every theory of meaning has to deal today with the problem of a terrific change in the way common sense knowledge is produced, circulated and modified in collective activities. Meaning needs some stability in order to be of collective utility. Moreover, meaning needs some validation to become stable.

The PhH solves this problem with a simple idea: if humans have survived and evolved, their evolutionary strategy about meaning is successful. In a natural “hostile” environment, our ancestors must have find the way to communicate in such a way that a danger would be understood in the same way by all the group members and under different conditions, including when the danger is not actually present, like in bonfire tales or myths.

The PhH becomes problematic when we consider the post-truth era. What would be the evolutionary advantage to deconstruct the environmental foundations of meaning, even in a virtual environment? For instance, what would be the evolutionary advantage of the common sense belief that global warming is not a reality, considered that this false belief could bring mankind to the extinction?

StH leads to the view of meaning as a configuration of formal conditions. Thus, stability is guaranteed by structural relations of the linguistic system, rather than by the contribution of groups or individuals as epistemic agents. StH cannot account for the rapidity and liminality of meaning that Vähämaa (2018) attributes to common sense nowadays. SH and PH share the idea that meaning emerges from what people do together, and that stability is both the condition and the product of the fact that we establish contexts of meaningful actions, ways of doing things in a habitual way.

The problem is today the fact that our accelerated Western capitalistic societies have multiplied the ways of doing and the number of groups in society, decoupling the habitual from the common sense meaning. New habits, new words, personal actions and meanings are built, disseminated and destroyed in short time. So, if “Our lives, with regard to language and knowledge, are fundamentally bound to social groups” (Vähämaa, 2018, p. 169) what does it happen to language and to knowledge when social groups multiply, segregate and disappear in a short time?

From Common Sense to the Bubble

The grounding of meaning in the group as epistemic agent has received a serious stroke in the era of connectivism and post-truth. The idea of connectivism is that knowledge is distributed among the different agents of a collective network (Siemens, 2005). Knowledge does not reside into the “mind” or into a “memory”, but is rather produced in bits and pieces, that the epistemic agent is required to search, and to assemble through the contribution of the collective effort of the group’s members.

Thus, depending on the configuration of the network, different information will be connected, and different pictures of the world will emerge. The meaning of the words will be different if, for instance, the network of information is aggregated by different groups in combination with, for instance, specific algorithms. The configuration of groups, mediated by social media, as in the case of contemporary politics (Lewandowsky, Ecker & Cook, 2017), leads to the reproduction of “bubbles” of people that share the very same views, and are exposed to the very same opinions, selected by an algorithm that will show only the content compliant with their previous content preferences.

The result is that the group loses a great deal of its epistemic capability, which Vähämaa (2018) suggests as a foundation of meaning. The meaning of words that will be preferred in this kind of epistemic bubble is the result of two operations of selection that are based on popularity. First, the meaning will be aggregated by consensual agents, rather than dialectic ones. Meaning will always convergent rather than controversial.

Second, between alternative meanings, the most “popular” will be chosen, rather than the most reliable. The epistemic bubble of connectivism originates from a misunderstanding. The idea is that a collectivity has more epistemic force than the individual alone, to the extent that any belief is scrutinized democratically and that if every agent can contribute with its own bit, the knowledge will be more reliable, because it is the result of a constant and massive peer-review. Unfortunately, the events show us a different picture.

Post-truth is actually a massive action of epistemic injustice (Fricker, 2007), to the extent that the reliability of the other as epistemic agent is based on criteria of similarity, rather than on dialectic. One is reliable as long as it is located within my own bubble. Everything outside is “fake news”. The algorithmic selection of information contributes to reinforce the polarization. Thus, no hybridization becomes possible, the common sense (Vähämaa, 2018) is reduced to the common bubble. How can the epistemic community still be a source of meaning in the connectivist era?

Meaning and Common Sense

SH and PH about language point to a very important historical source: the philosopher Giambattista Vico (Danesi, 1993; Tateo, 2015). Vico can be considered the scholar of the common sense and the imagination (Tateo, 2015). Knowledge is built as product of human experience and crystallized into the language of a given civilization. Civilization is the set of interpretations and solutions that different groups have found to respond to the common existential events, such as birth, death, mating, natural phenomena, etc.

According to Vico, all the human beings share a fate of mortal existence and rely on each other to get along. This is the notion of common sense: the profound sense of humanity that we all share and that constitutes the ground for human ethical choices, wisdom and collective living. Humans rely on imagination, before reason, to project themselves into others and into the world, in order to understand them both. Imagination is the first step towards the understanding of the Otherness.

When humans loose contact with this sensus communis, the shared sense of humanity, and start building their meaning on egoism or on pure rationality, civilizations then slip into barbarism. Imagination gives thus access to the intersubjectivity, the capability of feeling the other, while common sense constitutes the wisdom of developing ethical beliefs that will not harm the other. Vico ideas are echoed and made present by the critical theory:

“We have no doubt (…) that freedom in society is inseparable from enlightenment thinking. We believe we have perceived with equal clarity, however, that the very concept of that thinking (…) already contains the germ of the regression which is taking place everywhere today. If enlightenment does not [engage in] reflection on this regressive moment, it seals its own fate (…) In the mysterious willingness of the technologically educated masses to fall under the spell of any despotism, in its self-destructive affinity to nationalist paranoia (…) the weakness of contemporary theoretical understanding is evident.” (Horkheimer & Adorno, 2002, xvi)

Common sense is the basis for the wisdom, that allows to question the foundational nature of the bubble. It is the basis to understand that every meaning is not only defined in a positive way, but is also defined by its complementary opposite (Tateo, 2016).

When one uses the semantic prime “we” (Vähämaa, 2018), one immediately produces a system of meaning that implies the existence of a “non-we”, one is producing otherness. In return, the meaning of “we” can only be clearly defined through the clarification of who is “non-we”. Meaning is always cogenetic (Tateo, 2015). Without the capability to understand that by saying “we” people construct a cogenetic complex of meaning, the group is reduced to a self confirming, self reinforcing collective, in which the sense of being a valid epistemic agent is actually faked, because it is nothing but an act of epistemic arrogance.

How we can solve the problem of the epistemic bubble and give to the relationship between group and person a real epistemic value? How we can overcome the dangerous overlapping between sense of being functional in the group and false beliefs based on popularity?

Complementarity Between Meaning and Sense

My idea is that we must look in that complex space between the “meaning”, understood as a collectively shared complex of socially constructed significations, and the “sense”, understood as the very personal elaboration of meaning which is based on the person’s uniqueness (Vygotsky, 2012; Wertsck, 2000). Meaning and sense feed into each other, like common sense and imagination. Imagination is the psychic function that enables the person to feel into the other, and thus to establish the ethical and affective ground for the common sense wisdom. It is the empathic movement on which Kant will later on look for a logic foundation.

“Act in such a way that you treat humanity, whether in your own person or in the person of any other, never merely as a means to an end, but always at the same time as an end.” (Kant 1993, p. 36. 4:429)

I would further claim that maybe they feed into each other: the logic foundation is made possible by the synthetic power of empathic imagination. Meaning and sense feed into each other. On the one hand, the collective is the origin of internalized psychic activities (SH), and thus the basis for the sense elaborated about one’s own unique life experience. On the other hand, the personal sense constitutes the basis for the externalization of the meaning into the arena of the collective activities, constantly innovating the meaning of the words.

So, personal sense can be a strong antidote to the prevailing force of the meaning produced for instance in the epistemic bubble. My sense of what is “ought”, “empathic”, “human” and “ethic”, in other words my wisdom, can help me to develop a critical stance towards meanings that are build in a self-feeding uncritical way.

Can the dialectic, complementary and cogenetic relationship between sense and meaning become the ground for a better epistemic performance, and for an appreciation of the liminal meaning produced in contemporary societies? In the last section, I will try to provide arguments in favor of this idea.

Ethical Grounding of Meaning

If connectivistic and post-truth societies produce meanings that are based on popularity check, rather than on epistemic appreciation, we risk to have a situation in which any belief is the contingent result of a collective epistemic agent which replicates its patterns into bubbles. One will just listen to messages that confirm her own preferences and belief and reject the different ones as unreliable. Inside the bubble there is no way to check the meaning, because the meaning is not cogenetic, it is consensual.

For instance, if I read and share a post on social media, claiming that migrants are the main criminal population, despite my initial position toward the news, there is the possibility that within my group I will start to see only posts confirming the initial fact. The fact can be proven wrong, for instance by the press, but the belief will be hard to change, as the meaning of “migrant” in my bubble is likely to continue being that of “criminal”. The collectivity will share an epistemically unjust position, to the extent that it will attribute a lessened epistemic capability to those who are not part of the group itself. How can one avoid that the group is scaffolding the “bad” epistemic skills, rather than empowering the individual (Vähämaa, 2018)?

The solution I propose is to develop an epistemic virtue based on two main principles: the ethical grounding of meaning and the cogenetic logic. The ethical grounding of meaning is directly related to the articulation between common sense and wisdom in the sense of Vico (Tateo, 2015). In a post-truth world in which we cannot appreciate the epistemic foundation of meaning, we must rely on a different epistemic virtue in order to become critical toward messages. Ethical grounding, based on the personal sense of humanity, is not of course epistemic test of reliability, but it is an alarm bell to become legitimately suspicious toward meanings. The second element of the new epistemic virtue is cogenetic logic (Tateo, 2016).

Meaning is grounded in the building of every belief as a complementary system between “A” and “non-A”. This implies that any meaning is constructed through the relationship with its complementary opposite. The truth emerges in a double dialectic movement (Silva Filho, 2014): through Socratic dialogue and through cogenetic logic. In conclusion, let me try to provide a practical example of this epistemic virtue.

The way to start to discriminate potentially fake news or the tendentious interpretations of facts would be essentially based on an ethic foundation. As in Vico’s wisdom of common sense, I would base my epistemic scrutiny on the imaginative work that allows me to access the other and on the cogenetic logic that assumes every meaning is defined by its relationship with the opposite.

Let’s imagine that we are exposed to a post on social media, in which someone states that a caravan of migrants, which is travelling from Honduras across Central America toward the USA border, is actually made of criminals sent by hostile foreign governments to destabilize the country right before elections. The same post claims that it is a conspiracy and that all the press coverage is fake news.

Finally the post presents some “debunking” pictures showing some athletic young Latino men, with their faces covered by scarves, to demonstrate that the caravan is not made by families with children, but is made by “soldiers” in good shape and who don’t look poor and desperate as the “mainstream” media claim. I do not know whether such a post has ever been made, but I just assembled elements of very common discourses circulating in the social media.

The task is no to assess the nature of this message, its meaning and its reliability. I could rely on the group as a ground for assessing statements, to scrutinize their truth and justification. However, due to the “bubble” effect, I may fall into a simple tautological confirmation, due to the configuration of the network of my relations. I would probably find only posts confirming the statements and delegitimizing the opposite positions. In this case, the fact that the group will empower my epistemic confidence is a very dangerous element.

I could limit my search for alternative positions to establish a dialogue. However, I could not be able, alone, to find information that can help me to assess the statement with respect to its degree of bias. How can I exert my skepticism in a context of post-truth? I propose some initial epistemic moves, based on a common sense approach to the meaning-making.

1) I must be skeptical of every message which uses a violent, aggressive, discriminatory language, and that such kind of message is “fake” by default.

2) I must be skeptical of every message that treats as criminals or is against whole social groups, even on the basis of real isolated events, because this interpretation is biased by default.

3) I must be skeptical of every message that attacks or targets persons for their characteristics rather than discussing ideas or behaviors.

Appreciating the hypothetical post about the caravan by the three rules above mentioned, one will immediately see that it violates all of them. Thus, no matter what is the information collected by my epistemic bubble, I have justified reasons to be skeptical towards it. The foundation of the meaning of the message will not be neither in the group nor in the person. It will be based on the ethical position of common sense’s wisdom.

Contact details: luca@hum.aau.dk

References

Austin, J. L. (1975). How to do things with words. Oxford: Oxford University Press.

Bouchard, D. (2013). The nature and origin of language. Oxford: Oxford University Press.

Danesi, M. (1993). Vico, metaphor, and the origin of language. Bloomington: Indiana University Press.

Fricker, M. (2007). Epistemic injustice: Power and the ethics of knowing. Oxford University Press.

Horkheimer, M., & Adorno, T. W. (2002). Dialectic of Enlightenment. Trans. Edmund Jephcott. Stanford: Stanford University Press.

Kant, I. (1993) [1785]. Grounding for the Metaphysics of Morals. Translated by Ellington, James W. (3rd ed.). Indianapolis and Cambridge: Hackett.

Keyes, R. (2004). The Post-Truth Era: Dishonesty and Deception in Contemporary Life. New York: St. Martin’s.

Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of Applied Research in Memory and Cognition, 6(4), 353-369.

Siemens, G. (2005). Connectivism: A learning theory for the digital age. International Journal of Instructional Technology and Distance Learning, 2(1) http://www.itdl.org/Journal/Jan_05/article01.htm

Silva Filho, W. J. (2014). Davidson: Dialog, dialectic, interpretation. Utopía y praxis latinoamericana, 7(19).

Tateo, L. (2015). Giambattista Vico and the psychological imagination. Culture & Psychology, 21(2), 145-161.

Tateo, L. (2016). Toward a cogenetic cultural psychology. Culture & Psychology, 22(3), 433-447.

Thao, T. D. (2012). Investigations into the origin of language and consciousness. New York: Springer.

Vähämaa, M. (2018). Challenges to Groups as Epistemic Communities: Liminality of Common Sense and Increasing Variability of Word Meanings, Social Epistemology, 32:3, 164-174, DOI: 10.1080/02691728.2018.1458352

Vygotsky, L. S. (2012). Thought and language. Cambridge, MA: MIT press.

Wertsck, J. V. (2000). Vygotsky’s Two Minds on the Nature of Meaning. In C. D. Lee & P. Smagorinsky (eds), Vygotskian perspectives on literacy research: Constructing meaning through collaborative inquiry (pp. 19-30). Cambridge: Cambridge University Press.

Author Information: James A. Marcum, Baylor University, james_marcum@baylor.edu

Marcum, James A. “A Role for Taxonomic Incommensurability in Evolutionary Philosophy of Science.” Social Epistemology Review and Reply Collective 7, no. 7 (2018): 9-14.

The pdf of the article gives specific page references. Shortlink: https://wp.me/p1Bfg0-3YP

See also:

Image by Sanofi Pasteur via Flickr / Creative Commons

 

In a review of my chapter (Marcum 2018), Amanda Bryant (2018) charges me with failing to discuss the explanatory role taxonomic incommensurability (TI) plays in my revision of Kuhn’s evolutionary philosophy of science. To quote Bryant at length,

One of Marcum’s central aims is to show that incommensurability plays a key explanatory role in a refined version of Kuhn’s evolutionary image of science. The role of incommensurability on this view is to account for scientific speciation. However, Marcum shows only that we can characterize scientific speciation in terms of incommensurability, without clearly establishing the explanatory payoff of so doing. He does not succeed in showing that incommensurability has a particularly enriching explanatory role, much less that incommensurability is “critical for conceptual evolution within the sciences” or “an essential component of…the growth of science” (168).

Bryant is right. I failed to discuss the explanatory role of TI for the three historical case studies, as listed in Table 8.1, in section 5, “Revising Kuhn’s Evolutionary Image of Science and Incommensurability,” of my chapter. Obviously, my aim in this response, then, is to amend that failure by discussing TI’s role in the case studies and by revising the chapter’s Table to include TI.

Before discussing the role of TI in the historical case studies, I first develop the notion of TI in terms of Kuhn’s revision of the original incommensurability thesis. Kuhn (1983) responded to critics of the original thesis in a symposium paper delivered at the 1982 biannual meeting of the Philosophy of Science Association.

In the paper, Kuhn admitted that his primary intention for incommensurability was more “modest” than with what critics had charged him. Rather than radical or universal changes in terms and concepts—what is often called “global” incommensurability (Hoyningen-Huene 2005, Marcum 2015, Simmons 1994)—Kuhn claimed that only a handful of terms and concepts are incommensurable after a paradigm shift. He called this thesis “local” incommensurability.

More Common Than Incommensurable

Kuhn’s revision of the original incommensurability thesis has important implications for the TI thesis. To that end, I propose three types of TI. The first is comparable to Kuhn’s local incommensurability in which only a small number of terms and concepts are incommensurable, between the lexicons of two scientific specialties. The second is akin to global incommensurability in which two lexicons are radically and universally incommensurable with one another—sharing only a few commensurable terms and concepts.

An example of this type of incommensurability is the construction of a drastically new lexicon accompanying the evolution of a specialty. Both local and global TI represent, then, two poles along a continuum. For the type of TI falling along this continuum, I propose the notion of regional TI—in keeping with the geographical metaphor.

Unfortunately, sharper delineation among the three types of TI in terms of the quantity and quality of incommensurable and commensurable terms and concepts composing taxonomically incommensurable lexicons cannot be made currently, other than local TI comprises one end of the continuum while global TI the other end, with regional TI occupying an intermediate position between them. Notwithstanding this imprecise delineation, the three types of TI are apt for explaining the evolution of the microbiological specialties of bacteriology, virology, and retrovirology, especially with respect to their tempos and modes.

Revised Table. Types of tempo, mode, and taxonomic incommensurability for the evolution of microbiological specialties of bacteriology, virology, and retrovirology (see text for details).

Scientific Specialty Tempo Mode Taxonomic

Incommensurability

 

Bacteriology Bradytelic Phyletic Global

 

Virology Tachytelic Quantal Regional

 

Retrovirology Horotelic Speciation Local

 

 

Examples Bacterial and Viral

As depicted in the Revised Table, the evolution of bacteriology, with its bradytelic tempo and phyletic mode, is best accounted for through global TI. A large number of novel incommensurable terms and concepts appeared with the evolution of bacteriology and the germ theory of disease, and global TI afforded the bacteriology lexicon the conceptual space to evolve fully and independently by isolating that lexicon from both botany and zoology lexicons, as well as from other specialty lexicons in microbiology.

For example, in terms of microbiology as a specialty separate from botany and zoology, bacteria are prokaryotes compared to other microorganisms such as algae, fungi, and protozoa, which are eukaryotes. Eukaryotes have a nucleus surrounded by a plasma membrane that separates the chromosomes from the cytoplasm, while prokaryotes do not. Rather, prokaryotes like bacteria have a single circular chromosome located in the nucleoid region of the cell.

However, the bacteriology lexicon does share a few commensurable terms and concepts with the lexicons of other microbiologic specialties and with the cell biology lexicons of botany and zoology. For example, both prokaryotic and eukaryotic cells contain a plasma membrane that separates the cell’s interior from the external environment. Examples of many other incommensurable (and of a few commensurable) terms and concepts make up the lexicons of these specialties but suffice these examples to provide how global TI provided the bacteriology lexicon a cognitive environment so that it could evolve as a distinct specialty.

Also, as depicted in the Revised Table, the evolution of virology, with its tachytelic tempo and quantal mode, is best accounted for through regional TI. A relatively smaller number of new incommensurable terms and concepts appeared with the evolution of virology compared to the evolution of bacteriology, and regional TI afforded the virology lexicon the conceptual space to evolve freely and self-sufficiently by isolating that lexicon from the bacteriology lexicon, as well as from other biology lexicons.

For example, the genome of the virus is surrounded by a capsid or protein shell, which distinguishes it from both prokaryotes and eukaryotes—neither of which have such a structure. Moreover, viruses do not have a constitutive plasma membrane, although some viruses acquire a plasma membrane from the host cell when exiting it during lysis. However, the function of the viral plasma membrane is different from that for both prokaryotes and eukaryotes.

Interestingly, the term plasma membrane for the virology lexicon is both commensurable and incommensurable, when compared to other biology lexicons. The viral plasma membrane is commensurable in that it is comparable in structure to the plasma membrane of prokaryotes and eukaryotes but it is incommensurable in that it functions differently. Finally, some viral genomes are composed of DNA similar to prokaryotic and eukaryotic genomes while others are composed of RNA; and, it is this RNA genome that led to the evolution of the retrovirology specialty.

Image by AJC1 via Flickr / Creative Commons

And As Seen in the Retrovirological

As depicted lastly in the Revised Table, the evolution of retrovirology, with its horotelic tempo and speciation mode, is best accounted for through local TI. An even smaller number of novel incommensurable terms and concepts accompanied the evolution of retrovirology as compared to the number of novel incommensurable terms and concepts involved in the evolution of the virology lexicon vis-à-vis the bacteriology lexicon.

And, as true for the role of TI in the evolution of bacteriology and virology, local TI afforded the retrovirology lexicon the conceptual space to evolve rather autonomously by isolating that lexicon from the virology and bacteriology lexicons. For example, retroviruses, as noted previously, contain only an RNA genome but the replication of the retrovirus and its genome does not involve replication of the RNA genome from the RNA directly, as for other RNA viruses.

Rather, retrovirus replication involves the formation of a DNA provirus through the enzyme reverse transcriptase. The DNA provirus is subsequently incorporated into the host’s genome, where it remains dormant until replication of the retrovirus is triggered.

The incommensurability associated with retrovirology evolution is local since only a few incommensurable terms and concepts separate the virology and retrovirology lexicons. But that incommensurability was critical for the evolution of the retrovirology specialty (although given how few incommensurable terms and concepts exist between the virology and retrovirology lexicons, a case could be made for retrovirology representing a subspecialty of virology).

Where the Payoff Lies

In her review, Bryant makes a distinction, as quoted above, between characterizing the evolution of the microbiological specialties via TI and explaining their evolution via TI. In terms of the first distinction, TI is the product of the evolution of a specialty and its lexicon. In other words, when reconstructing historically the evolution of a specialty, the evolutionary outcome is a new specialty and its lexicon—which is incommensurable locally, regionally, or globally with respect to other specialty lexicons.

For example, the retrovirology lexicon—when compared to the virology lexicon—has few incommensurable terms, such as DNA provirus and reverse transcriptase. The second distinction involves the process or mechanism by which the evolution of the specialty’s lexicon takes place vis-à-vis TI. In other words, TI plays a critical role in the evolutionary process of a specialty and its lexicon.

Keeping with the retrovirology example, the experimental result that actinomysin D inhibits Rous sarcoma virus was an important anomaly with respect to the virology lexicon, which could only explain the replication of RNA viruses in terms of the Central Dogma’s flow of genetic information. TI, then, represents the mechanism, i.e. by providing the conceptual space, for the evolution of a new specialty with respect to incommensurable terms and concepts.

In conclusion, the “explanatory payoff” for TI with respect to the revised Kuhnian evolutionary philosophy of science is that such incommensurability provides isolation for a scientific specialty and its lexicon so that it can evolve from a parental stock. For, without the conceptual isolation to develop its lexicon, a specialty cannot evolve.

Just as biological species like Darwin’s Galápagos finches, for instance, required physical isolation from one another to evolve (Lack 1983), so the evolving microbiological specialties also required conceptual isolation from one another and from other biology specialties and their lexicons. TI accounts for or explains the evolution of science and its specialties in terms of providing the necessary conceptual opportunity for the specialties to emerge and then to evolve.

Moreover, it is of interest to note that an apparent relationship exists between the various tempos and modes and the different types of TI. For example, the retrovirology case study suggests that local TI is commonly associated with a horotelic tempo and speciation mode—which to some extent makes sense intuitively. In other words, speciation requires far fewer lexical changes than phylogeny, which requires many more lexical changes or an almost completely new lexicon—as the evolution of bacteriology illustrates.

The proposed evolutionary philosophy of science, then, accounts for the emergence of bacteriology in terms of a specific tempo and mode, as well as a particular type of TI; and, it thereby provides a rich explanation for its emergence. Furthermore, the quantity and quality of taxonomically incommensurable terms and concepts involved in the evolution of the microbiology specialties suggest the following relative frequency for the different types of TI: local TI > regional RI > global TI.

The Potential of Evolutionary Paradigms

Finally, I proposed in my chapter that Kuhn’s revised evolutionary philosophy of science is a good candidate for a general philosophy of science, even in light of philosophy of science’s current pluralistic or perspectival stance. Interestingly, regardless of the increasing specialization within the natural sciences (Wray 2005), these sciences are moving towards integration in order to tackle complex natural phenomena. For example, cancer is simply too complex a disease to succumb to a single specialty (Williams 2015).

The revised Kuhnian evolutionary philosophy of science helps to appreciate and account for the drive and need for integration of different scientific specialties to investigate complex natural phenomena, such as cancer. Specifically, one of the important reasons for the integration is that no single scientist can master the necessary lexicons, whether biochemistry, bioinformatics, cell biology, genomic biology, immunology, molecular biology, physiology, etc., needed to investigate and eventually to cure the disease. A scientist might be bilingual or even trilingual with respect to specialties but certainly not multilingual.

The conceptual and methodological approach, which integrates these various specialties, stands a better chance in discovering the pathological mechanisms involved in carcinogenesis and thereby in developing effective therapies. Integrated science, then, requires a systems or network approach since no one scientists can master the various specialties needed to investigate a complex natural phenomenon.

In the end, TI helps to make sense of why integrated science is important for the future evolution of science and of how an evolutionary philosophy of science can function as a general philosophy of science.

Contact details: james_marcum@baylor.edu

References

Bryant, Amanda. “Each Kuhn Mutually Incommensurable”, Social Epistemology Review and Reply Collective 7, no. 6 (2018): 1-7.

Hoyningen-Huene, Paul. “Three Biographies: Kuhn, Feyerabend, and Incommensurability”, In Rhetoric and Incommensurability. Randy A. Harris (ed.), West Lafayette, IN: Parlor Press, (2005): 150-175.

Kuhn, Thomas S. “Commensurability, Comparability, Communicability”, PSA: 1982, no. 2

(1983): 669-688.

Lack, David. Darwin’s Finches. Cambridge: Cambridge University Press, (1983).

Marcum, James A. Thomas Kuhn’s Revolutions: A Historical and an Evolutionary Philosophy of Science. London: Bloomsbury, (2015).

Marcum, James A. “Revolution or Evolution in Science?: A Role for the Incommensurability Thesis?”, In The Kuhnian Image of Science: Time for a Decisive Transformation? Moti Mizrahi (ed.), Lanham, MD: Rowman & Littlefield, (2018): 155-173.

Simmons, Lance. “Three Kinds of Incommensurability Thesis”, American Philosophical Quarterly 31, no. 2 (1994): 119-131.

Williams, Sarah C.P. “News Feature: Capturing Cancer’s Complexity”, Proceedings of the National Academy of Sciences, 112, no. 15 (2015): 4509-4511.

Wray, K. Brad. “Rethinking Scientific Specialization”, Social Studies of Science 35. no. 1 (2005): 151-164.

Author Information: Adam Riggio, SERRC Digital Editor, serrc.digital@gmail.com

Riggio, Adam. “Action in Harmony with a Global World.” Social Epistemology Review and Reply Collective 7, no. 3 (2018): 20-26.

The pdf of the article gives specific page references. Shortlink: https://wp.me/p1Bfg0-3Vp

Image by cornie via Flickr / Creative Commons

 

Bryan Van Norden has become about as notorious as an academic philosopher can be while remaining a virtuous person. His notoriety came with a column in the New York Times that took the still-ethnocentric approach of many North American and European university philosophy departments to task. The condescending and insulting dismissal of great works of thought from cultures and civilizations beyond Europe and European-descended North America should scandalize us. That it does not is to the detriment of academic philosophy’s culture.

Anyone who cares about the future of philosophy as a tradition should read Taking Back Philosophy and take its lessons to heart, if one does not agree already with its purpose. The discipline of philosophy, as practiced in North American and European universities, must incorporate all the philosophical traditions of humanity into its curriculum and its subject matter. It is simple realism.

A Globalized World With No Absolute Hierarchies

I am not going to argue for this decision, because I consider it obvious that this must be done. Taking Back Philosophy is a quick read, an introduction to a political task that philosophers, no matter their institutional homes, must support if the tradition is going to survive beyond the walls of universities increasingly co-opted by destructive economic, management, and human resources policies.

Philosophy as a creative tradition cannot survive in an education economy built on the back of student debt, where institutions’ priorities are set by a management class yoked to capital investors and corporate partners, which prioritizes the proliferation of countless administrative-only positions while highly educated teachers and researchers compete ruthlessly for poverty wages.

With this larger context in mind, Van Norden’s call for the enlargement of departments’ curriculums to cover all traditions is one essential pillar of the vision to liberate philosophy from the institutions that are destroying it as a viable creative process. In total, those four pillars are 1) universal accessibility, economically and physically; 2) community guidance of a university’s priorities; 3) restoring power over the institution to creative and research professionals; and 4) globalizing the scope of education’s content.

Taking Back Philosophy is a substantial brick through the window of the struggle to rebuild our higher education institutions along these democratic and liberating lines. Van Norden regularly publishes work of comparative philosophy that examines many problems of ethics and ontology using texts, arguments, and concepts from Western, Chinese, and Indian philosophy. But if you come to Taking Back Philosophy expecting more than a brick through those windows, you’ll be disappointed. One chapter walks through a number of problems as examples, but the sustained conceptual engagement of a creative philosophical work is absent. Only the call to action remains.

What a slyly provocative call it is – the book’s last sentence, “Let’s discuss it . . .”

Unifying a Tradition of Traditions

I find it difficult to write a conventional review of Taking Back Philosophy, because so much of Van Norden’s polemic is common sense to me. Of course, philosophy departments must be open to primary material from all the traditions of the human world, not just the Western. I am incapable of understanding why anyone would argue against this, given how globalized human civilization is today. For the context of this discussion, I will consider a historical and a technological aspect of contemporary globalization. Respectively, these are the fall of the European military empires, and the incredible intensity with which contemporary communications and travel technology integrates people all over Earth.

We no longer live in a world dominated by European military colonial empires, so re-emerging centres of culture and economics must be taken on their own terms. The Orientalist presumption, which Edward Said spent a career mapping, that there is no serious difference among Japanese, Malay, Chinese, Hindu, Turkic, Turkish, Persian, Arab, Levantine, or Maghreb cultures is not only wrong, but outright stupid. Orientalism as an academic discipline thrived for the centuries it did only because European weaponry intentionally and persistently kept those cultures from asserting themselves.

Indigenous peoples – throughout the Americas, Australia, the Pacific, and Africa – who have been the targets of cultural and eradicative genocides for centuries now claim and agitate for their human rights, as well as inclusion in the broader human community and species. I believe most people of conscience are appalled and depressed that these claims are controversial at all, and even seen by some as a sign of civilizational decline.

The impact of contemporary technology I consider an even more important factor than the end of imperialist colonialism in the imperative to globalize the philosophical tradition. Despite the popular rhetoric of contemporary globalization, the human world has been globalized for millennia. Virtually since urban life first developed, long-distance international trade and communication began as well.

Here are some examples. Some of the first major cities of ancient Babylon achieved their greatest economic prosperity through trade with cities on the south of the Arabian Peninsula, and as far east along the Indian Ocean coast as Balochistan. From 4000 to 1000 years ago, Egyptian, Roman, Greek, Persian, Arab, Chinese, Mongol, Indian, Bantu, Malian, Inca, and Anishinaabeg peoples, among others, built trade networks and institutions stretching across continents.

Contemporary globalization is different in the speed and quantity of commerce, and diversity of goods. It is now possible to reach the opposite side of the planet in a day’s travel, a journey so ordinary that tens of millions of people take these flights each year. Real-time communication is now possible between anywhere on Earth with broadband internet connections thanks to satellite networks and undersea fibre-optic cables. In 2015, the total material value of all goods and commercial services traded internationally was US$21-trillion. That’s a drop from the previous year’s all-time (literally) high of US$24-trillion.[1]

Travel, communication, and productivity has never been so massive or intense in all of human history. The major control hubs of the global economy are no longer centralized in a small set of colonial powers, but a variety of economic centres throughout the world, depending on industry. From Beijing, Moscow, Mumbai, Lagos, and Berlin to Tokyo, and Washington, the oil fields of Kansas, the Dakotas, Alberta, and Iraq, and the coltan, titanium, and tantalum mines of Congo, Kazakhstan, and China.

All these proliferating lists express a simple truth – all cultures of the world now legitimately claim recognition as equals, as human communities sharing our Earth as we hollow it out. Philosophical traditions from all over the world are components of those claims to equal recognition.

The Tradition of Process Thought

So that is the situation forcing a recalcitrant and reactionary academy to widen its curricular horizons – Do so, or face irrelevancy in a global civilization with multiple centres all standing as civic equals in the human community. This is where Van Norden himself leaves us. Thankfully, he understands that a polemic ending with a precise program immediately becomes empty dogma, a conclusion which taints the plausibility of an argument. His point is simple – that the academic discipline must expand its arms. He leaves the more complex questions of how the philosophical tradition itself can develop as a genuinely global community.

Process philosophy is a relatively new philosophical tradition, which can adopt the classics of Daoist philosophy as broad frameworks and guides. By process philosophy, I mean the research community that has grown around Gilles Deleuze and Félix Guattari as primary innovators of their model of thought – a process philosophy that converges with an ecological post-humanism. The following are some essential aspects of this new school of process thinking, each principle in accord with the core concepts of the foundational texts of Daoism, Dao De Jing and Zhuang Zi.

Ecological post-humanist process philosophy is a thorough materialism, but it is an anti-reductive materialism. All that exists is bodies of matter and fields of force, whose potentials include everything for which Western philosophers have often felt obligated to postulate a separate substance over and above matter, whether calling it mind, spirit, or soul.

As process philosophy, the emphasis in any ontological analysis is on movement, change, and relationships instead of the more traditional Western focus on identity and sufficiency. If I can refer to examples from the beginning of Western philosophy in Greece, process thought is an underground movement with the voice of Heraclitus critiquing a mainstream with the voice of Parmenides. Becoming, not being, is the primary focus of ontological analysis.

Process thinking therefore is primarily concerned with potential and capacity. Knowledge, in process philosophy, as a result becomes inextricably bound with action. This unites a philosophical school identified as “Continental” in common-sense categories of academic disciplines with the concerns of pragmatist philosophy. Analytic philosophy took up many concepts from early 20th century pragmatism in the decades following the death of John Dewey. These inheritors, however, remained unable to overcome the paradoxes stymieing traditional pragmatist approaches, particularly how to reconcile truth as correspondence with knowledge having a purpose in action and achievement.

A solution to this problem of knowledge and action was developed in the works of Barry Allen during the 2000s. Allen built an account of perception that was rooted in contemporary research in animal behaviour, human neurology, and the theoretical interpretations of evolution in the works of Steven Jay Gould and Richard Lewontin.

His first analysis, focussed as it was on the dynamics of how human knowledge spurs technological and civilizational development, remains humanistic. Arguing from discoveries of how profoundly the plastic human brain is shaped in childhood by environmental interaction, Allen concludes that successful or productive worldly action itself constitutes the correspondence of our knowledge and the world. Knowledge does not consist of a private reserve of information that mirrors worldly states of affairs, but the physical and mental interaction of a person with surrounding processes and bodies to constitute those states of affairs. The plasticity of the human brain and our powers of social coordination are responsible for the peculiarly human mode of civilizational technology, but the same power to constitute states of affairs through activity is common to all processes and bodies.[2]

“Water is fluid, soft, and yielding. But water will wear away rock, which is rigid and cannot yield. Whatever is soft, fluid, and yielding will overcome whatever is rigid and hard.” – Lao Zi
The Burney Falls in Shasta County, Northern California. Image by melfoody via Flickr / Creative Commons

 

Action in Phase With All Processes: Wu Wei

Movement of interaction constitutes the world. This is the core principle of pragmatist process philosophy, and as such brings this school of thought into accord with the Daoist tradition. Ontological analysis in the Dao De Jing is entirely focussed on vectors of becoming – understanding the world in terms of its changes, movements, and flows, as each of these processes integrate in the complexity of states of affairs.

Not only is the Dao De Jing a foundational text in what is primarily a process tradition of philosophy, but it is also primarily pragmatist. Its author Lao Zi frames ontological arguments in practical concerns, as when he writes, “The most supple things in the world ride roughshod over the most rigid” (Dao De Jing §43). This is a practical and ethical argument against a Parmenidean conception of identity requiring stability as a necessary condition.

What cannot change cannot continue to exist, as the turbulence of existence will overcome and erase what can exist only by never adapting to the pressures of overwhelming external forces. What can only exist by being what it now is, will eventually cease to be. That which exists in metamorphosis and transformation has a remarkable resilience, because it is able to gain power from the world’s changes. This Daoist principle, articulated in such abstract terms, is in Deleuze and Guattari’s work the interplay of the varieties of territorializations.

Knowledge in the Chinese tradition, as a concept, is determined by an ideal of achieving harmonious interaction with an actor’s environment. Knowing facts of states of affairs – including their relationships and tendencies to spontaneous and proliferating change – is an important element of comprehensive knowledge. Nonetheless, Lao Zi describes such catalogue-friendly factual knowledge as, “Those who know are not full of knowledge. Those full of knowledge do not know” (Dao De Jing 81). Knowing the facts alone is profoundly inadequate to knowing how those facts constrict and open potentials for action. Perfectly harmonious action is the model of the Daoist concept of Wu Wei – knowledge of the causal connections among all the bodies and processes constituting the world’s territories understood profoundly enough that self-conscious thought about them becomes unnecessary.[3]

Factual knowledge is only a condition of achieving the purpose of knowledge: perfectly adapting your actions to the changes of the world. All organisms’ actions change their environments, creating physically distinctive territories: places that, were it not for my action, would be different. In contrast to the dualistic Western concept of nature, the world in Daoist thought is a complex field of overlapping territories whose tensions and conflicts shape the character of places. Fulfilled knowledge in this ontological context is knowledge that directly conditions your own actions and the character of your territory to harmonize most productively with the actions and territories that are always flowing around your own.

Politics of the Harmonious Life

The Western tradition, especially in its current sub-disciplinary divisions of concepts and discourses, has treated problems of knowledge as a domain separate from ethics, morality, politics, and fundamental ontology. Social epistemology is one field of the transdisciplinary humanities that unites knowledge with political concerns, but its approaches remain controversial in much of the conservative mainstream academy. The Chinese tradition has fundamentally united knowledge, moral philosophy, and all fields of politics especially political economy since the popular eruption of Daoist thought in the Warring States period 2300 years ago. Philosophical writing throughout eastern Asia since then has operated in this field of thought.

As such, Dao-influenced philosophy has much to offer contemporary progressive political thought, especially the new communitarianism of contemporary social movements with their roots in Indigenous decolonization, advocacy for racial, sexual, and gender liberation, and 21st century socialist advocacy against radical economic inequality. In terms of philosophical tools and concepts for understanding and action, these movements have dense forebears, but a recent tradition.

The movement for economic equality and a just globalization draws on Antonio Gramsci’s introduction of radical historical contingency to the marxist tradition. While its phenomenological and testimonial principles and concepts are extremely powerful and viscerally rooted in the lived experience of subordinated – what Deleuze and Guattari called minoritarian – people as groups and individuals, the explicit resources of contemporary feminism is likewise a century-old storehouse of discourse. Indigenous liberation traditions draw from a variety of philosophical traditions lasting millennia, but the ongoing systematic and systematizing revival is almost entirely a 21st century practice.

Antonio Negri, Rosi Braidotti, and Isabelle Stengers’ masterworks unite an analysis of humanity’s destructive technological and ecological transformation of Earth and ourselves to develop a solution to those problems rooted in communitarian moralities and politics of seeking harmony while optimizing personal and social freedom. Daoism offers literally thousands of years of work in the most abstract metaphysics on the nature of freedom in harmony and flexibility in adaptation to contingency. Such conceptual resources are of immense value to these and related philosophical currents that are only just beginning to form explicitly in notable size in the Western tradition.

Van Norden has written a book that is, for philosophy as a university discipline, is a wake-up call to this obstinate branch of Western academy. The world around you is changing, and if you hold so fast to the contingent borders of your tradition, your territory will be overwritten, trampled, torn to bits. Live and act harmoniously with the changes that are coming. Change yourself.

It isn’t so hard to read some Lao Zi for a start.

Contact details: serrc.digital@gmail.com

References

Allen, Barry. Knowledge and Civilization. Boulder, Colorado: Westview Press, 2004.

Allen, Barry. Striking Beauty: A Philosophical Look at the Asian Martial Arts. New York: Columbia University Press, 2015.

Allen, Barry. Vanishing Into Things: Knowledge in Chinese Tradition. Cambridge: Harvard University Press, 2015.

Bennett, Jane. Vibrant Matter: A Political Ecology of Things. Durham: Duke University Press, 2010.

Betasamosake Simpson, Leanne. As We Have Always Done: Indigenous Freedom Through Radical Resistance. Minneapolis: University of Minnesota Press, 2017.

Bogost, Ian. Alien Phenomenology, Or What It’s Like to Be a Thing. Minneapolis: Minnesota University Press, 2012.

Braidotti, Rosi. The Posthuman. Cambridge: Polity Press, 2013.

Deleuze, Gilles. Bergsonism. Translated by Hugh Tomlinson and Barbara Habberjam. New York: Zone Books, 1988.

Chew, Sing C. World Ecological Degradation: Accumulation, Urbanization, and Deforestation, 3000 B.C. – A.D. 2000. Walnut Creek: Altamira Press, 2001.

Negri, Antonio, and Michael Hardt. Assembly. New York: Oxford University Press, 2017.

Parikka, Jussi. A Geology of Media. Minneapolis: University of Minnesota Press, 2015.

Riggio, Adam. Ecology, Ethics, and the Future of Humanity. New York: Palgrave MacMillan, 2015.

Stengers, Isabelle. Cosmopolitics I. Translated by Robert Bononno. Minneapolis: Minnesota University Press, 2010.

Stengers, Isabelle. Cosmopolitics II. Translated by Robert Bononno. Minneapolis: Minnesota University Press, 2011.

Van Norden, Bryan. Taking Back Philosophy: A Multicultural Manifesto. New York: Columbia University Press, 2017.

World Trade Organization. World Trade Statistical Review 2016. Retrieved from https://www.wto.org/english/res_e/statis_e/wts2016_e/wts2016_e.pdf

[1] That US$3-trillion drop in trade was largely the proliferating effect of the sudden price drop of human civilization’s most essential good, crude oil, to just less than half of its 2014 value.

[2] A student of Allen’s arrived at this conclusion in combining his scientific pragmatism with the French process ontology of Deleuze and Guattari in the context of ecological problems and eco-philosophical thinking.

[3] This concept of knowledge as perfectly harmonious but non-self-conscious action also conforms to Henri Bergson’s concept of intuition, the highest (so far) form of knowledge that unites the perfect harmony in action of brute animal instinct with the self-reflective and systematizing power of human understanding. This is a productive way for another creative contemporary philosophical path – the union of vitalist and materialist ideas in the work of thinkers like Jane Bennett – to connect with Asian philosophical traditions for centuries of philosophical resources on which to draw. But that’s a matter for another essay.

Author Information: Peter Taylor, University of Massachusetts Boston, peter.taylor@umb.edu

Taylor, Peter. “Not Throwing Up My ‘hand in defeat … or reduc[ing] everything to contextual complexity’: A Short Response to Lynch’s Counter-Criticisms.” Social Epistemology Review and Reply Collective 5, no. 4 (2016): 65-66.

The PDF of the article gives specific page numbers. Shortlink: http://wp.me/p1Bfg0-2SL

Please refer to:

Loch Tay

Image credit: spodzone, via flickr

Bill Lynch is clearly more accepting of using the explanatory form of natural selection for evolution than I am.[1] He wants, moreover, to discourage readers from exploring my account of Darwin, from which my criticisms flow, by claiming that these criticisms play into the hands of intelligent design exponents,[2] fall into the STS temptation of blowing up all explanations,[3] and “conflate … the issue of whether a proposed explanation can explain a particular phenomenon and whether we can know it to be true given the limited tools at our disposal.”[4]  Continue Reading…