Part IV: Prognosis, Academic Agonies and How to Avoid Them, Joseph Agassi






1. The Academic and the University Administration

The traditional university was administered by its senate and faculties, with the rector running the senate and deans running their faculties. Senates included all and almost only the university’s professors; faculties had all academic staff as their members except for the assistants. Rectors and deans were professors elected for limited periods. Department heads were professors elected for limited periods. They had administrative staff and graduate assistants to help them. When necessary some junior members, lecturers or assistant professors and associate or extraordinary professors helped, especially in running some committees. In old times, departments had often one professor and two assistants competing for the position of his heir and hating each other accordingly.

The English system of Oxbridge has its college system, well described in 1951 The Masters of C. P. Snow. I will not go into it except to say it describes a small college through a narration about its fellows, hardly noticing students and non-academic staff. Colleges and universities must employ maintenance crews, of course. They usually also employ an academic secretary and the staff necessary for the performance of the secretarial duties. The most important and heaviest administrative task today, that of keeping records of students’ achievements, was traditionally left for students: every student kept his own record booklet. When I was a student, I had such a booklet. For a time I also worked as a clerk in the office of the academic secretary of my Alma Mater. The whole administrative staff of the university were a handful. I did not mention here the university laboratories personnel, its legal department, its department of extra mural studies, the academic press, the public-relations office and other specialized departments. It does not matter overmuch; the point is that the whole academic administration was very small and today it is very big. They say that today the normal ratio is this: there is one administrator to two academics. Suspicion is that the proportion is of one to one.[1] I was shocked to find bouncers in the entrance to the hall where I delivered introduction to philosophy lectures who stopped elderly people from entering the hall unless they had proof of having paid to listen to me. I learned this by accident: a bouncer did not recognize me as the lecturer and stopped me. I refused to cooperate and this did not go unnoticed: the faculty office mad an unfriendly comment.

Two innovations in the university administration since my student days are most important as threats to academic freedom. One is the general adoption of the system that existed in the United States of America before: the appointment of a president and the board of governors who are together responsible for its budget, thereby gaining control over functions of the university that members of traditional universities would have found an intolerable encroachment on academic freedom. The other most important innovation is the institution of a research authority whose role is to pressure professors to apply for grants. It is an outrage.

The justifications of these innovations are economic and functional: academic faculties have much lighter duties than schoolteachers do, because allegedly they have to keep up with research and they have to contribute to it. This raises repeatedly the question, what role is more central to Academe, teaching or research? The question is repeatedly dismissed on the ground that the two functions strongly interact: to be a good academic teacher you must do research and vice versa. This argument is refuted by the presence in Academe of both teaching-positions and research professorships. This refutation may be answerable. The discussion of these matters is too poor to raise this objection and the possible response to it. Indeed, this discussion is redundant: the administration has decided that there is a need for a research office—usually run by a dean of research elected by the senate—and the faculty can seldom question administrative decisions.

Things are changing due to the tremendous growth of the system of higher education with the mushroom growth of teaching colleges, namely of institutions that grants no degrees, like community colleges, or only lower degrees, like some open universities. Teachers there are not expected to engage in research, at least not as a rule. Some of their graduates become students in some universities. They were often not up to it. Nevertheless, they have my admiration since they are not engaged in phony research and their students are more often eager to learn, as they do not have the incentive of academic degrees.

The story told here is different. It is about the power structure of the standard university. This is of little import, as it is largely fictitious—not the administrative posts, but the controls that come with them. There is a body that officially runs the daily affairs of a university; it is officially recognized; it is the collection of the holders of the powerful offices: the president, the rector, and a few others. But the governing body may be unofficial. Either way, members of senate hardly know what that body does and even its agenda. Clearly, the modern university or college is financially speaking a substantial organization. A part of its budget is run on fees, and the portion that this part comprises of is a major characteristic of it. For example, the question how important is the university’s football team depends on the question, how much the university depends on fees. The dependence of the university on research grants is similar. These are matters that the administration controls as a matter of course and that it impinges on academic life is all too obvious. This way the faculty has gradually lost its self-administration to the long forgotten Cold War—first in the USA and soon in the whole western world. I do not know how academic life is run in former communist countries or in the Far East.

When discussing the reform of the university, it is worthwhile to begin with the administration. The reason for this may be not very obvious, but it is a good reason: the university can exist with minimal administrative stuff, perhaps not very efficiently, but it can. Yet, obviously, it cannot possibly exist with no faculty or even with very small faculty. When discussing the reform of the university, it is thus worthwhile to check the function of each of its arms, the faculty, the students, the maintenance staff and the administrative staff. Let me skip discussion of the maintenance staff, although I have much to say about its diverse functions from libraries and restaurants to on-campus entertainment (a theatre, a museum, a zoo) and on-campus medical services. The administration has to serve mainly the needs of the faculty and of the students. The question what are these needs is all too obviously a matter of decisions that have little or nothing to do with scholarship; these needs, anyway, are fairly clear, since researchers list their requests for what they consider useful for them. The same holds for other aspects of the academic institutions that are full board, such as the academic institutions of some religious orders and such as military academies with fairly rigid academic curricula. In any case, these matters are hardly studied.[2] Even the impact of these institutions on Academe at large is hardly studied. I will not go into that. Suffice it to notice that the faculties of engineering and of communication technologies are relatively newly founded and they come to serve a market—as does the faculties of medicine and of law[3] that are as old as the university. The impact of these is obvious. Traditionally legal studies were meant to serve not the market but the community. These days some colleges aim to serve the (alleged) needs of the community, others aim serve those of the market. They all intend to supply as many degrees of all sorts as the market can absorb. As to the desire to study regardless of the possible use of that knowledge as means of livelihood, it is totally ignored. Such people do exist, but planners overlook them, perhaps as they comprise an insignificant minority. As long as university training is market-oriented, market research provides the guidance for training. Now this is not the case, and the paradigm for it is the faculty of arts, as is well known. The question there is, how much is the nation’s economy free-market oriented? I do not know.

In the discussion of the role of Academe, it is very important to beware of hostility. The hostility to market research in matters intellectual is conspicuous. It is the hallmark of a respected school in contemporary philosophy, the celebrated Frankfurt School. This hostility causes much harm. It does not stop market research; it only blocks critical discussion of it, thus blocking its ability to attain high quality. Guardians of some endangered subjects—ancient Akkadian has become the rubberstamp instance of that, I do not know why—protest that market oriented attitudes suffocate them. They may be right, I cannot judge. Only few students will enrol for courses whose texts are largely written in cuneiforms, but these students still need professors and departments and departmental secretaries and all that. Statisticians know that, and this includes the ones who conduct market research. They provide weights to marginal cases (by the method of inverse probability weighting) so that they will not drop out of sight: we do not want to face a situation in which there will be no single living individual able to read cuneiforms. And it is the marginal cases that will be the first victims of the hostility to market research. This is a general rule: whatever can be reached by means of hostility will be better reached otherwise. It does not look that way because in democracy we have to fight prejudices that enlightened dictatorships can overcome with ease. Except that enlightened dictatorship is a fantasy (that the better members of the Frankfurt School spoke against) and we cannot circumvent the need for education as the only means we have to sustain democracy. And so, there is no better cure for hostility than education for the democratic aptitude.

Advisers seldom discuss democratic aptitudes; theoreticians place them in political science and forget them because they belong to applied psychology. Psychologists ignore them as they are elitists and thus they are seldom democrats. The most important and least noticed of these is a sense of proportion. It is often ignored due to obsession. We should never handle these. This makes my present discussion of no use to you right now—at least not until you get tenured and know how to neutralize students’ wrath. For, when treatment of an obsession threatens to be successful, the obsessed penalize their benefactors. This is a shame, because the obsessed about studies may excel in their studies; it will do them no good (as Freud has argued, the implementation of an obsession brings no relief, eve when it is successful), but they may contribute anyway. Still, leave them alone, at least while you are a beginner. Nowadays you have no interest in the university administration, much less in its future reform. This part of the present work may say little to you. So perhaps you should leave it for later.

2. The Academic and the Economy

The public image of Academe and of academics is intentionally confused, misleading, and hyper-conservative. This is the inevitable outcome of the very presence of public-relations officers whose task is to serve authorities. Public-relations officers and public-relations systems are not at fault; the drive for success unrestrained is. We still suffer the shock that nineteenth century western society experienced during the transition from agrarian to urban society—as described in the monumental literature from the writings of Charles Dickens to those of George Eliot and Evelyn Waugh, from Flaubert, Balzac and Stendhal to Émile Zola and Marcel Proust, not to mention Jack London and F. Scott Fitzgerald and Joseph Conrad and Pirandello and Joseph Roth and Maxim Gorky and ever so many other wonderful authors, not to mention the tremendous contribution of the realist cinema to our familiarity with this painful transition. I am not going to discuss this painful transition, except to say that the most pertinent aspect of it—especially here—is the contribution of education to the process. For me it is epitomized in the story of Thomas Alva Edison and of Wilhelm Neurath, the father of Otto Neurath, the leading early twentieth-century philosopher. The story of Edison is well known, from rags to great riches, all by his own wits. He is larger than life, too fantastic to be a role model. We can ignore him here, especially since he was only once in intellectual society, where he read a terrific paper and was laughed at, never to forget the humiliation. Among his hundreds of great innovations, the greatest scientific discovery he made was thermionic (hot metals emit electrons), and he registered it as a patent (the diode) rather than write it up as a scientific paper. As to Wilhelm Neurath, he grew up in East-European Jewish milieu—poor and pious—receiving the narrow Jewish education traditional in that milieu, and yet he managed to end up a professor of political economics in the glittering Vienna of the end of the nineteenth century, in one of the few universities in the world that expected its faculty to perform research. He was a pioneer: he wrote about the right to work.

Uncommon as such individuals are, they played role models for countless parents eager to see their offspring emerge out of the stultifying poverty to which they were born. This laid intolerable burden on the offspring. Even reading the CV of successful academics is depressing, as they parade successes and conceal failures in obligatory deception. It is bound to cause pain to a young ambitious academic like you. So you should know that a CV is not the place to show that its object was once in your shoes, that by sheer luck and by the tremendous good will of ever so many people, failure failed to reign supreme. For this you may look up honest biographies. You will find some, since encouragement is an essential ingredient for success.

The public image of Academe and of its inhabitants is intentionally confused, misleading, and hyper-conservative. This is not the worst. Public images are not that important. Physicians for example project a worse image, of faultless life savers. Why should anyone like to appear infallible? The law recognizes the right to make harmful mistakes not due to thoughtlessness, negligence or similarly unacceptable conduct. To assume the ability to be infallible is to forego this recognition, to relinquish, to waive, to give up voluntarily the right to such mistakes. It is to forego the right to admit error, to forego the opportunity to help others to avoid repetition of the error, to foster conservative attitude even in cases that cry aloud for reform. When Ignaz Semmelweis demanded that physicians wash their hands after they perform post-mortem operations, it was a demand for a small and inexpensive reform with great benefit. He was punished severely for it. Fortunately, some years later, two individuals, Joseph Lister and Louis Pasteur, took up the matter; this way they started modern medicine. And still, after all this, physicians still prefer to appear infallible! It leaves me speechless,[4] except that some progress did take place here and there. Nevertheless, let me report, whenever I voice some criticism of Academe, no matter how marginal, the very fact that I voice it regularly shock my audience. That shows that they view Academe as a utopia, as an island of perfection. It is a minor, innocuous self-deception; yet for some people it costs too much. Concerned parents who make tremendous economic efforts to afford academic education for their offspring seek the best institution for them. The concerned parents seek the advice of the most informed friend, relation or neighbor or passerby. These take the utopian image a jot more seriously. They advise to send the young victim to an Ivy League school with no concern for their needs and no familiarity with them. This is just terrible.

The free market deviates from optimal operations due ignorance of agents. This is known in jargon as friction. The very use of this word, let me repeat, is apologetic: Galileo’s law of gravity disregards the air friction that causes a feather and a parachute to deviate from it; free market economists see this as license to ignore refutations. Yet Galileo’s followers studied the physics of feathers and parachutes with great success; not so defenders of the free market theory. This is holds particularly for the employment market, particularly the market in academic jobs (like the one you hope to acquire soon). Milton Friedman waived the criticism of the free market theory by reference to its success. The economic crisis of 2008 called his bluff.

The public image of Academe and of its inhabitants is intentionally confused, misleading, and hyper-conservative. Consider the pay that academics receive. It is sheer treason to say so, but say so I will: academic jobs are better paid than academics will settle for. My friend Robert S. Cohen was the chair of the physics department in Boston University when the school administration told heads of departments to list people in their department to terminate their jobs. Rather than do this, Cohen managed to have his stuff volunteer pay-cuts to prevent the planned dismissals. This took some doing, to be sure, and it cannot be a standard cure for all economic ills of all academic institutions. Nevertheless, it is a story that does have a moral to it—one that academics prefer to ignore.

They have a point, though. The point being economic, appeal has to be made to economic theory. Yet economic theory, it is well known, has nothing to say about innovation. The assessment of the need for Academe has two components: the training of future technologists and the continuation of current research. Now not all research is academic. Thus, universities are active in the market since even before the proliferation of startups. Also, research in startups is mainly private. Nevertheless, the problem remains: how much the ability of Academe to keep up technological research depends on the engagement of the university in the study of dead ancient languages? The answer seems obvious: there is no reason to assume that the ability of the university to contribute to the nation’s ability to keep up-to-date in any way depends on its having a department for the study of ancient languages. This answer is false. It rests on the presumption about future research, which is palpably absurd: its presumptions are commonsense yet all great breakthroughs were causes of radical alterations of commonsense. In particular, any plan to improve the efficiency of research, particularly the commonsensical ones, may be the death of all research. If you are not sensitive to the ability of minor and seemingly irrelevant causes to change a whole system, you should try to immerse yourself in ecological studies for a few weeks.

The public image of Academe and of its inhabitants is intentionally confused, misleading, and hyper-conservative. This need not be disastrous. All it takes is to ignore it as much as possible. The planners of the plans to reform Academe should ignore people swayed by its public image and avoid arguing with them. As a part of the plan to reform Academe, when there will be such a plan, may be a consideration of the public image of Academe and how to improve it.

Meanwhile we should discuss the question, what is the place of Academe in the economy and does it require an alteration, and if yes, what alteration. We should remember, western Academe was a system of sinecure, of priests with no communities. They had no obligations. They used their students as copyists of their texts. By now they do not need students for that: the computer does that ever so much more efficiently. Academics who write essays or books still need their peers and their students—as critics and as commentators. The academy is a terrific learning environment despite the great damage that the system creates to itself by using incentives and other methods meant to force people to study. At the time of the foundation of the western-style academy all of its members were all Christians—faculty, students and all else. The great philosopher Michael Polanyi whom I had the privilege to have met personally, studied in Budapest University in the early twentieth century and he enrolled into the faculty of medicine merely because he was a Jew: other faculties were closed to Jews. When he graduated, he changed to physical chemistry, his true interest. He was sufficiently creative to have earned a fan letter from Albert Einstein discussing with him his ideas. Now Polanyi was lucky to enter university at all. Traditional Jewish students had only in Jewish institutes of learning to support them. Traditionally, Jews had two kinds of institutions of learning (not higher learning: there is no such thing in Jewish tradition): the institute for solitary learning, which was a place with books, and the school where one sought instructors and co-students and where one learned religious law and medicine (as one did in the early universities). The institute of solitary learning was attached to a synagogue; so had no staff. The Jewish house of learning had a chief rabbi and a few helpers to run it, but no budget and no organization. Students were free to stay there as long as they wished and they had the communities to support them or feed them. They were ordained as rabbis or as doctors by examination committees of three that put their hands on their heads and declaring them qualified. They stayed in school until they were invited to perform rabbinical duties or to enter the market on their own arrangements. Christian students had better terms: the Church took care of them on condition that they take the vows. Things changed little with the rise of Protestantism and more with secularization, and much more so with the entry of the industrial revolution into the university and still more so due to the intervention of the pentagon in academic life. Prior to the American and French revolutions change took place gradually. Afterwards, more than anything else, the new view of higher education as technological training ground made for radical changes. No room sine then for ancient Acadian. How it survives is partly due to fashions sustained by exciting public images of glorious antiquity, partly by good will of donors, but mainly by tenacity.

The public image of Academe and of its inhabitants is intentionally confused, misleading, and hyper-conservative. The first thing to do is to kill the myth that Academe is profitable for the nation: this holds, but only when treated as a by-product; designing it to be so will kill it. This is so obvious that the father of the scientific revolution, Francis Bacon, said it of scientific research in general.[5] In addition, we have to consider academics parasites of the system. This will raise the problem, how are we to elect people to positions that need not yield any fruit? This problem will be lessened as the need for employment will reduce. Hopefully, this is around the corner: we may soon shed our harmful, inhuman work-ethic and see to it that citizen will not be forced to work—possibly forced, but not by the threat of starvation. Even if we will fully succeed in this assignment, it will not solve the general problem of induction to Academe. We will need to develop traditions, somewhat less silly than today’s induction methods, to help us decide such matters, and we may hope that there will develop alternative traditions that we will be able to compare and help improve through such comparisons and critical debates about them.

The public image of Academe and of its inhabitants is intentionally confused, misleading, and hyper-conservative. And we want to expose its inanity by making public the methods of induction current today and discussing them critically and proposing some improvements. The worst of the current situation is the method of advertising. When I was a graduate in England I was offered a position in my own school, the London School of Economics and Political Science. I enjoyed it, since after my graduation I studied there different subjects and even read papers in seminars in different departments. But for personal reasons I could not stay there. I desperately wanted to join a faculty in my own country, but I had no support there and so I had no chance to get a position there. This changed later on: over a decade later I was invited to Tel Aviv where I stayed to this day. In the meanwhile I applied for jobs in England—in York, Oxford and Cambridge. In all three philosophy jobs were advertised and in all three the advertisement was pro forma: the jobs were allotted for others and I had no chance of changing their decisions in the interviews that were performed just because the law required it but to no effect. This need not be so. When I was offered a job in York University, Toronto, the job was advertised, but with a clear explanation of the situation: people were invited to apply if they thought they could beat me at the specifications of the job that made them consider appointing me as the default option. No one did.

You may wonder what has this to do with the economics of appointments. The answer is, it is the most general idea of the market economy: openness and competition raise efficiency most. Academe can barely have it.

3. The New Renaissance of Learning

“Plan for the best and be prepared for the worst.” Since life is a set of lost opportunities, as any side-glance into history will show, it is hard to judge which is nearer to the mark, optimism or pessimism. These days we face the choice between the real hell of a nuclear winter and a utopia that surpasses anything envisaged by any of our dreaming ancestors.[6]

The future that I envisage here is scarcely a rosy utopia as it rests on two reasonable components. One is the rapid alteration of the current employment policy by totally relinquishing our current work-ethic and by ceasing to force people to work. How exactly this will happen does not matter overmuch. The other radical change concerns study: we should totally relinquish our current study-ethic and abolish all forcing studies of any sort and for any purpose.

Tradition divides life to study, work and entertainment. Our current philosophy recommends that we allow the spending of time on entertainment only as a means for continuing to work and study as preparation for work. This philosophy is both disastrous and stupid, seeing that the very idea of a challenge is missing from it even though common knowledge tells us that we need challenge in all of these three areas and that we can mix them with profit. Education pioneer Maria Montessori built her successful educational system on mixing learning with play. This goes in the right direction, but we must go much further in that direction.

I am speaking of the love of learning. The most popular learning theory still is that of the great seventeenth-century philosopher John Locke, it is a variant of the learning theory of Aristotle of the fourth century BC. It has no room for incentives and no room for the love of learning—the one that both Aristotle and Locke excelled in. What we witness here is not only indifference to the love of learning. We witness here hostility to learning. This finds expression in regrettable contempt for impractical learning, known as tough-and-no-nonsense. This is an irrationalist pro-science philosophy, one that tends towards illiberalism and anti-democracy: it supports the rule of experts. Its best expression is in one of the very best science-fiction novels: Fred Hoyle, The Black Cloud, 1957. There is much to say for the tough-and-no-nonsense attitude. When the GI Bill was implemented in the USA its successful outcome led to the popularity of academic education and to the tremendous expansion of Academe, first in the USA and soon the civilized world all over. And the success was conditioned on the success of the absorption into the system of good students whose education was wanting as it was understood then, and the flexibility required for their absorption into the system was facilitated by the tough-and-no-nonsense attitude. Still this attitude remained and crystallized into what may be called anti-poetry.

We have then two versions of the hostility to the love of learning: we may choose between the pro-poetry version of it and the anti-poetry version of it. Now a poetry-addict though I am, I will not advocate the love of poetry. We must be tolerant. Some people are colour-blind, others are poetry-blind, and still others are science-blind; they all handicapped who have a place under the sun. My recommendation is (not that you appreciate this or that item but) that you devote yourself to what you enjoy. Of course, we should see to it that the haters of poetry, and even the indifferent to it, should not be appointed teachers of poetry, regardless of whether the schools that are their prospective employers have their poetry courses as obligatory or not. But this is a different matter. All I want to suggest to you is that teachers of x hate x appear unlikely, and that appearances mislead. Look at it this way: many people have chosen as their favorite field of study the field of study of their favorite teacher. These teachers excel: they enjoy teaching what they teach. This would be impossible were favorite teachers not so rare.

I once received an invitation to speak in a museum. I do not know how this came about: most invitations I have are to institutions of learning. My custom is to speak on the subject-matter that interests my hosts. I did not know what to speak of then. It occurs to me that in museums you face the intriguing fact that great art comes often in waves. These are known as golden ages. There are golden ages of plastic arts, of poetry, of music. They are always local: Spanish poetry, Italian-Renaissance plastic art, Elizabethan poetry, Elizabethan music, French impressionism, German expressionism. I decided to read a paper on the question, what makes for a golden age? As is my custom, I looked for literature on the question. To my surprise I found none. After I delivered my paper, I tried to publish it in a learned journal and failed. It is published in a collection of essays of mine.[7] If you are interested in the question, let me save you the trouble of reading it by giving you the spoiler: what makes for a golden age is encouragement.

Today the rule is discouragement: who do you think you are? This remark is never acceptable. Learn to be deaf to it, and ignore the question, who addresses it to whom.

The idea, come to think of it, is not my discovery. An example is Bernard Shaw’s Pygmalion. In that play Professor Higgins meets with an undistinguished flower girl, Eliza Doolittle, and he uses her to prove that offering her the pronunciation of the upper class and adding some social information that upper-class women acquire with no effort make her pass as an upper-class individual with ease. To make it sharper, Pygmalion in the Classroom, 1968, reports that expert whispering into a teacher’s ear that a certain pupil is exceptional suffices to improve that students rate of scholarly achievements. That shows the power of misanthropic teaching.[8]

Notice this: the difference between Pygmalion in the Classroom and the ambitious parents is obvious: the student who is encouraged will improve and gladly so; the student who is pushed to seek approval is under pressure, and pressure tends to create resistance. The reason traditional Jewish education was successful in its ventures is that teachers were supposed to seek characteristics of students that they could encourage.[9] Only when the aim of Jewish education was agreeable were the results agreeable. But that is a different story. What was common to almost all Jewish life was the value of education and the love of learning. Perhaps it began with the commandment to study, but that commandment I do not like. The love of learning is in no need of boosting. Suffice it if it is not suppressed. The suppression of the love of learning is paradoxically due to the liberalism and democracy that it may bring about encouragement of diversity and of criticism: teachers may encourage learning until it leads to possible heresy. Teachers whose love of learning is not qualified will possibly fight heresy, but they will not discourage the learning of heretic material or of taboo material or of any other allegedly dangerous material. It is easy to convey to students disapproval of some learning: a tone of voice or a gesture may suffice. And the outcome is very damaging. Look at the general outcome of education: some people you know are allergic to mathematics, other are allergic to poetry or music or whatever their teachers try to tell them they must study even if they dislike it. Dislike it they soon do. The decision that no learning is to be curbed no matter what, however, leads to a true Renaissance of learning. Remember: opposition to science or to art is not inborn. Yet it is a widespread disease, an epidemic indeed.

The Renaissance is famous for its Renaissance people, for people who excelled in all branches of human culture. The concept of Renaissance people has two versions, that of the Age of Reason and that of the Romantic metaphysicians. The Age of Reason took it for granted that every individual is autonomous. Of course, all individuals have the right to use their own judgment; the thinkers of the Age of Reason assumed that all individuals use this right to the full. They assumed that whatever question one adjudicates on one does so out of the use of the right to autonomy, namely, out of the ability to judge for themselves. So popular books about all sorts of scientific and philosophical and social matters appeared to help people judge for themselves, yet, clearly, these books showed that the view in question is highly exaggerated. Of this genre I like in particular Newtonian Cosmology for Ladies: Dialogues on Light and Colours (1737). The Romantic philosophers said, those who can live up to the expectation of the Enlightenment Movement and be autonomous are geniuses. In other words, do not aspire to be autonomous unless you consider yourself a genius. But then, who do you think you are? Both doctrines opted for extreme positions. A sense of proportion should dismiss both.

Whatever the truth is, what matters is first and foremost that the extremist doctrines put tremendous pressure on their enthusiasts and this pressure we should neutralize. To begin with, you should choose to study what interests you and ignore all your obligations and in particular your alleged obligation to receive good grades and your alleged obligation to see to it that you will have reasonable employment when you grow up.

My interest here is not only to help you enjoy the life of a scholar that you should learn to cultivate on your way to your chosen academic career; it is to revive or resuscitate the culture of the Renaissance style of learning. But let me stay with your own career, if you want me to do so. (Otherwise skip to the next section.) First, the popular idea that good grades are essential for a good career is downright silly. At most receiving sufficiently good grade to be able to graduate is essential for professionals who need license to practice—like physicians, lawyers and accountants—and then getting grades has little to do with learning and more to do with learning the system. A friend of mine who was a retired teacher registered for a degree in a university and was once surprised to receive a poor grade on an exam contrary to her expectations. She consulted the professor about the matter. He looked at her exam paper, and in embarrassment changed her grade. She chafed: she did not appeal for an improved grade but asked for an explanation. He was dismayed: she expected the graduate student who graded her exam to see that she understood the material when she did not use the keyword that helped him do his job. She admitted error and left.

All academics have to grade except for the very few research professors who have no teaching duties and no supervision. If you have to grade, it is reasonable to grade students for some homework, preferably essays. If you do advise your students to write essays, demand that they concern a controverted question. If this is too hard for them, let them write a review of a controversial book of their choice. And if you help them write and improve you have no fear of fraud. If you have too many students for that, then perhaps the system can force you to examine the students. My advice then is to prepare the exam questions in class on its last meeting. This may serve as self-examination: if you have conducted your class reasonably well, the students will participate intelligently—non-defensively—in the preparation. And, of course, all exams should be open-book.

If you do well, then the average grade will be above the average. Administrations resent grade sheets that show excellence, and so they harass the academics who issue them. If you are one of these academics, do not yield: administrators are supposed to know when it is good sense to yield, and they will soon learn to leave you alone. I used to revise grades long after the deadline, causing a complaint from the record office. They soon learned to leave me alone. They even befriended me.

For a good job it helps to know how to play the game; this is insufficient: to succeed you must be lucky. But I know you do not want the job that many covet from afar not knowing how tasking it is; you want a job that is reasonably well-paid and reasonably respectable yet that does not take so much out of you as to make you hate work. After all, this is why you aspire to obtain an academic job. The advantage of an academic job is that it allows for independence. To destroy this advantage your well-wishers tell you that you have to suffer before you can enjoy. And they have suffered and lost their capacity to enjoy performance of their jobs. Before taking their proposals seriously ask them, are they pleased with their situation?

I see that you question the logic of my advice. I hear you ask, why should I listen to you telling me not to listen to them? Well, of course, you decide whom to listen to. My standard advice on advice is, keep consulting as many people as you can. This way you will be likely to obtain conflicting suggestions, and then you will have to decide for yourself. The decision may be right or not, your taking your fate in your hand will enhance your autonomy. Bertrand Russell opens his 1929 Marriage and Morals with the question, what is better, to be autonomous and make mistakes or to listen to the right authority and thus be always right? He went for autonomy. We need not judge, since, if the right authority exists we do not know what it is. So we better make our own mistakes rather than those of our priest or fearless leader. The autonomous will err too, but their decisions are varied and open to correction: hopefully, reform is better than stagnation. Dictators often promise democracy; they deliver only at gunpoint.

It is easy to see why people often fail to use their autonomy. Erich Fromm called it the fear of freedom. There is more to it: autonomous people have to meet shaming and appeal to a sense of guilt. My best advice to you is, free yourself totally of a sense of shame and of guilt. Totally. Otherwise you simply invite bigger doses of shaming and more insistent appeals to your sense of guilt. They say this is impossible. Do prove them in error.

4. Specialization and Proliferation

The advocacy of proliferation is fraught with confusions. Consider dual confusion—of proliferation with relativism and relativism with toleration. Its expression is in idiom “my truth”, “his truth”. The truth belongs to nobody. Of many competing opinions, at most one is true; the rest are erroneous. Toleration is not of other people’s truth but of their opinions, of their of their errors; their opinions may be equally legitimate as ours—even if they are not as good as ours.[10] Now the attempt to merge this anti-relativist advocacy of pluralism leads to the opposite version of it: specialization. Let the people who are best familiar with a given field of study / action take control over it and let others defer to their expertise and refrain from meddling in it. Now to be tolerant we may be willing to allow—quite reluctantly, but nevertheless allow—this meddling with no expertise, but we, namely, the ones with the better understanding of the situation, namely you and I, on the understanding that such meddling has no value.

This understanding implies another: you and I are experts, possibly in different fields. In what field? The field in which you and I happen to be experts. This is toleration at the cost of isolation. The toleration of error allows cross-field respectful communication—if and when we share interest despite our differing fields of expertise. This is a totally different view of expertise, one that I have found in publications of arch-philosopher Bertrand Russell: specialization is a necessary limitation. The question that this view raises is then, how are we to mitigate its worst aspects?

This question suggests notice this. No one can be expert in all fields: universal expertise is impossible: one cannot master all the arts and sciences, not even the whole story of one art-form. Hence, to be an expert, one has to specialize in a narrow field. One can know more and more on less and less and less and less on more and more; or one can know more and more on less and less while knowing on the rest just what average educated people do. The second option here is obviously superior to the first, especially as long as the fund of knowledge that average educated people are familiar with is on the ascent. This is under dispute. The dispute becomes more reasonable when we specify the more and the less in different manners. Obviously, two ideologies compete here, one of them being a part of the traditional ideal of the educated individual and the other being a part of the tough-and-no-nonsense ideology of post-World War II culture. If you want some study of the process by which the second got the upper hand over the first, you may read Aldous Huxley’s 1939 novel, After Many a Summer the Swan Dies, that contrasts a cultured researcher and his crass assistant who are characteristic of the period of transition.

Huxley objected to insensitivity or crassness, we need not share his objection—no more than to color blindness. Objection is anyway a waste of effort. Yet the defense of it is as regrettable as the defense of blindness that is worse than waste of effort: it is the advocacy of self-mutilation. This advocacy may be in fashion, the fashion being the support of unreason, the support of the advice to avoid using one’s brains. That is irrationalism (to use the nosological name for this disease). There are arguments against rationalism, and they deserve study for various reasons, all of then rationalist, not as serious support for the advice for relinquishing reason:[11] you need no arguments to relinquish reason, just as you need no arguments to avoid using your right arm; and at times it is indeed advisable to avoid using your right arm; but it is downright silly to avoid using your right arm on principle. This should suffice to make irrationalism impractical—on principle. Finally, the tough-and-no-nonsense attitude is in principle worthless—as a form of irrationalism.

Putting it like that is cutting corners: even the irrationalist version of this idea has some merit; the deliberation on metaphysical ideas may be just an excuse for inaction, just an excuse for the inability to take a stand, and so on. I admit that. More generally, the lack of autonomy often appears in disguise as scientific doubt. This is easy to spot: when you face a practical question and offer a proposal for action people may ask you, as they often do, how do you think action A will bring about result R? What mechanism is at work here? For, quite possibly you are in error: far from relieving us of our trouble, action A may boost it. Now at time action A is costly and time consuming and it behooves those who are ready to try it out to take some time, to sleep on it, and then take the necessary measure to implement action A. At times, however, the best way to test the claim that action A leads to result R is to implement it right away on a small scale. The best practical pieces of advice are indeed such: they should either have immediate positive results or else prove inadequate with little, reversible damage. This is the demand for high refutability. Refutations that come late, after series of confirmations, may indeed be too costly.

The paradigm case here is the aircraft known as Comet 4. It was a plane tested by military authorities and by civil companies and found very satisfactory. It lost its tail in flight—with fatal results—due to metal fatigue, a phenomenon not known till then.

All this is irrelevant. Those who hesitate deciding about anything to do with flying are seldom informed about aviation and the physics that goes into it, and they seldom ask technical questions about it. Only when offered practical solutions to problems that are social in character, hesitant people mask their hesitation as technical questions. Arguing with them will often prove waste of time. Advice to avoid waste of time is usually a waste of time. Now the waste of time is a part of the idle way of life.[12] Yet it may frustrate. Awareness of this helps reduce frustration. The attitude of the tough-and-no-nonsense that reduces frustration by advocating useful actions does the job often enough, yet it may also cause tremendous frustration. People who are active all their lives may die happy in the feeling that they have spent their lives productively, but things may go wrong and make the dying feel having wasted their lives. This cannot be avoided for sure: one may waste one’s life with or without the help of the tough-and-no-nonsense attitude.[13]

My presentation is off the point. I am discussing the possibility of life spent satisfactorily and happily or wasted in deep regret—with the aid of this or that philosophy. Meditation on this may be very frustrating. Finding this to be the case leads to the philosophy of Ludwig Wittgenstein. He said, what a philosopher ought to do is to destroy the very interest in such discussions. Now liberal philosophy says, everyone has the right to decide what lifestyle to choose, and this raises the problem, what lifestyle should I choose? Some people can ignore this question; others cannot. Wittgenstein said, this question is upsetting and so it is better to destroy interest in it.[14] In my adolescence, I considered seriously the question, should I be a law-abiding citizen or not. I then envied Yehudi Menuhin, not because he was a musician and not because he had such tremendous success, but because since the age of three he knew what his lifestyle would be: he was a violinist. Little did I know: he broke down when he was at his peak. He emerged out of his personal crisis by retraining professionally, still as a performer: he moved to yoga, Indian music and to Jazz and to becoming involved with children’s literature, to partaking in world politics and to running a music school and to teaching music. This story portrays an extreme case. The opposite extreme is Edison, who was an inventor from the very start and he was very happy with this choice. He was extremely versatile. Other inventors were known to operate within a very narrow compass and very happy with their choices. This shows that there are many ways to evade the problem. Why some are hit so hard with it and others do not notice it I do not have any idea.

So back to you. If you have a specialty and you are at peace with it, do stay with it till you are reasonably established. And then perhaps you better takes stock; that will be up to you. But first get established, and do not mind if you are branded a specialist in the wrong specialty.

I do not know why people mind this. Consider Thomas S. Kuhn, perhaps the most famous philosopher of science of his day (his The Structure of Scientific Revolutions 1962 appeared in many languages; more than a million copies of it were sold). His success was often declared due to its work being a combination of history of science and of philosophy of science. He denied it: “I am never a philosopher and a historian at the same time.” He was a protégé of Harvard President James Bryant Conant. He never knew whether he wanted to be a philosopher of science or a historian of science, as his main idea was this: what makes a fields scientific is the unanimity of the academics who inhabit the department devoted to that field. It never occurred to him to ask, how did this look in a world devoid of universities and he even declared that science without professional periodicals is impossible—as if Archimedes wrote scientific papers. He had hoped to launch a new science. To that end he declared that he agreed with every individuals whose opinions he cited—even if he cited them in criticism. To his regret he failed to unify either the philosophy of science or the history of science. So he died with no decision on his allegiance. The question remained, what generates unanimity? His answer was, a shared paradigm of an admissible theory within the field. This raised the question, what is a paradigm? Early in his career he admitted that this is very much what Michael Polanyi had called personal knowledge, namely, that personal knowledge is not given to articulation. What the master can transmit to disciple is knowledge, or else it could not be transmitted, Polanyi observed, yet this does not mean that the master can articulate it; famously, masters are unable to say what they transmit to their disciples. The concept of the paradigm is a bit more specific: a master offers a chief (παρά, para) example (δείγμα, digma) for disciples to emulate. Kuhn said, the structure of scientific revolutions is this: when a paradigm becomes too problematic, the masters of its field invent a new paradigm and declare a paradigm-shift. To this he added: those academics who refuse to follow a paradigm-shift lose their jobs. This represents more his teacher, Harvard president James Bryant Conant, than Polanyi. The tenure system prevents this from happening to most academics. Kuhn ignored the historical information on the scientific revolution as a movement of amateur researchers organized in voluntary societies. This shows that his output is more propaganda than history.[15] In particular, it takes professionalization and the specialization that it accompanies as given, so that it ignores its rationale. Now, clearly, specialization is a result of the process of division of labor that is ancient, but that proliferated beyond recognition with the industrial revolution. It is time to do something about it. This is not for you now. What you can do in that direction is try our hand in popular science.

5. Science and Technology

The application of science is as old as science. Many historians of science see the origins of science in technology. They are biased: they advocate a corollary to their view of science as inductively evolved out of information: they take it for granted that ancient technology is prescientific and so devoid of theoretical basis. Anthropologists deny that. Some historians of science view science as having evolved out of metaphysics. Their bias is different: they consider science a deductive system. They have the support of Aristotle’s Metaphysics, although we are ignorant of his view about the question, how historically did science come into being. Bacon unjustly decried Aristotle’s philosophy as deductive and declared the need to be rid of it and to replace it with a theory based inductively on a “just history of nature”. The first step in this direction, he suggested, is to draw from the vast extant literature all information before disposing of it. No one has ever tried this, but the idea did serve some historians of the physical sciences who mentioned, however meagerly, some empirical knowledge embedded in technology. The most obvious examples should be from the most ancient technologies of hunters and gatherers or from the agricultural revolution, or perhaps from the ancient kitchen, such as the benefit of roasting or of cooking. Somehow, most historians of the physical sciences prefer cases that are not obvious and that statics reinforces, like the columns in Egyptian temples that are thickest in the right place: not in the middle but one-third up. This indicates that ancient designers knew some statics even though they learned it by trial-and-error, as they allegedly lacked any theory that could incorporate it. This supports Bacon’s view that information precedes scientific theory. As it stands, this idea sounds quite commonsense. Bacon was not the first to articulate it; nor was Aristotle. Taking this idea seriously, within methodology or within psychology, most current students of perception deny the possibility of (Baconian) pure perception. The discovery that all observations are theory-laden belongs to Galileo and Bacon. It is theory, Galileo observed, that prevents seeing the moon jump from one rooftop to another like a cat while strolling in the street on a moonlit night. Hence, said Galileo, we must be cautious when we argue from facts: we need to use the right theory; for this we must learn to think critically. Bacon did not agree. He said, critical thinking does not help, as the experience of scholasticism shows: the scholastics were critically minded and still in error—due to their prejudices. Hence, said Bacon, we must first give up all preconceived opinions. We will then see facts as they are, he promised. He even promised—but he never made good on his promise—that he would describe a technique to overcome optical illusions.

Bacon insisted that the avoidance of all theory is possible: all you need is the good will that requires giving up all beliefs voluntarily and stay vigilant. This rendered inductivism irrefutable: all scientific error proved their initiators prejudices and thus not qualified for scientific research. After Newton’s mechanics was superseded, things changed. Bertrand Russell said then, the view that one is free of all prejudice is humbug, you may remember. Yet many famous philosophers clung to the view that not all observation-reports are theory-laden. Thus, famous twentieth-century empiricist philosopher Rudolf Carnap split all statements of the language of science, be they true or false, to three parts, logical, observational and theoretical; he called the language of pure observation Lo and took it for granted that it is not empty. His most famous student, Carl G. Hempel, argued that possibly all theories are mere restatements of observation-reports, so that the language of theories LT is empty. If not, then the mix of the two languages, the set of theory-laden observation-reports, is problematic. Hempel was convinced that the problem is soluble since he advocated Wittgenstein’s theory that science has its own language. Hempel showed that this theory—of the language of science—is problematic too, but he never gave it up. Discussions involving this absurd theory abound. They are less than worthless. They claim to pertain not to psychology and not methodology, but only to epistemology: participants in it want to show (in vein) that given this odd division of language, some observations will justify the choice of theories (to believe in? to apply?).

This view was an essential part of Bacon’s vision of the new science. He was a utopist, and he declared science the vehicle of the great future, as it will cure the world of famine, poverty and disease, perhaps even of death, and thus also of the motivation for war. He suggested that people who wish to contribute to the rosy future of humanity should not waste their free time on politics but invest it in empirical research. The Royal Society of London that carried out this vision of Bacon (he was its patron saint) showed interest in technology. This was a novelty: earlier, scholars were aristocrats of one sort or another, and as such they were haughtily indifferent to technology.[16] Among the early Fellows of the Society were individuals interested in forestry, in shipbuilding and more. In retrospect, the most remarkable thing they did was to encourage the use of a steam engine of technician Thomas Newcomen. Its use was pumping water out of coalmines. A century later, replica of it held in the University of Glasgow drew the attention of James Watt, an instrument-maker turned researcher, hired to attend to the instruments that its observatory had inherited. It seems to me this was a way to support an independent researcher who was not sufficiently well off to fend for himself. In any case, he felt obliged to justify his job. He improved the Newcomen engine and with the support of the researchers around he and an engineer with some scientific pretense tried to market the new Watt engine. They went bankrupt. With the protection of a special patent, the venture had better success; it made him rich. The invention of the metal railway in the end of the eighteenth century and the construction of the train system soon after had the industrial revolution launched. In 1830 there was a revolution in the Royal Society of London with professional technologists who were also scientific researchers began excluding the amateurs from the Society and having it an organization of professionals run by professionals for professionals. The current URL of the Royal Society denies that it ever was an amateur society, claiming that it allowed rich amateurs to join it as a means for milking them to support poor scientific researchers. Memory is short, especially when the deletion of a memory aims to cover up embarrassment. Obviously, those who wrote the Google item on the Society found the idea of amateur science embarrassing.

It is amazing how quick the change was. The transition was total only after the end of World War II, with the total victory of professional science with its proof of ability to kill millions (the Manhattan Project). The change was remarkably fast. The reform of the universities after the French Revolution with the secularization of the system of higher education there and the 1830 reform of the Prussian universities and the foundation of the University of London soon followed by the foundation of technical universities, soon to turn much of the scientific academic education into technological training. Repeatedly commentators observed that in the long run even the most practical technology depends on pure research. To no avail.

The advancement of technology is taken very seriously as the peak of progress. The paradigm case is the NASA project of sending astronauts to the moon and back. Some critics observed that the cost of that project could make a difference were it used to feed starving children or a similar project. This, incidentally, is false: the call was to abort the project, and it would have cost more to do that than to go on with it. Even were the protest made early enough it would have been doubtful. The project was taken as a military venture, and so cancelling it would have benefitted not starving children but fatted generals. Were it possible to guarantee that large-scale funds will go to save starving children, then after the moon landing this should have happened. It did not. We do not quite know why.[17] For all we know, a world that sends astronauts to the moon is better fed than one without it. Nevertheless, though in error, opponents to the moon landing had their priorities right: technology is a tool, mainly to improve the quality of human lives, and so feeding string children should have top priority.

This is to endorse the idea of the welfare state: its main object is the improvement of the quality of human lives. Canada and Sweden, according to the UN records, are toe top welfare states today. Brian Mulroney, the Canadian premier for 1984-93, said once, the market loses every year a given number of engineers and the education system has to train and send to the market at least the same number of engineers every year; for that it needs to receive as students at least as many freshmen every year—which it does not. His purpose in saying this was to increase the budget for education. Academics responded with unqualified approval. Sad. He could have said, we are a rich country and we can afford improving the quality of life of our citizens, which is a part of our official duty, and one major means for this is the improvement of our education system—in the art and in the sciences alike. Canadian academic would have agreed with that, but they deemed it too daring. When a similar debate took place in the Congress of the United States of America, critics raged at useless investments in research. An example mentioned in that discussion was the investment of a pretty penny in the study of migration of some birds. By luck, the supporters of that project could show the benefit of the economic consequences of that project, and they mentioned it triumphantly as if all successful scientific research projects were also technologically successful. This is preposterous, yet an addition to the current American officialese, the term r & d [research and development], serves to keep confusing the issue: during the Cold War, the military used physicists and chemists, as well as mathematicians. They were called to help increase the already too huge American arsenal of weapons of mass destruction that should now be urgently on its way to dismantling as soon as possible.

The rich part of the world is the technologically active one and the liberal one. Look at those poor countries that provide the rich ones with oil and with other natural resources. They squander their money on luxuries for the rich instead of feeding and educating the poor. This shows the error in Bacon’s idea that science will make the world rich: this holds today only for the civilized part of the world. Ernest Gellner has said, with no civil society, neither science nor technology could flourish as they do.[18] Despite the tremendous success of liberalism—the rich countries are liberal and the liberal countries are rich—most scholars speak of liberalism as bankrupt.[19] This requires caution. The term “liberalism” is not used in the same sense everywhere. Moreover, much of the criticism of liberalism is just, and the criticism of its utopian version is even deadly. We should take it to be the direction for desirable improvements and among these the uncontroversial ones should gain top priority. This should suffice for putting much social science in the service of social technology.

6. Natural and Social Science

Gellner’s claim—only in civil society can science or technology flourish—takes us to social philosophy and to social science. That the academic members of the faculty of arts and the faculty of social science fight for the prevention of closing them down should turn red light on. The damage that the obscurantist tough-and-no-nonsense faculties can do is the worst: not immediate and not reversible. While the natural sciences and technologies may still flourish, social conditions may become increasingly intolerant—first while allowing natural science and natural technology their traditional freedom while implementing a limitation on civil liberties, and that limitation will affect all. Under such conditions, the citizenry may allow the rulers to limit democracy since their material conditions may improve due to permissible technological advances. People engaged in research into natural science and technology and even in production of the products of the more advanced knowledge, they will suffer less from the curbing of the freedoms of other citizens. When they will find that the conduct of their own affairs is becoming impossible, they may change their minds and fight for the reinstatement of democracy. By then it may be too late.

Things may start with the ill effects of the implementation of new technologies. This is conspicuous in every case in which the replacement of human labor with some robotic process leads to unemployment and thus to suffering. This is gratuitous: robots should relieve burden, not add misery. Now unemployment is a modern concept, born to the industrial revolution that totally altered employment traditions. It is these that have brought about unemployment. Marx deemed unemployment inevitable: the free market forces workers to accept employment for minimum wages thereby limiting their purchase power. He knew of course of unionization and of collective bargaining; he proved that the struggle for the improvement of workers’ conditions would fail: the rise of workers’ wages will cause inflation that will keep real wages minimal. It is like getting bigger spoons instead of more soup, he said.[20] This Marx deemed the cause of a new economic situation, of poverty within richness: economic crises make people hungry while warehouses overflow. This makes the abolition of the free market and the inauguration of socialism imperative. Marx never thought of legislation for the improvement of workers’ lots, since his philosophy said, intervention of politics in the economy is impossible.

Even Bernard Shaw ridiculed Marx, although he oddly declared himself a Marxist: while Marx was sitting in the British Museum reading books, laws regulating child labor were implemented. This goes well with Gellner’s observation: only in civil society can science or technology flourish.

What will refute Gellner’s observation? Obviously, the development of modern technology in a country with no civil society. The Soviet Union, for example. This did not make Gellner withdraw his observation. It is easy to see why: Soviet technology depended on Western technology. Moreover, it imposed on its public. Gellner meant to correct our perception: picturing inventors, we look at their scientific environments; he has invited us to pay attention to their social environments. Following his invitation, we learn to see his observation; it then soon looks trivial. His invitation was inspired and it made a difference in our view of technological society. Do not let the tough-and-no-nonsense destroy this improvement.

7. Science and the Arts

Apart from the sciences, the natural and the social, we have the humanities. The boundary between the natural sciences, the social sciences, and the arts is not clear: they overlap. Engineering seems to be mostly within the natural sciences: the parts of it that include mainly applied mathematics and applied natural science take practically the whole teaching efforts that go into the making of accomplished engineers. The most significant part of engineering nevertheless concerns social goals and circumstances. This is why engineers have to consult all sorts of non-engineers regarding their projects, from legal matters to insurance. These are technical too; there is still much non-technical to engineering. All this is clearer than the boundary between the sciences and the arts: how much of the study of language is science and how much is it art? Linguistics is a science and philology is an art; but the distinction between them is unclear. The only sharp distinction is between the universal and the particular: only the universal is scientific. Except that this distinction itself is not clear.[21] We need not go into that. Suffice it that some fields are decidedly arts and thus not science: the study of cultures, the history of art and of politics and their likes.

As science is triumphant, the arts are defensive. This makes science useless: few people can enjoy the fruits of technology with no knowledge of how to enjoy life. We should know what kind of leisure we enjoy, what kind of art we enjoy, and so on. The revolutionary aspect of the Keynesian revolution in economics is the idea that encouraging spending is the best way to solve some major difficulties that the economy may meet, problems that require solutions form the arts. All opposition to Keynesian economics notwithstanding, his proposal works and the objection of the Chicago school of economics to it concerns only the way the government spends its money, nothing else. (It was economist Don Patinkin, former student of Chicago superstar Milton Friedman, who said, Friedman was a clandestine Keynesian.[22])

As science is triumphant, the arts are defensive. This is so because members of the Arts faculty feel that they are fighting for their livelihood, and that for that they compete with the sciences; they take it for granted that they will lose, but they cannot afford to lose all the way without losing their means of livelihood. This is a regrettable error: the idea that the academy can do without attention to culture deprives it of its traditional status as a cultural institution, as an institution that should preserve culture and help advance it. It deprives science of its status as a major component of Western culture. Already Galileo found this view—instrumentalism—intolerable. (He said this with concern for science, which was at the time on the defence, regardless of the academy, for which he had little sympathy.) In his great 1632 Dialogue, he declared that instrumentalism belittles science and impeding its research.

Science is triumphant and the arts are defensive merely because the ignorant, and more so the obscurantists, advocate the view that science is devoid of intellectual value. (This way the advocates of science-sans-arts hope to reconcile their ignorance of the arts with their claim to be of cultural standing. Indeed, members of the science faculty who are not artistically ignorant speak differently.) This denial of cultural value to science glorifies it as practical; as the source of weapons for mass destruction and as a source of instruments that increase comfort. Also, some obscurantists look askance at the comfort of modern life. Of course, their view is appalling even after one concedes to them all that the faculty of arts is able to concede: the major task of technology is not to increase comfort but to combat hunger and disease and infant mortality!

C.P. Snow’s celebrated and highly influential 1959 The Two Cultures (namely, the arts and the sciences) suggested that we encourage the development of subjects that bridge the two cultures. (He intentionally ignored the social sciences and lumped technology within science. This deprived his contribution of durable value.) At best, his proposal is for sugar coating of a bitter pill (the sugarcoating that he recommended is photography and the history of science.)

The two cultures are fictional. As Michael Polanyi has noted,[23] the split of culture into two is an early nineteenth-century artefact. It reflects the average educated people’s loss of interest in the natural sciences due to the rise of professional scientific technology and the institution of compulsory education. Paradoxically, compulsory education split the population to the mathematical illiterate and the artistically illiterate. Compulsory education was a bliss, but it came with needlessly high cost; we can improve upon the early implementers of the idea of compulsory education. The compulsory-education law should force the young to go to school, not to study; to stay in school, not in the classroom. This will lead to the abolition of some classes—not all. And the abolished classes are indeed ones too boring to merit survival, unless we can revive them by restoring their passion. This will be a challenge for some enterprising young teachers. Such simple and easy-to-implement reform will reduce the gulf between art and science.

>This looks unproblematic. Do not worry: its implementation will raise problems. There will be advocates of closing departments that have merit, for example; already now we hear repeatedly the proposal to close the faculty of arts. It is admittedly just too stupid to suggest that people not proficient in the acquisition of languages should use this deficiency of theirs as a major reason for advocating the closing of departments of foreign languages and cultures. Academics with no shade of familiarity with the arts already propose to reduce the faculty of arts as much as possible. It is possible to close the faculty of arts, of course, and even at once. This will destroy Academe as we know it. They may then regret this, but it will then be barely reversible.

As science is triumphant, the arts are defensive. This threatens the future of the life of the spirit. The threat is real. The rest is marginal. It was in 1959 that leading German philosopher Karl Jaspers demanded that the university should limit its concern to science and science alone. In 1964 Harvard president James Bryant Conant cited him and added, “If there are no general principles, then the subject is not scientific; if it is not scientific, it does not belong in a university.” Dismissing “campus warfare”, he added that he took science in a broad sense of the word, while rejecting the demand for certainty and while using common sense. This is supposedly broadminded even though it excludes as unscientific, say, Marxism. Such attitudes led to the warning that Bertrand Russell issued at the time: the academic taboo on Marxist literature, he said, will turn the tide and make Marxism popular in the USA. This happened, but without correcting the damage that Conant had caused. It is time to try to think about that. The first step is to view science as a high point of western culture akin to high points of the western arts (Popper).

8. The Public, the Educated Lay People, and the Intellectual Elite

Usually, fringe-academics are—understandably but erroneously—too servile towards Academe, due to their uncritical acceptance of the diverse fashions in it. I have warned you against following them uncritically. This concerns you as a young person on the way to academic career. It behooves you to take a detached view of them and see their enormously valuable contribution to society.

The societies of western countries are democratic; their citizens take this for granted, although democracy needs a watchful eye to push the alarm bell when it is at risk. It often is, since it is inherently unstable, as it is given to easy transition from democracy to populism and from populism to dictatorship. A populist demagogue can win elections and then erect a dictatorship. The threat of dictatorship is less serious than populism, since it is attractive only in times of stress and then there is a strong disposition to unite and defend the right of common people to voice their opinions. Populism is dangerous because populism is the idea that average people can and should run their country yet they cannot; even elected experts seldom can. This is why the threat of populism lessens with the rise of the level of education of the population: the better educated the public is, the nearer democracy and populism are. What saves a democracy from populism is the respect that common people have for the better-educated members of their own society.

Politically, it is the fringe academics who sustain the respect for intellectuals that traditionally enabled them to keep populism at bay. They lost this ability when during the Vietnam War they semi-officially replaced truthfulness and credibility with populist techniques as means for furthering the cause of peace. We now pay the price for this by experiencing a wave of populism sweeping the western world.

This is where fringe-academics enter the picture. They have better access to the public arena than their peers, the better-educated inhabitants of the ivory tower.

“Democracy” means, the rule of the people (the demos). Abraham Lincoln concluded his Gettysburg address in a refusal to declare victory, so “that these dead shall not have died in vain—that this nation, under God, shall have a new birth of freedom—and that government of the people, by the people, for the people, shall not perish from the earth”. Not so, said Bernard Shaw: the people cannot govern: officials do. Popper did better. He said, although democracy cannot decide who governs or how it can and does decide who should not govern: in democracy the people can overthrow the government with no violence. Now since antiquity it was known that great ease in overthrowing governments is detrimental, since rulers must be given the chance to try out their plans before these are to be viewed as defunct: it should be possible but not too easy to overthrow a government.

Moreover, there are conditions for democracy. Among these, some level of education is essential, perhaps also of literacy. Elections in illiterate populations were tried in the twentieth century—with little success. The more literate the people, the more aware they are of the intricacy of their own democracy, and this knowledge is the best tool against the danger of populism, the best reduction of the gap between democracy and populism. This is where the fringe-academics should enter the picture; they often do. More generally, this is where the educated people enter the political picture: the better educated than the average may influence the less educated and thereby check populism. Hopefully: populism is a constant threat and at times the better educated cannot convince the less educated that their familiarity with politics is badly wanting, especially when populist leaders have their public’s ear when they denounce intellectuals as fake and intellectual warnings as fake news.

The reason is, there is the anti-populism mechanism works on intellectual blockage and there is no rule as to how much it needs to succeed: it is tradition, the tradition of respect for learning and thereby for the learned. This tradition is at least as dangerous a populism: the learned tend to support Plato’s idea of the philosopher king. It is hard to assess the risk of the success of philosopher kings. Will they contribute to the welfare of the people as Plato had hoped? We do not know: few dictators were nearly as learned as Plato wanted them to be and none was as benevolent. Usually, efforts to appoint a philosopher for a dictatorial position led to crass tyranny. Still, we should note that politically speaking the intellectual elite is not an organized group and membership in it is by self-selection. The assumption is therefore reasonable that politically oriented intellectuals are usually much less politically knowledgeable than they claim.

This is somewhat surprising. Among intellectuals Platonism is not popular; even liberalism fares there better. Even simple liberal ideas—such as that hatred has little to do with politics—being counter-intuitive, are fruit of intellectual disquisition. Traditional responses to liberalism dismissed it as agreeable but too unrealistic to work. Yet the success of liberalism is stupendous. Nevertheless, most political writers—academics or not—these days oppose liberalism to this or that extent. It makes no sense to me, but I have to point this out to you. I do not recommend that you learn anti-liberal slogans to further your career, but I owe it to you to tell you that your being a liberal—as I hope you are—is a positive obstacle to your acquiring an academic career. Of course, I fervently hope that you will succeed despite your liberalism, if you are lucky, your liberalism may even be rewarded.

I am discussing liberal intellectuals and their role in keeping democracy away from populism. When intellectuals are illiberal, they do the opposite, as we can learn from the sad recent history of democracy and populism in the United States of America.

You may find this section bewildering. You may think that I have not thought the message of this section through, that I am in two minds or more. How perceptive of you. There is almost no literature on it and it is very subtle. This is why fringe-academics may be more influential—positively—on the stability of democracy. Many people are too constrained to consult proper academics even on academic affairs, not to mention broader intellectual matters. So they consult the people in their circles who are not academics but who have some knowledge of matters academics, the fringe-academics. And so, the very fluidity and lack of organization of the intellectual elite, the very ambivalence towards intellectuals of non-intellectuals, and so on, all this makes it hard to say what happens, when, much less what will happen and least of all, why. Yet some cases stand out. Not only was one headline written by a French novelist and intellectual—Émile Zola—made political history; the eighteenth-century Enlightenment Movement, the French Encyclopédistes, to be precise, contributed to the making of the modern world. To see how this works is to look at your uncle and to ask whom did he consider wiser and more educated than himself, and did that person attenuate your uncle’s political expressions. This should do, but you may want to go further and ask, what influences of what past intellectuals worked in that circle of ideas and what ill effects did some democratic intellectuals have on politics.

I said enough for you to get the gist of discussion. Since it is tangential and hardly helpful to you in your search of an academic career, I will stop here and zero in on more practical matters since this book is coming to its close.

9. The Individual and the Academic Community

The social sciences are impeded by the traditional philosophical dispute between individualists and collectivists, now outdated. Individualists deem a society—any society—a conglomerate of individuals, and hence all social science psychology at heart. Jean-Paul Sartre, once a celebrated philosopher, said, there are only two sciences, physics and psychology (because, as Descartes has taught, there are only two substances, body and soul). The alternative view, collectivism, is that individuals are members of society, so that at heart all social science is political science. The latter view, represented by the much praised and much maligned Georg Friedrich Wilhelm Hegel, seems to be the more sophisticated; yet, when you notice that the representation of this view in common parlance is the idea that we belong to national types, so that typecasting is not only inevitable but also the right thing to do, then you realize how primitive collectivism is. Indeed, collectivism is the older and the still prevalent view; individualism is the invention of the Greek sophists that was revived in Europe during the Renaissance and the scientific revolution. In the eighteenth century, only economics and political theories were popular as they are easier to study from the individualist viewpoint. The achievement of the social sciences of the eighteenth century, economics and politics, are marvelous (even though grossly outdated). The contributions of the collectivists to the social sciences appeared in the nineteenth century, as a reaction to the French revolution. Some of them are intriguing; most are downright disgraceful.

The arguments in defense of both views are obvious and most powerful: individualists say, no individual, no society; collectivists say, no society, no individuals. (Collectivism looks more sophisticated than individualism since the collectivist argument is the slightly less obvious.) What these arguments prove is that both parties are in error: we must assume that both individual and society exists for social studies to proceed.

Benedict Spinoza complained that almost all social studies are utopian and so unrealistic. He saw only Machiavelli as an exception, as his message was, do not trust princes (Psalm 146:3). To be more realist, said Georg Simmel, we must see that individual and society are in constant and irreconcilable conflict. Obviously, this is true. Practically all teachers take it for granted. Worse: they take it for granted that for the good of their charges they must break them—the way you break a horse: you have to subdue their insistence on freedom. Think of your own teachers. Unless you were fortunate, you have undergone the normal process of education and so you remember favorably one or two among the many teachers you have encountered. You will notice that they were the exception, the individuals that did not see breaking you as their chief task.

This pertains to you and to your wish to integrate into Academe. Why integrate at all? To any framework? True, some individuals are not integrated; they are all sort of social outcasts, from hermits to the mentally ill. Since collectivism sees individuals as members of society, it deems them all the unintegrated mentally ill, or in the jargon of the early nineteenth century, collectivists see every individual as either integrated in some society in some way, or else alienated.[24] Thus, Romantic thinkers declared mad all deviants, including the revolutionary and the genius; as societies change and endorse the ideas of some revolutionaries, they rehabilitate them retrospectively. Hence, truth is relative.

All this is highfalutin: things are much simpler. Remember this: you do not have to integrate. There are many ways to avoid integration. The simplest is to become a bum. The welfare state is great for many reasons, chief among them is that in it you can be a bum comfortably: you need not be homeless for that. The hardest way to become a bum is to lose your mind. It is too painful and so not recommended under any circumstances.[25] The most comfortable way for avoiding integration is to join Academe. It does not look that way because conspicuous professors fight jealously for every option of public exposure while true academics are not visible: they prefer to spend their time in libraries and in laboratories. All stable societies accommodate for deviants; in Europe as in many other parts of the world, it was the monasteries; in Western Europe, it was the universities that were a bit less esoteric than the monasteries. In any case, deviation is unavoidable; ignoring it creates all sorts of witch-hunts and they all destabilize as they all have a common base. Children accept all that adults tell them with no filter. They also accept what they see and the way they understand it: this is how we inherit all faulty habits from our parents. It is this childlike naiveté that some obscurantist thinkers envy and wish to revive. (The paradigms here are Blaise Pascal, Leo Tolstoy and Ludwig Wittgenstein.) Adolescents usually lose their naiveté and become what is these days called cynicism: there are no values. It is impossible to argue against cynicism. It expresses your being educated for a task that you reject, you need an alternative to it and you find none. Finding one may help one overcome cynicism with no help. So, if you want to be an academic, choose the right reason: do not stay in Academe out of fear of the outside world, do not choose this profession in order to win respect or make money or anything else other than that Academe offers you leisure to study. Otherwise, you can seek better alternatives elsewhere.

The conflict between individual and society is a matter of degree, with a sense of full integration as the one extreme and the life of a hermit as the other. The life of an academic is more comfortable; that of a financially independent scholar is the most comfortable. Hence, do not try to resolve all conflicts in your (academic) environment. Suffice it that you decide for yourself what ill of society—at large or in your immediate vicinity—you will fight to improve. Try to choose it judiciously if you can. And no, do not expect me to advise you on this. To think that I can help you here and there is already a great exaggeration; but I am sufficiently aware of my limitations and of my inability to offer you advice on personal choices, especially as long as we have not met each other.

Generally, it is graduation from the two extremes, the naïve overall acceptance of social norms and their cynical overall rejection, that the moral life of the adult individual begins. On this I cannot offer generalities. The one generality that rings in my ear is Talmudic: let not your youth put you to shame in your old age. Translation: do not join any academic clique and partake in no intrigue. Member of a clique may do something right, of course, and then they may merit you support. You can then join their activities as long as you approve of them; for that you need not join them. You should act openly and swear no allegiance: be loyal to yourself only. You can do it. You will meet with threats. Ignore them.

10. The Open Future

That is it. This volume will end with a final chapter, from which one may expect nothing but hot air. So here is my last chance to let you know what I think of your chances and how I think you may try to improve them in a reasonable manner.

This is then is my main advice to you. There is a conveyor-belt leading to the position that you covet. And on it are more people than can stay on it all the way: most of those who covet academic positions end up in disappointment; Academe invites most of them to blame themselves—it is always comfortable for the insider to feel that they deserve their privileges, that those who did not make it are only themselves to blame. What is most unfair in this process is that some of those who are on the conveyor-belt that leads to academic careers have no chance to reach the end point: they have no chance to succeed for the sufficient reason most of those who play the game play it dishonestly. Among those who have no chance to succeed, some enjoy the ride; and then it is fine. Most of them suffer agonies, chiefly humiliations and bewilderments. You need not be among them; this is a matter of a personal decision. Decide in accord with your personal dispositions, but without losing your ability at self-criticism. Remember: plenty evidence shows that Academe is no utopia; not all insiders have merit and not all who deserve academic positions succeed in acquiring them. You cannot do much about it now. If you are successful, then remember that you should try to help improve the system. But not now.

When I say that some of my peers are semi-literate, it sounds as if I exaggerate. I cannot help it. I still recommend that you do not resent this fact. And do not jump to the silly conclusion that a semi-literate individual who has just become a new member of the department you wish to join has robbed you of the position that you have a claim for. If it is any comfort to you, let me draw your attention to the fact that there is no law of nature as to the number of members of any institution and that semi-literate academics often have useful merits that many a learned academic lacks.

A final piece of advice regarding the conveyor-belt: find out whether you enjoy your place on it; if yes, stay there for the ride, not for the hope of reaching its end-point. Otherwise, decide whether you are willing to pay the price of staying on it. Always pay the price of a decision of yours willingly and with no haggling. If you do not fit the conveyor-belt, try going for the academic job unconventionally. (I did.) There is always room for unconventional competitors, nowhere more than in Academe. There are myriad arguments against this advice. The ones I know merit no response.

A word about the price. Academics strive for tenure as a matter of course; in many systems the faculty members belong to academic unions, and these prevent the option of academic employment with no tenure except for probation periods. Liberalism suggest that a dual system is preferable; and then the non-tenured should be better paid then their equivalent tenured peers. This is not going to happen in the near future, though. And so you need a tenure-track position. As you will approach the moment of decision about your possible tenure, the price for your deviations will rise with increasing speed. Afterwards, tenure will be a great protection; by then your peers will expect you to become tame. Otherwise, they may try to punish you. Penalties for tenured academics are two: ignoring you and your work, and shaming you. Shaming is the greatest and most effective penalty in Academe. You are immune to it, I hope. Still, whatever is the price, pay it willingly. A friend of mine was designated heir-apparent on a school of thought. He was allowed to seek a temporary job elsewhere with the understanding that he will be called back at the right moment to fill the position he was expected to fulfill. He then had heretic thoughts. He was not foolish enough to be sure he was right, yet he thought others may find his thoughts interesting and so he published his ideas with no hesitation. He promptly lost his promised very high position. He never regretted it; not for moment. You may think this is a special case, and of course in a sense it is; but I have a few such friends and I learn from gossip that there are many more. The requirement for such a story is that there be schools of thought, of course. It is not known that these exist.[26] Even when their presence is obvious and well known, the disposition is to ignore them. The paradigm-case is the case of economic theory with the free-market school or the Chicago school against the Keynesian school. Yet for decades experts deprecated this by the standard technique of naming: the theory of the one school is micro-economics and the other is macro-economics. A joke about a department of economics in a new university in Texas is telling: its head said, we have established micro-economics in our department, and now is the time to move to start macro-economics. Yet it took courageous Don Patinkin to say, as long as the department in Chicago teaches micro and the department in Yale teaches macro, economics is no science: they both should teach both.

Back to you. Do not swear allegiance to any school, not even to the one you belong to: openness is the highest and wisest imperative. And do not wait for schools to recognize each other: recognize them all—unless you have a compelling reason not to (such as the Fascist blemishes on the theories of famous social anthropologist Mircea Eliade or the utter futility of the output of Nazi philosopher Martin Heidegger). Keep your research above petty conflicts and pay attention to serious intellectual ones: pick the question of your choice from among those under dispute and write down all the competing answers to it. Sift the intelligent ones from the silly ones and discuss the former. Clarifying a dispute this way is always a worthy intellectual activity.[27] This may take courage: it is difficult for a novice to dismiss a silly a view that some big wigs support. Disregard this. The penalty for deviation here is amply compensated by the respect that it gets you—even when no one will do you the courtesy of telling you about it.

Like every large institution, Academe is entangled with its own red tape. This makes the bush telegraph popular although it is an unreliable source of information Students usually learn about the academic red tape from it. Even academics do. Bureaucrats often use it as an excuse for ignoring your rights. You should know that every large institution has officials whose task is to provide you with the information you need about your rights. You do need it, and you need it updated; request it. This is not to say that you should trust what official tell you: rules have often enough exceptions; if need be, you can create one. In addition, note this: the university is an autonomous institution: it can change at a drop of a hat any rule it has that is not prescribed by the law of the land; and it does not need to justify its conduct to anyone. If an academic rule seriously impedes you, think of the option of asking the university to waive it. The university does this. It has to. And at times to everybody’s benefit.

Look at the history of Academe. It began as the class of priests with no community, of people who were pensioners for life. It had two faculties: medicine and canon law. It developed slowly, with the faculties of engineering and of mass media being the last to appear. It offered lectures and degrees. For the degrees it required only a few exams. The American system finally won, with exams and grades for every single course a student takes and for the combinations of courses, with or without a written essay, dissertation, whatnot. This is a burden on both student and professor, except that powerful professors get assistants to grade their students’ papers. In addition to teaching, the tasks of professors are administrative and research. The teaching burden is extremely light if you want your lectures to help your students rather than impress them, since you can always benefit from feedback and if you cannot help them you can go home and do some minimal homework and if this does not suffice tell them how things stand. The administrative tasks are largely voluntary and you can dodge most of them. In the whole of my career, I bumped into only one case of a professor’s utter refusal to undertake any administrative task that caused him to undergo early retirement. (He gladly did that.) As to research, only one piece of advice becomes it: undertake only research that interests you. This may change, and so do fashions. And fashions are made by the stubborn and by those who can do only one sort of thing.

The future of Academe largely depends on the future of society. Humanity may face total destruction. In that case, our current plans make no difference. So it is advisable to ignore this possibility except that we should try to help to prevent it. The future of Academe largely depends on the future of the attitudes of our society to the life of the intellect. This is a part of current modern democracy. It behooves us as intellectuals and as teachers to raise the interest of the public in democracy and in the life of the intellect. For reasons that I will not specify here, widespread populist anti-intellectualism comes repeatedly with proposals to limit certain intellectual activities. It can come in Parents-Teachers Association meetings from individuals who complain against the inclusion of poetry in the school’s curriculum and it can appear as a revivalist religious movement. In any case it impinges not only on your job but also, and more importantly, as your task as a member of your society. It does not take the great Abraham Maslow to tell us that there is a hierarchy of needs and that we do have intellectual needs that are hard to overlook. Science fiction writers dwell on it.[28] Except that education often makes us suppress awareness of these needs. It is the task of those who know better—you and I—to try to alter this situation before it does too much damage. And you need not be a pessimist: tough as the situation is, the deep-seated need for food for the intellect is strong enough to make its way back.

That should do for now.





[2] The natural items to look up are autobiographies and biographies. Next come official reports.

[3] The traditional law faculty was of canon law; the secularization of the university changed this, although canon law is still a legitimate topic, particularly in schools that come to serve religious communities.

[4] Invited to speak to heads of departments of a reputed university hospital, I began by asking that any one of them volunteer to report one mistake. There was no response and I left.

[5] Francis Bacon, Novum Organum, 1620, Bk. I, §117.

[6] William Beveridge, Forward to Lionel Curtis, World War: Its Cause and Cure, 1945, vii.

[7] “What Makes a Scientific Golden Age?” in my Science and Society: Studies in the Sociology of Science, 1981.

[8] Robert Rosenthal and Lenore Jacobson, Pygmalion in the Classroom: Teacher Expectation and Pupil’s Intellectual Development. 1968.

[9] Albert Einstein, “Why Do They Hate the Jews?” (1938); Out of My Later Years, 1956.

[10] Thus, also, even while appreciating local art, we may admit that few parts of the world has produced art equal to that of the West. Some non-Western art is exquisite: there is no glass like Chinese glass, and Chinese and Japanese painting are also exceptional, yet Renaissance paintings are superior to them.

[11] William Warren Bartley III, The Retreat to Commitment, 1964, Ch. 1.

[12] Adam Smith, The Wealth of Nations, Chapter 3.

[13] Aldous Huxley, Point Counter Point, 1928.

[14] Ludwig Wittgenstein, Philosophical Investigations, 1953, §118

[15] See my The Very Idea of Modern Science, 2013, Preface.

[16] Archimedes, the greatest applied physicist of all time, they say, was most reluctant to apply his ideas: he did that only as the last resort, as means to save his hometown Syracuse from defeat.

[17] Raising funds for a wise cause is not easy. Former US president Bill Clinton is in charge of funds for fighting AIDS. He noted that USA should invest in health in Africa to reduce the rate of AIDS in the USA. He failed to convince people to do that, he reported, and so this did not take place. See Bill Clinton’s HIV/AIDS Record of Shame,

[18] Ernest Gellner, Contemporary Thought and Politics, 1974, 34; Conditions of Liberty: Civil Society and its Rivals, 1994, 56, 196.

[19] Patrick J. Deneen, Why Liberalism Failed, 2018.

[20] Karl Marx, “Value, Price and Profit”, 1865. §2.

[21] Karl Popper, The Logic of Scientific Discovery, §14: “it depends upon whether we wish to speak of a race of animals living on our planet (an individual concept), or of a kind of physical bodies with properties which can be described in universal terms a biologist will not decide whether biology is limited to this earth or not.”

[22] Don Patinkin, “Friedman on the Quantity Theory and Keynesian economics”, Journal of Political Economy, 1972, 80: 883-905.

[23] Michael Polanyi, Knowing and Being, 1969, Ch. 3.

[24] Philippe Pinel, a physician who defended the rights of mental patients, invented the word “alienation”. For a time, the politically correct word for psychiatrists was “alienists”.

[25] Y. Fried and J. Agassi, Paranoia, a Study in Diagnosis, 1976, Ch. 7, n. 2.

[26] See my “Scientific Schools and Their Success” in my Science and Society: Studies in the Sociology of Science, Boston Studies in the Philosophy of Science, Vol. 65, 1981. That essay appears in that collection of essays since editors of refereed journals rejected it too many times.

[27] R. G. Collingwood, Autobiography, 1939.

[28] For example, Ray Bradbury, Fahrenheit 451, 1953.

Categories: Books and Book Reviews

Tags: , , , , , , , , , , , , , , , , ,

Leave a Reply