Part III: Prescriptions, Academic Agonies and How to Avoid Them, Joseph Agassi

TABLE OF CONTENTS

FRONT MATTER | PREFACE | PROLOGUE | PART I: DIAGNOSIS | PART II: ETIOLOGY |

| PART III: PRESCRIPTIONS | PART IV: PROGNOSIS | EPILOGUE |


 

The rule that guides me here is that of Bernard Shaw. He wrote his plays as political propaganda, but he needed an overall rule to guide him. He developed a new attitude towards the standard book of rules of the theatre. In almost every play he wrote, he planned to violate a rule; he did so while taking care of its rationale. Thus, for a conspicuous example, his early, 1896 play You Never Can Tell that is a mix of drama and farce, builds its plot on coincidences. The rule says, avoid coincidences: since this ploy is too artificial and too easy, it does not work.[1] In that play, Shaw piles up coincidences, and so his audience soon expects more of them. As Maugham has observed, any event, however improbable, is fine as long as readers are ready to consider it credible, no matter why and no matter by what ploy narrators succeed to convince their readers. Another example is Shaw’s only tragedy: St. Joan. The most sacred rule of the theatre is that a tragedy ends with a catharsis. In that play of his, the cathartic scene is most memorable, as in it the priest responsible for her burning on the stake sees it happen (off stage), and expresses sincere great regret. (This comes to illustrate a major thesis of Shaw: evil deeds done out of malice are as boring as pickpocketing; it is the evil done with good intention that deserves attention.) After that cathartic scene, Shaw refuses to let the curtains down. Following Shakespeare’s Romeo and Juliet, he opens a discussion, to show that people refuse to learn the lesson that the tragedy teaches. It works—because, like Shakespeare, he delivers his lesson with a punch that expresses great concern.

Allow me a mini-digression on this digression. I enjoy the theatre and all other forms of the verbal arts, but they are not for me. There are many stories, for example, of parting with traditional ways and their replacement with better ones. Some of these end with reconciliation and adjustment to the new, some end with return to the old (and there are many other options). The authors of the stories that end with return to the old advocate tradition and the other advocate modernity. Examples go either way, of course, and so the choice of a case is an expression of the author’s predilection. Thus, intellectually speaking, art offers no argument. It is thus sheer propaganda. This is no criticism: it is the nature of the beast. The search for examples of the beauty of critical thinking leads us to different examples. True, the wording of a scientific text can be clumsy or beautiful; this may suffice to justify the repeated rewriting of scientific textbooks. Similarly, artists can dramatize some old great debates, political and scientific. Indeed, Plato’s The Apology of Socrates that many consider the philosophers’ bible, underwent dramatization repeatedly. Thus, even in the theatre the better advocates of an option recognize the merit of the alternative to it. The best dramatic example of that is the end of Inherit the Wind, the 1955 Jerome Lawrence and Robert E. Lee play on the famous monkey trial: Clarence Darrow has won the case, and he now stands alone, in the empty stage, holding two books in his hands, evidently the Bible and The Origins of Species, silently weighing them against each other. Nevertheless, this play, terrific though it is, cannot serve as a substitute for debates about the situation it describes so masterfully. That is where my heart lingers.

Back from the glittering theatre to drab Academic and its day-to-day life and its hoped-for reforms. These are sound rules that we should try to apply to all reform. First, examine the rationale behind the current system: there is a good reason behind every current practice—including even the stupid and malicious intrigues that notoriously obsess academics of all faculties and all departments. To implement a reform successfully, there is the need to try to see to it that it preserves the positive function of every institution and custom that it comes to eradicate. Reformers must advertise this when they implement their reforms. They also must see to it that the reform includes means that compensate all those who will suffer from it and institute amnesty for all who habitually behaved in the manner that the reform comes to outlaw reasonably successfully. The literature on reforms occupies a major part of the literature on the law and on politics. The most important aspect of it is this: minimize the damage a reform causes and compensate its victims as best possible.

For example, before we can abolish the lecture course, we must secure the tenure of the professors and assure them of two things: they do not have to replace their lectures with anything, and the reform assures that they can work for the implementation of any alternative to it that they like. Above all, we should never abolish the lecture course itself, only the obligation to attend it. This is possible only if reformers take care to ensure the avoidance of the institution of extraneous incentives to attend a lecture course: there is only one legitimate incentive to registering in a lecture course in the interest of students: it is their recognition that it is to their benefit to attend it—be the benefit intellectual or practical or any other.
 

1. How to Avoid Professional Mystique Without Loss of Self-Esteem

Professional mystique finds expression in ordinary language: the words “professional” and “expert” are used as synonyms although some experts are amateurs and some professionals are inexpert. Yet even ordinary language distinguishes between the two: praise for expertise is for efficiency and competence, for the ability to solve problems, whereas praise for professionalism is for the insistence on correctness, on etiquette. Yet etiquette it is that expresses mystique. Industrial psychologists accept it as a truism that professional mystique serves to boost and support needed self-esteem, that on occasion it might obstruct the development of professional efficiency and competence, and that in such cases the task of the industrial psychologist is not merely to remove the mystique but also to replace it.

I think you dislike this, and for two reasons. First, these industrial psychologists are narrow-minded professionals with the sense of superiority—the mystique, indeed—of members of a super-profession. On this count, you may be right. Second, there is an implicit comparison here between industry and Academe. You resent this. Here you are in error: you suffer from a sense of superiority. Academe has many functions, and it is comparable to other employer regarding any of these. As an ivory tower, it is possibly superior to any other ivory tower, from the Buddhist monastery to the lord’s manor. As an employer, the ivory tower is possibly superior to any other employer. Or as the center for the administration of rituals, or of justice or even as a center of learning or as means for the advancement of learning. Comparisons and contrasts of diverse sorts are always trite, but they may help dispel mystique: it may improve one’s sense of orientation.

Mystique comes to bestow harmony. In Academe harmony is absent as a rule: internally, then, academic mystique is a failure: its target is the market place. The reason for its internal failure is that tasks of academics are not suitably definable. Comparing academics for excellence is hard, except for popularity. It includes students, peers, or administrators, but mainly the lay public. The diversity of criteria of excellence makes any opinion on them contestable.[2] All this invite jealousy and mudslinging aplenty.
 

1.1

Let me advise you about the best place in the game that you can occupy: stay out of it. Instead, do your own thing: do what you enjoy doing. It is best for your soul and for your career. Do not volunteer. If they assign to you a role, play it well, but stick to the book: do nothing more, except to help students in need. Frequent the departmental seminar only if it interests you. Avoid bumping into peers. And remember: honesty is the best policy. Peers may resent your poor valuation of their performances; they may hate your dissent. This sounds as if a little flattery may help. Admittedly, it can go a long way; slight oversight of some awkward items is better. There is no harm in that. Nevertheless, I say, avoid it all. Repute for straightforward, honest conduct is much more rewarding and much more durable. It also becomes a good habit. It also disqualifies you generally for entering an academic clique. The penalty for honesty may at times be excessive, especially for non-tenured faculty. The world is often unjust and those are utter fools who say that occupants of academic jobs are people who best deserve them. Yet injustice hits the honest and the dishonest; observation shows clearly that dishonesty too does not always deliver the goods. Above all, honesty does lead to the company of those who prefer honest criticism to flattery (in accord with the advice of Proverbs, 27:10).

All this is obvious. People are often shocked when they hear of injustices in Academe. For them academic mystique is a great success.

Academic mystique is surprisingly similar mystique elsewhere, especially in our own secular age, when academics sound ridiculous every time they use too overtly ecclesiastical means of mystification. In a section on mystique (called “injelititis”), brilliant Parkinson’s Law (1957) describes the mystique exercised in poor business establishments or run-down hotels or universities. The cheap spy novel The Looking Glass War of John le Carré (1965), too, shows how strikingly universal mystique is. Its anti-hero is the incompetent intelligence department of the war office. The story is that the competent intelligence department of the Foreign Office sets a trap for the competition, designed to force the government to close it. The author presents mystique as the language of incompetence and disorientation, the means with which to avoid acknowledging the existence of these qualities. Le Carré reacted to the popular, romantic, glamorous trend of spy-novels, describing the spying profession as both gray and callous in equal measures. Somehow, readers and critics alike refuse to get the message. They take his grayness to be merely the technique of creating atmosphere (with justice: his techniques come from the magnificent, romantic, pseudo-realist trend of the detective novel of Dashiell Hammett and Raymond Chandler). His message has passed unnoticed. That in that novel a man goes on a useless mission to die there, all critics have discussed; and they have often found it too unusual to be credible. As to the incompetence itself, they barely noticed it as a theme in the book, much less its universal nature. If you disbelieve me and if you care about it, then you can make an experiment to test it. Read the book and make a list of statements made by the incompetent intelligence officials there that both express and mask their incompetence. Go to the faculty-club or to the departmental office, where incompetent academics thrive. Try to find a difference in style. If I did not know that le Carré is one who used to mix professionally with intelligence officials and who had spent little time in a university, I would have suspected him of having moved dull faculty-club parlance to the drab offices of MI27.

I do not place le Carré in the same category as Chekhov. I discuss here one who uses contemporary idiom; I do recommend that you train your ear to hear it. When you hear professors saying, never-mind just now why what we are doing is important, when you hear their expressions of inarticulate contempt towards those who do things differently and their pooh-pooh at their reputations, when you hear stories from some good old days just when you feel that strong reasons are required, when talk of duty and of glory pops up in serious conversation, you may then feel unease and blame yourself for your shortcomings rather than pity those who speak that way. We should correct this error pronto. Keep away from contempt and from bitterness—in sheer self-defense, said charming Spinoza.

Let me then tell you a true story, and of the ill effect of the mystique on one of the most admirable and least naïve people I ever had the fortune to meet. Let me begin with the background information to the story. It was my mother-in-law, Margarete Buber-Neumann, reputed as the first to testify from personal experience about concentration camps in both Nazi Germany and the Soviet Union. She did that in court testimonies, in books, including her bestselling Under two Dictators, and in series of public lectures. I got her invited—with her consent, of course—to speak to the celebrated Boston Colloquium for the Philosophy of Science. This was no small matter since it involved a trans-Atlantic travel and since she was not fluent in English. Yet it seemed—it still seems—to me important since the Boston Centre for the Philosophy of Science was the intellectual centre of the American New Left that was a significant political force at the time. She agreed to compare the views towards rationality of the Old Left as she had experienced it in her youth and the New Left that dominated the scene then. This she did, and with great charm and great success. Unfortunately, the editors of the Boston Studies series did not have the courage to publish her lecture. (It appeared in German in her collected lectures.[3])

To my story, then. A week or so before the lecture, she rang me from Germany to Boston. She got cold feet. She wanted to cancel the lecture and suggested that I tell the organizers she was sick. This puzzled me since she was an experienced lecturer and an unusually brave one: she was used to hostile audiences since she lectured chiefly in Germany, to audiences that included many individuals who had spent their youth in the Nazi youth movement, so that they needed more than a dreadful military defeat to shed the convictions that she scorned. Yet speaking to an academic audience was different. This surprised me. To repeat, she did come to lecture, and did it very well indeed, but only because she had promised to come and had to keep her promise despite the feared fiasco. This expectation she owed to the academic mystique that she was a victim of, her amazing life experience notwithstanding.

This story should show the ubiquity of the academic mystique: the default assumption is that it invites caution: dodging it requires constant efforts.
 

1.2

The one remedy to mystique that is possible to find in Academe today is proficiency, efficiency, competence. Here both Parkinson and le Carré would agree; wholeheartedly. I hope you do not need my story to know that this is not exactly my recipe, although I do agree that incompetence and mystique are Ugly Sisters. Professor Parkinson, a historian and a student of the differences between East and West, between democracy and other regimes, saw in efficiency something important on the scale of human affairs in general, whether in Western social and political life or in the western academic world. I shall not discuss here his detailed studies, interesting though they are. Rather, let me develop my own view concerning academic efficiency.

Taking competence and efficiency seriously generates a scholarly attitude and a thoroughness mentality that may easily amount to a mystique of sorts. Otherwise, it is much simpler if you leave competence alone, acquire techniques and proficiencies only when you need them and to the extent that you need them and no more. And no more. It is true that incompetence calls for mystique, but so does any exaggerated stress on competence. Indeed, over-stressing competence makes us feel incompetent and seek a mystique to cure it. Moreover, competence alone does not suffice for ousting mystique. In particular, the need is not so much for a high degree of competence as for a sense of orientation, a bird’s-eye view of one’s place in the world, a sense of how much one can reasonably expect to achieve regardless of one’s competence—under what circumstances and at what expense. In brief, the feeling that certain waste is inevitable is more important than respect for competence and efficiency. Since all this sounds rather highfalutin, let me offer an elementary example.

Take the very first serious task of scholarship: learning to read. It is the mother of all the exceptions to the rule. How can we explain to youngsters what reading is, I have no idea. Galileo considered script—especially alphabetical—one of the greatest miracles of all time. How can you explain to tiny tots what Galileo could not understand? How can you explain that teachers only pretend that they understand the mystery because they suffer from a malady called professional mystique? It is all so hopeless that it is not children’s misery but rather their survival that is inexplicable.

In desperation we shower our little wretches with all sorts of aids, educational toys, games with printed instructions, cross-word games, glossy illustrated fairytales—anything we can afford, and at times even slightly more than we can afford. It stands to reason that it all is a matter of hit or miss, that even if one experiment in the whole group might prove helpful, even partially, the venture was worth it. The sense of waste is oppressive, especially in hard-up families with bright problem children. And then enormous emotional and financial efforts are put into purchasing the newest, most expensive, dazzling piece of equipment; this flatters the harassed child and raises a momentary vague hope of a miraculous salvation—soon leading to an indescribably deep and painful sense of disappointment and frustration and despair, all of which goes promptly into a savage revenge on the new equipment; and the parents on the bewildered in the already miserable child.

All reactions of bewildered children to such situations are understandable. Yet parents feel they must make their bright children feel that they are in the wrong. This assuredly creates reading blocks. And yes, bright kids are more prone to suffer this malady, unless they are so bright that they learn to read almost effortlessly. And reading blocks lead to stubbornness. Rather than be inventive and resourceful, the stubborn learn to avoid work, to wait for the one and only way out of the problem; and when they try a way that fails, they neurotically wait for the wrath of God to fall on their heads. This increases stubbornness. The parents of the stubborn are then so frustrated that they cannot possibly do the only right thing—namely, comfort their child and offer all sorts of reassurance without explanation: no thunder is going to strike anyone; somehow, tomorrow will be a better day. Instead, parents get increasingly tense; they can easily lose their temper in no time and thus serve as instruments of the divine punishment that is bound to come.

How to avoid catastrophes of first-graders? The degree of competence required to prevent trouble is minimal. So is the amount of required general orientation. Yet the problem is very serious, and almost insurmountable, all the teachers and headmasters and child-psychologists and uncles and family friends constantly notwithstanding. They are all anxious to play their required role in the tragedy.

To make it easier on me, let me move from schoolkids to graduate students.

Like reading, proficiency in a foreign language and in mathematics are worth acquiring. You need not be expert; suffice it that you can read useful texts. You can acquire enough of this early in your career as a student. Find the classics in your chosen field that you enjoy reading, and study them closely. This is the best investment for a scholar. This is obvious to good teachers in some fields, chiefly in the humanities, but even psychologists and economists ignore it, not to speak of mathematics and of the natural sciences. You can enjoy reading good old texts: all it takes is looking for them; the internet is here a great facility but also a trap. Keep vigilance against the mystique.[4] Aim at reasonable progress that helps you enjoy reading texts—in a foreign language and in mathematics.
 

1.3

As the requirement that doctoral dissertation should be original is the source of academic mystique, the requirements for proficiency and competence and expertise boost the mystique. Note this: competence is terrific in some specific contexts and deadly otherwise. This way they ruin the writing of a doctoral dissertation: the supervisor of the graduate student who has no idea about what is required of a dissertation can always ask for more work. How much? Doctoral dissertations are supposed to contain contributions to the stock of human knowledge, no less. Ask an academic what this is and you have invited trouble and may have acquired an enemy. This ensures the prevalence of academic mystique. The demand for expertise clinches the disaster. Keep your distance from it!

Some decades ago, Princeton University senate witnessed a funny story. The head of the music school reported to the senate that the school had decided to grant doctorates to creative works and requested senate approval of that move. The head of the math department responded saying that, oddly, the math department had decided to have an innovation in the opposite direction—to grant doctorates to non-creative work. This raised laughter and the senate passed the required approval with no discussion.

This story merits analysis. Artworks were traditionally not in the jurisdiction of universities:[5] to join a guild an artist had to produce a masterpiece. When recently the academy developed its current position as the home of all things intellectuals, it began with theories and histories of art and soon it laid its hands on art as such—by granting art schools academic status, by adopting art schools, by making art schools and art departments proper parts of the academy. A part of this process was the granting doctoral degrees for good artworks. In mathematics, standards are the highest: every doctoral dissertation had to prove a new theorem.[6] It used to be pretty hard to find a dissertation that merely proves a new wording of an already proven theorem. To make it easier to receive a doctorate in mathematics it was necessary to find merit in a study that does not offer a new proof. The most obvious candidates are a survey, a history, and an educational thesis—but other options are available too. They are all known as non-creative. You can see how different the word “creative” means in art and in math. Most likely, the members of the Princeton senate were aware of this difference; it granted them an opportunity, however, to dodge the need to discuss the difficult question of the meaning of an academic higher degree. And they embraced this opportunity.

Doctorates were initially licenses to teach (licentia docendi): meant to control the teaching of dogma. Medical doctorates were different: licenses to practice. The rationale of granting licenses is that in many a society people employ strangers and they wish to reduce the danger of employing incompetents and charlatans. Prior to the industrial revolution, as church institutions academies required that their members to belong to the clergy; otherwise they were free (for life) of any task: sinecure means without care (of souls), without communities. Lecturing was secondary to scholarship and served as a means for publications. This led to granting of degrees that were intellectual licenses that the Church granted. These were coveted and so they were means for attracting students, which was a matter of reputation (and not of levying fees); though open to all, universities served only a small literate elite. They started granting degrees that in current secularized academic system were of questionable value other than that they carried the reputation that inevitably contributed to academic mystique. The only clear function of doctorates these days is that they all serve as professional licenses for entry into the academy: since World War II, hardly any academic institution will hire people with no higher degree; likewise, they will not grant a doctorate to one with no master’s degree. I still had friends with doctorates but no lower degree.

Professors supervise doctoral dissertations atrociously since they have no idea what is required of it,[7] and they will not admit it. They have no idea as to when a student qualifies for a Ph.D. and they fear divine punishment, namely, their colleagues’ ridicule, namely, the suggestion that they approve of a doctoral dissertation a trifle prematurely. To avoid this, they let their students hang about aimlessly for years; they allow the forces of circumstances and incidental events and outside pressures to determine the moment of relief; they cause a folklore of horror stories to develop among their junior colleagues and senior students concerning their bizarre methods of supervision and cruel ways of handling their graduate students. I shall not report any of these stories; you can hear them from others. My advice to you is, avoid a supervisor with many lingering graduate students.
 

1.4

Before you register as a graduate student, it would be wise of you to shop around. This may be wasteful, but you should expect waste: better lose a semester and move to another college then stay with an adviser who will make you waste an extra few years on a finished dissertation just in order to reduce the fear of releasing you prematurely.

We can expect the very competent advisers to teach their students the techniques they know well. So, unless you have a good reason that you have closely examined with the help of severe critics for wanting to acquire the given techniques, do not join a team of the super-efficient professor. Having made a reputation, this professor may very well be using students to sustain it. Any technique may become outdated any day and the day after you may find yourself in a marginal job, bitter, and boring your colleagues and students about the good old days. Your colleagues will cease to invite you to their parties and avoid your company—all of them except the incompetent and the lost. Your students will be a dull, unwilling, aimless bunch. You will be talking the language of mystique to these colleagues and students you will have. Your competence will still be there; what you will be missing is orientation and resourcefulness, and independence of spirit. You will be not incompetent but lost. Competence is not enough to dispel mystique; orientation is necessary for it too.

Orientation is a general feature. It applies to a field of study and to social life, and to your place in the department, in the university, in the community, in the profession. Mystique may develop with any relation to your professional work, yet it will become a professional mystique incidental to your profession. This is the basic optical illusion of industrial psychology: seen as a professional mystique, its roots being elsewhere is sadly overlooked.

Every society, social group, organization, or other institution, adopts procedures, a book of etiquette. Superficially, these are simple and accessible. The familiar with them usually abide by them: they live by the book. The unfamiliar with them are more fascinating. Some of them do not know and do not care: they are the happy-go-lucky whom everyone envies and admires and adores—rightly, but too often out of proportion. Not only God protects fools (Psalm 116: 6); everyone does. The funny fact about them is that they are not such fools as they make you think they are (Suspicion; Gone with the Wind; Lucky Jim; Happy-Go-Lucky; The Man Who Came to Dinner). They do not live by your book, but they live by some book: they listen to another drummer (Henry David Thoreau). Their book is unusual in one way: through thick and thin, they absolutely must remain amiable and bubbling and delightful to the last. Those who do not act by any book, who have no special book to live by, they get penalized, often also broken—they either learn their lesson the hard way or move helplessly from one penalty to another, from prisons and mental homes to beachcombing and tedious temporary jobs. The most thought-provoking type, however, is rather common in modern society, especially in the middle classes: the type of those who know that there is a book and that they must abide by it. They have no idea how to do so, but they do. Common as they are, they awaited discovery; Franz Kafka discovered them— introspectively, by the way—. He will be recognized as a psychologist who used literature as his medium of expression: he is the first and leading psychologist of the mystique.[8]

Kafka’s hero does not know the book because of a deep-seated mystique, a sense of guilt, and so on. Psychoanalysis will never help him adjust: they share the most serious flaw in Freud’s psychology, the erroneous view that we all know the book. Freud did not allow to blame the super-ego for the ambivalence that people suffer from. His father must have been a tremendously imposing individual.

Something hilariously ironical goes on here. Trauma initiates in childhood and sets root in adolescence; the book of rules that a child or an adolescent can know is primitive as compared to that of a young adult. Freud rightly considered learning the book for adults a part of the process of adjustment. Adjustment to what? To one’s environments; is not the book a part of the environment? Can the Super-Ego grow? Freud (and Freudians) described adjustment in sufficient detail for a critic to show that they do not include in this adjustment any growth of the Super-Ego. This may have practical consequences. For, however traumatized you are, perhaps you can bring yourself to learn some sets of rules; in the abstract; say in an alien society. If you do, you may find it easier to adjust, say, by joining that alien society. Rather than undergo analysis, perhaps you can try to read sex-manuals of alien cultures. Perhaps not; but analysts all too often advise patients to try all sorts of experiments rather than reading such a manual. This is empirical information: about current manuals of psychoanalysis. Freud advised his followers never to advise their patients. Critics said, this was barely possible” Freud could not possibly follow his own advice (Alfred Adler).

Back to Kafka. He would probably have refused to study any book of rules; he was neurotic enough to prefer to abide by an unknown book, constantly in fear that he was breaking it: the fear was painful, but it also gave him a thrill. I have a friend who knows only the rules of research, so whenever he is in trouble and he does not know what to do, he goes to his researches about which he does know what to do, and hopes and trusts that other matters take care of themselves. They often do, more or less. Most academics live like Kafka. They need the mystique as an excuse for their ignorance of the book, ignorance that the mystique in its turn perpetrates. They bother a lot about the book; whenever they have to make a decision, they worry a lot and go about it inefficiently; the more technical the problem concerning which they decide, the less technically they go about it. They lack worldly wisdom—they will not study the book.

Every manual of every university is such a book. Every letter of academic appointment refers to it. Administrators can and do bully academics and play with them all sorts of Kafkaesque games. My advice to you is, invest a little time now and then in study of the book that they expect you to follow. It is a good antidote to the academic mystique. It will prevent them from forcing you to ask for permission to do what it is your right to do just because you fancy it. In my experience, this piece of advice signifies.
 

1.5

Two stories by Kafka illustrate the malady and pave the way to a cure; both occur in his The Trial. A man stands by the gate trying to bribe the sentry to let him in; the sentry says he is not authorized to grant permissions. The man goes on trying—for the rest of his life. With his last breath, he hears the sentry volunteer the information that no entry permit was necessary. The second story is from the very last page of that book. K. has presumably been tried and found guilty (this is unclear: the book is unfinished); for, his executioners come and take him to an empty lot in a suburb to kill him there. To the very end he is ignorant about his crime, the judges, the court, or its legitimacy. On the way to the execution, the victim and his executioners cross a bridge; a police officer stands there idly. All that K. has to do in order to find out whether the business is legitimate, is to ask the police officer for help. He does not. Kafka offers his opinion only in the last sentence of the book that describes the last moment of the victim: “‘Like a dog!’ he said. It was as if the shame of it should outlive him.”

Before you ask for any entry-permit, inquire whether it is necessary and who is officially designated to answer the question, and examine the answer carefully. Regardless of your feelings, if someone jumps at your throat, do shout for police assistance. That is all you can learn from Kafka. Bureaucrats may demand entry-permits even with no authorization to do so; some of them do so despite explicit rules that forbid it. Ignore them and move on. There is nothing more to it.

Or is there? Manuals are full of hot air; people you consult talk endlessly and never to the point. Where are we? In a Kafkaesque world. Well, you have to find the right paragraph in the manual; and if you have no time for more, seek the right adviser. How? I do not know. Just keep trying.

When I was a perplexed adolescent, I went to The Guide for the Perplexed; in it, Maimonides had advised me to consult people reputed for their wisdom. It broke my heart: if I knew who was wise, I mused, or whose reputation for wisdom to trust, I would not have been perplexed. Yet nothing is easier than to thumb through a manual in search for the proper paragraph or to seek the person who might and would interpret it for you. You can consult various reputed persons and make up your mind whom you wish to retain as an adviser; even phonies may offer sound advice, however reluctantly, when you press your request with reasonable honesty. Maimonides was thus on the right track when he sent the perplexed to the reputed, but he could not help me by directing me to the wise. The worldly wise is who we need; the settled; the one who may be phony but who also can quietly and amiably take reasonable care of their friends and relations; one who is not easily thrown off balance by the unexpected. Maimonides spoke to the perplexed in search of God, not to the perplexed in search of their own places in the world. My mistake. I am afraid I did not make myself clear enough. Let me try again.
 

1.6

An essay by leading Canadian neurologist-psychologist D. O. Hebb concerns the mystique of inductive confirmation and its effect on his graduate students: [9] it made them nervous. It is the mystique of academic competence and success; it is your key enemy. Very few are successful all the way as inductivism promises. Most people fail regularly; some of them, not discouraged by failure, try repeatedly—usually with slight variations. Stories of mere success, in Who is Who, on the dust jackets of famous books, or in the College periodically printed mystique, stories of the pure milk of successare depressing no end. [10]  Hebb ignored all this. H reported that he had told his graduate students not to take things seriously. This good advice scarcely helps, as it matters to careers.

You may have read some biographies; at least sometimes, they may have depressed you by showing you quite unattainable an ideal. When Mozart was my ages, says Tom Lehrer, he was dead. This causes self-contempt, self-mistrust, self-doubt; intensifying it beyond reason is his proposed cure for it. Facts are simple: we all fail, even Einstein managed to fail although he was a success before he was thirty, and even Beethoven. Nevertheless, we may learn to be resourceful, cut our losses—viewing them as tuition fees—and start afresh. We may achieve nothing, yet live happily trying and trying again and approaching the grave as cheerfully as any has the right to, not only the Einsteins and the Beethovens. For this only hygiene is at your service: when anyone says, who do you think you are, you know that you are no friend of this anyone, no matter what a somebody that anyone happens to be. Keeping distance from anyone of this ilk incurs no loss. So do watch it. Only the lonely cannot keep distance, and loneliness comes from the inability to help. So do learn to help.

More generally, it is good to remember that losses are a part of the process, that every development must be wasteful to some extent. Alas, first-graders cannot understand this. Parents can comfort kids and make them trust that all is well even when they do not understand how or why; but too often parents do not understand this either. Do understand and remember this if you do not want to suffer from mystique and from self-degrading of any sort. When others in your vicinity suffer likewise, you should comfort them and remind them that some failure is inevitable.

To recapitulate, Maimonides advises you to go to the reputedly wise on the assumption that they are wise. This assumption is questionable; in modern Academe, it is often false and then dangerous. Academic mystique exaggerates the success and wisdom of established academics and so they are often your worst advisers; to be of use to you they have to be worldly wise rather than great lights. Otherwise, they will feel phony because they tend to view their reputation as exaggerated and fear that when advising you they may expose themselves. You may nonetheless get good pieces of advice from them, but only after making clear to them that their mystique will not do, that you sincerely want real advice and no pep talk.
 

1.7

Sociologists have concerned themselves for long with the penalty or sanction applied to those who break the rules. The penalty is all too often administered by what is known as public opinion, epitomized and symbolized in literature, Theatre, and innumerable movies, by the woman (often a spinster or a widow) who served semi-officially as village-gossip; at times the task was allotted to a frustrated wife or even a bachelor (Sitting Pretty). When a couple in the village live in sin, the village-gossip is supposed to ostracize them. Yet she dotes on them; she wants to know all about them first-hand since she is in constant need for refurbishing her stock of gossip; she is thus a peeping-Tom; and they are adorable, being people who habitually break the rules and get away with it.

Kafka knew that the threat from the village-gossip is pure projection. His The Castle describes carefully a once-popular-but-now-ostracized family. Their disgrace would have been forgiven, they admit, but for the fact that they cannot forgive themselves. Kafka was perceptive; he ignored a significant fact, though: the perception of the family that he described and its honesty with itself makes it most unusual. You should not forget this.

In Academe (undeclared) ostracism is much the rule as officially declared expulsion is in the free-professions, and much cheaper to execute. (I will discuss this later.) It is hard to know what kind of people are ostracized and why, much less than to teach you how to minimize the reasonable risk that you may be ostracized. Nor can I tell you how to exorcise the spell of ostracism. The myth says, almost all cranks and only cranks suffer ostracism, once and until they prove that they have repented and relented and reformed. What generates ostracism is never public opinion though the mystique would have you believe it is. You can easily avoid being the object of their wrath by starting not at the top, thus not coming to the unkind attention of those who ostracize until you are well established. Being established, incidentally, is also a matter shrouded in mystique; but in small doses it can be broken down to simple criteria: by the time you have a good position in an established school and a few publications in established periodicals, you are moderately established. Or even if a few leading people merely say that you are just great.

Remember this. No matter what you do, you are in the wrong just because you are a troublemaker and there is no need to prove that you are a troublemaker. If you concentrate on one thing, you are a narrow specialist: with all due respect, most of us may totally ignore you. Otherwise, you are too superficial, aiming to draw scholarly attention to yourself. Hence, if you must be popular, the right thing for you to do is to strike the golden mean. You may think the golden mean is between narrow specialism and thin spread; you are mistaken again: the golden-mean-kid is one we can be proud of, naturally; and so it follows with cosmic necessity that naturally the golden-mean-kid is not such a rebel and a nuisance and a pest as you are. So if you want the golden mean, then get civilized. Stop being a pest.

Get off our backs to begin with. Period.

Dear friend, if I had no hope that you will remain uncivilized for the rest of your life, and a pest to the establishment, I would not be writing all this junk for you. It has but one purpose: to keep your spirit from breaking too early in the game and help you sharpen your weapons and have a little fight now and then with the fellows who are so much in the middle of the road that they block all traffic. So keep your spirits high. That is much more important than all sorts of remedies are. In case you happen to be an ostracized colleague who wishes to know right now how to break the spell of ostracism, do proceed on the assumption that public opinion has no terror-striking tools and instruments of sanction and punishment. Why then did Franz Kafka live in constant fear that he might violate some rules? Why did Hebb’s graduate students?

Kafka was permeated with a sense of the importance of the book of rules and of the fear of violation and the irredeemable guilt due to violation. What did he think it important? What violation could make him feel so irredeemably guilty? When St. Augustine confesses his adolescent theft of a few measly pears, you feel he must be a saint to make such fuss over so small a sin. When Darwin describes his own youth, saying with an incredible measure of detachment, I think I was a naughty boy, and explains why, he wins his reader’s heart for the rest of his autobiography although some of his contemporaries were more amiable and easy going than he was. Does Kafka resemble St. Augustine or Darwin? He had a very good sense of proportion. In his Amerika, the story begins with a hero who at sixteen had fathered an illegitimate child, but neither Kafka nor anyone else except his pompous, cowardly parents make much of it. The hero, however, like all of Kafka’s heroes, suffer from unknown and inexplicable sense of guilt, a sense of proximity to a most important thing and failing to appreciate it. What had happened is simply that Kafka swallowed the mystique and suffered from it all the more because his strong sense of proportion forbade him from projecting it: his betters and elders could not explain matters to him and he knew that; yet he remained persuaded that somewhere an explanation must exist. He lost his faith, but he followed the Kabbalah: we must believe every day in the mysterious unexplained, so that we may be graced with the explanation (The Castle, 1926). Kafka’s greatest weakness is not merely in his taking the Kabbalah seriously; it is in his readiness to consider sublime the Kabbalist search in the book of rules of the maximum, of heaven-on-earth, in full awareness of it being impossible. This is not serious.

Consider your quest seriously but not too intensely. It may be specific: how to find God, how to contribute to human knowledge, how to be a decent member of a community, how to explain the anomaly of a semi-conductor. The starting-point is the same: there may be no answer; there may be some hint of an answer, a half-baked answer, or perhaps even a fully satisfactory answer. Look around; find some general guidance from well-oriented people even if they are not distinguished; experiment with their ideas; keep seeking. I do not know how, and I can do nothing about it except to encourage you to be resourceful. Think nothing of the waste as long as it is reasonable and as long as you enjoy the game and think nothing of the sanctions if these are reasonable: consider them low tuition fees.

How far does this go? Certainly not all the way. Students must know that if they drop out of courses and perhaps consequently graduate one or two semesters later than expected, life is not over, nor are their careers. True: certain losses are regrettably irreplaceable, certain problems are painfully insoluble, certain lives have been wasted irretrievably. Yet never forget: nothing ensures success, and so not learning the book of rules—and acting on it or deviating from it. Nonetheless, I do recommend a reasonable familiarity with the book of rules.

How do we learn the book? How do we find the sanction for a deviation before we try it out? This is easy for students, since the major rules of the university that apply to them are written and binding. Not so with faculty. Some rules about this are not stated; authorities are likely to deny this. Thus, most of the coveted jobs are secretly filled before their obligatory advertising appear in the marketplace. This is illegal yet in regular practice. Worse, candidates who have received their doctorates after the age of 40 will not be serious candidates unless some external force imposes them (parachutes them, as the jargon goes). Rules that do not exist are necessarily rigid: it is impossible to fight ghosts.

Even less arbitrary rules are problematic. The book of rules says, for instance, do not rig experimental results to make them confirm your hypothesis. As I have told you already, Hebb reports that this sometimes leads graduate students to nervous breakdowns. How do others escape it? The book is silent about it. Of course, we have to alter the book and commend an experiment regardless of its empirical results. Alas, we cannot achieve this right now: the cult of success needs a serious onslaught.

When an instructor in a physics laboratory once showed me how to rig results, I was utterly dumbfounded. He would honestly and sincerely deny that he ever did such a thing. He simply did not know that he could fit a curve so expertly he could even teach a student how to do so, because he honestly believed it is very unscientific. It is: to squeeze them in is pointless. My friend did commit a folly. He could have told me that probably something had gone wrong with the instrument and he was too busy to investigate the matter. He did not. He was suffering from the mystique and he was transmitting it to me.

The students who suffered nervous breakdowns where Hebb observed them were not well oriented. They swallowed the mystique in the book of rules and never asked, what did other fellows do to get Ph. D. degrees when they were not much brighter? They often rigged results. Or they reported the refutations of their own hypotheses. Or they rigged their hypotheses. More likely, they rigged their doctorates and reported their results honestly in later publications.
 

1.8

John Chadwick reports in his breathtaking 1958 The Deciphering of Linear B: Michael Ventris and he had reported their discoveries in the learned press not in the dialectical style that would have fit the chronology of the discoveries more closely but in the inductive style. Some opponents had felt that something there was not quite kosher, and to this extent Chadwick confessed guilt. This is one instance for my view: there are two books of rules for research. The real book of rules is not inductive but dialectical. Induction is the professional mystique; you will be more at peace with yourself after you shed it. Therefore, my advice to you is, learn the rules of the two books. The most important rule of the real book is this: learn to admit error honestly; doing so puts one on the right track. More I cannot say, as I have no recipe for all eventualities. All I can tell you from experience is that threats of penalties on this matter abound and they are almost all bogus. The only serious disadvantage of doing the right thing is that it yields many more rejection slips. You must learn to receive them with equanimity. Other than that, doing the right thing is not always rewarding and doing the wrong things is not always punishable. I still advise you to do whatever you think the right thing is; remember: it may be erroneous, but as it brings you closer to yourself, I recommend it.

Scholars read and write. Both reading and writing demand training. I will discuss writing later. The art of reading is simpler. Choose a question. Find as many books as you can that may be relevant to that question. If you find too few of these, consider changing your question. Glance at every book for one minute or so. Stop. Ask yourself, do I want to glance more into it? Put the ones with the affirmative answer on one side. Return the others to the shelf. Glance at the remaining ones a few minutes more and repeat the exercise. When you have only a handful of books you want to read, do so while taking notes; write reviews of them. Publish them or put them aside for a rainy day. The most general, most-often violated rule here is this. Low quality works are better ignored. Otherwise, add an explanation for the need to criticize them. (Mario Bunge’s criteria for this invite further discussion. I will ignore it here.[11]) And yes, book reviews should tell prospective readers what they want to know about them and why they deserve attention. A beginner, try to publish book reviews. For this, you need editors to commission you to write reviews of books that they send you copies of. Return a book not worthy of attention. Ask editors for detailed instructions. The more detailed the reply, the easier it is to comply.
 

2. How to Avoid the Academic Chores Without Being Inefficient

The mystique around the advocacy of hard work is so successful that a number of attempts to cut it down to size ended with utter failure. We cannot do better right now. I hope you are open-minded enough to reconsider and informed enough to know that the current cult of hard work, though it has biblical roots (Genesis 3:19: “In the sweat of thy face shalt thou eat bread”), it is modern.
 

2.1

Dostoevsky’s Idiot is the story of an epileptic: he joins a party in an opulent palace full of art treasures, especially an invaluable Chinese vase; the host warns him not to break it. The reader knows that the hero is going to get an epileptic fit, fall on the vase and break it. The author quickens the pace in a tremendous accelerando. The Chinese vase broken, the author starts a new chapter with a kind of moderato: the author appears there as a journalist who reports his having passed through a village, where he has heard gossip about some dramatic goings on in the party that took place in the nearby palace on the previous night. It hits you that the report is about the events you have just read in the previous pages. Whatever you think of Dostoevsky as an artist, surely his control of his craft was admirable. His stories move on two levels—the commonsense and the metaphysical—and the abyss between them is vast. On the metaphysical level, ignorance reigns supreme and problems and difficulties abound; Dostoevsky’s hero is possibly Christ incarnate. On the commonsense level, the story is dull and the author stresses this superbly. This became standard technique. Ask a Kafka if he knows anybody; he would look at you disconcerted; he will wonder how on earth can anyone ever accomplish such a task as acquire any knowledge of, even the faintest familiarity with, anyone whatsoever. He will frankly tell you that some mornings he wakes up with the definite conviction that he barely knows who he might be (Metamorphosis). On the commonsense level, every normal person knows well friends and relations and work-mates. To avoid metaphysical pitfalls, we create conventions of acquaintanceship and familiarity; standards that are very superficial, so that they work as best we know; we reform them whenever we have to and have an idea how to.[12] Look at letters of recommendation and their likes: unless they follow a formula, they seldom signify. The formula is simple and intentionally superficial: I know the candidate for so and so many years, I met him in my capacity as this and did with him this and that; we went on a tour to this and that place. It is practically useless to ask a good friend who does not know you well by these standards for a letter of recommendation for you. Often, a lukewarm recommendation from a person who knows you better by common standards will help you more.

Writing letters of recommendation will soon become routine for you—and unless you handle it on the commonsense level, it will absorb a tremendous amount of your time and energy; for many of my colleagues it is a real heavy chore. Newly acquired good friends may ask you to recommend them and you may feel terribly embarrassed because you will not know what to write in case you like them but you do not know them well enough by accepted standards. This embarrassment may lie heavily for weeks; it may deepen and cause endless chains of complications. It is easily avoidable very simply—merely by telling such friends that one has to write in such letters how well and how long one knows the candidate and in what capacity—and let your friends decide whether they still want you to write that letter or not: one way or another this prevents embarrassment.

You do not like all this. Nor do I. Acquaintanceship comes in different standards for different purposes; some of these might suit your temperament better than other; on the commonsense level or on the metaphysical—not on both.

Dostoevsky did for acquaintanceship what Kafka did for the book of rules. The inability to know people metaphysically upset him no end; he thirsted for real human contact. He thought the only real salvation can come from real human contact; love, hate, respect, contempt—they were all the same to him as long as they were real and tightly close; and so his heroes get exhausted and frustrated and desperate from failed attempts to reach contacts—often to the accompaniment of epileptic fits. Do not believe him![13] On the contrary; had we accomplished life’s end, then we, or at least our children, would die of boredom. It is no accident that his Idiot fails to bring human interaction and salvation, and it is psychologically correct of him to cause his hero a breakdown at the end of the story. Better be worldly wise about human relations and no metaphysical recipe. Perhaps this is fortunate, considering that humans, and even other simians, must have tasks to perform or else they die of boredom.
 

2.2

Corollary: there is no general recipe for human contacts. Since I fear that this is what you may be after—quick instructions, in steps—step one, step two, and so forth—I have lost my tempo (if I ever had it). I can only hope that somehow you get a recipe to suit your own taste and circumstances, and that you enjoy the labor of concocting your own recipe as much as applying it. This is essential for the ability to sustain the happy human contacts that is essential for doing your academic chores with a good sense of proportion. I hope this volume amuses you. I hope it teaches you to avoid pitfalls, such as the confusion of the commonsense with the metaphysical, which may make great literature but not very productive life in any other respect. I can report some experiences and perhaps offer some hints. When I was a miserable student, I would have welcomed such a book as this one, even though its tempo is not half as good as that of Dostoevsky. When I was a young academic, I could help my department head because he took his academic chores very seriously and they put a great burden on him; so much so that I had to run things alone unauthorized. Still, if you do not like this volume do throw it at the wall. Go ahead, by all means.
 

2.3

You are lazy. You are lazy because you naturally hate work and prefer to do nothing at all (la dolce far niente) if not play silly games—which is the same—and even harmful ones—which is worse. You are selfish and so if we let you do what you want you will do nothing, and then the Devil will tempt you to do evil—as a penalty for your idleness. Therefore, it is our job, the job of your friends and colleagues, to goad you to work. If you do not suffer, you are useless dead wood, socially and individually.

The worst about the rubbish that the previous paragraph displays is that it undermines the distinction between the useful and the useless, thereby destroying all sense of proportion. This distinction is the heart of the present section: learn to ask of any task, is it necessary? If yes, is there a better—easier—way to perform it? Most chores are unnecessary; you hardly ever need bother about them (“Consider the lilies of the field, how they grow; they toil not, neither do they spin”; Matthew 6: 28). I suppose that you have loads and loads of chores. You can avoid most of these chores yet be a useful member of a department and a good friend to your colleagues (speaking on the commonsense level). My advice to you is, leave everything for a while and go out; go to the theatre, to the concert hall or to the bowling alley. Go! You will return to your desk with your sense of proportion retrieved.

If I show you that you hardly have to do any chore, and if we take for granted that you accept—voluntarily or grudgingly, I do not care—certain chores and even execute some, this amounts to your having swallowed the rubbish in the silly paragraph above. Yes, go back and reread it. I shall wait for you right here; and damn the tempo of these pages.

Well, then. I have discussed this point with many people. Some think I am crazy. Granted. Others think I am exaggerating though I have a good point to make. Someone must wash the dishes, they say, or clean the latrines; or correct the compositions of freshmen. What is a chore, I ask them. And when I show them that by a reasonable definition of a chore it is much less common than it seems, they grant me a victory but only a verbal one. They know how much I hate verbal victories, but they hate being told not to do their chores too—so we are quits.

Even the greatest enemies of communism, if they know anything about it, admire the willingness and readiness and self-sacrifice-with-a-smile, with which communists will execute any chore imposed on them. Now the truth is that some communists enjoy chores, and they soon find themselves in the position known in their party as “old horses”.[14] Whether a chore done lovingly yet as a chore is still a chore is as tough a problem as the famous one: is a masochistic pain really pain? Ask students of the philosophy of Jean-Paul Sartre—they enjoy such questions, so let them handle this one. Ignoring the masochists, you will find that communists graduate fast from one chore into another—going through the whole menu and ending up either as party-functionaries—doing nothing, bored, and in search of some intrigue—or as ex-communists: it is the boredom that makes them start their criticism and move away from communism and further away (unless McCarthyism sends them back to the arms of the Party).

What characterizes the young Communist who does things so willingly is faith, hope, optimism. Old Communists are different: they have been around long enough to know that salvation is not around the corner. Yet they do their chore regardless—not from hope. They seldom know why. Perhaps in order to avoid thinking about politics. All too often chores turn out to be redundant and even harmful—such as the Communist’s chores, such as most of the five-finger-exercises and drawing courses and bookish classes and braided lace and high tea and pomp and circumstance.

Boredom, like any other disease, does not distinguish between an old Communist activist and a house cleaner: it hurts them all, since it is painful. Leibniz has made this observation in the seventeenth century. They say boredom is the outcome of laziness and laziness is an induced neurosis; an inadequate mode of protest that turns into a mode of escape from excitement. For many people washing dishes is not a chore; it is a chore for the traditional homemaker, one that symbolizes her unjust frustration. Yet Agatha Christie considered it a form of recreation. After a dull party, it may very well be a chore. When a party is zestful, however, there is always a helping hand of a charming guest who wants a chance for a private conversation with the hostess.

Only boredom characterizes a chore. What characterizes boredom is less its being routine and more its futility. What is additionally painful about chores is frustration or oppression. Whenever possible, we should leave a routine job to a robot to perform, or find a person who likes it, or one not yet very well drilled in it. College-kids who want to earn pin money may be less efficient in many tasks that they perform in their spare time, but they perform them with grace and so with no pain. The idea that one must do chores to perfection is one of the deepest causes of their pain. This is particularly true when a hard-working, snowed-under is in charge of the chores. Such people are snowed-under because they do everything and delegate none—and they then have to create jobs for underlings, jobs that are thus essentially futile: the real chores. The snowed-under will not give you a proper job until they are convinced that you will perform it to perfection.

Gossipy Plutarch says in his Life of Pericles that Pericles was such a snob that he would perform no civic task whatsoever unless he was fully convinced that no one else in Athens could do it at all—not even badly. It was Pericles, then, who discovered the principle of delegation of power. Many people think that they know the principle and that they believe in it. They all say, do not you think I would be happy to delegate all these chores if I could only find a really (watch the metaphysics!) trustworthy and competent fellow who could do them well enough? However, you cannot really trust anybody these days: if you want a job done really well you have to do it yourself. I can instruct and supervise the execution of the chore, but to do this really well is much more of a chore than doing it myself. There is no better evidence that such a speaker is ignorant of the principle of delegation of power. For, the principle says, to delegate you must give up perfection. This is not so obvious: Plutarch reported this unawares.

People who perform their chores because they want them done well have no time to do other things that may be much more important. They refuse to ask whether it is not more useful that the chores done not so well by others and allow them to do other things not so well. Doing things well, incidentally, can best be achieved by stagnation; if you want progress you must allow people to do new things and they do not do new things as well as things they already are doing well. Moreover, one should neither instruct nor supervise delegated jobs; keeping some superficial check should suffice. If the situation is very bad, then it may need a brief intervention of an instructing hand. Correct then only the worst errors. A sense of proportion trumps every day any efforts to correct all errors.

What you should know now is that bosses who undertakes all the chores do not solve any problem for you; Parkinson’s Law operates here: pedantic bosses create sillier jobs for you. Refuse to perform them! To begin with, if worse comes to worst, you can confess incompetence. This will send your pedantic boss for long deliberations that may yield new suggestions. Meanwhile you can make a more reasonable suggestion. If it is reasonable, the boss will take more time to contemplate it. Meanwhile you can act, and do what you can to provide reasonable help—particularly for learning.

When you are a new member of a department, you will have a reasonable teaching load (6 to 9 hours, one week out of two). Your peers will expect you to do some research about which no one will have any discussion with you. (They may ask you for copies of your publications or invite you to talk to the local colloquium.) They will shower on you hundreds of other small tasks—they will specify them day to day; registering students, advising them, grading their papers; memberships on all sorts of departmental and faculty committees or sub-committees; organizing all sorts of occasional and standing activities from a publication and library to a party and students’ club. The general rule is, do not jump at a chore; do it if you like it—either because you see it yielding fruit or because it affords you contact with charming people, it does not matter why. If you enjoy doing it for a while, do it only for the length of time that you enjoy it.

Someone is sure to admonish you: we all must do our shares. Especially in modern Academe, where life is largely of a whole community within the ivory tower. In some cases even second-hand car dealers on lots adjacent to the college and housing agents to whom the college sends newcomers are eternal students and faculty spouses; they raise the number of chores on campus. Be friendly; always ask for the use of the chore at hand; always show willingness to do some other job for which you may fit better: and do try the job that may challenge you.

Grading papers is tough because graders do not know how to grade: they have no criteria or, worse, they employ criteria that they disapprove of. If they are perfectionists, then they demand the impossible. It is important to know: there are no criteria. There can be none, since the task is senseless. Proof: exam with obvious functions are unproblematic. So grade a paper or an exam as well as you can and as honestly, with no waste of time: being on the tentative assumption that you should grant it an average grade (it is a B or a C, depending on local conditions). Try to refute your hypothesis. If you read a sentence that catches your eye, change the grade as you find fit. If the paper has reached the highest grade, just skim through the rest of it to try to refute the latest grade. If you fail, then the highest grade should be the final one. You may think that this is it. Not so. It may turn out that you grant too many high grade—too many by the rule of averages. Hence, your class is above average. If you follow my advice, your classes will often be above the average. Trouble: the administration assumes that all classes are average. Their assumption is demonstrably absurd, but they are adamant. You will have to quarrel with them. Do. If your class excels, they deserve recognition. Alternatively, the class may be below average. Fight for that too, although this time, more likely, it is your department chair that you will have to fight. Do.

Most papers and exams that you grade you will naturally find not much above or below average. This is why administrations go for the average. Here is a thought experiment that I do not recommend: grade your class randomly, with the exception of not failing those who do not deserve failing. Most likely, no one will complain. Why? Because most grading is with no good reason so that they are almost random. To repeat, do not try this: it is unjust. Oh, yes, injustice is rampant; yet it is no reason for you to be sloppy.

Here is the most useful information about grades: after graduation, everyone forgets them for good, except the traumatized. Hence, the most important thing to know about poor grades is that you should take great care to prevent them from causing harm. The harm can be due to the rule that disqualifies those who fail from taking the next step in their career. It can be due to the violation of a student’s self-respect.

I am nearly finished—this section is not going to be long after all. There are chores that nobody is required to attend to and that you should not attend to either, except as a young academic. There is a special type of student on whom you may lay an informal eye and with whom you may try to keep friendly regular contacts. There is also the excellent colleague who suffers terribly from a writing block and who would be most grateful for your proposal to write a joint paper. Someone else will be happy to show you a manuscript and expect honest critical comments on it. There is the supervisor who is desperate about one graduate student whom you may unofficially instruct and launch in one semester to everyone’s immense relief. There is no point to continue the examples of such possible chores. You will have to be a pal—to be alert and find cases peculiar to your specific environment and personal taste, and you will have to discharge them tactfully. Such tasks may frustrate, but when successful they are pleasant and useful and cost surprisingly little. If you do such things, if, following Pericles, you will be available however seldom but when no one else is, then you will not be negligent, nor will your peers consider you negligent. Those whom you will help will see to that.

This should suffice for you in your early stages; in later stages, rather than do it you can also instruct young colleagues in the art of doing it. You will find out soon enough that the more you shun chores the more disposed your colleagues will be to offer you significant and useful and interesting and challenging and delightful tasks. Serious people from Plato to Kipling say that you must learn to accept orders—and wash dishes—before you can become a responsible leader. This is nasty and only partly true. Besides, you should become a responsible citizen, not a leader. You can perform your tasks as long as you are learning and hence enjoying it. The moment it becomes a chore and a bore, drop it. Someone else will do it or else it does not matter that no one will.

Is this always so? Of necessity? I hesitate. I am trying to stay on the commonsense level. Metaphysically, arguments go either way: possibly life will be happier with no dishwashing; possibly the contrary is the case and doing chores builds excellent characters. Metaphysics aside, the question is different. For a million dollars, many people will wash dishes religiously, ecstatically, enthusiastically, lovingly, and throw in a sonnet to boot. Just look at those college-kids who are paid well in resorts for dishwashing; pearl diving is the technical term for it; appropriately.

Corollary: giving you a chore is making you a dupe. It appeals to the mystique and to a fuzzy sense of duty. Better make any task sufficiently worth your while: enjoyable and useful.

Last word: a word of caution. Some offer a counter-mystique to avoid standard chores and their mystique. There is no need for that, as one can always say that one works too hard. This wins high appreciation, becomes a trademark, and provides a little local color. This may succeed in short term—at the cost of acquiescence in the mystique; thus one joins the crowd (Nietzsche). Do not join them; do not follow their footsteps; rip all mystique in its bud.
 

3. How to Avoid Memory Work Without Being Ignorant

Metaphysically, we are all ignorant. By common standards, some of us are less ignorant than others, and all of us are likewise less ignorant in one field than in another. Metaphysics and commonsense thus differ. The confusion between them made many a commentator on Plato’s early dialogues call Socrates a liar, although he was the epitome of intellectual honesty, just because he confessed total ignorance. The common standard of intellectual honesty is having honestly won it and possessing official certificates that say so. You earn it by receiving good grades. Here then is my advice to you: do not seek a certificate unless you must. And, indeed, since World War II an academic career requires a doctorate. Also, do not aim for excellence, only for passing the exams required for that certificate. And study the art of passing exams. And, most importantly on this topic, if you study anything that interests you because it interests you, then you will pass the exams on it with no preparation (as initially intended when exams were first instituted). Do not memorize except as the very last resort.

Descartes was the first considered individual memory purely physical, namely, mechanical: memory = a physical record. He concluded that from his view of animals as machines and whatever humans share with animals as animal. In the computer age, it is hard to imagine the boldness of this idea and of its immense consequence: speaking of information stored in computers as their memory is Cartesian. Hence, we need not store memory in the brain: libraries and computers and ancient monuments store information, and as such they are extensions of our brains at least in one very distinct, specific, and functional way (the way both the sword and the pen are extensions of the hand). Consequently, we need not memorize the contents of the books we possess. We need the ability to find any item of information we seek—an inscription in an archeological site, a book in a library, an item in a book—we need to retrieve items of information that we want to use. The excuse for memorization is that the information in a book is not as easily available as our remembering of it. Now the difference between the two is not of storage but of recall: hence, what we try to overcome by memorizing is not retention but recall, the use not of our stored information but of our retrieval mechanism. How do we store information in the brain and how do we recall it? Why, in particular, do my students prefer recall from the brain to recall from a library? Why did I fail so regularly to help then alter this preference and to acquire the art of information retrieval instead?[15]
 

3.1

Memory = storage plus the ability to recall it. The uselessness of a book misplaced in a library shows that memory is not its mere presence. Tradition identified memory with storage alone. Now that we have so much interesting discussion of recall, we may recall that as children we found recall puzzling. Upon perceiving the face, a child will exclaim the name and then find puzzling the memory of it. We often find it difficult to recall some information that we have in store. Why? Plato said, we always possess all knowledge; learning any item is recognizing it. Why, he did not know. Freud found recall natural. He said, it fails when information that we possess is painful: we repress it to avoid the pain. This explains only a small portion of our poor retrieval abilities. Some modern brain researcher report that we remember better—are better able to retrieve—what we experience while pumped with adrenaline (for any reason). Our retrieval mechanism is more complex than that.

The prevalent theory of recall is amazingly primitive. It is associationism, common to Aristotle, Locke, Hume, Pavlov, Freud, Skinner. Two items of information have entered storage together. Repetition reinforces this (Locke, Hume, Pavlov) unless the process is disturbed (Freud, Pavlov). This leaves much information about memory unexplained. Yet most commentators take the theory for granted. Oswald Külpe refuted it around 1900. To no avail.

When in your childhood your teachers forced you to read a poem repeatedly, your boredom and protest notwithstanding, they were acting on the primitive theory of memory as storage. So did your university professors as they examined you to ensure that you read your lecture nots at home, allegedly to make durable your memory—storage—of the information that they had imparted to you. Einstein said, exams made him lose the taste for science for a year or two. He said, he disliked memorizing what he could look up. Your professors ignore him. You should not. Remember: grades are external incentives, and these should but do not facilitate the carving of the information on the hard rock that your brain is. That you could find the information useful or interesting was considered an additional incentive—internal incentive—but usually one that may fail to work because you were too immature. At least by now internal incentives are the best means for you to facilitating your memory.

The best means for memory is the understanding of their import. We do not remember telephone numbers just because they are arbitrary. We remember dates of important historical events if we have an idea of their flow, if we see them in historical context. To remember and to understand are two completely different mental qualities; they interact strongly. This distinction is on display in any recital of a geometrical proof from memory with no knowledge of geometry by any individual with total recall. A Chinese colleague of mine who had traditional Chinese education—which is now extinct—had a science degree from a famous western university; he knew an astounding number of textbooks and monographs in his field by heart, cover to cover. He could not converse about it, and he finished his academic career (as an assistant professor) without having performed any research: all his vast and detailed knowledge led to little understanding of his chosen subject, although he was an eminently reasonable and delightful conversationalist on politics and on diverse arts.

How much of the understanding of a subject, or of its mode of reasoning, is a matter of memory? Evidently, some understanding depends on memory: we can easily find people who once understood an argument or a proof but who now forget it. Strangely, most teachers, educationists, learning theorists overlook this fact, the ability to forget an argument, even though it accords with the received theory of learning. The only person who has discusses this fact is Sir Karl Popper, and even he did that only in his lectures, not in any published work of his that I know of. The fact is nevertheless of great interest and of practical value—for the following reason.

We have an intuitive view that contrasts memory and understanding a bit too excessively, with the result that high-school students get confused and desperate about their mathematics and other fields of close reasoning. We all feel that when you have an understanding of some point you need not memorize it since you can reconstruct it since you understand it: if you need the product of a specific multiplication but do not understand what multiplication in general is, you can hardly use a multiplication table to find it out. If you know that multiplication is a reiterated sum and you know your addition, then you can add and find out the product that you need. Here understanding renders memory superfluous. J. J. Gibson and his disciples have shown empirically that most items we think we remember we reconstruct from rudiments of the item that we do remember.

Two errors play an important role here, one minor and one major; and having corrected the minor error, we often assume that we have put everything in order, overlooking the major error. This causes most of the subsequent memory troubles—especially for students.

The minor error is simple. We have said that if you understand what is a product of two numbers you need not remember what the product of two given numbers is since you can calculate it. We should have said, when you remember your understanding of what is a product. For, you can forget not only the product of two given numbers; you can also forget what the product of two numbers is, namely, how to multiply. The name for the method of calculating a product is an algorism or an algorithm. We may learn an algorism and forget it. We may learn by heart a multiplication table and we may learn by heart an algorism—and we may forget either. With this correction, everything said thus far falls so beautifully into place that one need not wonder how the traditional theory of learning was so successful in captivating so many great thinkers for so long. It is a simple theory—I shall call it the drill theory. It reduces both learning and training to one act, the act of reinforcing associations by repetition. All one has to add is the corollary of levels of understanding or drill and of transfer of results of learning. Let us go into that now.

The Homeric problem is, who was Homer? It is hard to know since the early versions of the Homeric texts are preliterate: they were memorized. Milman Parry has thrown an exciting new light on the Homeric problem by locating in Homeric texts traces of mnemonic techniques that he reconstructed. These are kinds of algorisms, more or less; illiterate bards had to remember exceedingly long poems, so they memorized not only poems but also which lines, even sections of poetry, are possible to reiterate on given kinds of occasions: they even knew what standard lines they could insert to give themselves time to refresh failing memory. This is very similar to any other higher level memorizing or training that may be useful, such as learning to drive not just one car, or having learned to drive one kind of car you learn quickly how to drive another kind; not just one musical instrument. Moreover, having learned to drive one kind of car you learn quickly how to drive another kind; having learned to play one kind of instrument, you learn quickly how to play another kind; this is so partly because the two have much in common. It is possible to learn some general features common to all cars and their varied implementations in different cars. Driving the second car is easier than driving the first, due a transfer of knowledge; garage-hands go one step further as they apply a general algorism of sorts. In both cases, drill is the source of the dexterity, and the novelty of the application is not quite a novelty—what is novel in a partly new item, one has to acquire by further drill. What applies to mathematics, poetry, playing a musical instrument, and driving a car, applies generally.

The drill theory of learning is very attractive, and for various reasons. It goes very well with the inductivist theory of learning, with the theory of learning as inductive. It is the theory of scientific research. Locke has worded it thus: scientific theories are generalizations that rest on experience. Memorize information, then, drill yourself in certain associations, and then you can generalize these associations into scientific theories. This is Locke’s theory of science. (The problem of induction as Hume has posed it is the question, is it possible to validate this theory? He proved the negative answer to it. For those who reject Locke’s theory of learning, the problem appears by reference to the newer theory of learning that they advocate—except for those, like Einstein and Popper, who deny that such a theory is possible.) Similarly, the drill theory goes very well with Locke’s theory of perception; I will ignore it here, as it is utterly passé and as it will take too big a digression. Finally, a factor that makes the drill theory popular today: it is operational. The operational theory of learning, operationalism, is the (false) theory that identifies knowledge as familiarity with some rules and it identifies that familiarity with the ability to act in accord with them. Operationalism thus reduces knowledge to skill—to a variety of verbal skills: the ability to calculate or recite or answer a given question or anything else. Any item will do, as long as the claim to have it is easily testable.[16] Before coming to the major error of the drill theory, let us see how it raised difficulties, and how any act of recall became a subject of interest though the drill theory includes the tacit premise that only storage may be problematic, not recall.

Freud revolutionized the theory of memory. This renders his work comparable in historical significance to those of Descartes and of Locke. I find the gap between Locke and Freud hard to comprehend. Locke’s theory was thoroughly inductivist. Inductivism was subject to the severe criticism of Hume and of Kant. The theory of knowledge that Kant developed to overcome Hume’s criticism influenced generations of psychologists, from Johannes Müller and Helmholtz to Külpe and his followers, Bühler, Kohler, Koffka and Popper.[17] Kant had said first that theories are modes of observing the world, pigeonholes or rubrics and second that they are inborn. At least the Gestalt psychologists endorsed his opinion that modes of observation are inborn, though they concerned themselves with modes more primitive than those that Kant spoke of: whereas he referred to Euclidean geometry and Newtonian dynamics, they referred to everyday experiences like observing-a-person-sitting-in-a-chair: they declared that it is one indivisible unit rather than a (Locke-style) combination of sense data. Kohler expressed strong hesitance on the question, how much the results of his study of the mentality of apes—especially his observation of how apes acquire new information—is applicable to the mentality of scientific investigators. (He was an inductivist, and his view of induction he had inherited from Locke. This inconsistency was—and still is—a ceaseless source of confusion.) New psychologists cannot answer the simplest questions concerning learning and memory; their results were not useful for grade-school teaching. There was only one, poorly planned effort, to use Gestalt psychology to facilitate learning to read—by teaching reading prior to teaching the alphabet. The results of this experiment were officially declared unsatisfactory. What exactly happened there is still not clear, at least not to the experts whose published works I have consulted.

Kant and the Gestalt psychologists failed to answer a simple question concerning memory proper: why do our inbuilt modes or theories have to await discovery? How does discovery happen? How is it at all possible? Popper asked these questions in the context of Kant’s theory of knowledge, although already William Whewell tried to answer them a century earlier. This is strange, since already Plato (or perhaps Socrates) advocated the idea that all knowledge is inbuilt, and already he has asked this question in general: how then is learning at all possible? His answer was—you will be surprised—that learning is the process of overcoming confusion, since confusion is the stumbling block for recall! The world had to wait for Freud before these obstacles to recall re-entered the study of memory.

Freud was never interested in memory, whether storage or recall; even his most detailed studies of memory are devoted to something totally different, and it is the techniques of facilitating recall by overcoming one obstacle to it; his concern was the nature of that particular obstacle. He took it for granted that recall is by association (following Locke); he developed his theory of suppression and repression of memory (repression is the single act of transferring a given item of information from the front-conscious to the sub-conscious, and suppression is the repeated act of preventing it from coming up each time it tends to emerge)—as means of avoidance of the unpleasantness associated with it. (Pierre Janet and) Freud had the idea that free association may elicit repressed items. He added that the semblance of accidents (in dreams or in slips of the tongue and of the pen) might also release a suppressed association. This is, indeed, his most popular idea.

In this discussion, I have skipped mention of the particular item that Freud said we suppress, since it is irrelevant. Just in case you wonder what it is, let me mention it: we try to repress one item, he said: our having wished to sleep with our mothers.[18]

The most striking premise implicit in all of Freud’s writings on memory with no exception is that memory is perfect: healthy brains restore all of its input with no loss; all the troubles that people have with remembering concern retrieval, he suggested; hence, in principle it is possible to recall absolutely anything we have experienced, and in practice this depends on free-associations. This, to repeat, is only tacit in Freud. It is of the utmost significance and we owe it to Freud: it diverted attention, at long last, from problems of retention to problems of recall. It is a problem, Freud added, not only to the neurotic, but to us all: it is the Psychopathology of Everyday Life (1901). This and computer science led to a revolution in library science “The organization of knowledge for efficient retrieval of relevant information is also a major research goal of library science”, as Google words this. We should thank Freud, together with the computer engineers who followed him. (Information retrieval is a sub-discipline in computer engineers.) In libraries, recall was always a problem, as librarians knew that a misplaced book is lost. This is why librarians request readers not to return books to the shelves and why librarians regularly look for misplaced books. A library, said librarian-poet Jorge Luis Borges, is the best place to lose a book. This is how even a manuscript by Bach or by Mozart could turn up not so long ago in a very famous library, and a book by Archimedes or by Menander.[19] Nevertheless, until recently librarians did not discuss then problems of recall. (Their early discussions related to the recall of books on loan, not on shelves.) Though they still center on retention, and though the new modes of retention, such as microfilms and digital electronic memory cells create new problems of recall (e.g., how do you skim through a microfilm?) that have seldom been studied. At least the ice was broken.

The potential of Freud’s theory has not yet been tapped. He meant to apply it to the memories of events significant for the individual in question. Does it apply to the memorization of poetry or of telephone numbers? Does it explain the use of the drill method for memorizing a poem? Why are most people able to recognize a poem upon hearing it a second time but not to recite it after a first reading? Do people with total recall avoid repression? It is not clear whether Freud’s theory applies to all this. If yes, then it opens new vistas of the possibility of endowing everybody with total recall. This is too exciting a possibility to overlook, however slim it may be, and despite the observation that total recall may be a curse rather than a blessing. This depends on the answer to the question, how does drilling help memorize? How does the ancient idea of mnemonic tricks work? If recall is the problem, then it may be easier to recall information from a book, and so open-book exams should replace all school memorizing! Those who understand a book and know how to use it need not memorize it! The standard answer—inasmuch as there is an answer at all—is that memory facilitates both the use of the book and the understanding of it.[20] How? What is understanding and how does it differ from memory?

It is time to discuss the major error of the received theory. The minor error, you remember, was that we failed to notice that we could memorize algorisms and equally that we can forget them: the drill theory identifies understanding with the ability to apply algorisms, logical, mathematical, and any other. Is it correct to identify understanding with the ability to apply an algorism? A mathematical proof is logical—there must be no mystery about it. Hence, it allegedly use an algorism; so says the theory implicit in the drill theory. It is false. The drill theory makes learning as obvious a procedure as possible; allegedly this makes it rational; hence, the popular folly of identifying the operations of computers with rationality. Hence the justice in the strong plea of the once very famous existentialist philosopher William Barrett (Irrational Man, 1958) to flee from the Laputa that science permeates. He does not equate rationality with algorism on his own; he says, if the rationality of science is as algoristic as everybody claims that it is, then let us search for a more human kind of rationality.[21]

Is Barrett’s view of the general opinion correct? Do people identify understanding with the ability to calculate? Do experts? You always face this kind of question when you criticize a popular doctrine. You may quote the highest authority, Professor Jerome S. Bruner Himself, to say it is popular among his peers, among educationists of all sorts, and people will say, this proves nothing, at least not that the doctrine you are attacking is popular. Therefore, the claim that a doctrine is popular is always hypothetical, even when limited to experts. You do not like it. Forget it, then. Let us replace it by a much more interesting one: the doctrine at hand (understanding = the ability to calculate, to manipulate an algorism) is popular in high schools. This is Popper’s hypothesis.

Practically all those high-school kids who develop mental blocks against mathematics do so because they look very hard for the algorisms that are not there. Here is an empirical fact: armed with this observation of Popper’s you can remove—with patience and with a gentle touch, but with no difficulty—any such mental block if it has not aged and caused so much havoc that the student is totally disoriented.

Before going further, let me mention a useful moral from what I have told you thus far. If you start a lecture by asserting its thesis in one sentence and explain its importance,[22]  if you repeat that sentence—verbatim—somewhere in the middle, and if you end the lecture with that sentence, you improve its chances to be remembered. If you prepare the lecture course properly, you begin writing an abstract of it. Make the abstract comprise of thirteen sentences (for a one-semester course), with the first being an introduction and the last a conclusion. Your students will remember the abstract. Do not make it original: make it signify to them. Some original thinkers are so frustrated by their peers’ persistent oversight of their important innovations that they make these the center of their teaching activity. This is obviously unfair to their students, as they obtain a lopsided image of the whole field to which their professor’s research belongs.

Back to my story about the mental block that we impart our students by teaching them logic or math unexplained. Rational argument is deductive. Any given deductive process is largely arbitrary and depends on intuition: the deductive character of an argument seldom rests on an algorism: it rests on its obedience to the rules of logic that are not obligations but permissions.[23] Often a proof-procedure starts with an assumption; indirect proofs always start with assumptions, and even with false ones.[24] This perplexes students, since the assumption is unfounded and we are often told to avoid unfounded assumptions. This blocks the ability of readers to reproduce proofs; understanding, you remember, is remembering not the proof but the algorism that reproduces it. So poor students are caught in a net quite unaware; they struggle against something most elusive. We teach them the drill theory on the silly assumption that we drill in them truths that we prove. Students conclude from this that teachers have an algorism by which they arrive at their assumptions; students also assume that teachers do not bother to explain the algorism in use because it is too obvious, that hence students must find it, that anyone who does not is behind.[25] Eventually, they volunteer to become blocked, since it is the only thing they can do to save the logic of the situation. Like the accused African native who admits an accusation—he had become the lion that had killed his neighbor—to be punished accordingly.[26]

Were proofs dependent on algorisms, we would not owe gratitude to our great predecessors for their having conceived them. Euclid used no algorism to prove that there are infinitely many prime numbers. We can see that every step in his proof is logical, we can see that it arrives at the intended theorem. Each step in the proof has infinitely many alternatives to choose from, and almost all of them fail to lead to the desired target. By luck and ingenuity,[27] as well as by who knows how many unsuccessful or partially successful previous trials, one arrives at a proof. To expect a student to reinvent Euclid’s proof is monstrous, yet confused students often think this is what they should be able to do. Some confused teachers, we remember, have invented the discovery method that demands that of them. Alas! No matter how well you understood Euclid’s proof once, if you forgot it clean you are unlikely to rediscover it.

The error is so deep-seated that even high-level mathematics students suffer from it, especially when teaching differential equations. There is no theory of differential equations to speak of, and no algorism in any way akin to the algorism by which we solve algebraic equations of the first or second degree in high school. Some groups of differential equations have been solved, and others can be approximated by various algorisms, but one never knows in advance which algorism, if any, helps to come close to solving a given differential equation: one can only try the various techniques and hope for the best, or pray for inspiration, or give up after long, futile efforts. This information is almost never available to students who enter differential-equations courses, although it is implicit in almost any textbook. Differential equations are often more useful than interesting—especially in physics. Hence, students in courses of differential equations are more likely to be physics students than students of mathematics. Oddly, they suffer less from the said confusion.

To sum up an essential point without further discussion and examples. First, Popper agrees with Freud that there are psychological difficulties concerning studies, but the ones he centers on are ones that (as a Baconian) Freud could not see: their direct source is not moral; they are associated not with childhood experiences but with the inability to master the curriculum, much less to criticize it—often due to a high degree of native talent and sensitivity to the subject. These qualities should enormously facilitate study; under better conditions, they would. (The sight of previously poor students who got rid of blocks is often most delightful, because they blossom tremendously, just because often their very talents have led to the blockage.)

Some memory work—whether memorizing or being able to look up the literature—is unavoidable even when the material studied is largely deductive: no memory is secure, not of an algorism and not of an ingenious trick. It is possible to facilitate memory, however, by facilitating understanding—in two very closely related yet distinct ways. First, sheer economy. We can pinpoint the part that we have no choice but to memorize. Second, by fitting the remembered item very well within the general understanding of the problem-situation, the problem at hand, and its solution. (General familiarity with the field facilitates looking-up.) Moreover, by developing a critical attitude in class, not only the standard causes of distress, but also other difficulties that students may sense, will be not deterrents to study by further stimuli.

All teachers, including inexperienced ones, know that most students will forget much of what they study. Query: why do they teach what students will forget? Were it fun to go over such a material, the problem would be academic; but it is agony to all concerned. Were it a matter of experiment, it would be highly justifiable; but with a notoriously conservative curriculum, occasionally even teachers complain about stagnation. So why do we teach what will be forgotten? The few who have faced the question have also dismissed it with ease—in one or two standard fashions. The first answer is, there is a transfer from the forgotten material to other affairs so that not all is lost. This allegation is apologetic: not a single empirical study support it. The transfer theory itself, being part of the drill theory, is in a shaky state, as learning-theorists may admit under pressure. There is not a single design of a sane experiment to test the hypothesis that any material learned in school and forgotten is transferred. To my knowledge, the only empirical study of what is memorable in any course whatsoever is the immensely popular R. J. Yeatman and W. C. Sellar 1066 And All That (1930). This hilariously funny book claims to contain all that middle-class English high school graduates remember from high school history classes. It is not to take seriously. Still, in truth we do forget much.

The second answer is, we do not know what is memorable, and we cannot expect complete effectiveness. Concerning effectiveness, some research on it has led to some attempts to cut the material down to the bones, especially in auxiliary courses, whether of geometry in vocational schools or of biochemistry in medical schools. The trend started, I think, with traditional navigation for naval officers. Their instructors assume that they are too dumb for Newtonian mechanics fully fledged. The trend then developed with statistics for students of social sciences, since they are often utterly incapable of absorbing the standard statistics course and yet professors consider statistics essential for proper social studies. Thus far, the experiment is in no proportion to the need and it already meets an insurmountable difficulty: some experimental courses cut materials down to the bones; as experts view them with contempt, as a rule they delegate them to wretched teachers.[28] Even cutting down the courses to the bones results only with minimizing agonies, not with the required presenting the material as enjoyable. This is a pity because there is no need to cut: most of the regular standard curriculum is highly interesting; it is the education system that makes it boring. All that is necessary to render school interesting is students’ readiness to cooperate, and this is easy to elicit by presenting the interest of the curriculum material and its significance. For this, however, teachers must be frank and present the shortcomings of that material too.

The right structure of a course renders merely marginal the difference between the full course and the one cut to the bones. To be well designed, a course must have a structure. If you wish to interest your students in it, you must describe and explain that structure, its significance, its chief problems, its general background ideas. The more successful or high-level a course, the richer it may become—much pending on the occasion, the students, the atmosphere, its various causes and factors. The natural starting point, however, is the structure. I am speaking from experience.

If you intend to provide students with a preview of a course and to air the problems that they may have, most of present-day problems of instruction, in high school as in college, would be manageable and memory-work minimized: most of what students will have to be remembered they will remember with ease they will be able to look up the information that they forget with ease. I am no utopian; I know that your teachers and colleagues will not follow my suggestions—although I do hope that my suggestions, and better ones, will be commonplace by the time you retire. Meanwhile, there is a lot you can do for yourself in that direction. There are top-notch easy introductions to various subjects, such as Charles Darwin, Autobiography; A. H. Fraenkel, Abstract Set Theory; Courant-Robbins, What Is Mathematics; Otto Toeplitz, The Calculus; Galileo, Dialogue on the Two World Systems; Einstein and Infeld The Development of Physics; Schrödinger, What is Life?, David Hume, Essays, Moral, Political, and Literary; Adam Smith, Wealth of Nations; Léon Walras, Elements of Pure Economics; Paul Samuelson, Foundations of Economic Analysis; Edward Evans-Pritchard, Lectures on Anthropology; Plato, Symposium; Bertrand Russell, Problems of Philosophy; Karl Popper, The Open Society and its Enemies; Ernst Gombrich, The Story of Art; E. M. Foster, Aspects of the Novel; Raymond Chandler, The Simple Art of Murder; Charles Ives, Essays Before a Sonata, and many other delightful and profound books that are great eye-openers on any level of scholarship and grand works of art to boot. You can also read survey material wherever you can find it: prefaces, introductions, and conclusions of certain monographs, book-reviews and review articles and essays, and encyclopedia articles—use the eleventh edition of the Encyclopedia Britannica (of the nineteen-tens) whenever you can. On the whole, try to tackle your material in the not-so-up-to-date better literature, whenever you can find it, whether in professional works written a generation ago or in encyclopedias for teenagers: whenever the encyclopedia for adults and for kids are at all comparable, you can be sure the latter will win hands down: it is written for a purely voluntary audience not yet versed enough in the mystiques of scholarship and the rest. Once you are somehow coordinated you will be surprised how relatively easy it will become to update your information—perhaps with the aid of the old pros and the more talented friends and colleagues, but much more cheaply by the use of the internet.

Yes, I do mean it: doing very well on the introductory level, however elementary (indeed, the more elementary the better), will enable you to ignore much dead wood and redundancy as well as to remember with ease the details you have to. Somewhere, sometimes, you must have heard the saying about the wood and the trees. I guess it got lost among so many other sayings you have memorized in high school. I suggest you pay a special attention to it every time your professor takes you for a walk in the Petrified Forest. For further details go on reading.

Yes, I do mean all this not only for students and for young colleagues; nobody has to get lost among the dead wood. I confess I have given up hope of helping my established colleagues; I intend these pages for young people with glittering eyes on their way to academic careers. You may be a student and worry about exams: how, you say, can one have time for all the extra-curricular reading here recommended when there are so many exams on so many details? In the following section, I shall show you how you can try to fit the details in simple dialectical schemes and so make them highly memorable without having to memorize them. I confess it is not so easy and sometimes simply impossible. Yet this I say to you right now: if you develop with ease a good grasp of the structure of a given course, and remember only the details that stick without effort, your teachers may dislike it no end and even condemn your conduct; they may bar you from receiving the highest grades; but they will not be able to flunk you, and they will consider you promising. Being able to pass a course cheaply without fear of flunking, even with the chance of doing very well—for some of your professors are nobody’s fools, even if you overlook this fact—seems to me to be quite a bargain. The real bargain, however, will be that you will enjoy study and develop intellectually much better than your hard-working classmates: teachers’ pets do not show half as much exceptional talents when they go into the world—especially into the ivory tower—as their grades promise. Indeed, as I keep telling you, once you graduate, your grades lose all significance. Or, are you a new teacher? I recommend then that you discuss in class the structure of your course—in the first meeting, in the last meeting and on any other occasion. It is much better for you—especially in your first few years—to cut your course down to the bones and repeat it, and encourage students to raise all the difficulties they may have and air all their blocks. This way you, too, will benefit. You will have more time to plan your courses and your lectures since you will not need so much time for preparation by the way of studying up-to-date pointless material and by the way of stuffing your course with pointless details. Also, announce in your first class, and repeat it endlessly, that you will have an open-book exam, or still better a take-home exam in essay form (if the authorities permit it), to ensure that they will not spend time memorizing. Remember the professors from whom you learned most, and notice that you have learned from them least in detail and that they always had time to labor the obvious in a free and easy manner at the request of the dullest and slowest student in class. Note that it is easy to imitate them, now that you know how.

To conclude, the more meaningful for you an item (“cept” is students’ jargon for it) is, the less necessary it is for you to memorize it—for exams and more so on other occasions, when you can look things up—and the more you will be able to use it intelligently. The dull items in the standard textbook are often easy to liven up. I shall soon tell you how.

Before going to the next section, I feel an urge to tidy up a point concerning memory and its shortcoming. The drill theorists have assumed tacitly that recall is unproblematic; they centered on storage. Freud said the exact opposite—also tacitly. What is missing in both theories is a reasonable view of the understanding and its role in memory. There is such a thing as overall understanding, as well as the understanding of the place of a given step in a general discussion or of a given item in a general picture. The understanding of problems invites study too. Gestalt psychologists should have noticed that, since already Kant discovered it; some of them indeed have, to this or that extent. Inasmuch as they have offered a theory of intellectual Gestalts, it is Kantian and static. An improved version of it is possible: it requires a modification of Kant’s theory, a recognition of our ability to alter our Gestalts as we face new criticisms, new problems, new aspirations. This modification of Kant’s theory is Einstein’s or Popper’s. As a theory of memory, this is a part of the research of the great J. J. Gibson and his followers; my favorite among them is my former student Edward S. Reed. I still grieve his early demise.

Memory may distort, yet we cannot do without it: we need it for a sense of proportion. Take for example the standard distortion due to the view of criticism as harmful (even though it is obviously beneficial). The memory of some terrific criticism that took you off an embarrassing path is the best cure for the misconception. However, notice this: criticism is not only for personal benefit: it is also publicly very useful. This is the use of public memory: history. Think of criticism of ideas of dead thinkers. Only some of these are remembered—possibly together with the criticism that has rendered them obsolete. The value of the criticism depends on the value of the ideas criticized and is indifferent to the question, is its originator dead or alive? Although the knowledge of the value of criticism for learning is ancient, Popper has observed, leading learning theories systematically overlook it.
 

4. Training in Dialectics, the Supreme Art of the True Scholar

Dialectics, the supreme art of scholarship, is open to diverse techniques. This section concerns the diverse techniques. You can test my recommendations on them by trying to apply them. If you find them useless, give them up or try to improve upon them. The role of dialectics is to criticize: dialectic is the attempt to disprove a given approach, view, idea, theory, observation. The essential prerequisite here is to choose an item worthy of criticism. To this end, it is important to know quite a few things. What is the question/problem/issue at stake? Is it important? Why? What is the specialty and strength of the way of tackling it? All this comprises one point: you must learn to be appreciative of the object of your criticism. I advise you most emphatically to express your appreciation clearly and in the very opening of your critical discussion. It is advisable to open with a description of a problem-situation, conclude it with a statement of a problem to be studied, followed by the statement of the extant solutions, followed by naming the ones that deserve criticism and explaining why (Collingwood, Autobiography, 1939).

Before starting, here is a terrific piece of advice. It concerns a common error that you should avoid: when your opinion is under criticism, you may encounter a piece of criticism that you cannot answer; that ends the debate. I have witnessed many cases of violation of a simple rule here, to my shame some of them by myself: you simply let the critical debate die naturally. This should never happen to you. Never. As the object of criticism, you must always, always have the last word; the default option is, “Thanks”.
 

4.1

Dialectic presentation renders disagreement somewhat abstract—more objective than otherwise. I never attack people, said Friedrich Nietzsche, only the opinions that they represent.

If you wish play safe, dialectic is not for you. In any case, safe or not safe, this is no to the point: dialectic debates are seldom about you or about your convictions. If you choose to discuss an opinion that you advocate and that you strongly believe in, that is fine; if it is an opinion that you do not advocate, that is equally fine—as long as you value the opinions you are discussing. If you do not value an opinion, do not put it to a dialectic debate. Some people oppose this and say, some opinions are valueless but need discussion because they are popular. This opposition merits attention, but in the present context it is irrelevant, since we are discussing here your benefit, not your (political) duty to your society. Nevertheless, let me take this point up, as briefly as I can.

There is all the difference in the world between playing and training. When a coach and a trainee play, the coach is teaching and the trainee is learning. They do not play. When you argue with anti-Semites, you try to teach them, you are not engaged in a dialectic debate with them: a dialectic debate may surprise, arguing against antisemitism is a real bore. Your argument with the anti-Semites may also not qualify as teaching, since your opponents did not ask you to teach them. Your activity is political, and the name for it is propaganda. Do not confuse dialectic debate with propaganda: they are both honorable and valuable, but they follow different sets of rules.

Nor are dialectic and propaganda the only kinds of conversation. When you wish to convince, say your heart’s desire to marry you, you are engaged in still different an activity, and I wish you luck. Let me add, if I may, that this activity is not dialectic (although it may involve some).

The supreme rationale for playing the game of dialectic is the idea that the exposure of error is progress. This makes commendable the relinquishing of (seemingly) defeated positions. Hence, the rationale of dialectics in general is the recognition of ignorance. More dialectic cannot contribute. Arch chemist Lavoisier said once, let those who disagree with me attempt to criticize my doctrine—they are most heartily welcome to it. Since my theory is true, their criticism will have to fail. Their very failure will convince them that I am right, he said. As it happens, Humphry Davy criticized his doctrine effectively (by claiming that halogens that oxidize are elements, contrary to Lavoisier’s claim that only oxygen oxidizes and that therefore chlorine is a compound).

The position of Lavoisier is questionable, yet he was a prominent dialectician, a master of the first order. If I were concerned enough about the present study and the impression that it made on you, I might scrap the last paragraph altogether (and no matter what an author tells you, to be any good, authors must learn to scrap a lot). I will not scrap a paragraph that exposes a weakness in my view.

Here is my view of Lavoisier for what it is worth. He was a magnificent critic of all his predecessors. Regrettably, he was contemptuous of them—following seriously Bacon’s condemnation of the publication of erroneous ideas. He consequently allowed his wife to burn ceremoniously the leading book of his arch target, his great predecessor, phlogistonist Georg Ernst Stahl. A leading Lavoisier opponent, Richard Kirwan, wrote a major work against him, his Essay on Phlogiston and the Constitution of Acids (1787). In response, Lavoisier, had the work translated and each chapter published together with a detailed critical commentary written by a leading French thinker (himself included). Even mathematician Laplace had a hand in the work, but this was more of a political than of a dialectical move. Yet, this political move too showed how seriously Lavoisier took Kirwan. The translated and expanded book appeared then in English retranslation. Kirwan capitulated—very honorably and equally honored by his contemporaries, as all evidence available indicates. Lavoisier died (he was beheaded during the Revolution) before Priestley and Davy sufficiently effectively criticized his doctrine. Aa a private letter (written a generation later to Davy’s pupil Faraday) testifies, there were serious attempts to suppress the French publication of Davy’s criticism of Lavoisier—even threats to use police force; though there was no difficulty for any historian of science to lay hands on that letter and publish it, it was an amateur historian of science who published it—myself.[29] (No worry: some professional historians of science dismissed my paper with the aid of many sophisticated arguments.)[30] Lavoisier would not have suppressed criticism of his views. It is to the credit of his followers that their resistance to Davy, though unpleasant, crumbled within a few years and totally disappeared: the incident the letter relates took place in 1811; during 1814, in spite of the war, Davy was a guest of honor of the French scientific community. Scientific dogmatism is like this. In all of its variants, it comes with some efforts to do justice to opponents. Hence, said Popper, open-mindedness in the scientific community is a social phenomenon, not individual: the scientific tradition neglects dogmatic texts (except as items to explain).[31] Most of the rank-and-file have little occasion to change their minds. They defend Newton one day, Einstein the other, and Bohr afterwards, but these occasions are rare, and so they can safely follow the same idea: parrot the leaders; the less you comprehend them the more you should conceal that by vociferous expressions of loyalty. Just so. Thomas S. Kuhn has placed them on the map by calling them normal;[32] their defense of dogma normally collapses fast.

All this does not render the publications of the rank-and-file useless; research-work is only mostly useless (especially in these days of academic empire building and publication pressure). Inasmuch as new research is useful, it is so because the leadership accepts new ideas, together with the new frameworks within which new researcher results appear. Research cannot always stagnate as much as the leadership would want it to. (Leading philosopher and historian of science Thomas S. Kuhn said, the leaders of a science are always right when they decide to declare a paradigm shift. This should have given him some peace of mind. It did not.) Once Einstein’s relativity won acceptance, there was a lot of work to do: it was necessary to translate much from the old Newtonian framework to the new, some requiring work of giants like Paul Dirac, some offering smaller but still important challenge that rank-and-file researchers could handle. The same holds for Malinowski’s work, and that of Morgan and of other trailblazers. (Already Laplace noticed this as he humbly viewed all of his trailblazing work in physics as mere implementation of Newton’s achievements; and this despite his tremendous, amply justifiable pride!) Alas! Some leaders in science are dogmatists of frightening dimensions. Yet even they bend before powerful criticism—or else they simply lose their leadership (Polanyi). Hence, things need not be as bad as they sound: some criticism may cause change of framework to permit further change.

This explains how Einstein’s severe criticism of the views of scientists about what they do is no warning sign that science may be dying. The popular expressions of contempt for the critical attitude is costly, but they do not endanger science: look at their critical attitude in action! The reason I am fussing about it rests on my wish to save you the cost of this contempt. I do not know how much my academic success allows me to put myself as a model; for what it is worth, however, let me report this: without my dialectic writing abilities, I do not think I could have such an enormous and successful output. I hope with my aid you will do better.
 

4.2

For what it is worth, in my opinion you can be a good dialectician without knowing and even ambivalently, but you can be quicker to learn the tricks of the trade better and become more efficient, if you shed your ambivalence and approach matters aware and self-critical. After all, most researchers have learned a formula for research—inductivism—and their practicing research seemingly show that the inductive formula works. You can easily learn the formula, by reading summaries of works by Bacon, Locke, and others. I suppose it is just as easy for you to learn the modern variant of inductivism. As a student or as a young colleague, you will find little antagonism to either your attempt to follow that formula or your declaration of faith in it. I suppose that this is what your professors and senior colleagues expect of you. I suppose you are frustrated and bewildered; well, try the dialectic formula and see if it does not fit your work better. It may be just what you want and it may be furthest from your thought. If you will not try it, you may never know.

That is all for the time being. Later on, someone may say of you that you were only thinking you were applying the rules of dialectic and then propose to you better rules. If we are to be scientific, why not improve our views of science just as we improve our views of electrons and nucleic acids and democracy? There is one difference, however, between the natural and the social sciences, and it is simple: electrons follow the same laws regardless of our views of them and of our experiments on them, but those who experiment in order to create and follow better rules, may become better at democracy or at scholarship and research.

To lend the importance of theory for chemistry, Lavoisier tried to show first how much it explains—especially that his theory explains better all that his predecessors had explained. This he did magnificently; but then he sank into apologetics, though not for long, as he died young. Phlogistonism was the theory that combustibles contain phlogiston (= the matter of fire) that they exude, which is the process of burning. This explains why coal ashes are lighter than the initial coal. Yet the ashes of metals (rusts) are heavier than the initial metals. Phlogistonists took coal for granted and wondered about metals. Lavoisier turned this around: he took metals for granted and tried to explain the loss of weight of coal: he trapped the gas that the burning coal emitted and measured its excess weight. At the time, researchers deemed phlogistonism a tremendous success as it included formulas that describe processes. These still dominate introductory chemistry books. Yet in the initial formulas phlogiston moved away from the combustible; Lavoisier replaced this loss of phlogiston with the gain of caloric. In the path-breaking theory of heat-machines of Sadi Carnot of 1824, heat-transfer was the transition of caloric as a fluid from high places to low ones (= from hot bodies to cold ones).

Carnot too was a master dialectician. He considered the strongest criticism of the theory of caloric the observation of Benjamin Thompson (Count Rumford) that friction is an inexhaustible source of heat (1798). He even showed that in the same process (it happened to be boring cannon pipes to make them extra-smooth) the more friction causes more heat. He said, the calorist assumption that heat is matter means that creating or destroying it is impossible. (Physicists took this for granted; they gave it up only when Einstein’s equation of mass with energy replaced it.) Carnot said, friction raises temperature because it is a means for moving caloric around; like other fluids, if flows only from high places to low ones. Although his study was very successful, he was sufficiently self-critical to worry about Davy’s (thought) experiment: friction in isolation in vacuum still raises temperature. The situation was baffling because Thomson and Davy held a baffling competing theory: the theory of heat as motion. Historians of physics declare them in the right, thereby perpetuating the confusion. When Carnot’s successful theory underwent successful translation from the caloric theory to an extended version of the motion theory (that obeys the law of conservation of energy), the result was the theory of heat (not as motion but) as the concentration of motion. It is still the received opinion despite its problems (the thermodynamics paradoxes).

If you happen to be a physicist, then I assume you see that the gross outline of the history of classical chemistry and of classical thermodynamics as an ongoing dialectic process facilitates presentation tremendously. It is also easy to fill details here as raw material for either a text for students or a historical paper for some scholarly publication in the history of science. If you are a physicist and you do not like this, I wonder how you have arrived to this place in my book to read these paragraphs.
 

4.3

Criticism is not enough for progress; for that, one wants fresh supply of ever better theories to keep dialectics from stagnation. We have no theory about the generation of theories. We have no theory about the frame of mind conducive to the production of good theories—except that one needs problems to solve with their help, and that criticisms of past theories furnish such problems. Some people might refuse to become researchers if they thought their ideas might be superseded one day: Newton himself seems to have been such a person; and Freud too. This led them both to enormous waste of mental energy. They had enough of it to waste; you may have to be a bit more economical.

You can start your lessons in dialectical economy straight away. We have been discussing the piling of facts in the standard textbook.—Sheer inductivism; why not scrap it all?—Not so fast; try to translate it into dialectics if and when you can. It is not so easy, and sometimes requires more historical knowledge and training than you have; but sometimes you can. Indeed, sometimes you do—and meet with penalties as a result. Let us start there, where you have already arrived, concerning an achievement of yours for which you have already received your penalty.

Have you ever argued with your elders and betters? Do you remember the incident? You were beaten in the debate, I dare to assume, even by fair means—certainly not always, but let us concentrate on the better instance. The defeat had a moral for you: do not argue—not yet, at least—and before you master your trade, keep silent; eat a humble pie; chew dust; do as you are told. Let us take a more specific case. Suppose you have advanced a theory, perhaps one you have heard one way or another during your early days. Suppose now your elders and betters remind you of fact x on page y of your sacred textbook and use it to disprove your pet theory. You are humbled; with agonies, you try to put away the silly theory—forget it even. Memorize the textbook instead.

>What? Has it never occurred to you? Or did their proposal that you forget it work so well? One way or another, perhaps this book not for you. Perhaps you never dared pester your elders and betters; perhaps you never had any idea with which to pester them. I recommend you to stop reading this book for a while and remedy this defect. You can come back to this book later if you must. First, have it out with your elders and betters.

Meanwhile we continue with those who have pestered their elders and betters and were told to forget. Well, I say, do not. Do remember the theory you have heard or invented or reconstructed; it comes together the fact that they used to criticize it. You are therefore free of the need to memorize that fact. When you become a teacher, your list of dry facts will be reduced by one: you will be able to present the theory and refute it with the help of that same fact—with no trauma and so with no scar. This is very useful.

You can make a method of it. Start with your high school if you study in college what you studied already in high school. Otherwise, start with some of what they call folk science. Express each of the scientifically-rejected theories as clearly as you can. Find in the textbook the facts that refute it. Very easy. Very instructive. Nothing to forget. Little to memorize. Real bargain as far as it goes.

Take a subject not taught in high school, such as politics (civics) or economics. Everyone is a politician of sorts and an economist of sorts. Before you came to college, you followed the value theory: you only bought a thing if the price was reasonable in the sense that you got the value for your money. In your elementary economics textbook, they start with preferences and with indifference-curves and budget-curves and supply and demand curves; and they give you no idea as to how these relate to your purchase of your jalopy last fall. Now try to word your pre-college value-theory and find in your textbook criticisms of it. The theory of supply and demand that comes to replace it will make much better sense to you then, as will the theory of consumers’ preference.

It is not all an easy exercise. You may find it helpful to consult your encyclopedia—under value-theory or under Ricardo, David—or you may wish to consult the text of your history-of-economic-thought course (try the terrific Robert L. Heilbroner, The Worldly Philosophers, 1953). Ask your instructor for help—there is always one around willing to help a bright fellow, and following the kind of suggestions made here is the bright fellow’s way.

The same goes for politics. Take the latest political dispute you have engaged in. Ask yourself, what problem were you and your peers debating? If you do not know the answer, just try out any conjecture. You may change your question later on. In the meantime, stick to the question for a while, list as many answers to it as you can, and try to defend each of them. This should raise the level of the debate sufficiently above the standard level to make it more interesting than usual. If you continue this way, you will very soon find that you need some simple items of information. Stop the discussion for a while—notice this: it is always legitimate to ask for time out—and look it up. When you want to look up some broader questions, you will find it particularly hard to find the right text to consult. Take it slowly.

The method is easy in some cases and very difficult in others. Try to ask your teacher in learning-theory how he explains repression. Nothing to it: it is negative reinforcement. How exactly? Shut-up; we are busy now in some other work. All in good time. In other words, it is too tough even for your teachers in learning-theory. It is useless to pester. Let us leave them and the task at hand for a while. You can come back to it later.
 

4.4

The standard dialectic discussion concerns the possible disagreement between theory and observation or perception of facts. This is no small matter since it is seldom obvious whether theory and observation are in agreement. Do not despair if what I am going to describes has never happened to you: I do not take you for a genius. Perhaps you tried to measure the sum of angles of a triangle to see if it really equals to two right angles. Maybe not. Maybe you never bothered about your high school geometry. Maybe they taught you the new mathematics and no trace of Euclid. It does not matter. All that matters is that something or other bothered you about your high school geometry textbook, about its content, form, illustration, print, or color of cover. Or about something else that they taught you in high school. I assume you have not found much about it and that you have forgotten it until now. They say growing-up is growing out of such uneasy feelings. How sad. Growing-up is not growing out of any unease, but learning to articulate unease—this may be and indeed often is quite precarious, and so it should read, learning to try to articulate unease in alternative manners—and learning to tackle it dialectically: you may try to solve the problem articulated and then try to criticize your own solution, or to generalize the problem first, or to replace it by a more interesting and significant one. Experts call repression the growing out of an unease or forgetting it, whereas they call adjustment the learning to articulate it as a problem and perhaps solving it. There is a methodological defect in Freud (who was an ardent inductivist) that shows up as a defect in his theory, since articulating the unease as a problem and solving it is an adjustment in a sense that his theory overlooks.

Thomas Mann’s 1903 Tonio Kröger is a literary application of Freud’s theory. It is a schematic story in three movements—a problematic infancy, its development in manhood into a form of neurosis, and its resolution through the hero’s return to his childhood scene, noting that his plight had been of a half-Italian living in a parochial German community. Poor as it is we can learn from it. When you come to college, your mind is full of ideas and full of misgivings. If you express all these as clearly as you can, you will find both greater peace of mind and much of the textbook helpful and thus making better sense; perhaps you will have to use not only the textbook but also some auxiliary material, such as historical sketches and encyclopedia articles. In any case, this process is cheaper and more fruitful than agonizing or memorizing.

Textbooks contain much material confirming theories, old or new, not only criticisms. Confirmations, they say, are accords between facts and theories. This is very vague and somewhat dangerous for a beginner dialectician, for two distinct reasons. First, if the logic of confirmation is not clear it is harder to understand it, and second it is harder to see problems that might nonetheless cause unease. Popper’s criticism of the popular view of confirmation is simple: it renders painkillers the best cure for cancer, since most of those who use them are free of cancer. This displays the poverty of the inductivist theory of confirmation currently most popular among philosophers of science. To see what confirms what ad why, you need to see what a new theory explains and how it changed the scene: what was problematic before, how much of it is now better explained. Notice that we have good explanations and poor ones, and that only good ones are testable and thus confirmable. This is Popper’s view. Apply it to Thomas Mann’s novel. You will look for cases that look un-Freudian, that may fail to fit Freud’s theory. If you describe it in detail, and then explain how it fits Freud’s theory nonetheless, then you have a tour de force, an achievement.

Take a simple example. Physicists calculated from their theories the strength of metallic crystals; the result did not tally with known facts: by the calculated results, pig iron should be much stronger than even steel happens to be. However, inspection shows that pig iron is not quite a crystal: it is a heap of crystals. So with some effort physicists succeeded to manufacture some sort of metallic crystals (whiskers they are called) more like the theoretical crystals and less broken or distorted than the usual. Their strength does accord with the theory much better.

Many physicists take this as a part of the success story of science and leave it at that. I hope they are happy. Good dialecticians they are not. For, although the strength of certain unusually constructed metallic crystals is successfully explained, the strength of usual pieces of metal is still not. For this, further studies are necessary, and some indeed were undertaken. These studies are much more advanced and still highly incomplete. They are essential, say, for aircraft engineers, who need results that are both confirmed and applicable. They are legally forced to limit applications to ideas confirmed in accord with highly specified procedures. The confirming cases may be special and leave much not understood. Nobody suspected that brilliant dust dancing in the sunbeam offered a criticism of Newton’s theory of light until almost a century after the demise of this theory—when Lord Rayleigh showed that only a wave theory of light explains it. Maxwell theory of electromagnetism (1865) explained the wave theory of light (1818); he was convinced that radiation of hot things is explicable by his theory; Rutherford disproved him (1904). To overcome this difficulty, his student Niels Bohr developed quantum theory (1919).

Take a different example: Mendelism (1866). When you cross-breed a trait AA with a trait BB you get a mixed breed AB—heterozygote—and when you breed these you get one quarter AA, one quarter BB and one half of the mixed AB. The mixed breed may look in-between—say, gray if A and B are white and black—or it may look like the one that is called dominant; the other is called recessive. When a mixed breed of certain mice showed one third and two thirds it looked like a criticism of Mendelism; balance was restored when the missing trait—one quarter—were found as fetuses that die in the womb early in pregnancy. This is confirmation indeed, but it raises hosts of new problems, not all of which are solved. Also, there is the failure of Mendelism to apply to humans skin-color. (Almost all African American so-called are mixtures of skin-colors). Powerful researchers tried to apply Mendelism to the breeding of cows with high yields of milk and racehorses. Thus far, few of these efforts are successful. Yet quite a few historians of science say, as Mendelism is obvious, all of Mendel’s predecessors and contemporaries were prejudiced.

There are many such cases. To take a conspicuous example, historians of science allege that early nineteenth century chemistry has proven Dalton’s atomism. His proof was the law of multiple proportions. (Thus the ration between the weight of a molecule of oxygen, O2, and of ozone, O3, is 2/3.) They ignore all evidence that conflicted with it (such as solutions and amalgams) and that explained why many researchers disagreed with Dalton. But just look at the latest table of elements and see how many facts do not fit simply and invite special theories, some of which are truly impressive, but without the ability to close the issue.

Still another example. The theory of the cone of colors is an impressive success. I looked at as many books on it for the color brown. They do not mention it. Although it is a tremendous achievement, the theory lacks much. Consider the different shades of white, especially silver, and specific colors such as oxblood, brown and khaki, as well as burgundy and beige. Studies of colors that display the cone do not mention this. That conflicts with the rules of dialectics. 

The social sciences are murkier. Popper showed that much of the reasons for Marx or Freud having considered their views scientific are poor. They are untestable; also, they do not explain what they are alleged to explain.

Ask the contemptuous of Freud, how do they explain Freudian repression? Easy: negative conditioning. Ask for details. In the best instance, they will give you a list of publications to look in them for an answer. You need not bother about such cases. They are all run-of-the-mill. Articulate them, use them as best you can in your present work, or lay them aside for a rainy day. They do bother you because you do not know what exactly to expect from your professor or your textbook (they systematically try to refrain from telling you) and subsequently you expect them to give you more than they can. If you are as submissive as required, then this comes off nicely; if you are clever, you expect to receive what you think they can deliver; when they do not, you feel troubled. To release this sense of trouble, you may increase your critical aptitude. You need not exert yourself; try to get more than the average student at your stage gets. The fruits of novelty may come later, with perseverance and with luck. Right now, we ignore your need to coordinate and to achieve some peace of mind.

I am wary of your becoming hypercritical when you discover weaknesses in your syllabus, textbook, professor, or senior colleagues—especially when they obviously mystify a point to conceal weakness or ignorance. Hypercriticism is an inverted uncritical attitude, a variant of cynicism and bitterness, itself a variant of broken but not destroyed naiveté. Also, it is the mark of the crank—sometimes rightly so. For, being hypercritical, being scandalized at the existence of established and respected defected items masqueraded as perfect, amounts to the faith in perfection, in the hope that one day the hypercritics will right all wrongs.

I do not know how Lavoisier made his first great discovery, his real breakthrough. It was a fact and a conjecture. The fact was that not only metals but also other matter might exhibit the same feature: Lavoisier found that, like dregs of metals, dregs of phosphorus are heavier than their origins. The conjecture was bold: perhaps metals display the rule and charcoal is the exception! He worked feverishly trying to prove this conjecture. He was stuck. He then heard Priestley on his discovery of deflogisticated air (that Lavoisier later called oxygen), and it came to him in a flash: the dregs of a combustion are partly ashes partly gases, and hence their combined weight must equal the combined weight of the combustible and the oxygen it consumed. He was so excited he thought it only just to view himself as a co-discoverer of oxygen. Posterity decided to view his conduct as plagiarism. This, however, is an unimportant exaggeration. Posterity is more likely to be just because the recognition that plagiarists usually seek is immediate, although some great thinkers too fall for this pathetic weakness.

Our perplexity about priority is evident every time a civil court is asked to decide whether some text is plagiarized. (The same holds for the fine arts, but there it is inherent, as there a sharp criterion for plagiarism is impossible.) The paper mills—the organizations on the margins of Academe that write for students what they need, from semester papers to doctoral dissertations—are expert in plagiarizing with impunity. Boston University sued such an organization and won. Had the judge asked me for my opinion, I would have said, a professor who cannot distinguish the real from the fake deserves the fake. If you ever suspect a text to be fake, do not fuss; invite its alleged writer to discuss it with you. If such a discussion leads to the improvement of the text then the improved version is an achievement.

So back to the real thing, to genuine research, to our example of it from Lavoisier. He never solved all the problems and difficulties he faced, and he frankly and boldly stated so—though with the inductivist naïve conviction that one day research will surmount them all. Hence, Lavoisier displayed ambivalence toward dialectics. And so, much as he played being scandalized at his predecessors’ errors, he did so only when he was assured of victory, when he was playing to the gallery. Until now, it is all right for an established person to be scandalized—especially at the paucity of experimental work, of a genuine spirit of research, and at other maladies. Before you are established, do not practice indignation! When you are established and indignant, you are avowedly performing your duty; when you are not established and indignant, everyone suspects you of trying to establish yourself the easy way: without doing your honest share. Cheap.

No; indignation is never to recommend, especially not in Academe. It is a poor substitute for compassion.
 

4.5

Approach your studies critically, but make no fuss about your criticism. As a student, use your critical abilities and the results of your dialectical activities not for professors and not for fights but chiefly for yourself and possibly for you close friends—as therapeutic techniques to avoid unarticulated displeasure at what you study and as means of absorbing your curriculum intelligently, with ease and with pleasure.

Writing a paper or a report or a dissertation—for any purpose—you may be able to use the dialectic technique explicitly. Your professors may force you to avoid it. I had a friend who could not write his doctoral dissertation. He had to pile up his empirical evidence from which his theories should have emerged, and he got himself lost in the details. He was desperate because he had a deadline. He asked me to help and I trained him to write dialectically. As a normal intelligent citizen who had taken part not just once in political and other debates, he was no novice to the game. He had a real horror of expressing in writing a hypothesis he disowned. In desperation, he yielded to my suggestion and wrote down a false hypothesis hoping to erase it after parting my company. I then advised him to write that the hypothesis was false and bring the evidence against it. The spell was broken. He went home and wrote in this fashion a sufficiently large part of his dissertation—and quite a dissertation it was—in a surprisingly short time. His professor was impressed with the content but dismayed at the style. My friend was not disheartened. He deleted much of the explicit part of the argument, suppressed most of what was left of it, multiplied his information, and added a dash of irrelevant data, and he got his Ph.D. It became his style for a lifetime, he told me decades later, to write dialectically and then retouch the outcome inductively.

Popper told me about his early days, including the story of his becoming a friend of John Eccles, then a young academic but later a Nobel Laureate in physiology. Eccles felt grateful to Popper as he encouraged him in his efforts to refute his own hypotheses. Popper advised him to submit to the leading British scientific periodical a paper in the dialectic style. The editors accepted the paper on the condition that it be recast in the inductive style. This is impressive. Later on, I too received such letters. I never yielded on this, although my policy was always to follow editors’ detailed suggestions as much as possible. One famous scholar-cum-editor helped me rewrite a paper again and again before he had it published. It remained dialectical, although rather softly so.

The day will come when it will not be necessary to retouch a paper, when it will be possible to publish a closely reasoned piece—I suppose it will be a survey. It still is possible to do so today, before you will become an established academic. Colleagues, especially in your department, may be slow to realize this fact; it should not worry you: do whatever you like and enjoy your robust activity; write as dialectically as you can, book reviews or surveys, and achieve recognition as a published academic. This will facilitate further writing, in accord with what Robert Merton called the Matthew effect: the published have the better chances to publish more. Editors are cowards; they fear punishment for the publication of the wrong stuff more than they covet praise for the publication of the right stuff.
 

4.6

Book reviews are usually commissioned. Find someone who will recommend you to an editor for it. (This is not easy: editors may expect reviewers to express semi-official view of the book’s author and you should stay ignorant of this.) When you obtain a book to review, examine it to see if it deserves a favorable review. If not, return it to the editor pronto: negative reviews are harder to write than positive ones. Next, see to it that you have some familiarity with the field to which the book belongs. Do that before you study the book, but do not spend too much time and effort on it. Then write the review. Try to read as little of the book as you can, while guessing its contents as best you can. Then check your guesses. When you failed, try to explain to yourself your failure. Never hesitate to throw away a draft and write another. This will teach you to write fast: writing many versions is faster than writing the first draft as final. A version is final when you decide that it deserves careful checking of details. Do that as much as your patience allows. If the book appears in a printed version, get a digital version too, as this helps refine your checking.

A book review should include a summary of the book’s contents, the praise for it that is the explanation for its right to a review and a critical discussion of its contents. You can also add your impressions, a report on your experience of having read the book and so on. And write it with a specific graduate student in mind—as if it were a personal letter.

A survey of a literature is a multiple review. It requires a good controversial question, a discussion of it, a list of available answers to it, good bad and indifferent, and critical discussions of them. A survey can be of answers and it can be of books that present these answers. Here room for variations is as wide as you wish. If surveys of the question you have chosen exist, this is a cause for celebration: a survey of surveys is plainly exhilarating.

Let us then move to teaching and lecturing. If you are the lecturer, then the simplest is to deliver book reviews and book surveys, with some background material as needed. Things are different if you are a listener. Sitting in a classroom and hearing familiar material, you may find boring no end. A simple trick will change that—for a while, until it wears off. Do not sit there like a student, but like a colleague; a supervisor even: do not study materials delivered but the people who deliver them, their choices of material, their modes of delivery. Why does the lecturer teach this, do that? How well does the lecture catch the listeners’ attention? In particular, how much dialectics the lecturer employs and how much profitably? Could I learn from all this? If a lecture is too dull even for that, you may spend your time more usefully leaving the lecture-hall or, if this is impossible, thinking to yourself about your own work. (Students often use their laptops during dull lectures.) If we dare not whistle or throw tomatoes at a lecturer, there still is the right—to say nothing of the duty—of the audiences to express themselves.

The duty to deliver a dull lecture is worse than to listen to one. First thing, tell your audience that you have to deliver it and explain yourself properly and honorably as an adult in adult company. Some students may giggle—ignore them. Then discuss the reasons that make those who think you should deliver the lecture consider it important. Your dull lecture may thus become at least entertaining and also interesting and instructive. The rule is simple. Try to avoid dullness at all cost. Do so primarily by presenting other people’s views, by questioning the claim that the activity is dull, important, both, neither, whatever is on your mind. The way to become a dialectician is by playing the game. Playfully.
 

4.7

You may dislike all this. You may think it is too much of a concession to the Establishment. If so, then possibly you may consider too submissive the very acceptance of an academic job. In which case this book is not for you: you are better off not on your way to an academic career.

I think this view is an exaggeration: in a modern civil society, doing your job as best you know how need not be submissive. Take the task that seems to me the least moral: passing your grade-sheet to the administration. If you want as much justice as possible, you check your grading against given criteria and check your results. You may want to consult colleagues, perhaps. What many professors do is not clear even to themselves. They cannot tell you why they grade this way or that. Professors can easily put you off students who complain about grades. This may be frivolous, but hardly avoidable: the just grading of one class properly and critically may easily expand into a lifetime task! So professors do their bit in order to avoid feeling guilty. They read, they worry, they reread, they survey the list, compare grades, regrade, reread—until the record office cannot take it any longer and tells some poor secretary to get the grades from you on the pain of some penalty. The professor feels it is more unjust to the secretary to go on working on the exams then it is to the students to stop it; so the process comes to an end and thank God for having postponed the inevitable breakdown for another semester. The professor rushes off on a holiday or on what is left of it.

The moral from this true story is that professors read exams and essays and other assignments not in order to be just but in order to avoid excessive feeling of guilt. Yet the method leading nearer to justice requires less reading than they do, not more. For some purpose, multiple-choice exams are better than written ones. Otherwise, the justice of an exam depends on its objective. A well-structured essay on a familiar topic one can read at a glance—unless it is very new material (for the reader); this one can spot easily and read more slowly and with pleasure. If the lecturer takes care to describe clearly the objective of a course and students know what they should do in order to achieve it, then the examiner should be able to tell in a glance the quality of the exam papers. Spending more time on reading them can only confuse.

A student once came to my office and stood at the door clumsily. I think he was a footballer, as he filled the door’s space. I looked at him inquiringly and he said apologetically, I want you to read my essay. That is not true, I said. The fellow boiled inside; his shoulders and back straightened and you could almost hear the door’s frame crack. Relax, I said, and take a chair. I shall convince you, I added, with one sentence that you are mistaken. He looked as if I hit him on his head; he slumped into a chair, his eyes fixed on me as if we were on the sports field. I said, you want me to comment on your paper, not to read it, but you think I cannot comment on it unless I read it. He was utterly bewildered. I showed him that I could comment on his paper without glancing at it. He was thrilled. I could do so by questions and answers, of course. You can learn much about a paper by learning about any point in it that you choose: choice of topic, preface, intended audience, structure, dialectical give-and-take. You can compare the answers about the paper to your set standard answer. What thrilled that student was that he was actively engaged in intellectual activity. Evidently, the paper was not a case of such an activity. I discussed it with him and we agreed that he should rewrite it and come for another discussion. He did not. Instead, his coach came to tell me he had dropped the course. College footballers have to play, not to study. Pity.

Since I am retired now, I can sum up the success rate of my performances as a lecturer. I did have some success, but I had more failures than successes. Failures come for diverse reasons. The commonest in my experience were intrigues: colleagues sabotaged my work systematically and in many ways, some quite creative. Partly in ignorance of what I was doing and partly in baseless fear of honest competition. I say this, however, just to satisfy your curiosity. It has nothing to do with my purpose, which is to offer you my best advice to help you plan for your academic career. The best way to face any intrigue, I keep telling you, is to ignore it. This has its opportunity cost, of course, as every choice does. It still seems to me the best policy. I know. I could have increased my success rate by offering straight lectures rather than argue with my students. It was my choice to act as I did. Looking back on my career, I do not regret it, much as I lament the cost of the intrigue that some of my students had to pay.

One final point. Training in dialectics, like any training, requires constant feedback. So students do need grades. It is the school that does not need them. This not only accounts for inflated administration. It is downright an insult to students as it rest on the assumption that they need prodding in order to study.
 

5. Personal Adjustments on the Way to Becoming a Dialectician

Il Sodoma was Leonardo’s major disciple. Judging by his paintings in the Florence Pitti Gallery, he outdid Leonardo in the techniques for which Leonardo is so famous, yet he lacked Leonardo’s subtlety and delicacy. He is an almost forgotten artist of past renown, and even in the Pitti Gallery, which exhibits many second-rate works, his pictures are nowadays located in relatively obscure places. I dare say most visitors there have more than enough first-rate masterpieces to watch rather than notice him.

The San Bartolomeo a Monte Oliveto Monastery that lies on the serene Tuscan hills outside Florence, is not a particularly famous place. It shelters an obscure fresco by Il Sodoma, The Last Supper; it is in a very poor condition. Being a fresco, it does not suffer from the excesses that reduce the value of his oils. The fresco’s composition is daring: Judas sits with his back to the spectator, looking, however, not at Jesus but away from Him, so that his portrait is central. St. John “the beloved disciple” sits sweetly behind, resting his saintly head on Jesus’s shoulder; the contrast between these two faces is striking: the one is pure sweetness, the other a most impressive mixture of a villain and a saint—the only convincing Judas I know. I suppose the whole composition emerges around him. If I am not mistaken it is a self-portrait.[33] Anyway, it is intriguing and very moving.

The record of Judas Iscariot is intriguing: he was the arch traitor and thus the vilest character imaginable, yet we know about him details that matter so much less. These speak to us, perhaps more than his betrayal of the Son of God, in what looks an utter loss of all sense of proportion. St. Matthew uses these to render him human: he praises him as an efficient treasurer, and reports that before he hanged himself he returned the fifty pieces of silver that were the reward he received for his treason. St. John differs: [34] even as a treasurer, he reports, Judas showed his vile character: he helped himself to the petty cash. I suppose St. John found irritating what the fresco of Il Sodoma came to illustrate: we find it easier to empathize with Judas than with the other Disciples; perfection is too remote from us to have a direct, live meaning for us (Bernard Shaw, Preface to his St. Joan, 1923). This section, about personal adjustments on the way to becoming a dialectician, deals with one’s readiness to devote one’s life to research that fails to dwarf personal quirks. To portrait Judas as vile St. John ascribes to him a permissible goofing, not only mere treason.

What role do personal characteristics play in the life of Academe? Some professors will claim eccentricity at all costs (The Man Who Came to Dinner; Mr. Belvedere); others will rather die than be considered eccentric. There is something deep-seated here, something as deep-seated as that in Il Sodoma is fresco that is so moving. We could go on discussing what exactly all this is as long as it sounds fascinating, but useful it will not become—not in any way that I can see.
 

5.1

How much of our personal characteristics can we ignore in our life in Academe? How important to us are they? Freud showed how much childhood experiences, almost accidental events, might contribute to what we consider our real selves. Yet without notice—a strange and exciting intellectual oversight if there ever was one—he declared that character solidifies (not in infancy but) in transition from adolescence to adult life. At that stage, he noted, some of us outgrow childhood traumas, some of us sublimate them, some never attend to them and become neurotics or worse. This is thought provoking, yet Freud, who stated it more than once, never worried much about it—perhaps because he had enough work on his hand even while ignoring it. Yet it was an error. For one thing, it implies that there is no child neurotic—at least not in the same sense (as Melanie Klein has noticed). For another thing, it raises the question, how much the past and how much the present determine whether a given adolescent will become neurotic or adjusted: it tells us that the neurotics are those who have missed the opportunity to shake off their childhood traumas. In adolescence, we are most obviously unadjusted—sensitive to other people’s pain, Freud discovered the pains of adolescence just as much as he discovered the pains of infancy (and alas declared both unavoidable). We are sensitive to our smallest failings then as if we wish to be saints; but only as-if: our reaction-pattern is scrambled. If we remain as sensitive and as vulnerable, we remain unadjusted. If we lose sensitivity, we become coarse—vulnerable or invulnerable, but coarse. We may, however, remain sensitive but reduce our vulnerability—especially in congenial environments. The simplest way to strike such a feat successfully is to show concern for others, especially for their sensitivity and vulnerability.

Inadequate adjustment and the problems it raises have caught up with respectable psychology—less with students of abnormal psychology than with students of growth psychology, education psychology, or learning theory and the study of transfer of skills, adequate and inadequate.

An example. One of the deepest reasons for the almost universal inability to imitate native pronunciation well enough is transfer, or an inadequate mode of adjustment, as E. H. Gombrich has observed (Art and Illusion, 1960). The reason for this is the reluctance to improve modes of adjustment. This is partly clinging, shyness, or fear of change. Partly it is the fear of sounding phony. (Drama schools may set their students free. Yet some artists imitate natives well only on stage.) Partly this is due to the lack of readiness to prattle as children do when learning to speak.[35] These are all due to inadequate modes of teaching, especially of foreign languages, rooted in a strong inhibition against imitating children. (Try it in company and see what tremendous displeasure this elicits.) All children are painters; child-psychologists rightly worry about the mental development of children who do not paint. Most adults are inhibited even from doodling. This, too, is the success of our education. If you can bring yourself to doodle again, I greatly recommend it. It will improve you in many unexpected ways. I am speaking here of the rules of intellectual conduct that regularly find their way into Academe and cause inhibitions.

A simple example: “do not interrupt!” Why should one speaker avoid interrupting another? Obviously, it is at times very useful and conducive for the proper development of intellectual discourse to suppress the urge to interrupt; for the give-and-take between teacher and student or between scholars on equal footing. Equally obviously, the same is at times useful for the opposite purpose: do not interrupt me! Say, bores who go on endlessly making sounds without having anything to say but feeling immune to your impatience. Bores should not worry us: we meet them and discover soon enough that they are bores, and we use all evasion techniques in the book to prevent their assaults on our precious time. It is tragic to see interesting scholars interested in each other’s work being too polite to make the most of their encounter, simply because of the inadequacy of the rules of polite conversation that prevents interruptions. The most useful interruptions I know are these. Sorry, I have asked the wrong question; allow me to withdraw it and ask again. Do skip this item.

Sensitive people are able to interpret all sorts of silent interruptions—such as facial expressions. A scholar’s face can easily convey vital interruptions, even without notice: go on more quickly, says a thoughtful open face on a strongly nodding head; go more slowly here since now I can barely follow you, says a concentrated face, with a hand embarrassedly rubbing chin; I must put here an obvious objection you overlook as if I know your rejoinder to it but I do not, says a grimace, perhaps with a hand rubbing hair for energetic emphasis. And so on. This may sound unusual to you. It is common. Consider the difference between a lecture delivered once to the microphone in an empty studio and once to an energetic audience. A good lecturer is open to constant feedback while lecturing. This is why lecturing to a public is more tiring than talking to microphones.

Study silent cues—from interlocutors and from a listening public alike—and learn to respond to them fast. However interesting a monologue may be, it is easily capable of boring audiences unable to help speakers adjust. However important such modes of audience tacit interference with the speaker’s activity are, they are very limited in scope. A listener indicates being lost because of an obvious objection; the speaker obviously thinks the listener knows how to answer it, but the listener does not; perhaps the listener has no idea how to answer the objection; perhaps the listener knows of a few lines of defense but is unable to guess which one the speaker favors; perhaps none of the existing lines of defense are favorable and the listener must know how the speaker copes with the situation; and there is always the option of breaking the thread of the conversation for a while so as to handle the objection, or to ask for one’s listener’s credit and postpone the objection to a more appropriate part of the discourse, and no a priori rule can tell which procedure is more advantageous since that depends on many circumstances and many peripheral factors.

How can sheer grimace express so much? It cannot. Therefore, very often the speaker who observes such a grimace has to make a snap decision, whether to ignore it, divine what the objection is and take it in stride, or other. Speakers may easily goof. Pity; for the simplest, easiest, and sanest procedure is to invite interruption. The rules of polite conversation thus wavered, the discussion can go on peacefully. Once you realize that, you learn what interruptions are useful, what are obstructive, and what are redundant. When in doubt, consult the speaker. Moreover, you can always request an interruption before embarking on it.

It is my habit to begin my public lectures with a request for interruptions. I always try welcome interruptions by stopping talking, no matter what I say, and how important it is. This, admittedly, may cause a loss of important item, even to the complete loss of thread. This last event may happen even without interruption. When you talk, privately or publicly, and you feel that you have lost your thread, or that you are going to lose it, or anything like that, do not hesitate: start the discussion or lecture all over again; at once. The right opener is, we were discussing the question, what do gremlins like for breakfast? Remember, when you are going to lose the thread of your discourse, your audience has probably lost it already.

An old hand I knew was a first-rate expert in divining reactions. For years he was unfailingly a most delightful dialectician, whom one could easily guide with sheer facial expression, and he would go fast, slow down, pick-up objections—all in perfect accord with his listener’s tacit wishes. Then his audiences changed with the change of the climate of opinions in his field. All of a sudden, he could become a bore; he became handicapped and limited in his repertoire; he did not know why; he became irritable; he was on the decline. A similarly talented old hand I knew spoke to an audience of listeners educated in the Chinese tradition. They were unable to comprehend anything he said. They misled him all the way. He said he enjoyed talking to an audience that was so sympathetic. When this was going to happen to a friend of mine, I warned him that this might happen. To test my claim, he injected a joke to his lecture with no hint at it; no one laughed. Without missing a beat, he started all over again and won my admiration.

Occasionally, one may see a small group of scholars engaged in full-speed high-powered conversation. A younger fellow stands on the periphery, gets naturally excited, and joins the activity. Naturally, the novice is prone to goof. A remark or a question in violation of the rules of the game or touching a point familiar to all the rest may irritate. It is a moment of minor crisis: to stop and debate what to do is out of the question since it will utterly ruin the atmosphere. One member of the team obviously has to make a decision—either to lower the level of the discussion and introduce a tone of patience, or to disregard the youngster—by a momentary facial expression. Any peculiar way of being dismissed, the youngster is bound to feel slighted; and often enough this will breed severe inhibition. Pity. You should take such a goof in your stride, and take it up again the next time you meet the individual who has slighted you.

In my classes, lectures, or seminars, I always encourage discussion and interruption, and much of it to be able to select the better and more pertinent remarks to pursue the set discussion. I do not take all interruptions on equal footing. I am particularly anxious to explain to a student whose remark is not fully taken account of that it should not make anyone feel letdown, that it is better to think of an improvement of a poor remark and repeat it on the next occasion. At times, this is helpful; at times, however, it reinforces the inhibition. This causes me great trouble because in every class there are a few uninhibited fellows—not always the better students—and they soon become dominant in class and reinforce the inhibition of their inhibited classmates; and this is a real nuisance: I do not want to inhibit the dominant members but their dominance has to be curbed, even when they do happen to be the brighter fellows in class, since they reinforce the inhibitions that I do not know how to overcome. True, no matter how shy and inhibited, when in the company of a willing teacher or colleague, students eager enough to learn will sooner or later come forth with questions, objections, and ruminations. Yet the opportunity does not always present itself. This is a pity, since any victory over an inhibition may yield spectacular results. There is no rule here that I know of, and no guarantee. Still, some guidelines may be useful on occasion. For example, the dominant students may not notice that they inhibit their shy peers; it is possible to draw their attention to their being bullies. At times, the bullies instruct the shy ones. You should stop everything and discuss their instructions with them openly: this may yield spectacular results.
 

5.2

The inhibition of developing one’s interests comes in a variety of ways, not least important among them is the inhibition of free and enjoyable reading. The inhibiting rule is—read a book thoroughly, cover to cover; never skip: especially do not skip a passage that requires hard concentrated work—it is good for your soul and it is good for your intellect! Only God knows how many students suffer so deeply and so unjustly from their inability to finish reading the textbook. The reason for this so very common malady is obvious and common. The result of conditioning, the student aims at completing a job; like most conditioning, it rests on threats of penalties for failure, especially through overgrown senses of guilt, incompetence, and frustration. Yet the job is impossible. Textbook writers are usually at least as neurotic as their readers are; they, too, undergo conditioning to do the impossible. Writing a textbook these days may mean giving up all hope of making a name as a researcher, as a fully-fledged able-bodied academic. Only top dogs can write textbooks and retain their academic status in their colleague’s eyes. Textbooks must then reflect then their authors’ ability to keep investing mental energy in research while writing a top-notch textbook. They try then to show that they know everything worth knowing: they are up-to-date in two (interconnected) respects. If they wish to be didactic on top of this, that is their added task. What a big task this is, you have no idea. I have met a few textbook writers who said frankly, had they known what a headache the venture was they would not have started it. A compromise is the obvious way out; and the simplest and easiest compromise is having the textbook begin as teaching material and end as results of research. The last part of the book thus aims not at the common reader, but at colleagues—for showing them that the author has not lost touch while wasting time on a mere textbook. Fortunately, this is not always so. Some authors are educators: they consider it good for their readers to see how much they are behind and how hard they have to work in order to become experts. Intended or not, the result is torture.

Dependence on teachers is a heartbreaking phenomenon and a spreading disease. Allegedly, the opposite of dependence is independence. The great idea of inter-dependence is thus out of consideration. Not learning the art of cooperation, you find it frustrating and humiliating and depressing to be dependent and become increasingly dependent the harder you try to follow the accepted prescriptions. The independence they teach you is a romantic ideal. You need super-human resourcefulness, it says. You try hardest. You then bump against widespread taboos. Breaking then isolates you. This is a trap to dodge: be resourceful in simple things: in the choice of courses, teachers, colleagues, and reading material. Trying to choose reading material, glance at diverse materials, invent some techniques of your own; examine classic texts in your field. And, above all, find partners in study.

Romanticism is strict conservatism with a modification: rarely, we need revolutions, for utterly independent geniuses to conduct them. Beware of this idea of (utter) independence. It is dangerous, since its intended thrust is to discourage almost all. The exceptions aspire to be independent at all cost: to that end, they pass ordeals: they stay in the desert for forty days and forty nights with no food and no water, and, worse, with no company. To become independent one must be a Moses, an Elijah, a Christ: geniuses avoid their own societies; they forge the rules of new ones. By the rules of the old society, a genius is alienated (crazy), to find vindication by the rules of the new society. (Romantic philosophy is relativist.) Nobody is that independent: not even a Moses. Independent people have independent company to support them, especially in a tough moment, or after a big failure.
 

5.3

Romanticism divides us to the submissive and the original. Your trouble is obvious: a budding academic, you do not want to be submissive and you do not know how to be original. This is a dilemma to dodge: choose what to study and enjoy your studies.

For a budding academic, the most important choice is of a teacher—of a mentor, really. It is both personal and intellectual. Of the personal aspect of the choice, I say nothing. You either like people you associate with or not. If not, you need to change your lifestyle. Of the intellectual aspect of the choice, you need an interest and you need to skim the easily available current literature for a potential teacher. For this you need an interest, a dominant one. If you have none, my advice to you is, get out of Academe as fast as you can. If you have one, begin by looking for current texts on it that you like.

How can one choose a text? If one knows what it says, one need not read it, and if not one is blind! The generalization of this is Socrates’ paradox of learning (Menon):[36] if you know what you wish to learn, you do not wish to learn it, and if not, how will you recognize it when it comes your way? Even if you feel satisfied, how do you know that you have met your original quest rather than another one? Some philosophers think this problem is not serious, since we often know what we want to learn—electrical engineering, say—and achieve it. These philosophers could just as well say, there is no problem how sea turtles and salmons know their way to their breeding grounds or a way back to the open, since we know that they do arrive. Indeed, just as most biologists think sea turtles and salmons have inborn knowledge, so did Plato (or was it Socrates?) who thought all human knowledge is inborn or a priori (namely, prior to experience, namely, logically independent of it).

Philosophers rightly dislike Plato’s solution. Yet they have no other solution. So they tend to pretend that the problem is not real. Without much ado, I shall give you the two other extant solutions to the problem, how do we know what we want to learn.

Bacon’s solution is ingenious. We do not want to know anything in particular—we just want to know the truth, the whole truth and nothing but. We collect facts most indiscriminately and let them guide us wherever they will. Unfortunately, however, he was fantasizing: one cannot collect facts indiscriminately, and facts do not guide. Bacon’s solution is historically most important. He was aware of the problem he was solving; he used his solution of it as a major argument in favor of his view of learning as utterly passive: Nature does not deceive: the faults for our errors is ours. There is a pragmatic aspect here: faith in passivity—at least yours—is rampant. You are clobbered into passivity and you must be creative: just acquire some simple information, and then you can be off on your own quest.

Popper’s alternative exhausts the list of solutions to Socrates’ paradox of learning. As long as learning is fun, it does not matter that what we learn is possibly not exactly what we have initially wanted. We have a question. We want the answer to it. Perhaps we will never know the true answer, but we do recognize answers to our questions. Once we find an answer, we put a new question: is this answer true? If it is false, how can we show it to be false? After showing a given answer false, we may look for an alternative to it. This is a possibly endless job, and it is fun; hopefully, it is progress, but never mind this now: this section is personal, remember.

An example. You take a dull course with an unreadable reading-list. What should you do about it? Do not read dull material. It is harmful. There are ways of replacing dull reading with exciting reading and ways of trying to liven-up material. And do not prepare for an exam unless its outcome determines your fate.

Exams—yes, we must digress a bit—are not ends but means; means for professors for finding out whether you have done your homework. The sanest exams I know of are driving tests. These have a clear end: to get dangerous drivers off the road. It is empirically found useful this way. Testers wish to know that candidates operate safely under normal conditions. For that, their presence should have a minimal effect. So candidates should ignore their testers altogether, except for listening to their instructions to turn right here, left there, reverse to parking position. If scholarly exams are to be sane, they must be of the same ilk. Students are told to write their exam papers for their instructors. This is utterly insane as it is utterly abnormal: normally students cannot instruct their instructors. Supposing you can instruct your instructor, you should conceal this fact, since examiners show no gratitude for such an affront. Such cases are rare: in my whole career, I met two undergraduates who unwittingly instructed their instructors. They did not do well. This is terrible injustice, but it is relatively easy to avoid. As an examiner, your instructor is peering over your shoulder while you operate normally. What then is the normal operation? Explaining the material to your peers or to your juniors as best you can. If you studied well, and if you are versed in writing, you need not worry about exams. This is how things should be and can be. They are not. Largely because preparations for exams are usually confused. There are exceptions. In practical exams (laboratory work, singing, occupational therapy) and in rigorous or computational exams (mathematics, social statistics, mathematical physics) students produce in the exams samples of what they were doing during the courses. This may also apply to other kinds of cases, perhaps creative writing, or musical analysis. Most courses on Campus share one most incredible incongruity: students are supposed to read for written exams.

This incongruity usually requires memorizing. This is the chief reason for the fact that so many students try so very hard to remember as nearly by heart as possible: the end of learning by heart is the ability to repeat; and if you know a text by heart, then it is equally easy to repeat it, reciting it or writing it down. That memorizing should so control studies merely because of the incongruity in our examination system, is monstrous. Professors do complain that students repeat to them what they have said in class. This never ceases to amaze me. They force students to memorize, and so it is not surprising that students wish first to remember and then to forget. Now students know that exams are futile. When a student contests the grade of an exam, nothing is simpler than repeating it. Yet both parties reject this solution. Increasingly many students support their complaint with the aid of lawyers! There will be no improvement of the system until the purpose of exams will be clear and operative. This is most unlikely, because in practical matters, to repeat, the problem scarcely arises and in other matters the purpose of exams is not clear; it is merely to justify inequality, to keep some clubs exclusive (Steve Fuller).

This concludes this digression into the traditional system of exams. I have placed it here for a purpose: my recommendation to you regarding preparation for exams is the same as my recommendation to you regarding reading during the course: take care to avoid boring yourself. Do not prepare for exams: failing them is less costly than harming yourself by depriving yourself of the rewards of the love of learning.

A few more technical points. Reading and writing should intertwine. It is all right to underline a good passage in a book, especially, as students do in the U.S.A., with magic-markers. However, if you leave it at that, it is pointless. You can always build a short essay around it; say, write the question it comes to answer, praise the question and then praise the answer: why is it cogent? One way or another you must write daily. Like letters to your Aunt, two or three clear and friendly pages. Of course, she is interested in what you do, so it is easier to write for her than for a learned journal whose readers are too busy to read and too learned to be instructed—but you can start with a sympathetic audience; the techniques for capturing hostile audiences are not necessary for beginner; you can develop it later on if you must.

Introductions to textbooks usually address teachers; students habitually skip them. That is regrettable. Read the introduction carefully. Look the author up in Who’s Who or in the Dictionary of American Scholars or in the Dictionary of National Biography or in the Britannica or the Americana or Larousse. (They are all on the Net.) Write to your Aunt, if you can, or to your roommate if you prefer; explain why this or that scholar has invested years in the project of making just the textbook that your professor has prescribed for you. Try to empathize with that scholar and with your prof. Try to find the chapter, the page, that the author very much wanted to write and why it appears in this textbook. Choose between two options: to study this page or to ignore the book.

Make writing a good habit. You can take exam questions from previous years—the departmental secretary will have some, or your instructor—in the very beginning of the course, choose the questions that appeal to you and strike your fancy, and answer them at once. Of course, you do not know the answers, but write what you can. Use your books freely to check what you have written and rewrite it. Of course, you cannot write well and clearly. Use an audience, and ask your volunteer readers to tell you (1) what they did not understand, (2) when has bored them and (3) when they felt they had an objection you should have handled but did not. Correct your essays accordingly. This is an easy kind of exercise—except that you may be inhibited, in which case breaking the inhibition is of supreme importance for you—and it is profitable for you—as a student and more so as an academic. It is the best way for launching a joyful academic career.

Well then; why have you chosen this course? Why does the system impose this course on you? Try to write an essay on this. Look it up in a general introduction or in an encyclopedia or in the biography of its originator. If it is an old subject glance at an old introductory text, an old encyclopedia, Britannica or Americana or Larousse. Write up what you have found or what has occurred to you while reading. And try to write regularly questions, answers, and criticisms.

Look up various texts; to begin with, do not read any of them; glance into them, appraise them, write about the ones that appeal to you and then check what you have written by further perusal. Then read carefully what you have chosen to read—and do not forget to stop well before it bores you. As usual with many problems of choice, a standard optical illusion makes them more difficult than they are: take care to notice the options to avoid. After weeding out the unacceptable, you can choose any way you like and change your mind with ease.

Weeding out is often next to impossible, however. Only one in one hundred books is worth reading carefully, and of those only one in one hundred suits you right now. Hence, you must make a job of a search for it. Go to the undergrad library. You will find there a score or two; give each one five minutes at most; this means an hour or two; give two to four another fifteen minutes each, reading a page here, a paragraph there, consulting the table of contents and the index, absorbing the preface or the conclusion; this means another hour or so. After a morning’s work, you come up with the one out of a dozen or two books, the one of them you like best. Write an essay on your experience. Do not worry: you can always throw to the wastebasket any essay of yours that you dislike. (It is an important empirical observation that the writing inhibition comes together with the inability to throw away a written page. This is why the computer is very useful for the inhibited: the delete button is a great relief.) Read only the books that appeal to you. You can also pick up your essay from the wastebasket and read it carefully, with a red pencil. Do not blush! We are all stupid! You can pretend you are an instructor reading an essay by some unknown student. It is great fun. You may find it profitable to rewrite the essay, or to argue with it, or to burn it quietly. Do as you wish. It is good for you. It builds character.

If the undergrad library does not have a single text that appeals to you, do not worry. The task was worth it nonetheless. Go to the main library and do the same. Perhaps there is no good text on the subject; try an anthology; try the learned press. Struggle as long as you find it fun to do so.

No, there is no guarantee for success. Try hard as I can, I have never succeeded with the exercise I am suggesting to you when I applied it to the most popular topics in contemporary philosophy. I have executed exercises of this kind, and I have published some, in a dozen different fields. Although they led me to little, I found the all enjoyable and fruitful, whether in education, in the social sciences, in the history of science and medicine and art and culture, and in branches of traditional philosophy; some of these are published in leading periodicals and have been cited over decades. I have little training in physics, and I have almost never written a survey of up-to-date material in any science proper, but I am going to—I have written on one topic in physics that excites me and as soon as I am done with you I hope to return to it and write a survey on it and if I see any success, I may try to publish my results. That will please me no end.

In principle, there is no difference between the work of the novice and that of the established, except that it is harder for novices than for the established, since they are learning and acquiring scholarly techniques. Novices have much more fun when they discover new and tremendously exciting things almost every week, whereas the established progress much more slowly. The best scholars I knew did not develop half as fast as some of my students did. It is consoling to observe them grow fast. Why should a novice not play the same game as the scholar? Why is it permissible in chess and in tennis but not in learning? In music training much time was spent on five-fingers exercises; a growing consensus now joins Debussy and Ives and Bartók in finding these more harmful than useful.[37] Not so in scholarship that still awaits its Debussys and Bartóks to write exercise that are not boring. Why? To this, the answers are ideological and psychological. A novice tennis player learns how to lose a game; a novice scholar learns that every lost game is a disgrace. This is why. So forget all about disgrace and study for fun.
 

5.4

Scholarship more than any other activity is a matter of trial and error. You have goofed, you do goof, you will goof. So do not worry about it, just try again. Find your interests and pursue them regardless of anything you can possibly disregard. Use books, letters, conversations—anything—if it may advance your interests. Be resourceful and accept your failures gracefully. The way to do it is by never concealing past errors. Also, the simplest way to avoid finding yourself stuck is to stop whenever you fear that you may find yourself in that situation. If you suspect that possibly you are getting into a quagmire, stop. Stop well before your work gets dull and oppressive; try different approaches, try some changes; consult anyone around; random changes are better than being stuck. Better stop too early than too late! This is sheer hygiene. Stopping too early may be losing an opportunity; not so in matter scholarly: you may hope to return to an interesting study later.

The standard response to this advice is that following it gets you into total inaction. People who say so suggest that it is so obviously true; there is no need to explain it. I have one difficulty with this answer; you can follow my advice for a day or two, even a week or two, and then forget it and no harm done. Yet these people often refuse to try this advice even for a short while. Do they fear that it will work?

Personal this section ought to be, but I do not know you personally and so I am constrained. Perhaps you are eager enough and pliable enough so that my advice thus far will suffice for you for a while. Perhaps not. Perhaps you are already in some disruptive habits. Perhaps you are a pedant. Perhaps you are ambivalent about your work. Perhaps you cannot work except in utter silence, but some noise accompanies you wherever you go. Perhaps you cannot get exams out of your head yet do badly in exams out of the very fear of them. There are a thousand and one idiosyncratic agonies and I cannot even list the more frequent and important among them. How then shall I help to you?

My apologies. I never thought I can advise everybody, and I discussed this point with you extensively in my introductory part, surely you remember. If you have skipped it, then perhaps you would care to glance at it now; perhaps I should have placed it here instead of in my introduction. Sorry. I can give you only some general idea here, and let you hope for the best.

One of the worst symptoms of bad habits is the strong propensity to play with fire. I should say the propensity to test one’s ability by putting oneself in the fire. This too is not quite the right wording. We are ambivalent about our ambivalences, Freud has taught us. Hence, we do not even know whether we want our idiosyncrasies altered or not; so we do not quite plunge into the fire. In our ambivalence about exams we do not just take too many; we go where too many of them are likely to be imposed on us. If you must have silence for studies, you are drawn away from both noisy places and utterly quiet ones; you find yourself in places that might be quiet but are not and you get irritated with the world for its noise and yourself. This is the pattern of neuroses, said Freud.
 

5.5

Getting rid of our neuroses may be too costly. We can learn to live with them. Mental hygiene may be sufficiently effective for that. Mental hygiene is important both as a preventive and as a start for a cure: never allow yourself to be under pressure, and see for yourself! The reason you waver is ambivalence. WE justify our ambivalence by a false theory of the will. It is this. You can will to have a strong will. Once you have the will to have a strong will you decide to exercise your will until it strengthens. The essence of the exercise is to provide yourself with ever-harder tasks and master them. So much for the false theory. It is as vile as it is silly, yet is it most popular and widely spread. Ignatius Loyola, I suppose, was its best advocate; the excuse for him is that he lived before Freud. Moreover, he meant it; rather than play with fire, he plunged into it; and he created institutions that force you to plunge into fire—on the assumption that the fire will harden you or melt you. That he was successful is still evident in his organization; that it cost dearly we will probably never know: there are no statistics of dropouts from Jesuit colleges (the training takes a full ten years) or of breakdowns within the ranks of the order. Theoretically, you can say, Loyola excluded the middle grounds: when you work so hard and contemplate hell every day for a stretch of half-an-hour at least, you are hard—perhaps until you collapse.

Loyola was consistent, systematic, and pre-Freudian. Most people are neither. The better schoolteachers often display an approach that is more effective than what most western people do. They do not expect their pupils to have strong wills, powers of concentration, or any preoccupation with these. They often try to create circumstances and incentives leading children to voluntary actions that help them develop. It should be nice and easy—indeed, if it is not that then you have to change your approach. This is not the Loyola way, but it is as near to it as a soft approach can be.

Approach yourself as if you were your own schoolteacher. If you think it childish to be a teacher and a schoolchild simultaneously, then you are mistaken: it is detachment, and detachment is the very symptom of maturity. This holds also for the ability to be your problem-student (be that yourself or someone else) and for the ability to consult others about it. The difference between a child proper and a college-kid is that the one and same problem signifies differently for a school-kid and a college-kid: the one has an undeveloped response-pattern whereas the other has an inadequate one; it is one that handicaps. Disregard for inadequate response-patterns is of no use. It is built-in to a purpose and it is cleverer than the conscience that tries to suppress it or fight it—with a strong will or otherwise.

Now I have done it: when I spoke of you as both a teacher and a student, I have not quite recommended splitting your personality, not more than the split between active planning and preparing spontaneous reaction. When I speak of your conscious battle with your sub-conscience I did (follow Freud and) split your personality—claiming that you (your ego) have one aim and your sub-conscience another. Freud also compared the conduct of the sub-conscience to that of a pupil thrown out of class, knocking on the classroom’s door and disturbing the peace. Some people did not like this analogy; they frowned at Freud and scolded him. He took it lightly and said, this criticism is not serious: having two conflicting ends, we suppress one, thus, figuratively speaking, becoming two distinct persons. What of it. I dare say, in this case he was right in his taking a criticism lightly.

I take his metaphor even more literally: neuroses are internal police officers. Now always do the opposite of their instructions; this will force them to commit suicide. Freud distinguished between three parts of the self: the super-ego, the moral convictions as accepted (uncritically!) from the environment; the ego, the coordinator’ and the id (or the it), the (animal) motive force. This, however, is a different story. It does not accord with his distinction between aims and co-ordinations that are conscious and those that are sub-conscious. To allow for them, we have to note that each of them partakes in all three levels: super-ego, ego, and id. In particular, when we have a sub-conscious propensity to punish ourselves, its motive is a morality, not an animal-instinct. This, incidentally, is the criticism of Freud that his disciple Alfred Adler has voiced. Freud resented passionately him and all of his output. Shame.

The sub-conscious that prevents you from being a good scholar rests on old-fashioned theories of morality, inculcated in you throughout childhood and adolescence by parents and teachers and relations and family friends. They all recommend drills and traumas. The “real you” is your internalized version of your oppressors. They humble you because a scholar should be humble; they force you to learn long boring tracts by heart because a scholar do that. By Freud, your sub-conscious is cunning and effective but stupid. You have to beat it on its own ground. When your sub-conscious forces you to be humble, kill it by bragging excessively. If it interferes with your studies in any way peculiar to your specific case, just lay everything aside and go read a novel or watch a movie or something. Correction: do these things five minutes before the assault of your sub-conscious (upon its preparation, to use fencers’ jargon). To this end, you need some familiarity with the behavior pattern of your unconscious self.

Freud or no Freud, the pressure-system in high schools and universities make no sense. Forcing a kid to study and instituting penalties for watching a movie instead is the admission that movies are more enjoyable than studies. The exclusive choice between entertainment and study is similar to the exclusive choice between junk food and health food. Old-fashioned educators recommend flogging kids who want to live on sweets alone; most contemporary ones will say, let them eat sweets until they come out of their ears.

In principle, they say, you are right, and it is our job to make you right in fact; but it takes doing. Kids, they say, naturally prefer comics and movies to books and discussions and we must make them develop good taste. Left alone, they say, kids will never choose serious studies.

This argument is not serious. Reporting evidence that refutes it will be of no use. Even statistics to that effect will cut no ice. Arguing against them is tiresome and useless. There is a kernel of truth in what they say, and they will cling to it. Our elders and betters will see in that kernel a justification of their insistence on their use of cruel medicine. Their picture is incredibly exaggerated, but this is not easy to show. They justify their exaggeration by blaming our animal part, the one that Freud has called the id. To fight the id they destroy the sense of fun, including the pleasure of doing good and of studies. Alas, scholars share the blame, since they sigh about the burden of their work as they enjoy it and escape to it from all unpleasantness of daily life and its problems and headaches and chores. I never cease marveling at the constant complaints of my peers about the burden of work that they claim they suffer. They are not strictly hypocrites; they are just muddled about themselves and not very brave, at least not when it comes to small matters. The sighs that they release with the regularity of public-relations bulletins do spoil the fun, but this need not interest you. Your interest is to prevent these sighs from deceiving you. The damage they cause is by their destruction or masking the pleasure of study. No matter what, remember: the only good reason for joining Academe is that study is fun. If it is not, do yourself a favor and seek different means of livelihood.

Some schools avoid pressure and thereby disprove the standard argument for it. We may overlook this, but not without debate. We should have public debates about the need to force students to work; we must see that what we administer to them is for a very important purpose and that they cannot as yet learn to do it voluntarily, without pressure. Much of what we impose on students for reasonable ends we should suggest to them with arguments that make the work not a chore but a pleasure. Remember: pressure may cause harm even when justifiable. When force is inevitable, then explaining this to students helps a lot. Moreover, no matter how right we are in administering a necessary evil, no matter how necessary the evil we administer is, it still is evil. Never conceal this fact. Finally, students allowed to do things without pressure and without proper training may fail; this fact the establishment repeatedly offers as justification for putting pressure for boring training; yet failure renders training meaningful and thus pleasant and easy and better understood. They say, the curriculum is heavy so that there is no time for failure. This has met with ample empirical refutation: trial and error is most efficient and its results are the best. Avoiding dull work is the quickest way to high degrees of efficiency!
 

5.6

What is the training for independence? I said, discussing available options with others. Yet the paradox remains: one needs independence to develop independence. Fortunately, the little independence that normal members of western society have suffices to begin with. Examples: Pretend that you are writing for publication, but do not try to publish until later. Pretend that you are a concert-pianist or a composer for the Philharmonic, but do not try to come out into the wide open. Take this not as playing out ambition, but as playing a game. Children learn to play chess almost exactly as if they were champions. Fencing masters insist that from the very start novices should follow the same rules of the game as those accepted in international tournaments and in the Olympics. That is a general idea. Students who play colleagues are up against their own sub-conscious, their friends, instructors and the university system of rules and regulations. Ignore all this. Play scholarship now—like chess, fencing, tennis, music.

This would be good advice, you say, were not the system opposed to it. What can one do if one simply cannot avoid sitting for a multiple-choice exam? What can one do if one must study Aristotelian logic or obsolete computer languages?

You are right. My advice is no magic wand. To answer your objection, then. One may improve one’s score in multiple-choice exam by sitting for them repeatedly—as many times as regulations allow—and by following simple, obvious rules, such as answer a question randomly unless error is penalized. Yet this is not solution to the problem. I suggest that you seek available tools: the system has many hidden latitudes that you will not find unless you ask; a lot of people, from secretaries to department chairs, not to mention official advisers, will help you break various rules and help you find options you did not know exist and create new ones experimentally, but only if you ask. What you have to learn is to empathize with those who made the rules you hate and those who are in charge of imposing them on you. If you empathize with those who has set the rules and see their reasoning, this will give you enough of a start—indeed so much of a start that the vague general knowledge of the subject that you have already should suffice to get you a good C. Generally, what blocks you is the stupidity of the system; its saving grace is the same stupidity.

I have seen too many potential academic careers disappear due to the good will of rigid instructors. It pained me to see how seldom students try to avoid registration to lecture courses that are not to their tastes, how many students do not know of the option of taking reading courses instead, how many students give up studies for easy-to-overcome obstacles, perhaps with the help of a faculty member with a pinch of good will. Doing poorly in exams is an example. As an instructor, you should prescribe writing papers instead of exams. Otherwise, make exams open-book. Prepare them in class. And start the semester by discussing grades in detail.

Why students do not refuse to take surprise exams I do not know. As a graduate student, I had to take a qualifying exam since my second degree was in science and I planned my third to be in the arts. I refused. The academic registrar of the great University of London wrote to me disqualifying me for any degree in his university. I wonder if he remembered this when he wrote to me granting me my Ph.D.— in science, not in the arts. This weighty change made not the slightest difference to my career.
 

6. Social Adjustments on the Way to Becoming a Dialectician

Educationists do not trust their pupils to understand what the system expects of them. So they drill them and appeal to their fear of the very thought of deviation (Kafka’s Letter to his Father, 1952). The success of this kind of education amounts to the loss of autonomy (Kafka’s The Judgment, 1913). The received education system is successful: it sacrifices autonomy on the altar of social adjustment.

Common as social adjustment and maladjustment are, central to education as they are, central to the diverse social sciences that they are, they have evaded the eyes of philosophers, even of the commonsense variety. The fill the philosophy of life of many populations. Philosophy of life teaches how to overcome weaknesses so as to improve integration. Its best expressions are proverbs and bon mots of all sorts, often incorporated in fables and stories and plays and novels. Highly intelligent as they are, they are often too conservative. It is therefore hardly possible to comment on them with no reference to some social criticism. Socialists have a traditional expression of contempt for people who are critical of the society within which they integrate well: Champagne socialists. This displays a serious dilemma: you do not want to be a conservative and you do not want to be alienated. The solution to this dilemma is democracy: abide by the law while making efforts to improve it. How is one to adjust to democracy? How does everyday life in Academe express its democratic character?
 

6.1
Academe is an aristocracy—a true aristocracy, an honest aristocracy, a tolerant aristocracy; it is wonderful. Cicero said, senators are good people; the Senate itself is an evil beast. It is tempting to say the opposite of Academe: the university is wonderful; some professors are not. That would make your efforts to adjust easier in principle and harder in practice.

The parvenu tries to imitate aristocrats, to adjust to their roles. The more successful the effort, the more it earns ridicule. For this, the parvenu blames the aristocrats’ closed shop. There is truth in this. Academe is still the most open aristocracy. Parvenus always miss the point; whether they try to be sober as a judge or drunk as a lord, whether they try to be as gentle as gentlewoman or as well or as brash as a gentleman. The point they miss is this. An aristocrat does not give a damn about what the parvenu cultivates: impressing others, seeming to be this or that to the hoi polloi, explaining oneself, one’s actions, one’s motivations, one’s good intentions, the reasonable of one’s errors. In brief, take the parvenus, teach them not to give a damn, and you have aristocrats. Even if they learn not to care much, the moment they show they care they show themselves failures (Maugham, The Razor’s Edge, 1944).

Throughout this volume, I told you repeatedly that Academe offers terrific opportunities for good life and that too many academics miss these opportunities for fear of ridicule. So this is my point in this section, about social adjustments on the way to becoming a dialectician: ignore the ridicule. My personal disposition is to speak with new acquaintances about the fields of their expertise. This exposes me to their ridicule, and yet their corrections of my worst errors helps me familiarize myself fastest with their field of expertise. So all I want to tell you in this section is that you should not care a damn. That all sorts of small pressures will be put on you for your own good, that your teachers will be deeply disappointed in you, that your colleagues will view you as ambitious and insolent, that your students will be miserable and bewildered, that your administrators will find you unmanageable—all just as soon as you make the first attempt a becoming a dialectician. Yes, dialectic is the root of most academic hostility. I should know. Think of the up side. If you are a dialectician, you will be left alone with the very minimal duties subject to your kind approval and consent and correction, your teachers will be proud of you, your colleagues will be happy to enjoy your eccentricities, your students euphoric to be allowed to be registered in your courses, and the administration will blow the horn every time you go to town. This is so obvious, how can one write an interesting section on it?

There is only one small point to add. Concerning transition. When you are a trained and acknowledged dialectician you will have your problems and tackle them dialectically—more successfully or less successfully but in your own way and while learning from your mistakes and while consulting friends and colleagues critically. Today, however, you are not only poorly trained, not only you do not know how to pose as an aristocrat—a dialectician—you have a plebeian reputation to shake off. Not that you care, but that you want the proper treatment. For that, you should never care about fashion. Do not ever take it seriously; in particular, try hard to avoid arguing against it. And see to it that peers who draw your attention to the latest fashion should know that they are wasting their time on you. This is not easy, especially since it leads to the unwanted reaction of all the friends and relations of the plebeian going aristocrat: they try to return defectors to the fold—for their own good, need I say. It is very important for defectors to avoid yielding an inch to pressure while show understanding and empathy toward those who exert it.

Change of location may help; even travel may; but it is too costly. How many times, remember, you hoped to move to a new place to turn a new leaf to no avail. Yet it is true: if you know how to turn a new leaf, then going to a new place may be nice. It is hardly ever necessary, however: once you succeed in turning a new leaf, they will all let go of you. Kafka said, people often refuse this friendly treatment (The Castle).
 

6.2

In the meanwhile, here are some simple suggestions.

First, cut out as many old commitments as you can without being irresponsible. Begin with those that are for your own good—like that departmental seminar on a useless item. If your professors can and will argue with you, ask them why you should attend it. If not, just tell them that you will pass. For this, it helps to know the rule on talking to plebeians: talk to them in their own language. Talk to all people in their own languages. A dialectician can talk several languages and often uses them simultaneously, making every significant statement twice—make it once in your own language and once in that of opponent. (Joseph Priestley’s writings are delightful: he talked phlogiston and anti-phlogiston equally fluently and did not care about terminology. As a teacher, he reports, he taught his students to argue for the views that they opposed, and as honestly and as forcefully as they could.) Dialecticians may prefer to ignore the poor-in-spirit; yet when speaking to them—for any reason—dialecticians invariably use their language.

Your guides and mentors will be deeply hurt when you cease letting them help you. Make it as easy for them. One way of minimizing the pain of separation is to shorten it by resolute action and by being impervious to all bribe and to all pressure. They will first think you are raising your stake and they will raise their bids accordingly. You should be sweet but adamant. They will try to catch you where you are vulnerable. When you goof as a result, admit it but do not yield. They may threaten and even apply sanctions. Ignore it all. Above pettiness, true aristocrats ignore penalties.

This leaves many problems open. You will soon handle them like a dialectician: with ease. A dialectician often meets uncritical peers—those who cannot be interrupted, who cannot change a subject even when you plead with them, who cannot let you go when it looks as if you are holding the upper hand in a debate even though you tell them that the hour is late and people are waiting for you somewhere else, who cannot answer a simple question simply, who repeatedly answer with yes-and-no, who plead with you to be nicer in a debate and not be out to kill, who use too many ifs and buts and who waiver otherwise, who complain about your misunderstanding of what they meant. And do not enter a debate if you do not enjoy it; if by mistake you have, apologize nicely and stop as soon as possible. How to do it is a matter for your private style. Cut them short. Plead headache. Plead ignorance. Ask them to tell you about their hobbyhorses (Disraeli’s gambit was “and how is the old complaint?”). Tell them repeatedly that things are not simple. Tell them you are much slower than they are. (I hope this is true.) Tell them it is too late. Invent your own style. They will be peeved all the same; but if they are willing to let you instruct them, then it is your sacred duty to do so.

Intellectual honesty is not synonymous with honesty. Poor wretches who do not know how to play by the book are not wicked rascals. Be gentle with them if you can, but do not try too hard or you will waste your life on it. When you argue, whether with a true dialectician or a prospective one, never allow yourself to be defensive and never care about impressions. Do care about helping the other fellow. If on occasion you goof on such matters, apologize and hope that your interlocutor forgives and forgets.
 

6.3

Here are three errors about misunderstandings and wrong impressions.

First, concerning fair or neutral terminology. It is customary to require neutral terminology. However, to expect neutral terminology from opponents in debates is to expect miracles.

Things are harder to manage in a public debate where rhetoric may sway audiences. A clever demagogue may start a debate with the use of neutral terminology and sway nuance and overtone at a crucial junction of the debate. The particularly clever will do so at a point where your protest against it will sound particularly piddling and unconvincing and defensive and apologetic. Do not protest. Do not protest under any circumstances. If your opponent uses words in a way that may matter, there is a very simple way of handling the situation, and it is just and humorous. Let your opponents have their say; be patient about it. Do not disturb them in any way. Smile to the audience knowingly. When the opponent has finished, wait for them to look at you triumphantly, then gleefully to the audience, then mockingly back to you. Start slowly, weakly. Restate the point they have just made as best you can, either in a neutral terminology, or in yours. Explain the last barrage of your opponent as sympathetically as you can, exhibiting an apologetic feeling towards their demagoguery but not commenting on it. Before you even start commenting on what you have just reported, you have won the victory of an honest-speaker-bulldozed-by-a-demagogue-to-no-avail. Do not believe me; try it for yourself, or watch others who can do it and see how honest and how winning this is. I do not know if honesty is always the best policy, but here it is.

Second misunderstanding. Leading questions. A paradigm of a supposed leading question is, “have you stopped beating your wife?” If you say yes or if you say no, your answer suggests that you are a wife-beater. This is in poor dialectical taste. In proper debate, you answer clearly and ignore suggestion. The opponent may articulate the suggestion: “so you admit you have beaten your wife” and there is the place for your denial of the suggestion. Otherwise, your opponent is a bad dialectician. Do not argue with bad dialecticians. Withdrawing from a debate may suggest that you have lost the debate. This should not bother you: whatever you do and however you act, some fool or another is bound to have a bad impression of you. Perhaps so much the better: they will leave you to enjoy the company of true dialecticians.

Formal situations are different. In a law court “have you beaten your wife?” is not a leading question: a leading question is not one that feeds you with an answer of certain overtone but one that feeds you with a certain information and leads you to answer the question the way that the one who asks it wants you to answer. When the representative of the party that has invited you to testify asks you a question, do not hesitate. In court, the representative for the other side tends to ask you a question worded to give a bad impression. This should not bother you: give your answer straight. Then your representative has the right to question you too (this is a sacred rule). The question may be, to follow our rubberstamp example, “why have you not stopped beating your wife?”— “Because I have not yet started.”— “Do you intend to? “No”. End of story.

(Your attorney may be a poor dialectician and neglect asking you the question that dispels the poor impression. You can to nothing about it: lose the case or replace you attorney.)

We give poor impressions all the time, no matter what we do. Otherwise, we are saints. This gives a terrible impression in a different way. If you worry about the impressions you give, you give the (correct) impression that you are pathetic. You should not care. Some questions are more interesting and more useful than this one. Do find them.

My third and last point concerning misunderstandings and false impressions is more technical. Perhaps I should postpone it to another section, but I am nearly there already and it is a pity to waste an occasion to make such a central point just because it is less dramatic and more technical. It is talking at cross-purposes and genuine misunderstandings proper in honest argument. Much of the distaste for argument is based on the true claim that many debates are at cross-purposes and consequently futile. Hence, popular opinion goes, it is better to avoid argument whenever possible; and then confine arguments to ones between experts; they clarify and define terms. Not so. If you do not argue you do not learn, and if you do not learn you do not become expert, and clarification is an endless process, especially by the use of tedious, dry-as-dust formal definition.[38]
 

6.4

Most arguments fail because people regularly break the rules of debate. They often are unable to agree about procedure in advance. One argues for a thesis without having first asked one’s opponent whether the opponent will give up their approval of it upon finding a counter-example. This is a very widespread folly. Experts in dialectics can do with little knowledge better than muddlers can do with much. Bertrand Russell has observed this when noticing that at times—not always—scientists can make better sense of politics than politicians can. [39] Most people do not concentrate on the question at hand, from ignorance of what it is, of what debate they are conducting.

This leads to a discussion about clarifications of poor debates. When you attempt to put a messy debate in order, you are bound to hit upon essential obscurities. You may be angry with those whose arguments are obscure; you will then be unjust. They cannot clarify every point and they do not know which point needs clarification, or which clarification might turn up as essential. The rule, however, is simple: when a clarification is missing that turns out to be essential, explain this point to your interlocutor and go back in the debate to fill any necessary gap. When interlocutors do not cooperate, gently but firmly close the debate. When not knowing what step to take, try different alternatives. The general rule is always the same: treat an argument as if it is plain and clear unless reasons to the contrary turn up, in which case it is advisable to take time out for clarification. When an argument turns out to be less clear than one assumes, one may go back a step or two or three and try to clarify what seems obscure. The worst pest is the one who does the opposite; who systematically leaves room for further complication. There are many such techniques, but I shall mention only one—my pet aversion. “Have you stopped beating your wife?” you ask. “It is not the case that all beatings of one’s wife is culpable,” answers the pest. After half an hour or so, feeling a need for a retreat, the pest reminds you that your question was never answered and that you have only surmised an answer—wrongly, of course. My sincere advice to you is to avoid arguing with such a pest—or, if you must, insist on an explicit answer to every question of yours, and for that you must choose your questions carefully: avoid irrelevancies. If you do ask a wrong question, take it back as soon as you learn this. You may say, I surmise your answer is such-and-such; am I correct in my surmise? The pest may evade answering such a question and give a speech instead. After the evasive speech of your interlocutor is over, say again, I still surmise your answer is such-and-such, am I correct in my surmise? Do not lose your temper with a pest; it is much healthier all round if pests lose theirs. Repeat your question until the pest terminates the debate. Still better, terminate it yourself and pay the price. Best of all, do not start a debate with a pest unless you are obliged to. Otherwise, apologize and terminate the debate nicely.[40]

The previous paragraph rests on observations within a limited culture. Better and livelier descriptions you may find in Plato’s early masterpieces, Protagoras, Gorgias, Euthyphro, Symposium. These are top-notch and unbeatable masterpieces from every possible angle, including the literary and the philosophical. If you drop this volume right now and go for these books instead, then you see my message. After a while, you can go back to the next section. It will wait for you there; meanwhile enjoy reading Plato.

In all cultures, all arguments, all cases of give-and-take, whether in commerce or in friendship, they all rest on agreement. As debates usually display disagreement, they sound as if they are unfriendly. This has led to the response that only extreme cases of disagreement are unfriendly. It led to the adage, you cannot argue against principles. (In American slang this is, you can’t argue with a commie.) Not so: the readiness to argue is agreement enough. When you refuse an offer to argue, then you refuse to cooperate. Even this you can do in a friendly manner: respectfully. Even when you are respectful, however, some may consider your refusal unfriendly. It be a rejection in Freud’s sense; this is sad on many accounts; it also confuses good will with respect. Your cooperation or its absence has nothing in itself to do with respect. Respecting people while rightfully refusing their friendships—justly or unjustly—may sound humiliating. Our society has not rendered enough of the teachings of the open society as explicit as it should. We still view indifference with censure, and the termination of a partnership or a friendship as treason. Almost no one has yet sung the praise of the great virtue of indifference—towards anything but evil. Be respectful to all and cooperate with chosen friends or partners. The only cooperation that is required of you is in fighting evil. (Professors who ignore injustice to students are culpable.) When the cooperation is dialectical, you can be and should be as choosy as an aristocrat is; and, as a true aristocrat, have no disrespect in any refusal of cooperation—unless you deem them evil—and you need not explain.

Recognition and public acclaim do not make one an aristocrat. Aristocrats determine their positions inwardly. Others may recognize them and make them powerful rulers or leave them as eccentrics. People may persecute aristocrats; failing to understand the indifferent manner with which aristocrats try to shake off hostility, the hostile may get infuriated and become vicious (Joseph Conrad, “The Duel”, 1908). To no avail: true aristocrats just cannot be bothered with pettiness, including recognition and acceptance and what people may say; they do not mind.

Academe, the home of eccentrics, of the unrecognized aristocrats, have recently gained spectacular recognition. It has rendered some true eccentrics a fine aristocracy: they partake in politics in their spare time, somewhat indifferently but honestly and a bit intelligently. Recognition also pulled in the hoi polloi and these created its cream, its middle classes and its upper middle classes. Their puritan quest is for competence and for efficiency. They want to teach their students as much as possible and train them as widely as possible in order to enrich their reaction-patterns, so that they are not stuck in some predicament or another. There is nothing against competence and efficiency.[41] Rather it is not enough: you better aim at being resourceful—like a lord or like a tramp, but train yourself to be resourceful rather than prepare yourself for an eventuality. The first step is to avoid consulting your intuition about how you should react and rather try some different reaction-patterns and let all hell break loose. Moreover, do not waste the day for the morrow that may never come. Do not prepare too much for an eventuality or there will be none. Enjoy your studies now and leave the future eventuality to your future resourcefulness; or leave it untended.[42]

Striking a balance between enthusiasm and cool-headedness is hard: the cool tend to be indifferent; enthusiasts tend to exaggerate. Plato indicates this in his early dialogues. It is no excuse for me, I admit: I should not be carried away. If they want competence and efficiency, let them. If you do not want to enjoy life now, why should I care. I hope that at least the last paragraph can serve you in some way in case you are struggling. For, it is honest struggle that wins a general enthusiastic appreciation. Look around and find struggling people everywhere; help them: do not argue on it, especially not with pests who replace dialectics with competence. Argue with those who will not be peeved when you stop the debate for a while to examine whether it follows the rules of the game proper: they are the (potential) aristocracy. Like you, I hope.

Now do go and read Plato’s early works.
 

6.5

I have promised a discussion about ostracism. I do not think you need it, since it will take you years and years before you may meet the risk of ostracism and since I have already given you sufficient advice to immunize you against this risk. Still, there is here an interesting point, impractical though it might be. For, if my view of Academe as an aristocracy is true, then ostracism in it is impossible in a definite sense. Therefore, I seem inconsistent when I propose the aristocratic view of Academe while endorsing Daniel Greenberg’s sensitivity to ostracism.

It is the expansion of Academe and the rise of its technocracy and meritocracy, you remember, that has led to the invasion of the hoi polloi, of the parvenu, of the middle and upper-middle classes, with their petty award systems and sanctions and neuroses and demands for the up-to-date and for excess competence. For them ostracism is real; it is the two-edged sword that rotates over the gate to the Garden of Eden (Genesis 3:22). The two edges are, the fear of ostracism and the ability to join those who ostracize. If you think you may face ostracism, then you are probably right. First, you must be a somebody, however. If you think you can help ostracize, it all depends: ostracism is by ignoring colleagues and thus penalizing them. Doing this to a true aristocrat is futile.

Consider ostracism as a phenomenon. Its source is an authoritative verdict that everybody who is anybody (what an ugly expression!) knows. The name-dropping parvenu says, I have heard Feynman myself, assuring me that there is absolutely nothing in the charges of Alfred Landé against the physics Establishment. Landé, a physicist of nearly half a century of peak reputation, met with ostracism. What did this mean in practice?

Breaking the sacred rule—avoid public expression of disagreement with colleagues—risk ostracism. You may welcome ostracism, as it saves you the effort of avoiding the company of colleagues whom you should avoid without quarrel or insult. As soon as it becomes clear that you are incorrigible, they will give you up: they will make you an exception to the rule. If you have aristocratic tendencies, you should know: this is the moment you are recognized. Official verbal recognition you will receive only when you are an old fogy, ripe for serving as a national treasure. The recognition that ordinary academics aspire to is the stonewall of silence.[43] Knowing this prevents a lot of heartache.

Ostracism means that a dear father figure has fallen from grace. They have to look away when they see a Landé walk along the corridor, but they smile in a friendly and appreciative way when they see a Feynman. A sociologist with a flare could sum up all this with a long list of status symbols and status manifestations and status ascriptions and status hiatuses. Colleagues assure you that these items matter a great deal; I do not know why and what for and in what respect. They say, if you have no tenure, keep your nose clean. Perhaps. Granting tenure is a complex quasi-democratic process; few people have studied it.[44] I do not know whether my own tenure refutes the received idea of it or not: we need some statistics for it and we have none. Rumors has it that the departments whose members care about their scholarly reputations will offer tenured posts to scholars even if they are not famous for savoir-faire. If so, then you should brush up your scholarship more than your manners. For my part, I doubt it.

My explanation for the fear of ostracism is this. The drive for recognition is the fear of rejection. This is a part of the sociological theory of the mentality that fosters and inculcates neuroses that drive people to work harder for emotional remunerations than for financial ones. Hence, ostracism cannot touch an autonomous person like Landé: he suffered his high academic status with ease and was slightly relieved when he lost it.

What ostracism does curtail is broadcasting of one’s ideas. (Economically, academics are well off one way or another.) Broadcast, however, has two aspects, petty and adult: that I want to be heard is petty; that I would like to exchange ideas with people sharing common interests, talk to people who enjoy my jokes, may be adult: overlooking its petty aspect, you can see that the worthy ostracized would find it hard to have worthy colleagues, students, since true scholars are rare in any field of interest. The perennial problem of the ostracized is, how to communicate with peers when the prattle of the hoi polloi clutters the communications channels. This is a serious problem. I shall return to it in a later section. For the time being let me say, the clutter is such that the added burden of ostracism is marginal.

Once you start lecturing, some channel is all yours. No matter on what level, you become an educator to your captive audience. This is a heavy burden. Beware of passing your defects on to your charges. They do not need them: they will find their own defects. Remember the supreme liberal rule: do not try to until you meet with an explicit request. This is very hard but terribly important. You may offer help, but giving it unasked is usually immoral. It is easy to hear a request for help implied. Do not. Under no circumstances. Violating this advice raises untold risks to both sides and no benefit to either. I will not repeat this. Incidentally, in writing you may provide advice: all handbooks do that. Yet the written page never imposes itself on readers the way people do in personal encounters.

Communication channels are so cluttered, that we would have lost all sense of proportion regarding them—many of us have already lost it—but for the fact that there are preferential treatments: some channels are given priority and some individuals have privileged access to them. I shall not digress to this important sociological point right now. Whatever the reason for which the Establishment offers a person privileged access, it may withdraw that access; this is how ostracism starts.

The problem of communication blockage is particularly bothersome to the ostracized who has a message to the public—like Landé, who battled the mystique of the standard quantum theory as appeared in the university courses of the time and that to an extent it still does. The problem is universal. Nor is it solved by privilege access. Einstein, the most ostracized big chief, never lost his privileged access; yet top-notch physicists did not study some of his latest publications. These turned out to be failures, or else the Establishment would have lifted the ostracism posthumously.

Bernard Shaw faced the problem of communication channels squarely. He decided, relatively late in life, to become a dramatist in order to have privileged access to the theatre as a highly unused channel (You Never Can Tell, 1897). His audiences enjoyed his jokes but failed to receive the messages (Devil’s Disciple, 1897 Preface) except on rare occasions (Androcles and the Lion, 1912 Preface). My way of joking is to tell the truth, he declared. It is the funniest joke in the world (John Bull’s Other Island, 1904). The audiences merely found in this cri de coer an additional witticism. He gave up his ambition as too high (Major Barbara, Preface, 1907) and went on writing from habit and for fun (Too True to be Good, Preface, 1932). He collected high fees to ensure audience appreciation (The Millionairess, Preface, 1936). He poked sarcastic fun at the crowds, chuckling all the way to the bank—a glorious, delightful failure.[45]

You can do likewise. If and when you have a message, just broadcast it, and as appealingly as you can, but do not evangelize. (The evangelist, a frightfully middle-class sort of a fellow, is the one most prone to ostracism; Shaw, Parents and Children) If you want to implement a reform, go about it intelligently. The best way to implement a proposal is not by evangelizing but by creating a small circle of sincere and brave and intellectually alert and agile people committed to it (Margaret Mead). When you have such a circle, you are immune to ostracism. Even when the members of your circle are intellectually not so top-notch but supplement this defect by devotion.[46]

This story is uncommon only because it can occur among uncommon people—the true dialecticians who have old friends wherever they go. Among those, the story is so common you would expect it to leak out. Once people are ready for such an eventuality they would approach old friends with just a little bit of caution—not because of estrangement but because the time gap may harbor all sorts of small misunderstandings that should not be allowed to swell—and then any problem that might arise is under reasonable control.

Now what I have just said is not common knowledge, as the communication channels are full of noise. You will have to learn to adjust to the noise too—by designing sufficiently good filters. Go and read Plato first: you must know how to communicate with close associates before you can tackle large tasks. You cannot be a large-scale storyteller without being able to hold an audience around the campfire or around the table; much less can you communicate with the learned world before you can communicate with your next-door colleague. For that you need the young Plato; on this he is yet the unbeaten champion.

In my long academic career, I have experienced many odd things. Most of them mean very little. The following odd story, however, happened to me three times: liking my style, a beginning editor wrote to me, requesting that I submit a paper. As I find the search for a publication channel an unpleasant hustle, I immediately sent a paper in accord with the periodical’s agenda, to experience yet another rejection, again because of my style: the paper looked too offhand: it looked as if I spoke to a recorder and sent it to a typist to type. In other words, the editors who liked my printed page did not like my manuscripts. It is odd that they did not know that to sound conversational takes great efforts. What seems to me the reason for these rejections was that the editors could not read a typescript as if it were a printed paper—for want of familiarity with the art of writing.

The appearance of text written offhand is deceptive. Famously, it takes much training and much sweat to reach it. Moreover, you need a lot of feedback too. Feedback is hard to come by, as readers respond to a paper in identification, not in empathy; consequently, their recommendations render a text more awkward. So assess comments of peers before deciding to accept or reject them. Beware of peers with writing blocks or publication blocks: they are infectious. Remember this when you look for a mentor. Avoid such mentors, especially those who make a virtue of it. You need a mentor with a sense of proportion, with a sense of humor.
 

7. Learning to Write Surveys and to Articulate Views: Sticking Out One’s Neck

Love what you do and do what you love. Don’t listen to anyone else who tells you not to do it. You do what you want, what you love. Imagination should be the center of your life. (Ray Bradbury)[47]

The reasons publication pressure became prominent are no longer relevant; they all have to do with the terrific expansion of Academe, especially in the United States of America, and this has come to an end long ago. This rapid and welcome expansion required easy rules for hiring people; the successful publications of learned candidates was a simple criterion that answers this need. Second, James Bryant Conant showed resolute eagerness to establish Harvard as the top academic institution in his country; to that end, he demanded that all of its academic members should have doctorates. Third, official American organizations offered financial support for publication pressure. All this matters little. What matters is that now the pressure serves no end, least of all the declared end of encouraging research. John Ziman, physicist turned sociologist of science, declared that any bunch of publications selected by any criterion other than excellence will show that most of them make no contribution to human knowledge. He went further: he said he could show with ease that the greatest majority of research projects that academics undertake today are worthless.[48] Research is still the most productive industry. What signifies for you here is the way this affects you: publication pressure deprives academics of their peace of mind, as they wish to avoid futility; they wish to contribute but they often surmise that they cannot. They rightly hate the false pretense that publication pressure imposes on them. They feel trapped. To avoid this trap you may decide that you resist the pressure, yield to it with no struggle, or learn how one can avoid both of these options with ease. Keep reading.
 

7.1

The reform of Academe is particularly problematic. Conservatives oppose reforms since they can cause more damage than progress: as long as the current system works, we should defend it and reform it with steps as small as possible. This may be true of Academe, but not of publication pressure, since it is an innovation. Even important contributions were seldom academic: during the scientific revolution, leading lights were not academics (but courtiers): Copernicus, Kepler, Galileo, Gilbert, Harvey, they were all courtiers. Before World War II, publications hardly contributed to the livelihood of intellectuals. The Royal Society of London prescribed convention about publication (inspired by ideas of Francis Bacon and Robert Boyle) that granted priority to new experiments. This convention is still operative—now as a parts of academic systems that thrive on experiments in their expensive laboratories. Researchers now are usually trained experimenters; their training contains keeping records of experiments and writing progress reports. These can go straight to the printer, unless journal editors interfere. The proliferation of periodicals does not facilitate the publication of empirical papers, since the way publications gain acceptance today by passing peer reviews. This institution is new and under debate; it maintains the traditional style of scientific paper that Boyle had invented when he instituted the scientific periodical.[49] As experimentalists learn to write scientific papers in the inductive style as a part of their routine training, they suffer less than other academics from writing blocks and publication blocks. This does not hold for their doctorates. To overcome the hurdle due to the sacred requirement to obtain a doctoral degree in order to obtain an academic post, these days an increasing number of universities accept three or five published papers of a graduate student as a doctoral dissertation. This strengthens the popularity of inductivism.

Boyle raised questions relevant to his criterion of a paper’s acceptability for publication: what information is empirical, what empirical information is scientific, and what scientific empirical information is new? Boyle offered judgment on the first two: information is empirical if courts allow it as eyewitness testimony (rather than as expert testimony), and empirical information is scientific if it is repeatable. Now at the early modern era, the west witnessed witch-hunts; they disappeared only in the middle of the eighteenth century. [50] Admittedly, it was the scientific revolution that has finally put an end to it; yet it may also have contributed to it, since it targeted female healers, barring women from the medical profession (for three centuries). The scientific literature included no criticism of it—or of any other gender discrimination; Boyle wrote as if law-courts rejected testimonies about witchcraft. John Locke, his erstwhile scientific assistant, later offered a rule for which he claimed infallibility—thus dispensing with the need to refer to law-courts. The result is a mess that still reigns. Willard van Quine, the most careful twentieth-century philosopher, still agreed with Locke on this.[51]

What then makes an eyewitness testimony scientific? This is the only methodological rule that the scientific tradition has never allowed to contest; it is Boyle’s most valuable contribution to the scientific tradition:[52] all and every observation is scientific if and only if it is reported by two independent sources and is declared repeatable. What renders two sources independent of each other? This question is not serious and it is seldom raised. For, those who questions it can try again. What happens then when an experiment is refuted? To this, the answer is Newton’s rule: a refuted observation invites its properly qualified restatement.

This leaves matters in reasonably good condition. Not so the following question: what scientific information is new? It is very tough since in one sense every event is new and in another sense every event is repeatable. This looks too sophisticated to take seriously. So let me observe that historians of science struggle with it as they discuss a famous trouble: multiple discovery. The paradigm case here is the law of conservation of energy. Even though in the late eighteenth century mathematicians proved that (contrary to Newton’s view) Newtonian systems conserve energy,[53] historians of science name a few nineteenth-century thinkers as the its discoverers. They do not say, who worded it rightly. Our question is, when are two versions distinct? There is a simple partial answer to this: a possible experiment that refutes the one but not the other make the two distinct.[54]

When all this is put together, then we have a good idea of how Boyle’s rule requires that new items of scientific empirical information deserves publication in the learned press. As empirical questions that empirical researchers can answer abound, what Kuhn has called normal science should never cease. (Ziman considered all of them useful.) They should be unproblematic as they are eminently publishable. They are not.
 

7.2

Whatever we say of normal science, it obviously reduces writing blocks and publication blocks. Still, the problem is broader: it is as old as scientific publications. In Plato’s dialogue Parmenides, young Socrates converses with Parmenides and pokes fun at Zeno who sits there reading and interpreting his own text to some youths. Zeno responds angrily, adding that he had not intended his book for publication but that friends pulled it out of his hand and got it published. In the Autobiography of Collingwood, a prolific writer, he said he was a perfectionist and he obtained much comfort from realizing that no work is ever finished, and that friends pull a work out of the hand of its creator and have it displayed in public too early. Both Zeno and Collingwood used some internalized criteria that they were scarcely aware of. It is better to ask prospective intended readers for comment. For this, beware of those who respond as prospective coauthors rather than as prospective readers. Regrettably, editors, who represent prospective intended readers, are too often ignorant, biased, and frightened. Publishing surveys minimizes the damage that this causes. Indeed, Collingwood published some first-rate surveys. Their advantage is that they offer a broad outlook of a field, thus serving both novices and old hands. Take for example a survey of detective novels.[55] Anyone new to the field who wishes to get acquainted with it will do well to consult Raymond Chandler, The Simple Art of Murder (1950), William Somerset Maugham, “The Decline and Fall of the Detective Story” (1952) or some other, later survey. The scene is ripe for a survey of such surveys. Surveys of scientific items are easier to write, since they do not involve taste. The easiest example, and one not easy to contribute to, since it has been studied in enormous detail and with tremendous success, is the survey of theories of gravity that includes a very small set of significant items—Aristotle, Archimedes, Galileo, Newton and Einstein. You can try your hand in it only if you are hyper-ambitious; if you can survey all the extant surveys and if you can add to the list a set of commentators who did not have significant theories of gravity but made significant comments on them, such as Faraday, Maxwell, Mach, and Eddington. My apology for my dropping names like this. I hope this is no big deal, since you can always ignore the details of examples.
 

7.3

Before that, my recommendation to you is this. Even if you are very ambitious, it is useful for you to learn to write easy surveys and to articulate simple views. For, hyper-ambition writings often express a publication block, ambivalence. (Freud has observed that the famous shepherd who desires a princess simply wishes to avoid the normal, towards which he is ambivalent.) High ambition is often a way to avoid doubt about the possible insignificance of one’s possible output. To resolve ambivalence one may examine it slowly and carefully and aim at output that has much less value.

What contribution is important? That depends on aims. What end do academics wish their publications to serve? Most academic publications do not serve their apparent aim: they do not serve any intellectual purpose; the rational cause of much writing and publication block is this intellectual uselessness of most academic publications. There is literature on how to read a scientific paper, which is very welcome, of course. Only a literature on how to write a scientific paper will make that literature redundant. One sentence suffices to sum up the literature on writing a scientific paper: start by reporting what is your aim in writing it; what kind of reader you want to read it; and why you advise your reader to read it. More briefly, start with a question and the level of knowledge you expect your reader to possess. It is amazing how seldom this happens in the scientific literature when the mathematical literature does it regularly. Studies of readership of scientific papers took place as soon as publication pressure was operative full swing; they occur regularly since.[56] They show systematically that most publications are simply not read at all.[57] This makes it advisable to have them on the net, printed on demand.[58]

Institutions have no aims (Popper), not even the society for the prevention of cruelty to animals. Now publication is a complex institution that serves diverse aims. It looks as if all concerned serve one aim: the advancement of learning, the benefit of the commonwealth of learning, the greater glory of God. True, yet there are subsidiary ends that matter too: editors of periodicals have their own aims, among them the aim of avoiding being subject to peer ridicule. They try to play it safe; so they prefer to miss a hit than to publish a miss: when they hesitate, they prefer to reject a paper. Then you foolishly feel rejected—by people who know nothing about you and who wish to know less.

Ivory tower requires of its rank-and-file that they develop thick skins. They never told you this, did they? No, this they left for me to do. Therefore, I am telling you. On the average, at least nine out of every ten papers meet with rejection. So do not feel unease if you have your paper rejected ten times. It is not that the phenomenon is specific to Academe: it is much worse for authors of children’s literature or for poets not to mention scriptwriters, composers of operas and their likes. Yet the image of Academe as a Utopia that academic public-relations offices reinforce is a serious contributor to publication pressure and to publication blocks and to academic agonies.

I was sidetracked. The question before us now is, what ends does academic publication serve? Whatever the answer is, it also has to refer to the laws of supply and demand: publications are distributions of information and there should be some demand for the information distributed. Does this seem to you a trite hackneyed truth? Well, I regret to inform you that publication pressure has rendered it false: the laws of supply and demand apply to free markets. If the market in ideas was ever free, publication pressure has rendered it subject to the laws of grants and financial support, laws that nobody knows. Free markets do not prevent writing blocks and publication blocks, as is evident from the market in dime novels. What the free market does is draw away from Academe many talents that can start up small, brainy enterprises.

The reason publication pressure worked to begin with is that it has a tremendous financial backing. A grant application needed a host academic institution, and every grant offered to any academic was supplemented with at least half of it added to bribe its host institution. That made all USA art schools and conservatories parts of the American academic system. This source of money dried out long ago. Now often academics have to pay to have their output published. This ensures that the demand for their information does not count. Surveys show repeatedly. This is why, as I have mentioned, most papers in the learned press since the beginning of the Cold War are unread. This is incentive for the rise of electronic periodicals fully financed by authors and serve only to fill their publication lists. This is inflation. Keep out of it: you can do better. With ease.

What you need is a better idea of the service to the commonwealth of learning: how is the stock of knowledge piled up?

Robert Boyle, the arch architect of the discussion to which these pages belong to, has recognized the traditional division of contributions to theoretical—metaphysical or scientific—and factual—informative. Before discussing what this division misses, let me discuss it. The hostility to metaphysics of Bacon’s origins rests on his (and Galileo’s) great discovery that observations are theory-leadenness plus the false theory that avoiding it is possible—by suspending all judgment. The right response to this is Russell’s: assuming that one is free of all prejudice is humbug.[59] Correcting this error of Bacon renders pointless much of the philosophical literature, including, say, the whole of the posthumous output of Ludwig Wittgenstein that is still so popular.[60] This makes you despair of so many academics for their inability to move with the studies to which they wish to contribute.

Boyle did not share Bacon’s hostility to metaphysics: he expressed admiration of some metaphysical systems of his time. Nevertheless, he suggested that it is all too easy to develop a new metaphysics and that it is better to invest one’s energy in empirical research. Let us leave this now.

The value of a scientific theory is its explanatory power: any explanation of any theory or information is laudable. William Whewell said, an explanation is an advancement if it covers more than was covered hitherto. This leaves open the case of two theories with the same explanatory power; it is already progress, as it is a challenge for a crucial experiment between them. Most science textbooks add that a new theory has to meet with empirical confirmation. This condemns the important 1924 theory of Bohr, Kramers and Slater, according to which the conservation of energy is statistical: it was refuted at once. The history of biology is full of such examples, but let me leave them now. Einstein and Popper stressed that testability suffices, that the view of a theory as a success should not rest on the demand that its initiator should be a clairvoyant who can foresee the outcome of a test.

The question, is a theory new or not is thus answered. When researchers questioned whether Hermann Weyl’s unification of field theory was new or a mere conjunction of two older field theories, this required devising a crucial test. No one could judge this matter then. Not that it is essential: soon Einstein constructed his unified field theory in a way that makes it obviously novel, yet, regrettably, ad hoc. There was no crucial test for it.

The novelty of a theory is thus much more obvious than that of information. Bacon had said, a new fact does not follow from an established theory, since one that does follow from a theory is not new; it is then either scientific or it should be dismissed as prejudiced. Whewell disagreed: a fact can depend on a theory and be new: the theory may be a new hypothesis. Bacon had opposed all a new hypotheses as prejudices. Whewell disagreed: he said, rigorous tests of will insure that a hypothesis will not become a prejudice, and having new hypotheses is essential for scientific progress, since we observe a new fact only when we have some expectation of it. This idea of Whewell is a strong hypothesis (of perception theory) and its confirmations refute Bacon’s hypothesis. So does the more common phenomenon of counter-expectation. Oddly, however, although the history of science is full of all sorts of counter-expectations, this class of events won attention only after theoretician Karl Popper described it. I have two amusing examples for it; I hope you indulge me telling them.

My first example is the debt that Darwin owed to Malthus. He stressed this fact but could not explain it as fully as he liked. The reason is simple: to explain it is to show that Darwin’s theory of natural selection is a correction of the population theory of Malthus. But this only shows that a discoverer is in debt to the inventor of the theory that the discovery refutes. Malthus was criticizing the economic theory of Adam Smith. He found out that Smith assumed that all natural resources are unlimited. Against this he said, they are exhaustible. To show this he argued that human populations grow geometrically whereas food grows arithmetically. On this Darwin commented, observing that all populations, human or animal or plant, tend to grow geometrically but natural selection checks their growth. For “all organic beings in the world” it holds that allowed to grow naturally they will grow geometrically, he said (The Origins of Species, 1859, Introduction) in clear disagreement with Malthus with no mention of him.

My second example is more intriguing. Admirable Dr. Joseph Priestley was an ardent Baconian. It made him prejudiced in favor of the theory of the phlogiston that he had contributed most towards its demise because he adhered to Bacon’s view that there can be no revolutions within science. (This is why Einstein put an end to Bacon’s philosophy. Its popularity shows how backward public opinion is.) His contribution that ended the scientific career of the theory of the phlogiston was his discovery of the gas that Lavoisier later called oxygen. He insisted that his discovery was accidental. He discovered the gas—deflogisticated air, he called it—by putting a candle into the container that was full with it and watching the light brighten. That was it. To prove that the discovery was accidental he added, the candle was there by accident. Had it not been there he would have missed the discovery. To prove that the candle was there by accident, he added the information that he had not expect the candle to burn more. Indeed, he added, he had expected it to extinguish.

This is Priestley’s report on his discovery. It leaves open the question, where did the expectation for the candle to extinguish come from? Why did he not report it? The answers to these questions are easy to reconstruct: he was testing Joseph Black’s theory of fixed air that historians of science (mis)identify these days as carbon dioxide. The reason he did not say so is that Boyle had decreed it impolite to state explicitly that a certain theory is refuted. Embarrassing researchers is disincentive, and since they are amateurs, they may drop out. This argument does not work for professionals. It reinforces itself, however, as one professor found a personal insult the criticism of his ideas by another professor; this turns dialogues into personal mudslinging. This custom reinforces itself as editors still take amiss and suppress all friendly open criticism. The way they still do it is simple; one praises the targets of one’s criticism and asserts failure to agree with them all the way. This is sufficient when the logic of the controversy is easy. When it becomes somewhat sophisticated, the rule to hide it hinders research. What we gain from the disregard for the rule—you have to be gentle as you break a received rule—is that the novelty of facts becomes transparent when you criticize a received idea.

I do not expect you to be able to do that. I do not assume that you are a genius and that you can make a big name for yourself soon. I suggest that you write a critical survey of a literature around a given, fairly famous problem that has a literature devoted to it. Possibly, though not very likely, there is already such a survey, or even more than one. This discourages some researchers. Wrongly. There is no need to ignore it, as it is much better to comment on it. Whatever is the problem, if it is a serious challenge, the need to have a general view of the situation is all the more urgent.

Here then are some practical lessons for you. Writing and publication blocks differ. To write with ease, be explicitly critical. I greatly recommend this. Take a popular theory and try to refute it, or take a popular question and survey answers to it. To publish with ease you should subdue your criticism. I do not recommend this. I recommend that you struggle in efforts to publish in your own way. You should be ready to accept criticism of what you write and to change and rewrite as often as needed in order to reach a reasonable standard of presentation. But do not yield to fashion.
 

7.4

Boyle ignored the role of surveys, even though he wrote important ones, especially his magnum opus, his 1661 The Sceptical Chymist. Similarly, as Popper has observed, although Hume’s 1748 criticism of methodology and of theology were of extreme importance, he had no discussion there of the importance of criticism.[61] Popper suggested degrees of criticism, the highest of which is the empirical refutation that a theory has to be prone to if it is to earn the lofty status of science.

Let me mention two defects in reporters of research that are optical illusions of sorts and that inhibit work and causes blockages. The first is the reluctance to acknowledge debt to others that is the eagerness to achieve recognition as original. It is childish.

Acknowledgement may help one reach recognition, not the other way around. The second defect is the fear of sticking one’s neck out. We admire those who do so and we fear being caught doing so. This makes poor sense. Inhibited writers often publish works that do not say what their intended message is. Reading such works is like chewing dust. If you want recognition without the use of power-struggle and intrigue and all that, then you must be reader-friendly. For this you must state the aim of your publication. For this you must be ready to stick out your neck.

Detour upon a detour: I am a bit unfair to those who report the fruit of their research the traditional way. Their reluctance to acknowledge is often rooted in ignorance of the role of the very institution of acknowledgement. It is protest against the diffusion of acknowledgements that most likely movie star Hilary Swank expressed as she delivered at the 2000 Academy Awards Ceremony a long (three minutes) acceptance speech, making one acknowledgement after another. Now the cinema does not have established rules on that; the commonwealth of learning does, and the reporters of research should know it: required general acknowledgements are of priority of publications of new observations or ideas; the required specific acknowledgement is to people who have helped that author in writing the specific work in which the acknowledgement appears (including names of ones who have read it in draft). The rest is optional and the default option is to omit it. End of detour. Back to our topic: the supreme demand to be reader-friendly, the demand for clarity and openness.

There are many ways to be obscure. Young Ludwig Wittgenstein demanded absolute clarity while referring to one sort of clarity: the meaning of words and of sentences. What cannot be said clearly, he pontificated, one must not try to say; one should remain silent about it. The book in which he declared this, his Tractatus Logico-Philosophicus, is famously obscure. Indeed, the book becomes clear when its aim is made clear. The trouble then is that this step—making the aim of the book clear—makes it clear that the book misses its aim. In conversations Wittgenstein admitted this when he declared war on metaphysicians, calling them slum landlords.[62] Now, when uncritical readers notice a failure of a book, they tend to assume that they have misread it. This is a serious error, especially since the criticism of any text may be answerable: you cannot discuss this as long as you worry about your having understood the book correctly. If you have misread a book, the best way to correct your misreading is to express your criticism of it and be shown that it is invalid. If the book is important, then this exercise is important for you. If it is also important for others, then publish it.

I need not discuss all this here, since it is a detailed philosophical issue. It is a particularly controversial philosophical issue whether Wittgenstein’s first book is a success or a dud.[63] The answer to this is, we can learn from this that Wittgenstein was mistaken about clarity: nothing contributes to clarity more than knowledge of what the speaker’s background information and concern were. Whenever you write, try to be clear on this.
 

7.5

For the importance of background information for clarity, it is best to glance at some contemporary philosophical texts. Mentioning some relevant background information, they take it for granted that you know it. This is referential opacity: they are opaque on purpose: they want to avoid sticking out their necks. When you criticize them effectively, you can bet that they will deny that you attack their views, since your surmise about some of their background reference was erroneous. Proof: you have shown them wrong when they are right.

Your mistake, it seems, was to criticize them in the first place. If you must do that, offer alternative readings of the texts that they refer to and refute their assertion on each alternative. If the exercise is too complex, then just give it up. The moral of the story is simple: you have a choice: write obscurely or write clearly and risk being shown in error—we are all fallible. If you want to count, you must either write clearly or partake in power-struggles that make you important enough to have you obscure writings count. In any case, there is no guarantee that what you write will make commentators discuss your work, or even pay attention to it.

When the aims of writers are not clear or when the background information for comprehending them is not known, then their output is likely to be useless. The first prerequisite of clear writing is the choice of readership: it is characterized by two qualities: their interest and their background knowledge. And background knowledge comprises the degree and the kind of knowledge. Hence, the less informed the intended readers, the harder it is to write for them. When Einstein writes for experts, he can assume that they are familiar with tensor calculus or he can refer the reader to a standard text on it. When he writes for students, he describes the essentials of that calculus.[64] It is his tremendous ability that makes his text for students no novelty in any sense yet tremendous pleasure to read. (The book served experts too, but that is sheer bonus.) When Einstein wrote to the inexpert who will not acquire even the basics of the tensor calculus, then the challenge he faced was different.

Consider popular scientific writings and textbooks. Consider the best of them, those of Einstein and of Russell. Not meant to be innovative, their having no references is reasonable. Since most academic fields have standard textbooks (Kuhn), such standard texts, popular or educational, are very similar to each other. They compete for clarity and ease of reading. This is fine, but hard to assess: the only clear assessment is the free market: at times bestsellers are the best—except books that fail somehow to reach the university bookstore for one reason or another. Some bestsellers are awful bores.

It is much harder for a beginning academic to publish in the trade press than in the learned press: referees and advisers for the trade press keep its gates open only to the big chiefs. If you manage to write a good textbook or a good exposition of familiar important ideas, or even histories of interesting ideas, a publisher may agree to publish it on the condition that you add to the title page a name of some bigwig as its senior author. I hope you never receive such an offer; if you do, refuse with no hesitation. For, chances are, this will cost you the readiness to allow that senior academic to mess your text up: such people do not lend their names without making some contribution to improve the text that bears their names. Also, it is better to seek endorsements; a few of them may appeal to a publishers better than fake co-authorship.

Publishers of academic texts yield to senior academics who disallow them to accept some texts. For, they cannot combat academic denouncements. Or so they think. Some decades ago, a famous publishing house accepted for publication some rubbishy text and incurred the wrath of a Harvard professor; they swallowed their pride and cancelled the contract. Another publisher accepted it; the scandal that ensued contributed to the books sales. Still, the trade press for popular science and for textbooks is the bonus and the fringe benefits for senior academics. Do not compete with them. That you can perform a task better than they is alas irrelevant. What you can do is write a survey and get it published; improve it—hopefully in the light of criticism—and increase its scope, and get it republished. The negative side of this is that if you succeed you become known as the expert on one small section of your field. This is no big deal: you can try later to correct misimpressions. The positive side of it is that your seniors will allow you to write a popular scientific text. One of the most prolific popular-science authors was Isaac Asimov. As a renowned science-fiction author, he was allowed to publish popular science. If you have any distinction, you may get a dispensation and write a science textbook or popular science. Otherwise, I recommend writing normal scientific reports and surveys.

Two more obvious options are editorial work and writing articles for encyclopaedias of all sorts. Some publishers may approach you, or some editors of marginal items such as anthologies or suspect periodicals or other kind of marginal items. Generally, if you do not like the idea, forget it pronto. If you do, accept it with no haggling and do not worry about destroying your reputation. It is very hard to destroy one’s reputation; I should know.
 

8. Learning to Write and Speak with Ease

This section should precede the previous one, but it is easier to discuss ends and then means; they say never put the cart before the horses. I say, always do. So now let me begin with two or three observations that seem to me very pertinent to learning to write and speak with ease. They all concern spontaneity, and they may help you become more spontaneous. I hope so. The standard advice is, collect documented data, read, write. My advice is, do the opposite. You need not succeed. Just try. You will not regret it.

8.1

In speech, a sentence often begins without knowledge of how it will end. You can see that with those who are the exception: they pause between sentences, saying the next one silently before repeating it aloud. Concerning them, then, this observation holds for the silent part of their speech, not for its voiced part. In writing, most people know the end of the sentence before they write it. The reason for this is obvious: it saves energy. You can see this with writing on a rock; before writing a text on a rock one writes it on paper. (Traditionally, they did this on sand: they moved about holding small sand boxes.) My main aim here is to help you learn to write the way you speak: begin to write a sentence before you know how to finish it. As it is common to begin to write an essay before knowing what it will say; to make your writing more spontaneous, my advice is to write an abstract of it. For writing a book spontaneously, one needs also a table of contents, together with estimated numbers of pages for each chapter and sub-chapter. All plans are changeable, and when you deviate from a plan, then you will benefit from rewriting the plan first. It takes some time, but it saves more.

Most activities are spontaneous. They say, routine ones are more so than deliberate ones. Not so: we stop being spontaneous when we decide to act after deliberation. The common idea that spontaneity is the exception is proof of the success of our education system: it is intentionally conservative and it comes to impede changes that come naturally. (Its aim is to impede all change, but this is a general point that will take me to a digression. I would be glad to go for it, but as here your concern comes first, I will suppress for a while my disposition to regress.) They say, natural conduct is more disposed to be spontaneous than artificial ones, and writing is artificial. This is outright silly. We do not know what action is natural. Is wiping one’s nose natural or artificial? Is speech? We all hope that we drive a car spontaneously, sing in key and play instruments spontaneously, but we refuse to write spontaneously. Why? Because tradition discourages us: read; do not write. If you must write, write commentaries in margins of valuable texts. To ensure this we will train you in a manner that you will be unable to write more than a page or two of a connected text. Most of your professors are unable to write books. Their lecture courses are often old texts, at times updated. Most of your professors write papers—they have to—and it costs them tremendous efforts and even humiliation. The reason is that they have the wrong training in writing. (You remember that those who are trained in writing laboratory progress reports may write nothing else and even get their Ph. D.s this way.) For example, when they write abstracts of papers they use descriptive phrases rather than sentences (“the mill on the floss” rather than “it was the Dorlcote Mill on the River Floss”). The right way to write is to do write fast a draft and correct it afterwards. This saves a lot of time since first drafts are no good unless you are a genius. And it is much better since it is spontaneous and the sense of its freedom may linger from one draft to another.

Our education opposes this: you better think carefully rather than write a text twice, it says, since words have indestructible powers. By the biblical text (Numbers 5:11), words can test a wife: take a piece of parchment; write some curses on it with some magic ink (made of ashes of a sacrificed bird) and wash it with water for her to drink. Supposedly, this will make her body swell only if she is unfaithful (sotah). Well, I do not know how to break the magic spell of words. It is up to you to do it.

The best way to get rid of the inhibition to write is to try to write the way you speak: write whatever pops into your mind and then read it and then throw it away. If this procedure is too slow, write every sentence a few times. It will feel odd, even silly. Humor me and do it any way. The experience can be delightful. If it is not, you have wasted a few minutes of your life. It is a fair wager. More than fair. If you simply cannot try this experiment, then this book is not for you. So drop it; this will save you time more than avoiding the experiment I recommend here.

If you continue, then first thing to do is to learn to throw away some of what you have written. This is crucial: learn to throw away what you have written: it is not that precious. I apologize for repeating this piece of advice. I have some reason for it. Quite a few people who I tried to advise, mostly graduate students but not exclusively so, wasted much of their time and of mine by efforts to compromise in their acceptance of this advice without telling me. Usually they dumped their manuscripts into a wastebasket and placed the wastebasket in a corner in their cellars or attics, but you will be surprised at the variety of ways to circumvent the proposal to throw things away when you have to depart from them but cannot do so.

No, it does not make sense to me either. I do not know why is it necessary to destroy a manuscript in order to break its magic hold on us. I do not know what is the mechanism of the writing block. Nor am I interested: contrary to Freud’s advice, mental blocks need no analysis and no dismantling; suffice it to learn that fear causes them and that circumventing it is possible (Joseph Wolpe). It is clear that writing block is a part of publication block. Indeed, overcoming writing block reveals it, and having written a paper or a book is thus no guarantee that its author will send it to an editor or be glad to see it published. People who suffer from serious writing blocks and who learn to overcome it often end up with interesting, publishable manuscripts that are lost after their demise. I have myself helped such people write such manuscript, and though they are deceased friends of mine, my memory of them is not quite friendly: they deceived me and wasted my time. I should have known that: there is a simple sign for publication blocks: people who suffer from them can write whole books with no author’s name. The first thing you write on your way to healing yourself is always to write you name on the page you are using or in the file on your computer that you open.

We should discuss writing blocks now and discuss publication blocks later; I mention the latter now in order to show that the fear of the former is the same as that of the latter. My point now is that to overcome the former you need to learn to throw away some of what you have written and that to my astonishment hiding manuscripts in the attic does not break the magic spell that manuscripts have on their authors. The fear of publication is the fear of exposure: the psychoanalysis of that fear invariably shows that it is a version of the fear of undressing in public. Rejecting psychoanalytic cure, I recommend overcoming the fear of exposure without going into the details of its source.

The fear of exposure appears in Academe not only in writing blocks but also in lecturing blocks. Except that the fear of exposure of a lecturer is much smaller and easier to control than the fear of exposure of published stuff. A lecturer may be a ham who enjoys exposure without minding what people will say after the performances as long as it was a success. This is unusual. Many lecturers have confessed to me that they dread exposure during a lecture even after decades of practice. Many will not answer a question thrown at them during a lecture and instead promise to answer in a later session, after due preparation. These lecturers may succeed in reducing the amount of error within their lectures, but the cost of this may be too high: they may be dull lecturers; usually they are.
 

8.2

There are diverse ways to overcome writing blocks. It would not have occurred to me that dictation is a tool for that. I learned it from the life of leading philosopher Ludwig Wittgenstein. He dictated his first publication (it was a book review) to a friend; he dictated to his professor the essay he had to write in order to win his Cambridge University bachelor’s degree. The professor wrote to him later saying that the regulations require adding to his essay what the professional jargon calls a scholarly apparatus. Wittgenstein answered him with a rude letter and thus managed to receive his degree without it. He later published a book on the supposition that it is the greatest text in modern philosophy. All his life he tried to reinterpret that text in light of newer ideas he had, says Jaakko Hintikka, one of the leading experts on his output. Hintikka relied here on the huge material of Wittgenstein’s literary remains: for decades he dictated much and wrote manuscripts that he left for his literary heirs to publish. He thus suffered from both writing and publication blocks, but they differed. This then is a glaring case of contrast between publication blocks and writing blocks.

Wittgenstein’s way to overcome his writing block is not the only one. I will spare you stories of other cases of more complicated ways of overcoming it. The way I recommend is the most elegant, provided you do not cheat and do write and destroy manuscripts. This is very healthy: it teaches you that what you write is not scriptures. You should say that you know this and yet have a writing block. I will deny this. The crucial experiment to decide between your view of your writing block and mine is simple: write something—anything; it may be something you work hard on and are proud of or something that you consider outright silly; it does not matter—and then throw it away. You will then look at yourself and know which view of your writing is right, yours or mine.

I mean it. Interesting as this piece of writing of mine may be, I recommend to you to cease reading it, yes, just now, and go and write something or find something that you have written and then throw it away. Really: destroy it irretrievably. And then you may throw away this piece of writing of mine too or come back to read it, now or later as you please. To repeat, I accept your verdict, hoping for your sake that you are in error about your writing block, but I do not insist: I have never met you in person. Still, before declaring yourself right, allow me to suggest that you take time out. Stop reading this book and do something else. After a reasonable pause, you can decide. We will then consider your decision correct.
 

8.3

So, you are back to this piece of writing of mine. Now let me refine my advice: when you decide next time to throw away a piece of writing, glance at it first and decide which part of it appeals to you more and which less. Later on, you may decide to keep parts of it as you plan a rewrite. If what you write is in paper or digital, the process is different but the idea is the same. Indeed, some people I know learned to write on paper and taught themselves to write on the computer with the result that surprised them: their writing block did not follow them as they moved from paper to screen.

Another proposal. Choose a controversial problem and write all the extant answers to it, silly and clever alike. Now I will stop this line of instruction. My paper “Dissertation without Tears” is on the web and it belongs here, except that rather than reproduce it here I will advise you to look it up on the net. It is available free of charge. If you want my advice from someone else’s pen, let me recommend the 2005 Memoirs of a Failed Diplomat by Dan Vittorio Segre that reports of his having learned to write although he was already an established writer: it served him as means of inducting him into the academic world.

Allow me to give you the gist of my proposal, though. You may need no advice in order to be able to write and speak with ease: for all I know you are already an accomplished writer or speaker, able to read and write with ease. Or perhaps you are able to read and write with ease without being an accomplished writer or speaker. In this case, I hope you will agree with me that it is nice to be a good writer and speaker even if one does not need it. Admittedly, there are advantage to being a poor writer and speaker, especially for people in certain positions. Here we can ignore them.

It is surprisingly easy to be interesting. All you need is to be ready to make an ass of yourself. I do not know why this is so difficult; I do not know why people shudder at the thought. Bernard Shaw said,[65] we want to be perfect in order to win approval; yet look at the perfect people in our tradition, Socrates, Jesus, Joan of Arc: they were killed. Now, whatever is the situation exactly, clearly, if you do not wish to serve the people you address, you better avoid addressing them in the first place. If you do wish to serve the people you address, then what they think of you is less important than that you help them. For that, all you have to do is to address whatever question you think matters to them. You need not be expert to be helpful; you need to report to them the opinions on it and criticisms of these opinions. And you need not be original: reporting items that are in the field will do. Having done that, you have brought your audience up to date. For most audiences this is great progress.

For audiences that are already up-to-date on a given question, you need not address them on that question unless you have something special to say. Your addition may then be open to criticism and your audience may offer you criticism of what you say. This sounds a failure; it is a major success. If you still do not know that this is a major thesis of this piece of writing of mine, then you are wasting your time reading it.

In brief, this is the major moral point of your teachers, assuming that they are of the usual sort: speak only on what you are expert. If you want more discussion in this vein, you are reading the wrong book. My advice to you is better; when you meet a person and you two are able to spin a conversation, take the initiative and discuss what you think the other party knows better than you do. If the other party is any good, you will soon have exposed some of your worst errors in the material under discussion. If you are more interested in giving the impression of a great scholar, perhaps my advice is amiss; if your wish to learn is strong, you will be grateful for the corrections. Then my advice is that you say so loud and clear. If you do so, then you will be surprised to find how easy it is to speak to a public with ease. All you have to do to then to be able to write with ease is to record your speeches as best you can and ask friends to correct them.

It seems easy to find friends to correct your manuscript. It is not. You have to request your friends to tell you where you are not understood and where you cease to be interesting. Friends who do so are priceless. If you try my advice, you will find that the friends will do other things: they will tell you how to rewrite the paper, what to add to it and other things that you should be wary of taking seriously since they serve one purpose and it is to give you excuses for postponing publications. You do not need excuses: publishing is not obligatory and so avoiding it invites no excuse. When publication is imperative, when your department chair tells you that your academic position will not be renewed unless you publish, then you do not need a friend: you need a friendly editor.
 

8.4

Allow me to brag again. I have lived long and published over six hundred papers in refereed learned journals in addition to many books. My papers are often published in highly respectable periodicals in diverse fields of study. Some of them are half a century old and the learned press still mentions them. I wrote them with ease. Moreover, they are written so that readers hear them. If you happen to have studied what the academy calls creative writing, then you may know that some teachers of creative writing lay great stress on the need of writers to write in a manner that sound like recording of real conversations. They may also mention jane Austen, Ernest Hemingway and William Somerset-Maugham as masters of this art. They recommend that students record real conversations and put then on paper to show that this does not work: ever so often artists learn to create artifacts that look real. I also told you that three editors who liked my style asked me for a paper and rejected what I sent them a too facile—because they could not read them as published. This testifies to their having publication blocks. The reason they publish anyway is that publication pressure is tremendous. The editors afraid to publish interesting stuff explains why published papers even in leading periodicals are so often so very tepid. Steve Fuller has noted that tepid papers are easier to publish than exciting ones.[66]

Most papers submitted to the learned press meet with rejection; nine out of ten, they say. Nevertheless, usually researchers take rejection amiss. When possible, I ask a friend in that predicament, do you think the paper is outside the area of the periodical you have sent it to, or do you think it is substandard? I have written in a learned paper that most of my well-cited papers were rejected a few times each.[67] F. Scott Fitzgerald, one of the greatest American writers, especially of short stories, had suffered over one hundred rejections of a short story of his before he saw one in print. He obviously advertised this information as a piece of encouragement to young beginners. This means you.
 

9. Middle-Range Prescriptions: How to Reform Academe

My advice to you is, stay out of all academic politics, be it short-term or middle-term or long-term, and partake in administration as little as you can. There is a limit to this: it behooves responsible citizens to have some idea of the politics of their country and of their group of sociopolitical identification within their country. This means, chiefly, their long-term aspirations. That is to say, we all have the need to identify with our people. For this we need to develop for ourselves some images of our national societies; for this we need some familiarity and affinity with our nations’ mid-term plans. This—some mid-terms aspirations—is what democratic governments often neglect to pay attention to; and this is what Academe too lacks most. All of us except for our professional politicians have to struggle with the problem of how much attention we should pay to national politics, to our aspirations, to our wishes to partake in efforts improve the democracy and peacefulness of our countries. The more elastic our tine-table is, the harder the problem. And Academe is as elastic as any employer can ever be.

What holds for national politics holds no less for professional politics. To plan a reform of Academe is harder than to plan a reform of democracy, since the latter is the framework for the former. In brief, the task of writing this section is simply impossible: I am out of my depth.
 

9.1

The problem I have now is with my sense of proportion. It is an unfortunate custom in our society that your friends tend to say to you only good things about you. As Shaw has observed, it is unwise of married people to praise the institution of marriage and for bachelors to speak against it: they should do the opposite—and act as the devil’s advocates. I do not know how right this is. Roman Catholic priests are unmarried yet they often praise marriage. Not always, to be sure, and not always without qualification. After all, St. Paul upheld the institution of marriage only as second best: “But if they cannot control themselves, they should marry, for it is better to marry than to burn with passion” (1 Cor. 7:9). Admittedly, we tend to defend what we like and attack what we do not. Admittedly, this is the established custom in courts, where one party speaks with great bias for one side and the other for the other side. Yet there a judge and a jury are supposed to strike a balance. Moreover, even in courts one side may concede to another. In politics, in particular, no sense of balance is expected and often none obtains. When people argue with no sense of proportion, they do so on the supposition that they speak to a judge. They thus appoint their interlocutors as judges without saying so, and obtaining no assent. This is improper.

The custom of defending only your own opinion and attacking only the one you reject is that it brings about the loss of the sense of proportion. Here I spend much time attacking Academe. Yet I deem Academe the very best institution around. I see two major reasons for this. First, academic freedom: Academe is remarkably free. It is easier for academics to say their minds in public than for politicians (especially about religion). Second, the academy offers leisure. They say it offers also a learning environment. Even to the extent that this is true, there is no need for Academe for it. We can create learning environments elsewhere, except that we need some financial arrangements to make it stable. Every institution that offers such arrangements—mostly patronage and research institutions—has already joined Academe: during the Cold War.

Throughout my adult life, with very few exceptions, I made a living by belonging to one academic institution or another. The task of an academic is to lecture two or three courses or seminars—four to six hours a week one week out of two (Sabbaticals aside)—with ample time for academic intrigues and for the gossip that is essential for keeping up with the Joneses. There are exceptions. Half a century ago, the theory of elementary particles became respectable despite the prevalence of ignorance of some abstract algebra that blocked the ability to comprehend it. To meet this contingency, the United States administration instituted a summer course to which all physics professors were invited. The need for such a drastic measure happens seldom even in the best fields that mathematics and physics are. Something similar happened to certified accountants: when computers entered the market system they needed some computer savvy but had none. They learned to overcome this defect soon without learning to operate computers, though with the aid of some user-friendly programs.

These are wild exceptions. Usually, academics can stay au courant with little effort; normally, listening to gossip suffices for it; in many fields, even this is not required. Very little is. Apart from the need to come to the lecture-hall more-or-less regularly and to hand in grades on time at the end of each teaching semester, and fulfill some light administrative chores, academics are pretty much left to their devices. This is just wonderful. Not surprisingly, the demand for academic jobs far exceeds its supply. This situation is stable as long as the demand for studentships in Academe is on the rise—since the academic administration can control the size of their student bodies and thus the size of their faculties. What administrations want from the faculty is to keep the system run smoothly and if possible raise its reputation.

The defect in this rosy picture is the matter of hiring. When I studied physics, the sociology of Academe that my peers took for granted amazed me no end. They acted on the supposition that their intellectual progress guaranteed their smooth absorption into Academe and their unproblematic achievement of reasonable livelihood there. Since they were physics students and since this was the early period of the cold war, they were not disappointed. Around the corner, in the faculty of arts, things looked very different. The seed of the Battle of the Faculties had then taken roots.
 

9.2

The university of my student days is gone for good. It had almost no administration and very few regulations. If I had to register today for the doctoral degree that I have earned in the college in which I received it (The London school of Economics and Political Science, the University of London), I would not be able to fulfill all the requirements for it. In addition to all the formal differences between the requirements then and now, the contents is also very different. I will skip the detail. The initial medieval university had no fees and no requirements and minimal rules as to the acquisition of academic degrees. Its professors were ignorant yet sufficiently learned by received standards; the idea that they must keep up with the progress in their fields of expertise could not possibly occur to them. They were all members of the clergy and thus celibate and with no financial worry. In nineteenth-century protestant England, celibacy was still the rule; academics who wished to marry had to resign their positions.

Academe was secularized with the rise of the normal liberal democratic nation state—the United States of America and the French Republic—but it was still largely outside the sociopolitical system; economically, it was left to its own devices despite regular outside financial support. In 1830, Prussia revised its academic system and the University of London was founded. After it the many other universities did (the Red Bricks, they are called). The nineteenth century witnessed the rise of technical universities. When the Nazi regime showed its barbarism, Martin Buber has observed, the German academic system had sufficient power to bring it down, yet it was sufficiently chauvinist to granted the Nazis the benefit of doubt; very soon it was willing to oppose the Nazis, but by then it was too late: in three to five years it lost all its influence.

After World War II, the main role of German universities was in the hands of their departments of philosophy: they had the task of training a cadre of non-Nazi schoolteachers—under the supervision of the occupation forces. As to the professors who were in charge of this program although almost all of them had been complicit with the regime to one degree or another, they were largely confused, since the Nazi ideology condemned failure and the failure of the Nazi regime was uncontested. The post-World War II German and Austrian philosophy professors were almost all Nazi to this or that degree. Some Austrian academics had been anti-Nazi activists; this made the Austrian academic system refuse to employ them; they sought livelihood by establishing a summer school in the Austrian Alps to teach democracy. People like Schrödinger, Hayek, Popper, and Koestler gave it sufficient kudos to make a meager living. It was a success. I went there as the assistant of Popper and met there for the first time in my life Austrian philosophers whom I consider Nazi.

Many European academics could not hide their disgraceful conduct during the war; to relieve themselves of the shame, they reformed of their academic systems. These are of no interest to us here, especially since the reforms of the academic system in the United States of America, dictated by the Cold War, soon eroded much of the European academic traditions. The Cold War is over; the grant system by which the American administration infiltrated and controlled the American academic system is long gone; but the Americanization of the world academic system is an accomplished fact—for better and for worse. I will ignore this here, since this is no survey of the history of the world institutions of higher learning and of scientific research but hopefully a mere manual of sorts—to help as much as possible some young people on their way to academic careers. It may help these youths to know that Academe had a checkered past if they wish to struggle against excessive conservatism.
 

9.3

The Cold War mobilized all intellectuals—from the academy to the entertainment industry—and they have not fully regained their freedom from the rest of the national system. No matter how admirable the American academic system is, its collaboration with forces of darkness during the early days of the Cold War is a stigma, much less than the stigma of Nazism of many European universities, but still one that does not go away and so is in need for healing. At the time, some academics tried to reduce the damage of their complicity in the atrocities. The idea was to mobilize the new western academic system— the New Left—to the struggle against imperialism, namely, to support the Soviet Union in the Cold War. The most significant ideologists of the New Left were American Norman Birnbaum, Noam Chomsky and Howard Zinn. Although they were individuals with no charisma and no credibility,[68] they led a successful campaign against the Western participation in the Vietnam War. This was a success, perhaps the only success the New Left ever had, largely because the western involvement in Vietnam was a stupid failure doomed to go nowhere ever since the French were defeated there (in the battle of Dien Bien Fu of late 1953). A contribution to it was the stupidity of the Ohio National Guard: in 1970, they shot students on the campus of Kent State University, raising the ghost of the Civil War. It was the beginning of the end of that stupid, tragic war.

The conclusion of the Vietnam War should have counted as a success of American Academe. It was not. It counted as a success of American New Left. It then only destroyed the traditional view of Academe as a-political that was generally received for ages: no public debate took place aiming at a re-examination of the position of Academe regarding politics, because the academics who ran the New Left thought nothing of lying more than politicians then dared to.[69] This led to the loss of credibility all round,[70] and to political lying of unimagined degree.[71]

These digressions of mine illustrate an obvious point: the very idea of a middle-range reform of any institution should rest on a middle-range projection of democracy that should hopefully be middle-range projection of modern society in general. Planners tacitly assume that the world will be increasingly liberal and democratic, even though we do not quite know what this means. I do endorse this, but I think it needs public discussion. For, in my view, without liberal democracy we are doomed, yet liberal democracy is still powerful. When liberalism appeared, it sounded too utopian to be realist. Nowadays its success thus far is too obvious for words: by and large, the rich countries in the world are liberal and the liberal ones are rich. Of course, this is not as striking as it looks, since the criterion for richness used in this context is liberal: we rightly consider Saudi Arabia poor and Finland rich, even though concerning natural resources the comparison goes clearly the other way.

Nevertheless, the facts are scarcely contestable yet most philosophers around are anti-liberal as if facts obviously support their attitude. If we expect the future of the modern world to be liberal-democratic, then, clearly, in the near future the welfare aspect of the modern state will be strong. The popular Chicago school of economics that advocates totally free market economy and opposes all welfare-state measures. The economists of that school know that their suggestion is just out of question: they have tacitly replaced it with another proposal that is much more reasonable: monetary investments in the implementation of welfare measures are usually preferable to fiscal ones. No matter what these measures are, suffice it that they are both different from the official Chicago demand for letting the market take care of the economy unaided.

The Chicago school of economics also takes for granted current employment patterns and oppose trade unions totally. These abide by the industrial work-pattern that are products of the industrial revolution. Some sectors of the modern population are older. Agriculture is still largely traditional self-employment and Academe rejects the equality of employment that the free market is supposed to impose, with women and minority groups still underemployed, poorly promoted and underpaid. All this is peanuts; it is obvious that the labor market cannot stay in its present form: it is intolerable that every discovery of a new technique for automation, or any other means for work-reduction, raises new fears of unemployment. Clearly, every such advancement should lead to finding ways to raise the standard of living and to reduce working-time.[72] What should draw our attention is the new, post-industrial employment pattern of the information age. What it is going to be we have only some inklings. As far as we can see, it is having all citizens pensioners of the state. The first result of this rule will be that financial constraints will cease to force people to work. The second is that employment discrimination will thus become very different from what it is at present. The change of employment pattern will reduce significantly the pressure that forces some people to escape from reality into drug abuse and addiction. Even if not, decriminalization of drug abuse is just a matter of time. What by the liberal canon the law of the land should declare criminal, is not drug abuse but doing under the influence of drugs what requires full attention—like driving motorcars. Laws against drug abuse testify to the weakness of liberal democracy: it has many undesired consequences that threaten democracy (since smugglers use the routes of smuggling drugs also for smuggling weapons and money). It rests on the poverty of the democratic education system. Even though education is obviously the best investment, budget cuts always hit it first. This makes no sense. It will have to change sooner or later, and then the place of Academe in civil society will also change.

And so I have arrived to the suggestion as to the topic of this section, which is, I may have forgotten for a while, prescriptions for your activity towards the middle-range reform of Academe. I do not know what we need to do to establish a public forum for the discussion of the future of Academe, and I do not know how we should establish a steering committee for such a body, and I do not know what criteria such a committee should adopt. But I do know that the future of Academe looks more auspicious than ever before. I also know enough about obstruction to know that the future is never assured:[73] repeating silly and defunct arguments comprise very efficient ways to block progress towards any reform. I hope my proposal to be deaf to them will help as better means for defeating them than arguing against them.

Of course, not all ignorance is willful; many ignorant individuals are eager to learn. And then of course it is your civic task to help them as best you can, although without spending too much effort on this task. Moreover, they may have some criticism of what you think is a silly objection to your proposals, and then it is very much in your interest to listen to them carefully. Nevertheless, I say, do not argue with them unless they show you that they share your concern with the future of Academe and that they argue not in order to win but in search for the truth. This paragraph is problematic, and I do not know how to list the problems that this raises, let alone their solutions. I still say, it is very important to avoid discussions with people who sabotage the project while looking concerned, and the difference between them and the serious objectors is that the serious ones do not repeat silly popular objections like the observation that there is no money for your proposed reform, whatever it is. For, indeed, there is never money for urgent tasks. If this were a valid argument, we would not have lived to see this day. The strongest source of optimism is that despite the atrocities that humanities proved itself capable of, we have seen so much exciting progress—in science and in technology, but also in social engineering. The future is still open.
 

10. Intellectual Courage and Integrity

Novelty in philosophy is often a matter of daring rather than invention. — Harry A. Wolfson, Spinoza, ii, 331

Why the rise of modern science required courage? Harry A. Wolfson has explained: modernity was science and magic intertwined. He viewed the three monotheistic religions as mere variants of one religion, and he divided this religion to rationalism and mysticism or science and magic. The rationalists lived in Academe. The mystics had no social base. They were hermits or itinerants; they appeared seldom, and then as courtiers. As courtiers they practiced the secret art that we view alternatively as magic, alchemy, astrology or Kabbalah; they viewed them as one kind of activity: the effort to bring salvation to the world. The Kabbalah is usually deemed Jewish in origin; it was Greek. The reason for this error may be partly due to the fact that medieval physicians were often Jews and kabbalists. Cooperation with Jews was essential for the most famous Christian Kabbalist ever, Giovanni Pico della Mirandola, author of the celebrated Oration on the Dignity of Man (1487) that changed the Christian culture from the Talmudic view of us as dust and ashes, as worms and maggots, to the Talmudic view of us as the crowning glory of Creation. That view became the Renaissance idea known as humanism. The label “kabbalist” was changed into the label “Pythagorean” in the writings of the kabbalist Johann Reuchlin—due to his hostility to Jews.[74]

The Inquisitors in charge of the case of Galileo addressed him formally as Galileo Galilei, Pythagorean. In this they follow his Dialogue that was at the focus of the case: on its first page, the Aristotelian complains that the Copernicans accuse the Aristotelians for their lack of clarity when they are worse. The Copernican admits the charge and promises to behave better. So Galileo viewed himself as Pythagorean and undertook to clean his tradition of its mystic fog. His contemporary Francis Bacon was more authoritative on this matter: he demanded to ignore the scholastic tradition and to replace it with a just history of nature, namely with factual reports unexplained and unsupported by any theory. He still was a Kabbalist: he advocated alchemy and magical medicine.

Under the influence of Galileo and more so of Bacon, we still condemn medieval thought as confused. We forget that it had its own idea of clarity that was central to it. That idea rested on a philosophy: ancient thought was perfect and that learning should therefore comprise efforts to comprehend it. This, the study of the clarification of ancient ideas, is Hermeneutics, exegesis or interpretation, three terms that we may take as sufficiently close to count as synonyms.

Hermeneutics began in antiquity, with the writings of Philo Judaeus who declared that there is no disagreement between Moses the lawgiver and Plato. Porphyry of Tyre wrote a (lost) book called, The Opinions of Plato and Aristotle are the Same. The greatest medieval Muslim philosopher Al-Farabi wrote The Reconciliation of the Two Sages, Plato the Divine and Aristotle the first Master. His most celebrated follower was Maimonides, who interpreted as metaphorical every text that he found unacceptable as it stands. The popularity of this technique grew. The most important example of this is the reconciliation that St. Thomas has offered of the texts of Aristotle and Archimedes on gravity.

The first to break from this tradition was Pythagorean Nicolaus Copernicus. He preferred the heliocentric system to the geocentric system for Pythagorean reasons. What he found objectionable is more important: the reliance of ancient authors as authorities. In his introduction (his letter of dedication of his book to the Pope of that time) he said, since the ancients disagreed with each other, they cannot serve as authorities. That was his greatest contribution to scientific thinking. Next came a Pisan thinker by the name of Francesco Buonamici. He declared that Aristotle and Archimedes disagreed and concluded that Archimedes in error. His student and admirer Galileo reversed the judgment and deemed Aristotle in error. He found in Archimedes an argument for the heliocentric system (as he hinted in his 1596 letter to Kepler). His disproof of Aristotle’s theory of gravity is a milestone in the history of science.

It took great courage to declare Aristotle in error. It led to serious objections that led Galileo to a lifelong battle in defense of his reputation. He was an academic turned a courtier (an Aristotelian turned Pythagorean). The thinkers of the next generation were upper-class amateurs. They needed courage much less than their predecessors, yet they did not possess it. Robert Boyle, the legislator for the Royal Society of London, whose etiquette became general throughout the Age of Reason, decreed that refuting an idea should not be made explicit: the author of the refuted idea will understand; so there is no need to shame a researcher who has erred and who, being an amateur, can easily cease publishing research results.

Some researchers of the Age of Reason were academics; research was not a part of their job description. Thus, Newton left Academe as soon as he could but continued his researches. Leading physicist Luigi Galvani, the discoverer of animal electricity, who was an adjunct academic even though he was married, lost a debate and as a result left his work and went to the Holy Land on a pilgrimage. The fate of Sir Humphry Davy was worse: he advised the Royal Navy, telling them how to improve the speed of boats. His advice was erroneous and he went to exile never to return to England. Yet it was the official dislike of criticism of the Age of Reason that is most incredible. Today it looks bizarre, since Einstein’s success to replace Newton’s theory could not possibly hurt Newton’s reputation. Nor did he intend to, considering him the greatest contributor to science of all time. Nevertheless, in his scientific autobiography he said “Newton forgive me!”

Physicist Freeman Dyson of Princeton Center for Advanced Study, let me repeat, reports in his autobiography that when during World War II he found that the escape hatch in the fighter pilot cockpit should be enlarged to reduce the rate of burns of pilots, some generals opposed his proposal out of personal pride. This is barely conceivable.

There are three cases where this is hardly reasonable to expect. We can scarcely expect a complaint of a prisoner against a jailer to be effective, no matter how just it is. This is no news, since it is a major aspect of a 1951 bestselling story, From Here to Eternity of James Jones made into an Oscar winning 1953 film directed by legendary Fred Zimmermann and selected for preservation in the National Film Registry by the Library of Congress as being “culturally, historically, or aesthetically significant”. Another famous case of this kind is of a patient in a mental home has a grievance against a physician. Notoriously, such grievances are often right, yet impossible to defend. The same holds—for very different reasons—for a just grievance of a graduate student against an adviser. The just grievance may be against a hostile adviser and it may be against a committee hostile to the adviser of the graduating student.[75]

This should be advertised, since it means that a graduate student not on good terms with an adviser is better off cutting losses and changing adviser. I know of very few students who graduated successfully despite adviser’s recommendation to the contrary. Overriding unjust graduation committee is much harder, let me report. Still, since obviously a good dissertation is not expected to fail, let me testify that these things do happen, however rarely. Two of the very best dissertations that came my way had trouble to pass, and one failed, whereas dozens of dissertations that I would not have passed as seminar papers were passed smoothly.

The trouble is to do with the coupling of ignorance with cowardice. The ignorance is as to whether a dissertation will pass or fail. It is more reasonable to be able to decide whether a dissertation deserves to pass. Many an academic is afraid to declare a dissertation deserving a doctoral degree since other academics may deny that. Indeed, every department has the right to overlook an expert report on a dissertation if it finds it incompetent. This too requires more courage than is usually available, let me testify.

To expect courage is wrong. By definition: it is not courage if it is not supererogatory, and it is not supererogatory if it is to be expected. But integrity is to be expected; yet it is not as common as it should be. And so when anyone lacks the courage to do what is required to avert flagrant injustice, then one is not obliged to display courage, but one is obliged to show decency and this may require to recluse oneself. Alas, in order to recluse oneself all too often one needs more courage than most can amass. This holds for run-of-the-mill tasks. When the task is necessary but not possible, it is time for a search for institutional reform. This requires even more courage, since it is more outlandish.

This then is an impasse. The many successes of reform movements is a miracle of sorts. So we must grant the Good Lord the benefit of the doubt.

FRONT MATTER | PREFACE | PROLOGUE | PART I: DIAGNOSIS | PART II: ETIOLOGY |

| PART III: PRESCRIPTIONS | PART IV: PROGNOSIS | EPILOGUE |

TABLE OF CONTENTS
 


[1] There are exceptions, though. Wilhelm Hauff had used coincidences freely; still, it is a weakness of his art, tolerated since his stories are otherwise terrific. Similarly, readers forgave Boris Pasternak his having his Dr. Zhivago bump into his sweetheart in the vast Siberian steppes.

[2] Some academics are influential in the academy or outside it. They may receive invitations to international academic gatherings or organize them. Others are editors or consultants. And so it goes.

[3] Margarete Buber-Neumann, Plädoyer für Freiheit und Menschlichkeit: Vorträge aus 35 Jahren, 1999.

[4] The trouble with old mathematical texts is old terminology. Avoid it. There is enough stuff without it. Some ancient and Renaissance text hardly use formulas. Early twentieth-century and later literature are easier to read. Some texts stand out: Maxwell’s texts are enjoyable as his equations are easy to translate into Heaviside’s terminology or even simply to skip.

[5] The reason for this is historical: mediaeval and Renaissance artisans (including artists) who wished to be masters submitted masterpieces to their guilds. The university began in Salerno as a separate (medical) guild as patients and physicians flocked to the dispensary of the monastery there; it ceased in 1812.

[6] Not that criteria for novelty of mathematical theorems prevail. It is not very clear what theorem is new, what is a mere lemma, what is a variant of a known theorem, and what is a mere rewording of one. My image of math is thus somewhat idealized.

[7] Where the requirements are clear, as in some chemistry and biology labs, a doctorate is a simpler matter. The problem there, however, is that the requirement is often for positive results, which makes doctorates gambles. Popper’s methodology should alter this soon.

[8] The same holds for novelist Jack London: leading animal psychologist Konrad Lorenz has recognized him as the father of modern animal psychology.

[9] Donald O. Hebb, Biological and Biochemical Bases of Behavior, 1958, “Alice in Wonderland or Psychology Among the Biological Sciences”, 451-67.

[10] There is logic to all that. Newton was supposedly infallible until 1818, when his optics was superseded. Physicists then felt the urge to prove that his theory of gravity is better founded.

[11] For this challenge, see my “Changing Our Background-Knowledge”, Synthese, 19 (1968-69) 453-464.

[12] This is what Ludwig Wittgenstein spoke of: rule following ousts metaphysics. See my Ludwig Wittgenstein’s Philosophical Investigations: An Attempt at a Critical Rationalist Appraisal, 2019, 229.

[13] In a sense, the writings of Dostoevsky comprise religious propaganda. His greatness was his rising above it as his magnificent “The Grand Inquisitor” illustrates.

[14] For “old horse” see Orwell’s Animal Farm, 1945; it is a standard term in the communist inner jargon.

[15] The objectivity of information is difficult to recognize. To recognize the transfer of a sign from one list to another in a computer is to see the refutation of the description of a sign (as opposed to a symbol) as a physical object―a description that Rudolf Carnap made famous and that Hilary Putnam expanded and that is still popular. Frege and Pierce had declared information a third domain next to the body and the mind. Russell denied it but he finally admitted inability to make do without it.

[16] The testability of claims for skill is notorious. Its expression is the ancient adage: Hic Rhodus, hic Salta. (We can challenge one who brags about having performed a difficult jump: jump here and now!) The identification of knowledge with skill and skill with the ability to follow a rule here-and-now is the most basic idea in the later philosophy of Ludwig Wittgenstein; it is a major reason for its popularity as well as for its barrenness.

[17] See John R. Wettersten, The Roots of Critical Rationalism, 1992, for enlightening details.

[18] Heinrich Gomperz reports in his autobiography that he had told Freud he always remembered what Freud was insisting that he had forgotten. Freud ignored this. This is an amazing story: it speaks of humans as males with only their mothers as females. Remember that most of Freud’s early patients were female. He discusses the sex act only once—in his The Id and the Ego—and only as a male experience! The generality of Freud’s explanation has raised the hostility of anthropologist Malinowski. See his Sex and Repression in Savage Society, 1927.

[19] Some of Menander’s works were found in the papier-mache of a mummy case!

[20] Frances A Yates, Art of Memory, 1966, refers to many mnemonic ideas. She was concerned not with mnemonics but with its role in the hermetic literature. The mnemonic network is memorized in accord with the drill theory, but its idiosyncratic character make it special. The drill theory cannot explain idiosyncrasy (least of all idiosyncrasy of past associations: these are negligible as compared with the drill of the—arbitrary—network itself.)

[21] Paul Feyerabend too sounds more irrationalist than he is: he ends his 1987 Farewell to Reason with: if this is reason, then farewell to reason.

[22] Instead of a summary sentence, people often offer a descriptive phrase—because unlike a descriptive phrase a statement can be false: they wish to appear infallible, no less.

[23] This is far from obvious. It is the subject matter of the marvellous book of Jacque Hadamard, The Psychology of Invention in the Mathematical Field, 1945.

[24] This is far from obvious. It is the subject matter of the marvellous book of George Polya, How to Solve It: A New Aspect of Mathematical Method, 1945.

[25] This is far from obvious. In the marvellous Proofs and Refutations of 1976, Imre Lakatos complained that math professors expect students to absorb the missing items that in truth even they lack.

[26] Michael Polanyi, Personal Knowledge, 1958, 1973, 290.

[27] Bacon already insisted that induction does away with the need for both luck and talent. Inductivists leave no room for talent, at least, in their theories of discovery. Talent came into the picture with Einstein. See my “Genius in Science”, Philosophy of the Social Sciences, 5, 1975, 145-61, reprinted in my Science and Society, 1981.

[28] I say “as a rule” because a good teacher may undertake it out of real interest, or a good teacher may be penalized or underestimated and so forced to do it. I have witnessed a course in mathematics for biologists that an under-estimated young instructor taught; some of the best mathematics students around slipped into his class and I followed them. He soon won a fellowship and disappeared to a leading university. Let me add to this that many private teachers for matriculations are famous magicians, experts in cutting curses to the bones. Their teachings are remembered, though as sheer bonus.

[29] See my “An Unpublished Paper by the Young Faraday”, Isis, 52, 1961, 87-90.

[30] See Maurice Crosland, “Humphry Davy—An Alleged Case of Suppressed Publication.” The British Journal for the History of Science 6.3 (1973): 304-310.‏

[31] Karl R. Popper, The Logic of Scientific Discovery, 1959, §4.

[32] Thomas S. Kuhn, The Structure of Scientific Revolutions, 1962; what he called normal science exists only since World War II. This is a systematic defect of his philosophy: although his examples are historical, his ideas are not: his description of science is distinctly contemporary.

[33] There is a well-known self-portrait of Sodoma. It is an earlier fresco in the Benedictine Abbey and Monastery of Monte Oliveto Maggiore south of Sienna. It is not central and not striking and he is surrounded there with animals―for the sake of identification in accord with tradition, for which see E. H. Gombrich, The Uses of Images, 1999.

[34] The experts say, the Gospel of St. John is the latest among the four Gospels.

[35] See my “Can Adults Become Genuinely Bilingual?”, in A. Kasher, ed., Language in Focus, 1976, 473-84; reprinted in my The Gentle Art of Philosophical Polemics: Selected Reviews and Comments, 1988.

[36] See my “The Logic of Scientific Inquiry”, Synthese, 26, 1974, 498-514, republished in my Science and Society: Studies in the Sociology of Science, 1981.

[37] See my “Piano Pedagogy as a Test Case”, review of Lia Laor, Paradigm War, Lessons Learned from 19th Century Piano Pedagogy, Asia-Pacific Journal for Arts Education, 19, 2020, 1-19.

[38] For more see JA and Abraham Meidan, Beg to Differ: The Logic of Disputes and Argumentation, 2016.

[39] Portraits from Memory, 1956. Unfortunately, Russell says there, the source of their ability is in their detached terminology. In a letter to the London Times defending Ernest Gellner’s attack on Oxford philosophy, he advocates a better view: biased terminology and malice—alleged or real—are of no importance; the concern of a critical debate is with the justice or injustice of a given item of criticism.

[40] For more on this see John Rajchman and Cornel West, eds. Post-Analytic Philosophy, 1985. See also my 2018 Ludwig Wittgenstein’s Philosophical Investigations: An Attempt at a Critical Rationalist Appraisal, for Wittgenstein’s resolute decision to eradicate metaphysics.

[41] Some say, aristocrats are inefficient in principle, so that a true blue law student should aim at a “gentleman’s C”. This is sheer snobbery.

[42] Commentators often wonder why democracies are never sufficiently prepared for war. It is that sufficient preparedness endangers democracy as it favors military dictatorship.

[43] Karl Popper asserted in a radio program that Oxford philosophers were ostracizing him and they vehemently denied it. When later on Imre Lakatos published mock-criticism of Popper, the same Oxford people made song-and-dance about it, thus admitting openly that he was a worthy target for criticism.

[44] See however Constance E. McIntosh, Cynthia M. Thomas, David Eugene McIntosh, A Nurse’s Step-by-Step Guide to Academic Promotion and Tenure, 2018.

[45] His failure was to the good: his political philosophy was anti-liberal: as he explained in his Major Barbara, 1907, in his view liberalism is the mere hypocrisy of the rich. No: it is much more than that.

[46] This seems to me the way Popper overcame the ostracism against him. His crowd was called his three knights because, indeed, among his openly-confessed fans were three knighted professors.

[47] https://youtu.be/EzD0YtbViCs?t=412

[48] John Ziman, Real Science: What It is, and What It Means, 2000, 10.5.

[49] Michel de Montaigne invented the essay (in the late sixteenth century). Francis Bacon followed him; he also invented the scientific, empirical essay (in the early seventeenth century). Robert Boyle invented the periodical as a means for publishing new empirical information. ‏peer review evolved unintentionally. See Ann C. Weller, Editorial Peer Review: Its Strengths and Weaknesses, 2001.

[50] Barbara Ehrenreich and Deirdre English, Witches, Midwives, & Nurses: A History of Women Healers, 2010; Julian Goodare, The European Witch-Hunt, 2016.

[51] Willard van Orman Quine “A Comment on Agassi’s Remarks”, Journal for General Philosophy of Science 19.1, 1988: 117-118.‏

[52] Writing during the Restoration, Boyle intended to separate religious from scientific testimony and to render scientific testimony fallible yet not under dispute. Achieving this was magnificent. Most philosophers of science ignore this altogether. This assures that their work is scientifically irrelevant.

[53] A theorem for experts: the conservation of energy is a first integral of any system of central forces.

[54] Refutation is an unproblematic logical concept: an observation report refutes a theory if they contradict each other. Hence, distinctness of observations is context-dependent. This males case studies historical.

[55] Edmund Wilson dismissed Raymond Chandler, saying his kind of art is not serious. Sadly, Chandler took offense; he should have been haughty enough to stay above Wilson’s snobbery.

[56] Carol Tenopir, Rachel Volentine, and Donald W. King, “Social Media and Scholarly Reading”. Online Information Review 37.2 (2013): 193-216.‏ Abstract:

There have been hundreds, perhaps thousands, of studies of journal reading by professionals in such fields as science, engineering, medicine, law, social science and the humanities. These studies have many reasons, including the wish to better understand professional communication patterns and the role this plays in their work. Some studies also focus on providing specific information to journal system participants such as publishers, librarians, other intermediaries and their funders. In this article, we present a description of a little used but powerful method of observing reading by scientists. This method is designed to measure the amount of reading of specific journal articles and entire journals to complement exclusive observations of electronic journal hits and downloads, transaction logs, limited counts of citations to journals or articles and rough estimates of total amount of reading by professionals compared with total number of articles published.

[57] The leading example for this is what Robert K. Merton has labelled the Matthew Effect: editors prefer established authors: they are hardly interested in their readers because they hardly know what their interest is. Now the choice of authors is an indirect choice of readers. Science readers read out of a sense of duty; popular science still caters for the curious. This makes it excellent.

[58] See my “In Defense of Standardized On Demand Publication”, in Miriam Balaban, ed., Scientific Information Transfer: The Editor’s Role, 1978, 133-9. (Republished as “Storage and Communication of Knowledge” in my Science and Society, 1981.)

[59] Bertrand Russell, Portraits from Memory, 1956, 77.

[60] See my Ludwig Wittgenstein’s Philosophical Investigations: An Attempt at a Critical Rationalist Appraisal, 2018.

[61] Hume’s path-breaking 1742 essay “Of the Rise and Progress of the Arts and Sciences” is amazingly refreshing: in it, he speaks of criticism in a most favorable mode. Why did he never take it up again?

[62] See my Ludwig Wittgenstein’s Philosophical Investigations: An Attempt at a Critical Rationalist Appraisal, 2018, pp. 5, 6, 10, 21, 39, 185, 187, 202, 216, 225, 242, 243, 263, 265.

[63] Briefly, Wittgenstein used their modern logic to present a compact version of neutral monism. This is an important lesson from his book. See the previous note on Wittgenstein.

[64] A. Einstein, The Meaning of Relativity, 1922, is a classic textbook. Arthur S. Eddington, The Nature of the Physical World, 1927, is terrific popular science, but it is hard to read despite its author’s abilities. That Jules Verne managed to include popular science in his fiction is awe-inspiring.

[65] Bernard Shaw, Saint Joan, 1923, Preface: Joan and Socrates.

[66] Steve Fuller, The Sociology of Intellectual Life: The Career of the Mind in and Around the Academy, 2009, Ch. 3.

[67]Peer Review: A Personal Report”, Methodology and Science, 2, 1990, 171-180, available on the internet for free.

[68] The big push to the movement came in West Germany, where Birnbaum, an American Jew, fashioned the disgusting slogan, the Vietnam War is worse than the Holocaust.

[69] The myth that universities are a-political has historical roots: mediaeval universities were a separate class. Local police had no authorities on their territories even in central-European universities up to the nineteenth century, when they served then as shelters for nationalist rebels. Later, weighty Max Weber said, it is beneath the dignity of a professor to use the podium as a propaganda tool for a political party.

[70] During the Vietnam War, many leftist professors would not fail students since dropping out of college meant induction to the military. This had a tremendous effect on the way Academe integrated in society: no more ivory tower. See my “The Ivory Tower and the Seats of Power”, Methodology and Science, 24, 1991, 64-78.

[71]Sissela Bok, Lying: Moral Choice in Public and Private Life, 1978. She discussed there the decline of standards of truthfulness in American politics—and thus of rationality there. If this goes on, then farewell to American democracy. This is possible, but beyond my imagination, I confess.

[72] Judith Buber Agassi and Stephen Heycock, eds., The Redesign of Working Time: Promise or Threat? 1989.

[73] Condorcet said the future is certainly rosy since people will emulate success and desist from failure. He ignored human folly. See his Sketch for a Historical Picture of the Progress of the Human Mind, Tenth Epoch. Future Progress of Mankind, 1795.

[74] Moshe Idel, Representing God, 2014, 124.‏

[75] This happened in Yale University to Dr. Joan Bailey. See her “Dungeons and Dragons: A Lesson in Etiquette from Professor Agassi”. In Nimrod Bar-Am and Stefano Gattei, eds., Encouraging Openness: Essays for Joseph Agassi on the Occasion of His 90th Birthday, 2017.



Categories: Books and Book Reviews

Tags: , , , , , , , , , , , , , , , , ,

Leave a Reply