Epistemological Consequences of Bibliometrics: Insights from the Scientific Community, Tommaso Castellani, Emanuele Pontecorvo, and Adriana Valente

Author Information: Tommaso Castellani, Institute for Research on Population and Social Policies, t.castellani@irpps.cnr.it ; Emanuele Pontecorvo, Physics Department, Sapienza University of Rome; Adriana Valente, Institute for Research on Population and Social Policies, National Research Council of Italy, adriana.valente@cnr.it

Castellani, Tommaso, Emanuele Pontecorvo and Adriana Valente. “Epistemological Consequences of Bibliometrics: Insights from the Scientific Community.” Social Epistemology Review and Reply Collective 3, no. 11 (2014): 1-20.

This research has been supported by the ScienceOnTheNet project of the Italian Ministry for Education, University and Research.

The PDF of the article gives specific page numbers. Shortlink: http://wp.me/p1Bfg0-1GH

2890933648_8e232deaa9_z Image Credit: SamahR, via flickr

Abstract

The aim of this paper is to investigate the consequences of the bibliometrics-based system of evaluation of scientific production on the contents and methods of sciences. The research has been conducted by means of in-depth interviews to a multi-disciplinary panel of Italian researchers. We discuss the implications of bibliometrics on the choice of the research topic, on the experimental practices, on the publication habits. We observe that the validation of the bibliometric practices relies on the acceptance and diffusion within the scientific community, and that these practices are self-sustained through their wide application. We discuss possible evolving scenarios, also considering the recent development of digital archives.

Introduction

In a first approach to the demarcation problem, the process of obtaining a scientific result and the process of its communication have been considered two separate activities. Despite the ancient Greek rhetoric may have played a significant role in the development of the mathematical proof method, the claimed objectivity of science relies on the independence of the result from the rhetorical aspects of its presentation. Actually, already Galileo wondered if the ‘readiness of saying’ (‘prontezza del dire’) of an individual might address towards a certain solution of a scientific dispute (Galilei 2002). Incidentally, Galileo himself was a master of rhetoric and Galilean writing style is considered an excellence of Italian literature: Galileo’s writing ability is undoubtedly one of the main reasons of his success, even regarding his wrong claims (Feyerabend 1975).

But besides the effects of the communication practices on the dissemination of the result, an even more interesting question is the influence of the communication practices on the process  which lead to the result. In the framework of the Laboratory Studies (Knorr Cetina 1995; Latour and Woolgar 1979) much emphasis has been put in such rhetorical aspect in the construction of the scientific fact. These studies set the ground for a further reflection on the impossibility of separating the communication practices and the strict research activity, overcoming the distinction of the context of discovery and the context of justification of the classic epistemological tradition (Reichenbach 1973; Hanson 1958). Latour unified under the name of eloquence the two inseparable activities of rhetoric and scientific demonstration (Latour 1987; Latour 2010). According to this framework, the writing of a scientific paper deeply influences the setting up and the conducting of a research.

The communication within the scientific community has been at the beginning oriented towards sharing of information (Valente 2002a). The early communication between scientists took place most by letter, until in the mid-seventeenth century the main academies began to publish their acts, which later became scientific journals. The specialised journals appeared between the nineteenth and twentieth centuries, initially only in the field of medicine, and in an extremely limited number: on mathematics and physics, for instance, in the mid-nineteenth century no more than a dozen journals were published on a regular basis. After the Second World War, there was an explosion in the number of scientific journals. The journals managed by research institutes were gradually replaced by the system of commercial scientific publishing. The commercial publishing provided a limitation in access to publications, formally introducing an opposition to the sharing of knowledge. At the same time, scientific journals became the almost exclusive and official communication instrument in many fields of science, leading de Solla Price to claim that science is what is in scientific journals and the scientists are those people who published at least one article on a scientific journal (de Solla Price 1965).

After 1945, with the “big science,” the growth of the scientific production became exponential (de Solla Price 1963; Ziman 1980; Vickery 1990). The debate on the effects of such a growth is still going on (Boutry 1970; Bracey 1987; Hamilton 1990; Valente 2002a; Valente 2002b). One of the causes of this exponential growth has been identified in the evaluation system based on bibliometric indicators, which may encourage quantity rather than quality (Klein 1985; Buchholz 1995; Vinkler 2000; Lawrence 2003). Researchers are divided on the weight of positive and negative effects of quantitative bibliometric criteria for evaluating science, but at the moment these are in general accepted, at least for hard science, as the most reasonable way to solve problems ranging from the progression of the individual career to the funding repartition at any level up to the social accountability of the research institutions (Jefferson et al. 2002; Anon.; Vinkler 2010; Cronin 2011; Callaham and McCulloch 2011). Despite a general consensus is indeed the prerequisite in contemporary paradigms of public management, several criticisms are coming from scientists’ side (Godlee, Gale, and Martyn 1998; Rothwell and Martyn 2000; Gura 2002; Ioannidis 2006; N. S. Young, Ioannidis, and Al-Ubaydli 2008; Power 2008; Ségalat 2009; Smith 2010; Steinhauser et al. 2012; Pinto 2012; Bohannon 2013) and even from the editors’s side (Waters 2004), reaching recently the mainstream media (Economist 2013; Schekman 2013). The debate is thus growing and in general it is more and more acknowledged that bibliometrics must be used cautiously (UNESCO 2005).

The effects of bibliometric evaluation have been investigated both from a quantitative point of view, e.g. analysing the structure of the citations (Redner 1998; Lehmann, Lautrup, and Jackson 2003; Radicchi, Fortunato, and Castellano 2008; Ren, Shen, and Cheng 2012), and from a qualitative point of view, e.g. proposing to researchers questionnaires on the effectiveness of peer-review or on its relationship with scientific frauds (Gidez 1991; Fanelli 2009).

To our knowledge, much less has been done in investigating specifically the consequences of the bibliometrics based evaluation on the contents and methods of science, studying the epistemological implications of an evaluation system that mainly encourages the production of a large number of articles that aim to collect the largest number of citations, all this inside a commercial publishing business. Beyond a Goodhart’s law type argument—when the measure becomes a target, it is no longer a good measure—(Goodhart 1984) one can argue that the simple act of writing influences the construction of the scientific ‘facts.’ Such an act is deeply transformed and impacted by the practices which the ‘publish or perish’ imperative brings daily into the labs in the very moment scientists use their ‘inscription devices’ (Latour and Woolgar 1979).

As a starting approach to such a complex investigation we interviewed a panel of Italian researchers coming from different disciplines (from hard to social science), eliciting their visions on these evaluative practices on the line of a uniform set of questions. From this point of view the Italian scene represents a very interesting field of studies as the scientometrics based evaluative system is growing its importance in the latest years, moving from the hard sciences to humanities and acquiring an enormous weight in a relatively short period of time. This rapid growth of the impact of scientometrics in Italy may highlight some trends which may be hidden in countries where scientometrics represents a long standing habit.

From all the interviews it emerges the existence of a ‘before’ and an ‘after’ the time the quantitative indicators started to influence their daily scientific activity. All of people in our panel developed their careers in a system which did not give as much importance to bibliometric indicators (with differences depending on the discipline and on the age). Nevertheless, all of them – independently from their criticism – would strongly encourage their students to take care of bibliometrics in their work. This process breaks the usual scheme in which the older generations transmit their practices to the newest generations, which have the duty of criticism and changing.

The set of questions which we will discuss in the following section is very general but allowed to point out some main issues concerning pillars of the ‘common sense’ about science and what ‘scientific’ means even between scientists.

Methodology

Our work is based on interviews to a multi-disciplinary panel of scientists.

The use of in-depth interviews to understand the working mechanisms of the scientific community, as well as their attitudes and values, is a longstanding methodology, used in many classics of sociology of science (Merton and Kendall 1946; Latour 1987), and the reflection on the power and limits of such approach has continued for years (Potter and Mulkay 1985; Riesch and Potter 2013). In a recent Italian book, interviews to an multi-disciplinary panel of researchers have been used for gathering scientists’ opinion on the relationships among the different disciplines (Gagliasso, Memoli, and Pontecorvo 2011).

Panel_Composition

Figure 1.

Our multi-disciplinary panel was composed by 12 people, roughly uniformly distributed in an age range from 40 to 65 in order to include only experienced scientists, at different stages of their career but with a solid curriculum. Among the 12 researchers there were 6 men and 6 women, working in the following disciplines: physics, biology, chemistry, medicine, neuroscience, economics, cognitive science, engineering, sociology and philosophy. We included both scientists doing strictly discipline-focused research and scientists working in the boundaries between disciplines.

To all panellists we administered a questionnaire of 22 questions, constructed as follows. First, we made a literature review identifying the main issues on the epistemological effects of bibliometric evaluation (Gura 2002; Lawrence 2003; Ioannidis 2006; N. S. Young, Ioannidis, and Al-Ubaydli 2008; Ségalat 2009). Then we classified them into two main categories, ‘epistemological issues,’ strictly regarding the way of doing research activities, and ‘communication issues,’ more linked to the practices of information exchange. We wrote at least a question for each issue of our list, and distinguished the questions in questions about actual behaviours, motivations and opinions. The interviews have been performed from April 2013 to March 2014.

Results and Discussion

1.1  ‘Publish or perish’

“Nowadays I tend to publish everything, in the past it was not so” says a cognitive scientist of our panel “…and I do not give in to the temptation of working on subjects which I know I will not publish.” This sentence could be hardly heard nowadays in scientific communities – especially in hard sciences – and it may sound quite surprising as much it is so obvious. Nevertheless it reveals an ongoing change of attitude in research which is by now over elsewhere in the world. Another interviewee, a computer engineer, says: “compared to the first years of my research, some strategic decisions have changed. At the beginning I also focused heavily on conferences, I would send a paper to a journal only when it was very mature, with verifiable results, now I’ve changed a lot my strategy.” These statements seem to unveil an ultimate tendency: the ‘laboratory life’ is segmented in many smaller events where each one must bring to an ‘inscription’ (Latour and Woolgar 1979). Every act must possibly be transformed into a communication to the community.

This stresses the same idea that the moment of communicating the results enters in the scientific fact. Such idea is well represented in our interviews: one chemist affirms that one should start to think to the paper as soon as she/he gets the main result, since in that moment “you have to think to the audience which will read the paper in order to chose the ‘decorations’ targeted for that audience,” meaning all the related measurements and characterizations which may be relevant for the community you want to reach. On the same way an economist thinks that “it is a good methodological criterion for young people to think where to publish and then starting to write.” She/he adds that “It is not always the principle I used.” “Writing is also a way of building a theory,” says a cognitive scientist. As an extreme case, a neuroscientist tells: “my boss used to write the article before you did the experiment, leaving white spaces, to be filled depending on the results of the experiments. Sometimes, he even wrote two different conclusive paragraphs, one opposite to the other, to be used depending on the actual results of the experiment.” Another biologist says: “generally when you start to notice that a bunch of experiments tells you a story you start to think to the paper. It is quite important as while thinking what to write to convince other people you realize the experimental evidences which you miss to complete the work.” “We always try to make a coherent story in itself, good to be sold,” says the same biologist, “but if we can split it in a couple it is better.” This last sentence raises of course an issue of ‘salami publishing’ – which we will address in the following – but here we aim to underline an attitude which not only impacts the structure of scientific literature but deeply rules the processes of the scientific production and the way every single researcher looks at her/his actions. This activity is devoted to an audience sometimes even before the process of building researchers’ own understanding, overcoming researchers’ own skepticism.

This attitude seems also to be supported by two factors: the easy access to publications and the difficulties in emerging from the noise and promoting a research outcome. According to our interviewees, “to publish is not difficult, it is difficult to publish in magazines that are read” and “it is not a problem of publishing or not publishing. Everything is published at the end. If an article is not published, it is really poor. The problem is the ranking.” In fact, despite the explosion of scientific publications and the proliferation of scientific journals, apart from supporting a huge commercial publishing business it seems that contemporary science could not escape the ‘Matthew effect’ (Merton 1973). Gatekeeping strategies seem to have a preferential playground in the noise of the enormous scientific information available today. Editorial policies of the main journals have been strongly criticized on this point and such criticism may be also found among our interviewees: a biologist suggests that: “The journals consider above all where you come from. In most cases, what happens is that they do not even send the work to the referee. The editors are taking a much greater power than just ten years ago. Today it is basically the editor who decides whether or not your work is interesting for the journal, and it is interesting mainly depending on the group from which it comes. The content is not even evaluated, since the work is not sent to the referees.” A neuroscientists confirms: “Once a scientist has published in a major journal, since that moment he will easily publish” and reports her/his experience: “when I wrote with the director of the institute, and he was obviously the first name, it was easier to publish the article.” And a physicist says: “Editors have an agenda. […] In commercial journals, it is their right. But then it should be better made explicit.” And then: “The figure of the editor should be better analysed. The editor also chooses the referees, knowing very well who to choose in order to kill an article.”

The ‘induced scarcity’ of the papers published in the most influencing journals has been addressed as responsible of being among the strongest distortions in the game of the scientific production (Ségalat 2009), but it relies on rules and habits which scientists have deeply accepted in their practices. A physicist tells about this point: “Do the indicators (impact factors, H-index of the authors…) reflect the quality of the papers? In my field, my answer is that a weak correlation is there, say 10-2 but not 10-7. This correlation depends on the fact that if I have a really good result, and I’m not a senior, for me it is worth to invest in order to publish the paper on a very good journal. It’s our own interest to make more visible our best papers. So this correlation does not depend on the fact that the journal ranking is valid or not, but it depends on what we think of this ranking.” It is important to remark that physics in Italy is the first area to be influenced by bibliometrics since long time. This quotation stresses that the validation of the bibliometric practices relies on the acceptance and diffusion within the scientific community, and that these practices are self-sustained through their wide application. In other words, we could say that the discipline of scientometrics has a strong performative approach, in the sense that it creates the phenomena it describes, in the same way economics is often charged to do.

1.2  Choosing the Theme

The question of how researchers choose the object of their studies has been widely discussed inside the philosophy of science. Popper noted that the idea that science moves from the ‘observation’ of the world lacks of substance: he used to invite his audience to ‘just observe’ to underline the naivety of such a claim (Popper 1959). Polanyi considered the ‘intrinsic interest’ of a fact as one of the attributes which make it enter into science: “to form part of science, a statement of fact must be […] interesting, and, more particularly, interesting to science” (M. Polanyi 1967). According to Polanyi, the features of this ‘interest’ are part of the ‘tacit knowledge’ of the scientists, as it is the ability of choosing ‘interesting’ facts. The question of the ‘choice of the facts’ has also been addressed by Poincaré. According to Poincaré, it is this choice that, more than everything else, characterizes the scientific method. In his vision of a mathematician, interesting facts are the ones which are going to repeat. Interesting facts are the ‘simple and regular’ facts that allow to make predictions on a multitude of different future events (H. Poincaré 1958). Indeed, the value of simplicity can be found in many scientists, starting from Galileo and even in pre-scientific thinkers ­– e.g. in the principle of Occam’s razor –, and only in 20th century it has been accompanied – but not overcome – by the paradigm of complexity.

In contemporary science, the ‘choice of the facts’ is more a social than an individual matter, and it is basically linked to the environment in which the researchers work, both their institution and the ‘invisible college’ they belong to. Our interviews show that one of the elements which play a relevant role in orienting the choice of specific research subjects by researchers is bibliometrics.

The fact that everything is published “sooner or later,” discussed in the previous paragraph, can be complementarily seen as the fact that scientists work only on what they consider publishable. If this may be most times an unconscious choice, for others it constitutes a conscious waiver to study other things, as evident in the quotation presented in the previous paragraph: “I do not give in to the temptation of working on subjects which I know I will not publish.” The use of the word ‘temptation’ suggests that non-publishable things on which this interviewees does not work would be for this researcher interesting and stimulating.

Many of the interviewees state that there is a tendency to work on topics which are more suitable to produce good publications. According to our interviewees, this can happen in a variety of forms:

  • The choice of a fashionable theme, namely a theme that for some reasons is considered particularly interesting at the moment. This seems to be relevant for all disciplines, including the humanities. A physicist declares: “Once something is fashionable, you can make an article about it with low scientific Only if you already have a big name you can write on a rare topic.” A philosopher declares: “I think these phenomena also exist in philosophy, at least in my area. For example, all the realism-constructivism debate is very strong at the moment. If you enter in this debate in any way to say any thing, you’re already ahead.” The same physicist considers the presence of a large number of low-quality papers and seminars on fashionable themes as a confirmation of this trend: “The bell of the works is not symmetric, there is a wide tail on low quality. Even at conferences, it is more common to listen to outrageous talks on fashionable topics.” In our panel, only an economist does not agree that writing on a fashionable theme is a good strategy, declaring that “In general it is almost the opposite, the fashionable field involves many people, and the literature is also extensive. […] To be noticed in a fashionable field you must really have a strong argument.” Among our interviewees this point of view is an exception, and the opposite vision prevails, although if others consider the new theme as an opportunity, not devoid of risks: “It’s like going to America in 1492, there are not states yet and areas of influence. It is obvious that wars break out. Then schools of thought set up.” A doctor stresses how economical and social consequences of scientific applications may be a pushing factor: “since the invention of Viagra […] in andrology they do not talk of anything but erectile dysfunction.”
  • Placing the article in the tail of an important discovery. When a topical paper has been published, many researchers publish minor results on the same topic, drawing a high attention. As declared by an interviewee, “once a discovery has come out, many people jump on the bandwagon.”
  • Focusing the article on some trendy keywords. A biologist declares that many researchers “follow a keyword”: “In our field, if you just say cancer, genetics, or other keywords, even if the research is not excellent, it has great appeal.”
  • Choosing short empirical papers rather than books or long essays on theoretical and argumentative This may particularly affect the topics of socio-economic area. A sociologist declares: “If you spend so much time working on a complex model with a large sample, how can you manage to report it in a short paper of ten pages? No, it is better to work on small cases, a chitchat on network models for example, it is cheap and there is not much graphics, the journal does not complain, you do a small ethno-graphic work […] and you publish your small research. It took you a little, it cost a little. […] Everything that is related to argumentation has no place in the ISI-Thomson criteria. If I am an economist and I want to write an essay on universal income, 40 pages are not enough. […] But there is no journal that will publish more than 12 pages of article, and they also give you the structure of the paper: introduction, discussion, conclusions, etc. […]. The esprit de système and a broad viewpoint is useless.” And a cognitive scientist states: “it has become difficult to publish theoretical work on good journals. To publish on high quality journals, you need to arrive as quickly as possible to empirical research.” From these quotations, it is evident that the choice of publishing short articles is not only a choice of a communication strategy, but also affects the contents of the research. Different instruments, as books or articles, have always had different aims (Clemens et al. 1995). A biologist declares: “the kind of research I did, very much based on qualitative analysis […], often long-term, requires much work; and it is difficult to write on it articles of length acceptable for a journal.”
  • Placing your paper into a clear school of thought. This seems to be particularly relevant for social sciences and An interviewee states: “In philosophy, bibliometrics pushes you on certain sides: for instance you cannot have alternatively an ‘analytic’ and ‘continental’ approach. Simply you cannot. […] From a conceptual point of view it would be better, to use the analytical tools provided by the diverse  approaches  when you need them, while  nowadays  you have to choose, and then you’ll find the journals for the ones and for the others. If you place yourself at the intersection, the referees tell you that you have to be more careful with the languages​​.” According to a neuroscientist, it may happen that particularly ‘powerful’ schools of thought try to hinder the articles of the competing school : “if a ‘big name’ follows a certain theory and you unknown follows the opposite, the members of that school, as referees, will not accept your article.”

A negative consequence of this approach is a risk of uniformity. To describe this risk, a philosopher of our panel uses a comparison to Middle Ages: “The bibliometric approach reminds me in some ways the medieval research, in which they said: ‘on what do you have to work?’ ‘You have to work on the writings of the post-Aristotelian.’ The transition from ‘on what do you have to work’ to ‘what is licit to think’ is very thin, a hair.” A chemist talks about topics which are “left behind,” noting that this happens in the countries with a stronger scientific tradition: “there are fields that are left behind […] actually, these fields are being carried out by the nations which are ‘scientifically emerging’, where researchers do measures that European scientists no longer do – e.g. the viscosity of mixed solutions, with more than one solvent, is a subject  that we studied 30 years ago in Italy, and we published a lot, today it is done in India, because it is low-budget but extremely important for the applied chemistry. Also the African school is collecting many data that we left behind.”

Another negative consequence of the publishing-oriented research is the hurry. This affects not only the moment in which the research is conducted, but also the initial stage of focusing the topic and identifying the relevant research questions. An interviewee (philosopher) declares: “It must be said clearly that one of the aspects of bibliometrics is the speed. I have to get out before my competitors who are working on the same things. From the point of view of the functioning of the mind, it takes a long time to bring out the right question, which is not only what journals tell me, the smart theme of the moment.”

A further consequence is that, according to our interviewees, interdisciplinary topics are hindered, despite the growing acknowledgement by the scientific community of the importance of inter- and trans-disciplinary research for achieving relevant scientific advancements (Benard and de Cock-Buning 2014; Mulkay 1979; Keller 1989; Castellani and Parisi 2014; Gagliasso, Memoli, and Pontecorvo 2011). A physicist states that “interdisciplinarity does not pay. We all do great proclamations on interdisciplinarity, but definitely it does not pay.” In particular, interdisciplinary papers are more difficult to be published, since, as another physicist declares, “the referees are never interdisciplinary,” and interdisciplinary journals, except some almost inaccessible ones (like Nature), are usually not the ones with higher impact factor. As explained by another interviewee, a cognitive scientist, “a problem is the trade-off between the impact factor of the journal and the interdisciplinary approach. The journals with high impact factor are single-discipline mainstream ones.” It has been observed that hindering interdisciplinary approaches may contribute to broaden the gender gap in science, since interdisciplinary topics are more chosen by women (Naldi et al. 2005).

Finally, an interviewee (engineer) observes that bibliometric systems encourage researchers not to change topic during their career: “If one keeps the same research topics over the years, it’s easier to make a good result visible to the community; it remains for a long time in the same circumscribed community. If one changes, it’s much more difficult.” Like interdisciplinarity, also the possibility of varying the research topic has been seen as one of the most important sources of creativity in science (Loeb 2010).

Of course the interviewees observe that bibliometric criteria are not the only factor influencing the choice of the theme. Among the others, one that is strictly linked to bibliometrics is the possibility of raising funds, also driven by mechanisms substantially different from the curiosity-driven research. An engineer declares that in the choice of the topic “the prevailing element was the economic one, that is to find funding. […] I tried to understand on which emerging issues it was easier to raise funds, trying to anticipate the times, to work on the issues before there was a big competition.” A chemist says that “what you do is to try to adjust your skills and topics to the calls for funding.”

1.3  Replicating the results

Galileo was one of the first to theorize the importance for a scientist to replicate an observation or an experiment made by others. In Galileo’s view, the possibility of replicating results is used to break down the authority principle that for centuries made the Aristotelian science immutable. Descartes went further: in his famous Discours de la méthode, published in 1637, he theorized a sort of self-sufficient scientist. In Descartes’ view, the scientist should re-obtain all previous results by herself/himself, since it is the only way to really understand and dominate them: “when you learn something from someone else, you will not be able to conceive it, and make it yours, as if you invent it by yourself” (‘on ne saurat si bien concevoir un chose, et la rendre sienne, lorsqu’on l’apprend de quelque autre, que lorsque’on l’invente soi-même’) (Descartes 1987). The experiences communicated by other scientists can not be considered reliable, since they are probably ‘bad explained, or even false’ (‘mal expliquées, ou même si fausses’). The role of the community in building a collective knowledge has been soon recognised. In the 20th century, Popper reminded that scientific objectivity is not an individual matter, but it is a social matter of scientists, coming out from their reciprocal collaboration and contrasts (Popper 1972). Even if in different ways, both individualistic and social views consider the repetition of experiments and calculi done by others as a fundamental step in order to produce scientific knowledge.

Scholars from different disciplines have since long time acknowledged that the actual repeatability of experiments can be called into question from many points of view (Lakatos and Musgrave 1970; Collins 1974; Feyerabend 1975; Hacking 1983; Latour 1987). An exhaustive treatment of this issue goes beyond the aims of this paper, but we can underline some issues regarding bibliometrics.

Our interviewees generally understand and accept the controversial nature of repeatability of experiments, as a physicist who says that “I believe that our experiments can not be replicated by anyone, you need to develop a technology which needs a long time. In the end we are the only ones who do it this way.” At the same time, they underline that replicating an experiment is strongly discouraged by journals publication criteria, according to which only new results are interesting. Simple repetitions of previous experiments are not accepted for an article. The same physicist declares that “if there is not a criterion of novelty in your result, you will not publish it. Normally repeating what others did is a starting test.” This seems to concern not only hard science. A sociologist declares that “in social sciences there is a disappointing pull towards the new. The evaluation of the differences between two studies on the same object would be important. […] The scientific community has a journalistic approach: according to them, originality consists in studying a new object.” If the repetition is a “starting test”, also when it is performed it can be not so visible. A biologist states that the repetition of an old experiment reported at the beginning of a paper about a new experiment, makes the first one invisible, since it is “lost in a sea of other things.”

Many of our interviewees declare that it is possible to publish repetitions of old experiments only if the result is different. But this possibility seems to depend on the reputation of the scientist. A biologist declares that “it is perhaps easier to publish the fact that you have not replied the result. But to be able to publish such a thing you must have a weight comparable to the author of the first work, otherwise they tell you that you are wrong and the other is right.” A cognitive scientist states that “especially the replicas which give denials are published.” The possible bias towards ‘positive’ results – in the sense that they demonstrate a new phenomenon rather than simply corroborating an old one – in published articles has been widely discussed in literature, not only in natural sciences (Epstein 1990). Anyway, the physicists of our panel do not think that in their discipline false results can survive for long time. One of them states that “there is a mechanism for which when you say something important, everyone tries to pick at you. […] The unknown Japanese who repeats your calculation is always there.”

Repeatability is a complex issue. Following a Hacking’s suggestion, in a strict sense an emerging regularity is necessary to claim a phenomenon, and experimenting is just “to create, produce, refine and stabilize phenomena” (Hacking 1983). Therefore, speaking about repeatability is almost tautological. But considered from a collective point of view, in a weaker sense no experiment is actually repeatable, as according to Hacking ‘serious repetitions of an experiment are attempts to do the same thing better’ with different equipment. According to our interviewees, the bibliometric based system weakens both these perspectives. The scarce marketability of the repetitions of experiments makes difficult the seek for improvements of existing results. Among the consequences of this approach, characterized by speed and competition, we may include the big number of retracted articles and the many cases of frauds (Bornmann, Nast, and Daniel 2008; Fanelli 2009; Economist 2013; Bohannon 2013).

1.4  Lost in the Ocean of Scientific Information

One of the features of the deep transformation of the scientific community in the second half of 20th century is the explosive growth of the number of researchers and a consequent explosive growth of the number of published articles (de Solla Price 1963; Michael Polanyi and Grene 1969; Merton 1973; Latour 1987; Ziman 2002). Burke observes that the metaphor of explosion of publications combines the idea of expansion with the idea of fragmentation (Burke 2012).

The possible risks associated to an overload of scientific information were envisaged since two centuries ago, when a growth of such size was totally unexpected. The well-known British scientist Thomas Young wrote in 1807: “When we contemplate the astonishing magnitude to which a collection of books in any department of science may even at present be extended […] there is the greatest reason to apprehend, that from the continual multiplication of new essays, which are merely repetitions of others that have been forgotten, the sciences will shortly be overwhelmed by their own unwieldy bulk, that the pile will begin to totter under its own weight” (T. Young 1845). In 1908 Poincaré was concerned about the same question, saying that in theory mathematics may develop ‘in all directions’; but ‘luckily it will be only partly true’. According to Poincaré, if it was completely true, it would be reason for alarming, since “our richness would soon become a hindrance, and by dint of accumulation it would form a jumble as impenetrable as it was the truth before being discovered” (M. H. Poincaré 1908).

Many proposals have been soon formulated in order to improve the efficiency of the scientific documentation. Bradford observed that the most significant articles were contained in a small number of journals, which he called core journals. The identification of core journals for all disciplines represented in Bradford’s opinion a solution for the growth of scientific information (Bradford 1948). Among the other possible solution to improve the efficiency of information retrieval, Bernal proposed to replace the scientific journals with highly centralized information centres in charge of acquiring and indexing of scientific results (Bernal 1960). Despite its radicalism, this proposal has somehow been recovered – without the centralization – in the era of digital archives (Valente 2002a). We will resume this topic in the last section, while in this paragraph we aim to investigate our panel’s perception of the problem.

When asked about their publication practices, most of the interviewees answer that ‘salami publishing’ is a current habit, at different levels which are not all deplorable. We already quoted the following sentence by a biologist: “We always try to make a coherent story in itself, good to be sold, but if we can split it in a couple it is better.” But, apart from the splitting of articles, the proliferation of scientific papers is also unavoidable in a so large scientific community. The same biologist states that “typically, in the swarm that follows a discovery, […] everyone say: ‘I saw the same on the cell X, I saw the same on the cell Y, I saw the same on the cell Z’.” This mechanism is confirmed by scientists from other disciplines, as in the following statement by a cognitive scientist: “When you think you’ve found a powerful tool, you try to be the first to use it on another question not yet illuminated by the new instrument.”

In other cases, interviewees declare that they published more than once the same article. When asked, an economist answers: “I did it. The reason is that when you go to meetings they always ask you an article for the conference proceedings, for a book, or other things. I have no desire and time to write an endless number of articles and books, so in the end I have to recycle the same things. It happens to me because there is a pressure to which I can not answer.” The same economist tries to formulate some criteria for ‘tolerating’ multiple publishing: “One: I believe that to publish the same paper in different languages is ok, the audiences are different, you only have to be transparent and say it. Two: I do not think there is a big competition when an article is published as working paper or conference proceeding and then in a journal, I think they are different degrees of the same publication. I am also quite tolerant when articles published in journals are also published in books. For me the problem is when the same article or two very similar articles are published on two similar competitor journals. In that case I strongly tend to discourage it.”

What are the consequences of this proliferation of articles? A first issue may be to understand to what extent the existing research results are actually found by present researchers, and whether the power of new digital media compensate the dilution of relevant information in a huge background noise. A sociologist declares that the main criticalities in information retrieval may depend on the transition from no-bibliometric to a full-bibliometric research evaluation system, which in Italy is still not completed: “We may be losing contents now, but this will last about five years. At the end of the mutation, when we’ll all work on certain standards, you will find everything.” This sociologist continues highlighting the importance of respecting the richness and the differences of the many publication practices:  “The only problem is that these standards must not be imposed by a hegemony. I agree to give standards for improving the documentation, but respecting local habits.” A doctor identifies the main limitation to information retrieval in the lack of free access to journals: “The problem is that many accesses are restricted. You have to pay and the universities do not have the funds. I would put the rule that every student should have free access to all the libraries of the world.”

But how to cope with the abundance of literature? Interestingly enough, the same features which are criticized become the ruling factors. The biologist complaining about the fact that papers are accepted basing on the name of the author (see. Par. 3.1), then declares about the scientific articles: “It is impossible to read them all, then you have to choose, and you have a mental bias. Unfortunately, at the end you also make the selection basing on who wrote the article.”

We made specific questions on the attitudes regarding citations of articles, since citations are the key feature of the bibliometric system. Citing is a practice that appeared at a certain point of the history. Ong (Ong 1982) observed that in the ancient oral cultures the oral proof was considered the  reliable proof, whereas the opposite happens in the present society based on literacy. According to Ong, it is with the diffusion of writing that the concepts of ‘plagiarism’ and ‘copyright’ appear, and, together with them, the concept of quotation and citation. Interestingly enough, in the first scientific newspapers the articles were all anonymous, since it was claimed that the scientific idea had not to be connected to the author (Valente 2002b). Merton considered citations as a sort of symbolic currency for exchanging credits within the scientific community (Merton 1973), while other authors combined this approach with the rhetorical role of citations (Cozzens 1989). The attitude of scientists towards citations has been discussed and analysed in literature (Erikson and Erlandson 2014); we point out some aspects related to bibliometrics which emerged from our interviews.

We firstly observe that, beyond the criticisms, our interviewees consider the number of citations of an article as an important factor. A physicist declares that “if an article is much cited, I tend to give a look to it. Although it may be not good, it is interesting to understand why it is so cited.” Citing is thus seen as a self-sustaining mechanism, highly social. An engineer speaks about a “mechanism of citation” which needs to be started, adding that “not all important results are cited, perhaps because the mechanism citation of this result did not start.” A biologist admits that “everyone cites the works cited by the others,” while the same engineers says: “I think there are a lot of works cited very often simply because you do a copy-paste of references made ​​by others, it’s a common practice.” Another physicist, working on the boundaries between physics and biology, observes that citations are strictly disciplinary: “Our articles were the first to include a bibliography containing articles from various disciplines. Before, an article done by a biologist cited biologists, one made by a physicist cited physicists, one written by engineers cited engineers.” A philosopher, summing up the change in the publication practices, focuses on the tendency of citing only very recent literature: “Today I say to my young researchers: one, write in English. Two, look what’s going on around. Three, look at the recent literature. But this is also a perversion, it’s a circle biting its own tail. The most recent author is not necessarily better than the one who wrote 20 years ago. Yet, if I put a citation of a 1995 paper, the referees turn up their noses.” Some interviewees declare that they have been asked by referees to add citations to their paper. Some of them consider this not appropriate, while others do not feel it as a problem, as an economist who declares: “For me citations are like a soup, to add citations is not a problem.”

Conclusions and perspectives

“Long time ago, a major journal asked me to write an article. I wrote it and I sent it. They asked me to make some corrections that I did not like, so I answered that I did not want to change, I preferred not to publish the paper. Today I feel I was crazy for having done it.” As discussed, bibliometrics-based evaluation system of the scientific research has been only recently introduced in Italy, and acquired quite quickly a growing relevance. A drastic change of researchers’ attitude due to the introduction of bibliometrics clearly came out from most of the interviewees. Summing up, the blibiometrics based evaluation has an extremely strong normative function on scientific practices which deeply impact the epistemological status of the disciplines: this artificially produces a speed and competition based framework which at the same time is far from being exempt from gatekeeping strategies and is often source of further imbalances in accessing resources. This approach has consequences on major nodes in the production of knowledge as setting the questions, organizing the dissemination practices, replicating the results and fragmenting the heritage.  It came out as well that the validation of the bibliometric practices relies on the wide acceptance and diffusion within the scientific community, so that bibliometrics is substantially self-sustained through its broad application. It is just in this mechanistic application that the instrument becomes a target shadowing its limitations and also losing its possible benefits and the informative potential. This must be carefully handled as one of tools of the scientific policies with an attention for the social accountability of science, but not the only one neither the main one. Our claim is that in order to be useful, effective and harmless, a regulatory instrument should assure the widest variety of individual and collective behaviours which have characterized the production of scientific knowledge. According to the conclusions of our interviewees, bibliometric instruments must be accompanied by the awareness of their power and limits by the scientists who use them. This paper aims to be a step towards the development of a discussion which should go beyond the strictly evaluative aspects and should involve also the epistemological aspects.

The future of bibliometric practices seems to be strictly linked to the transforming framework of contemporary science, which is evolving towards new models, called with various names as ‘post-normal’ science (Funtowicz and Ravetz 1993), ‘post-academic’ science (Ziman 2002), ‘Mode 2 knowledge production’ (Gibbons 1994) and others. Such a transformation affects in particular the context in which science is developed, much more shifted towards application, and the growing complexity of the factors entering in the finding and validation of scientific results. These factors involve the network of relationships including communication internal and external to the scientific community, and with stakeholders, among which there are industry and policy makers (Bucchi and Trench 2008; Jasanoff 2014; Valente et al. in press). In the very last years we have witnessed the emergence of new modes of knowledge production, which move often towards more bottom-up models like the so-called ‘citizen science’ or ‘crowd-sourced science’, which involve individuals outside the traditional scientific community. With respect to 20th century ‘big science,’ the role of amateur seems today to regain a role, and the definition of scientist given by de Solla Price, for which a scientist is who published at least a paper on a scientific journal (de Solla Price 1965), seems to be rapidly becoming out-of-date; also the social significance of the act of publishing is changing (Casati 2010).

Digital information and communication technologies are undoubtedly a key feature which triggered most of these changes. The recent development of digital archives is going to further modify researchers’ attitudes towards publishing: the free electronic publications, more rapid, in general easier to access, often published without a peer-review process, are becoming an instrument more and more used for the exchange of information among scientists (Koepsell 2010; Fitzpatrick 2011; Nature 2012; Ohlsson 2012). In the field of physics, one of the first to use massively digital archives, has been showed that already in 2006 the articles firstly appeared as preprints on public on-line archives received on average the double of the citations with respect to the ones appearing directly on the traditional journals (Metcalfe 2006). Of course both positive and negative effects of use of ICT in scientific publishing have been evidenced and discussed (Evans 2008; Mauthner and Parry 2013).

It has been noted however that, despite the role of the scientific journal in the exchange of scientific knowledge is progressively decreasing in favour of informal electronic publications (Lievrouw 2010), the bibliometric indicators based on formal publications on journals are still the main criteria used for evaluating the careers of the researchers and the quality of the research centres. Therefore, despite the effectiveness of new electronic possibilities, traditional papers are still the most chosen alternative: ‘The traditional models of publishing, such as peer review, journal-length papers and reputation metrics, are still so central in e-Science” (Origgi and Simon 2010). In a recent research it has been showed that “individual imperatives for career self-interest, advancing the field and receiving credit are often more powerful motivators in publishing decisions than the technological affordances of new media” (Harley 2013).

ICT also paved the way for the development of new forms of peer review, participatory and open to the communities (Fitzpatrick 2010). In a similar way to popular websites like eBay, in which the user’s feedback is used as evaluation, different forms of open peer review have been experimented. For instance, it has also been recently proposed to let the users of PubMed participate as peer reviewers with comments to published abstracts (Reardon 2013). New and old instruments may be successfully combined and applied, provided the scientific community do not lose control of them.

References

Anon. “Peer Review Survey 2009.” Sense about Science. 2009. http://www.senseaboutscience.org/pages/peer-review-survey-2009.html.

Benard, Marianne and Tjard de Cock Buning. “Moving from Monodisciplinarity Towards Transdisciplinarity: Insights into the Barriers and Facilitators That Scientists Faced.” Science and Public Policy. 17 Feburary 2014.

Bernal, John D. “Scientific Information and Its Users.” Aslib Proceedings 12 (1960): 432–438.

Bohannon, John. “Who’s Afraid of Peer Review?” Science 342, no. 6154 (2013): 60–65.

Bornmann, Lutz., Irina Nast, and Hans-Dieter Daniel. “Do Editors and Referees Look for Signs of Scientific Misconduct When Reviewing Manuscripts? A Quantitative Content Analysis of Studies That Examined Review Criteria and Reasons for Accepting and Rejecting Manuscripts for Publication.” Scientometrics 77, no. 3 (2008): 415–432.

Boutry, Georges A. “Quantity versus Quality in Scientific Research (II): The Paper Explosion.” Impact of Science on Society 20, no. 3 (1970): 195–206.

Bracey, Gerald W. “The Time Has Come to Abolish Research Journals: Too Many Are Writing Too Much about Too Little.” Chronicle of Higher Education 30, no. 25 (1987): 44–45.

Bradford, Samuel C. Documentation. London, UK: Crosby Lockwood, 1948.

Bucchi, Massimiano and Brain Trench. Handbook of Public Communication of Science and Technology. London: Routledge, 2008

Buchholz, Klaus.” Criteria for the Analysis of Scientific Quality.” Scientometrics 32, no. 2 (1995): 195–218.

Burke, Peter. A Social History of Knowledge II: From the Encyclopaedia to Wikipedia. Vol. 2. Polity, 2012.

Callaham, Michael and Charles McCulloch. “Longitudinal Trends in the Performance of Scientific Peer Reviewers.” Annals of Emergency Medicine 57, no. 2 (2011): 141–148.

Casati, Roberto. “On Publishing.” Social Epistemology 24, no. 3 (2010): 191–200.

Castellani, Tommaso and Giorgio Parisi. “E’ Facile. Forse Anche Possibile”. Intervista a Giorgio Parisi.” Sapere 80, no. 1 (2014).

Collins, Henry M. “The TEA Set: Tacit Knowledge and Scientific Networks.” Social Studies of Science 4, no. 2 (1974): 165–185.

Cozzens, Susan E. “What Do Citations Count? The Rhetoric-First Model.” Scientometrics 15, no. 5 (1989): 437–447.

Cronin, Blaise. “Peer Review.” Journal of the American Society for Information Science and Technology 62, no. 7 (2011): 1215–1215.

de Solla Price, Derek J. Little Science, Big Science…and Beyond. New York, NY: Columbia University Press, 1963.

de Solla Price, Derek J. “Is Technology Historically Independent of Science? A Study in Statistical Historiography.” Technology and Culture 6, no. 4 (1965): 553-568.

Descartes, Rene. Discours de La Méthode. Paris: Garnier-Flammarion, 1966.

The Economist. “Unreliable Research: Trouble at the Lab.” The Economist. October 19, 2013. http://www.economist.com/news/briefing/21588057-scientists-think-science-self-correcting-alarming-degree-it-not-trouble.

Epstein, William M. “Confirmational Response Bias Among Social Work Journals.” Science, Technology & Human Values 15, no. 1 (1990): 9–38.

Erikson, Martin G., and Peter Erlandson. “A Taxonomy of Motives to Cite.” Social Studies of Science 44, no. 4 (2014): 625-637.

Evans, James A. “Electronic Publication and the Narrowing of Science and Scholarship.” Science 321, no. 5887 (2008): 395–399.

Fanelli, Daniele. “How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data.” PLOS One 4, no. 5 (2009).

Feyerabend, Paul. Against Method: Outline of an Anarchistic Theory of Knowledge. Atlantic Highlands. NJ: Humanities Press, 1975.

Fitzpatrick, Kathleen. “Peer‐to‐peer Review and the Future of Scholarly Authority.” Social Epistemology 24, no. 3 (2010): 161–179.

Fitzpatrick, Kathleen. Planned Obsolescence: Publishing, Technology, and the Future of the Academy. NYU Press, 2011.

Funtowicz, Silvio O. and Jermoe.R. Ravetz. “Science for the Post-Normal Age.” Futures 25, no. 7 (1993): 739–755.

Gagliasso, Elena, Rosanna Memoli, and Maria Elena Pontecorvo. Scienza E Scienziati: Colloqui Interdisciplinari. Franco Angeli, 2011.

Galilei, Galileo. 2002. Dialogo Sopra I Due Massimi Sistemi Del Mondo.

Gibbons, Michael. “Transfer Sciences: Management of Distributed Knowledge Production.” Empirica 21, no. 3 (1994): 259–270.

Gidez, Lewis I. “The Peer Reviews Process: Strengths and Weaknesses–A Survey of Attitudes, Perceptions, and Expectations.” The Serials Librarian 19, no. 3-4 (1991): 75–85.

Godlee, Fiona, Catharine R. Gale, and Christopher N. Martyn. “Effect on the Quality of Peer Review of Blinding Reviewers and Asking Them to Sign Their Reports.” JAMA: The Journal of the American Medical Association 280, no. 3 (1998): 237–240.

Goodhart, C.A.E. Monetary Theory and Practice: The UK Experiencie. Macmillan Publishers Limited, 1984.

Gura, Trisha. “Scientific Publishing: Peer Review, Unmasked.” Nature 416, no. 6878 (2002): 258–260.

Hacking, Ian. Representing and Intervening: Introductory Topics in the Philosophy of Natural Science. Vol. 5. Cambridge Univ Press, 1983.

Hamilton, David P. “Publishing By—and For?—the Numbers.” Science 250, no. 4986 (1990): 1331–1332.

Hanson, Norwood R. 1958. Patterns of Discovery, an Inquiry Into the Conceptual Foundations of Science. Cambridge University Press, 1958.

Harley, Diane. “Scientific Communication: Cultural Contexts, Evolving Models.” 2013. http://vcro-vm-i004-dev06.berkeley.edu/sites/default/files/shared/publications/docs/Harley_Science_10-2013_scientific-communication.pdf.

Ioannidis, John P. “Evolution and Translation of Research Findings: From Bench to Where.” PLOS Hub for Clinical Trials 1, no. 7 (2007).

Jasanoff, Shelia. “A Mirror for Science.” Public Understanding of Science 23, no. 1 (2014): 21–26.

Jefferson, Tom, Philip Alderson, Elizabeth Wager, and Frank Davidoff. “Effects of Editorial Peer Review.” JAMA: The Journal of the American Medical Association 287, no. 21 (2002): 2784–2786.

Keller, Evelyn F. Three Cultures: Fifteen Lectures on the Confrontation of Academic Cultures. Rotterdam: Universitaire Pers, 1989.

Klein, Jan. “Hegemony of Mediocrity in Contemporary Science, Particularly in Immunology.” Lymphology 18 (1985): 122–131.

Knorr Cetina, K. 1995. “Laboratory Studies: The Cultural Approach to the Study of Science.” In Handbook of Science and Technology Studies. Thousand Oaks,CA: Sage, 1995.

Koepsell, David. “Back to Basics: How Technology and the Open Source Movement Can Save Science.” Social Epistemology 24, no. 3 (2010): 181–190.

Lakatos, Imre and Alan Musgrave. Criticism and the Growth of Knowledge. Cambridge: Cambridge University Press, 1970.

Latour, Bruno. Science in Action: How to Follow Scientists and Engineers through Society. Cambridge: Harvard University Press, 1987.

Latour, Bruno. Cogitamus: Six Lettres Sur Les Humanités Scientifiques. La Découverte, 2010.

Latour, Bruno and Steve Woolgar.  Laboratory Life: The Social Construction of Scientific Facts. Princeton University Press, 1979.

Lawrence, Peter A. “The Politics of Publication.” Nature 422, no. 6929 (2003): 259–261.

Lehmann, Sune, Benny Lautrup, and Andrew Jackson. “Citation Networks in High Energy Physics.” Physical Review E 68, no. 2 (2003). http://link.aps.org/doi/10.1103/PhysRevE.68.026113.

Lievrouw, Leah A. “Social Media and the Production of Knowledge: A Return to Little Science?Social Epistemology 24, no. 3 (2010): 219–237.

Loeb, Abraham. “Taking ‘The Road Not Taken’: On the Benefits of Diversifying Your Academic Portfolio.” arXiv Preprint 1008.1586, 2010.

Mauthner, Natasha S. and Odette Parry. “Open Access Digital Data Sharing: Principles, Policies and Practices.” Social Epistemology 27, no. 1 (2013): 47–67.

Merton, Robert K. The Sociology of Science: Theoretical and Empirical Investigations. University of Chicago Press, 1973.

Merton, Robert K. and P.L. Kendall. “The Focused Interview.” American Journal of Sociology: 541–557, 1946.

Metcalfe, Travis S. “The Citation Impact of Digital Preprint Archives for Solar Physics Papers.” Solar Physics 239, no. 1-2 (2006): 549–553.

Mulkay, Mike J. Science and the Sociology of Knowledge. London: Allen & Unwin, 1973.

Naldi, Fulvio, Daniela Luzi, Adriana Valente, and Ilaria Vannini Parenti. “Scientific and Technological Performance by Gender.” In Handbook of Quantitative Science and Technology Research, 299–314. Springer, 2005.

Nature. “Openness Costs.” Nature 486, no. 7404 (2012): 439–439.

Ohlsson, Tommy. “Preprint Servers: Follow arXiv’s Lead.” Nature 489, no. 7416 (2012): 367–367.

Ong, Walter J. Orality and Literacy: The Technologizing of the Word. Routledge, 1982.

Origgi, Gloria and Judith Simon. “Scientific Publications 2.0. The End of the Scientific Paper?Social Epistemology 24, no. 3 (2010): 145–148.

Pinto, Valeria. Valutare E Punire. Una Critica Della Cultura Della Valutazione. Napoli: Cronopio, 2012.

Poincaré, Henri. Value of Science. New York: Dover, 1958.

Poincaré, M.H. “L’avenir Des Mathématiques.” Rendiconti Del Circolo Matematico Di Palermo (1884-1940) 26, no. 1 (1908): 152–168.

Polanyi, Michael. The Tacit Dimension London. Routledge and Kegan Paul, 1967.

Polanyi, Michael and Marjorie Grene. Knowing and Being: Essays. University of Chicago Press, 1969.

Popper, Karl. The Logic of Scientific Discovery. London: Routledge Press, 1959.

Popper, Karl. Objective Knowledge: An Evolutionary Approach. Clarendon Press Oxford., 1972.

Potter, Jonathan and Michael Mulkay. “Scientists’ Interview Talk: Interviews as a Technique for Revealing Participants’ Interpretative Practices.” The Research Interview: Uses and Approaches (1985): 247–271,

Power, Michael. “Research Evaluation in the Audit Society.” In Wissenschaft Unter Beobachtung, ed. Hildegard Matthies and Dagmar Simon, 15–24. VS Verlag für Sozialwissenschaften. http://dx.doi.org/10.1007/978-3-531-90863-2_2.

Radicchi, Flippo, Santo Fortunato, and Claudio Castellano. “Universality of Citation Distributions: Toward an Objective Measure of Scientific Impact.” Proceedings of the National Academy of Sciences (2008): 17268–17272.

Reardon, Sara. “PubMed Opens for Comment.” Nature (October 24 2013). http://www.nature.com/news/pubmed-opens-for-comment-1.14023.

Redner, Sara. “How Popular Is Your Paper? An Empirical Study of the Citation Distribution.” The European Physical Journal B-Condensed Matter and Complex Systems 4, no. 2 (1998): 131–134.

Reichenbach, Hans. The Rise of Scientific Philosophy. Berkeley: University of California Press, 1973.

Ren, Fu-Xin, Hua-Wei Shen, and Xue-Qi Cheng. “Modeling the Clustering in Citation Networks.” Physica A: Statistical Mechanics and Its Applications 391, no. 12 (2012): 3533–3539.

Riesch, Hauke and Clive Potter. “Citizen Science as Seen by Scientists: Methodological, Epistemological and Ethical Dimensions.” Public Understanding of Science 23, no. 1 (2013): 107-120.

Rothwell, Peter M. and Chrisopher N. Martyn. “Reproducibility of Peer Review in Clinical Neuroscience: Is Agreement between Reviewers Any Greater than Would Be Expected by Chance Alone?” Brain 123, no. 9 (2000): 1964–1969.

Schekman, Randy. “How Journals like Nature, Cell and Science Are Damaging Science.” The Guardian, December 9, 2013.

Ségalat, Laurent. La Science À Bout de Souffle? Vol. 110. Paris: Seuil, 2009.

Smith, Richard. “Classical Peer Review: An Empty Gun.” Breast Cancer Res 12, no. Suppl 4 (2010): S13.

Steinhauser, Georg, et al. “Peer Review versus Editorial Review and Their Role in Innovative Science.” Theoretical Medicine and Bioethics 33, no. 5 (2012): 359–376.

UNESCO. “What Do Bibliometric Indicators Tell Us about World Scientific Output?” UIS Bulletin on Science and Technology Statistics, 2005 unesdoc.unesco.org/images/0021/…/217111e.pdf‎.

Valente, Adriana. “Trasmissione Ed Accesso Alle Pubblicazioni Scientifiche: Evoluzione Storica Di Teorie E Pratiche.” In Trasmissione D’élite O Accesso Alle Conoscenze? Valente A. Milano: Franco Angeli, 2002a.

Valente, Adriana. “Gli Indici Di Citazione Nel Circuito Di Organizzazione, Selezione E Comunicazione Della Conoscenza Scientifica.” In Trasmissione D’élite O Accesso Alle Conoscenze? Valente A. Milano: Franco Angeli, 2002b.

Valente, Adriana., Tommaso Castellani, Maja Larsen, and Arja R. Aro. In press: “Models and Visions of Science-Policy Interaction: Remarks from a Delphi Study in Italy.” Accepted for Publication on Science and Public Policy

Vickery, Brian C. “The Growth of Scientific Literature, 1660-1970.” In The Information Environment: A World View: Studies in Honour of Professor A. I. Mikhailov, Ed. D.J. Foskett, 101–109. Amsterdam: Elsevier, 1990.

Vinkler, Peter. “Publication Velocity, Publication Growth and Impact Factor: An Empirical Model.” In The Web of Knowledge: A Festschrift in Honor of Eugene Garfield. Medford, NJ: Information Today, Inc, 2000.

Vinkler, Peter. The Evaluation of Research by Scientometric Indicators. Oxford, UK: Chandos Publishing, 2010.

Waters, Lindsay. Enemies of Promise: Publishing, Perishing and the Eclipse of Scholarship. Vol. 15. Chicago: Prickly Paradigm, 2004.

Young, Neal S., John P. Ioannidis, and Omar Al-Ubaydli. “Why Current Publication Practices May Distort Science.” PLoS Medicine 5, no. 10 (2008): e201.

Young, Thomas.  A Course of Lectures on Natural Philosophy and the Mechanical Arts: Pt. I. Mechanics. Pt. II. Hydrodynamics. Pt. III. Physics. Vol. 1. Taylor and Walton, 1845.

Ziman, John. “The Proliferation of Scientific Literature: A Natural Process.” Science 208, no. 4442 (1980): 369–371.

Ziman, John. Real Science: What It Is and What It Means. Cambridge: Cambridge University Press, 2002.



Categories: Pre-Prints

Tags: , , , ,

4 replies

  1. Well written lucid presentation of a complex and important subject.

    You wrote:
    “With respect to 20th century ‘big science,’ the role of amateur seems today to regain a role, and the definition of scientist given by de Solla Price, for which a scientist is who published at least a paper on a scientific journal (de Solla Price 1965), seems to be rapidly becoming out-of-date; also the social significance of the act of publishing is changing”

    As such an amateur in the social sciences this is encouraging, and important since so called “soft” science is the most directly accessible and of immediate effect on public policy. Writers such as the NYTimes David Brooks popularization of a study has more immediate societal impact than a slew of the most prestigious academic journals.

    There are dynamics that your article did not cover that I would like to add to the conversation. One example is illustrated by a truism from the hardest of science, theoretical physics. There, the study of string theory has become a reification of Goodhart’s Law according to Lee Smolin’s book “The Trouble with Physics.” It is the thousands of articles in respected journals that validates it, even though the theory is unfalsifiable, therefore not inclusive in the bedrock requirement defining science. It is in the realm of Scientology E-meters and even other respected concepts such as psychoanalytic based therapy that have never been objectively validated as effective.

    The interface between the academic-science oeuvre and public policy is a subject raised by your presentation. I have written on the popularization of the idea that we can, at least theoretically, go back to the past. Logically, this allows reversing life-altering mistakes and again being united with lost loved ones. This follows from the Nova presentation , seen by millions, funded by the Government, written by theoretical physicist Brian Greene. The academic work that underlies this is so arcane to be unintelligible by all but a handful of the population. Most of these scientists simply do not see what sort of risky game they are playing with reality—reality as something independent of what is experimentally established.

    This last sentence is not my conclusion but that of Albert Einstein writing to Shroddinger
    http://consilienceforum.blogspot.com/2011/12/notes-on-pop-cosmology-and-its.html

    The illustration of the Nova popularization is challenged in my article that includes a link to the video: http://alrodbell.blogspot.com/2013/06/humanism-cosmology-and-danger-of-awe.html

Trackbacks

  1. On Epistemic Effects: A Reply to Castellani, Pontecorvo and Valente, Arie Rip « Social Epistemology Review and Reply Collective
  2. Unspoken Complicity: Further Comments on Castellani, Pontecorvo and Valente and Rip, Tereza Stöckelová « Social Epistemology Review and Reply Collective
  3. Epistemic Consequences of Bibliometric Evaluation: A Reply to Rip and Stöckelová, Tommaso Castellani, Emanuele Pontecorvo and Adriana Valente « Social Epistemology Review and Reply Collective

Leave a Reply

Discover more from Social Epistemology Review and Reply Collective

Subscribe now to keep reading and get access to the full archive.

Continue reading