Until a few years ago, most Italian scientists had hardly heard of things like the impact factor of a journal or the h-index of a researcher; nowadays these seem to be their everyday concern. What are they about? They are some of the parameters introduced with the aim of quantitatively assessing the outcome of scientific research, giving rise to a new discipline called scientometrics, and in particular bibliometrics for what concerns the scientific publications.
Bibliometric parameters are more and more influencing career progression and fund assignment of researchers and research centres. The increasing importance of the bibliometrics-based evaluative system seems to have effectively induced severe changes in the habits of the Italian researchers. With respect to other countries, where bibliometrics-based evaluation is a well-established practice, in Italy it was introduced only recently and acquired an enormous weight in a relatively short period of time, even if with some remarkable differences between hard sciences and humanities.
But what are the results of such bibliometrics-based evaluation? Did it contribute to a more effective and efficient science? Did it have unexpected negative consequences? The debate on this topic is ongoing in many countries, including Italy. We decided to investigate a specific issue of this broad debate. Being the scientific publication the main instrument for scientists to communicate their results, bibliometrics may seem to regard only the communicational aspects of scientific activity. Nevertheless, the moments of communicating and achieving results are strongly connected: even Galileo wondered if the ‘readiness of saying’ (‘prontezza del dire’) of an individual might address towards a certain solution of a scientific dispute. Besides the effects of the communication practices on the dissemination of scientific results, an even more interesting question is the influence of these communication practices on the entire process that leads to these results, thus having an impact on the ‘epistemic’ status of the disciplines. As the Italian researchers only recently entered in such an evaluative system, they may represent a field of observations of on-going changes already settled in other countries.
We interviewed a multi-disciplinary panel of Italian scientists, composed by 12 people, roughly uniformly distributed in an age range from 40 to 65 in order to include only experienced scientists, at different stages of their career but with a solid curriculum. Among the 12 researchers, there were 6 men and 6 women, working in the following disciplines: physics, biology, chemistry, medicine, neuroscience, economics, cognitive science, engineering, sociology and philosophy. We included both scientists doing strictly discipline-focused research and scientists working in the boundaries between disciplines.
As we will show in the following, a drastic change of researchers’ attitude due to the introduction of bibliometrics-based evaluation clearly came out from most of the interviewees. This change not only affects the structure of scientific literature, but also deeply rules the processes of the scientific production and the way every single researcher looks at her/his actions. In other words, the moment of communicating the results fully enters in the scientific fact. In the following, we summarize some of the interesting results emerged from our research.
Firstly, many of the interviewees stated that the bibliometrics-based evaluation criteria changed the way in which scientists choose the topic of their research. Namely, there is a tendency to work on topics that are more suitable to produce good publications. According to our interviewees, this can happen in a variety of forms, among which:
– choosing a fashionable theme, namely a theme that for some reasons is considered particularly interesting at the moment;
– placing the article in the tail of an important discovery. When a topical paper has been published, many researchers publish minor results on the same topic, drawing a high attention. As declared by an interviewee, “once a discovery has come out, many people jump on the bandwagon”;
– choosing short empirical papers rather than books or long essays on theoretical and argumentative subjects. This may particularly affect the subjects of the socio-economic area.
Another negative consequence of the publishing-oriented research is the hurry. The interviewees stated that this affects not only the moment in which the research is conducted, but also the initial stage of focusing the topic and identifying the relevant research questions.
A further consequence is that, according to our interviewees, interdisciplinary topics are hindered, despite the growing acknowledgement by the scientific community of the importance of inter- and trans-disciplinary research for achieving relevant scientific advancements. Related to this issue, the interviewees also noted that bibliometric evaluative systems encourage researchers not to change topic during their career, despite the fact that, like interdisciplinarity, also the interplay of different research interests has been seen as one of the most important sources of creativity in science.
Another interesting epistemic consequence of the bibliometrics-based evaluation system is that the repetition of experiments, which is considered a pillar of the scientific method, is not encouraged. Our interviewees observe that replicating an experiment is strongly discouraged by journals editorial criteria, according to which only new results are interesting. Simple repetitions of previous experiments are not accepted for an article.
As a general remark, from all the interviews it emerges the existence of a ‘before’ and an ‘after’ the time the quantitative indicators started to influence their daily scientific activity. All of people in our panel developed their careers in a system that did not give as much importance to bibliometric indicators (with differences depending on the discipline and on the age).
Summing up, we can argue that the bibliometrics-based evaluation has an extremely strong normative function on scientific practices, which deeply impact the epistemic status of the disciplines. This artificially enhances the tendency towards a speed and competition based framework that at the same time is far from being exempt from gatekeeping strategies and is often source of further imbalances in accessing resources. This approach has consequences on major nodes in the production of knowledge as setting the questions, organizing the dissemination practices, replicating the results and fragmenting the heritage. It came out as well that the validation of the bibliometrics-based evaluative practices relies on the wide acceptance and diffusion within the scientific community, so that bibliometrics-based evaluation is substantially self-sustained through its broad application. It is just in this mechanistic application that the instrument becomes a target shadowing its limitations and also losing its possible benefits and the informative potential. This must be carefully handled as one of the tools of the scientific policies with an attention for the social accountability of science, but not the only one neither the main one.
Our claim is that, in order to be useful, effective and harmless, a regulatory instrument should assure the widest variety of individual and collective behaviours that have characterized the development of the scientific knowledge. According to the conclusions of our interviewees, bibliometric instruments must be accompanied by the awareness of their power and limits by the scientists who use them. Our work aims to be a step towards the development of a discussion that should go beyond the strictly evaluative aspects and frame these instruments in a more general epistemological debate.
T. Castellani, E. Pontecorvo, A. Valente, Epistemological Consequences of Bibliometrics: Insights from the Scientific Community, Social Epistemology Review and Reply Collective vol. 3 no. 11, 2014
This research has been supported by the ScienceOnTheNet project of the Italian Ministry for Education, University and Research.