SECTION 05 du Comité National
Centre National de la Recherche Scientifique
Accueil | Contact | Plan du site | Admin | Rechercher
Accueil >> Actualités >> Joint Committee on Quantitative Assessment of Research

Dans la même rubrique
Evaluation des chercheurs 2012
Élections pour le renouvellement du mandat des membres élus des sections du Comité national
Compte rendu annuel d’activité des chercheurs du CNRS
Appel à candidature pour la fonction de directeur de l’Institut de physique
Appel à candidature pour la fonction de directeur du CRMD
Journal officiel du 17 juin 2011
Rapport de conjoncture
Prime d’excellence scientifique 2011
Lauréat(e)s de la PES 2010
Rapport d’activité
Session d’automne 2010 : compte-rendu des élus
Concours 2011
Nomination siège A1 vacant
Nomination de Anne Renault à l’AERES
Compte rendu annuel d’activité des chercheurs du CNRS

Les autres rubriques
Sessions d’automne
Sessions de printemps

liens vers le CNRS

Comité de liaison
Photographie de la Section (printemps 2005)

Joint Committee on Quantitative Assessment of Research

Report of the IMU/IMS/ICIAM

joint Committee on Quantitative Assessment of Research
11 June 2008

PDF - 576.6 ko

In 2007 the International Mathematical Union (IMU), the Institute of Mathematics and Statistics (IMS) and the International Council for Industrial and Applied Mathematics (ICIAM), set up a Committee on "Quantitative Assessment of Research" that was asked to investigate various aspects of impact factors and similar numerical values.

The Committee consisted of :
-  John Ewing (Providence, USA), appointed by IMU ;
-  Robert Adler (Haifa, Israeli), appointed by IMS ;
-  Peter Taylor (Melbourne, Australia), appointed by ICIAM.

Citation-based statistics, such as the impact factor, are often used to assess scientific research, but are they the best measures of research quality ?

Three international mathematics organizations have today released a report, Citation Statistics, on the use of citations in assessing research quality - a topic that is of increasing interest throughout the world’s scientific community. The report is written from a mathematical perspective and strongly cautions against the over-reliance on citation statistics such as the impact factor and h-index. These are often promoted because of the belief in their accuracy, objectivity, and simplicity, but these beliefs are unfounded.

Among the report’s key findings :
-  Statistics are not more accurate when they are improperly used ; statistics can mislead when they are misused or misunderstood.
-  The objectivity of citations is illusory because the meaning of citations is not well-understood. A citation’s meaning can be very far from "impact".
-  While having a single number to judge quality is indeed simple, it can lead to a shallow understanding of something as complicated as research. Numbers are not inherently superior to sound judgments.

The report promotes the sensible use of citation statistics in evaluating research and points out several common misuses. It is written by mathematical scientists about a widespread application of mathematics.

While the authors of the report recognize that assessment must be practical and that easily-derived citation statistics will be part of the process, they caution that citations provide only a limited and incomplete view of research quality. Research is too important, they say, to measure its value with only a single coarse tool.

The report was commissioned by the International Mathematical Union (IMU) in cooperation with the International Council on Industrial and Applied Mathematics (ICIAM), and the Institute of Mathematical Statistics (IMS). It draws upon a broad literature on the use of citation data to evaluate research, including articles on the impact factor (the most common citation-based statistic) and the h-index along with its many variants. The work was also based on practices as reported from mathematicians and other scientists from around the world.


PDF - 576.6 ko

Voir aussi l’article par F. Laloé