Scientific quality is hard to define, and numbers are easy to look at. But bibliometrics are warping science by encouraging quantity over quality. Leaders at two research institutions describe how they do it differently. nature
We decided to ask applicants to tell us what they considered to be their three most important publications and why, and to submit a copy of each. We asked simple, direct questions: what have you discovered? Why is it important? What have you done about your discovery?
I believe most committee members actually read the papers submitted, unlike in other evaluations, where panellists have time only to scan exhaustive lists of publications. This approach may not have changed committee decisions, but it did change incentives of both the candidates and the panellists.
But committee members often felt uncomfortable; they thought their selection was subjective, and they felt more secure with the numbers. When I moved on from my position as dean, the system reverted to its conventional form.
The committee felt uncomfortable? Is this a case of imposter syndrome?
It takes only a day or two of programming together to understand most of the dimensions upon which programmers and programming tasks vary. But a first level manager would be much more comfortable with a progress indicator that is more "quantitative".