Leeds statisticians propose a “nu” citation approach to measure researcher impact
Researchers at the University of Leeds have proposed a new citation index that could offer a more balanced measure of academic impact than existing metrics.
The study, “A citation index bridging Hirsch’s h and Egghe’s g” published in PNAS Nexus, addresses long-standing concerns about how research productivity and influence are quantified. It was led by Leonid Bogachev, Professor of Probability and Statistics in the School of Mathematics, supported by Dr Jochen Voss and Dr Rukia Nuermaimaiti, who is now at Imperial College London.
Citation indices are widely used across academia, particularly in hiring, promotion and funding decisions, yet each comes with recognised limitations. The most established of these measures, the h-index, was introduced in 2005 by physicist Jorge E Hirsch.
This defines a researcher’s impact by identifying the largest number h, such that h publications have each received at least h citations. While this is easy to calculate and interpret, this index does not distinguish between moderately cited papers and those that attract very high attention.
In 2006, information scientist Leo Egghe proposed the g-index, which gives greater weight to highly cited work. However, this approach can underrepresent researchers whose work is cited more evenly across multiple publications.
The new index, named ν, was proposed by Professor Bogachev and his team by developing ideas initiated in Dr Nuermaimaiti’s PhD thesis. According to the authors, ν incorporates information about both the number of publications and the distribution of citations across them. The paper provides a formal proof that ν always falls between the values of the two existing indices, h and g, for any given citation record.
Professor Bogachev said, “Picking up on Rukia’s comparison of the h and g indices carried out in her PhD thesis, we observed that both definitions can be unified under the same framework, which has naturally led us to a new index ν bridging the two classical ones. We believe that ν provides a better balance between impact and productivity, so hopefully it would be useful in scientometric practice and research.”
Alongside the mathematical results, the authors emphasise that citation metrics should be used with care. Since the turn of the century, scholars have documented how citation indicators can be stretched far beyond their original intent, shaping hiring, promotion and funding decisions.
These concerns have led to initiatives such as the San Francisco Declaration on Research Assessment, and the Leiden Manifesto, encouraging restraint and transparency. The team also notes that the rapid adoption of AI tools introduces additional risk, making numerical shortcuts even more tempting.
The study concludes that while new indices such as ν can refine quantitative analyses, they should complement, rather than replace, expert evaluation of research quality.


