CAN UNIVERSITY PERFORMANCE BE MEASURED BY NUMBERS?

2023.02.06.
CAN UNIVERSITY PERFORMANCE BE MEASURED BY NUMBERS?
At the joint event of the ELTE PPK Social Communication Research Group and HVG, the invited experts and heads of institutions drew attention to the possibilities and limitations of higher education rankings.

Which is the best university? This question is often asked when higher education rankings are published, although the answer is not always clear. The team of UnivPress Ranking, the best-known Hungarian university ranking, which has been published in the HVG Diploma supplement for the past fifteen years, has launched a series of conferences to help you understand the rankings and the data. Held at the headquarters of the OTP Fáy András Foundation, the conference also reflected on the changes in national and global rankings.

The conference was chaired by Helga Dorner, Director of the Institute for Adult Education Research and Knowledge Management, ELTE PPK. In her opening speech, Anikó Zsolnai, Dean of ELTE PPK, spoke about the development of higher education pedagogy as a priority project at the faculty, while Zoltán Rónay, Deputy Dean, drew attention to the links between institutional autonomy and university performance.

György Fábri, head of the Social Communication Research Group at ELTE PPK, the initiator and leading expert of the UnivPress Ranking, examined in his presentation how and with what validity university performance can be measured with numbers. As he explained, university rankings embody the prevailing forms of quantification, i.e. measurement by quantity, which are also characteristic of the academic world and are subject to constant and serious professional criticism.

Many aspects of university life - such as the quality of teaching or student well-being - cannot be measured in numbers,

but the evaluation of the scientific work itself is also distorted by the absolutization of various publication or citation factors. Against all this, the lecturer advocated the validation of quality aspects, which is increasingly emphasized in the current international trends of the scientific community.

Sándor Soós, research fellow at the Institute of Adult Education and Knowledge Management, discussed the relationship between indicators of scientific performance and science assessment. He explored in detail the process by which publication indicators that were originally easy to follow get into the "black hole" of rankings and lose their real information value through rescaling and weighting. In examining the impact of the citation indicator, he pointed out that when studying these figures, outliers occur several times, but that this is a general phenomenon. In his opinion, anomalies could be avoided by using the indicators in a systematic way, as the Leiden Ranking does.

In the roundtable discussion following the presentations, participants representing several types of Hungarian pre-primary education institutions reflected on the issues raised and expressed their experiences and strategies in relation to the rankings. Among others, Marcell Eszterhai, President of HÖOK, Péter Szakál, Director of Education (SZTE), Péter Sziklai, Deputy Rector (ELTE), Péter Szluka, Library Director (Semmelweis University), László Vass, Rector Emeritus (METU), Gergely Attila Zaránd, Professor (BME), were present. The discussion was moderated by Fruzsina Szabó, Editor of HVG Diploma and Editor-in-Chief of eduline.hu.

During the discussion, the inevitability of rankings was acknowledged by all, and the achievement of better positions was not seen as an end in itself, but as a means of feedback or access to funding. However, it was said that if the expectation of the funding body is to make progress in the area of publishing, then the financial conditions should be provided. Several solutions are already used by universities of applied sciences, such as rewarding accepted publications, encouraging research with publication potential, and supporting conference participation or organisation. However, it is also important to recognise that not all universities can be judged on the basis of scientific publications, since for practical training, market-applicator feedback is much more important for both students and the institution. The information offered by the rankings is also limited compared to the complexity of the student aspects, as neither labour market validation nor student services are reflected. Therefore, it is not enough for institutions to focus their recruitment strategy on rankings, direct outreach remains important in the campaign, participants pointed out.

In her summary, Fruzsina Szabó concluded the event by indicating that in the future they would like to improve their higher education rankings published year by year in the HVG Diploma supplement, taking all these aspects into account.

The presentations of the conference are available at the following link.