Preview

Vysshee Obrazovanie v Rossii = Higher Education in Russia

Advanced search

The Role of Scientometric Thresholds for the Evaluation of Grant Applications

https://doi.org/10.31992/0869-3617-2023-32-10-57-75

Abstract

The present study focuses on data from the Russian Science Foundation (RSF). The authors analyze the effect of using quantitative indicators in grant allocation by using the natural experiment with the increasing publication threshold for principal investigators between two waves of grant selections in 2014 and 2017. The authors selected the relatively new RSF as our case study due to its policy to establish a publication threshold for grants’ principal investigators. The policy change provides the authors with the opportunity to study whether reliance on bibliometric indicators brings better results in the project evaluation process. This analysis included two groups of researchers: 1) physicists and 2) social sciences and humanities scholars. Scopus was sourced to collect bibliographic data, while the foundation’s website was used to check data on the funded projects. The following questions are explored in detail: whether the policy affected the distribution of funds to researchers with a better publication record, the strategies of increasing publications by individual researchers, and the differences, if any, in policy effects between disciplines. The authors found that the selection among physicists in the first wave was already effective as the grant recipients are prolific authors who publish many highly cited papers before 2014. In addition, the results indicated that the group of research leaders in physics did not significantly change between the two selected waves of competitions (from 2014 to 2017). Although social scientists demonstrated a relatively weak ability to publish internationally, the increase in scientometric expectations has improved the publication record regarding the quantity and quality of publications.

About the Authors

K. S. Guba
Center for Institutional Analysis of Science and Education of the European University at Saint Petersburg
Russian Federation

Katerina S. Guba – Cand. Sci. (Sociology), Director of the Center,

6/1, A, Gagarinskaya str., 191187, Saint Petersburg.



A. M. Zheleznov
Center for Institutional Analysis of Science and Education of the European University at Saint Petersburg
Russian Federation

Alexey M. Zheleznov – Researcher of the Center,

6/1, A, Gagarinskaya str., 191187, Saint Petersburg.



E. A. Chechik
Center for Institutional Analysis of Science and Education of the European University at Saint Petersburg
Russian Federation

Elena A. Chechik – Junior Researcher of the Center,

6/1, A, Gagarinskaya str., 191187, Saint Petersburg.



References

1. Merton, R.K. (1968). The Matthew Effect in Science. Science. Vol. 159, no. 3810, pp. 56-63, doi: 10.1126/science.159.3810.56

2. Roshani, S., Bagherylooieh, M.-R., Mosleh, M., Coccia, M. (2021). What Is the Relationship Between Research Funding and Citation-Based Performance? A Comparative Analysis Between Critical Disciplines. Scientometrics. Vol. 126, no. 9, pp. 7859-7874, doi: 10.1007/s11192-021-04077-9

3. Bol, T., Vaan, M. de, van de Rijt, A. (2018). The Matthew Effect in Science Funding. PNAS. Vol. 115, no. 19, pp. 4887-4890, doi: 10.1073/pnas.1719557115

4. Mindeli L.E., Libkind A.H., Markusova V.A. (2014). Theinfluence of the Grant Financing on the Efficiency of Scientific Research at the Higher School. Herald of the Russian Academy of Sciences. Vol. 84, no. 12, pp. 1080-1089, doi: 10.7868/S0869587314120111 (In Russ., abstract in Eng.).

5. Mirskaya E.Z. (2006). State Grants as a Tool for the Modernization of Russian Academic Science. Bulletin of the Russian Humanitarian Scientific Foundation. Vol. 44, no. 3, pp. 115-134. Available at: https://elibrary.ru/download/elibrary_37198121_74092538.pdf (accessed 10.06.2023). (In Russ.)

6. Dezhina, I.G., Simachev, Y.V. (2013). MatchingGrants for Stimulating Partnerships between Companies and Universitiesin Innovation Area: Initial Effects in Russia. Journal of the New Economic Association. Vol. 19, no. 3, pp. 99-122. Available at: http://www.econorus.org/repec/journl/2013-19-99-122r.pdf (accessed 08.08.2023). (In Russ., abstract in Eng.).

7. Dushina, S.A. (2017). Research Transfer: Once Again on Mobility, Mega-grants and the First Academics. Sociology of Science and Technology. Vol. 8, no. 2, pp. 87-103. Available at: https://elibrary.ru/download/elibrary_29384063_57014291.pdf (accessed 08.08.2023). (In Russ., abstract in Eng.).

8. Saygitov, R.T. (2014). The Impact of Funding through the RF President’s Grants for Young Scientists (the field – Medicine) on Research Productivity: A Quasi-Experimental Study and a Brief Systematic Review. PLOS ONE. No. 9, no. 1, e86969, doi: 10.1371/journal.pone.0086969

9. Fedderke, J.W., Goldschmidt, M. (2015). Does Massive Funding Support of Researchers Work?: Evaluating the Impact of the South African Research Chair Funding Initiative. Research Policy. Vol. 44, no. 2, pp. 467-482, doi: 10.1016/j.respol.2014.09.009

10. Sandström, U., Hällsten, M. (2008). Persistent nepotism in peer-review. Scientometrics. Vol. 74, no. 2, pp. 175-189, doi: 10.1007/s11192-008-0211-3

11. van den Besselaar, P., Leydesdorff, L. (2009). Past Performance, Peer Review and Project Selection: A Case Study in the Social and Behavioral Sciences. Research Evaluation. Vol. 18, no. 4, pp. 273-288, doi: 10.48550/arXiv.0911.1306

12. Azoulay, P., Li, D. (2021). Scientific Grant Funding. Innovation and Public Policy. University of Chicago Press, pp. 117-150, doi: 10.7208/chicago/9780226805597-008

13. Maisano, D.A., Mastrogiacomo, L., Franceschini, F. (2020). Short-Term Effects of Non-Competitive Funding to Single Academic Researchers. Scientometrics. Vol. 123, no. 3, pp. 1261-1280, doi: 10.1007/s11192-020-03449-x

14. Gush, J., Jaffe, A., Larsen, V., Laws, A. (2018). The Effect of Public Funding on Research Output: The New Zealand Marsden Fund. New Zealand Economic Papers. Vol. 52, no. 2, pp. 227-248, doi: 10.1080/00779954.2017.1325921

15. Tonta, Y. (2018). Does Monetary Support Increase the Number of Scientific Papers? An Interrupted Time Series Analysis. Journal of Data and Information Science. Vol. 3, no. 1, pp. 19-39, doi: 10.2478/jdis-2018-0002

16. Hornbostel, S., Böhmer, S., Klingsporn, B., Neufeld, J., von Ins, M. (2009). Funding of Young Scientist and Scientific Excellence. Scientometrics. Vol. 79, no. 1, pp. 171-190, doi: 10.1007/s11192-009-0411-5

17. Gralka, S., Wohlrabe, K., Bornmann, L. (2019). How to Measure Research Efficiency in Higher Education? Research Grants vs. Publication Output. Journal of Higher Education Policy and Management. Vol. 41, no. 3, pp. 322-341, doi: 10.1080/1360080X.2019.1588492

18. Langfeldt, L., Benner, M., Sivertsen, G., Kristiansen, E.H., Aksnes, D.W., Borlaug, S.B., Hansen, H.F., Kallerud, E., Pelkonen, A. (2015). Excellence and Growth Dynamics: A Comparative Study of the Matthew Effect. Science and Public Policy. Vol. 42, no. 5, pp. 661-675, doi: 10.1093/scipol/scu083

19. Dezhina I.G. (2020). Scientific “Centers of Excellence” in Russian Universities: Changing Models. ECO. Vol. 50, no. 4, pp. 87-109, doi: 10.30680/ECO0131-7652-2020-4-87-109 (In Russ.).

20. Morillo, F. (2019). Collaboration and Impact of Research in Different Disciplines with International Funding (from the EU and Other Foreign Sources). Scientometrics. Vol. 120, no. 3, pp. 807-823, doi: 10.1007/s11192-019-03150-8

21. Wang, J., Shapira, P. (2015). Is There a Relationship between Research Sponsorship and Publication Impact? An Analysis of Funding Acknowledgments in Nanotechnology Papers. PLOS ONE. Vol. 10, no. 2, e0117727, doi: 10.1371/journal.pone.0117727

22. Wang, L., Wang, X., Piro, F.N., Philipsen, N. (2020). The Effect of Competitive Public Funding on Scientific Output. Research Evaluation. Vol. 29, no. 4, pp. 418-429, doi: 10.1093/reseval/rvaa023

23. Grimpe, C. (2012). Extramural Research Grants and Scientists’ Funding Strategies: Beggars Cannot Be Choosers? Research Policy. Vol. 41, no. 8, pp. 1448-1460, doi: 10.1016/j.respol.2012.03.004

24. Aagaard, K., Kladakis, A., Nielsen, M.W. (2019). Concentration or Dispersal of Research Funding? Quantitative Science Studies. Vol. 1, no. 1, pp. 1-33, doi: 10.1162/qss_a_00002

25. Glänzel, W., Schoepflin, U. (1999). A Bibliometric Study of Reference Literature in the Sciences and Social Sciences. Information Processing & Management. Vol. 35, no. 1, pp. 31-44, doi: 10.1016/S0306-4573(98)00028-4

26. Akoev, M.A., Markusova., V.A., Moskaleva, O.V., Pislyakov, V.V. (2021). Handbook on Scientometrics: Science and Technology Development Indicators, Second edition. Yekaterinburg: IPC UrFU, 358 p., doi: 10.15826/B978-5-7996-3154-3 (In Russ.).

27. Aksnes, D.W. (2003). Characteristics of Highly Cited Papers. Research evaluation. Vol. 12, no. 3, pp. 159-170, doi: 10.3152/147154403781776645

28. Simachev, Yu.V., Zasimova, L.S., Kurbanov, T.R. (2017). Basic Research Support by the Russian Science Foundation: What Can We Learn from the First Grant Competition? Foresight and STI. Vol. 11, no. 4, pp. 74-83, doi: 10.17323/2500-2597.2017.4.74.83

29. Neufeld, J., von Ins, M. (2011). Informed Peer Review and Uninformed Bibliometrics? Research Evaluation. Vol. 20, no. 1, pp. 31-46, doi: 10.3152/095820211X12941371876382

30. Jacobs, J.A. (2016). Journal Rankings in Sociology: Using the H Index with Google Scholar. American Sociologist. Vol. 47, no. 2-3, pp. 192-224, doi: 10.1007/s12108-015-9292-7

31. van den Besselaar, P., Sandström, U. (2020). Bibliometrically Disciplined Peer Review: on Using Indicators in Research Evaluation. Scholarly Assessment Reports. Vol. 2, no. 5, pp. 1-13, doi: 10.29024/sar.16

32. Roy, R. (1985). Funding Science: The Real Defects of Peer Review and an Alternative To It. Science, Technology & Human Values. Vol. 10, no. 3, pp. 73-81, doi: 10.1177/016224398501000309

33. Alper, S., Yelbuz, B.E., Akkurt, S.B., Yilmaz, O. (2023). The Positive Association of Education with the Trust in Science and Scientists Is Weaker in Highly Corrupt Countries. Public Understanding of Science. Online First, doi: 10.1177/09636625231176935

34. Savina, T.F., Sterligov, I.A. (2016). Potentially Predatory Journals in Scopus: Descriptive Statistics and Country-Level Dynamics. Nordic Workshop on Bibliometrics and Research Policy 2016 Proceedings. 2016. Vol. 20, doi: 10.6084/m9.figshare.4249394.v1


Review

Views: 605


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 0869-3617 (Print)
ISSN 2072-0459 (Online)