[1] Abramo G., Cicero T., & D’Angelo C.A. (2013). National peer-review research assessment exercises for the hard sciences can be a complete waste of money: The Italian case.Scientometrics, 95, 311-324.
[2] Abramo, G., & D’Angelo, C.A. (2023). The impact of Italian performance-based research funding systems on the intensity of international research collaboration.Research Evaluation, 32(1), 47-57.
[3] Adair, J.G. (1984). The Hawthorne effect: A reconsideration of the methodological artifact.Journal of Applied Psychology, 69(2), 334-345.
[4] Akbaritabar A., Bravo G., & Squazzoni F. (2021). The impact of a national research assessment on the publications of sociologists in Italy.Science and Public Policy, 48(5), 662-678.
[5] Aksnes, D. W. (2006). Citation rates and perceptions of scientific contribution.Journal of the American Society for Information Science and Technology, 57(2), 169-185.
[6] Aksnes, D. W., & Rip, A. (2009). Researchers’ perceptions of citations.Research Policy, 38(6), 895-905.
[7] Anfossi A., Ciolfi A., Costa F., Parisi G., & Benedetto S. (2016). Large-scale assessment of research outputs through a weighted combination of bibliometric indicators.Scientometrics, 107(2), 671-683.
[8] Biagioli, M. (2016). Watch out for cheats in citation game.Nature, 535(7611), 201-201.
[9] Biagioli M.,& Lippman, A. (Eds.). (2020). Gaming the metrics: Misconduct and manipulation in academic research MIT Press Misconduct and manipulation in academic research. MIT Press.
[10] Bonaccorsi, A. (2020a). Two decades of experience in research assessment in Italy.Scholarly Assessment Reports, 2(1), 16.
[11] Bonaccorsi, A. (2020b). Two decades of research assessment in Italy. Addressing the criticisms.Scholarly Assessment Reports, 2(1), 17.
[12] Bornmann, L. (2014). Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics.Journal of Informetrics, 8(4), 895-903.
[13] Bornmann, L. & Wohlrabe, K. (2019). Normalization of citation impact in economics.Scientometrics, 120(2), 841-884.
[14] Brooks, C., & Schopohl, L. (2018). Topics and trends in finance research: What is published, who publishes it and what gets cited? The British Accounting Review, 50(6), 615-637.
[15] Broucker, B., & De Wit, K. (2015). New public management in higher education. In The Palgrave international handbook of higher education policy and governance(pp. 57-75). London: Palgrave Macmillan UK.
[16] Buela-Casal, G., & Zych, I. (2012). What do the scientists think about the impact factor? Scientometrics, 92(2), 281-292.
[17] Checchi D., Mazzotta I., Momigliano S., & Olivanti F. (2020). Convergence or polarisation? The impact of research assessment exercises in the Italian case.Scientometrics, 124, 1439-1455.
[18] Chen, C. M.-L., & Lin, W.-Y. C. (2018). What indicators matter? The analysis of perception towards research assessment indicators and Leiden Manifesto: The case study of Taiwan. In R. Costas, T. Franssen, & A. Yegros-Yegros (Eds.), Proceedings of the 23rd International Conference on Science and Technology Indicators (STI 2018) (pp. 688-698). Leiden, Netherlands: Centre for Science and Technology Studies (CWTS). https://openaccess.leidenuniv.nl/bitstream/handle/1887/65192/STI2018_paper_121.pdf?sequence=1
[19] Cheung, W.W. (2008). The economics of post-doc publishing.Ethics in Science and Environmental Politics, 8(1), 41-44.
[20] Corsi, M., D’Ippoliti, C. & Zacchia, G. (2019). On the evolution of the glass ceiling in Italian academia: the case of economics.Science in Context, 32(4), 411-430.
[21] Curry S., de Rijcke S., Hatch A., Pillay D. G., van der Weijden I., & Wilsdon J. (2020). The changing role of funders in responsible research assessment: progress, obstacles and the way ahead. Research on Research Institute Working Paper, No. 3. https://doi.org/10.6084/m9.figshare.13227914.v1
[22] Deci, E.L., & Ryan, R.M. (2013). Intrinsic motivation and self-determination in human behavior. NY, USA: Springer Science & Business Media.
[23] Derrick, G.E., & Gillespie, J. (2013). “A number you just can’t get away from”: Characteristics of adoption and the social construction of metrics use by researchers’. In S. Hinze & A. Lottman (Eds.), zProceedings of the 18th international conference on science and technology indicators (pp. 104-116).
[24] DORA. (2012). San Francisco Declaration on Research Assessment. Retrieved April 20, 2023, from https://sfdora.org/read
[25] Dorsch I., Jeffrey A., Ebrahimzadeh S., Maggio L.A., & Haustein S. (2021). Metrics literacies: On the State of the Art of Multimedia Scholarly Metrics Education. In Proceedings of the 18th international conference on scientometrics and informetrics(pp. 1465-1466). Leuven, Belgium: Zenodo. https://doi.org/10.5281/ZENODO.5101306
[26] European Commission. (2021). Towards a reform of the research assessment system: scoping report. Luxembourg: Publications Office of the European Union.
[27] Fan, W., & Yan, Z. (2010). Factors affecting response rates of the web survey: A systematic review.Computers in Human Behavior, 26(2), 132-139.
[28] Ferguson C., Marcus A., & Oransky I. (2014). The peer-review scam.Nature, 515(7528), 480.
[29] Franceschini, F., Maisano, D. & Mastrogiacomo, L. (2016). Empirical analysis and classification of database errors in Scopus and Web of Science.Journal of Informetrics, 10(4), 933-953.
[30] Gingras, Y. (2016). Bibliometrics and research evaluation: Uses and abuses. MIT Press.
[31] Goodhart, C. A. E. (1975). Problems of monetary management: The UK experience. In C. A. E. Goodhart (Ed.), Monetary theory and practice: The UK experience. Papers in monetary economics (Vol. 1, pp. 91-121)#. Sydney, Australia: Reserve Bank of Australia.
[32] Guba, K. (2024). Why do sociologists on academic periphery willingly support bibliometric indicators?.Scientometrics, 129(1), 497-518.
[33] Guba K., Zheleznov A., & Chechik E. (2023). Evaluating grant proposals: Lessons from using metrics as screening device.Journal of Data and Information Science, 8(2), 66-92.
[34] Haddow, G., & Hammarfelt, B. (2019). Quality, impact, and quantification: Indicators and metrics use by social scientists.Journal of the Association for Information Science and Technology, 70(1), 16-26.
[35] Hammarfelt, B., & Haddow, G. (2018). Conflicting measures and values: How humanities scholars in Australia and Sweden use and react to bibliometric indicators.Journal of the Association for Information Science and Technology, 69(7), 924-935.
[36] Hammarfelt, B., & Rushforth, A.D. (2017). Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation.Research Evaluation, 26(3), 169-180.
[37] Haustein, S. (2016). Grand challenges in altmetrics: Heterogeneity, data quality and dependencies.Scientometrics, 108, 413-423.
[38] Haustein S.,& Larivière, V. (2014). The use of bibliometrics for assessing research: Possibilities, limitations and adverse effects. In I. M. Welpe, J. Wollersheim, S. Ringelhan, & M. Osterloh (Eds.) Incentives and performance: Governance of research organizations (pp. 121-139). Cham: Springer International Publishing.
[39] Hicks, D. (2004). The four literatures of social science. In H.F. Moed, W. Glänzel, & U. Schmoch (Eds.), Handbook of quantitative science and technology research: The use of publication and patent statistics in studies of S&T systems (pp.473-496). Dordrecht, Netherlands: Springer.
[40] Hicks D., Wouters P., Waltman L., de Rijcke, S. & I. Rafols (2015). The Leiden Manifesto for research metrics. Nature, 520(7548), 429-431. https://doi.org/10.1038/520429a
[41] Kamrani P., Dorsch I., & Stock W.G. (2021). Do researchers know what the h-index is? And how do they estimate its importance? Scientometrics, 126(7), 5489-5508.
[42] Kulczycki E., Engels T.C., Pölönen J., Bruun K., Dušková M., Guns R., .. & Zuccala A. (2018). Publication patterns in the social sciences and humanities: Evidence from eight European countries.Scientometrics, 116, 463-486.
[43] Lemke S., Mehrazar M., Mazarakis A., & Peters I. (2019). “When you use social media you are not working”: Barriers for the use of metrics in Social Sciences.Frontiers in Research Metrics and Analytics, 3, 39.
[44] Lin, J. & Fenner, M. (2013). Altmetrics in evolution: Defining and re-defining the ontology of article-level metrics.Information Standards Quarterly, 25(2), 19-26.
[45] Ma, L., & Ladisch, M. (2019). Evaluation complacency or evaluation inertia? A study of evaluative metrics and research practices in Irish universities.Research Evaluation, 28(3), 209-217.
[46] Maggio L.A., Jeffrey A., Haustein S., & Samuel A. (2022). Becoming metrics literate: An analysis of brief videos that teach about the h-index.Plos One, 17(5), e0268110.
[47] Mason S., Merga M.K., Canche M.S.G., & Roni S.M. (2021). The internationality of published higher education scholarship: How do the ‘top’journals compare? Journal of Informetrics, 15(2), 101155.
[48] Millman J., Bishop C. H., & Ebel R. (1965). An analysis of test-wiseness.Educational and Psychological Measurement, 25(3), 707-726.
[49] Moed, H.F. (2006). Citation analysis in research evaluation. Springer Science & Business Media.
[50] Moed, H.F. (2020). Appropriate use of metrics in research assessment of autonomous academic institutions. Scholarly Assessment Reports, 2(1), 1. http://doi.org/10.29024/sar.8
[51] Moher D., Bouter L., Kleinert S., Glasziou P., Sham M.H., Barbour V., .. & Dirnagl U. (2020). The Hong Kong Principles for assessing researchers: Fostering research integrity. PLoS Biology, 18(7), e3000737. https://doi.org/10.1371/journal.pbio.3000737
[52] Necker, S. (2014). Scientific misbehavior in economics. Research Policy, 43(10), 1747-1759. https://doi.org/10.1016/j.respol.2014.05.002
[53] Olaya Escobar E.S., Berbegal‐Mirabent J., Alegre I., & Duarte Velasco O.G. (2017). Researchers’ willingness to engage in knowledge and technology transfer activities: An exploration of the underlying motivations.R&D Management, 47(5), 715-726.
[54] Penny, D. (2016). What matters where? Cultural and geographical factors in science. Slides presented at the 3rd Altmetrics Conference, Bucharest, Romania. Retrieved from https://figshare.com/articles/What_matters_where_Cultural_and_geographical_factors_in_science/3969012
[55] Rafols I., Leydesdorff L., O’Hare A., Nightingale, P. & Stirling, A. (2012). How journal rankings can suppress interdisciplinary research: A comparison between Innovation Studies and Business and Management.Research Policy, 41(7), 1262-1282.
[56] Rousseau, R., Egghe, L. & Guns, R. (2018). Becoming metric-wise. A bibliometric guide for researchers. Kidlington: Chandos (Elsevier).
[57] Rousseau S., Catalano G., & Daraio C. (2021). Can we estimate a monetary value of scientific publications? Research Policy, 50(1), 104116.
[58] Rousseau, S., & Rousseau, R. (2015). Metric‐wiseness.Journal of the Association for Information Science and Technology, 66(11), 2389.
[59] Rousseau, S., & Rousseau, R. (2017). Being metric-wise: Heterogeneity in bibliometric knowledge.El Profesional de la Información, 26(3), 480-487.
[60] Rousseau, S., & Rousseau, R. (2021). Bibliometric techniques and their use in business and economics research.Journal of Economic Surveys, 35(5), 1428-1451.
[61] Schubert, T. (2009). Empirical observations on new public management to increase efficiency in public research—Boon or bane? Research policy, 38(8), 1225-1234.
[62] Söderlind, J., & Geschwınd, L. (2020). Disciplinary differences in academics’ perceptions of performance measurement at Nordic universities.Higher Education Governance and Policy, 1(1), 18-31.
[63] Thelwall, M., & Kousha, K. (2021). Researchers’ attitudes towards the h-index on Twitter 2007-2020: Criticism and acceptance.Scientometrics, 126(6), 5361-5368.
[64] Tourish, D. & Willmott, H. (2015). In defiance of folly: Journal rankings, mindless measures and the ABS Guide.Critical Perspectives on Accounting, 26, 37-46.
[65] van Dalen, H.P. & Henkens, K. (2012). Intended and unintended consequences of a publish-or-perish culture: A worldwide survey.Journal of the American Society for Information Science and Technology, 63(7), 1282-1293.
[66] van Raan, A.F. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods.Scientometrics, 62, 133-143.
[67] Wilsdon J., Allen L., Belfiore E., Campbell P., Curry S., Hill S., Jones R., Kain R., Kerridge S., Thelwall M., Tinkler J., Viney I., Wouters P., Hill, J. & Johnson, B. (2015). The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. https://doi.org/10.13140/RG.2.1.4929.1363