Research Paper

A Local Adaptation in an Output-Based Research Support Scheme (OBRSS) at University College Dublin

  • Liam Cleere ,
  • Lai Ma
Expand
  • University College Dublin, Belfield 4, Ireland
Corresponding author: Lai Ma (E-mail: ).

Received date: 2018-08-22

  Request revised date: 2018-10-20

  Accepted date: 2018-10-25

  Online published: 2019-01-08

Copyright

Open Access

Abstract

University College Dublin (UCD) has implemented the Output-Based Research Support Scheme (OBRSS) since 2016. Adapted from the Norwegian model, the OBRSS awards individual academic staff using a points system based on the number of publications and doctoral students. This article describes the design and implementation processes of the OBRSS, including the creation of the ranked publication list and points system and infrastructure requirements. Some results of the OBRSS will be presented, focusing on the coverage of publications reported in the OBRSS ranked publication list and Scopus, as well as information about spending patterns. Challenges such as the evaluation of the OBRSS in terms of fairness, transparency, and effectiveness will also be discussed.

Cite this article

Liam Cleere , Lai Ma . A Local Adaptation in an Output-Based Research Support Scheme (OBRSS) at University College Dublin[J]. Journal of Data and Information Science, 2018 , 3(4) : 74 -84 . DOI: 10.2478/jdis-2018-0022

1 Introduction

University College Dublin (UCD), Ireland’s largest university, has implemented the Output-Based Research Support Scheme (hereafter “OBRSS”) since 2016. Adapted from the Norwegian model, the OBRSS awards individual academic staff using a points system based on the number of publications and doctoral students. Hence, a major difference between the Norwegian model and the OBRSS is that the Norwegian model is designed to allocate block grants to universities (Schneider, 2009; Sivertsen, 2016), whereas the OBRSS aims to reward academic staff individually. It should also be noted that the OBRSS is implemented as a university initiative rather than a component of performance-based system as described in Hicks (2012) and Zacharewicz, Lepori, Reale and Jonkers (2018).
In this article, we will first describe the design and implementation processes of the OBRSS, including the creation of the ranked publication list and points system, as well as infrastructure requirements. Then, some results of the OBRSS will be presented, focusing on the coverage of publications reported in the OBRSS ranked publication list and Scopus.
Some data about spending will also be reported. Last, challenges such as the evaluation of the OBRSS in terms of fairness, transparency, and effectiveness will be discussed.

2 Design and Implementation Processes

The overarching objective of the OBRSS is to incentivise academic staff to publish research output in higher quality outlets. The principles of the OBRSS, defined at the outset, are as follows:
●Fair—Academic staff should be actively involved in its creation and define its methodology.
●Transparent—Metrics and data used in the scheme are based on accessible and reproducible data.
●Easy to understand& implement —Academic staff can play a part in performance improvement.
●Underpinned by the strategic objectives—The scheme reinforces the objectives of the University Strategy 2015-2020.
●Rewards excellence—The scheme is designed to encourage research excellence.
The design involved the construction of a ranked publication list and a points system. The ranked publication list includes journals, conferences, and monographs; and the ranking is based on a basket of indicators including Norwegian Register for Scientific Journals, Series and Publishers (NSD—National Centre for Research Data, 2018), Danish BFI (Ministry of Higher Education and Science, 2018), Finnish Publication Forum (Federation of Finnish Learned Societies), SNIP (Source Normalised Impact Factor per Paper), and CiteScore. Academic staff from across the university are consulted in finalising the ranked publication list each year; they are also requested to update their publication records on the Current Research Information System (CRIS) for points to be calculated. Only publications with a status of 'Published' in the CRIS are included in the OBRSS. PhD supervision records are maintained in the institutional Student Information System (SIS).
The OBRSS uses the ranked publication list—one section for Publishers and another for Series (Journals, Book Series, and Conference Series)—as a reference for the calculation of points. Each publication is assigned one of two levels: level 1-Normal or level 2-Prestigious. Weighted scores are then applied to each publication. Similar to the Norwegian model, points are allocated for different types of publication as summarised in Table 1:
Table 1 Points allocation per publication type.
Publication types Points Level 1 ‘normal’ Points Level 2 ‘prestigious’
Book 5 8
Journals Article 1 3
Book Chapter 1 3
Conference Publication 0.5 2
Edited Book 1 3
Other Publication 0.5 2
Published Report 1 3
There is a consultation process to ensure that inputs from the academic staff are considered in finalising the ranked publication list. During the consultation period, academic staff can make recommendations to add/remove publications to/from the ranked publication list at the two levels. The suggestions and recommendations are reviewed by the Office of Research Administration. Considering the objectives and scope of the OBRSS, external panels are not used to review the ranked publication list.
Publication points are calculated for each academic staff’s publications in the CRIS over a three-year period (for example 2015-2017) using the following formula (Table 2):
Table 2 Calculation of publication output point.
Publication output-points = B x C x F x N, where
B = Points (allocated based on the type of publications and whether it is in a ‘normal’ or ‘prestigious’ channel)
C = collaboration factor (multiply by 1.25 if there are any international authors on the paper)
F = UCD author factor (multiply by 0.7 if there are two UCD academic staff on the paper; multiply by 0.6 if there are three UCD academic staff on the paper; multiply by 0.5 if there are four or more UCD academic staff on the paper)
N = if the total number of authors on a paper exceeds 100, multiply the result by 0.1
The total publication points for an individual are equal to the sum of all the points for each of their publications in the three-year period. PhD supervision points are calculated by counting the number of PhD students supervised by each academic staff member in the current academic year. Two points are awarded for being a primary or a secondary supervisor. The maximum points for PhD supervision have been capped at 20. Publication and PhD supervision points are both worth €35.
All academic staff are automatically entered into the OBRSS each year. The total points that an academic staff has accumulated is communicated using a personalised points statement. Final points statements are issued to academic staff receiving an award in October each year. The minimum value threshold for a research award is €200. There is no maximum research award, but in the first two years of operation, the maximum award based on the maximum points for an individual author were between €10,000 and €15,000.
Awards may be used by the academic staff for research support, such as to cover travel expenses, office supplies, equipment, and laboratory supplies. Overall, approximately 1% of the total annual research budget for the university is allocated to the OBRSS.

3 Results from the First Two Years

The first two years of the OBRSS has provided some valuable data for understanding research activities in the university. The more complete record of publications is essential and helpful for deliberating research strategies, on the one hand, and the spending pattern gives insights into the type of activities and resources that academic staff consider important for supporting their research, on the other. The following sections present a comparison of the coverage of publications in Scopus and the OBRSS ranked publication list, changes of publications reported and of the ranked publications list in the first two years, as well as some data about spending so far.

3.1 Coverage of Scopus and OBRSS

One of the most significant outcomes of the implementation of the OBRSS is a more complete picture of publication records in University College Dublin. The number of academic staff updating their research profiles in the CRIS has increased each year. In the first year the OBRSS was implemented, 85% of academic staff updated their profiles as opposed to 75% over the previous three years.
Using the publication records in CRIS, we can compare the coverage of research outputs per School and College in Scopus and the OBRSS ranked publication list. Although many international university ranking organisations use either Scopus or Web of Science as a data source to evaluate the research performance of an institution, the data source only works well for STEM (Science, Technology, Engineering & Mathematics) disciplines where coverage can be as high as 94% (Physics). For Arts & Humanities disciplines, coverage of their outputs varies between 2% (Irish, Celtic Studies and Folklore) and 18% (English, Drama & Film).
Table 3 Comparison of publications for academic staff only from 2013 to 2017 inclusive; Scopus data from SciVal 25 May 2018; CRIS data from UCD RMS Profiles 22 June 2018.
UCD School Name (Discipline) Scopus Total 2013-2017 CRIS Total 2013-2017 % Coverage
in Scopus
Agriculture & Food Science 868 1,085 80.0%
Archaeology 73 202 36.1%
Architecture, Planning and Environmental Policy 127 580 21.9%
Art History & Cultural Policy 12 141 8.5%
Biology & Environmental Science 448 572 78.3%
Biomolecular & Biomedical Science 468 522 89.7%
Biosystems and Food Engineering 470 665 70.7%
Business 459 1,047 43.8%
Chemical & Bioprocess Engineering 250 266 94.0%
Chemistry 391 452 86.5%
Civil Engineering 205 470 43.6%
Classics 5 53 9.4%
Computer Science 767 916 83.7%
Earth Sciences 149 390 38.2%
Economics 147 186 79.0%
Education 83 178 46.6%
Electrical & Electronic Engineering 730 860 84.9%
English, Drama & Film 78 415 18.8%
Geography 86 324 26.5%
History 40 291 13.7%
Information & Communication Studies 61 149 40.9%
Irish, Celtic Studies and Folklore 3 145 2.1%
Languages, Cultures and Linguistics 57 367 15.5%
Law 52 495 10.5%
Mathematics & Statistics 460 617 74.6%
Mechanical & Materials Engineering 478 919 52.0%
Medicine 1,867 2,451 76.2%
Music 7 126 5.6%
Nursing, Midwifery & Health Systems 241 534 45.1%
Philosophy 98 246 39.8%
Physics 1,325 1,403 94.4%
Politics & International Relations 124 321 38.6%
Psychology 277 622 44.5%
Public Health, Physiotherapy and Sports Science 747 977 76.5%
Social Policy, Social Work and Social Justice 128 437 29.3%
Sociology 46 267 17.2%
Veterinary Medicine 626 1,094 57.2%
Grand Total 12,453 20,785 59.9%
In total the UCD CRIS system records approximately 4,000 publications records per year for academic staff as opposed to Scopus which records 2,500 per year, see Table 4 below:
Table 4 Comparison of publications for academic staff only from 2013 to 2017 inclusive; Scopus data from SciVal 25 May 2018; CRIS data from UCD RMS Profiles 22 June 2018.
UCD College Name Scopus Total 2013-2017 CRIS Total
2013-2017
% Coverage in Scopus
College of Arts & Humanities 228 1,556 14.7%
College of Business 468 1,046 44.7%
College of Engineering & Architecture 2,384 3,691 64.6%
College of Health and Agricultural Sciences 4,017 6,021 66.7%
College of Science 3,925 5,075 77.3%
College of Social Sciences & Law 1,227 3,487 35.2%
Grand Total 12,249 20,876 58.7%
Figure 2. Shares in percent of total output per College in the OBRSS categories: Prestigious Channel - Level 2; Normal Channel - Level 1; Not recognised in OBRSS publication list.
The distribution of publication output in the OBRSS shows that the coverage
is consistent across Colleges, apart from the College of Science. The anomaly (Figure 2) is due to large volumes of papers produced in the School of Physics through international collaborations. It is possible to have up to 5,000 authors on single papers in Physics (Castelvecchi, 2015), leading to a small number of academic staff in the School of Physics producing a large volume of publications in both Prestigious Level 2 and Normal Level 1 channels.

3.2 Trends in Publications and Research Activities

It is expected that the creation of the ranked publication list in OBRSS would provide some guidance for publication outlets. Whilst it is understood that the increase can be due to many factors and would require careful examination, the initial results from the first two years of the OBRSS show some evidence that academic staff are selecting to publish in higher ranked publication outlets (Table 5).
Table 5 Number of publications per OBRSS category, per scheme year.
OBRSS categories 2016 Scheme (Publications from 2013 to 2015) 2017 Scheme (Publications from 2014 to 2016) Difference %Difference
Prestigious Channel - Level 2 4,230 4,444 214 5.1%
Normal Channel - Level 1 4,267 6,323 2,056 48.2%
Not recognised in OBRSS publication list 4,515 3,202 -1,313 -29.1%
Grand Total 13,012 13,969
In the second year of implementation, there was a small increase (5%) in the reported number of publications in Prestigious Level 2 publication channels, while an increase (48%) was noted in the Normal Level 1 publication channels. At the same time, there appeared to be less publishing activity (-29%) in channels that are not recognised by the OBRSS. While these figures are indicative, a trend cannot be established given, first, the OBRSS has only been in operation for two years, and second, it is likely that academic staff had altered the publications reported in CRIS in the subsequent year based on the outcome in the first year.
The comparability is also affected by the changes in Prestigious Level 2 and Normal Level 1 channels from 2016 to 2017. As can be seen in Tables 5 and 6 below, the number of Prestigious Level 2 channels were reduced while keeping the overall number of ranked publication channels approximately stable.
Table 6 Number of ranked journals, conferences and book series channels per OBRSS category, per scheme year.
Journal List 2016 2017 Difference % Difference
Prestigious Channel - Level 2 4,485 3,958 -527 -11.80%
Normal Channel - Level 1 38,544 39,128 584 1.50%
Grand Total 45,045 45,103 58 0.10%
Table 7 Number of ranked publisher channels per OBRSS category, per scheme year.
Publisher list 2016 2017 Difference % Difference
Prestigious Channel - Level 2 265 257 -8 -3.00%
Normal Channel - Level 1 2,190 2,200 10 0.50%
Grand Total 2,455 2,457 2 0.10%

3.3 Research Funding and Spending

Since the implementation in 2016, over €1.3m in new research funding was allocated to academic staff to support their research activities. The number of recipients and average award value have both increased from 2016 to 2017.
Figure 3. Number of award recipients per college and average award value.
Interestingly, over 50% of the awardees have not spent the research support fund at all. Of those who have used their funds, the expenses have been claimed for travel expenses, office supplies, equipment, and laboratory supplies:
●41% of the funding was used to cover travels inside and outside EU. This included accommodation, transport and subsistence expenses;
●25% of the funds spent were used to buy office supplies. Examples of the items purchased are: books, subscription to journals, staff training courses, website designs and copy-printing;
●11% of the funds spent were used to buy equipment, such as PC’s, laptops, peripherals and laboratory or office furniture;
●8% of the funds spent were used to purchase laboratory supplies, such as chemicals, parts, disposables, glassware & plastics and other general supplies.

4 Challenges

Implementing the OBRSS requires sound infrastructure, including the Current Research Information Systems, that supports reporting of publication coverage and trends, as well as spending pattern. Substantial resources were also needed to create and maintain the ranked publication list. In fact, the construction of the ranked publication list and points system are the basic steps to make OBRSS work. When compiling the publication list, suggestions and recommendations from academic staff are essential to gauge the completeness of the list as well as the appropriateness of the rank assigned. However, sometimes disciplinary differences are difficult to reconcile. For example, a publisher could be considered as prestigious in one discipline while it is considered normal in another. Some suggestions also fall into specialised areas where the publication is not indexed in Scopus and is not included in the Danish, Finnish, or Norwegian list. The decision about inclusion or exclusion of the publication could be taxing when balancing the credibility and fairness of the list. Nevertheless, as the ranked publication list is being updated every year, it is expected that the scope and the ranking will be adjusted to reflect quality, impact, and disciplinary norms.
Another challenge is to evaluate the effectiveness of the OBRSS pertaining to the objective of increasing publications in high quality outlets. The main reason is that many factors can contribute to different publication trends, for example, research areas of new staff members, national and EU funding priorities, and so on. The effectiveness of the OBRSS would be inconclusive despite it would certainly be a contributive factor in steering research outputs. Also, there is a risk that the OBRSS could result in a higher number of publications at the Normal Level 1 than the Prestigious Level 2, as Butler (2003, 2004) suggests in her studies of Australia. However, there is evidence from Norway that Level 2 publishing may increase as well (Schneider et al., 2015), which is just as likely since the amount of the award at UCD is rather small. The fact that less than 50% of the awardees has spent the funds is an indication that the award does not provide a strong enough incentive to game the system. It is expected that the ranked publication list would be seen as a guide of high quality publication channels and would alter preferences of outlets accordingly, whilst whether academic staff would be extrinsically motivated would need further investigation.
The OBRSS can also be used in ways unintended by the objectives of the scheme. As of now, the heads of school are provided with the points statement of academic staff in their unit. It is not clear as to whether and how the incentives might trickle down (Aagaard, 2015) or how the scheme might influence perception of university management and policy (Liefner, 2003; Woelert & Yates, 2015). There have been reports that the points have been used for self-evaluation as well as comparison by academic staff and heads of school. The “constitutive effects” (Dahler-Larsen, 2014) would also demand further investigation in the future.

5 Conclusion

This paper summarises the first two years of implementation of the OBRSS and some data about the coverage of publications and spending pattern. Whilst there have been questions concerning the fairness, transparency, and effectiveness and efficiency of the scheme, the OBRSS has also received encouraging and positive feedbacks. Since most funding schemes, both within the university and those offered by national and EU funding agencies, are competitive, faculty are appreciative of the reward of discretionary funds. The less positive responses of the scheme are largely dissatisfaction or disagreement with the ranked publication list. With the consultation process in place, it is hoped that the list will be updated to reflect quality, impact, and disciplinary norms. Research outputs and publications will be analysed regularly, with the understanding that the OBRSS factor would not be entirely conclusive.
It should also be noted that the points system of the OBRSS is not intended to be used as a tool, not least the sole criterion, for research assessment. Since the OBRSS points system does not represent the impact and quality of all kinds of research output, it does not necessarily reflect individuals’ research performance, particularly for those whose research outputs are more tailored and useful to local audience such as policy makers and businesses. An analysis of research outputs not included in the OBRSS ranked publication list would provide some insights into the notion of impact other than publications.
Nevertheless, the OBRSS has set an example of output-based support scheme in Ireland. Two other universities are currently considering implementing similar schemes. It is not yet known, however, whether they will adapt the Norwegian model when constructing the publication list and the points system.
Based on the experiences in the adaption of the Norwegian model (Aagaard, Bloch, & Schneider, 2015), it will be a few years before publication trends and effects of the OBRSS can be identified and analysed. Nevertheless, the implementation has provided insight into current publication patterns and preferences.
This is an open access article licensed under the Creative Commons Attribution-NonCommercial-NoDerivs License (http://creativecommons.org/licenses/by-nc-nd/4.0/).

The authors have declared that no competing interests exist.

[1]
Aagaard K. (2015). How incentives trickle down: Local use of a national bibliometric indicator system. Science and Public Policy, 42, 725-737.The question of how the incentives of national performance-based research funding systems affect local management practices within the higher education sector has high empirical and theoretical importance, but has so far received limited attention. From a traditional organization theory perspective the Norwegian system represents a puzzle: it is of marginal economic importance and the external pressure to fully implement the system at local levels is weak. A loose coupling between the national system and the local implementation would therefore be expected. Yet, in many instances we document a quite tight coupling between system-level incentives and local practices. Based on a recent evaluation of the Norwegian model, which identifies a number of indirect mechanisms that contribute to the trickling down of incentives, this paper shows that traditional coupling perspectives are oversimplistic. A large variation across institutions, fields and departments is, however, also observed and possible explanations for this are discussed.

DOI

[2]
Aagaard K., Bloch C., & Schneider J. W. (2015). Impacts of performance-based research funding systems: The case of the Norwegian Publication Indicator.Research Evaluation,24(2),106-117. Retrieved from .

[3]
Butler L. (2003). Modifying publication practices in response to funding formulas.Research Evaluation,12(1),39-46. Retrieved from .

[4]
Butler L. (2004).What happens when funding is linked to publication Counts? In Handbook of Quantitative Science and Technology Research (pp. 389-405). Dordrecht: Kluwer Academic Publishers. Retrieved from .

[5]
Castelvecchi D. (2015).Physics paper sets record with more than 5,000 authors.Nature, May 15. Retrieved from .

[6]
Dahler-Larsen P. (2014). Constitutive effects of performance indicators: Getting beyond unintended consequences.Public Management Review,16(7),969-986. Retrieved from .

[7]
Hicks D. (2012). Performance-based university research funding systems.Research Policy,41(2),251-261. Retrieved from .

[8]
Federation of Finnish Learned Societies. (n.d.).Publication Forum. Retrieved from

[9]
Liefner I. (2003). Funding, resource allocation, and performance in higher education systems.Higher Education,46(4),469-489. Retrieved from .

[10]
Ministry of Higher Education and Science. (2018). The BFI lists-Uddannelses-og Forskningsministeriet. Retrieved from https://ufm.dk/en/research-and-innovation/statistics-and-analyses/bibliometric-research-indicator/bfi-lists.

[11]
NSD-National Centre for Research Data. (2018). Scientific journals, series and publishers. Retrieved from https://dbh.nsd.uib.no/publiseringskanaler/Forside?request_locale=en.

[12]
Schneider , J.W. (2009).An Outline of the Bibliometric Indicator Used for Performance-Based Funding of Research Institutions in Norway.European Political Science,8(3),364-378. Retrieved from .

[13]
Schneider J. W., Aagaard K., & Bloch C. W. (2015) What happens when national research funding is linked to differentiated publication counts? A comparison of the Australian and Norwegian publication-based funding models. Research Evaluation; 25(3), 244-256.The experiences from Australia where undifferentiated publication counts were linked to funding of universities in 1993 is well known. Publication activity increased, but the largest increase was in lower-impact journals, leading to a general drop in overall citation impact for Australia. The experience from Australia has been a warning for what would most likely happen if funding were linked to publication activity. Nevertheless, in 2005, a performance-based model based on differentiated publication counts was implemented in Norway. The model was specifically developed to counter adverse effects like those identified in the Australian case. In the present article, we examine hat happens at the aggregated level of publication and citation activity when funding is linked to differentiated publication counts. We examine developments in Norwegian publication activity, journal publication profiles, and citation impact. We also examine developments in publication activities at the individual level and developments in research and development resource inputs. We compare experiences in Australia to those in Norway. The results show that for the Norwegian case, overall publication activity goes up, impact remains stable, and there is no indication of a deliberate displacement of journal publication activities to the lowest-impact journals. Hence, we do not see the same patterns as in Australia. We conclude that the experience in Norway with differentiated publication counts linked to funding has been different from the experience in Australia with an undifferentiated model. This is an important observation because currently the Norwegian model is being or has been adopted in several European countries.

DOI

[14]
Sivertsen G.(2016).Publication-based funding: The Norwegian model. In Research Assessment in the Humanities (pp. 79-90). Cham: Springer International Publishing. Retrieved from .

[15]
Woelert P.,& Yates ,L. (2015).Too little and too much trust: performance measurement in Australian higher education. Critical Studies in Education,56(2),175-189. Retrieved from .

[16]
Zacharewicz T., Lepori B., Reale E., & Jonkers K.(2018). Performance-based research funding in EU Member States—a comparative assessment.Science and Public Policy. Retrieved from .

Outlines

/

京ICP备05002861号-43

Copyright © 2023 All rights reserved Journal of Data and Information Science

E-mail: jdis@mail.las.ac.cn Add:No.33, Beisihuan Xilu, Haidian District, Beijing 100190, China

Support by Beijing Magtech Co.ltd E-mail: support@magtech.com.cn