Research Papers

Ranking academic institutions based on the productivity, impact, and quality of institutional scholars

  • Amir Faghri 1 ,
  • Theodore L. Bergman , 2,
Expand
  • 1Department of Mechanical and Aerospace Engineering, University of California, Los Angeles, Los Angeles, CA 90095, USA
  • 2Department of Mechanical Engineering, University of Kansas, Lawrence, KS 66045, USA
Theodore L. Bergman (Email: ).

Received date: 2024-03-22

  Revised date: 2024-06-08

  Accepted date: 2024-06-13

  Online published: 2024-07-11

Abstract

Purpose The quantitative rankings of over 55,000 institutions and their institutional programs are based on the individual rankings of approximately 30 million scholars determined by their productivity, impact, and quality.

Design/methodology/approach The institutional ranking process developed here considers all institutions in all countries and regions, thereby including those that are established, as well as those that are emerging in scholarly prowess. Rankings of individual scholars worldwide are first generated using the recently introduced, fully indexed ScholarGPS database. The rankings of individual scholars are extended here to determine the lifetime and last-five-year Top 20 rankings of academic institutions over all Fields of scholarly endeavor, in 14 individual Fields, in 177 Disciplines, and in approximately 350,000 unique Specialties. Rankings associated with five specific Fields (Medicine, Engineering & Computer Science, Life Sciences, Physical Sciences & Mathematics, and Social Sciences), and in two Disciplines (Chemistry, and Electrical & Computer Engineering) are presented as examples, and changes in the rankings over time are discussed.

Findings For the Fields considered here, the Top 20 institutional rankings in Medicine have undergone the least change (lifetime versus last five years), while the rankings in Engineering & Computer Science have exhibited significant change. The evolution of institutional rankings over time is largely attributed to the recent emergence of Chinese academic institutions, although this emergence is shown to be highly Field- and Discipline-dependent.

Research limitations The ScholarGPS database used here ranks institutions in the categories of: (i) all Fields, (ii) in 14 individual Fields, (iii) in 177 Disciplines, and (iv) in approximately 350,000 unique Specialties. A comprehensive investigation covering all categories is not practical.

Practical implementations Existing rankings of academic institutions have: (i) often been restricted to pre-selected institutions, clouding the potential discovery of scholarly activity in emerging institutions and countries; (ii) considered only broad areas of research, limiting the ability of university leadership to act on the assessments in a concrete manner, or in contrast; (iii) have considered only a narrow area of research for comparison, diminishing the broader applicability and impact of the assessment. In general, existing institutional rankings depend on which institutions are included in the ranking process, which areas of research are considered, the breadth (or granularity) of the research areas of interest, and the methodologies used to define and quantify research performance. In contrast, the methods presented here can provide important data over a broad range of granularity to allow responsible individuals to gauge the performance of any institution from the Overall (all Fields) level, to the level of the Specialty. The methods may also assist identification of the root causes of shifts in institution rankings, and how these shifts vary across hundreds of thousands of Fields, Disciplines, and Specialties of scholarly endeavor.

Originality/value This study provides the first ranking of all academic institutions worldwide over Fields, Disciplines, and Specialties based on a unique methodology that quantifies the productivity, impact, and quality of individual scholars.

Cite this article

Amir Faghri , Theodore L. Bergman . Ranking academic institutions based on the productivity, impact, and quality of institutional scholars[J]. Journal of Data and Information Science, 2024 , 9(3) : 116 -154 . DOI: 10.2478/jdis-2024-0017

1 Introduction

As is well known, both global and national university rankings are generated by various organizations (Rauhvargers, 2014) and used by a variety of interested parties ranging from government agencies to prospective students. The existing rankings can be based on a broad range of quantitative inputs such as but not limited to numbers of publications or citations, or on a multitude of qualitative inputs including but not limited to reputation surveys of individuals in academia or industry. Nearly all rankings are based on the cumulative scholarly output of the institution or its programs rather than, for example, on the number of highly productive and unusually influential individual scholars associated with the institution and its programs. Rankings are available for the academic institutions overall (including all areas of scholarship), and for academic institutions in specific subject areas such as engineering or medicine. Global (world) rankings are available, as are rankings of universities in specific countries/regions. Country/regional rankings tend to incorporate more inputs than do global rankings since considerations such as “student quality” can vary significantly from country-to-country or region-to-region (Cakur et al., 2015).
Some organizations conduct their rankings based on information restricted to specific calendar years (e.g. citations of works that were published during a recent five-year period), while others base at least part of their assessments on input associated with unrestricted time periods (e.g. citations of works regardless of when the works were published). Some organizations request academic institutions to provide them with additional information, while others do not. The cumulative time spent by faculty, staff, and researchers (i) responding to reputation surveys or (ii) providing information to the ranking organizations has not apparently been quantified, but it is well known within academia that responding to reputation surveys diminishes the productivity of the very institutions that will be ranked based, in part, on their productivity. Controversy as to the efficacy of rankings has surfaced due to questions about (i) the relevance of data provided in the form of reputational surveys (Bastedo & Bowman, 2010; Bowman & Bastedo, 2011; Shehatta & Mahmood, 2016), (ii) the reliability of data provided by the institutions being ranked (Daskivich & Gewertz, 2023; Pikos, 2022; Sinson et al., 2023), and (iii) the pre-selection of which universities will be considered in the rankings based on various qualifying criteria (Rauhvargers, 2014). As such, an individual institution can receive widely varying rankings by the various ranking organizations.

1.1 Comparison of ranking organization methodologies

A high-level overview of the attributes of various ranking organizations (including those of ScholarGPSTM which will be presented in detail in Section 2), based on information gleaned from the organizations’ websites, is provided in Table 1. As is evident, the number of ranked global academic institutions varies from just 100 (Reuter’s World Top 100 Innovative Universities) to nearly 12,000 (Webometrics). Excluding these outliers along with SciMago Institutions Ranking (8,433 institutions) the average number of institutions ranked by the organizations is approximately 1,700, ranging from those of ScholarGPSTM at 2,995 to those of the Academic Ranking of World Universities at 1,000.
Table 1. Attributes of various organizations and their ranking of academic institutions and programs.
Ranking Organization Number of Academic Institutions Ranked (Overall) Input Quantitative/Qualitative Input Entities Ranked Period of Scholarly Activity used for Annual Rankings Tansparency of Ranking Methodology
Number and Quality of Top Institutional
Scholars1
Related to Scholarly Publications2 Only Scholarly Publications and Other Information3 Quant. Only Quant. and Survey4 Survey4 Only Institution (Overall) Institution Subject Areas Institution Subject Sub- Areas Institution Specialties
ScholarGPS 2,995 X X X X 14 177 ~ 350,000 Lifetime and
Five Years5
High
University Ranking by Academic Performance 3,000 X X X 78 Five Years5 Medium/High
NTU Rankings 1,050 X X X 6 27 Eleven & Two
Years
Medium/Low
Leiden Ranking 1,411 X X X 5 Four Years Medium/High
Reuter’s World’s Top 100 Innovative Universities 100 X6 X X Five Years Low
Academic Influence unknown X X X 10 166 Unknown Low
Center for World University Rankings 2,000 X X X 23 227 Nine Years5 Medium
SCImago Institutions Ranking 8,433 X X X 19 57 Five Years Medium
Webometrics 11,997 X X X Unknown Low
MosIUR 2,000 X X X Three Years5 Medium
QS World University Rankings 1,500 X X X 5 54 Unknown Low
Academic Ranking of World Universities 1000 X X7 X 54 Variable Medium/High
US News & World Report 2,000 X X X 46 Five Years8 Medium
Times Higher Education World University Rankings 1,799 X X X 31 Five Years5 Medium
Round University Ranking 1.217 X X X 6 Multiple5 Medium
Times Higher Education World Reputation Rankings 1,799 X X 31 Not Applicable Medium

Notes: (1) Based on top 0.5% of scholars in all institutions who are affiliated with the institution being ranked. Scholar rankings are based on their productivity, impact, and quality. See the Methodology Section. (2) Publications (journal papers, books, book chapters, conference papers, and patents) are defined differently by the various ranking organizations (e.g. some exclude books, others include books). (3) Examples include Awards, Research Funding, Size of Endowment, Alumni Success, Selective Admissions, International Student Enrollment. (4) Reputational surveys of academic faculty, or employers, or alumni. (5) Inputs related to scholarly work outside of the time period (e.g. citations in the time period, but to work published prior to the time period) are excluded. (6) Emphasis on patents. (7) Quantitative analysis of journal publications and awards with specific journals and awards identified by survey. (8) Citations in the time period, but to work published prior to the time period are included.

As evident in the three “Input” columns of Table 1, five of the ranking organizations base their analyses solely on information associated with scholarly publications (as a whole, or on a per capita basis) emanating from each university that is considered for ranking. This information might include the number of publications, the number of citations of those publications, or the number of publications appearing in specific, highly regarded journals. Also evident in the “Input” columns, most of the ranking organizations include other information as input to their ranking methodologies such as levels of endowment, the number of major awards to scholars in the university, the level of industry engagement, and other factors such as those listed in Note 3 of Table 1. The ScholarGPS rankings are the focus of this study and are based on the number and quality of top institutional scholars, as will be defined in Section 2.2. The Times Higher Education World Reputation Rankings are based only on reputation surveys. In general, therefore, the input used in the various methodologies can be purely quantitative, purely qualitative (survey only), or some combination of quantitative and qualitative input as noted in the “Quantitative/Qualitative Input” columns of the table.
From the “Entities Ranked” columns of Table 1 it is obvious that every ranking organization provides ordered rankings of individual academic institutions without specification of a particular subject area (Overall). Most of the organizations also provide institution rankings relative to specific subject areas (such as engineering or medicine) ranging from 5 subject areas (Leiden Ranking) to 78 subject areas (University Ranking by Academic Performance). However, the number of subject areas covered is not a complete indicator of the granularity of the rankings since 7 of the rankings go beyond subject areas and include subject sub-areas (such as mechanical engineering or internal medicine) ranging from 227 (Center for World University Rankings) to 27 (NTU Rankings). To our knowledge, ScholarGPS is the only ranking service that includes a third level of granularity with inclusion of nearly 350,000 Specialties (such as fluid mechanics or myocardial perfusion imaging). Ultimately, the number of entities ranked by each organization (which would include subject areas, sub-areas, and in the case of ScholarGPS, Specialties) range from 5 (Leiden Ranking) to approximately 350,000 (ScholarGPS).
The “Period of Scholarly Activity” used in the rankings (which usually occur annually) also varies among the ranking organizations. To our knowledge, ScholarGPS is the only organization that contemporaneously considers two different time periods (lifetime and last five years) to generate two distinct annual university rankings using otherwise identical input and methodology. As will become evident in Section 3, this dual-ranking approach permits insight into recent trends in university rankings. The last “Transparency” column of Table 1 is our qualitative assessment of the clarity with which the various ranking organizations describe their methodologies on their websites.

1.2 Comparison of top 10 overall global rankings of academic institutions

Because each organization uses their own methodology to rank global universities on a common Overall basis, it is useful to consider the most recent rankings of global universities (as of October 25, 2023) to discern the degree of consistency or inconsistency in the various rankings. The recent rankings of the global Top 10 universities by various organizations are reported in Table 2. The rankings illustrate similarities and differences, consistent with those that have been observed in multiple previous investigations (Buela-Casal et al., 2007; Chen & Liao, 2012; Moskovkin et al., 2022; Shehatta & Mahmood, 2016).
Table 2. Most recent Top 10 global university rankings according to various ranking organizations.
Ranking Organization (Ranking Date) 1 2 3 4 5 6 7 8 9 10
ScholarGPS: Lifetime (2022) Harvard Stanford Michigan UCLA U. Washington Columbia Johns Hopkins U. Pennsylvania U. Toronto UC Berkeley
ScholarGPS: Last Five Years (2022) Harvard Stanford Oxford Michigan U. Pennsylvania U. Toronto Johns Hopkins Tsinghua U.C. London Cambridge
University Ranking by Academic Performance (2022 - 2023) Harvard U. Toronto U.C. London U. de Paris Oxford Stanford Johns Hopkins Shanghai Jiao Tong Tsinghua Zhejiang
NTU Rankings (2023) Harvard Stanford U.C. London Oxford U. Toronto Johns Hopkins U.Washington MIT Cambridge U. Michigan
Leiden Ranking (2023) Harvard Zhejiang Shanghai Jiao Tong Sichuan U. Toronto Huazhong Central South Tsinghua Sun Yat-Sen Xi’an Jiaotong
Reuter’s World’s Top 100 Innovative Universities (2019) Stanford MIT Harvard U. Pennsylvania U.Washington U. North Carolina KU Leuven U. Southern Cal. Cornell Imperial C. London
Academic Influence (2022) Harvard U.C. Berkeley Columbia U. Chicago Stanford Yale Princeton MIT U. Michigan U. Pennsylvania
Center for World University Rankings (2023) Harvard MIT Stanford Cambridge Oxford Princeton U. Chicago Columbia U. Pennsylvania Yale
SCImago Institutions Ranking (2023) Harvard U. Chinese Academy Sciences Tsinghua Harvard Medical School Zhejiang Stanford Shanghai Jiao Tong Peking University Oxford MIT
Webometrics (2023) Harvard Stanford MIT Oxford U.C. Berkeley Michigan Cornell U. Washington Columbia U. Pennsylvania
MosIUR (2023) Harvard MIT Oxford Cambridge U.C. London Stanford Columbia ETH Zurich Imperial C. London U. Chicago
QS World University Rankings (2024) MIT Cambridge Oxford Harvard Stanford Imperial C. London ETH Zurich N. University Singapore U.C. London U.C. Berkeley
Academic Ranking of World Universities (2023) Harvard Stanford MIT Cambridge U.C. Berkeley Princeton Oxford Columbia Caltech U. Chicago
US News & World Report (2022 - 2023) Harvard MIT Stanford U. C. Berkeley Oxford U.Washington Columbia Cambridge Caltech Johns Hopkins
Times Higher Education World University Rankings (2023) Oxford Harvard Cambridge1 Stanford1 MIT Caltech Princeton U.C. Berkeley Yale Imperial C. London
Round University Ranking (2023) Harvard Caltech Stanford MIT Imperial C. London U. Pennsylvania Peking University Yale Oxford U. Chicago
Times Higher Education World Reputation Rankings (2022) Harvard MIT Stanford Oxford Cambridge U. C. Berkeley Princeton Yale Tsinghua U. Tokyo

Note: (1) Tied.

Perhaps the most striking feature of Table 2 is the identification of Harvard University as the Number 1 academic institution in the world by all of the ranking organizations with the exceptions of Stanford University in Reuter’s World’s Top 100 Innovative Universities (Harvard is Number 3), MIT in the QS World University Rankings (Harvard is Number 4), and Oxford University in the Times Higher Education World University Rankings (Harvard is Number 2). Despite the relative consistency in the high rankings of Harvard University, closer consideration of Table 2 illustrates moderate to large inconsistencies in the various Top 10 rankings as follows.
Institutions in the U.S., the U.K, and Canada occupy all Top 10 positions in the rankings of ScholarGPS: Lifetime, NTU Rankings, Academic Influence, the Center for World University Rankings, Webometrics, the Academic Ranking of World Universities, US News & World Report, and the Times Higher Education World University Rankings. Universities in the U.S., the U.K., and the European Union occupy all Top 10 positions in Reuter’s Worlds’ Top 100 Innovative Universities and in the MosIUR rankings. The National University of Singapore joins institutions in the U.S., the U.K, and the E.U. in the QS World University Rankings.
One Chinese institution penetrates the Top 10 rankings in the ScholarGPS: last-five-year list (Tsinghua University), as well as in the Round University Ranking list (Peking University), and in the Times Higher Education World Reputation Rankings (Tsinghua). Three Chinese universities appear in the University Ranking by Academic Performance Top 10, while five Chinese universities are included in the SCImago Institutions Ranking Top 10. Chinese institutions dominate the Leiden Ranking Top 10 with eight table entries. Note that the Leiden Ranking uses only research articles and reviews published in journals included in the Web of Science, and therefore excludes books, book chapters, conference papers, and patents as input and as such emphasizes scholarly work in the physical sciences and mathematics, life sciences, and engineering and computer science (Leydesdorff, Wagner, & Zhang, 2021).
Each of the Top 10 institutions included in most, but not all of the individual rankings also appear among the Top 10 in at least one other ranking. Exceptions include the University Ranking by Academic Performance that uniquely ranks the Universite de Paris in the Top 10 at Number 4, the SCImago Institutions Rankings which includes the University of the Chinese Academy of Sciences at Number 2, the QS World University Rankings which ranks the National University of Singapore at Number 8, and the Times Higher Education World Reputation Rankings which uniquely lists the University of Tokyo at Number 10. Reuter’s World’s Top 100 Innovative Universities uniquely includes the University of North Carolina (Number 6), KU Leuven (Number 7), and the University of Southern California (Number 8); likely because of this ranking organization’s unique focus on innovation, patents, and patent utilization.
Remarkably, Leiden Ranking includes five Top 10 institutions including Sichuan University (Number 4), Huazhong University (Number 6), Central South University (Number 7), Sun Yat-sen University (Number 9), and Xi’an Jiaotong University (Number 10) that do not appear in the Top 10 of any of the other rankings. Other interesting features in the rankings are evident such as the treatment of Harvard University (Number 1) and the Harvard Medical School (Number 4) as separate institutions in the SCImago Institutions Top 10 rankings. The ScholarGPS: Lifetime and ScholarGPS: Last Five Years Top 10 rankings of Table 2 are of special interest in this study since (i) these rankings are based on identical and purely quantitative inputs, (ii) employ the same ranking methodology, and (iii) produce different Top 10 lists of world universities overall.
Motivated by the preceding comparisons of methodologies and the Top 10 ranking results generated by the various ranking organizations, the main objectives of this study are as follows:
$\bullet$ Review the key attributes of the ScholarGPS methodology used to rank approximately 30 million individual scholars affiliated with approximately 55,000 institutions (both academic and non-academic) in over 200 countries/regions as reported in detail by Faghri and Bergman (2024). Rankings of individuals are provided Overall (across all Fields), in 14 unique Fields, in 177 unique Disciplines, and in approximately 350,000 unique Specialties.
$\bullet$ Document, for the first time, how the ScholarGPS rankings of individual scholars are extended to rank approximately 15,000 academic institutions worldwide in over 200 countries/regions with no intentional or arbitrary pre-selection of the institutions or countries to be included or excluded from the rankings. As for individual scholars, rankings of institutions are generated on an Overall (across all Fields) basis, in 14 individual Fields, in 177 Disciplines, and in approximately 350,000 unique Specialties.
$\bullet$ Exercise the ScholarGPS institution ranking methodology to present detailed ScholarGPS (lifetime) and ScholarGPS (last-five-year) rankings of the Top 20 world institutions Overall, in several key Fields, and in several important Disciplines.
$\bullet$ Compare ScholarGPS institution rankings over both (i) lifetime and (ii) last-five-year bases to reveal trends in the world university rankings, and how the temporal change in the rankings various from modest to significant, depending on the Field or Discipline considered.

2 The ScholarGPS institution ranking methodology

ScholarGPS rankings of academic institutions are based on the number and quality of the individuals who are primarily responsible for the scholarly reputation of an institution. As will become evident in Section 2.2, the institutional ranking accounts for the number of active scholars (scholars who have published at least once in the last five years) who are remarkably productive (as measured by their number of publications) and have generated outstanding work of high impact (as measured by the number of citations of their work) and excellent quality (as measured by the h-index). The methodology used to rank individual scholars is first reviewed, followed by elaboration of how the rankings of individual scholars are translated to the rankings of academic institutions.

2.1 Individual scholar ranking

The terminology and methodology used to rank individual scholars, recently developed by ScholarGPS, is described in detail by Faghri and Bergman (2024) and is summarized as follows.
Rankings of individual scholars are relative to all scholars worldwide in each of four ranking categories: Overall (over all Fields), in a particular Field, in a particular Discipline, and in Specialties. The structural organization of the Overall, Field, Discipline, and Specialty categories is shown in Figure 1. The broad, Overall category includes all scholarly work associated with approximately 30 million scholars, and over 140 million archival publications as compiled by ScholarGPS.
Figure 1. Categorization of Fields, Disciplines, and Specialties by ScholarGPS.
Based on their content, each archival publication (book, book chapter, conference paper, journal article, or patent) is assigned to one of the 14 Fields of Figure 1, and one of the 177 Disciplines. Each ScholarGPS Discipline is a subset of one and only one Field (e.g., Urology is a subset of only Medicine, whereas Planetary Sciences is a subset of only the Physical Sciences and Mathematics). In addition, each archival publication is assigned to one of over 350,000 unique Specialties that typically span across multiple Disciplines and are therefore not tied to specific Disciplines or specific Fields. Based on the content of their own publications, each individual scholar is similarly assigned to one unique Field and one unique Discipline associated with that Field, and to multiple Specialties.
As presented in detail by Faghri and Bergman (2024), the ScholarGPS ranking methodology begins by constructing a unique profile for each scholar using the general process shown in the upper portion of Figure 2. Each profile includes the scholar’s publication, citation, and h-index information, as well as their affiliation history which is determined from the scholar’s publications. Each scholar’s ranking is based on either the totality of their work (lifetime as of December 30, 2021), or only work published in the last five years (January 17, 2017 through December 30, 2021).
Figure 2. ScholarGPS scholar and institutional ranking logic diagram.
The raw metadata associated with the publications are gleaned from multiple sources including Crossref, PubMed, Microsoft Academic Graph, and Unpaywall. Pre-processing algorithms are applied to this raw data to improve its quality through (i) elimination of publications from further consideration that are not archival or are duplicates, and (ii) management of publisher errors such as formatting mistakes, structural errors, and other inconsistencies. Once processed, the improved metadata are subsequently indexed for further analysis.
Only archival publications (publications with a DOI, or ISBN/ISSN, or a patent number, and have undergone peer review) are included in the ranking methodology. However, publications associated with a Memorium, Commentary, Celebration, or other minor matters are excluded from further consideration, as are open-access repository publications that are not peer-reviewed such as those in arXiv. Moreover, publications having many (> 30) authors are not included in the ranking process because credit cannot be accurately allocated among the authors of such works.
Any individual who has authored at least one admissible archival publication is considered a scholar. The names of scholars are disambiguated as accurately as possible, but to allow further improvement in the quality of the data individuals can claim their scholar profiles and make corrections or merge multiple profiles that belong to them. Note that scholars who have authored an excessive number of retractions, have published the same material in multiple venues, have demonstrated excessive plagiarism, or have published fraudulent data are not included in the rankings.
As reported by Faghri and Bergman (2024), evaluation of over 5,000 scholar profiles from various Fields and Disciplines was used to estimate the precision (98.5%), and recall (96%), relative to the assignment of publications to authors. It was similarly found that Disciplines and Fields were correctly assigned to each publication with an accuracy of 95%.
To reiterate, scholar rankings are conducted in four categories: (i) Overall (all Fields), (ii) by Field, (iii) by Discipline, and (iv) by Specialty. Four metrics are calculated across each of the categories to rank individual scholars: (i) their productivity (archival publication count), (ii) the impact of their work (citation count), (iii) the quality (h- index) of their publications, and (iv) the ScholarGPS Rank which is the geometric mean of the product, impact, and quality rankings. Self-citations are excluded from the ranking process as recommended by Viiu (2016), and author publication and citation counts are weighted by the number of authors of each publication (van Hooydonk,1997). For example, if a publication has four authors, each is credited with 0.25 publications and one-quarter of the citations to that publication; the scholar’s fractional h-index (Koltun & Hafner, 2021) is also calculated based on these weighted citation counts.
The quantitative ranking of all individual scholars involves the calculation of the competition rank, CR, for each scholar relative to their (i) productivity, as well as the (ii) impact and (iii) quality of their publications. The CR values are then used to calculate the top percentage rank (TPR) of each scholar within any of the four categories (Overall, Field, Discipline, and Specialty) on either a lifetime or last-five-year basis. Note that each scholar’s ranking in the Overall category is based on their ranking in their Discipline to accommodate wide variations in the publishing and citation traditions that exist from Discipline-to-Discipline. Each scholar’s TPR in any Field, Discipline, or Specialty is determined from the total number of scholars assigned to a category, N, the standard competition rank of each scholar in the category, CR, and the number of scholars, F, who might share the same competition rank, CR. Specifically, top percentage ranks by productivity (publication count) TPRp, by impact (citation count) TPRc, and by quality (h-index) TPRh are each determined using:
$ T P R=100-\left[\frac{(N-C R+1)-(0.5 \times F)}{N} \times 100\right]$
Once TPRp, TPRc, and TPRh are calculated for each scholar, the ScholarGPS Rank, S, for each scholar is determined by:
$ S=\sqrt[3]{T P R_{p} \times T P R_{c} \times T P R_{h}}$
Further details of the scholar ranking methodology are provided in Faghri and Bergman (2024). Profiles of individual scholars and their rankings in the Overall, Field, Discipline, and Specialty categories are available at www.scholargps.com.

2.2 Ranking of academic institutions and programs

The preceding methodology used to rank individual scholars is extended to rank academic institutions as follows. The institutional rankings Overall, in a particular Field, in a particular Discipline, or in a particular Specialty are determined by the number of scholars of high stature (having small values of S) in each institution using the general process shown in the bottom portion of Figure 2. These admissible, top institutional scholars are individuals who are active (have published at least once in the last five years) and have a ScholarGPS Rank of S ≤ 0.5% in the Overall category, in their Field, in their Discipline, or in their Specialty. Note that ScholarGPS provides institutional rankings for (i) all institutions, (ii) academic institutions only, and (iii) non-academic institutions only. Academic institutions are the focus of this study.
An institutional rank score, Rs,d,or f, is calculated for each institution relative to each Specialty, each Discipline, and each Field, as
$ R_{S, d, o \mathrm{or} f}=\sum_{a \in_{A}}\left(100-S_{a}\right)$
where A is the set of admissible scholars in the institution in the individual category of interest, and Sa are the ScholarGPS Ranks of the admissible, top institutional scholars in the category.
Through extensive testing and as noted above, it was found that an institution’s Overall rank, Ro, is best represented by its rankings relative to all Disciplines in the institution:
$ R_{o}=\sum_{b \in B}\left(100-S_{b}\right)$
where B A is the set of admissible scholars in the institution in all categories. The parameter Sb represents the ScholarGPS Ranks of the admissible, top institutional scholars in any Discipline. Although distinct scholar rankings are reported by ScholarGPS based on user-selected options to: (i) include or exclude self-citations, and to (ii) weight or not weight the numbers publications and citations by the numbers of authors on each publication, for institutional ranking purposes self- citations are excluded, and both author publication and citation counts are weighted by the number of authors of each publication as noted previously. Also note that subscripts s, d, f, and o will be dropped from the values of R reported in Section 3 for purposes of brevity and clarity.
As mentioned above, the preceding quantitative methodology for academic institutional ranking ultimately relates each academic institution’s Overall ranking to the number of active scholars in the institution who, as individuals, excel in their Disciplines. Use of disciplinary excellence is in recognition of (i) publication and citation traditions that can vary from Discipline-to-Discipline even for Disciplines in the same Field (Rauhvargers, 2014) and (ii) the stability of the 177 Disciplines over time (in contrast to Specialties that can vary from year-to- year as new areas of scholarly endeavor evolve). Also, with its focus on disciplinary excellence the methodology does not, for example (i) reward institutions whose mission is oriented toward engineering or medicine or (ii) punish institutions that emphasize the liberal arts or humanities. The institutional ranking methodology purposely avoids per capita (per scholar, or per faculty member) performance, circumventing abnormalities such as (i) low rankings assigned to large institutions that may have hundreds of top institutional scholars contemporaneous with (ii) high rankings assigned to small institutions that might have only a few top institutional scholars.
Accurate per capita calculations also rely on institutions to provide the ranking organization with quantitative information regarding total faculty or staff size, including numbers of faculty or staff members who are do not or are no longer publishing their work. Other features of the ScholarGPS institutional ranking approach are noted in Table 3.
Table 3. Distinguishing features of the ScholarGPS institutional ranking methodology.
SCHOLARGPS FEATURE ELABORATION
Academic and Non-Academic Institutions are Ranked Rankings are available for academic institutions only, for non-academic institutions only, and for academic and non-academic institutions combined.
Unique Ranking Philosophy Institutions are ranked based on the number of outstanding scholars who, by their productivity, impact, and quality, are primarily responsible for the global scholarly reputation of the institution.
Purely Quantitative Assessment Rankings are based soley on the factual publication records of the scholars within the institution compared to the publication records of scholars at all institutions.
No Self-Citations and Weighted Authorship Self-citations are excluded. Numbers of publications, citations, and the h-index are weighted by the number of authors.
Fine Granularity of Subject Matter Rankings Rankings of academic institutions are available in four categories: (i) Overall (including all Fields), (ii) in each of 14 distinct Fields, (iii) in each of 177 distinct Disciplines, and (iv) in each of over 350,000 Specialties. Specialties are updated regularly to accommodate emerging subject matter areas.
Productivity, Quality, and Impact of Active, Outstanding Scholars Institutional rankings are based on the productivity (publication count), impact (citation count), and quality (h-index) of active (published at least once in the last five years) scholars whose ScholarGPS Ranks exceed those of 99.5% of scholars in their Disciplines, S < 0.5%.
Fair Comparison across Fields, Disciplines, and Specialties Institutional Field, Discipline, and Specialty rankings are based on the productivity, quality, and impact of scholars within the Field, Discipline, and Specialty. For example, scholars in Discipline A are compared only to scholars in Discipline A for ranking purposes, and not compared to scholars in Discipline B.
Lifetime and Last-Five-Year Rankings are Conducted The same ranking methodology is applied to both (i) lifetime publications of active scholars in the institution and (ii) only publications appearing in the last five years permitting both Lifetime and Last-Five-Year institutional rankings and investigation of historical ranking trends.
Open Access to Scholars Scholars are permitted to curate their publication lists, and correct the Discipline with which they are associated, thus ensuring the veracity of their own ranking and, in turn the ranking of their institution.
No Institutional input or Curation is Required Institutions do not need to provide any information to ScholarGPS.
Because the ScholarGPS rankings of individual institutions (and scholars) are inclusive of those in over 200 countries/regions, no attempt is made to adjust the rankings based on considerations that have been suggested in the literature that are country-, region-, or culturally-specific such as but not limited to: the operating budget of the institution, availability of research funding, size of endowment, selectivity of the student admissions process, student quality, international student enrollment levels, faculty teaching loads, alumni success, alumni salaries, levels of collaboration among individual researchers, the extent of national or international institutional collaboration, the extent of collaboration among researchers of different gender or different age, market adaptation of patents generated, level of entrepreneurial activity, quality of the other scholars who cite the scholar’s work, numbers of web page views, or correction for language bias as described in hundreds of sources such as but not limited to (Aksnes et al., 2017; Auranen & Nieminen, 2010; Beveridge & Bak, 2011; Bozeman & Corley, 2004; Bozeman et al., 2013; Bozeman et al., 2016; Cakur et al., 2015; Coccia, 2008; Coccia & Bozeman, 2016; Fairclough & Thelwell, 2015; Guba & Tsivinskaya, 2023; Jasco, 2009; Leydesdorff & Zhou, 2005; Leydesdorff & Wagner, 2009; Massucci & Docampo, 2019; Ramírez-Castañeda, 2020; Rodriguez-Navarro, 2016; Van Leeuwen et al., 2001; Van Raan, van Leeuwen, & Visser, 2011).

3 Results and discussion

ScholarGPS provides detailed profiles of approximately 30 million scholars including their ranking in various Fields, Disciplines, and Specialties on a lifetime basis, or over the last five years. The number of scholars included in the ranking of academic institutions in each of the 14 Fields is reported in Table 4 on both lifetime (N) and last-five-year (N5) bases. As evident, the number of included scholars varies significantly from Field-to-Field with (i) Medicine as well as Engineering & Computer Science being the two largest Fields and (ii) Law being the smallest.
Table 4. Number of scholars included in the ScholarGPS rankings of academic institutions. Fields are listed in order of decreasing N.
Field Number of Scholars included in Academic Institution Rankings, Lifetime (N) Number of Scholars included in Academic Institution Rankings, Last Five Years (N5)
Overall (All Fields) 81,611 55,186
Medicine 17,124 11,113
Engineering & Computer Science 16,146 11,853
Physical Sciences & Mathematics 12,655 8,429
Life Sciences 10,369 6,884
Social Sciences 8,210 5,166
Agriculture & Natural Resources 3,497 2,666
Public Health 2,677 1,774
Pharmacy & Pharmaceutical Sciences 2,557 1,794
Allied Health 2,529 1,665
Arts & Humanities 1,844 1,082
Business & Management 1,670 1,155
Education 1,043 729
Dentistry 774 562
Law 516 314

Data Source: ScholarGPS.com

Whereas Medicine is the largest Field on a lifetime basis (N = 17,124, N5 = 11,113), Engineering & Computer Science surpasses Medicine with the most included scholars based on data for the last five years (N = 16,146, N5 = 11,853). Public Health (N = 2,677, N5 = 1,774) and Pharmacy & Pharmaceutical Sciences (N = 2,557, N5 = 1,794) have also traded places in the last five years in terms of the numbers of scholars included in the rankings, as have Arts & Humanities (N = 1,844, N5 = 1,082) and Business & Management (N = 1,670, N5 = 1,155). The five Fields with the most included scholars, Medicine and Engineering & Computer Science through Social Sciences (N = 8,210, N5 = 1,566), comprise 78.7 percent of the total number of scholars included in the worldwide academic institution rankings over the last five years and are therefore the primary focus of this study.

3.1 Overall (all Fields) institutional rankings

The Top 20 institutions over all Fields on the lifetime as well as the last-five-year basis are identified in Figure 3. As was reported in Table 2, Harvard University [R= R5 = 1] and Stanford University [R= R5 = 2] occupy the top two positions. Most of the institutions included in the figure are ranked among the Top 20 in both (lifetime and last-five-year) lists. As evident, five U.S. institutions have dropped from Top 20 lifetime rankings with last-five-year rankings ranging from that of the University of California, San Diego [R= 15, R5 = 21] to the University of Wisconsin-Madison [R=16, R5 = 48]. These five institutions have been replaced in the Top 20 last-five-year list by three institutions that are relatively highly ranked on a lifetime basis (University of British Columbia [R= 21, R5 = 15], Imperial College London [R= 31, R5 = 14], and University of Melbourne [R= 42, R5 = 16]) along with two Chinese institutions that have increased their last-five-year rankings dramatically relative to their lifetime standing among academic institutions (Tsinghua University [R= 128, R5 = 8] and Zhejiang University [R= 190, R5 = 19]). The relatively recent increase in the global rankings of Chinese universities has been attributed to, for example, the funding targeted to select universities in that country to foster the emergence of world-class universities (Allen, 2017). All of the universities identified in Figure 3 are members of national/regional organizations that are comprised of the most prolific producers of research: (i) the Association of American Universities, (ii) the U15 Group of Canadian Universities, (iii) the U.K. Russell Group of universities, or (iv) the Chinese Double First Class University Plan.
Figure 3. Top 20 institutions over all Fields and their lifetime and last-five-year ScholarGPS rankings.
The number of academic institutions in each country/region included in the Overall rankings are listed in Table 5 for those institutions included in the Top 20, Top 100, Top 250, and Top 500 categories. As evident, the U.S. and the U.K. have the largest numbers of ranked institutions in any of the preceding four categories when lifetime data are considered. The U.S. also occupies the top ranking in any of the four categories when data for the last five years are considered, but China has surpassed the U.K. for the number of institutions in the Top 100, Top 250, and Top 500 categories. The U.K. is home to the second largest number of ranked institutions in the Top 20 category (N = 4), with China and Canada tied with two ranked institutions each.
Table 5. Number of academic institutions by country/region included in the lifetime and last-five-year ScholarGPS rankings for all Fields (Overall).
Overall Top 20 Lifetime United States (16), United Kingdom (3), Canada (1)
Last Five Years United States (11), United Kingdom (4), Canada (2), China (2), Australia (1)
Overall Top 100 Lifetime United States (58), United Kingdom (9), Australia (6), Canada (6), Netherlands (5), Japan (3), Belgium (2), Israel (2), Sweden (2), Switzerland (2), Denmark (1), Finland (1), France (1), Germany (1), Singapore (1)
Last Five Years United States (39), China (18), United Kingdom (8), Australia (6), Netherlands (6), Canada (5), Hong Kong, China (4), Switzerland (3), Belgium (2), Denmark (2), Italy (2), Singapore (2), Germany (1), Japan (1), South Korea (1)
Overall Top 250 Lifetime United States (112), United Kingdom (25), Germany (17), Canada (15), Netherlands (12), Australia (10), Japan (8), Switzerland (7), Hong Kong, China (5), Sweden (5), France (4), Israel (4), Italy (4), Belgium (3), China (3), Denmark (3), Austria (2), New Zealand (2), Norway (2), Singapore (2), Brazil (1), Finland (1), South Korea (1), Spain (1), Taiwan, China (1)
Last Five Years United States (68), China (40), United Kingdom (20), Australia (18), Germany (16), Netherlands (11), Canada (10), Italy (8), Switzerland (7), Hong Kong, China (5), Denmark (4), France (4), Japan (4), South Korea (4), Sweden (4), Belgium (3), Iran (3), Saudi Arabia (3), Austria (2), New Zealand (2), Portugal (2), Singapore (2), Spain (2), Brazil (1), Finland (1), Greece (1), India (1), Israel (1), Norway (1), Qatar (1), South Africa (1)
Overall Top 500 Lifetime United States (165), United Kingdom (46), Germany (44), Australia (25), Canada (23), China (23), Italy (19), Japan (18), France (16), Netherlands (14), Sweden (10), Spain (9), Belgium (7), Finland (7), Israel(7), Switzerland (7), South Korea (6), Austria (5), Denmark (5), Hong Kong, China (5), Greece (4), India (4), Ireland (4), New Zealand (4), South Africa (4), Taiwan, China (4), Norway (3), Portugal (3), Brazil (2), Singapore (2), Czech Republic (1), Hungary (1), Iran (1), Saudi Arabia (1), Slovenia (1)
Last Five Years United States (123), China (77), United Kingdom (37), Germany (33), Australia (25), Italy (25), Canada (20), Netherlands (13), India (11), South Korea (11), Iran (10), Japan (9), Spain (9), Sweden (9), France(8), Belgium (7), Switzerland (7), South Africa (6), Denmark (5), Finland (5), Hong Kong, China (5), Malaysia (5), Brazil (4), Israel (4), Portugal (4), Saudi Arabia (4), Ireland (3), Norway (3), Austria (2), Greece (2), New Zealand (2), Pakistan (2), Singapore (2), United Arab Emirates (2), Czech Republic (1), Macau (1), Qatar (1), Slovenia (1), Taiwan, China (1), Vietnam (1)

Data Source: ScholaGPS.com

3.2 Field rankings

Rankings associated with the Fields with the most included scholars (Medicine, Engineering & Computer Science, Physical Sciences & Mathematics, Life Sciences, and Social Sciences), are reported in Figures 4 - 8. Note that the Disciplines comprising each of these five Fields are identified in Table 6; the Disciplines of Chemistry (in Physical Sciences & Mathematics) and Electrical & Computer Engineering (in Engineering & Computer Science) will be considered in more detail in Section 3.3 as they are the two Disciplines with the most included scholars among the five Fields considered.
Table 6. The Disciplines comprising the Fields of Engineering & Computer Science, Life Sciences, Medicine, Physical Sciences & Mathematics, and Social Sciences by ScholarGPS.
FIELD DISCIPLINES of the FIELD
Engineering & Computer Science Aerospace and Aeronautical Engineering, Automotive Engineering, Biological and Biomolecular Engineering, Biomedical Engineering, Chemical Engineering, Civil and Environmental Engineering, Computer Science, Electrical and Computer Engineering, Industrial Engineering and Operations Research, Materials Science and Engineering, Mechanical Engineering, Mining Engineering, Naval Engineering, Nuclear Engineering, Petroleum Engineering
Life Sciences Anatomy, Biochemistry, Biology and Biological Sciences, Biomedical Sciences, Ecology and Evolutionary Biology, Environmental Sciences, Genetics, Marine Sciences, Microbiology, Molecular and Cell Biology, Neurosciences, Paleontology, Parasitology, Phycology, Physiology, Virology, Zoology
Medicine Anesthesiology, Cardiology, Dermatology, Emergency Medicine, Endocrinology, Family Medicine, Gastroenterology, Geriatrics, Hematology, Immunology, Internal Medicine, Nephrology, Neurology, Neurosurgery, Nuclear Medicine, Obstetrics and Gynecology, Oncology, Ophthalmology, Orthopaedic Surgery, Otolaryngology, Pathology, Pediatrics, Physical Medicine and Rehabilitation, Psychiatry, Pulmonology, Radiology, Rheumatology, Sports Medicine, Surgery, Urology
Physical Sciences & Mathematics Astronomy, Atmospheric Sciences, Chemistry, Earth and Planetary Sciences, Mathematics, Oceanography and Limnology, Physics, Statistics
Social Sciences Anthropology, Archaeology, Brain and Cognitive Sciences, Criminology and Criminal Justice, Economics, Geography, Human Development and Family Studies, Information Sciences, Journalism, Linguistics, Political Science, Psychology, Public Policy, Recreation and Leisure, Social Work, Sociology, Urban Studies

Data Source: ScholarGPS.com

As already noted, Medicine has the largest number of included scholars on a lifetime basis. Academic institutions ranked in the Top 20 in this Field on the lifetime and last-five-year bases are identified in Figure 4. Most of the institutions shown have changed positions within the two Top 20 lists with the exception of (i) Johns Hopkins University [R= R5 = 2] and New York University [R= R5 = 20] that hold the same position in both lists and (ii) King’s College London [R= 26, R5 = 17] that has entered the Top 20 last-five-year list and the University of California, Los Angeles [R= 5, R5 = 25] that has dropped out of the Top 20 lifetime list, respectively. The institutions shown are primarily based in the U.S. along with representation from the U.K. (University College London [R= 13, R5 = 10] and King’s College London) and Canada (University of Toronto [R= 11, R5 = 7]). The predominance of highly ranked U.S. institutions in Medicine is consistent with the conclusions drawn by Faghri and Bergman (2024) regarding the sustained U.S. leadership in this Field.
Figure 4. Top 20 institutions in Medicine and their lifetime and last-five-year ScholarGPS rankings.
As was evident from Table 4, Engineering & Computer Science has the largest number of included scholars of any Field over the last five years, and the associated Top 20 rankings of this Field are reported in Figure 5. It is apparent that the rankings in Engineering & Computer Science have undergone profound change, especially considering the relative consistency of institution rankings associated with Medicine as seen in Figure 4. Only 6 institutions appear in both the Top 20 lifetime and Top 20 last-five-year lists (Massachusetts Institute of Technology [R= 1, R5 = 2], Stanford University [R= 2, R5 = 12], Georgia Institute of Technology [R= 3, R5 = 11], National University of Singapore [R= 9, R5 = 5], Imperial College London [R= 11, R5 = 17], and Nanyang Technological University [R= 14, R5 = 3]).
Figure 5. Top 20 institutions in Engineering & Computer Science and their lifetime and last-five-year ScholarGPS rankings.
Fourteen institutions including 12 from the U.S. and one each from Canada and Israel have dropped out of the Top 20 lifetime list and have last-five-year rankings ranging from that of the University of California, Berkeley [R= 4, R5 = 22] to the Technion Israel Institute of Technology [R= 19, R5 = 193] which has dropped 174 places between the two ranking lists.
Cornell University [R= 17, R5 = 124] has also lost over 100 places in these rankings. In contrast, the 14 institutions that have joined the Top 20 last-five-year list consist of 13 from China and Hong Kong, China, along with the University of New South Wales [R= 47, R5 = 18]. Seven institutions from China have risen more than 100 places in the last-five-year rankings relative to the lifetime rankings including Xi’an Jiaotong University [R= 118, R5 = 7], Huazhong University of Science and Technology [R= 135, R5 = 8], Beihang University [R= 150, R5 = 20], South China University of Technology [R= 162, R5 = 19], Tianjin University [R= 169, R5 = 10], University of Electric Science and Technology of China [R= 229, R5 = 15], and Central South University [R= 318, R5 = 13]. The dramatic shift of the Top 20 rankings evident in Figure 5 is consistent with the conclusions drawn by Faghri and Bergman (2024) regarding the recent increase and decrease of the scholarly influence in Engineering & Computer Science of China and the U.S., respectively. Factors influencing the profound shift in rankings in Fields such as Engineering & Computer Science have been discussed by Faghri and Bergman (2024).
The topical linkage between Life Sciences and Medicine is apparent, as is the relationship between the Physical Sciences & Mathematics and Engineering & Computer Science. The Top 20 rankings of institutions in the Life Sciences on both lifetime and last-five-year bases are shown in Figure 6. As reported in the figure, Harvard University is the top-ranked institution for both time periods [R= 1, R5 = 1] and 11 other institutions are ranked among the Top 20 institutions in both the lifetime and last-five-year lists. Eight institutions including 6 from the U.S. and two from Canada have dropped out of the Top 20 lifetime list and have last-five-year rankings ranging from that of the University of California, Los Angeles [R= 5, R5 = 22] to the University of California, Davis [R= 15, R5 = 68]. Alternatively, the 8 institutions that have joined the Top 20 last-five-year list include three from Europe, three from Australia, and one each from China and Canada having lifetime rankings ranging from that of the University of British Columbia [R= 22, R5 = 10] to Hunan University [R= 826, R5 = 8]. The 818-place increase in rankings (last-five-years versus lifetime) associated with Hunan University is followed by a more modest, yet still impressive 75-place increase by Monash University [R= 89, R5 = 15]. The shifts in the Top 20 rankings evident in Figure 6 are again consistent with the conclusions drawn by Faghri and Bergman (2024) regarding changes in the scholarly performance of nations in this Field.
Figure 6. Top 20 institutions in Life Sciences and their lifetime and last-five-year ScholarGPS rankings.
Top 20 rankings of institutions in the Physical Sciences & Mathematics on both lifetime and last-five-year bases are reported in Figure 7. Ten institutions are ranked in the Top 20 in both lists with 7 having R> R5 and three with R< R5. Ten institutions including 8 from the U.S. and one each from the U.K. and Japan have fallen from the Top 20 lifetime list and have last-five-year rankings ranging from that of Columbia University [R= 14, R5 = 22] to the University of Washington [R= 12, R5 = 74]. The 10 institutions that have joined the Top 20 last-five-year list include one from Europe and 9 from China, having lifetime rankings ranging from that of the Technical University Munich [R= 29, R5 = 19] to Huazhong University of Science and Technology [R= 429, R5 = 12]. The 417-place increase in rankings (last-five-years versus lifetime) associated with the latter is among other increases in excess of 100 places for the Institute of Chemistry, Chinese Academy of Science [R= 155, R5 = 15], Nanjing University [R= 158, R5 = 6], Wuhan University [R= 202, R5 = 10], Nankai University [R= 211, R5 = 18], Sun Yat-sen University [R= 213, R5 = 20], and Jilin University [R= 230, R5 = 13]. The shift in rankings and the emergence of Chinese institutions in the Physical Sciences & Mathematics is perhaps as expected, considering (i) the trends in Engineering & Computer Science noted previously and (ii) the conclusions drawn by Faghri and Bergman (2024) regarding changes in the scholarly profiles of nations, including China and the U.S., in the Physical Sciences & Mathematics.
Figure 7. Top 20 institutions in Physical Sciences & Mathematics and their lifetime and last-five-year ScholarGPS rankings.
The last Field presented in detail is Social Sciences which, from consideration of its Disciplines listed in Table 6, can be considered topically distinct from the four Fields associated with Figures 4 - 7. Top 20 lifetime and Top 20 last-five-year rankings for Social Sciences are provided in Figure 8. Fourteen institutions, mainly from Europe and the U.S., are ranked in the Top 20 in both lists with 8 having R> R5 and 6 with R< R5. Six institutions, all from U.S., have fallen from the Top 20 lifetime list and have last-five-year rankings ranging from that of Pennsylvania State University [R= 18, R5 = 22] to Cornell University [R= 20, R5 = 38]. The 6 institutions that have joined the Top 20 last-five-year list include three from Europe and one each from Australia, China, and the U.S. These institutions have lifetime rankings ranging from that of University College London [R= 21, R5 = 6] to Tsinghua University [R= 127, R5 = 10].
Figure 8. Top 20 institutions in Social Sciences and their lifetime and last-five-year ScholarGPS rankings.
Tsinghua University is the only non-European, non-U.S. institution included in the figure and is also the only institution in the figure that has changed rankings more than 100 places based on last-five-year data. Overall, the change in rankings from the lifetime basis to that of the last five years is not as significant as observed for the preceding four Fields except for Medicine.
Comprehensive information regarding the number of academic institutions included in the rankings of the Fields considered here is included in Table 7 for Medicine through Table 11 for Social Sciences. As evident in Table 7, The U.S. is firmly ensconced in the top position in terms of the number of ranked institutions included in the Top 20, Top 100, Top 250, and Top 500 categories on either the lifetime or last-five-year basis. Regardless of the ranking category, the gap between the U.S. and the second-ranked country/region is significant. Other countries with relatively large numbers of ranked institutions are Canada, Australia, and a variety of European countries. The dominance of the U.S. in terms of the number of ranked academic institutions in Medicine is consistent with the conclusions drawn by Faghri and Bergman (2024) as pertains to the number of highly ranked scholars in various countries/regions.
Table 7. Number of academic institutions by country/region included in the lifetime and last-five-year ScholarGPS rankings for Medicine.
Medicine Top 20 Lifetime United States (18), Canada (1), United Kingdom (1)
Last Five Years United States (17), United Kingdom (2), Canada (1)
Medicine Top 100 Lifetime United States (58), Netherlands (8), Canada (6), United Kingdom (5), Australia (3), Germany (3), Sweden (3), Switzerland (3), Denmark (2), France (2), Austria (1), Belgium (1), Finland (1), Hong Kong, China (1), Italy (1), Norway (1), Spain (1)
Last Five Years United States (43), Netherlands (8), Canada (6), Germany (6), Italy (6), United Kingdom (6), Australia (4), France (3), Belgium (2), China (2), Denmark (2), Hong Kong, China (2), India (2), South Korea (2), Switzerland (2), Austria (1), Greece (1), Spain (1), Sweden (1)
Medicine Top 250 Lifetime United States (101), Germany (29), United Kingdom (20), Italy (14), Canada (13), Japan (9), Netherlands (9), Australia (7), Belgium (6), France (6), Sweden (5), Switzerland (5), Finland (4), Austria (3), Israel (3), Denmark (2), Hong Kong, China (2), New Zealand (2), Norway (2), South Africa (2), Brazil (1), Greece (1), Singapore (1), South Korea (1), Spain (1), Taiwan, China (1)
Last Five Years United States (78), Germany (25), Italy (24), United Kingdom (17), China (15), Canada (13), Australia (8), Netherlands (8), France (7), Japan (7), Belgium (5), South Korea (5), Switzerland (5), Sweden (4), Austria (3), Spain (3), Denmark (2), Greece (2), Hong Kong, China (2), India (2), Iran (2), Ireland (2), Israel (2), New Zealand (2), Brazil (1), Finland (1), Norway (1), Portugal (1), Singapore (1), South Africa (1), Taiwan, China (1)
Medicine Top 500 Lifetime United States (143), Japan (62), United Kingdom (39), Germany (37), Italy (33), France (21), Australia (19), Canada (15), Austria (10), Netherlands (9), Switzerland (9), Spain (8), Sweden (8), Belgium (7), China (7), Finland (7), Denmark (5), Greece (5), Ireland (5), Israel (5), South Korea (5), Brazil (4), Norway (4), South Africa (4), Hong Kong, China (3), India (3), Egypt (2), Hungary (2), New Zealand (2), Poland (2), Singapore (2), Taiwan, China (2), Chile (1), Croatia (1), Czech Republic (1), Iceland (1), Lebanon (1), Pakistan (1), Paraguay (1), Portugal (1), Puerto Rico (1), Saudi Arabia (1), Turkey (1)
Last Five Years United States (115), Italy (39), United Kingdom (38), China (36), Germany (36), Japan (33), Australia (20), France (16), Canada (14), South Korea (14), Netherlands (10), Iran (9), Belgium (8), Spain (8), Switzerland (8), Greece (6), India (6), Poland (6), Sweden (6), Turkey (6), Austria (5), Denmark (5), Ireland (5), Brazil (4), Finland (4), Israel (4), Portugal (4), South Africa (4), Hong Kong, China (3), Norway (3), Singapore (3), Taiwan, China (3), Egypt (2), New Zealand (2), Chile (1), Croatia (1), Czech Republic (1), Hungary (1), Indonesia (1), Lebanon (1), Macau (1), Malta (1), Pakistan (1), Paraguay (1), Russia (1), Saudi Arabia (1), Serbia (1), Thailand (1), United Arab Emirates (1)

Data Source: ScholarGPS.com

The countries/regions with academic institutions ranked in the Top 20, Top 100, Top 250, and Top 500 categories for Engineering & Computer Science on both the lifetime and last-five- year bases are reported in Table 8. As for the Overall (Table 5) and Medicine (Table 7) rankings, the U.S. is home to most of the ranked institutions in any of the four preceding categories when lifetime data are considered. However, China has the largest number of ranked academic institutions in Engineering & Computer Science in any of the preceding categories for the last five years with the U.S. standing second in each of the categories. This switch in leadership, from the U.S. to China, is in sharp contrast to trends noted Overall and for Medicine, but is consistent with the findings of Faghri and Bergman (2024) regarding the recent increase in the numbers of highly ranked scholars in countries such as China, India, and Iran.
Table 8. Number of academic institutions by country/region included in the lifetime and last-five-year ScholarGPS rankings for Engineering & Computer Science.
Engineering & Computer Science Top 20 Lifetime United States (15), Singapore (2), Canada (1), Israel (1), United Kingdom (1)
Last Five Years China (11), United States (3), Hong Kong, China (2), Singapore (2), Australia (1), United Kingdom (1)
Engineering & Computer Science Top 100 Lifetime United States (47), United Kingdom (7), Australia (6), Canada (6), Japan (5), China (4), Hong Kong, China (4), Netherlands (3), Germany (2), Israel (2), Singapore (2), South Korea (2), Switzerland (2), Taiwan, China (2), Belgium (1), Denmark (1), Greece (1), Norway (1), Saudi Arabia (1), Sweden (1)
Last Five Years China (37), United States (22), Australia (11), Canada (5), Hong Kong, China (5), United Kingdom (3), Germany (2), Iran (2), Netherlands (2), Singapore (2), South Korea (2), Switzerland (2), Denmark (1), India (1), Qatar (1), Saudi Arabia (1), United Arab Emirates (1)
Engineering & Computer Science Top 250 Lifetime United States (93), China (21), United Kingdom (20), Canada (17), Australia (15), Italy (9), Germany (8), Japan (7), South Korea (6), Hong Kong, China (5), India (5), Israel (5), Taiwan, China (4), Belgium (3), Finland (3), Greece (3), Iran (3), Netherlands (3), Sweden (3), Denmark (2), Portugal (2), Saudi Arabia (2), Singapore (2), Switzerland (2), Austria (1), France (1), Malaysia (1), New Zealand (1), Norway (1), Spain (1), United Arab Emirates (1)
Last Five Years China (66), United States (55), United Kingdom (15), Australia (14), Canada (12), South Korea (12), India (11), Germany (8), Iran (7), Italy (6), Hong Kong, China (5), Malaysia (5), Saudi Arabia (4), Finland (3), Belgium (2), Denmark (2), Israel (2), Japan (2), Netherlands (2), Portugal (2), Singapore (2), Switzerland (2), United Arab Emirates (2), Greece (1), Macau (1), New Zealand (1), Norway (1), Qatar (1), South Africa (1), Spain (1), Sweden (1), Vietnam (1)
Engineering & Computer Science Top 500 Lifetime United States (138), China (51), United Kingdom (33), Canada (26), Japan (25), Germany (22), Italy (22), Australia (19), France (16), South Korea (15), Spain (11), Taiwan, China (10), India (9), Netherlands (9), Iran (7), Belgium (6), Israel (6), Sweden (6), Greece (5), Hong Kong, China (5), Austria (4), Denmark (4), Finland (4), Malaysia (4), New Zealand (4), Portugal (4), Saudi Arabia (4), Singapore (4), Ireland (3), Switzerland (3), Turkey (3), United Arab Emirates (3), Brazil (2), Norway (2), Poland (2), Thailand (2), Cyprus (1), Czech Republic (1), Luxembourg (1), Macau (1), Qatar (1), Slovenia (1), South Africa (1)
Last Five Years China (107), United States (100), United Kingdom (31), Canada (28), South Korea (26), India (25), Australia (20), Iran (20), Italy (16), Germany (12), Spain (9), Taiwan, China (8), Hong Kong, China (6), Japan (6), Malaysia (6), Netherlands (5), Saudi Arabia (5), Sweden (5), Belgium (4), Finland (4), France (4), Singapore (4), Turkey (4), Austria (3), Denmark (3), Greece (3), Portugal (3), Switzerland (3), United Arab Emirates (3), Brazil (2), Israel (2), Macau (2), New Zealand (2), Pakistan (2), Poland (2), Qatar (2), South Africa (2), Vietnam (2), Cyprus (1), Czech Republic (1), Egypt (1), Ireland (1), Jordan (1), Luxembourg (1), Mexico (1), Norway (1), Tunisia (1)

Data Source: ScholaGPS.com

Top 20, Top 100, Top 250, and Top 500 rankings for Countries/Regions for the Life Sciences are shown in Table 9. Similar to Medicine, the U.S. is home to most of the ranked academic institutions on either lifetime or last-five-year bases. The U.K. also ranks highly, especially in the Top 20, Top 100, and Top 250 categories. China has performed well in the last five years, especially in the Top 250 last-five-year and Top 500 categories. The rankings for the Physical Sciences & Mathematics, reported in Table 10, share similarities to the rankings for Engineering & Computer Science in that the U.S. holds the top position in all Top 20, Top 100, Top 250, and Top 500 categories on the lifetime basis, while China holds the top position in all Top 20, Top 100, Top 250, and Top 500 categories for the last five years. In general, institutions in Europe, Australia, and Canada are also well-represented in the various categories.
Table 9. Number of academic institutions by country/region included in the lifetime and last-five-year ScholarGPS rankings for Life Sciences.
Life Sciences Top 20 Lifetime United States (12), United Kingdom (4), Canada (2), Denmark (1), Sweden (1)
Last Five Years United States (6), United Kingdom (4), Australia (3), Belgium (2), Canada (1), China (1), Denmark (1), Sweden (1), Switzerland (1)
Life Sciences Top 100 Lifetime United States (53), United Kingdom (14), Australia (6), Canada (5), Netherlands (5), Sweden (5), Switzerland (4), Belgium (2), Israel (2), Japan (2), Denmark (1), Finland (1)
Last Five Years United States (33), United Kingdom (10), Australia (9), China (8), Netherlands (6), Canada (5), Switzerland (5), Sweden (4), Germany (3), Italy (3), Belgium (2), Denmark (2), Saudi Arabia (2), Brazil (1), Finland (1), Hong Kong, China (1), Ireland (1), Japan (1), New Zealand (1), Portugal (1), Singapore (1)
Life Sciences Top 250 Lifetime United States (103), United Kingdom (30), Germany (21), Canada (16), Australia (13), Netherlands (10), Sweden (7), Switzerland (6), Belgium (5), France (5), Japan (5), Italy (4), China (3), Denmark (3), Finland (3), Ireland (3), Israel (3), Norway (3), New Zealand (2), Singapore (2), Austria (1), Brazil (1), Spain (1)
Last Five Years United States (63), China (30), United Kingdom (21), Germany (17), Australia (16), Italy (13), Canada (12), Netherlands (9), Switzerland (7), Sweden (6), Iran (5), Belgium (4), France (4), South Africa (4), Spain (4), Denmark (3), Hong Kong, China (3), Ireland (3), Israel (3), Saudi Arabia (3), Austria (2), India (2), Japan (2), New Zealand (2), Norway (2), Portugal (2), Singapore (2), South Korea (2), Brazil (1), Czech Republic (1), Finland (1), Malaysia (1)
Life Sciences Top 500 Lifetime United States (171), United Kingdom (51), Germany (45), Canada (28), Japan (25), Australia (24), Italy (17), France (15), China (14), Netherlands (11), Switzerland (9), Austria (8), Belgium (8), Spain (8), Sweden (8), Israel (7), New Zealand (7), Finland (5), Ireland (5), Norway (5), South Africa (5), Denmark (4), Hong Kong, China (4), Brazil (2), Portugal (2), Singapore (2), South Korea (2), Chile (1), Costa Rica (1), Czech Republic (1), Hungary (1), India (1), Mexico (1), Russia (1), Saudi Arabia (1)
Last Five Years United States (112), China (59), Germany (37), United Kingdom (32), Italy (24), Australia (23), Canada (20), India (20), Iran (14), France (10), Netherlands (10), South Korea (10), Switzerland (10), Japan (9), Spain (9), Austria (8), Sweden (7), Brazil (6), Portugal (6), South Africa (6), Belgium (5), Hong Kong, China (5), Ireland (5), Denmark (4), Finland (4), Israel (4), Norway (4), Pakistan (4), Egypt (3), Malaysia (3), New Zealand (3), Poland (3), Saudi Arabia (3), Thailand (3), Qatar (2), Singapore (2), Costa Rica (1), Czech Republic (1), Greece (1), Hungary (1), Mexico (1), Nigeria (1), Oman (1), Russia (1), Slovenia (1), Tunisia (1), Turkey (1)

Data Source: ScholaGPS.com

Table 10. Number of academic institutions by country/region included in the ScholarGPS lifetime and last-five-year rankings for Physical Sciences & Mathematics.
Physical Sciences & Math. Top 20 Lifetime United States (14), United Kingdom (3), Japan (2), Switzerland (1)
Last Five Years China (9), United States (6), United Kingdom (2), Germany (1), Japan (1), Switzerland (1)
Physical Sciences & Math. Top 100 Lifetime United States (49), United Kingdom (11), Germany (6), Japan (6), France (5), Canada (4), Australia (3), Israel (3), Switzerland (3), China (2), Netherlands (2), Sweden (2), Austria (1), Belgium (1), Finland(1), India (1)
Last Five Years China (38), United States (25), United Kingdom (8), Germany (7), Australia (3), Canada (2), Japan (2), Saudi Arabia (2), Singapore (2), Switzerland (2), Austria (1), Belgium (1), Denmark (1), France (1), India (1), Israel (1), Netherlands (1), Portugal (1), South Africa (1)
Physical Sciences & Math. Top 250 Lifetime United States (82), Germany (33), United Kingdom (20), China (15), Canada (13), France (11), Australia (10), Netherlands (10), Japan (8), Italy (7), Switzerland (6), Israel (5), Sweden (5), Belgium (4), Austria (3), Denmark (3), Hong Kong, China (3), Singapore (2), Spain (2), Finland (1), India (1), New Zealand (1), Norway (1), Portugal (1), Russia (1), Saudi Arabia (1), Taiwan, China (1)
Last Five Years China (71), United States (50), Germany (21), United Kingdom (15), Australia (10), Japan (8), France (7), Netherlands (6), Canada (5), Hong Kong, China (5), Saudi Arabia (5), Sweden (5), Switzerland (5), Denmark (3), Russia (3), Spain (3), Austria (2), Belgium (2), India (2), Iran (2), Israel (2), Italy (2), Malaysia (2), Pakistan (2), Portugal (2), Singapore (2), South Africa (2), South Korea (2), Finland (1), Greece (1), Poland (1), Taiwan, China (1)
Physical Sciences & Math. Top 500 Lifetime United States (137), Germany (50), China (41), United Kingdom (40), Canada (25), France (25), Japan (22), Australia (16), Italy (16), Spain (13), Netherlands (10), India (8), Sweden (8), Austria (7), Belgium (7), Switzerland (7), Greece (6), Hong Kong, China (6), Israel (6), Russia (6), Finland (5), South Africa (5), Denmark (4), New Zealand (3), Norway (3), Saudi Arabia (3), South Korea (3), Taiwan, China (3), Hungary (2), Iran (2), Ireland (2), Portugal (2), Singapore (2), Brazil (1), Czech Republic (1), Poland (1), Slovenia (1), Turkey (1)
Last Five Years China (113), United States (81), Germany (36), United Kingdom (29), India (24), Australia (18), Iran (16), Italy (14), France (13), Canada (12), Japan (12), Spain (12), Netherlands (9), Switzerland (8), Russia (7), Belgium (6), Hong Kong, China (6), Israel (6), Pakistan (6), Saudi Arabia (6), South Africa (6), South Korea (6), Sweden (6), Malaysia (5), Austria (4), Denmark (4), Egypt (4), Finland (3), Greece (3), Norway (3), Poland (3), Brazil (2), Czech Republic (2), Portugal (2), Singapore (2), Estonia (1), Ireland (1), Jordan (1), Macau (1), New Zealand (1), Qatar (1), Slovenia (1), Taiwan, China (1), Turkey (1), United Arab Emirates (1), Vietnam (1)

Data Source: ScholaGPS.com

The Social Sciences (Table 11) are led by the U.S. in both the lifetime and last-five-year bases over all Top 20, Top 100, Top 250, and Top 500 categories. The U.K., several European countries, and institutions in Australia and Canada are also well-represented. China has ascended significantly when the last five years are considered, especially in the Top 100, Top 250, and Top 500 categories.
Table 11. Number of academic institutions by country/region included in the ScholarGPS lifetime and last-five-year rankings for Social Sciences.
Social Sciences Top 20 Lifetime United States (17), United Kingdom (3)
Last Five Years United States (12), United Kingdom (4), Netherlands (2), Australia (1), China (1)
Social Sciences Top 100 Lifetime United States (55), United Kingdom (20), Netherlands (7), Australia (6), Canada (4), Belgium (2), Israel (2), Hungary (1), Italy (1), Japan (1), Switzerland (1)
Last Five Years United States (37), United Kingdom (16), Australia (10), China (7), Netherlands (7), Canada (4), Germany (4), Belgium (2), Denmark (2), Hong Kong, China (2), Switzerland (2), Austria (1), Hungary (1), Italy(1), New Zealand (1), Norway (1), Singapore (1), Sweden (1)
Social Sciences Top 250 Lifetime United States (106), United Kingdom (44), Australia (14), Canada (14), Germany (13), Netherlands (10), Switzerland (6), Israel (5), Sweden (5), Belgium (4), Japan (4), China (3), Denmark (3), Italy (3), New Zealand (3), Austria (2), France (2), Hong Kong, China (2), Norway (2), Finland (1), Hungary (1), Singapore (1), South Africa (1), Spain (1)
Last Five Years United States (70), China (32), United Kingdom (30), Australia (22), Germany (20), Canada (13), Netherlands (10), Italy (7), Belgium (5), Sweden (5), Switzerland (5), Portugal (4), Denmark (3), Hong Kong, China (3), Israel (3), Norway (3), South Africa (3), Spain (3), Japan (2), New Zealand (2), Austria (1), Finland (1), France (1), Hungary (1), Singapore (1)
Social Sciences Top 500 Lifetime United States (181), United Kingdom (65), Germany (41), Canada (27), Australia (25), China (16), Italy (16), France (15), Japan (12), Netherlands (12), Sweden (9), Belgium (8), Denmark (7), Norway (7), Switzerland (7), Austria (6), Israel (6), New Zealand (5), Spain (5), Finland (4), South Africa (4), Hong Kong, China (3), Greece (2), Hungary (2), Ireland (2), Portugal (2), Singapore (2), Chile (1), Czech Republic (1), Estonia (1), Iceland (1), Luxembourg (1), Pakistan (1), Poland (1), Russia (1), Slovenia (1)
Last Five Years United States (126), China (59), United Kingdom (50), Germany (43), Australia (29), Canada (21), Italy(19), Spain (16), Netherlands (12), South Africa (9), France (8), Norway (8), Sweden (8), Switzerland (7), Austria (6), Belgium (6), Denmark (6), New Zealand (6), Portugal (6), Hong Kong, China (5), Ireland (5), Israel (5), Japan (5), Finland (4), Malaysia (4), Czech Republic (3), Greece (3), Turkey (3), Russia (2), Singapore (2), Croatia (1), Cyprus (1), Hungary (1), India (1), Lithuania (1), Nigeria (1), Oman (1), Poland (1), Qatar (1), Romania (1), Slovakia (1), Slovenia (1), South Korea (1), Tunisia (1)

Data Source: ScholaGPS.com

3.3 Discipline rankings

Standings associated with the two Disciplines with the most ranked institutions (Chemistry, with 631 and 506 academic institutions ranked on the lifetime and the last-five-year basis, respectively) and Electrical & Computer Engineering (with 473 ranked academic institutions on both the lifetime and last-five-year basis), are reported in Figure 9 and 10, respectively. Note that the Disciplines with the largest numbers of ranked academic institutions in the other three Fields analyzed in the preceding section are (i) Biology in the Life Sciences (380 and 346 ranked institutions on the lifetime and last-five-year basis, respectively), (ii) Economics in the Social Sciences (359 and 340 institutions ranked), and (iii) Surgery in Medicine (234 and 194 institutions ranked).
Figure 9. Top 20 institutions in Chemistry and their lifetime and last-five-year ScholarGPS rankings.
Figure 10. Top 20 institutions in Electrical & Computer Engineering and their lifetime and last-five-year ScholarGPS rankings.
As evident in Figure 9, six academic institutions are ranked in the Top 20 on both lifetime and last-five-year bases for the Discipline of Chemistry. Kyoto University [R= 1, R5 = 20] has been replaced by Tsinghua University [R= 13, R5 = 1] as the top ranked institution. Fourteen new institutions, all located in China, have been added to the Top 20 last-five-year list with significant advances made by Jiangsu University [R= 625, R5 = 18], Sun Yat-sen University [R= 496, R5 = 8], and Soochow University [R= 413, R5 = 11]. Hence, 15 of the top 20 ranked institutions, based on the data for the last five years, are in China. The 14 institutions that have dropped from the Top 20 include 9 from the U.S., three from Japan, and two from Israel. These institutions have last-five-year rankings ranking from Northwestern University [R= 12, R5 =21], the University of California, Berkeley [R= 6, R5 = 27], Harvard University [R= 17 R5 = 35], and the Massachusetts Institute of Technology [R= 19, R5 = 36], to rankings associated with Hebrew University of Jerusalem [R= 10, R5 = 111] and Tokyo Institute of Technology [R= 16, R5 = 131]. These findings are consistent with the conclusion that China has experienced the largest increase in the number of highly ranked scholars (last-five-year data versus lifetime data) with the U.S. having the largest reduction in the number of highly ranked scholars (Faghri & Bergman, 2024).
For the Discipline of Electrical and Computer Engineering, only five academic institutions are in the Top 20 on both the lifetime and last-five-year lists as evident in Figure 10. The Georgia Institute of Technology [R= 1, R5 = 16] has been replaced by Tsinghua University [R= 25, R5= 1] as the top ranked institution. Fifteen new institutions, 12 of which are in China along with one each in Australia, Denmark, and Saudi Arabia, have joined the Top 20 based on information for the last five years. The most significant advances have been made by Tianjin University [R= 385, R5 = 15], South China University of Technology [R= 200, R5 = 11], and Harbin Institute of Technology [R= 197, R5 = 13]. As a result of the shift in rankings, 12 of the Top 20 last- five-year ranked institutions are in China. The 15 institutions that have dropped from the Top 20 include 10 from the U.S., two from Canada, and one each from the U.K., Belgium, and Israel.
Decreases in standing range from the relatively modest reductions associated with Imperial College London [R= 12, R5 = 23], the Massachusetts Institute of Technology [R= 3, R5 = 25], the University of California, Berkeley [R= 4, R5 = 26], and the University of Southern California [R= 10, R5 = 28], to large reductions associated with the University of Illinois at Urbana-Champaign [R= 5, R5 = 91] and Tel Aviv University [R= 16, R5 = 246].

4 Summary and conclusions

To the best of our knowledge, existing rankings of academic institutions have: (i) often been restricted to pre-selected institutions, clouding the potential discovery of scholarly activity in emerging institutions; (ii) considered only broad areas of research, limiting the ability of responsible individuals to act on the assessments in a concrete manner, or in contrast; (iii) have considered only a narrow area of research for comparison, diminishing the broader applicability of the assessment. In general, the disparities in rankings depend on which institutions are included in the ranking process, which areas of research are considered, the breadth (or granularity) of the research areas of interest, and the methodologies used to define and quantify research performance.
To provide rankings of academic institutions including all institutions worldwide in topical areas ranging from Overall (inclusive of all Fields) to each of approximately 350,000 Specialties, the methodology used to rank individual scholars previously reported (Faghri & Bergman, 2024) has been extended here to determine both the lifetime and last-five-year rankings of academic institutions. Other unique features of the ranking methodology have been discussed, and sample Top 20 rankings have been presented and analyzed over all Fields, in five specific Fields (Medicine, Engineering & Computer Science, Life Sciences, Physical Sciences & Mathematics, Social Sciences), and in two Disciplines (Chemistry, and Electrical & Computer Engineering).
A comparison of the Top 20 institutions over all Fields (lifetime versus last-five-year basis) revealed modest changes in the Top 20 rankings with 15 institutions ranked on both the lifetime and last-five-year bases. Perhaps surprisingly, changes in the Top 20 institutions of the five Fields considered here range from almost non-existent (Medicine, with 19 institutions appearing in both the lifetime and last-five-year Top 20 rankings) to extreme (Engineering & Computer Science, with just 6 institutions appearing in both the lifetime and last-five-year Top 20 rankings). Changes in the Top 20 rankings for the two Disciplines analyzed were also significant, with only 6 (5) institutions appearing in both the lifetime and last-five-year lists for Chemistry (Electrical & Computer Engineering).
When viewed cumulatively over just the Fields and Disciplines considered in this study, Chinese institutions have in general exhibited increases in the Top 20 rankings (last-five-years versus lifetime) while U.S., as well as U.K. and E.U. institutions have shown decreased or relatively stagnant Top 20 rankings. This is especially the case for (i) the Field of Engineering & Computer Science as evident in Figure 5, (ii) the Field of Physical Sciences & Mathematics as shown in Figure 7, (iii) the Discipline of Chemistry as revealed in Figuer 9, and (iv) the Discipline of Electrical & Computer Engineering as reported in Figure 10. Institutions dropping from the Top 20 rankings in the preceding Fields and Disciplines are dominated by those located in the U.S. Other institutions that have dropped out of the Top 20 in the last-five-year rankings are in Israel (in Engineering & Computer Science, Chemistry, and Electrical & Computer Engineering), Japan (in Physical Sciences & Mathematics, and Chemistry), Canada (in Engineering & Computer Science and Electrical and Computer Engineering), the U.K. (in Engineering & Computer Science and Electrical and Computer Engineering), and the E.U. (in Electrical and Computer Engineering).
The preceding general trends associated with the last-five-year rise of Chinese institutional Top 20 rankings are, however, not observed for all Fields and Disciplines. For example, no Chinese institutions are ranked in the Top 20 in Medicine on either the lifetime or last-five-year basis, and only one Chinese institution is new to the Top 20 list in Life Sciences along with 7 other newly ranked institutions in Australia (three institutions), the E.U. (three institutions), and Canada (one institution); two Canadian institutions dropped out of the Top 20 in the last five years. Similarly, of the 6 newly ranked institutions in the Social Sciences, two are from the E.U., and one each are from the U.K, the U.S., China, and Australia; however, all of the institutions dropping from the Top 20 are located in the U.S.
Similar ranking trends are revealed by consideration of (i) the lists of Top 100, 250, and 500 institutions as presented in Tables 5 - 11, and (ii) the calculated scholarly influence of countries/regions as reported in Faghri and Bergman (2024). As alluded to previously (Faghri and Bergman, 2024), many factors might contribute to the geographical redistribution of top-ranked academic institutions including but not limited to: (i) national initiatives to attract preeminent scholars from abroad in key areas of strategic national importance, (ii) establishment of special initiatives to improve the research stature of a country’s universities, and (iii) easier access to publications and other scholarly information than in the past. The methods introduced in this study might assist future identification and assessment of the root causes leading to changes in the rankings of academic institutions worldwide, and help reveal how the changes in the rankings differ among the hundreds of thousands of specific Fields, Disciplines, and Specialties used by ScholarGPS.

Author contributions

Amir Faghri (amir.faghri@gmail.com): Conceptualization, Data Curation, Formal Analysis, Investigation, Methodology, Software, Validation, Writing - original draft, Writing - review and editing. Theodore L. Bergman (tlbergman@ku.edu): Conceptualization, Data Curation, Formal Analysis, Investigation, Methodology, Software, Validation, Writing - original draft, Writing - review and editing.

Competing interests

Amir Faghri is the Founder and Chief Executive Officer of ScholarGPSTM and Theodore L. Bergman is Senior Consultant to ScholarGPSTM.

Data availability

Data analyzed for this manuscript include: (i) numbers of Highly Ranked ScholarsTM in the Fields, Disciplines, and Specialties and (ii) their country/region affiliations. These data as well as data for all Fields, Disciplines, and Specialties are available at www.scholargps.com.
[1]
Aksnes D.W., Sivertsen G, van Leeuwen T.N., & Wendt K.K. (2017). Measuring the productivity of national R&D systems: Challenges in cross-national comparisons of R&D input and publication output indicators. Science and Public Policy, 44(2), 246-258. https://doi:10.1093/scipol/scw058

[2]
Allen R.M. (2017). A comparison of China’s “Ivy League” to other peer groupings through global university rankings. Journal of Studies in International Education, 21(11), 395-411. https://doi.org/10.1177/102831531769753

[3]
Auranen O., & Nieminen M. (2010). University research funding and publication performance -An international comparison. Research Policy, 39(6), 822-834. doi: 10.1016/j.respol.2010.03.003

[4]
Bastedo M.N., & Bowman N.A. (2010). U.S. News & World Report college rankings: Modeling institutional effects on organizational reputation. American Journal of Education, 116(2), 163-183. https://doi.org/10.1086/649437

[5]
Beveridge M.E.L., & Bak T.H. (2011). The languages of aphasia research: Bias and diversity. Aphasiology, 25(12), 1451-1468. https://doi.org/10.1080/02687038.2011.624165

[6]
Bowman N.A., & Bastedo M.N. (2011). Anchoring effects in world university rankings: exploring biases in reputation scores. Higher Education, 61(4), 431-444. https://doi.org/10.1007/s10734-010-9339-1

[7]
Bozeman B., & Corely E. (2004). Scientists’ collaboration strategies: implications for scientific and technical human capital. Research Policy, 33(4), 599-616. https://doi.org/10:1016/j.respol.2004.01.008

[8]
Bozeman B., Fay D., & Slade C.P. (2013). Research collaboration in universities and academic entrepreneurship: the state-of-the-art. Journal of Technology Transfer, 38(1), 1-67. https://doi.org/10.1007/s10961-012-9281-8

[9]
Bozeman B., Goughan M., Youtie J., Slade C.P., & Rimes H. (2016). Research experiences, good and bad: Dispatches from the front lines. Science and Public Policy, 43(2), 226-244. https://doi.org/10.1093/scipol/scv035

[10]
Buela-Casal G., Gutierrez-Martinez O., Bermudez-Sanchez M.P., & Vadillo-Munoz O. (2007). Comparative study of international academic rankings of universities. Scientometrics, 71(3), 349-365. https://doi.org/10.1007/s11192-007-1653-8

[11]
Cakur M.P., Acarturk C., Alasehir O., & Cilingir C. (2015). A comparative analysis of global and national university ranking systems. Scientometrics, 103(3), 813-848. https://doi.org/10.1007/s11192-015-1586-6

[12]
Chen K.-H., & Liao P.-Y. (2012). A comparative study on world university rankings: a bibliometric survey. Scientometrics, 92(1), 89-103. https://doi.org/10.1007/s11192-012-0724-7

[13]
Coccia M. (2008). Measuring scientific performance of public research units for strategic change. Journal of Informetrics, 2(3), 183-194. https://doi.org/10.1016/j.joi.2008.04.001

[14]
Coccia M., & Bozeman B. (2016). Allometric models to measure and analyze the evolution of international research collaboration. Scientometrics, 108(3), 1065-1084. https://doi.org/10.1007/s11192-016-2017-x

[15]
Daskivich T.J., & Gewertz B.L. (2023). Campaign reform for US News and World Report rankings. JAMA Surgery, 158(2), 114-115. https://doi.org/10.1001/jamasurg.2022.4511

[16]
Fairclough R., & Thelwall M. (2015). More precise methods for national research citation impact comparisons. Journal of Informetrics, 9(4), 895-906. https://doi.org/10.1016/j.joi.2015.09.005

[17]
Faghri A. & Bergman T.L. (2024). Highly ranked scholars and the influence of countries/regions in research fields, disciplines, and specialties. Quantitative Science Studies, 5(2), 464-483. https://doi.org/10.1162/qss_a_00291

[18]
Guba K., & Tsivinskaya A. (2023). Expert judgements versus publication-based metrics: Do the two methods produce identical results in measuring academic reputation? Journal of Documentation, 79(1), 127-143. https://doi.org/10.1108/JD-02-2022-0039

[19]
Jasco P. (2009). The h-index for countries in Web of Science and Scopus. Online Information Review, 33(4), 831-837. https://doi.org/10.1108/14684520910985756

[20]
Koltun V., & Hafner D. (2021). The h-index is no longer an effective correlate of scientific reputation. PLOS ONE, 16(6), e0253397. https://doi.org/10.1371/journal.pone.0253397

[21]
Leydesdorff L., & Wagner C. (2009). Macro-level indicators of the relations between research funding and research output. Journal of Informetrics, 3(4), 353-362. https://doi.org/10.1016/j.joi.2009.05.005

[22]
Leydesdorff L., Wagner C.S., & Zhang L. (2021). Are university rankings statistically significant? A comparison among Chinese universities and with the USA. Journal of Data and Information Science, 6(2), 67-95. https://doi.org/10.2478/jdis-2021-0014

DOI

[23]
Leydesdorff L., & Zhou P. (2005). Are the contributions of China and Korea upsetting the world system of science? Scientometrics, 63(3), 617-630. https://doi.org/10.1007/s11192-005-0231-1

[24]
Massucci F.A., & Docampo D. (2019). Measuring the academic reputation through citation networds via PageRank. Journal of Informetrics, 13(1), 185-201. https://doi.org/10.1016/j.joi.2018.12.001

[25]
Moskovkin V.M., Zhang H., Sadovski M.V., & Serkina O.V. (2022). Comprehensive quantitative analysis of the TOP-100s of ARWU, QS and THE World University Rankings for 2014-2018. Education for Information, 38(2), 133-169. https://doi.org/10.3233/EFI-211539

[26]
Pikos A.M. (2022). Restoring trust in an organization after a business school rankings scandal. Polish Sociological Review, issue 217, 93-113. https://doi.org/10.26412/psr217.06

[27]
Ramírez-Castañeda V. (2020). Disadvantages in preparing and publishing scientific papers caused by the dominance of the English language in science: The case of Columbian researchers in biological sciences. PLOS ONE, 15(9), e0238372. https://doi.org/10.1371/journal.pone.0238372

[28]
Rauhvargers A. (2014). Where are the global rankings leading us? An analysis of recent methodological changes and new developments. European Journal of Education, 49(1), 29-44. https://doi.org/10.1111/ejed.12066

[29]
Rodriguez-Navarro A. (2016). Research assessment based on infrequent achievements: A comparison of the United States and Europe in terms of highly cited papers and Nobel Prizes. Journal for the Association of Information Science and Technology, 67(3), 731-740. https://doi.org/10.1002/asi.23412

[30]
Shehatta I., & Mahmood K. (2016). Correlation among top 100 universities in the major six global rankings: policy implications, Scientometrics, 109(2), 1231-1254. https://doi.org/:10.1007/s11192-016-2065-4

[31]
Sinson G., Kolinski J., Alme C., & Siddhartha S. (2023). Is it time to follow the lawyers: Should hospitals extract themselves from US News & World Report rankings? American Journal of Medical Quality, 38(3), 160-161. https://doi.org/10.1097/JMQ.0000000000000116

DOI PMID

[32]
Van Hooydonk G. (1997). Fractional counting of multi-authored publications: Consequences for the impact of authors. Journal of the American Society for Information Science and Technology, 48(10), 944-945. https://doi.org/10.1002/(SICI)1097-4571(199710)48:10%3C944::AID-ASI8%3E3.0.CO;2-1

[33]
Van Leeuwen T.N., Moed H.F., Tijssen R.J.W., Visser M.S., & van Raan A.J.F. (2001). Language biases in the coverage of the Science Citation Index and its consequences for international comparisons of research performance. Scientometrics, 51(1), 335-346. https://doi.org/10.1023/A:1010549719484

[34]
Van Raan A.F.J., van Leeuwen T.N, & Visser M.S. (2011). Severe language effect in university rankings: Particularly Germany and France are wronged in citation-based rankings. Scientometrics, 88(2), 495-498. https://doi.org/10.1007/s11192-011-0382-1

[35]
Viiu G.-A. (2016). A theoretical evaluation of Hirsch-type bibliometric indicators confronted with extreme self-citation. Journal of Informetrics, 10(2), 552-566. https://doi.org/10.1016/j.joi.2016.04.010

Outlines

/

京ICP备05002861号-43

Copyright © 2023 All rights reserved Journal of Data and Information Science

E-mail: jdis@mail.las.ac.cn Add:No.33, Beisihuan Xilu, Haidian District, Beijing 100190, China

Support by Beijing Magtech Co.ltd E-mail: support@magtech.com.cn