This page aggregates publicly available data from QS, Times Higher Education, and ARWU for comparison purposes only. Composite bands are calculated by Find The Norm and are not endorsed by any ranking organisation. Rankings measure institutional characteristics like research output, reputation, and resources, not the quality of education an individual student will receive. Subject-level rankings often differ significantly from overall rankings. This page is for informational purposes only and should not be the sole factor in choosing a university.

EDUCATION

Is your university actually well ranked?

"Is this a good school?" sounds like a question with one answer. It is not. The three major global ranking systems use fundamentally different methodologies, and the same institution can sit dozens or even hundreds of places apart depending on which ranking you read. This page compiles the data on how rankings are built, why they diverge, and how to read them sensibly.

QS World University Rankings 2025; Times Higher Education 2025; ARWU Shanghai Ranking 2024; IPEDS and HESA supplementary dataQS World University Rankings 2025 · Times Higher Education 2025
3 systems Major global ranking organisations producing annual world tables (QS, THE, ARWU)
~1,500 Universities ranked in QS each year, out of roughly 25,000 worldwide
45% Share of the QS score that comes from reputation surveys alone

Look up a university across all three systems

Pick a university and (optionally) a second to compare. Filter by ranking system or subject area to see where the differences land.

Why universities rank differently across systems

The three major ranking systems use fundamentally different methodologies. QS places heavy weight on reputation surveys, with 45% of its score coming from academic and employer reputation, which favours well-known brands and institutions with large alumni networks. THE (Times Higher Education) weights research quality heavily at 30% and includes teaching proxies for another 29.5%. ARWU focuses almost entirely on research output and prestige indicators like Nobel Prize winners (30%) and highly cited researchers (20%), and ignores teaching and reputation surveys entirely.

Advertisement

The result is that a teaching-focused university with strong employer connections might rank highly in QS but poorly in ARWU. A research powerhouse with few Nobel laureates and low international visibility might rank well in THE but poorly in QS. A composite band, the average across systems, smooths these differences, but understanding why they diverge is more useful than picking a single number.

Factor QS weight THE weight ARWU weight
Research output / citations 20% 30% 40%
Teaching quality proxy 10% (student ratio) 29.5% 0%
Reputation surveys 45% 0% 0%
Nobel/Fields prizes 0% 0% 30%
International outlook 10% 7.5% 0%
Employer / industry 20% 4% 0%

Source: QS Quacquarelli Symonds 2025; Times Higher Education 2025; ShanghaiRanking 2024.

Top 20 universities across all three systems

Looking at the top of every ranking, the same handful of institutions appear repeatedly. MIT, Stanford, Harvard, Cambridge, and Oxford trade the top five places between systems, and Caltech and Imperial sit comfortably in the top ten in most years. The composite band is a broad categorisation we assign based on average rank across the systems where the university appears, smoothing out the noise between methodologies.

University QS 2025 THE 2025 ARWU 2024 Composite band
MIT 1 3 4 Top 5
Stanford 2 2 2 Top 5
Harvard 4 4 1 Top 5
University of Cambridge 5 5 4 Top 5
University of Oxford 3 1 7 Top 5
Caltech 10 7 9 Top 10
Imperial College London 6 8 23 Top 10
ETH Zurich 7 11 20 Top 15
UCL 9 22 17 Top 20
University of Chicago 21 13 10 Top 15

Source: QS World University Rankings 2025; Times Higher Education 2025; ARWU 2024.

How does your grade compare?

Knowing your university's rank tells you about the institution. Knowing your percentile within it tells you about you. The unified grade percentile calculator translates GPAs, classifications, and scores between countries.

Open the calculator

When rankings disagree: divergence examples

Outside the top 10 the disagreement between systems becomes substantial. NUS Singapore, for example, sits at 8 in QS but 71 in ARWU, a gap of more than 60 places driven almost entirely by methodology. ARWU rewards Nobel laureates and Nature/Science publications, where NUS has fewer. QS rewards international reputation and employer connections, where NUS is exceptional. The University of Malaya shows the most extreme divergence in this sample: 60 in QS and 301 to 400 in ARWU.

Knowing where the divergence is large is itself useful information. It tells you which dimensions of "quality" the institution is strong in, and which it is not.

University QS 2025 THE 2025 ARWU 2024 Divergence
NUS Singapore 8 19 71 High
University of Melbourne 14 44 35 Moderate
Tsinghua University 25 12 22 Low
University of Edinburgh 27 30 38 Low
Monash University 42 65 77 Moderate
University of Malaya 60 188 301-400 Very high

Source: QS 2025; THE 2025; ARWU 2024. Divergence is a Find The Norm classification based on the spread of ranks.

Global university tier bands

For users looking at universities outside the top 50, broader tier bands are more useful than precise positions. A university ranked 480 in one system and 540 in another is, for most practical purposes, the same institution. The tiers below are a Find The Norm classification, not an official ranking, and are designed for the most common search intent: "is this a strong university or not."

Tier Global rank range Examples
Elite 1-25 MIT, Stanford, Oxford, Cambridge, Harvard
World-class 26-100 Melbourne, Toronto, Edinburgh, NUS
Highly regarded 101-300 Birmingham, Macquarie, Arizona State
Solid 301-600 Many regional flagships
Recognised 601-1000 Emerging universities, newer institutions

Source: Find The Norm composite tier bands derived from QS, THE, and ARWU 2024 to 2025 averages.

Why subject rankings matter more than overall rankings

A university ranked 200th overall might sit in the top 30 for a specific subject. A university with a world-class engineering school and weaker humanities will rank much higher in engineering-specific tables than in the overall world rankings. QS and THE both publish subject-specific rankings, often with very different conclusions from the overall tables.

For prospective students, the subject ranking for the intended field is usually more relevant than the overall institutional rank. If you are choosing between universities for a specific programme, the subject table should drive the shortlist, not the overall composite. Use the education level calculator for context on how your degree level compares to the wider population once you graduate.

How rankings shape university behaviour

Rankings create a powerful feedback loop. Universities that rank highly attract more international students, who often pay higher fees, more research funding, and more philanthropic donations. That additional revenue allows them to invest in better facilities and recruit stronger faculty, which in turn improves their ranking. The cycle is self-reinforcing.

One consequence is that ranking systems have real-world power over university strategy, sometimes pushing institutions to prioritise metrics that improve their position rather than changes that better serve students. Some universities have been caught manipulating data submitted to ranking organisations. A university's ranking position reflects its optimised presentation to ranking systems as much as its intrinsic quality.

Rankings are one input into a university choice, not the input. Programme fit, location, cost, teaching quality, and personal circumstances usually matter more for individual outcomes than the difference between, say, rank 50 and rank 150.

How the ranking landscape has shifted

The most significant change over the past decade has been the rise of Asian universities, particularly from China, Singapore, South Korea, and Hong Kong. Tsinghua and Peking University have moved from outside the top 50 to firmly within the top 25 across most systems. NUS Singapore is now regularly inside the QS top 10. Some traditional European universities have slipped as research funding has shifted eastward.

The methodologies themselves have also evolved. QS added sustainability and employment metrics, and THE revised its research quality indicators. These changes mean that year-on-year comparisons need caution: a university might move up or down partly because the scoring formula changed, not because the institution itself changed. We update our aggregated dataset within 14 days of each new ranking release, and run a full audit every October once all three systems have published.

How university rankings are actually calculated

The three major global university ranking systems — QS, THE (Times Higher Education), and ARWU (Shanghai Ranking) — each measure a different set of institutional characteristics and weight them differently. Understanding what each system actually measures is more useful than accepting any single rank at face value. QS weights academic reputation surveys at 30% and employer reputation at 15%, meaning 45% of the QS score is based on what other academics and employers say about an institution rather than any directly measured output. Citations per faculty (20%) and faculty-student ratio (10%) make up most of the remaining QS score. This methodology favours well-established, internationally visible institutions whose reputations precede their current performance.

THE (Times Higher Education) takes a different approach: research environment accounts for 29% and research quality (citations) for 30%, making THE primarily a research ranking. Teaching environment (29.5%) is assessed through a proxy (staff-to-student ratio, doctorates awarded, institutional income) rather than direct teaching quality measurement. Industry income (4%) and international outlook (7.5%) complete the THE score. THE explicitly does not use reputation surveys, which is its most significant methodological departure from QS. ARWU (the Shanghai Ranking) is the most research-centric of all: alumni who have won Nobel Prizes or Fields Medals (10%), staff who have won Nobel Prizes or Fields Medals (20%), highly cited researchers (20%), papers in Nature and Science (20%), and papers in science citation indexes (20%) account for the entire score. ARWU completely ignores teaching quality, student experience, and reputation, making it essentially a ranking of elite research output concentrated in a handful of institutions.

The methodology comparison table on this page shows these weights side by side. The practical implication: a university can rank very differently across the three systems depending on its particular strengths. A teaching-focused university with strong employer partnerships but modest research output will score well in QS but poorly in ARWU. A research powerhouse publishing in Nature and Science but with lower international visibility will score well in ARWU but modestly in QS. Neither ranking is "wrong" — they are measuring different things. The question is which measured dimension is most relevant to your decision.

QS vs THE vs ARWU: which ranking should you trust?

No single ranking is definitively more reliable — each has documented strengths and weaknesses. QS is the most widely cited by institutions and applicants globally, partly because its 1,500+ coverage is broader than THE or ARWU, and partly because it includes employer reputation data that has practical value for students entering the workforce. Its weakness is the 45% survey-based component: reputation surveys measure perception and prestige, which can persist long after an institution's actual performance has changed. A university that was excellent in 1980 may still score well in QS surveys in 2025 simply because its historical reputation is embedded in academic memory. QS is also the ranking system most susceptible to gaming, as universities can influence their survey exposure through marketing and engagement with the survey population.

THE is generally considered more methodologically rigorous than QS because it avoids reputation surveys entirely and focuses on measurable outputs. Its research quality metrics are based on normalised citation counts that adjust for field-specific citation patterns, which is a more sophisticated approach than raw citation counts. THE's weakness is that its teaching quality proxy (staff-to-student ratio, doctorates awarded, institutional income) measures resources rather than actual teaching effectiveness. A well-funded institution can score well on teaching metrics while still delivering mediocre educational experiences. ARWU is the most objective of the three — Nobel Prizes and highly cited researchers are verifiable, non-gameable metrics — but its complete focus on elite research concentration means it is essentially useless for evaluating teaching universities, liberal arts colleges, or institutions specialising in applied fields. For prospective students, ARWU is the least practically useful of the three systems.

The most defensible approach is to use all three together, which is exactly what the composite band on this page provides. When all three systems agree (Oxford and Cambridge consistently appear in the top 10 of all three), the consensus is meaningful. When they disagree significantly (University of Malaya ranks 60th in QS but 301-400th in ARWU), the divergence tells you something important: the institution has strong regional employer reputation and international visibility but limited elite research output by global standards. Understanding why universities rank differently across systems gives you far more information than a single position in a single table.

Is X a good university? What rankings actually measure

The question "is [university] a good school?" is one of the most commonly searched questions in education, and the honest answer is that global rankings only partially address it. What QS, THE, and ARWU measure are institutional characteristics — research output, staff prestige, citation impact, employer recognition, international reach — that correlate with but do not directly determine the quality of education any individual student will receive. A university ranked 200th globally may have the best programme in its country for your specific field. A university ranked 15th globally may have a mediocre department in your subject area while its overall rank is lifted by unrelated faculties. For prospective students, subject-specific rankings are typically more relevant than overall institutional rankings, and subject rankings often tell a very different story.

The rank divergence examples in the data table above illustrate this clearly. NUS Singapore ranks 8th in QS but 71st in ARWU — because QS heavily rewards employer reputation and international visibility, both of which NUS excels at, while ARWU focuses on Nobel laureate density, where NUS (a relatively young institution by global standards) has fewer historical prize winners. University of Malaya's QS rank of 60 versus its ARWU rank of 301-400 reflects even stronger divergence: strong regional employer recognition elevates it in QS, while its research output at the global elite level is more modest. Neither rank is "wrong" — they reflect genuinely different dimensions of institutional quality. The composite band on this page (calculated as a broad average across available systems) smooths these differences to give a general sense of an institution's global standing.

For practical decision-making, rankings should sit alongside several other information sources: programme-specific outcomes data (what percentage of graduates from this specific course are employed in their field within 6 months?), student satisfaction surveys (NSS in the UK, NSSE in the US), independent graduate earnings data (College Scorecard in the US, HESA Graduate Outcomes in the UK), and your personal priorities around location, cost, campus culture, and career goals. Research by economists Stacy Dale and Alan Krueger has shown that for most careers, the characteristics of students who attend elite institutions (motivation, academic ability) explain their subsequent success better than the institution itself — though elite prestige still carries measurable weight in certain competitive recruitment contexts. Use rankings as one input, not the only input.

Advertisement

Frequently asked questions

QS places 45% of its score on reputation surveys (academic and employer), which measures perception and brand prestige. ARWU places 30% of its score on Nobel Prize and Fields Medal winners among alumni and current staff, and 40% on research output in high-impact journals. An institution with strong employer relationships and international brand recognition but limited Nobel laureate history will score far higher in QS than ARWU. Conversely, a research university with a distinguished faculty of prize-winning scientists but lower commercial profile will score far higher in ARWU. NUS Singapore (QS: 8th, ARWU: 71st) is the clearest current example — a young, ambitious institution with exceptional employer reputation and international reach that simply has fewer historical prize winners than the century-old institutions that dominate ARWU. Neither rank is more "correct" — they answer different questions.

The five institutions that consistently rank in the top 10 across all three major systems are MIT, Stanford, Harvard, Oxford, and Cambridge. MIT ranks 1st in QS 2025, 3rd in THE 2025, and 4th in ARWU 2024. Stanford ranks 2nd in QS, 2nd in THE, and 2nd in ARWU. Oxford ranks 3rd in QS, 1st in THE, and 7th in ARWU. Harvard ranks 4th in QS, 4th in THE, and 1st in ARWU. These five institutions dominate because they combine all the factors that each ranking system values: research output, employer reputation, faculty prestige, international reach, and Nobel-calibre research histories. Beyond these five, rankings diverge significantly depending on the system and subject area, making a definitive "best university" answer impossible outside this top tier.

Rankings are a useful starting point but a poor basis for a final decision. They measure institutional characteristics (research output, prestige, resources) rather than what an individual student will experience in a specific programme. The correlation between overall institutional rank and career outcomes is positive but moderate for most careers — and the correlation largely disappears when student characteristics (motivation, academic ability) are controlled for. Rankings are more reliable for identifying strong research environments (relevant if you plan a research career or PhD) and for institutions where employer prestige recognition matters in recruitment (investment banking, consulting, elite law firms). For most undergraduate programmes, subject-specific rankings, graduate employment rates by course, student satisfaction data, and personal fit are more predictive of a good outcome than the overall institutional rank. Use the composite band on this page to shortlist broadly, then dig into programme-level data before deciding.

QS World Rankings are published annually each June. THE World Rankings are published annually in September-October. ARWU (Shanghai Ranking) is published annually in August. Most institutions' positions change modestly year-on-year — movements of 5-20 places are normal, driven by changes in citation counts, faculty composition, and survey responses. Large movements (50+ places in a single year) typically reflect a methodology change by the ranking organisation rather than a dramatic shift in institutional quality, which is why year-on-year comparisons require caution. This hub updates its data within 14 days of each new ranking release. Subject rankings are published separately: QS subject rankings typically appear in March-April, THE subject rankings in the same window. The annual update schedule means data shown here may be up to 11 months old at the point immediately before a new release.

The composite band is a broad categorisation assigned by Find The Norm based on a university's average position across the three major ranking systems where it appears. We take the arithmetic mean of its QS, THE, and ARWU ranks (excluding systems where the institution is unranked) and assign a band: Top 5, Top 10, Top 15, Top 20, Top 50, Top 100, Top 200, Top 500, or 500+. For institutions that appear in only one or two systems, the band is based on available data with a note indicating limited cross-system coverage. The composite band is not endorsed by any ranking organisation and is our own calculation. It smooths the noise between systems to provide a general answer to "how well regarded is this institution globally?" rather than claiming a precise single-digit rank. It is most useful for broad shortlisting and for identifying institutions where the three systems diverge substantially — a signal worth investigating further.

Advertisement

Frequently asked questions

Why do universities rank differently in QS, THE, and ARWU?

The three major ranking systems use fundamentally different methodologies. QS places heavy weight on reputation surveys (45% of its score comes from academic and employer reputation surveys), which favours well-known brands and institutions with large alumni networks. THE (Times Higher Education) weights research quality heavily (30%) and includes teaching proxies. ARWU focuses almost entirely on research output and prestige indicators like Nobel Prize winners (30%) and highly cited researchers (20%), completely ignoring teaching quality and reputation surveys. The result is that a teaching-focused university with strong employer connections might rank highly in QS but poorly in ARWU. Similarly, a research powerhouse with few Nobel laureates and low international visibility might rank well in THE but poorly in QS. The composite band on this page smooths these differences, but understanding why they diverge is more useful than picking a single number.

Are university rankings reliable?

Rankings measure what they measure, and they measure it reasonably well. The problem is that what they measure may not be what matters to you. If you care about research excellence, ARWU is a reasonable proxy. If you care about employer recognition, QS captures that. If you care about the classroom teaching quality you will personally experience, none of the rankings measure it directly. All three systems have been criticised for methodological choices: QS for its reliance on subjective reputation surveys, THE for opaque data collection, and ARWU for ignoring everything except research. A better approach is to use rankings as one input alongside programme-specific information, campus visits, student satisfaction surveys, and graduate employment data. This hub shows you the range across systems specifically to discourage over-reliance on any single number.

What is a composite band and how is it calculated?

The composite band is a broad categorisation we assign based on a university's average rank across the three systems where it appears. We take the arithmetic mean of its QS, THE, and ARWU ranks (excluding systems where it is unranked) and assign a band: Top 5, Top 10, Top 15, Top 20, Top 50, Top 100, Top 200, Top 500, or 500+. For universities that appear in only one or two systems, the band is based on available data with a note about limited coverage. The composite band is not endorsed by any ranking organisation and is our own calculation for comparison purposes. It smooths out the noise between systems but should not be treated as a definitive ranking. It is most useful for answering the broad question "is this a strong university?" rather than "is it the 47th or 53rd best university in the world."

How do subject rankings differ from overall rankings?

Significantly. A university ranked 200th overall might be in the top 30 for a specific subject. For example, a university with a world-class engineering school but weaker humanities and social sciences will rank much higher in engineering-specific rankings than in the overall table. QS and THE both publish subject-specific rankings that often tell a very different story from the overall rankings. For prospective students, the subject ranking for your intended field of study is usually more relevant than the overall institutional rank. This hub includes subject filtering where data is available. If you are choosing between universities for a specific programme, filter by subject rather than relying on the overall composite band.

Do rankings affect university funding and admissions?

Yes, rankings create a powerful feedback loop. Universities that rank highly attract more international students (who often pay higher fees), more research funding, and more philanthropic donations. This additional revenue allows them to invest in better facilities and recruit stronger faculty, which in turn improves their ranking. This self-reinforcing cycle means that ranking systems have real-world power over university strategy, sometimes driving institutions to prioritise metrics that improve their ranking position over changes that might better serve students. Some universities have been caught manipulating data submitted to ranking organisations. Prospective students should be aware that a university's ranking position reflects its optimised presentation to ranking systems, not just its intrinsic quality.

Why is my university not in any ranking?

The three major systems rank between 1,000 and 1,900 institutions, out of approximately 25,000 degree-granting institutions worldwide. To be ranked, a university typically needs to meet minimum thresholds for research output, size, and international visibility. Many excellent teaching-focused institutions, smaller liberal arts colleges, and specialised institutions (art schools, conservatories, professional schools) are not ranked because they do not produce enough research publications or are too small for the methodology. Not being ranked does not mean a university is bad; it means it falls outside the scope of these particular measurement systems. Checking country-specific rankings (such as NIRF for India or the TEF for the UK) may provide data for institutions not covered by global rankings.

How has the ranking landscape changed over the past decade?

The most significant shift has been the rise of Asian universities, particularly from China, Singapore, South Korea, and Hong Kong. Tsinghua and Peking University have moved from outside the top 50 to firmly within the top 25 in most systems. NUS Singapore is now regularly in the top 10 in QS. Meanwhile, some traditional European universities have slipped as research funding has shifted eastward. The US and UK still dominate the top 20, but the concentration is less extreme than a decade ago. The ranking methodologies themselves have also evolved: QS added sustainability and employment metrics, and THE revised its research quality indicators. These methodological changes mean that year-on-year comparisons require caution, as a university might move up or down partly because the scoring formula changed, not because the institution itself changed.

Should I choose a university based on its ranking?

Rankings should be one factor among many, not the deciding factor. Research consistently shows that what you study and how you engage with your programme matters more for career outcomes than the name of the institution. A motivated student at a rank-200 university who takes advantage of internships, research opportunities, and networking will typically outperform a disengaged student at a rank-20 university. That said, ranking does matter for certain career paths: elite consulting firms, investment banks, and some graduate programmes recruit disproportionately from top-ranked institutions. If you are targeting those specific paths, institutional prestige carries real weight. For most careers, the difference between a top-50 and top-200 university is marginal compared to other factors like location, programme fit, cost, and personal circumstances.

What free data can I access from ranking organisations?

QS publishes free downloadable datasets (CSV/Excel) for overall and subject rankings via topuniversities.com. THE publishes free data tables and allows sorting and filtering on its website. ARWU publishes free rankings tables on shanghairanking.com. All three allow non-commercial use with attribution. This hub aggregates data from all three free sources into a single interface. For more detailed data (individual indicator scores, historical trends, peer group analysis), QS and THE offer premium subscriptions, but the headline rank and key indicators are freely available. We update our aggregated dataset within 14 days of each new ranking release.

How does this hub differ from going to QS or THE directly?

The primary value is cross-system comparison in a single interface. If you go to topuniversities.com, you see only QS data. If you go to timeshighereducation.com, you see only THE data. Neither shows you how the same university ranks in the other systems, and neither explains why the ranks diverge. This hub places all three systems side by side, calculates a composite band, and highlights rank divergences with explanations tied to methodological differences. It also connects to the broader FTN education silo: from any university, you can explore country-specific rankings (NIRF, UK rankings), grade comparisons (Unified Grade Percentile), and institution-level GPA data (Average GPA by College). The goal is to make rankings useful rather than opaque.

Data sources

QS Quacquarelli Symonds (2025). QS World University Rankings 2025. topuniversities.com. Free with attribution.

Times Higher Education (2025). THE World University Rankings 2025. timeshighereducation.com. Free for non-commercial use with attribution.

ShanghaiRanking Consultancy (2024). Academic Ranking of World Universities 2024. shanghairanking.com. Free with attribution.

IPEDS College Navigator. US Department of Education, National Center for Education Statistics. Public domain.

HESA (Higher Education Statistics Agency, UK). Student outcomes and research quality data. CC BY 4.0.

Composite bands and tier classifications calculated by Find The Norm and not endorsed by any ranking organisation.

Reviewed by Find The Norm Research Team · Last updated April 2026

Reviewed by Find The Norm Research Team · · Methodology