Breaking Away from Rankings

The Growing Movement to Reform Research Assessment and Rankings

By Dean Hoke, September 22, 2025: For the past fifteen years, I have been closely observing what can only be described as a worldwide fascination—if not obsession—with university rankings, whether produced by Times Higher Education, QS, or U.S. News & World Report. In countless conversations with university officials, a recurring theme emerges: while most acknowledge that rankings are often overused by students, parents, and even funders when making critical decisions, few deny their influence. Nearly everyone agrees that rankings are a “necessary evil”—flawed, yet unavoidable—and many institutions still direct significant marketing resources toward leveraging rankings as part of their recruitment strategies.

It is against this backdrop of reliance and ambivalence that recent developments, such as Sorbonne University’s decision to withdraw from THE rankings, deserve closer attention

In a move that signals a potential paradigm shift in how universities position themselves globally, Sorbonne University recently announced it will withdraw from the Times Higher Education (THE) World University Rankings starting in 2026. This decision isn’t an isolated act of defiance—Utrecht University had already left THE in 2023, and the Coalition for Advancing Research Assessment (CoARA), founded in 2022, has grown to 767 members by September 2025. Together, these milestones reflect a growing international movement that questions the very foundations of how we evaluate academic excellence.

The Sorbonne Statement: Quality Over Competition

Sorbonne’s withdrawal from THE rankings isn’t merely about rejecting a single ranking system. It appears to be a philosophical statement about what universities should stand for in the 21st century. The institution has made it clear that it refuses to be defined by its position in what it sees as commercial ranking matrices that reduce complex academic institutions to simple numerical scores.

Understanding CoARA: The Quiet Revolution

The Coalition for Advancing Research Assessment represents one of the most significant challenges to traditional academic evaluation methods in decades. Established in 2022, CoARA has grown rapidly to include 767 member organizations as of September 2025. This isn’t just a European phenomenon—though European institutions have been early and enthusiastic adopters. The geographic distribution of CoARA members tells a compelling story about where resistance to traditional ranking systems is concentrated. As the chart shows, European countries dominate participation, led by Spain and Italy, with strong engagement also from Poland, France, and several Nordic countries. This European dominance isn’t accidental—the region’s research ecosystem has long been concerned about the Anglo-American dominance of global university rankings and the way these systems can distort institutional priorities.

The Four Pillars of Reform

CoARA’s approach centers on four key commitments that directly challenge the status quo:

1. Abandoning Inappropriate Metrics The agreement explicitly calls for abandoning “inappropriate uses of journal- and publication-based metrics, in particular inappropriate uses of Journal Impact Factor (JIF) and h-index.” This represents a direct assault on the quantitative measures that have dominated academic assessment for decades.

2. Avoiding Institutional Rankings Perhaps most relevant to the Sorbonne’s decision, CoARA commits signatories to “avoid the use of rankings of research organisations in research assessment.” This doesn’t explicitly require withdrawal from ranking systems, but it does commit institutions to not using these rankings in their own evaluation processes.

3. Emphasizing Qualitative Assessment The coalition promotes qualitative assessment methods, including peer review and expert judgment, over purely quantitative metrics. This represents a return to more traditional forms of academic evaluation, albeit updated for modern needs.

4. Responsible Use of Indicators Rather than eliminating all quantitative measures, CoARA advocates for the responsible use of indicators that truly reflect research quality and impact, rather than simply output volume or citation counts.

European Leadership

Top 10 Countries by CoARA Membership:

The geographic distribution of CoARA members tells a compelling story about where resistance to traditional ranking systems is concentrated. As the chart shows, European countries dominate participation, led by Spain and Italy, with strong engagement also from Poland, France, and several Nordic countries. This European dominance isn’t accidental—the region’s research ecosystem has long been concerned about the Anglo-American dominance of global university rankings and the way these systems can distort institutional priorities.

The geographic distribution of CoARA members tells a compelling story about where

Prestigious European universities like ETH Zurich, the University of Zurich, Politecnico di Milano, and the University of Manchester are among the members, lending credibility to the movement. However, the data reveals that the majority of CoARA members (84.4%) are not ranked in major global systems like QS, which adds weight to critics’ arguments about institutional motivations.

CoARA Members Ranked vs Not Ranked in QS:

The Regional Divide: Participation Patterns Across the Globe

What’s particularly striking about the CoARA movement is the relative absence of U.S. institutions. While European universities have flocked to join the coalition, American participation remains limited. This disparity reflects fundamental differences in how higher education systems operate across regions.

American Participation: The clearest data we have on institutional cooperation with ranking systems comes from the United States. Despite some opposition to rankings, 78.1% of the nearly 1,500 ranked institutions returned their statistical information to U.S. News in 2024, showing that the vast majority of American institutions remain committed to these systems. However, there have been some notable American defections. Columbia University is among the latest institutions to withdraw from U.S. News & World Report college rankings, joining a small but growing list of American institutions questioning these systems. Yet these remain exceptions rather than the rule.

European Engagement: While we don’t have equivalent participation rate statistics for European institutions, we can observe their engagement patterns differently. 688 universities appear in the QS Europe ranking for 2024, and 162 institutions from Northern Europe alone appear in the QS World University Rankings: Europe 2025. However, European institutions have simultaneously embraced the CoARA movement in large numbers, suggesting a more complex relationship with ranking systems—continued participation alongside philosophical opposition.

Global Participation Challenges: For other regions, comprehensive participation data is harder to come by. The Arab region has 115 entries across five broad areas of study in QS rankings, but these numbers reflect institutional inclusion rather than active cooperation rates. It’s important to note that some ranking systems use publicly available data regardless of whether institutions actively participate or cooperate with the ranking organizations.

This data limitation itself is significant—the fact that we have detailed participation statistics for American institutions but not for other regions may reflect the more formalized and transparent nature of ranking participation in the U.S. system versus other global regions.

American universities, particularly those in the top tiers, have largely benefited from existing ranking systems. The global prestige and financial advantages that come with high rankings create powerful incentives to maintain the status quo. For many American institutions, rankings aren’t just about prestige—they’re about attracting international students, faculty, and research partnerships that are crucial to their business models.

Beyond Sorbonne: Other Institutional Departures

Sorbonne isn’t alone in taking action. Utrecht University withdrew from THE rankings earlier, citing concerns about the emphasis on scoring and competition. These moves suggest that some institutions are willing to sacrifice prestige benefits to align with their values. Interestingly, the Sorbonne has embraced alternative ranking systems such as the Leiden Open Rankings, which highlight its impact.

The Skeptics’ View: Sour Grapes or Principled Stand?

Not everyone sees moves like Sorbonne’s withdrawal as a noble principle. Critics argue that institutions often raise philosophical objections only after slipping in the rankings. As one university administrator put it: “If the Sorbonne were doing well in the rankings, they wouldn’t want to leave. We all know why self-assessment is preferred. ‘Stop the world, we want to get off’ is petulance, not policy.”

This critique resonates because many CoARA members are not major players in global rankings, which fuels suspicion that reform may be as much about strategic positioning as about values. For skeptics, the call for qualitative peer review and expert judgment risks becoming little more than institutions grading themselves or turning to sympathetic peers.

The Stakes: Prestige vs. Principle

At the heart of this debate is a fundamental tension: Should universities prioritize visibility and prestige in global markets, or focus on measures of excellence that reflect their mission and impact? For institutions like the Sorbonne, stepping away from THE rankings is a bet that long-term reputation will rest more on substance than on league table positions. But in a globalized higher education market, the risk is real—rankings remain influential signals to students, faculty, and research partners.
Rankings also exert practical influence in ways that reformers cannot ignore. Governments frequently use global league tables as benchmarks for research funding allocations or as part of national excellence initiatives. International students, particularly those traveling across continents, often rely on rankings to identify credible destinations, and faculty recruitment decisions are shaped by institutional prestige. In short, rankings remain a form of currency in the global higher education market.

This is why the decision to step away from them carries risk. Institutions like the Sorbonne and Utrecht may gain credibility among reform-minded peers, but they could also face disadvantages in attracting international talent or demonstrating competitiveness to funders. Whether the gamble pays off will depend on whether alternative measures like CoARA or ROI rankings achieve sufficient recognition to guide these critical decisions.

The Future of Academic Assessment

The CoARA movement and actions like Sorbonne’s withdrawal represent more than dissatisfaction with current ranking systems—they highlight deeper questions about what higher education values in the 21st century. If the movement gains further momentum, it could push institutions and regulators to diversify evaluation methods, emphasize collaboration over competition, and give greater weight to societal impact.

Yet rankings are unlikely to disappear. For students, employers, and funders, they remain a convenient—if imperfect—way to compare institutions across borders. The practical reality is that rankings will continue to coexist with newer approaches, even as reform efforts reshape how universities evaluate themselves internally.

Alternative Rankings: The Rise of Outcome-Based Assessment

While CoARA challenges traditional rankings, a parallel trend focuses on outcome-based measures such as return on investment (ROI) and career impact. Georgetown University’s Center on Education and the Workforce, for example, ranks more than 4,000 colleges on the long-term earnings of their graduates. Its findings tell a very different story than research-heavy rankings—Harvey Mudd College, which rarely appears at the top of global research lists, leads ROI tables with graduates projected to earn $4.5 million over 40 years.

Other outcome-oriented systems, such as The Princeton Review’s “Best Value” rankings, emphasize affordability, employment, and post-graduation success. These approaches highlight institutions that may be overlooked by global research rankings but deliver strong results for students. Together, they represent a pragmatic counterbalance to CoARA’s reform agenda, showing that students and employers increasingly want measures of institutional value beyond research metrics alone.

These alternative models can be seen most vividly in rankings that emphasize affordability and career outcomes. *The Princeton Review’s* “Best Value” rankings, for example, combine measures of financial aid, academic rigor, and post-graduation outcomes to highlight institutions that deliver strong returns for students relative to their costs. Public universities often rise in these rankings, as do specialized colleges that may not feature prominently in global research tables.

Institutions like the Albany College of Pharmacy and Health Sciences illustrate this point. Although virtually invisible in global rankings, Albany graduates report median salaries of $124,700 just ten years after graduation, placing the college among the best in the nation on ROI measures. For students and families making education decisions, data like this often carries more weight than a university’s position in QS or THE.

Together with Georgetown’s ROI rankings and the example of Harvey Mudd College, these cases suggest that outcome-based rankings are not marginal alternatives—they are becoming essential tools for understanding institutional value in ways that matter directly to students and employers.

Rankings as Necessary Evil: The Practical Reality

The CoARA movement and actions like Sorbonne’s withdrawal represent more than just dissatisfaction with current ranking systems. They reflect deeper questions about the values and purposes of higher education in the 21st century.

If the movement gains momentum, we could see:

Diversification of evaluation methods, with different regions and institution types developing assessment approaches that align with their specific values and goals

Reduced emphasis on competition between institutions in favor of collaboration and shared improvement

Greater focus on societal impact rather than purely academic metrics

More transparent and open assessment processes that allow for a better understanding of institutional strengths and contributions

Conclusion: Evolution, Not Revolution

The Coalition for Advancing Research Assessment and decisions like Sorbonne’s withdrawal from THE rankings represent important challenges to how we evaluate universities, but they signal evolution rather than revolution. Instead of the end of rankings, we are witnessing their diversification. ROI-based rankings, outcome-focused measures, and reform initiatives like CoARA now coexist alongside traditional global league tables, each serving different audiences.

Skeptics may dismiss reform as “sour grapes,” yet the concerns CoARA raises about distorted incentives and narrow metrics are legitimate. At the same time, American resistance reflects both philosophical differences and the pragmatic advantages U.S. institutions enjoy under current systems.

The most likely future is a pluralistic landscape: research universities adopting CoARA principles internally while maintaining a presence in global rankings for visibility; career-focused institutions highlighting ROI and student outcomes; and students, faculty, and employers learning to navigate multiple sources of information rather than relying on a single hierarchy.

In an era when universities must demonstrate their value to society, conversations about how we measure excellence are timely and necessary. Whether change comes gradually or accelerates, the one-size-fits-all approach is fading. A more complex mix of measures is emerging—and that may ultimately serve students, institutions, and society better than the systems we are leaving behind. In the end, what many once described to me as a “necessary evil” may persist—but in a more balanced landscape where rankings are just one measure among many, rather than the single obsession that has dominated higher education for so long.


Dean Hoke is Managing Partner of Edu Alliance Group, a higher education consultancy. He formerly served as President/CEO of the American Association of University Administrators (AAUA). Dean has worked with higher education institutions worldwide. With decades of experience in higher education leadership, consulting, and institutional strategy, he brings a wealth of knowledge on colleges’ challenges and opportunities. Dean is the Executive Producer and co-host for the podcast series Small College America.

Higher Education Leadership in Times of Crisis Part II

By Dr. Barry Ryan, September 15, 2025 – In my August 11th article titled ‘Higher Education Leadership in Times of Crisis,” we established that higher education leadership today cannot be solitary work and that effective crisis response requires both internal and external counsel. Now that you’ve assembled (at least thought through) your cast of trusted advisors and recognized the unique leadership demands of your situation, the next critical step is understanding what you’re actually facing—and how to navigate it successfully. Once you recognize that your organization may be entering such a time, there are three key initial questions to ask:

  1. How long can a crisis be expected to last?
  2. What are the effects of crisis on my institution, on my team, on my loved ones, and on me?
  3. What are some healthy and effective ways I can lead during crisis?

First, how long should I expect a “typical” crisis to last?

At first blush, it might seem a little silly to ask how long a crisis lasts. After all, isn’t that inherently unpredictable?

The answer is “yes” and “no.” It may seem a little flippant to say, but the reality is that the length of a crisis depends to a certain degree on how you and those in leadership alongside you respond to it. Your approach and actions may make it longer or shorter than it would have been. Here’s what I mean.

Ignoring a crisis and hoping that it blows over is actually a potential strategy—although not one that I would recommend in most circumstances. But there are some built-in roadblocks in a university’s life cycle, which is divided largely into annual, semester, or quarter segments. These can act, on their own, as speed bumps or detours that might diminish or change the course of a crisis.  

For example, a crisis that is being instigated or aggravated by certain individuals might be relieved to some degree on its own by their departure through retirement, transfer, and so on.  Or a financial crisis might be alleviated by the structural limits on certain types of debt that will be paid off, or the inception of certain grants or gifts that are within sight. But these are, unfortunately, uncommon scenarios, and the timing may be unpredictable.

On a global scale, one might think of Winston Churchill trying to imagine how long World War II might last. As futile as such a task might have been, he did, indeed, play out various scenarios and their likely duration. Although it makes for a great quote and probably captures an important aspect of Churchill’s thinking, he likely did not say, “When you’re going through hell, keep going.” But that’s a good reminder for anyone in crisis.

To grossly generalize, I have found that most institutional crises last between six months and two years. Why is that? The more acute ones require quicker action, and the result is either a solution that addresses the issues promptly and efficiently, in, say, six months, and you can move on to other things. Or, failing to find a speedy solution may end with you moving on. (And I don’t mean this lightly, but the reality is that moving on is not the end of the world.)

Why the two-year time frame, on the other end? Because I’ve found that to be about the maximum time frame that a board, or an accreditor, or a creditor, or even a faculty can endure before a solution is reached. Again, the conclusion of the crisis will either leave you in a happier and stronger position in your institution or leave you seeking happiness and a better position somewhere else. But somewhere between six months and two years is what I have found to be the rough lifespan of an intense crisis. (This is barring, of course, a truly existential crisis as a result of which the institution ceases to exist in its current form. But even that drastic of an outcome can easily take two years or more to unfold.)

Second, what are some of the common effects, and how do you survive them?

For the sake of argument, let’s say you become aware that you are entering a crisis period, whether or not it eventually proves to be an existential one. How do you survive in the intervening six months to two years?

Let’s begin with the effects of a continuing crisis on a leader. The crisis can easily become an enormous distraction for someone who already has too much on their plate. The stress that comes with leadership increases in crisis times, with mental, emotional, and even physical effects. Exhaustion can become a daily (and nightly) companion.  Self-doubt creeps in and steals even more of the leader’s resources.

It sounds trite, but when this happens, don’t forget to take a few deep breaths – physically and metaphorically. 

Draw up a “non-crisis” item list, i.e., things that still need to be done, but aren’t necessarily at the crisis point. Now start divvying them up between and among your fellow leaders, and to their direct reports when possible. This could be an opportune time to help them grow and develop, as well as ease your load.

Along with that, begin to excuse yourself from meetings at which your presence is not absolutely necessary. Only you really know which are and which aren’t. You may still need to attend to some that aren’t technically necessary, but that may prove helpful in crisis-related activities. Again, having trusted substitutes sit in for you for a while can be a growth opportunity for them, and also demonstrate that you trust and empower those with whom you work. When it comes to meetings, which can serve to drain you even more, perhaps adopt a practice of only making limited strategic appearances. Make your participation relevant enough and just long enough to establish your presence and help you – and your colleagues – feel like you’re staying in touch.

Don’t forget to take some days off, or even vacations. Sad but true, don’t make them too long or too far away or somewhere too difficult for you to be reached. You’re probably not really going to relax completely anyway, but you should at least experience some benefit from a change in perspective and place. Frankly, you would do well to consider the health and happiness of your loved ones who’ve been going through this with you, and that they need a break, perhaps even more than you do. After all, you are able to face the crisis more directly, as well as possible enemies, while your loved ones have to suffer vicariously and without the same ability to engage.

Third, how to lead during a crisis?

There is no question that crises have deleterious effects on you, your friends and family, but also your colleagues. You undoubtedly have support and supporters (even though they may seem distant), so don’t neglect them. Their fidelity to the institution and its mission – and you – deserves appreciation and acknowledgement, even if only expressed privately. They’re worried about the institution, but also their livelihood and their colleagues as well. 

When they see you, try not to be the deer in the headlights (a situation that doesn’t usually end well in the wild). Appearing indecisive is uninspiring. But so is being overbearing or angry.

Try to be yourself as you were before the crisis. Remember to smile, relax the muscles of your face and neck, and ask them about their loved ones, their teaching, or their research. Be human. The thoughtful ones have an idea about what you’re feeling and going through, so it’s okay for them to see you as a human. You don’t have to adopt a fake effervescence, but you should avoid moping.

Seek impartial counsel. That may, or may not, include colleagues. A small group of confidants is necessary. External friends who have the courage to be honest with you, and also keep complete confidence, can be your best resource to help you gain and keep perspective. They may have higher ed experience, but not necessarily. I have always found that the best counsel comes from folks who have had real challenges, real losses, survived real attacks, and still kept their heads about them. Ones that are “too perfect” are probably not what you need at this point.


While there is a need for you to seek and obtain trustworthy counsel, you should at the same time try to avoid seeking too much counsel. Bottom line is that you’re a leader and you’re going to have to make difficult decisions. So you should accept counsel, but too much can be confusing and even overwhelming. 

Look, you’re in a tough position and no matter what you do, some people (possibly including some people you respect and care about) are not going to be thrilled. Sad but true. And some of those feelings may change over time, as they come to a fuller perspective as well.

My advice to leaders in crisis situations always includes two elements:

Can you make a decision that allows you to look at yourself in the mirror? 

Then do what you believe is right and let the chips fall where they may. Period.

While you are a leader in a profession you may (or may not any longer) dearly love, there IS an “after.”  That may mean continuing in your post-crisis position in the same post-crisis institution, or it may mean more significant changes for you.  If so, take what you’ve learned along to whatever comes next.  Partings are rarely enjoyable, but I recall a very thoughtful young person we had to let go.  His response was remarkable.  “I want to learn from this experience and become better as a result.” When I saw him at another institution a year later, he came up to me and said that’s exactly what had transpired and that he was grateful.

Your life, and your legacy, are much more than just this current time of crisis within this current institution. Be grateful to those who have earned that gratitude, and remember who you are.


Dr. Barry Ryan is a seasoned higher education executive, legal scholar, and former president of five universities. He is a senior consultant for the Edu Alliance Group and a legal scholar. With more than 25 years of leadership experience, Dr. Ryan has served in numerous roles, including faculty member, department chair, dean, vice president, provost, and chief of staff at state, non-profit, and for-profit universities and law schools. His extensive accreditation experience includes two terms on the WASC Senior College and University Commission (WSCUC), serving a maximum of six years. He is widely recognized for his expertise in governance, accreditation, crisis management, and institutional renewal.

In addition to his academic career, Dr. Ryan ​ served as the Supreme Court Fellow in the chambers of Chief Justice William H. Rehnquist and is a​ member of numerous federal and state bars. He has contributed extensively to charitable organizations and is experienced in board leadership and large-scale fundraising. He remains a trusted advisor to universities and boards seeking strategic alignment and transformation.

He earned his Ph.D. from the University of California, Santa Barbara, his J.D. from the University of​ California, Berkeley, and his Dipl.GB in international business from the University of Oxford.


Edu Alliance Group, Inc. (EAG), founded in 2014, is an education consulting firm located in Bloomington, Indiana, and Abu Dhabi, United Arab Emirates. We assist higher education institutions worldwide on a variety of mission-critical projects. Our consultants are accomplished leaders who use their experience to diagnose and solve challenges.

EAG has provided consulting and executive search services for over 40 higher education institutions in Australia, Egypt, Georgia, India, Kazakhstan, Morocco, Nigeria, Uganda, the United Arab Emirates, and the United States.