A Big Week in the Measurement of Expertise: How the UK Research Excellence Framework (REF) Results Will Impact Universities

May 9, 2022

4 min

Justin Shaw

How should we measure faculty expertise? This week the UK provides its answer to this question via its highly significant and formal (government-directed) assessment of academic research - which grades academic teams on a scale of 1* to 4* for their ability to deliver, share and create impact globally outstanding research.


This process is known as the REF (the Research Excellence Framework) - and the results will be publicly released this Thursday (12th May) with universities themselves finding out how they’ve performed in advance today (Monday 9th May). The process was last carried out 8 years ago and has been delayed by a year due to the pandemic.



Why is the Research Excellence Framework (REF) Significant?


The Research Excellence Framework steers the level of UK public funds - allocated via research councils - that will be invested in research for each academic department (or so-called “Unit of Assessment”) for the next few years. It is also a way of comparing performance against other universities that are offering similar research expertise, and of strengthening (or weakening) global research reputations.


During the next three days, UK universities will be digging into the detail of their REF gradings and the accompanying feedback. There will be some very nervous university leaders and research heads delving into why this peer-assessed review of their research has not gone as well as they expected and why their percentages in each of the four grade areas have dropped - or even been given the “unclassified” career-damaging stamp.


How are the REF Scores for Universities Determined?


The measurement process is based on three aspects:


  1. Quality of outputs (such as: publications, performances, and exhibitions),
  2. Impact beyond academia
  3. The environment that supports research


The preparation, participation, and assessment process takes a massive amount of time, attention and energy. Last time (2014) there were 1,911 submissions to review. Research teams, designated REF leaders and senior staff will have spent long hours across many months preparing their submissions and making sure they are presenting hard evidence and the best case possible to meet the above criteria at the highest possible level. There are 34 subject areas that are covered in the latest REF - and three tiers of expert panels (some with about 20 or more senior academics, international subject leaders, and research users) will have reviewed each submission and compared notes to come to decisions.


How do these Key Categories within the REF Contribute to the Rating for a University?


The Research Excellence Framework is actually an intensive and highly important approach to expert assessment. These are the key factors and their definitions (with the assigned weighting of each of the criteria in steering final grades):


  1. Outputs (60%): the quality of submitted research outputs in terms of their ‘originality, significance and rigour’, with reference to international research quality standards. This element will carry a weighting of 60 per cent in the overall outcome awarded to each submission.
  2. Impact (25%): the ‘reach and significance’ of impacts on the economy, society, culture, public policy or services, health, the environment or quality of life that were underpinned by excellent research conducted in the submitted unit. This element carries a weighting of 25 per cent.
  3. Environment (15%): the research environment in terms of its ‘vitality and sustainability’, including the approach to enabling impact from its research, and its contribution to the vitality and sustainability of the wider discipline or research base. This element accounts for 15 per cent.


Taking a Closer Look at the Categories - Are We Focusing Enough on Research Impact?


In 2014 a formal review was carried out in order to improve and evolve the REF process which made a number of recommendations. Most notably the weighting for “impact” was increased by five percent, with “outputs” being reduced by the same percentage. This is certainly a recognition that the external contribution difference that research makes is more important - but is it enough? Should there be greater emphasis on the return on investment from a beneficiaries and user experience perspective?


Many argue that academic research should retain a strong element of ‘”blue sky” experimentation - where outright evidence of impact may take several years (even decades) and so can’t demonstrate such immediate value.


A particularly notable benefit of the timing of the COVID-19 pandemic and the effect of this in REF deadlines has allowed the extended assessment period for ‘proof of impact’ from 1 August 2013 to 31 December 2020. This is an extension from the previous end date of 31 July 2020. The extension has been put in place to enable case studies affected by, or focusing on the response to, COVID-19 to be assessed in REF 2021.


Going back to the original question: how should we measure faculty expertise? It will be interesting to monitor the views and responses of university leaders and faculty members at the end of this week as to whether they feel that - standing back from it all - this UK-centric method of measurement is the best that can be done, a neat compromise or isn’t really what we really need.


For more information on the Research Excellence Framework visit www.ref.ac.uk/


Justin Shaw

Justin is UK and Ireland Development Director for ExpertFile and Chief Higher Education Consultant at Communications Management. An authority on University strategy and communications, he has worked in and with leadership teams at UK universities for over 30 years. In his role he has advised universities on how to promote their expertise and on communications strategies related to the REF.


Connect with:
Justin Shaw

Justin Shaw

Chief Higher Education Consultant (Communications Management) and UK Development Director

Specialist in higher education, communications, reputation and marketing of academic experts

You might also like...

Check out some other posts from ExpertFile

2 min

The Thrill of Fear: The History and Cultural Significance of Horror Movies

From flickering silent films to today’s big-budget blockbusters, horror movies have always tapped into humanity’s oldest emotion: fear. Across decades, they’ve reflected social anxieties, moral questions, and shifting definitions of what scares us. Yet behind every scream lies a story about culture, creativity, and the psychology of thrill. The Origins of On-Screen Fear Horror cinema began in the early 1900s with short silent films inspired by literature and folklore. One of the earliest, Le Manoir du Diable (1896), often considered the first horror film, introduced audiences to bats, ghosts, and the Devil himself. By the 1920s, German Expressionist films like Nosferatu and The Cabinet of Dr. Caligari used shadow and distortion to create unease, shaping the language of horror still used today. Hollywood’s Golden Age of Horror in the 1930s brought monsters to life — Dracula, Frankenstein, and The Mummy — giving audiences both fright and fascination during a time of global economic depression. These films helped people confront real-world fears symbolically, offering escape through imagination. Fear Evolves with the Times Each generation has reinvented horror to reflect its cultural moment. The 1950s’ atomic-age fears spawned giant monsters and alien invasions. The 1960s and ’70s shifted toward psychological and supernatural horror with classics like Psycho, The Exorcist, and The Texas Chain Saw Massacre — films that exposed anxieties about social change, faith, and violence. The 1980s and ’90s introduced slasher icons such as Halloween’s Michael Myers and A Nightmare on Elm Street’s Freddy Krueger, mixing terror with pop-culture spectacle. By the 2000s, horror had splintered into subgenres — from found-footage realism (The Blair Witch Project, Paranormal Activity) to elevated art-house films like Get Out and Hereditary, which use fear to explore race, grief, and identity. Why We Like to Be Scared Psychologists suggest people enjoy horror because it offers safe danger — a way to experience fear, adrenaline, and relief without real threat. Watching horror triggers the body’s fight-or-flight response, followed by catharsis once the tension resolves. Culturally, it provides a mirror to our collective psyche: what we fear, we face, and what we face, we sometimes conquer. Horror also brings people together — in theaters, at home, or online — to share an intense emotional experience. Whether screaming, laughing, or peeking through fingers, audiences participate in a ritual as old as storytelling itself. The Icons of the Genre Among the most popular and influential horror films of all time: Psycho (1960) The Exorcist (1973) Halloween (1978) A Nightmare on Elm Street (1984) The Silence of the Lambs (1991) The Ring (2002) Get Out (2017) Hereditary (2018) Each left a lasting mark on both cinema and culture — showing that horror, far from being niche, remains one of the most expressive and enduring genres in film history. Connect with our experts about the history and popularity of scary movies and horror flicks: Check out our experts here : www.expertfile.com

2 min

Lighting the Night: The History and Meaning of the Jack-o’-Lantern

No Halloween is complete without the warm flicker of a Jack-o’-Lantern glowing from porches and windowsills. But long before it became a symbol of trick-or-treating and fall décor, the carved pumpkin had deep roots in folklore, superstition, and the immigrant experience that shaped North American culture. From Folklore to Flame The story begins in Ireland, where early Jack-o’-Lanterns were not pumpkins at all, but turnips and beets. The tradition sprang from an old Irish folktale about “Stingy Jack,” a clever but dishonest man who tricked the Devil and was doomed to wander the Earth with only a burning coal inside a hollowed-out turnip to light his way. People began carving their own “Jack’s lanterns” to ward off wandering spirits and evil forces during Samhain, the Celtic festival marking the end of the harvest and the beginning of winter. When Irish and Scottish immigrants brought this tradition to North America in the 19th century, they discovered that the native pumpkin—larger, softer, and easier to carve—was the perfect replacement. The transformation from turnip to pumpkin turned a small superstition into a dazzling new folk art. The American Reinvention By the mid-1800s, Jack-o’-Lanterns had become a staple of Halloween celebrations in the United States. Newspapers of the era described “pumpkin lanterns” lighting up autumn gatherings, and by the early 20th century, the smiling (and sometimes sinister) carved pumpkin was the defining symbol of the holiday. Over time, the tradition evolved from scaring away spirits to creating community and creativity. Towns began holding carving contests, families passed down patterns and designs, and pumpkin patches and Halloween festivals turned the once-humble lantern into an essential piece of American seasonal culture. A Symbol Beyond Scares Today, Jack-o’-Lanterns carry layered meanings: they celebrate harvest, creativity, and folklore while keeping a touch of the supernatural alive. In many ways, they embody the blend of ancient myth and modern celebration that defines Halloween itself—where fear meets fun, and the flicker of a candle becomes both decoration and tradition. Whether whimsical or eerie, the glowing face of a Jack-o’-Lantern continues to connect generations to an age-old story about light overcoming darkness—a reminder that even the spookiest traditions began with a spark of human imagination. Connect with our experts about the folklore, cultural history, and enduring legacy of the Jack-o’-Lantern. Check out our experts here : www.expertfile.com

2 min

Stepping Away from the Crown: Royals Giving Up Titles and Duties

Just last week, Prince Andrew announced that he would relinquish his title of Duke of York and other honours, citing that the ongoing allegations against him had become a distraction to the work of the royal family. He asserted this step was taken with the King’s agreement, stating he will no longer use the titles conferred upon him—even as he continues to deny any wrongdoing. A Legacy of Abdication and Renunciation Throughout royal history, stepping back from royal life or formally abdicating has taken many forms. The dramatic abdication of King Edward VIII in 1936—who gave up the British throne to marry Wallis Simpson—remains one of the most famous examples. Other monarchs, like Queen Christina of Sweden and Emperor Charles V, also renounced power to pursue personal convictions. Today’s examples are often more nuanced: royals “stepping down” from duties while retaining birthright status. The case of Prince Andrew fits in this evolving pattern of royal redefinition. Why Royals Leave (or Are Pushed Away) Motivations are diverse: personal choice, scandal, pressure, health, or changing views of leadership. Historically, abdications often responded to political crises. Now, with the monarchy under constant media and public scrutiny, stepping back can be seen as damage control or a bid for personal freedom—particularly in cases involving controversy. The Constitutional and Symbolic Ripples When a royal gives up titles or duties, multiple questions emerge: What role remains? (In Andrew’s case, he loses the Duke title but retains his princely status.) How does the monarchy manage public perception, continuity, and precedence? What are the implications for funding, patronages, and official duties? Such departures also force the institution to grapple with legacy, relevance, and the tension between duty and humanity. Monarchy in the Age of Transparency The modern era demands more from monarchy than ever before: accountability, relevance, and adaptability. When royals step aside—voluntarily or under pressure—it reshapes how the public sees royal duty. These shifts reflect broader questions: what role should individuals born into monarchy play? Can institutions evolve while retaining symbolic continuity? Connect with our experts about the history, symbolism, and modern evolution of royal abdications and withdrawals. To see our full database of experts, visit: www.expertfile.com

View all posts