The Many Faces of Expertise: Theres more than one way to define an expert!

Dec 10, 2019

3 min

Peter Evans



As the lead researcher at ExpertFile I spend a lot of my time thinking about the nature of expertise. I often get asked about the criteria that we use to identify experts, and it’s actually a very difficult concept to pin down. One of my favourite services that we provide to our clients is sitting down with them and helping them articulate what exactly “expertise” means to them. Sometimes this is an easy question, sometimes it is very, very difficult. The fact remains that there is no definition of expertise that perfectly encapsulates every organization’s requirements.


How do you begin to select your experts to showcase? Well, there are many different ways of exhibiting expertise. A lot of our clients are higher education institutions who are seeking to promote their academics. The evidence demonstrating their unique experience in their respective fields are easily encapsulated in their academic credentials, awards, grants, publications and professional affiliations. Academics are often at the cutting edge of their fields, and are very experienced telling meaningful stories about them. If a reporter is writing a story about the mountains of Pluto, then an astronomer who has dedicated their lives to the study of extra-Neptunian objects is a perfect choice.


By contrast, expertise in the corporate realm is more based on personal experience than academic credentials. This significantly differentiates corporate expert marketing programs from higher education. Although education and credentials remain important, it is generally more graded with a personal experience with the topic at hand. For example, a senior engineer at Google (with extensive media experience) would obviously be a tier A source for explaining the science behind contemporary web search engines, despite potentially having no university education in computer science! (Note: Some fields in computer science, such as machine learning, have a much more academic basis).


Both the astronomer and computer scientist are clearly experts, but the means that we used to evaluate their expertise are very different. I’m sure that you can see that both academics and practitioners have meaningful comments to add to news reporters. At ExpertFile, we think that the most important skill an expert can have is the ability to distill their domain-specific knowledge into a compelling, relevant and meaningful story. This ability to tell stories transcends experience, or credentials. A Harvard professor may be a nobel prize winner, but unless he can effectively communicate his research to the media and the public, he will not be perceived as an expert. In contrast, a professor at a local community college may have less august credentials, but if they can tell a good story, then they will engage with the public and media much more effectively. In fact, in some cases “lowly” adjuncts bring vibrant life-experiences and direct understanding of the real world applications of academic theory, and may therefore eclipse the atrophied viewpoints of some tenured faculty isolated for decades in the academy.


So at ExpertFile, we feel that a balanced representation of both academics and practitioners at every level is crucial for creating a balanced marketplace of experts that is a valuable resource for all our users. Although our platform contains thousands of academics, some from top flight institutions, we also list thousands of incredibly credible, engaging experts with real world, contemporary experience that can often eclipse that of tenured faculty. We are very proud of our network, and curate an active and vibrant community with a diversity of different voices and perspectives on topics of breaking news interest. We don’t open our network to anyone, the ability to distill meaningful and engaging content for others is crucial, so those who are only shilling a product or self-promoting are not welcome.


Some people think that only academics belong on an expert network, but at ExpertFile we believe that would exclude many people with extensive expertise. Similarly, it would be the height of elitism for us to restrict our database to tenured academics from top-tier universities. This would do our users a great disservice by removing those voices with a perspectives gained from practical experience.





Connect with:
Peter Evans

Peter Evans

Co-Founder & CEO

Recognized speaker on expertise marketing, technology and innovation

Media TrendsThought LeadershipMarketingTechnologyInnovation

You might also like...

Check out some other posts from ExpertFile

2 min

The Thrill of Fear: The History and Cultural Significance of Horror Movies

From flickering silent films to today’s big-budget blockbusters, horror movies have always tapped into humanity’s oldest emotion: fear. Across decades, they’ve reflected social anxieties, moral questions, and shifting definitions of what scares us. Yet behind every scream lies a story about culture, creativity, and the psychology of thrill. The Origins of On-Screen Fear Horror cinema began in the early 1900s with short silent films inspired by literature and folklore. One of the earliest, Le Manoir du Diable (1896), often considered the first horror film, introduced audiences to bats, ghosts, and the Devil himself. By the 1920s, German Expressionist films like Nosferatu and The Cabinet of Dr. Caligari used shadow and distortion to create unease, shaping the language of horror still used today. Hollywood’s Golden Age of Horror in the 1930s brought monsters to life — Dracula, Frankenstein, and The Mummy — giving audiences both fright and fascination during a time of global economic depression. These films helped people confront real-world fears symbolically, offering escape through imagination. Fear Evolves with the Times Each generation has reinvented horror to reflect its cultural moment. The 1950s’ atomic-age fears spawned giant monsters and alien invasions. The 1960s and ’70s shifted toward psychological and supernatural horror with classics like Psycho, The Exorcist, and The Texas Chain Saw Massacre — films that exposed anxieties about social change, faith, and violence. The 1980s and ’90s introduced slasher icons such as Halloween’s Michael Myers and A Nightmare on Elm Street’s Freddy Krueger, mixing terror with pop-culture spectacle. By the 2000s, horror had splintered into subgenres — from found-footage realism (The Blair Witch Project, Paranormal Activity) to elevated art-house films like Get Out and Hereditary, which use fear to explore race, grief, and identity. Why We Like to Be Scared Psychologists suggest people enjoy horror because it offers safe danger — a way to experience fear, adrenaline, and relief without real threat. Watching horror triggers the body’s fight-or-flight response, followed by catharsis once the tension resolves. Culturally, it provides a mirror to our collective psyche: what we fear, we face, and what we face, we sometimes conquer. Horror also brings people together — in theaters, at home, or online — to share an intense emotional experience. Whether screaming, laughing, or peeking through fingers, audiences participate in a ritual as old as storytelling itself. The Icons of the Genre Among the most popular and influential horror films of all time: Psycho (1960) The Exorcist (1973) Halloween (1978) A Nightmare on Elm Street (1984) The Silence of the Lambs (1991) The Ring (2002) Get Out (2017) Hereditary (2018) Each left a lasting mark on both cinema and culture — showing that horror, far from being niche, remains one of the most expressive and enduring genres in film history. Connect with our experts about the history and popularity of scary movies and horror flicks: Check out our experts here : www.expertfile.com

2 min

Lighting the Night: The History and Meaning of the Jack-o’-Lantern

No Halloween is complete without the warm flicker of a Jack-o’-Lantern glowing from porches and windowsills. But long before it became a symbol of trick-or-treating and fall décor, the carved pumpkin had deep roots in folklore, superstition, and the immigrant experience that shaped North American culture. From Folklore to Flame The story begins in Ireland, where early Jack-o’-Lanterns were not pumpkins at all, but turnips and beets. The tradition sprang from an old Irish folktale about “Stingy Jack,” a clever but dishonest man who tricked the Devil and was doomed to wander the Earth with only a burning coal inside a hollowed-out turnip to light his way. People began carving their own “Jack’s lanterns” to ward off wandering spirits and evil forces during Samhain, the Celtic festival marking the end of the harvest and the beginning of winter. When Irish and Scottish immigrants brought this tradition to North America in the 19th century, they discovered that the native pumpkin—larger, softer, and easier to carve—was the perfect replacement. The transformation from turnip to pumpkin turned a small superstition into a dazzling new folk art. The American Reinvention By the mid-1800s, Jack-o’-Lanterns had become a staple of Halloween celebrations in the United States. Newspapers of the era described “pumpkin lanterns” lighting up autumn gatherings, and by the early 20th century, the smiling (and sometimes sinister) carved pumpkin was the defining symbol of the holiday. Over time, the tradition evolved from scaring away spirits to creating community and creativity. Towns began holding carving contests, families passed down patterns and designs, and pumpkin patches and Halloween festivals turned the once-humble lantern into an essential piece of American seasonal culture. A Symbol Beyond Scares Today, Jack-o’-Lanterns carry layered meanings: they celebrate harvest, creativity, and folklore while keeping a touch of the supernatural alive. In many ways, they embody the blend of ancient myth and modern celebration that defines Halloween itself—where fear meets fun, and the flicker of a candle becomes both decoration and tradition. Whether whimsical or eerie, the glowing face of a Jack-o’-Lantern continues to connect generations to an age-old story about light overcoming darkness—a reminder that even the spookiest traditions began with a spark of human imagination. Connect with our experts about the folklore, cultural history, and enduring legacy of the Jack-o’-Lantern. Check out our experts here : www.expertfile.com

2 min

Stepping Away from the Crown: Royals Giving Up Titles and Duties

Just last week, Prince Andrew announced that he would relinquish his title of Duke of York and other honours, citing that the ongoing allegations against him had become a distraction to the work of the royal family. He asserted this step was taken with the King’s agreement, stating he will no longer use the titles conferred upon him—even as he continues to deny any wrongdoing. A Legacy of Abdication and Renunciation Throughout royal history, stepping back from royal life or formally abdicating has taken many forms. The dramatic abdication of King Edward VIII in 1936—who gave up the British throne to marry Wallis Simpson—remains one of the most famous examples. Other monarchs, like Queen Christina of Sweden and Emperor Charles V, also renounced power to pursue personal convictions. Today’s examples are often more nuanced: royals “stepping down” from duties while retaining birthright status. The case of Prince Andrew fits in this evolving pattern of royal redefinition. Why Royals Leave (or Are Pushed Away) Motivations are diverse: personal choice, scandal, pressure, health, or changing views of leadership. Historically, abdications often responded to political crises. Now, with the monarchy under constant media and public scrutiny, stepping back can be seen as damage control or a bid for personal freedom—particularly in cases involving controversy. The Constitutional and Symbolic Ripples When a royal gives up titles or duties, multiple questions emerge: What role remains? (In Andrew’s case, he loses the Duke title but retains his princely status.) How does the monarchy manage public perception, continuity, and precedence? What are the implications for funding, patronages, and official duties? Such departures also force the institution to grapple with legacy, relevance, and the tension between duty and humanity. Monarchy in the Age of Transparency The modern era demands more from monarchy than ever before: accountability, relevance, and adaptability. When royals step aside—voluntarily or under pressure—it reshapes how the public sees royal duty. These shifts reflect broader questions: what role should individuals born into monarchy play? Can institutions evolve while retaining symbolic continuity? Connect with our experts about the history, symbolism, and modern evolution of royal abdications and withdrawals. To see our full database of experts, visit: www.expertfile.com

View all posts