Experts Matter. Find Yours.

Connect for media, speaking, professional opportunities & more.

Researchers warn of rise in AI-created non-consensual explicit images featured image

Researchers warn of rise in AI-created non-consensual explicit images

A team of researchers, including Kevin Butler, Ph.D., a professor in the Department of Computer and Information Science and Engineering at the University of Florida, is sounding the alarm on a disturbing trend in artificial intelligence: the rapid rise of AI-generated sexually explicit images created without the subject’s consent. With funding from the National Science Foundation, Butler and colleagues from UF, Georgetown University and the University of Washington investigated a growing class of tools that allow users to generate realistic nude images from uploaded photos — tools that require little skill, cost virtually nothing and are largely unregulated. “Anybody can do this,” said Butler, director of the Florida Institute for Cybersecurity Research. “It’s done on the web, often anonymously, and there’s no meaningful enforcement of age or consent.” The team has coined the term SNEACI, short for synthetic non-consensual explicit AI-created imagery, to define this new category of abuse. The acronym, pronounced “sneaky,” highlights the secretive and deceptive nature of the practice. “SNEACI really typifies the fact that a lot of these are made without the knowledge of the potential victim and often in very sneaky ways,” said Patrick Traynor, a professor and associate chair of research in UF's Department of Computer and Information Science and Engineering and co-author of the paper. In their study, which will be presented at the upcoming USENIX Security Symposium this summer, the researchers conducted a systematic analysis of 20 AI “nudification” websites. These platforms allow users to upload an image, manipulate clothing, body shape and pose, and generate a sexually explicit photo — usually in seconds. Unlike traditional tools like Photoshop, these AI services remove nearly all barriers to entry, Butler said. “Photoshop requires skill, time and money,” he said. “These AI application websites are fast, cheap — from free to as little as six cents per image — and don’t require any expertise.” According to the team’s review, women are disproportionately targeted, but the technology can be used on anyone, including children. While the researchers did not test tools with images of minors due to legal and ethical constraints, they found “no technical safeguards preventing someone from doing so.” Only seven of the 20 sites they examined included terms of service that require image subjects to be over 18, and even fewer enforced any kind of user age verification. “Even when sites asked users to confirm they were over 18, there was no real validation,” Butler said. “It’s an unregulated environment.” The platforms operate with little transparency, using cryptocurrency for payments and hosting on mainstream cloud providers. Seven of the sites studied used Amazon Web Services, and 12 were supported by Cloudflare — legitimate services that inadvertently support these operations. “There’s a misconception that this kind of content lives on the dark web,” Butler said. “In reality, many of these tools are hosted on reputable platforms.” Butler’s team also found little to no information about how the sites store or use the generated images. “We couldn’t find out what the generators are doing with the images once they’re created” he said. “It doesn’t appear that any of this information is deleted.” High-profile cases have already brought attention to the issue. Celebrities such as Taylor Swift and Melania Trump have reportedly been victims of AI-generated non-consensual explicit images. Earlier this year, Trump voiced support for the Take It Down Act, which targets these types of abuses and was signed into law this week by President Donald Trump. But the impact extends beyond the famous. Butler cited a case in South Florida where a city councilwoman stepped down after fake explicit images of her — created using AI — were circulated online. “These images aren’t just created for amusement,” Butler said. “They’re used to embarrass, humiliate and even extort victims. The mental health toll can be devastating.” The researchers emphasized that the technology enabling these abuses was originally developed for beneficial purposes — such as enhancing computer vision or supporting academic research — and is often shared openly in the AI community. “There’s an emerging conversation in the machine learning community about whether some of these tools should be restricted,” Butler said. “We need to rethink how open-source technologies are shared and used.” Butler said the published paper — authored by student Cassidy Gibson, who was advised by Butler and Traynor and received her doctorate degree this month — is just the first step in their deeper investigation into the world of AI-powered nudification tools and an extension of the work they are doing at the Center for Privacy and Security for Marginalized Populations, or PRISM, an NSF-funded center housed at the UF Herbert Wertheim College of Engineering. Butler and Gibson recently met with U.S. Congresswoman Kat Cammack for a roundtable discussion on the growing spread of non-consensual imagery online. In a newsletter to constituents, Cammack, who serves on the House Energy and Commerce Committee, called the issue a major priority. She emphasized the need to understand how these images are created and their impact on the mental health of children, teens and adults, calling it “paramount to putting an end to this dangerous trend.” "As lawmakers take a closer look at these technologies, we want to give them technical insights that can help shape smarter regulation and push for more accountability from those involved," said Butler. “Our goal is to use our skills as cybersecurity researchers to address real-world problems and help people.”

Kevin Butler profile photoPatrick Traynor profile photo
4 min. read
UF professor to expand proven disease-prediction dashboard to monitor Gulf threats featured image

UF professor to expand proven disease-prediction dashboard to monitor Gulf threats

After deploying life-saving cholera-prediction systems in Africa and Asia, a University of Florida researcher is turning his attention to the pathogen-plagued waters off Florida’s Gulf Coast. In the fight to end cholera deaths by 2030 – a goal set by the World Health Organization – UF researcher and professor Antar Jutla, Ph.D., has deployed his Cholera Risk Dashboard in about 20 countries, most recently in Kenya. Using NASA and NOAA satellite images and artificial intelligence algorithms, the dashboard is an interactive web interface that pinpoints areas ripe for thriving cholera bacteria. It can predict cholera risk four weeks out, allowing early and proactive humanitarian efforts, medical preparation and health warnings. Cholera is a bacterial disease spread through contaminated food and water; it causes severe intestinal issues and can be fatal if untreated. The US Centers for Disease Control reports between 21,000 and 143,000 cholera deaths each year globally. Make no mistake, the Cholera Risk Dashboard saves lives, existing users contend. His team now wants to set up a similar pathogen-monitoring and disease-prediction system for pathogenic bacteria in the warm, pathogen-fertile waters of the Gulf of America. “Its timeliness, its predictiveness and its ease of access to the right data is a game changer in responding to outbreaks and preventing potentially catastrophic occurrences.” - Linet Kwamboka Nyang’au, a senior program manager for Global Partnership for Sustainable Development Data Closer to home Jutla is seeking funding to develop a pathogen-prediction model to identify dangerous bacteria in the Gulf to warn people – particularly rescue workers – to use protective gear or avoid contaminated areas. He envisions post-hurricane systems for the Gulf that will help the U.S. Navy/Coast Guard and other rescue workers make informed health decisions before entering the water. And he wants UF to be at the forefront of this technology. “If we have enough resources, I think within a year we should have a prototype ready for the Gulf,” said Jutla, an associate professor with UF’s Engineering School Sustainable Infrastructure and Environment. “We want to build that expertise here at UF for the entire Gulf of America.” Jutla and his co-investigators have applied for a five-year, $4 million NOAA RESTORE grant to study pathogens known as vibrios off Florida’s West Coast and develop the Vibrio Warning System. These vibrios in the Gulf can cause diarrhea, stomach cramps, nausea, vomiting, fever and chills. One alarming example is Vibrio vulnificus, commonly known as flesh-eating bacteria, a bacterium that often leads to amputations or death. The Centers for Disease Control and Prevention (CDC) has reported increases in vibrio infections in the Gulf region, particularly from 2000 to 2018. The warm and ecologically sensitive Gulf waters provide a thriving habitat for harmful pathogens. “The grant builds directly on the success of our cholera-prediction system," Jutla noted. "By integrating AI technologies into public health decision-making, we would not only lead the nation but also become self-reliant in understanding the movement of environmentally sensitive pathogens, positioning ourselves as global leaders.” Learning from preparing early Jutla’s dashboards are critical tools for global health and humanitarian officials, said Linet Kwamboka Nyang’au, a senior program manager for Global Partnership for Sustainable Development Data. “Its timeliness, its predictiveness and its ease of access to the right data is a game changer in responding to outbreaks and preventing potentially catastrophic occurrences,” Kwamboka Nyang’au said. Over the last few years, Jutla and several health/government leaders have been working to deploy the cholera-predictive dashboard. “Our partnership with UF, the government of Kenya and others on the cholera dashboard is a life-saving mission for high-risk, extremely vulnerable populations in Africa. By predicting potential cholera outbreaks and coordinating multi-stakeholder interventions, we are enabling swift action and empowering local governments and communities to prevent crises before they unfold,” said Davis Adieno, senior director of programs for the Global Partnership for Sustainable Development Data. The early warnings for waterborne pathogens also allows the United Nations time to issue early assistance to residents in the outbreak’s path, said Juan Chaves-Gonzalez, a program advisor with the United Nations’ Office for the Coordination of Humanitarian Affairs. “There are several things we do with the money ahead of time. We provide hygiene kits. We repair and protect water sources. We start chlorination, we set up hand-washing stations, train and deploy rapid-response teams. At the community level, we try to inject funding to procure rapid-diagnostic tests,” he said. “We identify those very, very specific barriers and put money in organizations’ hands in advance to remove those barriers.” Eyes on the Gulf In the United States, hurricanes stir up vibrios in the Gulf, posing a high risk of infection for humans in the water. There has been a nearly 200% increase in these cases over the last 20 years in the U.S., according to the CDC. “After Hurricane Ian, we saw a very heavy presence of these vibrios in Sarasota Bay and the Charlotte Bay region. Not only that, but they were showing signs of antibiotic-resistance. Last year, we had one of the largest number of cases of vibriosis in the history of Florida,” Jutla said. Samples from 2024 hurricanes Helene and Milton are being analyzed with AI and complex bioinformatics algorithms. “If there is a risky operation by rescue personnel, not using personal protective equipment, then we would want them to know there is a significant concentration of these bacteria in the water,” Jutla said. “As an example, Navy divers operating in contaminated waters are at risk of infections from vibrios and other enteric pathogens, which can cause severe gastrointestinal and wound infections.” Safety and economics “Exposure to vibrios and other enteric pathogens,” Jutla added, “can disrupt economic activities, particularly in coastal regions that are dependent on tourism and fishing. And vibrios may be considered potential bioterrorism agents due to their ability to cause widespread illness and panic.” In developing the Vibrio Warning System, Jutla noted, he and his team want to significantly enhance public health safety and preparedness along the Gulf Coast. By leveraging advanced AI technologies, satellite datasets and predictive modeling, they plan to mitigate the risks posed by environmentally sensitive pathogenic bacteria, ensuring timely interventions and safeguarding human health and economic activities. “Hospital systems and healthcare providers in the Gulf region will have a tool for anticipatory decision making on where and when to anticipate illness from these environmentally sensitive vibrios, and issue a potential warning to the general public,” he said. “With the potential to become a leader in environmental pathogen prediction, UF stands at the forefront of this critical research, poised to make a lasting impact on local, regional, national and global health and safety.”

Antar Jutla profile photo
5 min. read
LSU Experts Break Down Artificial Intelligence Boom Behind Holiday Shopping Trends featured image

LSU Experts Break Down Artificial Intelligence Boom Behind Holiday Shopping Trends

Consumers are increasingly turning to artificial intelligence tools for holiday shopping—especially Gen Z shoppers, who are using platforms like ChatGPT and social media not only for gift inspiration but also to find the best prices. Andrew Schwarz, professor in the LSU Stephenson Department of Entrepreneurship & Information Systems, and Dan Rice, associate professor and Director of the E. J. Ourso College of Business Behavioral Research Lab, share their insights on this emerging trend. AI is the new front door for search: Schwarz: We’re seeing a fundamental change in how consumers find information. Instead of browsing multiple pages of results, users—especially Gen Z—are skipping to conversational AI for curated answers. That dramatically shortens the shopping journey. For years, companies optimized for SEO to appear on the first page of Google; now they’ll have to think about how their products surface in AI-generated recommendations. This may lead to a new form of “AIO”—AI Information Optimization—where retailers tailor product descriptions, metadata, and partnerships specifically for AI visibility. The companies that adapt early will have a distinct advantage in capturing consumer attention. Rice: This issue of people being satisfied with the AI results (like a summary at the top of the Google results) and then not clicking on any of the paid or organic links leads to a huge increase in what we call “zero click search” (for obvious reasons). For some providers, this is leading to significant drops in web traffic from search results, which can be disconcerting due to the potential loss of leads. However, to Andrew’s point of shortening the journey, it means that the consumers who do come through are much more likely to buy (quickly) because they are “better” leads. This translates to seemingly paradoxical situations for providers: they see drops in click-through rates and visitors/leads, yet revenue increases because the visitors are “better.”  There is a rise in personalized shopping journeys: Schwarz: AI essentially acts as a personal shopper—one that can instantly analyze preferences, budget, personality traits, or past behavior to produce tailored gift lists. This shifts power toward “delegated decision-making,” in which consumers allow AI to narrow their choices. Younger consumers are already comfortable outsourcing this cognitive load. However, as ads enter the picture, these personalized journeys could be shaped by incentives that aren’t always transparent. That creates a new responsibility for platforms to disclose when suggestions are sponsored and for users to develop a more critical lens when interacting with AI-driven recommendations. Rice: This is also a great point. The “tools” marketers use to attract customers are constantly evolving, but this seems in many ways to be the next iteration of the Amazon.com suggestions that you find at the bottom of the product page for something you click on when searching Amazon (“buy all x for $” or “consumers also looked at…,” etc.), based on past histories of search and purchase, etc. One of the main differences is that you can now create virtually limitless ways to compare products, making comparisons less taxing (reducing cognitive load and stress), which may, in some cases, increase the likelihood of purchase. These idiosyncratic comparisons and prompts lead to the truly unique journeys Andrew is discussing. You no longer have to be beholden to a retailer-specified price range. You could choose your own, or instead ask an AI to list the products representing the best “value” based on consumer reviews, perhaps by asking to list the top ten products by cost per star rating, etc.  Advertising is becoming more subtle and conversational: Schwarz: With ads woven directly into AI responses, the traditional boundary between content and advertising blurs. Instead of banner ads, pop-ups, or clearly labeled sponsored posts, recommendations in a conversational thread may feel more like advice than marketing. This has enormous implications for consumer trust. Retailers will likely see higher engagement through these context-aware ad placements, but regulatory scrutiny may also increase as policymakers evaluate how clearly sponsored content is identified. The risk is that advertising becomes invisible—something both platform designers and regulators will need to monitor carefully. Rice: This is definitely true. I was recently exploring an AI-based tool for choosing downhill skis, but the tool was subtly provided by a single ski brand. I’m not sure the distribution of ski brands covered was truly delivering the “best overall fit” for a potential buyer, rather than the best possible ski in that brand. At least in that case, it was somewhat disclosed. It does, however, become an issue if consumers feel misled, but they’d have to notice it first. Still, the advantages are big for retailers, and the numbers don't lie. According to some preliminary Black Friday data, shoppers using an AI assistant were 60% more likely to make a purchase.  Schwarz: This shift is going to reshape multiple layers of the retail ecosystem: Retailers will need to rethink how they show up in AI-driven environments. Traditional SEO, ad bids, and social media strategies won’t be enough. Partnerships with AI platforms may become as important as being carried by major retailers today. Because AI tools can instantly compare prices across dozens of retailers, consumers will become more price-sensitive. Retailers may face increasing pressure to offer competitive pricing or unique value propositions, as AI reduces friction in comparison shopping. Retailers who integrate AI into their own websites—chat-based shopping assistants, personalized gift advisors, automated bundling—will gain an edge. Consumers are increasingly expecting conversational interfaces, and companies that delay will quickly feel outdated. As AI tools influence purchasing decisions, consumers and regulators alike will demand clarity around how recommendations are generated. Retailers will need to navigate this carefully to maintain What I think we are going to see accelerate as we move forward: AI-powered concierge shopping will become mainstream. Within a couple of years, using AI to generate shopping lists, compare prices, and find deals will be as common as using Amazon today. Retailers will create AI-specific marketing strategies. Instead of optimizing for keywords, they’ll optimize for prompts: how consumers might ask for products and how an AI system interprets those requests. More platforms will introduce advertising into AI models. ChatGPT is simply the first mover. Once the revenue potential becomes clear, others will follow with their own ad integrations. Greater scrutiny from policymakers. As conversational advertising grows, transparency rules and labeling requirements will almost certainly. A new era of “conversational commerce.” Buying directly through AI—“ChatGPT, order this for me”—will become increasingly common, merging search, recommendation, and transaction into a single seamless experience. I can speak to this on a personal level.  My college-aged son is interested in college football, and I wanted to get him a streaming subscription to watch the games.  However, the football landscape is fragmented across multiple, expensive platforms. I asked ChatGPT to generate a series of options. Hulu is $100/month for Live TV, but ChatGPT recommended a combination of ESPN+, Peacock, and Paramount+ for $400/year and identified which conferences would not be covered.  What would have taken me hours only took me a few minutes! Rice: On the other hand, AI isn’t infallible, and it can lead to sub-optimal results, hallucinations, and questionable recommendations. From my recent ski shopping experience, I encountered several pitfalls. First, for very specific questions about a specific model, I sometimes received answers for a different ski model in the same brand, or for a different ski altogether, which was not particularly helpful, or specs I knew were just plain wrong. Secondly, regarding Andrew’s point about the conversational tone, I asked questions intended to push the limits of what could be considered reliable. For example, I asked the AI to describe the difference in “feel” of the ski for the skier among several models and brands. While the AI gave very detailed and plausible comparisons that were very much like an in-store discussion with a salesperson or area expert, I’m not sure I fully trust when an AI tells me that you can really feel the power of a ski push you out of a turn, this ski has great edge hold, etc. It sounds great, but where is the AI sourcing this information? I’m not convinced it’s fully accurate. It also seems we’re starting to see Google shift toward a more AI-centric approach (e.g., AI summaries and full AI Mode). At the same time, we’re also starting to see AI migrate closer to Google as people use it for product-related chats, and companies like Amazon and Walmart have developed their own AI that is specifically focused on the consumer experience. I can’t imagine it will be long before companies like OpenAI and their competitors start “selling influence” in AI discussions to monetize the influence their engines will have.  

Dan Rice profile photoAndrew Schwarz profile photo
6 min. read
LSU’s Jill Trepanier Educating K-12 Louisiana Students About the Environment featured image

LSU’s Jill Trepanier Educating K-12 Louisiana Students About the Environment

What began in 2018 as a single rooftop weather station on LSU’s campus as a tool to help freshmen connect to the science happening around them, has grown into an educational network in the southern part of the state, connecting K-12 students with the sky through real-time data, interactive technology, and hands-on learning. Trepanier, a professor and department chair in LSU’s Department of Geography & Anthropology, leads a project that now includes 10 weather stations installed at or near K–12 schools from Lake Charles to Grand Isle. “The environment is harsh in Louisiana. Beautiful, but harsh,” Trepanier said. “The more students know about it, the better they can protect themselves and their families. For me, that’s what it is all about.” The project all started to help college students in Trepanier’s meteorology and physical geography classes connect more deeply with the material by using weather data collected from the air around them. “These were 400 freshmen every semester who were not geography majors, so they didn't really love the science of the atmosphere. But they were able to connect with the information because they could see the data on an app on their phone as they were living in it.” Installed across South Louisiana, each weather station is solar-powered and connected to a console that uploads data to an online web platform and displays it on a dashboard. Then an app shows the local students the current conditions and records for the day. “When we look at data from the community, it might be many miles from where you are. And most people live within a few miles or less of their schools. It allows them a close-up view of what is happening, instead of relying on something miles away,” she said. Teachers can use the data with certain lessons or during a passing storm. But the available data also educates them on things like solar radiation, “It also helps aid things like seasonality and our relationship with the sun. It extends well beyond just rain.” The material is also aligned with the Louisiana Student Science Standards for environmental and Earth sciences. “By allowing students to compare real data across space and time, it helps them to understand how systems are connected. And most of these science standards have them focusing on system theory, in one way or another,” Trepanier said. Read the full story here.

Jill Trepanier profile photo
2 min. read
LSU AgCenter Research Enables Better Flood Protection for Homes featured image

LSU AgCenter Research Enables Better Flood Protection for Homes

The American Society of Civil Engineers (ASCE) recently released its new standard for flood-resistant design and construction, ASCE/SEI 24-24, which provides new minimum requirements that can be adopted for all structures subject to building codes and floodplain management regulations. The new elevation standard was directly supported by LSU research and should help reduce flood risk and make flood insurance more affordable. “Without the research by the LSU AgCenter, the advancements made to the elevation requirements would not have been possible,” said Manny Perotin, co-chair of the Association of State Floodplain Managers’ Nonstructural Floodproofing Committee, who helped update the standard. “Dr. Carol Friedland’s research shows there are better ways to protect communities from flooding than adding one foot of additional freeboard.” The research team, led by Friedland, an engineer, professor, and director of LSU AgCenter’s LaHouse, showed how previous standards were failing to protect some homeowners. They mapped the impact of moving from a standard based on a fixed freeboard amount to being based on real risk in every census tract in the U.S. In response to these findings, they developed a free online tool to help builders, planners, managers, and engineers calculate the elevation required under the new standards. “Many on the committee said it would be too hard to do these complex calculations,” said Adam Reeder, principal at the engineering and construction firm CDMSmith, who helped lead the elevation working group for the new ASCE 24 elevation standards. “But the LSU AgCenter’s years of research in this area and the development of the tool makes calculations and implementation simple. This allowed the new elevation standard to get passed.” Flooding, the biggest risk to homes in Louisiana, continues to threaten investments and opportunities to build generational wealth. On top of flood losses, residents see insurance premiums increase without resources to help them make informed decisions and potentially lower costs. In response to this problem, Friedland is working on developing a whole suite of tools together with more than 130 partners as part of a statewide Disaster Resilience Initiative. When presenting to policy makers and various organizations, Friedland often starts by asking what percentage of buildings they want to flood in their community in the next 50 years. “Of course, we all want this number to be zero,” Friedland said. “But we have been building and designing so 40% will flood. People have a hard time believing this, but it’s the reality of how past standards did not adequately address flood risk.” Designing to the 100-year elevation means a building has a 0.99 chance of not flooding in any given year. But when you run that probability over a period of 50 years (0.99 x 0.99 x 0.99… 50 times, or 0.99 ^ 50), the number you end up with is a 60.5% chance of not flooding in 50 years. This means a 39.5% chance of flooding at least once. “We’ve been building to the 100-year elevation while wanting the protection of building to the 500-year elevation, which is a 10% chance of flooding in 50 years,” Friedland said. “Now, with the higher ASCE standard, we can finally get to 10% instead of 40%.” As the AgCenter’s research led to guidelines, then to this new standard, Friedland has also been providing testimony to the International Code Council to turn the stronger standard into code. In May, Friedland helped lead a workshop at the Association of State Floodplain Managers’ national conference, held in New Orleans. There, she educated floodplain managers about the new standard while demonstrating LSU’s web-based calculation tool, which was designed for professionals, while her team also develops personalized decision-making tools such as Flood Safe Home for residents. At the conference, Friedland received the 2025 John R. Sheaffer Award for Excellence in Floodproofing. More than two-thirds of the cost of natural hazards in Louisiana comes from flooding, according to LSU AgCenter research in partnership with the Governor’s Office of Homeland Security and Emergency Preparedness for the State Hazard Mitigation Plan. That cost was recently estimated to rise to $3.6 billion by 2050. “Historically, we have lived with almost a 40% chance of flooding over 50 years, which in most people’s opinion is too high—and the number could be even higher,” Reeder said. “Most building owners don’t understand the risk they are living with, and it only becomes apparent after a flood. The work done by the LSU AgCenter is critical in improving resilience in communities that can’t afford to be devastated by flooding.” “This may be the most significant upgrade in the nation’s flood loss reduction standards since the creation of the National Flood Insurance Program minimums in 1973, and it could not come at a better time as annual flood losses in the country now average more than $45 billion per year,” said Chad Berginnis, executive director of the Association of State Floodplain Managers. In addition to LaHouse’s work to prevent flooding, Friedland’s team is also working to increase energy efficiency in homes to help residents save money on utility bills. Their HEROES program, an acronym for home energy resilience outreach, education, and support, is funded by the U.S. Department of Agriculture and has already reached 140,000 people in Louisiana. Article originally posted here.

Carol Friedland profile photo
4 min. read
The Impact of Counterfeit Goods in Global Commerce featured image

The Impact of Counterfeit Goods in Global Commerce

Introduction Counterfeiting has been described as “the world’s second oldest profession.” In 2018, worldwide counterfeiting was estimated to cost the global economy between USD 1.7 trillion and USD 4.5 trillion annually, as well as resulting in more than 70 deaths and 350,000 serious injuries annually. It is estimated that more than a quarter of US consumers have purchased a counterfeit product. The counterfeiting problem is expected to be exacerbated by the unprecedented shift in tariff policy. Tariffs, designed as an import tax or duty on an imported product, are often a percentage of the price and can have different values for different products. Tariffs drive up the cost of imported brand name products but may not, or only to a lesser extent, impact the cost of counterfeit goods. In this article, we examine the extent of the global counterfeit dilemma, the role experts play in tracking and mitigating the problem, the use of anti-counterfeiting measures, and the potential impact that tariffs may have on the flow of counterfeit goods. Brand goods have always been a target of counterfeits due to their high price and associated prestige. These are often luxury goods and clothing, but can also be pharmaceuticals, cosmetics, and electronics. The brand name is an indication of quality materials, workmanship, and technology. People will pay more for the “real thing,” or decide to buy something cheaper that looks “just as good.” In many cases, “just as good” is a counterfeit of the brand name product. A tariff is an import tax or duty that is typically paid by the importer and can drive up the cost of imported brand name products. For example, a Yale study has shown that shoe prices may increase by 87% and apparel prices by 65%, due to tariffs. On the other hand, counterfeit products don’t play by the rules and can often avoid paying tariffs, such as the case of many smaller, online transactions, shipped individually. Therefore, we expect to see an increase in counterfeit products as well as a need to increase efforts to reduce the economic losses of counterfeiting. The Scale of the Counterfeit Problem In their 2025 report, the Organisation for Economic Co-operation and Development (OECD) and the European Union Intellectual Property Office (EUIPO), estimated that in 2021, “global trade in counterfeit goods was valued at approximately USD 467 billion, or 2.3% of total global imports. This absolute value represents an increase from 2019, when counterfeit trade was estimated at USD 464 billion, although its relative share decreased compared to 2019 when it accounted for 2.5% of world trade. For imports into the European Union, the value of counterfeit goods was estimated at USD 117 billion, or 4.7% of total EU imports.” In a 2020 report, the US Patent and Trademark Office (USPTO) estimated the size of the international counterfeit market as having a “range from a low of USD 200 billion in 2008 to a high of USD 509 billion in 2019.” According to the OEDC / EUIPO General Trade-Related Index of Counterfeiting for economies (GTRIC-e), China continues to be the primary source of counterfeit goods, as well as Bangladesh, Lebanon, Syrian Arab Republic, and Türkiye. Based on customs seizures in 2020-21, the most common items are clothing (21.6%), footwear (21.4%), and handbags, followed by electronics and watches. Based on the value of goods seized, watches (23%) and footwear (15%) had the highest value. However, it should be noted that items that are easier to detect and seize are likely to be overrepresented in the data. Although the share of watches declined, and electronics, toys, and games increased, it remains unclear whether this represents a long term trend or just a short term fluctuation. In general, high value products in high demand continue to be counterfeited. Data from the US Library of Congress indicates that 60% – 80% of counterfeit products are purchased by Americans. The US accounts for approximately 5% of the world’s consumers; however, it represents greater than 20% of the world’s purchasing power. Though it is still possible to find counterfeit products at local markets, a large number of counterfeit goods are obtained through online retailers and shipped directly to consumers as small parcels classified as de minimis trade. This allows for the duty-free import of products up to USD 800 in value. Counterfeit items may be knowingly or unknowingly purchased from online retailers and shipped directly to consumers, duty-free. Purchased products can be shipped via postal services, classified as de minimis trade. Approximately 79% of packages seized contained less than 10 items. Given the size and volume of the packages arriving daily, many or most will evade scrutiny by customs officials. This means of import is increasing over time. In 2017-19 it was 61% of seizures. By 2020-21, it was 79%. Economic Impact of Counterfeiting The scale of the counterfeiting problem has significant impacts on the US economy, US business interests, and US innovations in lost sales and lost jobs. Moreover, counterfeit products are often made quickly and cheaply, using materials that may be toxic. The companies producing these goods may not dispose of waste properly and may dump it into waterways, causing significant environmental consequences. Counterfeit products from electrical equipment and life jackets to batteries and smoke alarms may be made without adhering to safety standards or be properly tested. These products may fail to function when you need it and may lead to fire, electric shock, poisoning, and other accidents that can seriously injure and even kill consumers. Counterfeit cosmetics and pharmaceuticals can also lead to injuries by either including unsafe ingredients or by failing to provide the benefits of the real product. The Tariff Counterfeit Connection Tariffs may be seen as a tax on consumers and raise the price of imported products that are already the target of counterfeiters such as luxury leather products and apparel. It’s commonly understood that raising prices on genuine products can only drive up the demand for counterfeit goods. In general, consumers will have less disposable income and the brand goods they desire will cost more which is bound to increase the demand for counterfeit goods. Although recent changes removing the USD 800 tax exemption on de minimis shipments from China and Hong Kong will make it more expensive for counterfeiters to ship their goods internationally, tariffs are typically applied as a percentage of the cost of an object. This will cause the price of more expensive legitimate goods to increase even more than the cheaper counterfeit goods and likely make the counterfeit products even more attractive economically. Therefore, we expect to see an increase in counterfeit products as well as an increase in efforts to reduce the economic losses of counterfeiting. The Role of Technical Experts in Counterfeit Detection Technical experts play an important role in both the prevention and detection of counterfeits and helping to identify counterfeiting entities. Whether counterfeit money, clothing, shoes, electronics, cosmetics or pharmaceuticals, the first step in fighting counterfeits is detecting them. In some cases, the counterfeit product is obvious. A leather product may not be leather, a logo may be wrong, packaging may have a spelling mistake, or a holographic label may be missing. These products may be seized by customs. However, some counterfeit products are very difficult to detect. In the case of a counterfeit memory card with less than the stated capacity or a pharmaceutical that contains the wrong active ingredient, technical analysis may be needed to identify the parts. Technical analysis may also be used to try and identify the source of the counterfeit goods. For prevention measures, manufacturers may use radio frequency identification (RFID) or Near Field Communication (NFC) tags within their products. RFID tags are microscopic semiconductor chips attached to a metallic printed antenna. The tag itself may be flexible and easy to incorporate into packaging or into the product itself. A passive RFID requires no power and has sufficient storage to store information such as product name, stock keeping unit (SKU), place of manufacture, date of manufacture, as well as some sort of cryptographic information to attest to the authenticity of the tag. A simple scanner powers the tag using an electromagnetic field and reads the tag. If manufacturers include RFID tags in products, an X-ray to identify a product in a de minimis shipment (perhaps using artificial intelligence technology) and an RFID scanner to verify the authenticity of the product can be used to efficiently screen a large number of packages. Many products also may be marked with photo-luminescent dyes with unique properties that may be read by special scanners and allow authorities to detect legitimate products. Similarly, doped hybrid oxide particles with distinctive photo-responsive features may be printed on products. These particles, when exposed to laser light, experience a fast increase in temperature which may be quickly detected. For either of these examples, the ability to identify legitimate products, or – due to the absence of marking – track counterfeit products, allows authorities to map the flow of the counterfeit goods through the supply chain as they are manufactured, shipped, and are exported and imported to countries. For many years, electronic memory cards such as SD cards and USB sticks have been counterfeited. In many cases, the fake card will have a capacity much smaller than listed. For example, a 32GB memory card for a camera may only hold 1GB. Sometimes, these products may be identified by analyzing the packaging for discrepancies from the brand name products. In other cases, software must be used to verify the capacity and performance of each one, which is time-consuming when analyzing a large number of products. Forensic investigators, comprised of forensic accountants and forensic technologists, are heavily involved in efforts to combat this illicit trade. By analyzing financial records, supply-chain data, and transaction histories, they trace the origins and pathways of counterfeit products. Their work often involves identifying suspicious procurement patterns, shell companies, and irregular inventory flows that signal counterfeit activity. Forensic investigators often begin by mapping the counterfeit supply chain, an intricate web that often spans continents. Using data analytics, transaction tracing, and inventory audits, they identify anomalies in procurement, distribution, and sales records. These methodologies help pinpoint the origin of counterfeit goods, the intermediaries involved, and the final points of sale. By reconstructing the flow of goods and money, forensic investigators can begin to unmask activities. Cross-border partnerships are essential for tracking assets, sharing insights, and coordinating with financial regulators. Public-private partnerships further enhance the effectiveness of anti-counterfeiting efforts. Forensic investigators often serve as bridges between government agencies, brand owners, and financial institutions, facilitating the exchange of key information. These partnerships increase information-sharing, streamline investigations, and amplify the impact of enforcement actions. A promising development in this space is the World Customs Organization’s Smart Customs Project, which integrates artificial intelligence to detect and intercept counterfeit goods. Forensic investigators can leverage this initiative by analyzing AI-generated alerts and incorporating them into broader financial investigations, which allows for faster and more accurate identification of illicit networks. Jurisdictional complexity is a major hurdle in anti-counterfeiting efforts. Forensic investigators work closely with legal teams to navigate these challenges to ensure that investigations comply with local laws, and evidence is admissible and can withstand scrutiny in court, especially when dealing with offshore accounts and international money laundering schemes. Forensic investigators follow the money, tracing illicit profits through bank accounts, shell companies, and cryptocurrency transactions. Their findings not only help recover stolen assets but also support disputes by providing expert testimony that quantifies financial losses and identifies the bad actors. Conclusion Imitations of brand name products have become more convincing, harder to detect, and the sources of the counterfeit goods more difficult to identify. While counterfeiting clearly has evolved because of technological advancements, e-commerce, and the growing sophistication of bad actors, the process has now been complicated even further by the unpredictable tariff and trade policies that are affecting businesses worldwide. Consequently, companies need to take a multi-faceted approach to these new challenges introduced into the counterfeiting of products by tariffs. By engaging high-tech product authentication measures, utilizing technology-based alerts about counterfeits, and retaining the specialized skills of forensic investigators and other experts, companies will be able to navigate the risks posed by the complex and changing relationship between tariffs and counterfeit goods. To learn more about this topic and how it can impact your business or connect with James E. Malackowski simply click on his icon now to arrange an interview today. To connect with David Fraser or Matthew Brown - contact : Kristi L. Stathis, J.S. Held +1 786 833 4864 Kristi.Stathis@JSHeld.com

James E. Malackowski, CPA, CLP profile photo
9 min. read
5 Reasons "Expertise Marketing" Programs Fail. featured image

5 Reasons "Expertise Marketing" Programs Fail.

As a company dedicated to “Expertise Marketing” we work with some of the largest organizations from higher education and healthcare, to top global corporate brands. What these organizations have in common are smart, educated professionals…and a lot of them. The types of individuals that would be valuable ambassadors, true thought leaders, helping you deliver on your organization’s reputational and revenue goals. Instinctively marketing and communications teams recognize the intrinsic value of this human capital and have created a variety of “Thought Leadership” and “Expert Marketing and Directory” initiatives. The overriding objective is how to best connect their experts to audiences that matter. Seeking opportunities ranging from acting as media sources to event speakers to providing a valuable entry-point for research and business collaboration, even lead generation. To execute on this goal, one of the most effective approaches, and starting points for any expertise marketing program starts with better profiling their experts and related insights on their website. Building out and leveraging this expert content is at the core of most expertise marketing efforts. Despite the promises these web initiatives offer, most programs don’t deliver organizations the results they were hoping for. Success most often has nothing to do with how smart your people are. Some of the largest organizations with deep rosters of expertise fail where smaller organizations consistently punch above their weight. When creating an expertise presence on your website there are important areas to consider. The following represents the top 5 reasons many expertise marketing programs fail and how to maximize your success.  Reason #1 You’re missing critical team members There is no “going it alone” when starting a program like this.  Having the following individuals onboard at the start is crucial. Don't worry, these aren't all full-time resources by any means.  As your program progresses, these individuals may come in and out in terms of importance, but having access to them over the lifetime of your program will positively impact your success. At the core, you need access to the following individuals. Program Champion - Having a senior leader as a champion is pretty much table stakes for any successful company-wide initiative such as this.  Someone who can articulate to others, both up and down in the organization as to how this initiative fits into the broader long-term goals of the organization is imperative. Failure to establish this individual upfront puts your program's future at the whim of shifting priorities and budget cuts. Marketing/Communications - You need someone with ongoing responsibility for maintaining and promoting your roster of experts and their content.  This ensures your most relevant experts are showcased at the right time to meet the changing demands of your audiences and the news cycle. Digital/Web - You need someone with the keys to the website/CMS. Ensure you have connections to people who control not only your small area of the website such as a newsroom or department level webpages but also those that have access to the layouts and navigation of the broader website.  The latter is important as it helps prevent your expert content from combing isolated and disconnected from the rest of your website. IT - The level of involvement of IT is highly dependent on how you’re looking to implement your expert content on your website. By leveraging a variety of content implementation tools from simple "cut and paste" embeds to WordPress plugins you can severely limit the necessity to involve IT. However, depending on your budget and goals, IT can leverage a platform's API, accessing advanced layouts and functionality, including integrating with other systems your organization may already be using. Engaged Experts -  Last but not least, having your experts on board is critical. By properly communicating upfront and ongoing with your experts around the goals of the program, you're helping ensure your content best represents the talents that lie within. We realize it is often difficult and sometimes cost-prohibitive to assemble such a team. It is important if you don’t have access to all these members in-house that you access them through an external partner's professional services offerings. This could include assisting with building out content such as profiles and posts or providing technical assistance in integrating this content into your website. Reason #2 You’re relying too much on IT for implementation or updating. To be successful long term, it is important that key owners of the expertise marketing program feel empowered to take control of their expert content. From creation to ongoing management, those with marketing communications roles and others closest to their organization’s expertise need the flexibility to update content in real-time to remain relevant and up-to-date. Being able to quickly log into an external platform that syncs content with your website is key.  It eliminates the need for special access to your CMS or the possible requirement for IT to be in control of your updates. It also allows for a mix of individual expert and administrator access providing the highest level of flexibility. Often left out in IT-focused builds is how you will effectively handle inquiries.  Simply showing emails and phone numbers is a recipe for missed opportunities (and SPAM) as these experts are some of the most time-constrained individuals in your organization.  Ensuring you have access to a customizable workflow feature is essential in ensuring your organization doesn't miss potential time-sensitive inquiries. When working with IT to implement an Expertise Marketing Program on your website, you will often be presented with a “we’ll build it for you option” vs using a purpose-built platform. Understanding the tradeoffs of this approach is critical. One of the greatest benefits of using a SaaS platform, besides costs, is that you constantly have the most up-to-date software, with the latest features and functionality to best showcase your expertise. To learn more, download the “True Costs of DIY” to better understand the tradeoffs and functional requirements needed for success. Reason #3 Your expert content is siloed, one-dimensional, and rarely updated. This is by far one of the biggest reasons programs fail.  Well, it's actually a number of reasons, but it all relates back to how your content will be perceived and ultimately drive connections with interested audiences.  By addressing the following you'll present not only better but more easily discoverable expert content that drives inquiries. You have boring, not engaging profiles for your experts - Before people feel comfortable reaching out they need a good sense of the person. Profiles that lack media assets such as video, publications and even podcasts are one-dimensional. Furthermore, showcasing past media and event appearances provides an enhanced level of credibility. Focused solely on a directory & profiles - Your expertise is more than just showcased through a profile found in a directory. Adding long-form posts where experts can share their insights and even expert focussed Q&A (download report on "The Power of Q&A") provides audiences additional ways to connect with your experts. Ensuring all these additional assets connect back to your profiles provides more insight into the person behind the expertise. No main website navigation - Despite adding menu navigation on a specific web page, such as a newsroom or About Us page, most organizations neglect to add navigation to their main website’s menu structure. You can never assume visitors will know where this content resides. We recommend multiple links in both headers and footers to your expert content. Names such as “Find Experts”, “Media Sources” or “Research Experts” are some of the most common, accessible from overall menu items like “About Us”, “News” or “Research”. Expert content stuck to one small area of your website - If you restrict your expert content to just one area, you’re just making discovery that much harder and limiting exposing the breadth of expertise you have in-house. Highlight your experts and expertise on your homepage or in key sections of your website. Refine your experts and their insights found in posts or Q&A by tagging them based on specific topics and showcasing just those experts in various areas of your website. Using a dedicated SaaS platform means that when you update content it updates everywhere, making changes quick and easy. Expert content never gets updated - This is a big issue for organizations that build in-house or through their CMS. Visitors can quickly understand that the content isn’t fresh and it reflects poorly on the individual and the organization as a whole. The key to ensuring content is maintained is to provide multiple access capabilities where admins (internal or external) and the experts can maintain the content. Failure to respond in a timely manner to inquiries - Displaying content that exposes phone numbers and emails of your experts is not the best approach...both from a privacy and timely communications standpoint. Without an advanced inquiry workflow that alerts multiple members of your team, you risk missing out on time-sensitive requests such as those from journalists.  Reason #4 You haven’t considered everything needed to win the SEO game. Building out content on the web without having a plan for how external and internal search engines will interact with your pages is a big mistake. Organic search can play a big role in discovery leading to valuable opportunities. Before you consider your new expert content pages ready, ensure you've taken into account the following. Proper Meta Data - Do your expert profile pages have dynamically created titles, descriptions and keywords that automatically adjust to changes in areas such as an individual's expertise? Schema Data - Do you have proper schema tags that indicate to Google and other search engines the type of content displayed as well as the credibility of both the individual and organization behind it. Sitemaps - Have you ensured all your pages have been added to your sitemap. Is it automatically updated when new experts or pieces of expert content are added? Google Search Console - Are you pushing pages directly to Google by requesting important new content is updated in the search index. For more info on better SEO read my Spotlight "Why Expertise Ranks Higher". Reason #5 You’re not doing enough to actively promote your expertise… a “they’ll just find us” approach usually fails. It's like owning a Porsche and leaving it in the garage…pretty to look at but you’re not realizing its full potential. Simply putting your expert content on a web page is only the start. Successful organizations actively distribute these assets, sharing links to profiles and other content elements like news posts or Q&A in a variety of ways. Social Media Channels - They start by promoting these assets on their social media channels, from their Twitter feeds to Facebook and LinkedIn posts. Media Distribution Software - Whether it is systems like Cision or Meltwater, including links to expert profiles and related content when reaching out to journalists adds a layer of depth to your pitches. Press Releases - Every time you reference your organization's expertise, include links to additional content and individual experts for more insights and pathways to connect with real people. It sounds like a lot, but with a bit of planning and some ongoing maintenance, a properly constructed expertise marketing program can deliver incredible results for many years. To be successful it's more than just firing up a few new web pages. However, with the advent of specialized platforms specifically designed for these programs, and a bit of guidance, it is easier than ever to create an expert content footprint on your website and deliver valuable connections for your organization.  

Robert Carter profile photo
8 min. read
Why Your Experts Might Not Show Up in Google AI Overviews — And How to Fix It featured image

Why Your Experts Might Not Show Up in Google AI Overviews — And How to Fix It

The way we find expert information online is changing fast. With the rise of Google’s AI-generated overviews (formerly called Search Generative Experience), the top spot on the search page no longer goes to the highest-ranking blue link. Instead, AI now summarizes answers using a blend of machine learning, structured data, and trust signals—pulling directly from a variety of select sources across the web. If institutions—whether academic, healthcare, corporate or others—aren't aligning its expert content with these new rules of discovery, your experts may be left out of the conversation altogether.  Don't miss being featured in media stories, invited to speak at events, or approached for business and collaboration opportunities. This is the moment to double down on structured data and transparent authorship—because AI-first search is rewarding expert clarity, not just content volume. The following provides a quick breakdown as to how AI Search, Google’s EEAT principles, and Schema.org structured data work together—and what you can do to ensure your expert content...and your experts, gets surfaced, cited, and trusted. What Is EEAT and Why It Matters in AI Search EEAT stands for Experience, Expertise, Authoritativeness, and Trustworthiness—the core framework Google uses to evaluate whether content is reliable and deserves to rank, especially in high-stakes areas like health, education, and finance. In AI-powered summaries, Google doesn’t just look at keywords—it looks for: Real people with demonstrable credentials Clear affiliations with reputable institutions Consistent authorship and transparency Trust signals like citations, bios, and professional history EEAT in Action: Why Schema Markup Is Your AI SEO Power Tool EEAT signals work best when they’re machine-readable—that’s where Schema.org structured data comes in. It acts as a translator between your content and Google’s AI.  Schema tags are pieces of structured data that help search engines understand the content and context of your web pages. They translate human-readable information—like author names, job titles, and article types—into machine-readable signals that boost visibility AI overviews and search results. Implementing Schema helps ensure your expert content is eligible for inclusion in AI overviews. Key schema types include: {Person} – for expert bios {ScholarlyArticle}, {Article}, {FAQ} – for authored content {Organization}, {MedicalOrganization}, {EducationalOrganization} – to establish credibility {sameAs} – to reinforce expertise by connecting external profiles (LinkedIn, ORCID, Google Scholar) Schema in Action: AI Overviews Favor Structured, Credible Expert Content Google’s AI overviews are designed to synthesize trustworthy sources—not just surface-level blog posts or SEO-churned pages. That means expert content that is: Authored by named individuals with clear credentials Structured for readability and machine parsing Linked to institutional authority and trust domains If your experts don’t meet these criteria—or if Google’s crawlers can’t understand the relationships between person, organization, and content—your insights may never reach the surface of the AI summary box. How ExpertFile Optimizes for AI-Driven Search AI search is no longer just about keywords—it’s about credibility, structure, and clarity. Institutions that invest in properly structured expert content will not only rank better—they’ll become the source quoted in the next generation of search. ExpertFile is purpose-built to maximize visibility and trust in this new era of AI search. Here’s how: Structured Expert Profiles: Every expert has a dedicated page with rich Person schema, bios, credentials, affiliations, and publication history. Schema-Tagged Content: Articles, media spotlights, and FAQs are marked up using Schema.org types like ScholarlyArticle, FAQPage, and Article. Institutional Credibility: Profiles are embedded within .edu, .org, or corporate domains—reinforcing trust with Google’s algorithms. Cross-Linked Authority: Integration with Google Scholar, LinkedIn, and ORCID ensures a 360° trust profile across the web. Mobile-Ready & Indexed: ExpertFile content is fully indexable and distributed across web and mobile platforms—supporting discoverability everywhere AI pulls from. With ExpertFile, your experts are not just listed—they’re positioned, structured, and ready for the AI spotlight. Learn more about how ExpertFile helps organization's benefit in the new era of AI.

Robert Carter profile photo
3 min. read
Google's New AI Overviews Isn’t Just Another Search Update featured image

Google's New AI Overviews Isn’t Just Another Search Update

Google's recent rollout of AI Overviews (previously called “Search Generative Experience”) at its annual developer conference is being hailed as the biggest transformation in search since the company was founded. This isn’t a side project for Google — it fundamentally alters how content gets discovered, consumed, and valued online. If you're in marketing, PR, content strategy, or run a business that depends on online visibility, this requires a fundamental shift in your thinking. What Is AI Overviews? Instead of showing users a familiar list of blue links and snippets, Google now uses artificial intelligence to generate a summary answer at the very top of many search results pages. This AI-generated box pulls together content from across the web and tries to answer the user’s question instantly—without requiring them to click through to individual websites. Here’s what that looks like: You type in a question like “What are the best strategies for handling a media crisis?” Instead of just links, you see a big AI-generated paragraph with summarized strategies, possibly quoting or linking to 3-5 sources—some of which might not even be visible unless you scroll or expand the summary.  Welcome to the new digital gatekeeper. Elizabeth Reid, VP of Search at Google states "Our new Gemini model customized for Google Search brings together Gemini’s advanced capabilities — including multi-step reasoning, planning and multimodality — with our best-in-class Search systems. Let's breakdown this technobabble. Think of Gemini as the brain behind Google’s search engine that’s now: Even More Focused on User intent For years, SEO strategies were built around guessing and gaming the right keywords: “What exact phrase are people typing into Google?” That approach led to over-optimized content — pages stuffed with phrases like “best expert speaker Boston cleantech” — written more for algorithms than actual humans.  But with Google Gemini and other AI models now interpreting search queries like a smart research assistant, the game has changed entirely.  Google is no longer just matching phrases — it’s interpreting what the user wants to do and why they’re asking. Here’s What That Looks Like: Let’s say someone searches: “How do I find a reputable expert on fusion energy who can speak at our cleantech summit?” In the old system, pages that mentioned “renewable energy,” “expert,” and “speaker” might rank — regardless of whether they actually helped the user solve their problem. Now Google more intuitively understands: • The user wants to evaluate credibility • The user is planning an event • The user needs someone available to speak • The context is likely professional or academic If your page simply has the right keywords but doesn’t send the right signals — you’re invisible. Able to plan ahead Google and AI search platforms now go beyond just grabbing facts. They string together pieces of information to answer more complex, multi-step queries.  In traditional search, users ask one simple question at a time. But with multi-step queries, users are increasingly expecting one search to handle a series of related questions or tasks all at once — and now Google can actually follow along and reason through those steps.  So imagine you’re planning a conference. A traditional search might look like: "Best conference venues in Boston” But a multi-step query might be: “Find a conference venue in Boston with breakout rooms, check availability in October, and suggest nearby hotels with group rates.” This used to require three or four different searches, and you’d piece it together yourself. Now Google can handle that entire chain of related tasks, plan the steps behind the scenes, and return a highly curated answer — often pulling from multiple sources of structured and unstructured data. Even Better at understanding context Google now gets the difference between ‘a speaker at a conference’ and ‘a Bluetooth speaker’ — because it understands what you mean, not just what you type.”  In the past, Google would match keywords literally. If your page had the word “speaker,” it might rank for anything from event keynotes to audio gear. That’s why so many search results felt off or required extra digging. Now Google reads between the lines. It understands that “conference speaker” likely refers to a person who gives talks, possibly with credentials, experience, and a bio.  And that “Bluetooth speaker” is a product someone might want to compare or buy. Why this matters for marketers: If you’re relying on vague or generic content — or just “keyword-stuffing” — your pages will fall flat. Google is no longer fooled by superficial matches. It wants depth, clarity, and specificity. Reads More Than Just Text Google now processes images, videos, charts, infographics, and even audio — and uses that multimedia information to answer search queries more completely.  This now means that your content isn’t just being read like a document — it’s being watched, listened to, and interpreted like a human would. For example: • A chart showing rising enrollment in nursing programs might get picked up as supporting evidence for a story about healthcare education trends. • A YouTube video of your CEO speaking at a conference might be indexed as proof of thought leadership. • An infographic explaining how your service works could surface in an AI-generated summary — even if the keyword isn’t mentioned directly in text. Ignoring multimedia formats?  Then, your competitors’ visual storytelling could be outperforming your plain content.  Because you're not giving Google the kind of layered, helpful content that Gemini is now designed to highlight. Why This Matters There's a big risk here.  Marketers who ignore these developments are in danger of becoming invisible in search. Your old SEO tricks won’t work. Your content won’t appear in AI summaries. Your organization won’t be discovered by journalists, customers, or partners who now rely on smarter search results to make decisions faster. If you’re in communications, PR, media relations, or digital marketing, here’s the key message. You are no longer just fighting for links. You need to fight to be included in the Google AI summary itself at the top of search results - that's the new #1 goal.  Why? Journalists can now find their answers before ever clicking on your beautifully written news page. Prospective students, donors, and customers will often just see the AI’s version of your content. Your brand’s visibility now hinges on being seen as “AI-quotable.” If your organization isn’t optimized for this new AI-driven landscape, you risk becoming invisible at the very moment people are searching for what you offer. How You Can Take Action (and Why Your Role Is More Important Than Ever) This isn’t just an IT or SEO problem. It’s a communications strategy opportunity—and you are central to the solution. What You Can Do Now to Prepare for AI Overviews 1. Get Familiar with How AI “Reads” Your Content AI Overviews pull content from websites that are structured clearly, written credibly, and explain things in simple language. Action Items: Review your existing content: Is it jargon-heavy? Outdated? Lacking expert quotes or explanations? Then, it's time to clean house. 2. Collaborate with your SEO and Web Teams Communicators and content creators now need to work hand-in-hand with technical teams. Action Items: Check your pages to see if you are using proper schema markup.  Are you creating topic pages that explain complex ideas in simple, scannable formats? 3. Showcase Human Expertise AI values content backed by real people—especially experts with credentials. Action Items: Make sure your expert profiles are up to date.  Make sure you continue to enhance them with posts, links to media coverage, short videos, images and infographics that highlight the voices behind your brand and make you stand out in search. 4. Don’t Just Publish—Package AI favors content that it can easily digest and display such as summary paragraphs, FAQs, and bold headers that provide structure for search engines.  This also makes your content more scannable and engaging to humans. Action Items: Repurpose your best content into AI-friendly formats: think structured lists, how-tos, and definitions. 5. Monitor Your Presence in AI Overviews Regularly search key topics related to your organization and see what shows up. Action Items: Is your content featured? If not, whose is—and identify what they doing differently. A New Role for Communications: From Media Pitches to Machine-Readable Influence This isn’t the end of communications as we know it—it’s an evolution. Your role now includes helping your organization communicate clearly to machines as well as to people. Think of it as “PR for the algorithm.”  You’re not just managing narratives for the public—you’re shaping what AI systems say about you and your brand. That means: • Ensuring your best ideas and experts are front and center online. • Making complex information simple and quotable. • Collaborating cross-functionally like never before. Final Thought: AI Search Rewards the Prepared Google’s new AI Overviews are here. They’re not a beta test. This is the future of search, and it’s already rolling out. If your institution, company, or nonprofit wants to be discovered, trusted, and quoted, you can no longer afford to ignore how AI interprets your online presence. Communications and media professionals are now at the front lines of discoverability. And the best way to lead is to act now, work collaboratively, and elevate your role in this new era of search. Want to see how leading organizations are getting ahead in the age of AI search? Discover how ExpertFile is helping corporations, universities, healthcare institutions and industry associations transform their knowledge into AI-optimized assets — boosting visibility, credibility, and media reach. Get your free download of our app at www.expertfile.com

Peter Evans profile photo
7 min. read
How to create an engaging online presence for your experts at scale. featured image

How to create an engaging online presence for your experts at scale.

Tasked with creating or expanding how you promote your organization's experts? Delivering an engaging online presence is vital, yet scaling from a handful of experts to hundreds takes planning. While interesting content, modern layouts, and intuitive navigation are essential, the real test lies in managing and presenting the extensive knowledge each expert brings. What works for a few can become a complex, time-consuming, and costly endeavor as you attempt to scale to dozens or hundreds of experts and their content, leading to an underwhelming user experience and missed opportunities. These challenges are magnified as small marketing and digital teams face greater demands such as: How do I create and maintain up-to-date content for all my experts? How can I efficiently roll out this content across my website, beyond just the About Us/Team or Newsroom sections? How can I best facilitate audience interaction with my content, leading to valuable opportunities for both my experts and my organization? And perhaps most importantly… How can I minimize the use of marketing and digital resources, as well as costs, in building and maintaining all this content? Addressing these challenges requires a plan. The following highlights 4 areas to focus on when scaling your expert content to ensure an engaging user experience for your audience. 1. Create versatile content that’s engaging, timely, and relevant. The foundational importance of the quality and versatility of your expert content in designing an optimal user experience cannot be overstated. According to a recent report from the Content Marketing Institute, the majority 52% of B2B marketers plan to increase their marketing spend in 2025 on “thought leadership content”. This underscores the necessity of making relevant, high-quality expert content the backbone of delivering engaging and intuitive interactions with your visitors. Without this focus on content, it doesn’t matter how visually appealing your layouts are or how well structured your navigation, it won’t meet the needs of your audience. Expertly crafted content builds trust and credibility, as users perceive well-organized, comprehensive and authoritative information as a sign of a reputable organization with interesting and credible experts. "Content precedes design. Design in the absence of content is not design, it’s decoration". Jeffrey Zeldman - Renowned designer, author & speaker on web design. Ultimately, the integration of high-quality, versatile content into UX design is essential for creating meaningful and effective digital experiences that meet and exceed your visitor’s expectations. Keys to Scaling Share the Workload: Make content creation and management easier by using a purpose-built system that streamlines content creation and updates, vs custom designed page or need to provide access to the core CMS. Ensure the system allows multiple team members, including the experts themselves, to easily access and manage their content, making the process quick and efficient. Repurpose Existing Content Assets: According to the Content Marketing Institute, the failure to utilize pre-existing content is significant challenge (37%) impacting marketers' ability to scale. Leverage existing content assets, such as blog copy & imagery, and previously created videos, to enhance your expert content. This approach allows you to enrich your content without the need for expensive production, making the most of the resources you already have. Leverage Your Content Elements Together or Individually: Each content element should enable visitors to explore deeper insights from experts. Linking profiles to embedded videos and insightful posts or showcasing other experts within your organization can offer new and diverse perspectives. This approach enhances user engagement and provides a richer, more interactive experience. 2. Start with a home base, then grow your footprint. Creating a home base for your expert content, such as an "Expert Center" or "Speakers Bureau," within your website's newsroom or media section, or enhancing your "About Us" pages, can significantly elevate your organization's profile and improve user experience. This hub could also be tailored to highlight specific areas like "Research Expertise," depending on your primary audience—be it media, event organizers, or prospective clients. Establishing this destination for your expertise using a flexible integration option, not only provides a focal point for your owned content but also lays the groundwork for expanding your reach across your website. By categorizing and featuring your expert content strategically, you can engage a broader audience across various sections of your site. "Your website’s content should act as a doorway. Land new visitors with compelling stories, then expand their engagement by guiding them to explore more relevant content tailored to their interests."  Ann Handley, Chief Content Officer, MarketingProfs Keys to Scaling Establish & State Clear Objectives - Ensure you prominently state the goals of your initiative—whether it’s combating fake news, serving the community, or showcasing your organization's breadth of expertise. Clearly outline the types of inquiries you’re seeking to attract. This transparency not only sets expectations but also aligns visitors with your mission, fostering trust and engagement. Invest in Fresh Content - To keep your expert content hub dynamic and engaging, continually invest in new content. Regularly feature new experts and insights to encourage visitors to bookmark and frequently visit your site. Implementing a centralized, multi-access platform for content updates will streamline this process, making it scalable and sustainable. Link to Related Content - Utilize your expert content hub as a gateway to other areas of your website. Create links to related content, such as research initiatives, to help visitors explore and engage with your broader expertise. This not only enhances the user experience but also maximizes the value and reach of your content across your site. 3. Always be thinking about Discoverability Creating expert content—from compelling profiles to thought leadership—is only valuable if it’s easy to find. If visitors can’t quickly locate the expertise they need, frustration sets in and user experience suffers. To make expert content truly effective, it must be optimized for search engines, clearly organized, and internally linked. This is especially important for audiences like media, event organizers, and potential clients who rely on quick access to credible information. Prominently featuring and properly tagging expert content boosts visibility, builds authority, and drives meaningful engagement. Keys to Scaling Homepage/Top-Level Navigation: Don’t rely solely on a menu option or link buried in a subsection like your Newsroom. Featuring menu items, graphics, and other call-outs on your homepage and main section pages will increase interaction and inquiries. Leverage Distribution Networks: Drive traffic to your expert content by promoting your experts and their insights on platforms like LinkedIn, Twitter, and expert-specific search engines like expertfile.com or mobile expert directory apps. Add Free-Form Google-Like Search: Provide visitors with a free-form search experience that encompasses all elements of your expert content, rather than just a series of tags, titles, and names. Ensure All Metadata is Available: Let Google do the heavy lifting by ensuring you have properly structured metadata and schema data for each piece of expert content. While most digital teams remember standard title and description metadata, powerful schema data that helps Google understand the context and authority of the content is often overlooked. Add Links to Common Recurring Communications: Leverage all your communication channels, including adding links to your About Us section in press releases and individual experts’ email footers. 4. Plan for your Success It is important to plan for the success of your Expertise Marketing program. A successful program will not only deliver valuable opportunities, helping drive reputation and revenue, but can also place increased demands on your marketing and digital teams, as well as your experts themselves. Your success will likely inspire interest from other experts or departments for inclusion in your program, necessitating tools and defined processes for efficient onboarding of new experts and integrations across your website. Equally important is managing the influx of inquiries from key audiences such as the media, event organizers, and prospective clients in a way that provides a seamless user experience and encourages repeat engagement. Addressing these challenges with a strategic approach will lay a solid foundation for a robust and scalable expertise marketing initiative. Keys to Scaling Inquiry Workflow: Putting a general email or phone # as a contact does not scale. Implementing an efficient inquiry workflow is essential for the success of your expertise marketing. This process starts by ensuring that inquiries from key audiences—such as media, event organizers, and potential clients—are promptly and accurately directed to the appropriate experts within your organization. An automated system can streamline this process by categorizing inquiries based on specific topics and routing them to the relevant experts, even filtering out unnecessary or harmful inquiries. This approach not only saves time but also ensures swift and professional responses, enhancing your organization's reputation and effectiveness. Capture and Act on Analytics: Continually monitoring your analytics is crucial for refining your content strategy. By analyzing which types of content and which experts resonate most with your audience, you can better plan future content creation and decide who to feature prominently. This data-driven approach allows you to tailor your expertise marketing efforts more effectively, ensuring that you consistently engage your audience and meet their needs. Share Your Success - By sharing your experts' achievements both within and outside your organization, you create a culture of recognition and aspiration. This not only encourages additional departments and experts to join your program but also enhances the overall value of your expert center. Expanding your program to include more experts and additional expert content transforms your website into a valuable destination for key audiences such as media, event organizers, and potential clients. Effective dissemination of success stories amplifies your reach, reinforces your organization’s credibility, and drives sustained engagement and growth. Successfully scaling your Expertise Marketing program while maintaining an optimal user experience presents unique challenges. It requires producing versatile, high-quality content that is consistently engaging and relevant. Establishing a centralized home base for this content, such as an "Expert Center," helps streamline navigation and enhance user interaction. Improving discoverability through effective SEO and internal linking ensures that your expert content is easily accessible to key audiences like media and event organizers. Finally, meticulous planning for content updates and inquiry workflows is essential to manage resources efficiently and sustain growth. By addressing these areas strategically, you can build a robust and scalable Expertise Marketing initiative that drives engagement and reinforces your organization's reputation. About ExpertFile ExpertFile is changing the way organizations tap into the power of their experts to drive valuable inquiries, accelerate revenue growth, and enhance their brand reputation. Used by leading corporate, higher education and healthcare clients worldwide, our award-winning platform helps teams structure, manage and promote their expert content while our search engine features experts on over 50,000+ topics. Download our "Guide to Expertise Marketing", book a demo and more here.

Robert Carter profile photo
8 min. read