Experts Matter. Find Yours.

Connect for media, speaking, professional opportunities & more.

AI gives rise to the cut and paste employee featured image

AI gives rise to the cut and paste employee

Although AI tools can improve productivity, recent studies show that they too often intensify workloads instead of reducing them, in many cases even leading to cognitive overload and burnout. The University of Delaware's Saleem Mistry says this is creating employees who work harder, not smarter. Mistry, an associate professor of management in UD's Lerner College of Business & Economics, says his research confirms findings found in this Feb. 9, 2026 article in the Harvard Business Review. Driven by the misconception that AI is an accurate search engine rather than a predictive text tool, these "cut and paste" employees are using the applications to pump out deliverables in seconds just to keep up with increasing workloads. Mistry notes that this prioritization of speed over accuracy is happening at every level of the organization: • Junior staff: Blast out polished looking but unverified drafts. • Managers: Outsource their ability to deeply learn and critically think in order to summarize data, letting their analytical skills atrophy. • Power users: Build hidden, unapproved systems that bypass company oversight. A management problem, not a tech problem "When discussing this issue, I often hear leaders blame the technology. However, I believe that blaming the tech is missing the point; I see it as a failure of leadership," Mistry said. "When already overburdened employees who are constantly having to do more with less are handed vague mandates to just use AI without any training, they use it to look busy and produce volume-based work. Because many companies still reward the volume of work produced rather than the actual impact, employees naturally use these tools to generate slick but empty deliverables." "I believe that blaming the tech is missing the point; I see it as a failure of leadership. Because many companies still reward the volume of work produced rather than the actual impact, employees naturally use these tools to generate slick but empty deliverables." The real costs to organizations and incoming employees Mistry outlines three risks organizations face if they don’t intervene: 1. The workslop epidemic "These programs allow people to generate massive amounts of workslop, which is low-effort fluff that looks good but lacks substance. It takes seconds to create, but hours for someone else to decipher, fact-check, and fix," Mistry notes. "This drains money (up to $9 million annually for large companies) and destroys morale. As an educator, researcher, and a person brought into organizations to help fix problems, I for one do not want to be on the receiving end of a thoughtless, automated data dump, especially on tasks that require real skill and deep thinking." 2. Legal disaster He also states, "When the cut and paste mentality makes its way into professional submissions, the risks to the organization are real and oftentimes catastrophic. Courts have made it perfectly clear: ignorance is no excuse. If your name is on the document, you own the liability. Recently, attorneys have faced severe sanctions, hefty fines, and case dismissals for blindly submitting fake legal citations made up by computers." Click here for a list of cases. 3. A warning for incoming talent For new graduates entering this environment, Mistry offers a warning: Do not rely on AI to do your deep thinking. "If you simply use AI to blast out polished but unverified drafts, you become a replaceable 'cut and paste' employee," he says. “To truly stand out, new grads must prove they have the discernment to review, tweak, and challenge what the computer writes. The hiring edge is no longer just saying, 'I can do this task,' but 'I know how to leverage and correct AI to help me perform it.'" Four ideas to fix it To survive and indeed thrive with these new tools and avoid the unintended consequences of untrained staff, organizations should: 1. Reinforce the importance of fact-checking and editing: Adopt frameworks that teach employees how to show their work and log how they verified computer-generated facts. 2. Change the incentives: Stop rewarding busy work, useless reports, and massive slide decks. Evaluate employees on accuracy and results. 3. Eradicate superficial work: Don’t use automation to speed up ineffective legacy processes. Instead, use it to identify and eliminate them entirely. 4. Make time for editing: Give yourself and your employees the breathing room to actually review, tweak, and challenge what the computer writes instead of accepting the first draft. Mistry is available to discuss: Why AI is causing an epidemic of corporate "workslop" (and how to spot it). The leadership failure behind the "cut and paste" employee. How to rewrite corporate incentives to measure impact instead of volume in the AI era. Strategies for implementing safe, effective AI policies at work. How new college graduates can avoid the "workslop" trap in their first jobs. To reach Mistry directly and arrange an interview, visit his profile and click on the "contact" button. Interested reporters can also send an email to MediaRelations@udel.edu.

Saleem Mistry profile photo
4 min. read
VCU College of Engineering receives $600,000 for AI-driven cybersecurity research featured image

VCU College of Engineering receives $600,000 for AI-driven cybersecurity research

To advance AI-enabled cybersecurity research, the National Science Foundation (NSF) presented Kemal Akkaya, Ph.D., professor and chair of the Department of Computer Science, with a $600,000 grant through the organization’s Cybersecurity Innovation for Cyberinfrastructure program. Akkaya’s three-year project will explore how large language models (LLMs) can automate packet labeling for intrusion detection systems. “From transportation and healthcare to finance, improving the accuracy of machine learning algorithms used to defend the networks that underpin these sectors’ cyberinfrastructure is critical for protecting them from cyberattacks. Strengthening these defenses helps ensure the reliability and security of the essential services people rely on every day,” said Akkaya. Intrusion detection systems monitor network traffic to identify suspicious or malicious activity. These systems rely on machine learning models trained on large volumes of accurately labeled data. Producing those datasets, however, is time intensive and often requires expert cybersecurity knowledge. As digital systems increasingly power transportation, health care, finance and communication, the volume and sophistication of cyber attacks continue to grow. At the same time, artificial intelligence is reshaping how both attackers and defenders operate. Improving how quickly and accurately security systems can be trained is critical to protecting the infrastructure that supports daily life. Akkaya’s project will investigate how generative AI can help address this challenge. The team will fine tune open-source large language models using network data, threat signatures and expert annotations. Model accuracy will be strengthened through retrieval-augmented refinement, ensemble modeling and human-in-the-loop verification. Labeled datasets will be released in stages to support the development and evaluation of cybersecurity models. Using data from AmLight, an international research and education network operated by Florida International University (FIU), the project includes collaboration with researchers from FIU. The award strengthens VCU’s growing leadership in AI-enabled cybersecurity research and provides hands-on research training for graduate students. Resulting datasets from this work will support machine learning education for undergraduate students.

Kemal Akkaya, Ph.D. profile photo
2 min. read
A Century and a Half of Connectivity: Professor Mojtaba Vaezi Reflects on the Evolution and Future of Communication Technology featured image

A Century and a Half of Connectivity: Professor Mojtaba Vaezi Reflects on the Evolution and Future of Communication Technology

On March 10, 1876, Alexander Graham Bell spoke the first words ever transmitted over telephone: “Mr. Watson, come here; I want you.” This simple request to Bell’s assistant, Thomas Watson, marked a significant milestone in direct person-to-person communication. Now, 150 years later, this message has paved the way for advanced cellular technology in the form of satellites, wireless networks and the personal devices we carry everywhere. For Mojtaba Vaezi, PhD, associate professor of electrical and computer engineering at Villanova University and director of the Wireless Networking Laboratory, Bell’s few words spoken over telephone marked the beginning of an ongoing technological revolution. “One hundred fifty years ago when telephone communication first started, there was essentially a wired line and a transmitting voice,” said Dr. Vaezi. “That simple, basic transmission has transformed the field of communication technology in unimaginable ways.” According to Dr. Vaezi, five shifts have defined the past century and a half of communication technology: wired devices to wireless, analog to digital, voice to data, fixed landlines to mobile phones and human-to-human communication giving way to an increasing focus on machines and artificial intelligence. Early wireless networks were built around one device per person. Today's networks must support multiple devices per person, plus the technology behind innovations such as smart homes, driverless cars and even remote surgery. “Applications are much more diverse now, so communication has to follow,” said Dr. Vaezi. “A big portion of communication now, in terms of number of connections to the network, is from machine to machine—not human to human or even human to machine." The growing number of connections can cause a host of issues for users. When multiple users share the same wireless spectrum simultaneously, their signals interfere with one another—a problem that is becoming more acute as the number of connected devices increases exponentially. Dr. Vaezi’s research at Villanova focuses on developing techniques that allow multiple users to transmit messages on the same frequency at the same time and still be understood. Another vibrant research area of Dr. Vaezi’s involves Integrated Sensing and Communication (ISAC). This field of study focuses on integrating wireless communications and radar so they can function within the same spectrum. “Historically, radar and wireless communication work in different bandwidths or spectrums and use separate devices. Although they are related, they happen in different fields,” said Dr. Vaezi. “Almost every communication scheme that has been developed has focused on this: How can we better utilize the spectrum?” ISAC is increasingly important as new innovations like driverless cars become fixtures in everyday life. These vehicles rely on radar to continuously scan for hazards, and when a hazard is detected, a signal must be sent to trigger safety mechanisms. Currently, the radar and communications systems operate on separate bandwidths using separate hardware. Dr. Vaezi's research explores how both functions could be housed in a single device running on one shared spectrum. Areas of study like Dr. Vaezi’s that focus on machine to machine communication are becoming increasingly relevant as communication technology evolves and moves away from simple person to person messaging. As for the next big milestone in communications, Dr. Vaezi is looking ahead to the implementation of 6G by 2030, though he tempers expectations. For most users, the change will feel modest, amounting to slightly faster device speeds. The most massive shift with 6G will be the amount of added coverage in areas that previously did not have network accessibility. “Say you order a package and it’s coming from somewhere abroad,” explained Dr. Vaezi. “6G will add network coverage over oceans, so you’ll be able to track your package in real time using that satellite technology.” The sixth generation of cellular technology will continue to connect our world and optimize current communications to accommodate more users and devices that need network access each day. It is far different from Alexander Graham Bell’s historic phone call 150 years ago. That brief exchange over a single wired line laid the groundwork for a communications ecosystem that now supports billions of devices, complex data networks and emerging technologies yet to be seen. It also serves as a reminder that despite how far communication technology has come, and how complex it has gotten, it all shares a common, simple goal: to transmit information from one point to another.

3 min. read
Director Gennady Miloshevsky, Ph.D., shares his vision for the nuclear program at the VCU College of Engineering featured image

Director Gennady Miloshevsky, Ph.D., shares his vision for the nuclear program at the VCU College of Engineering

Recently named the nuclear program director at the Virginia Commonwealth University (VCU) College of Engineering, Gennady Miloshevsky, Ph.D., associate professor in the Department of Mechanical & Nuclear Engineering, answers some questions about the direction of VCU Engineering’s nuclear program and what he hopes it can accomplish. What are your top priorities for the nuclear program at the VCU College of Engineering? I want to focus on student development, innovative research and our rankings in best program lists, but that is not everything. Strategy is important. We need to align ourselves with the country’s national energy needs. There are many new developments in the energy sector, like small modular reactors or fusion energy systems, and having the right faculty to engage with these advancements is important. Providing students with a well-rounded education and good opportunities for gaining experience benefits the College of Engineering’s public and private sector partners. Nuclear subject matter is complex, so higher education is very important for workforce development. We want to build partnerships, like the one we have with Dominion Energy, that support this goal. A priority for me is continuing to establish relationships with Commonwealth Fusion Systems, which seeks to build and operate the first commercial grid-scale fusion plant in Chesterfield County, Virginia. Our workforce partners will benefit from VCU’s well-trained nuclear engineering graduates joining the workforce. So, aligning our strategy with national energy needs, hiring the right faculty to support our programs and building industry partnerships that benefit our student’s education and career opportunities are important things for VCU Engineering’s nuclear program. Where would you like to see the College of Engineering’s nuclear program 10 years from now? I would like to see growth in the nuclear program. For example, some new graduate courses on topics like nuclear materials or fusion energy. In 2024, I developed a general course for fusion energy, so building out a curriculum that goes more in-depth would be good. When you look at small modular reactors and micro reactors, current energy policy does not allow private companies to build their own. However, as energy demands increase, policy could change to where you see these compact devices installed in places like data centers, for example. A more in-depth curriculum allows VCU Engineering students to step into industry roles that lead growth of the energy industry while also ensuring students are capable of adapting to the changing field and taking advantage of new developments. What sort of cross-disciplinary opportunities are there for the College of Engineering’s nuclear program? Nuclear engineering and nuclear science are very interdisciplinary fields. You have physics that covers the nuclear reaction and the radiation it generates, for example, then chemistry is needed when talking about nuclear fuel cycles and nuclear waste. You also need materials science because good materials capable of withstanding radiation and high temperatures are needed in nuclear fission and fusion energy systems. This science then connects to engineering, building the reactors, the energy distribution systems like a power grid. It is a small sample of the overall work, but you see how mechanical and electrical engineering are key to this part. All these disciplines come together to solve the same problem. One researcher might be figuring out how to confine plasma and make it stable, then another researcher is looking at how plasma can disrupt the containment wall and how to make materials to protect the wall. Within our department, we are making connections between mechanical-focused faculty working on high-temperature ceramics or additive manufacturing techniques and those of us researching nuclear energy systems in order to make joint proposals. We are also collaborating outside VCU. As an example, I am involved with an alliance founded by the Defense Threat Reduction Agency (DTRA) comprised of 17 universities, research labs and military centers. Coordinated through DTRA, we work together on many of the same problems.Through this partnership, my Ph.D. students do summer research rotations with national labs like Lawrence Livermore National Laboratory in California and The Pacific Northwest National Laboratory. We also bring cadets and midshipman into VCU from other institutions, like the DTRA Nuclear Science and Engineering Research Center, United States Military Academy West Point and the Virginia Military Institute, whose students have been part of research experience for undergraduates programs in the summer. How is artificial intelligence impacting the field of nuclear engineering? So, the United States is sponsoring the Genesis Mission, which seeks to transform science innovation through the power of AI. One area of the Genesis Mission is nuclear fission and fusion energy. I see this playing out with the Department of Energy encouraging national labs, universities and industry to work together on applying these AI advancements to solve the research problems of nuclear energy. It is a great opportunity for students, who we can involve in this work to give them real-world experience with topics they will see after graduation. Last semester I taught a course at VCU on the practical applications of AI on nuclear engineering problems. It is not something like ChatGPT or anything like that. What we did is take Google’s TensorFlow platform that is a library of AI models and machine neural networks. Using Python scripting students learn how to apply these AI resources to about 30 problems in mechanical and nuclear engineering. They create scripts, use data sets and run analytics. We have a nuclear reactor simulator and I have some ideas to create AI-based software we can pair with the simulator, then give the software a data set and let it control the operation of the simulator in a safe way. Tell us about your background. What brought you VCU and the Department of Mechanical and Nuclear Engineering? Actually, I am not a mechanical or a nuclear engineer. My background is in physics. I graduated from the Belarusian State University in 1990 and continued to a Ph.D. in physics from the Heat and Mass Transfer Institute of the National Academy of Sciences of Belarus working on topics related to fusion plasmas and nuclear weapon effects. In space, nuclear weapons produce shockwaves and radiation. I computationally model these effects in my research to determine how something like a nuclear warhead detonation in orbit will impact the materials a satellite is made of, for example. My research also crosses over into nuclear fusion, specifically thermodynamic and optical plasma properties, fusion plasma disruptions, melt motion and splashing from plasma facing components. Accelerating Next-Generation Extreme Ultraviolet (EUV) Lithography (ANGEL) is my most recent collaborative project, supported by the Department of Energy’s (DOE) Office of Science, Fusion Energy Sciences. It involves two national laboratories, three universities and a private-sector company focusing on advancement of future micro-electronic chips, EUV photon sources, mitigation of material degradation and plasma chemistry. Prior to joining the VCU College of Engineering I worked at Purdue University at a DOE-funded center investigating nuclear fusion and the effects of plasma on materials. Around 2019 I wanted to develop my own lab, so I came to VCU with startup funds from the Nuclear Regulatory Commission and DTRA. My first priority after joining the VCU College of Engineering was continuing my fusion research, the second was collaborating with an alliance of universities focused on work for DTRA and DOE.

Gennady Miloshevsky, Ph.D. profile photo
5 min. read
Assisted by sniffer dogs and DNA sequencing, researchers discover three new truffle species featured image

Assisted by sniffer dogs and DNA sequencing, researchers discover three new truffle species

University of Florida biologists studying fungal evolution and ecology have discovered three new truffle species, including one capable of commanding hundreds of dollars per pound within culinary circles. “Our paper confirms what a lot of people had suspected for a long time, which is that the North American truffle species is genetically very distinct from its European relatives.” —Benjamin Lemmond, study co-author and a former UF student The researchers describe their discoveries in a Persoonia. Their work shakes up the Morchellaceae truffle family tree, with key insights related to perhaps the most commercially valuable truffle in North America, the Oregon black truffle. Gourmet chefs, who sometimes grate the odoriferous truffle over dishes or infuse butter with it, have been known to pay as much as $800 per pound for the delicacy. For decades, the Oregon black truffle has been known scientifically as Leucangium carthusianum. It was originally found in Europe and later found in the Pacific Northwest, from California to British Columbia. However, recent genetic testing and field analysis by researchers from UF’s Institute of Food and Agricultural Sciences (UF/IFAS) revealed the North American variety is a distinct species. Scientists are giving this newly recognized species a name honoring the Cascadia region in which it is found: Leucangium cascadiense. “Our paper confirms what a lot of people had suspected for a long time, which is that the North American truffle species is genetically very distinct from its European relatives,” said study co-author Benjamin Lemmond, a former UF student. Lemmond, now a postdoctoral associate at the University of California at Berkeley, began his research into the truffles as a first-year doctoral student studying under professor Matthew Smith of the UF/IFAS plant pathology department. During the COVID-19 pandemic, Lemmond couldn’t access the campus greenhouse where he was conducting an experiment, so Smith secured hundreds of dried truffle specimens from Oregon State University for him to study. The stash included slivers of the Oregon black truffle, a dark-colored, potato-shaped species with tiny, pyramid-shaped warts. When pandemic restrictions relaxed, Lemmond and Smith conducted genetic testing of the Oregon State specimens and others borrowed from Polish, Greek, Italian, French and Japanese collections. Their tests indicated Oregon black truffles from North America had at one point diverged from their European counterparts on the Morchellaceae evolutionary tree, according to the study. They also established the existence of another distinct and very rare species, Imaia kuwohiensis, a pale-colored truffle with dark warts, which is native to threatened spruce-fir habitats in the southern Appalachian Mountains. Their name for the truffle comes from the Cherokee word for the Great Smoky Mountains’ highest peak, Kuwohi. Field tests followed. The researchers wanted to understand the origin of Oregon black truffles’ energy. “Understanding the fundamental, basic biology and life cycle of this truffle is really important,” Lemmond said. “It’s a very valuable commodity, and this knowledge might help us to cultivate the truffle in the future. It also supports long-term conservation and management.” Most gourmet truffles are mycorrhizal, meaning they obtain energy from trees, Lemmond said. It had long been suspected that Oregon black truffles obtain energy through a symbiotic relationship with young Douglas fir trees, but no one had conclusively proven it. Lemmond traveled to the Pacific Northwest and worked with specially trained sniffer dogs capable of detecting truffles buried as deep as 10 inches beneath soil and leaf litter. With the dogs’ help, he unearthed Oregon black truffles nestled among Douglas fir stands. He used fluorescent stain that bonded with the fungal tissue, coloring it green to show where the truffle fungus grew between the cells of the tree root tissue. “The truffle fungi surround the whole root, but the fungus is healthy, and the plant is healthy,” Smith said. “The two trade nutrients back and forth.” DNA sequencing of the roots subsequently proved the truffles rely on the trees as their main source of carbon, according to the study. As the researchers conducted genome sequencing of the Oregon black truffle, they learned of a peculiar find reported by a citizen scientist on iNaturalist, an online science data network: a Leucangium truffle growing among Eastern hemlock trees in Oneida County, New York. It was the first time anyone had ever reported a Leucangium species in the United States, east of the Rocky Mountains, Lemmond said. Lemmond contacted Purdue University, which was preserving the specimen, and requested a sample. The truffle’s physical characteristics, including its dense external hairs and lack of warts, distinguished it from other Leucangium species. DNA analysis confirmed significant variation, too. The researchers named the new truffle species Leucangium oneidaense to recognize the county where it was unearthed. A few years later, just before the researchers submitted their study for publication, someone found a second Leucangium oneidaense specimen growing in Massachusetts, Lemmond said. “It was great timing, and it suggests to me that there are still a lot of undiscovered truffles out there, waiting to be found,” he said.

Matthew Edward Smith profile photo
4 min. read
War in Iran: Impact on Oil Prices featured image

War in Iran: Impact on Oil Prices

As global markets respond to escalating tensions in Iran, energy prices are once again at the center of international concern. For insight into what this conflict could mean for oil markets, consumers and the broader economy, media can turn to Greg Upton, executive director and associate research professor at the LSU Center for Energy Studies. An expert at the intersection of energy and environmental economics, Upton studies how geopolitical disruptions, supply constraints and policy decisions influence oil prices and downstream economic impacts. As instability in the Middle East threatens global supply chains, he can provide context on potential price volatility, implications for Louisiana’s energy sector and what higher crude prices may mean for gasoline costs and inflation in the United States. Upton has contributed to more than 40 academic publications and has presented his research to over 200 industry, government and academic audiences. He has testified before committees in both chambers of the Louisiana Legislature and a subcommittee of the U.S. House of Representatives. A frequent voice in national and local media, Upton has been quoted or cited more than 250 times, including by the The Wall Street Journal, The New York Times, USA Today and NPR. In addition to his research, Upton teaches in LSU’s MBA program and in the Department of Economics and Environmental Sciences, helping prepare the next generation of leaders to navigate complex energy and environmental challenges. For timely, data-driven analysis on the impact of oil price fluctuations amid the ongoing conflict in Iran, Dr. Greg Upton is available for interviews and expert commentary.

Greg Upton profile photo
2 min. read
Surgery past 65? Brain health screening can aid recovery featured image

Surgery past 65? Brain health screening can aid recovery

Before surgery, your doctor will order evaluations to identify any health problems that may need to be addressed before the procedure. This typically includes medical histories, laboratory tests and checking blood pressure, heart rate and temperature. There’s one vital sign that is often not on the list, but is crucial for older adults: screening for mental and cognitive health. “There is an overwhelming amount of evidence that presurgical brain health predicts complications after surgery,” said Catherine Price, Ph.D., a professor in the University of Florida College of Public Health and Health Professions Department of Clinical and Health Psychology and the UF College of Medicine Department of Anesthesiology. “For example, individuals with weaknesses in memory and attention and people with neurodegenerative diseases, such as Parkinson’s, have higher rates of confusion and memory complications that affect their recovery from surgery.” Research by Price and others has shown that a patient’s cognitive, memory and mental health status before surgery is an excellent indicator of whether they will experience cognitive problems such as delirium, a common complications in older adults after surgery. Delirium, characterized by confusion, disorientation and impaired awareness, can lead to longer recovery times, increased dementia risk, higher mortality rates and health care costs. Price founded and directs the University of Florida Perioperative Cognitive Anesthesia Network, or PeCAN, a first-of-its-kind, multidisciplinary program that seeks to identify older adults who may be at risk of developing cognitive problems after surgery so that clinicians can intervene. In recent findings published in the journal Anesthesia and Analgesia, Price and her colleagues report on two years of PeCAN patient data. Of the thousands of patients over age 65 who received presurgical screening, 23% were found to have issues with their cognitive performance, yet only 2% of the patients screened had a previous note in their medical charts indicating they had a cognitive impairment. “It’s so important to know when an individual has cognitive complications because that changes their care path,” Price said. “From medication to monitoring, the patient’s care is more complex for the perioperative team and family.” For PeCAN patients identified as being at risk for postsurgery cognitive problems, Price and her team will share tailored recommendations with the patient’s care team before, during and after surgery. These may include more monitoring during anesthesia and medication adjustments, such as using medications for nausea and pain management less likely to contribute to delirium. The PeCAN team also might offer the surgical care team specific communication strategies. For example, health care providers should repeat information several times for patients who have trouble remembering new material and ask them to write it down. Recently published research by Price and colleagues found PeCAN patients reported the focus on brain health improved confidence in their surgical team and care plan. Health care systems are only starting to incorporate preoperative brain health teams like PeCAN. Until they are offered more frequently, Price offers a few steps anyone can take to help protect brain health, including a focus on reducing inflammation in the body prior to surgery. To help achieve this: Optimize nutrition. Reduce your intake of added sugars and refined carbohydrates, like white bread. Get good sleep. Improve sleep hygiene so you are well-rested. “Sleep is essential for the brain for a number of reasons,” Price said. Reduce alcohol intake to limit inflammation and dehydration. Pay attention to your medications. Follow your care team’s instructions. Enlist a family member or caregiver to help you keep tabs on what you’re taking, how much and how often. Practice techniques to limit anxiety, such as visualization and deep breathing. The box breathing method is an easy one to remember: Breathe in slowly for four seconds. Hold your breath for four seconds. Slowly exhale for four seconds. Wait four seconds before inhaling again.

Catherine Price profile photo
3 min. read
Florida renters struggle with housing costs, new statewide report finds featured image

Florida renters struggle with housing costs, new statewide report finds

Nearly 905,000 low-income renter households in Florida are struggling to afford their housing costs, according to the 2025 Statewide Rental Market Study, released by the University of Florida’s Shimberg Center for Housing Studies. Prepared for Florida Housing Finance Corporation, the report provides a comprehensive look at the state’s rental housing conditions and is used to guide funding decisions for Florida Housing’s multifamily programs, including the State Apartment Incentive Loan (SAIL) program. “Florida’s strong population growth has collided with limited housing supply, pushing rents beyond what many families can afford,” said Anne Ray, manager of the Florida Housing Data Clearinghouse at the Shimberg Center. “This report helps policymakers and housing providers target resources where the need is most acute — including communities that are experiencing the fastest growth and the greatest affordability gaps.” Key findings from the 2025 study include: A growing affordability gap: An estimated 904,635 renter households earning below 60% of their area median income (AMI) are cost burdened, paying more than 40% of their income toward rent. These households are spread across the state, with 64% in Florida's nine most populous counties, 33% in mid-sized counties and 3% in small, rural counties. Surging population and higher rent and housing costs: Between 2019 and 2023, Florida added more than 1 million households — nearly 195,000 of them renters — driven by in-migration from states like New York, Illinois and California. Despite the addition of more than 240,000 multifamily units, median rent soared nearly $500 per month, from $1,238 to $1,719. After years of growth, Florida's older renter population is holding steady: Renters age 55 and older represent 39% of cost burdened households, up from 29% in 2010 but similar to 2022 numbers. Most renters are working: 79% of renter households include at least one employed adult, compared to 67% of owner households. Most non-working renters are seniors or people with disabilities. Homelessness is on the rise: The report estimates 29,848 individuals and 44,234 families are without stable housing, up from 2022, as hurricanes and tight markets contribute to displacement. Assisted housing provides an alternative to high-cost private market rentals: Developments funded by Florida Housing, HUD, USDA and local housing finance authorities provide over 314,000 affordable rental units statewide. Future risks to affordable housing stock: More than 33,000 publicly assisted units may lose affordability protections by 2034 unless renewed. Evalu ating affordable housing in Florida “State- and federally-assisted rental housing developments are essential to providing stable, affordable homes for Florida’s workforce, seniors, and people with special needs,” Ray said. “Florida Housing Finance Corporation’s programs make up a significant portion of this housing, and our study helps ensure those resources are directed where they’re needed most. Preserving these developments — and expanding them — is critical to keeping pace with Florida’s growing population and maintaining affordability.” Since 2001, the Shimberg Center has produced the Rental Market Study every three years to inform strategic investments in affordable housing across Florida. The study evaluates needs across regions and among key populations including seniors, people with disabilities, farmworkers and others. The Rental Market Study and the Florida Housing Data Clearinghouse are part of a 25-year partnership between the Shimberg Center and Florida Housing Finance Corporation to support data-driven housing policy and planning.

Anne Ray profile photo
3 min. read
Beyond the field: New research highlights how NIL is reshaping college athlete identity featured image

Beyond the field: New research highlights how NIL is reshaping college athlete identity

In an era of name, image and likeness, or NIL, many college athletes are thinking differently about who they are — seeing themselves not just as competitors or students, but also as influencers with distinct voices and causes, according to a new study from the University of Florida. Molly Harry, Ph.D., an assistant professor in the Department of Sport Management at the UF College of Health and Human Performance, surveyed 200 athletes from 21 Power Four universities to better understand how NIL, which refers to the rights of college athletes to earn money through endorsements, sponsorships, social media promotions and other commercial opportunities, has impacted the way athletes perceive their roles and identities. “Historically, we’ve viewed them (college athletes) through the lens of athletics or academics, but they’re daughters, brothers, role models, and increasingly, they’re now cultivating public personas and marketing skills.” —Molly Harry, Ph.D., an assistant professor in the Department of Sport Management The findings, published Friday in the Sociology of Sport Journal, reveal a growing recognition among athletes that they are more than the two-dimensional “student-athlete” model that is traditionally used in research and policy. “With the shift in NIL policies, athletes are starting to develop roles and identities related to that of the influencer,” Harry said. “Historically, we’ve viewed them through the lens of athletics or academics, but they’re daughters, brothers, role models, and increasingly, they’re now cultivating public personas and marketing skills.” Through survey responses across seven major sports — football, baseball, men’s and women’s basketball, gymnastics, volleyball and softball — Harry and UF doctoral student Hannah Kloetzer examined athletes' engagement with NIL opportunities, as well as the personal sacrifices they made to pursue them. They found that many athletes now view NIL as a platform to promote causes they care about, build connections with their communities and explore career pathways after college. One softball player described the value of NIL in a way that highlights the broader impact: “It’s been great to feel seen and have your hard work in a sport help in other parts of life. It’s really nice to use NIL on a resume as marketing experience.” Athletes surveyed said they found deals not just with big-name brands, but more often with local businesses like restaurants, boutiques and community partners. This entrepreneurial approach often required initiative and personal outreach, something many athletes had to learn on their own. “Some athletes told us they felt lost when trying to navigate NIL,” Harry said. “Others shared how they reached out to local businesses or organized their own camps.” One particularly striking finding, Harry said, was that some athletes were making athletic sacrifices — like spending less time training — to pursue NIL work, a shift that underscores the importance of these opportunities. Harry stressed that while no one reported skipping practices, athletes did acknowledge shifting their priorities to make room for NIL-related endeavors. “If you’re willing to give up something in your athletic routine, that speaks volumes about how central NIL — and influencer identities — could become for some athletes,” she said. Another key insight: football players of color from low socioeconomic backgrounds were most likely to self-identify as influencers. This emerging pattern stands in contrast to perceived broader trends in the social media world. “That was one of the most fascinating takeaways,” Harry said. “We have this unique subset of influencers — college football athletes — that are starting to enter this space.” Harry’s research builds on a growing conversation in the academic community about the evolving identity of college athletes. A few conceptual pieces have previously proposed the idea of a “student-athlete-influencer,” but Harry’s team is one of the first to gather empirical data to back it up. This new perspective has broad implications for how universities and organizations like the NCAA support college athletes, both during their playing years and as they prepare for life after sport. “As fans, we often see athletes as commodities on the field,” Harry said. “But they’re humans first, and they’re starting to recognize their own value and tap into their potential beyond the playing field.” In addition to academic and athletic support, Harry believes universities should invest in more targeted resources tailored to influencer pressures, like mentorship opportunities and training that goes beyond basic social media etiquette. “Athletes who take on influencer roles may deal with unique stressors, whether it’s comparing engagement numbers or coping with public scrutiny,” she said. “It would be valuable to provide opportunities where athlete-influencers can support each other, share strategies and protect their mental health.” A football player who participated in the study summed up the broader potential of NIL: “I’m very appreciative of NIL opportunities and the ability to continue to grow my camp and greater brand outside of my football program.” Looking ahead, Harry plans to explore this evolving identity through more qualitative research, with a focus on what it truly means to be an “influencer” in the context of college athletics. “Athletes are more than football players. They are more than swimmers,” she said. “They are people who we walk with on our college campuses, and they are people who bring value to our society in a host of ways.”

Molly Harry profile photo
4 min. read
Young magmas on the moon came from much shallower depths than previously thought, new study finds featured image

Young magmas on the moon came from much shallower depths than previously thought, new study finds

New research on the rocks collected by China's Chang'e 5 mission is rewriting our understanding of how the moon cooled. Stephen Elardo, Ph.D., an assistant professor of Geological Sciences with the University of Florida, has found that lava on the near side of the moon likely came from a much shallower depth than previously thought, contradicting previous theories on how the moon produced lavas through time. These samples of basalt, an igneous rock made up of rapidly cooled lava, were collected from the near side of the moon by the Chang’e 5 mission and are the youngest samples collected on any lunar mission, making them an invaluable resource for those studying the geological history of the moon. In order to get an estimate of how deep within the moon the Chang’e 5 lava came from, the team conducted high-pressure and high-temperature experiments on a synthetic lava with an identical composition. Previous work from Chinese scientists has determined that the lava erupted about 2 billion years ago and remote sensing from orbit has showed it erupted in an area with very high abundances of potassium, thorium and uranium on the surface, all of which are radioactive and produce heat. Scientists believe that, in large amounts, these elements generate enough heat to keep the moon hot near the surface, slowing the cooling process over time. “Using our experimental results and thermal evolution calculations, we put together a simple model showing that an enrichment in radioactive elements would have kept the Moon's upper mantle hundreds of degrees hotter than it would have been otherwise, even at 2 billion years ago,” explained Elardo. These findings contradict the previous theory that the temperature of the moon’s outer portions was too low to support melting of the shallow interior by that time and may challenge the hypothesis about how the moon cooled. Prior to this study, the generally-accepted theory was that the moon cooled from the top down. It was presumed that the mantle closer to the surface cooled first as the surface of the moon gradually lost heat to space, and that younger lavas like the one collected by Chang’e 5 must have come from the deep mantle where the moon would still be hot. This theory was backed by data from seismometers placed during the Apollo moon landings, but these findings suggest that there were still pockets of shallow mantle hot enough to partially melt even late into the moon’s cooling process. “Lunar magmatism, which is the record of volcanic activity on the moon, gives us a direct window into the composition of the Moon's mantle, which is where magmas ultimately come from,” said Elardo. “We don't have any direct samples of the Moon's mantle like we do for Earth, so our window into the composition of the mantle comes indirectly from its lavas.” Establishing a detailed timeline of the moon’s evolution represents a critical step towards understanding how other celestial bodies form and grow. Processes like cooling and geological layer formation are key steps in the “life cycles” of other moons and small planets. As our closest neighbor in the solar system, the moon offers us our best chance of learning about these processes. “My hope is that this study will lead to more work in lunar geodynamics, which is a field that uses complex computer simulations to model how planetary interiors move, flow, and cool through time,” said Elardo. “This is an area, at least for the moon, where there's a lot of uncertainty, and my hope is that this study helps to give that community another important data point for future models.”

Stephen Elardo profile photo
3 min. read