The AI Journal: UF and other research universities will fuel AI. Here’s why

Alina Zare Ph.D.

Feb 2, 2026

3 min

Alina Zare


In the global AI race between small and major competitors, established companies versus new players, and ubiquitous versus niche uses, the next giant leap isn’t about faster chips or improved algorithms. Where AI agents have already vacuumed up so much of the information on the internet, the next great uncertainty is where they’ll find the next trove of big data.


The answer is not in Silicon Valley. It’s all across the nation at our major research universities, which are key to maintaining global competitiveness against China.


To teach an AI system to “think” requires it to draw on massive amounts of data to build models. At a recent conference, Ilya Sutskever, the former chief scientist at OpenAI — the creator of ChatGPT — called data the “fossil fuel of AI.” Just as we will use up fossil fuels because they are not renewable, he said we are running out of new data to mine to keep fueling the gains in AI.


However, so much of this thinking assumes AI was created by private Silicon Valley start-ups and the like. AI’s history is actually deeply rooted in U.S. universities dating back to the 1940s, when early research laid the groundwork for the algorithms and tools used today. While the computing power to use those tools was created only recently, the foundation was laid after World War II, not in the private sector but at our universities.


Contrary to a “fossil fuel problem,” I believe AI has its own renewable fuel source: the data and expertise generated from our comprehensive public academic institutions. In fact, at the major AI conferences driving the field, most papers come from academic institutions. Our AI systems learn about our world only from the data we offer them.


Current AI models like ChatGPT are scraping information from some academic journal articles in open-access repositories, but there are enormous troves of untapped academic data that could be used to make all these models more meaningful. A way past data scarcity is to develop new AI methods that leverage all of our knowledge in all of its forms. Our research institutions have the varied expertise in all aspects of our society to do this.


Here’s just one example: We are creating the next generation of “digital twin” technology. Digital twins are virtual recreations of places or systems in our world. Using AI, we can develop digital twins that gather all of our data and knowledge about a system — whether a city, a community or even a person — in one place and allow users to ask “what if” questions.


The University of Florida, for example, is building a digital twin for the city of Jacksonville, which contains the profile of each building, elevation data throughout the city and even septic tank locations. The twin also embeds detailed state-of-the-art waterflow models. In that virtual world, we can test all sorts of ideas for improving Jacksonville’s hurricane evacuation planning and water quality before implementing them in the actual city.


As we continue to layer more data into the twin — real-time traffic information, scans of road conditions and more — our ability to deploy city resources will be more informed and driven by real-time actionable data and modeling. Using an AI system backed by this digital twin, city leaders could ask, “How would a new road in downtown Jacksonville impact evacuation times? How would the added road modify water runoff?” and so on.


The possibilities for this emerging area of AI are endless. We could create digital twins of humans to layer human biology knowledge with personalized medical histories and imaging scans to understand how individuals may respond to particular treatments.


Universities are also acquiring increasingly powerful supercomputers that are supercharging their innovations, such as the University of Florida’s HiPerGator, recently acquired from NVIDIA, which is being used for problems across all disciplines. Oregon State University and the University of Missouri, for example, are using their own access to supercomputers to advance marine science discoveries and improve elder care.


In short, to see the next big leap in AI, don’t immediately look to Silicon Valley. Start scanning the horizon for those research universities that have the computing horsepower and the unique ability to continually renew the data and knowledge that will supercharge the next big thing in AI.


Read more...



Connect with:
Alina Zare

Alina Zare

Professor

Alina Zare's research focuses on developing new machine learning and artificial intelligence algorithms to process data and imagery.

AI for DefenseAI for AgricultureAutomated Sensor UnderstandingRemote SensingMachine Learning
Powered by

You might also like...

Check out some other posts from University of Florida

4 min

From classroom to cosmos: Students aim to build big things in space

In the vast vacuum of space, Earth-bound limitations no longer apply. And that’s exactly where UF engineering associate professor Victoria Miller, Ph.D., and her students are pushing the boundaries of possibilities. In partnership with the Defense Advanced Research Projects Agency, known as DARPA, and NASA’s Marshall Space Flight Center, the University of Florida engineering team is exploring how to manufacture precision metal structures in orbit using laser technology. “We want to build big things in space. To build big things in space, you must start manufacturing things in space. This is an exciting new frontier,” said Miller. An associate professor in the Department of Materials Science & Engineering at UF’s Herbert Wertheim College of Engineering, Miller said the project called NOM4D – which means Novel Orbital and Moon Manufacturing, Materials, and Mass-efficient Design – seeks to transform how people think about space infrastructure development. Picture constructing massive structures in orbit, like a 100-meter solar array built using advanced laser technology. “We’d love to see large-scale structures like satellite antennas, solar panels, space telescopes or even parts of space stations built directly in orbit. This would be a major step toward sustainable space operations and longer missions,” said team member Tianchen Wei, a third-year Ph.D. student in materials science and engineering. UF received a $1.1 million DARPA contract to carry out this pioneering research over three phases. While other universities explore various aspects of space manufacturing, UF is the only one specifically focused on laser forming for space applications, Miller said. A major challenge of the NOM4D project is overcoming the size and weight limitations of rocket cargo. To address these concerns, Miller’s team is developing laser-forming technology to trace precise patterns on metals to bend them into shape. If executed correctly, the heat from the laser bends the metal without human touch; a key step toward making orbital manufacturing a reality. “With this technology, we can build structures in space far more efficiently than launching them fully assembled from Earth,” said team member Nathan Fripp, also a third-year Ph.D. student studying materials science and engineering. “This opens up a wide range of new possibilities for space exploration, satellite systems and even future habitats.” Miller said laser bending is complex but getting the correct shape from the metal is only part of the equation. “The challenge is ensuring that the material properties stay good or improve during the laser-forming process,” she said. “Can we ensure when we bend this sheet metal that bent regions still have really good properties and are strong and tough with the right flexibility?” To analyze the materials, Miller’s students are running controlled tests on aluminum, ceramics and stainless steel, assessing how variables like laser input, heat and gravity affect how materials bend and behave. “We run many controlled tests and collect detailed data on how different metals respond to laser energy: how much they bend, how much they heat up, how the heat affects them and more. We have also developed models to predict the temperature and the amount of bending based on the material properties and laser energy input,” said Wei. “We continuously learn from both modeling and experiments to deepen our understanding of the process.” The research started in 2021 and has made significant progress, but the technology must be developed further before it’s ready for use in space. This is why collaboration with the NASA Marshall Space Center is so critical. It enables UF researchers to dramatically increase the technology readiness level (TRL) by testing laser forming in space-like conditions inside a thermal vacuum chamber provided by NASA. Fripp leads this testing using the chamber to observe how materials respond to the harsh environment of space. “We've observed that many factors, such as laser parameters, material properties and atmospheric conditions, can significantly determine the final results. In space, conditions like extreme temperatures, microgravity and vacuums further change how materials behave. As a result, adapting our forming techniques to work reliably and consistently in space adds another layer of complexity,” said Fripp. Another important step is building a feedback loop into the manufacturing process. A sensor would detect the bending angle in real time, allowing for feedback and recalibration of the laser’s path. As the project enters its final year, finishing in June of 2026, questions remain -- especially around maintaining material integrity during the laser-forming process. Still, Miller’s team remains optimistic. UF moves one step closer to a new era of construction with each simulation and laser test. “It's great to be a part of a team pushing the boundaries of what's possible in manufacturing, not just on Earth, but beyond,” said Wei.

3 min

Study: What makes a smell bad?

You wouldn’t microwave fish around your worst enemy — the smell lingers both in kitchen and memory. It is one few of us like, let alone have positive associations with. But what makes our brains decide a smell is stinky? A new study from UF Health researchers reveals the mechanisms behind how your brain decides you dislike — even loathe — a smell. Or as first author and graduate research fellow Sarah Sniffen puts it: How do odors come to acquire some sort of emotional charge? In many ways, our world capitalizes upon the importance of smells to influence emotions, running the gamut from perfumes to cooking and even grocery store design. “Odors are powerful at driving emotions, and it’s long been thought that the sense of smell is just as powerful, if not more powerful, at driving an emotional response as a picture, a song or any other sensory stimulus,” said senior author Dan Wesson, Ph.D., a professor of pharmacology and therapeutics in the UF College of Medicine and interim director of the Florida Chemical Senses Institute. But until now, researchers have puzzled over what circuitry connects the parts of the brain vital to generating an emotional response with those responsible for smell perception. The team started off with the amygdala, a brain region that curates your emotional responses to sensory stimuli. Although all our senses (sound, sight, taste, touch and smell) interact with this small part of your brain, the olfactory system takes a more direct route to it. “This is, in part, what we mean when we say your sense of smell is your most emotional sense,” Sniffen said. “Yes, smells evoke strong, emotional memories, but the brain’s smell centers are more closely connected with emotional centers like the amygdala.” In the study, researchers looked at mice, who share neurochemical similarities with people. They can learn about odors and categorize them as good or bad. After observing their behavior and analyzing brain activity, the team found two genetically unique brain cell types that allow odors to be assigned into a bucket of good feelings or bad feelings. Initially, the team expected that one cell type would generate a positive emotion to an odor, and another would generate a negative emotion. Instead, the brain’s cellular organization gives the cells the capability of doing either. “It can make an odor positive or negative to you,” Wesson said. “And it all depends upon where that cell type projects in your brain and how it engages with structures in your brain.” But why is knowing more about how we categorize smells important? Well, for starters, smells — and our reactions to them — are a part of life. Sometimes, however, our reactions to them can be outsized, or take on a negative association so strong it disrupts how we live. “We’re constantly breathing in and out and that means that we’re constantly receiving olfactory input,” Sniffen said. “For some people that’s fine, and it doesn’t impact their day-to-day life. They might even think, ‘Oh, odors don’t matter that much.’ But for people who have a heightened response to sensory stimuli, like those with PTSD or anxiety or autism, it’s a really important factor for their day-to-day life.” In the future, the research could help clinicians adjust for heightened sensory response that some people struggle with in their everyday lives, Wesson added. One example? A patient associating a clinic’s smell with transfusions that made them queasy. Based upon the receptor systems in these specific brain pathways, the team members believe they might be able to change those associations. Potentially, medications could suppress some of these pathways’ activity to allow you to overcome stressful and aversive emotional responses. Conversely, these pathways could be activated to restore enjoyment to things that people might have grown indifferent to — like those who lose their appetite from illness. “Emotions in part dictate our quality of life, and we’re learning more about how they arise in our brain,” Wesson said. “Understanding more about how our surroundings can impact our feelings can help us become happier, healthier humans.” This research was supported by funding from the National Institute on Deafness and Other Communication Disorders and the National Institute on Drug Abuse. Sarah Sniffen was supported by a fellowship from the National Institute on Deafness and Other Communication Disorders.

2 min

Machiavellianism boosts CEO pay, study finds

In an extensive study examining the relationship between personality traits and executive pay, CEOs who exhibit more Machiavellianism, characterized by motivation to achieve personal goals and 'win' social interactions, are more likely to have higher total pay and severance pay and to secure higher pay for those on their top management team. Aaron Hill, Ph.D., an associate professor at the University of Florida Warrington College of Business, and his co-authors determined that CEO's who scored higher on Machiavellianism were more motivated and successful in their negotiations. The team used a longitudinal sample of S&P 500 firms to compare compensation data with the executives' personality traits collected by expert clinical psychologists’ analyses of public video recordings. “Broadly, we find that CEO Machiavellianism positively relates to their own pay, their severance pay and the pay of their C-Suite or top management team,” Hill said. “The latter effect – on top management team pay – we find then predicts CEO pay raises. Our findings suggest that in this way, CEOs higher in Machiavellianism may pay their top management team members more to set up their own pay raises.” The team’s research highlights an underlying bias in how this trait can affect pay decisions. In response, those who set pay, such as boards of directors, should work on policies that reinforce the behaviors they want in their executives. They should also place leaders in a position to succeed and accentuate the positive aspects of their innate tendencies. “We all have tendencies that present tradeoffs in terms of having some positive aspects and some negative aspects,” Hill said. “Hopefully, as managers, we can acknowledge those and work to accentuate the positives and limit the potential downsides – in effect, take advantage of the positives and work to mitigate the negatives.” This research is published in the Journal of Applied Psychology.

View all posts