Natural defenses: UF researchers use living infrastructure to protect Florida’s shores

Jan 23, 2026

4 min

Andrew Altieri



Armed with a $7 million grant from the Army Corp of Engineers, University of Florida researchers are working to bolster shoreline resilience and restore troubled wetlands in St. Augustine through nature-based solutions.


“The idea of nature-based solutions is to build what we sometimes refer to as green infrastructure, to use living, natural components as the building blocks,” said Andrew Altieri, Ph.D., an assistant professor with the Engineering School of Sustainable Infrastructure & Environment and interim director of the Center for Coastal Solutions, also known as CCS.


Instead of building man-made structures to protect wetlands, for example, restoration crews can move dredged natural sediment otherwise destined for costly disposal to increase wetlands’ size and elevation, restoring their ability to protect shorelines from storm surge, keep pace with sea-level change, filter toxins, store carbon and provide habitats for wildlife. 


The project is in concert with the Army Corps of Engineers’ goal to naturally reuse and repurpose at least 70% of dredged sediment into other natural areas to benefit habitats and restoration by 2030.


“It is critical to understand, test and model how natural processes can be harnessed and strategically implemented to sustainably meet the challenge of rapidly intensifying coastal hazards while also providing environmental, economic and social benefits,” Altieri wrote in the project’s technical summary.


Overall, the multi-disciplinary project closely examines patterns and processes of change in coastal landscapes. That includes wetlands — marshes and mangroves — and beach/dune systems.



The project comes as these coastal areas are facing threats both natural and human. These areas are essential to wildlife, air quality, native vegetation, storm protection and the overall health of the ecosystem. A 2008 study by the U.S. Fish and Wildlife Service reported a net loss of about 361,000 acres of wetlands in the coastal watersheds of the eastern United States between 1998 and 2004 — an average net decrease of 59,000 acres each year, with experts citing sea-level rise as one of the primary factors.


“We're trying to understand the patterns of that loss and what's leading to it,” Altieri said. “These systems are essentially the first and sometimes last line of defense against coastal hazards, risks that include storm surges and coastal flooding. They are forming a buffer, this kind of protective layer on our coast. But they're changing, generally for the worse and are in danger of being lost.”



With this project, the CCS-led research team plans to advance the science, technology and engineering principles of nature-based solutions.


With marshes, the primary concern is elevation loss, which can drown the vegetation critical to the ecosystem. They are sinking, eroding and succumbing to sea-level changes, Altieri said.


“The plants are really important for trapping sediment and holding sediment,” he said. “You lose some of the plants, then you get more erosional loss and a lack of the accumulation of sediment.”


Sediment is natural muck on the bottom of water bodies.


“If we can add sufficient sediment to increase the elevation to a level where the plants thrive, then they will retain that sediment that's been added to hopefully trap more sediment and accumulate more biomass through their growth,” Altieri said. “It’s something that may need to be done periodically. You may stop that decline, but you may even reverse the process of loss and change the trajectory.”


As a bonus, this process saves the cost of disposing of dredged sediment, which is usually piped offshore or to a materials-management area.



This project is the next step for CCS-led coastal resilience efforts in St. Augustine. In 2024, CCS and WSP Environment & Infrastructure Inc. launched a coastal wetlands-restoration project to keep pace with sea level change and erosion. The 2025 work is a standalone project with separate funding, Altieri said.


The current project also has more research disciplines and project partners, including UF researchers from Landscape Architecture, Geological Sciences and the School of Forest, Fisheries and Geomatic Sciences.


“Storm surges, wave energy, coastal flooding – all of that can be slowed or reduced because of wetlands,” Altieri said. “They are basically like shock absorbers. These wetlands, beaches and dunes can be lost or eroded to some degree, but the upland area behind them is essentially protected.”


Researching the resilience of dunes comes with a different set of dynamics. Here, they are looking at the plants that support the dunes – sea oats and panic grass, for example. That vegetation also provides a habitat for animals such as beach mice, turtles and birds.


On the beach, the team also is looking at water energy and how grain size affects the stability of dunes.


“It’s understanding water movement, water energy. How is that interacting with depositing sediment, moving sediment around, sorting sediment? With water, you tend to carry finer particles further than coarser materials,” he said.


What does success look like after the award’s five years end?


“We'll have an understanding of what's changing on our coasts and why,” Altieri said. “We'll have an understanding of how we can work within this system to modify the natural components and utilize the natural processes. And we will hopefully be working with partners through additional funding mechanisms to actually apply that towards implementation of solutions to increase coastal resilience.”


The team also includes Peter Adams, Department of Geological Sciences; Julie Bruck, Department of Landscape Architecture, School of Landscape Architecture and Planning; Maitane Olabarrieta, ESSIE; Alex Sheremet, ESSIE; Nina Stark, ESSIE; Ben Wilkinson, Geomatics Program, School of Forest, Fisheries, and Geomatics Sciences; and Xiao Yu, ESSIE.

Connect with:
Andrew Altieri

Andrew Altieri

Assistant Professor

Andrew Altieri studies ecology of human-dominated ecosystems to support coastal restoration.

RestorationMarine EcologyConservation
Powered by

You might also like...

Check out some other posts from University of Florida

4 min

AI in the classroom: What parents need to know

As students return to classrooms, Maya Israel, professor of educational technology and computer science education at the University of Florida, shares insights on best practices for AI use for students in K-12. She also serves as the director of CSEveryone Center for Computer Science Education at UF, a program created to boost teachers’ capabilities around computer science and AI in education. Israel also leads the Florida K-12 Education Task Force, a group committed to empowering educators, students, families and administrators by harnessing the transformative potential of AI in K-12 classrooms, prioritizing safety, privacy, access and fairness. How are K–12 students using AI in classrooms? There is a wide range of approaches that students are using AI in classrooms. It depends on several factors including district policies, student age and the teacher’s instructional goals. Some districts restrict AI to only teacher use, such as creating custom reading passages for younger students. Others allow older students to use tools to check grammar, create visuals or run science simulations. Even then, skilled teachers frame AI as one tool, not a replacement for student thinking and effort. What are examples of age-appropriate tools that enhance learning? AI tools can be used to either enhance or erode learner agency and critical thinking. It is up to the educators to consider how these tools can be used appropriately. It is critical to use AI tools in a manner that supports learning, creativity and problem solving rather than bypass critical thinking. For example, Canva lets students create infographics, posters and videos to show understanding. Google’s Teachable Machine helps students learn AI concepts by training their own image-recognition models. These types of AI-augmented tools work best when they are embedded into activities such as project-based learning, where AI supports learning and critical thinking. How do teachers ensure AI supports core skills? While AI can be incredibly helpful in supporting learning, it should not be a shortcut that allows students to bypass learning. Teachers should design learning opportunities that integrate AI in a manner that encourages critical thinking. For example, if students are using AI to support their mathematical understanding, teachers should ask them to explain their reasoning, engage in discussions and attempt to solve problems in different ways. Teachers can ask students questions like, “Does that answer make sense based on what you know?” or “Why do you think [said AI tool] made that suggestion?” This type of reflection reinforces the message that learning does not happen through getting fast answers. Learning happens through exploration, productive struggle and collaboration. Many parents worry that using AI might make students too dependent on technology. How do educators address that concern? This is a very valid concern. Over-reliance on AI can erode independence and critical thinking, that’s why teachers should be intentional in how they use AI for teaching and learning. Educators can address this concern by communicating with parents their policies and approaches to using AI with students. This approach can include providing clear expectations of when AI is used, designing assignments that require critical thinking, personal reflection and reasoning and teaching students the metacognitive skills to self-assess how and when to use AI so that it is used to support learning rather than as a crutch. How do schools ensure that students still develop original thinking and creativity when using AI for assignments or projects? In the age of AI, there is the need to be even more intentional designing learning experiences where students engage in creative and critical thinking. One of the best practices that have shown to support this is the use of project-based learning, where students must create, iterate and evaluate ideas based on feedback from their peers and teachers. AI can help students gather ideas or organize research, but the students must ask the questions, synthesize information and produce original ideas. Assessment and rubrics should emphasize skills such as reasoning, process and creativity rather than just focusing on the final product. That way, although AI can play a role in instruction, the goal is to design instructional activities that move beyond what the AI can do. How do educators help students understand when it’s appropriate to use AI in their schoolwork? In the age of AI, educators should help students develop the skills to be original thinkers who can use AI thoughtfully and responsibly. Educators can help students understand when to use AI in their school work by directly embedding AI literacy into their instruction. AI literacy includes having discussions about the capabilities and limitations of AI, ethical considerations and the importance of students’ agency and original thoughts. Additionally, clear guidelines and policies help students navigate some of the gray areas of AI usage. What guidance should parents give at home? There are several key messages that parents should give their children about the use of AI. The most important message is that even though AI is powerful, it does not replace their judgement, creativity or empathy. Even though AI can provide fast answers, it is important for students to learn the skills themselves. Another key message is to know the rules about AI in the classroom. Parents should speak with their students about the mental health implications of over-reliance on AI. When students turn to AI-augmented tools for every answer or idea, they can gradually lose confidence in their own problem-solving abilities. Instead, students should learn how to use AI in ways that strengthen their skills and build independence.

3 min

Is writing with AI at work undermining your credibility?

With over 75% of professionals using AI in their daily work, writing and editing messages with tools like ChatGPT, Gemini, Copilot or Claude has become a commonplace practice. While generative AI tools are seen to make writing easier, are they effective for communicating between managers and employees? A new study of 1,100 professionals reveals a critical paradox in workplace communications: AI tools can make managers’ emails more professional, but regular use can undermine trust between them and their employees. “We see a tension between perceptions of message quality and perceptions of the sender,” said Anthony Coman, Ph.D., a researcher at the University of Florida's Warrington College of Business and study co-author. “Despite positive impressions of professionalism in AI-assisted writing, managers who use AI for routine communication tasks put their trustworthiness at risk when using medium- to high-levels of AI assistance." In the study published in the International Journal of Business Communication, Coman and his co-author, Peter Cardon, Ph.D., of the University of Southern California, surveyed professionals about how they viewed emails that they were told were written with low, medium and high AI assistance. Survey participants were asked to evaluate different AI-written versions of a congratulatory message on both their perception of the message content and their perception of the sender. While AI-assisted writing was generally seen as efficient, effective, and professional, Coman and Cardon found a “perception gap” in messages that were written by managers versus those written by employees. “When people evaluate their own use of AI, they tend to rate their use similarly across low, medium and high levels of assistance,” Coman explained. “However, when rating other’s use, magnitude becomes important. Overall, professionals view their own AI use leniently, yet they are more skeptical of the same levels of assistance when used by supervisors.” While low levels of AI help, like grammar or editing, were generally acceptable, higher levels of assistance triggered negative perceptions. The perception gap is especially significant when employees perceive higher levels of AI writing, bringing into question the authorship, integrity, caring and competency of their manager. The impact on trust was substantial: Only 40% to 52% of employees viewed supervisors as sincere when they used high levels of AI, compared to 83% for low-assistance messages. Similarly, while 95% found low-AI supervisor messages professional, this dropped to 69-73% when supervisors relied heavily on AI tools. The findings reveal employees can often detect AI-generated content and interpret its use as laziness or lack of caring. When supervisors rely heavily on AI for messages like team congratulations or motivational communications, employees perceive them as less sincere and question their leadership abilities. “In some cases, AI-assisted writing can undermine perceptions of traits linked to a supervisor’s trustworthiness,” Coman noted, specifically citing impacts on perceived ability and integrity, both key components of cognitive-based trust. The study suggests managers should carefully consider message type, level of AI assistance and relational context before using AI in their writing. While AI may be appropriate and professionally received for informational or routine communications, like meeting reminders or factual announcements, relationship-oriented messages requiring empathy, praise, congratulations, motivation or personal feedback are better handled with minimal technological intervention.

2 min

MedPage Today: Ozzy Osbourne shined a light on Parkinson’s stigma

Ozzy Osbourne was best known for two things: his shape-shifting resilience as a pioneer of heavy metal music and, most recently, his remarkable authenticity during his public journey with Parkinson's disease. Osbourne, who passed away on July 22, possessed a unique ability to connect directly with people who were suffering. He was an honest and transparent voice for what it was like to live with a neurodegenerative disease. He was willing to go where others would not, and he took on the stigma of a Parkinson's diagnosis. Stigma remains one of the most underrecognized yet pervasive challenges in Parkinson's disease. Far too often, individuals are made to feel ashamed of their visible symptoms like tremors, facial masking, or soft speech. This reality can lead to social withdrawal, depression, and even delayed medical care. Research has shown that perceived stigma is not only linked to reduced quality of life, but it also correlates with worse outcomes. That's why, when someone like Osbourne rises up and speaks out, it matters. It sends a powerful message that Parkinson's does not define a person, and that no one should suffer in silence. Many people with Parkinson's disease choose to conceal their diagnosis from those closest to them. A recent study published in Scientific Reports found that nearly 23% of participants kept their condition hidden, even from family members. Broader surveys have suggested that more than half of individuals with Parkinson's disease may conceal symptoms, mask tremors, or avoid public situations due to stigma and fear of judgment. People who hide their diagnosis frequently report lower social support, reduced engagement in physical activity, and significantly worse emotional well-being. These findings underscore how pervasive and harmful disclosure avoidance can be.

View all posts