You might also like...
Check out some other posts from University of Delaware

Delaware is positioning itself as a “living lab” where academia, health systems and government
collaborate to shape the future of artificial-intelligence-enabled health care.
The latest issue of the Delaware Journal of Public Health, guest edited by University of Delaware computer scientists Weisong Shi and Yixiang Deng, brings together 16 articles from researchers, clinicians, policymakers and industry leaders examining how AI and big data are reshaping health care.
The issue, debuting this month, balances Delaware-specific topics with broader perspectives, highlighting three levels of impact: what Delaware can expect in the coming years, what other states can learn from Delaware’s approach and how UD research is advancing AI for health through collaborations.
“At UD, we don’t work in isolation. We’re working closely with health care systems so that innovation happens together from the beginning,” says Shi, Alumni Distinguished Professor and Chair of UD’s Department of Computer and Information Sciences.
Highlights from the issue include: The nation’s first nursing fellowship in robotics – ChristianaCare, Delaware’s largest health system, created an eight-month fellowship to train bedside nurses to conduct applied robotics research. Nurses who completed the program reported higher job satisfaction, improved well-being and greater professional confidence, suggesting programs like this may help retain the bedside workforce and reduce nationwide staffing shortages. Wheelchairs that navigate hospitals on their own – UD researchers developed a prototype autonomous wheelchair that combines onboard sensors and computing with software that interprets spoken directions from users, a step toward moving beyond systems that only work in controlled environments. To operate effectively in health care settings, the researchers say, wheelchairs must be able to navigate crowded hallways, interact with doors and elevators and recover safely when sensors or navigation systems fail. Smarter insulin dosing for type 1 diabetes – Researchers are developing computer models to predict blood sugar (glucose) trends and guide insulin delivery, but must address issues such as noisy data, reliable real-time prediction and the computational limits of wearable devices. A review by UD researchers and colleagues emphasizes the importance of interdisciplinary collaboration, standardized datasets, advances in computational infrastructure and clinical validation to turn these models into practical tools that improve patient care. To interview Shi about AI in health care and the new DJPH issue, click his profile or email MediaRelations@udel.edu. ABOUT WEISONG SHI Weisong Shi is an Alumni Distinguished Professor and Chair of the Department of Computer and Information Sciences at the University of Delaware. He leads the Connected and Autonomous Research Laboratory. He is an internationally renowned expert in edge computing, autonomous driving and connected health. His pioneering paper, “Edge Computing: Vision and Challenges,” has been cited over 10,000 times.

"Right now, storytelling is critical. Language learning is highly personal, and it’s the person-to-person relationships that grease the wheels," says Cheryl Ernst, director of the English Language Institute at the University of Delaware. She recently published English Language Programs as Facilitators of Soft Diplomacy in Innovations in Star Scholars Press. Here's how she's discussing this important topic.
Q: What is the focus of this research, and why is it important? Ernst: ELI and other English language programs provide the ideal space for communication development, cross cultural appreciation, gaining life skills, and raising awareness about people beyond the media.
Post pandemic, we’re hearing across campus how individuals feel less connected, and in English language classrooms, connection is critical. Language is only learned through production and practice since it’s a skill that needs to be honed. In language, there is no such thing as perfect. In our classrooms, English is the common goal, and everyone comes to that space at their own levels and overflowing with imperfection. Our students learn to use their vulnerability as a tool. They learn the value of a growth mindset living in a culture that is different from their own, and with that comes an appreciation for difference, respect for others, trust, human-to-human communication.
Q: What inspired this research? Ernst: More than 30 years of observation, conversations, experiences, and personal relationships. There was no term for what English language programs do beyond grammar (what’s perceived, anyway). Terms like personal diplomacy, person-to-person diplomacy, civic diplomacy, and the like happens all the time and oversimplifies what we do. In my readings, I started to see overlaps between soft power and diplomacy, which led to the concept of Soft Diplomacy. Then what distinguishes Soft Diplomacy from other more common monikers are the variety of skills that happen organically in our classrooms that we rarely acknowledge and students may not recognize.
Q: What are some key findings or developments? Ernst: Institutionally: ELPs can do better highlighting the skills beyond English that we teach organically or deliberately.
Q: How could this work potentially impact the field or the wider public? Ernst: Respecting ELPs for the space they provide and the skills they offer. It’s not “just English,” rather is learning to communicate in a common language and with people from around the globe.
I’d like people to realize that relationships are foundational, that there are common values across nations and that differences are not bad. What version of English is “correct” British or American (the New York? Wisconsin? Alabama? Iowa?).
Q: What are the next steps or upcoming milestones in your research? Ernst: A former student and I have launched a podcast series called Soft Diplomacy in Action that focuses on personal stories from those who work in international education. We’ve interviewed an ELI associate professor from Morocco, the UD coordinator of the Mandela Fellows program, a professor who sees (and lives) the diplomatic value of sports, and a retired English language professional. We’re looking forward to continuing these conversations with individuals from a variety of disciplines that also work in this space but through different lenses.
ABOUT CHERYL ERNST Cheryl Ernst is the director of the English Language Institute at the University of Delaware where she and her colleagues and students practice Soft Diplomacy every day. Her professional areas of interest include program administration and international marketing, teacher training and working with international teaching assistants, curriculum design, and advanced level academic English (graduate levels).
To speak with Ernst her work and the importance of Soft Diplomacy, reach out to MediaRelations@udel.edu.

Although AI tools can improve productivity, recent studies show that they too often intensify workloads instead of reducing them, in many cases even leading to cognitive overload and burnout. The University of Delaware's Saleem Mistry says this is creating employees who work harder, not smarter.
Mistry, an associate professor of management in UD's Lerner College of Business & Economics, says his research confirms findings found in this Feb. 9, 2026 article in the Harvard Business Review. Driven by the misconception that AI is an accurate search engine rather than a predictive text tool, these "cut and paste" employees are using the applications to pump out deliverables in seconds just to keep up with increasing workloads. Mistry notes that this prioritization of speed over accuracy is happening at every level of the organization:
• Junior staff: Blast out polished looking but unverified drafts.
• Managers: Outsource their ability to deeply learn and critically think in order to summarize data, letting their analytical skills atrophy.
• Power users: Build hidden, unapproved systems that bypass company oversight.
A management problem, not a tech problem "When discussing this issue, I often hear leaders blame the technology. However, I believe that blaming the tech is missing the point; I see it as a failure of leadership," Mistry said. "When already overburdened employees who are constantly having to do more with less are handed vague mandates to just use AI without any training, they use it to look busy and produce volume-based work. Because many companies still reward the volume of work produced rather than the actual impact, employees naturally use these tools to generate slick but empty deliverables."
"I believe that blaming the tech is missing the point; I see it as a failure of leadership. Because many companies still reward the volume of work produced rather than the actual impact, employees naturally use these tools to generate slick but empty deliverables." The real costs to organizations and incoming employees Mistry outlines three risks organizations face if they don’t intervene:
1. The workslop epidemic "These programs allow people to generate massive amounts of workslop, which is low-effort fluff that looks good but lacks substance. It takes seconds to create, but hours for someone else to decipher, fact-check, and fix," Mistry notes. "This drains money (up to $9 million annually for large companies) and destroys morale. As an educator, researcher, and a person brought into organizations to help fix problems, I for one do not want to be on the receiving end of a thoughtless, automated data dump, especially on tasks that require real skill and deep thinking."
2. Legal disaster He also states, "When the cut and paste mentality makes its way into professional submissions, the risks to the organization are real and oftentimes catastrophic. Courts have made it perfectly clear: ignorance is no excuse. If your name is on the document, you own the liability. Recently, attorneys have faced severe sanctions, hefty fines, and case dismissals for blindly submitting fake legal citations made up by computers."
3. A warning for incoming talent For new graduates entering this environment, Mistry offers a warning: Do not rely on AI to do your deep thinking. "If you simply use AI to blast out polished but unverified drafts, you become a replaceable 'cut and paste' employee," he says. “To truly stand out, new grads must prove they have the discernment to review, tweak, and challenge what the computer writes. The hiring edge is no longer just saying, 'I can do this task,' but 'I know how to leverage and correct AI to help me perform it.'"
Four ideas to fix it To survive and indeed thrive with these new tools and avoid the unintended consequences of untrained staff, organizations should:
1. Reinforce the importance of fact-checking and editing: Adopt frameworks that teach employees how to show their work and log how they verified computer-generated facts.
2. Change the incentives: Stop rewarding busy work, useless reports, and massive slide decks. Evaluate employees on accuracy and results.
3. Eradicate superficial work: Don’t use automation to speed up ineffective legacy processes. Instead, use it to identify and eliminate them entirely.
4. Make time for editing: Give yourself and your employees the breathing room to actually review, tweak, and challenge what the computer writes instead of accepting the first draft.
Mistry is available to discuss: Why AI is causing an epidemic of corporate "workslop" (and how to spot it).
The leadership failure behind the "cut and paste" employee.
How to rewrite corporate incentives to measure impact instead of volume in the AI era.
Strategies for implementing safe, effective AI policies at work.
How new college graduates can avoid the "workslop" trap in their first jobs.
To reach Mistry directly and arrange an interview, visit his profile and click on the "contact" button. Interested reporters can also send an email to MediaRelations@udel.edu.