Experts Matter. Find Yours.

Connect for media, speaking, professional opportunities & more.

Virtual reality training tool helps nurses learn patient-centered care

University of Delaware computer science students have developed a digital interface as a two-way system that can help nurse trainees build their communication skills and learn to provide patient-centered care across a variety of situations. This virtual reality training tool would enable users to rehearse their bedside manner with expectant mothers before ever encountering a pregnant patient in person. The digital platform was created by students in Assistant Professor Leila Barmaki’s Human-Computer Interaction Laboratory, including senior Rana Tuncer, a computer science major, and sophomore Gael Lucero-Palacios. Lucero-Palacios said the training helps aspiring nurses practice more difficult and sensitive conversations they might have with patients. "Our tool is targeted to midwifery patients,” Lucero-Palacios said. “Learners can practice these conversations in a safe environment. It’s multilingual, too. We currently offer English or Turkish, and we’re working on a Spanish demo.” This type of judgement-free rehearsal environment has the potential to remove language barriers to care, with the ability to change the language capabilities of an avatar. For instance, the idea is that on one interface the “practitioner” could speak in one language, but it would be heard on the other interface in the patient’s native language. The patient avatar also can be customized to resemble different health stages and populations to provide learners a varied experience. Last December, Tuncer took the project on the road, piloting the virtual reality training program for faculty members in the Department of Midwifery at Ankara University in Ankara, Turkey. With technical support provided by Lucero-Palacios back in the United States, she was able to run a demo with the Ankara team, showcasing the UD-developed system’s interactive rehearsal environment’s capabilities. Last winter, University of Delaware senior Rana Tuncer (left), a computer science major, piloted the virtual reality training program for Neslihan Yilmaz Sezer (right), associate professor in the Department of Midwifery, Ankara University in Ankara, Turkey. Meanwhile, for Tuncer, Lucero-Palacios and the other students involved in the Human-Computer Interaction Laboratory, developing the VR training tool offered the opportunity to enhance their computer science, data science and artificial intelligence skills outside the classroom. “There were lots of interesting hurdles to overcome, like figuring out a lip-sync tool to match the words to the avatar’s mouth movements and figuring out server connections and how to get the languages to switch and translate properly,” Tuncer said. Lucero-Palacios was fascinated with developing text-to-speech capabilities and the ability to use technology to impact patient care. “If a nurse is well-equipped to answer difficult questions, then that helps the patient,” said Lucero-Palacios. The project is an ongoing research effort in the Barmaki lab that has involved many students. Significant developments occurred during the summer of 2024 when undergraduate researchers Tuncer and Lucero-Palacios contributed to the project through funding support from the National Science Foundation (NSF). However, work began before and continued well beyond that summer, involving many students over time. UD senior Gavin Caulfield provided foundational support to developing the program’s virtual environment and contributed to development of the text-to-speech/speech-to-text capabilities. CIS doctoral students Fahim Abrar and Behdokht Kiafar, along with Pinar Kullu, a postdoctoral fellow in the lab, used multimodal data collection and analytics to quantify the participant experience. “Interestingly, we found that participants showed more positive emotions in response to patient vulnerabilities and concerns,” said Kiafar. The work builds on previous research Barmaki, an assistant professor of computer and information sciences and resident faculty member in the Data Science Institute, completed with colleagues at New Jersey Institute of Technology and University of Central Florida in an NSF-funded project focused on empathy training for healthcare professionals using a virtual elderly patient. In the project, Barmaki employed machine learning tools to analyze a nursing trainee’s body language, gaze, verbal and nonverbal interactions to capture micro-expressions (facial expressions), and the presence or absence of empathy. “There is a huge gap in communication when it comes to caregivers working in geriatric care and maternal fetal medicine,” said Barmaki. “Both disciplines have high turnover and challenges with lack of caregiver attention to delicate situations.” UD senior Rana Tuncer (center) met with faculty members Neslihan Yilmaz Sezer (left) and Menekse Nazli Aker (right) of Ankara University in Ankara, Turkey, to educate them about the virtual reality training tool she and her student colleagues have developed to enhance patient-centered care skills for health care professionals. When these human-human interactions go wrong, for whatever reason, it can extend beyond a single patient visit. For instance, a pregnant woman who has a negative health care experience might decide not to continue routine pregnancy care. Beyond the project’s potential to improve health care professional field readiness, Barmaki was keen to note the benefits of real-world workforce development for her students. “Perceptions still exist that computer scientists work in isolation with their computers and rarely interact, but this is not true,” Barmaki said, pointing to the multi-faceted team members involved in this project. “Teamwork is very important. We have a nice culture in our lab where people feel comfortable asking their peers or more established students for help.” Barmaki also pointed to the potential application of these types of training environments, enabled by virtual reality, artificial intelligence and natural language processing, beyond health care. With the framework in place, she said, the idea could be adapted for other types of training involving human-human interaction, say in education, cybersecurity, even in emerging technology such as artificial intelligence (AI). Keeping people at the center of any design or application of this work is critical, particularly as uses for AI continue to expand. “As data scientists, we see things as spreadsheets and numbers in our work, but it’s important to remember that the data is coming from humans,” Barmaki said. While this project leverages computer vision and AI as a teaching tool for nursing assistants, Barmaki explained this type of system can also be used to train AI and to enable more responsible technologies down the road. She gave the example of using AI to study empathic interactions between humans and to recognize empathy. “This is the most important area where I’m trying to close the loop, in terms of responsible AI or more empathy-enabled AI,” Barmaki said. “There is a whole area of research exploring ways to make AI more natural, but we can’t work in a vacuum; we must consider the human interactions to design a good AI system.” Asked whether she has concerns about the future of artificial intelligence, Barmaki was positive. “I believe AI holds great promise for the future, and, right now, its benefits outweigh the risks,” she said.

5 min. read

Nurse Scientist Susan Smith Birkhoff Makes Two Research ‘Firsts’ in Delaware

Susan Smith Birkhoff, Ph.D., RN, is making nursing history in the First State through the Delaware IDeA Network of Biomedical Research Excellence (INBRE). She is the first nurse scientist to be named an INBRE site principal investigator and she is the first nurse to receive the Seema S. Sonnad Mentor of the Year Award from INBRE’s Junior Investigator Network. INBRE is a collaborative network of Delaware academic, health care and research institutions, composed of ChristianaCare, Delaware State University, Delaware Technical Community College Nemours Children’s Health and University of Delaware. First nurse scientist to lead INBRE site As the INBRE site principal investigator at ChristianaCare, Smith Birkhoff will expand on the research network’s success at a large academic health center. In collaboration with the INBRE partners and the program manager, Kellie Patterson, BSN, RN, CCRP, she will leverage centers of excellence across ChristianaCare to host an exceptional student program, increase the health system's contributions to the pilot program pool and grow the visibility of INBRE across the enterprise. “Susan brings a terrific combination of skills to this role,” said Omar Khan, M.D., MHS, FAAFP, chief scientific officer for ChristianaCare and institutional representative on the INBRE steering committee. “She is a mentor, scientist and teacher, and her experience with INBRE and the state’s other premier research programs will ensure that we deliver the highest value for the Delaware community.” Smith Birkhoff leads and supports interprofessional research education, systemwide technology evaluation, and grantsmanship. She spearheads a diverse research program, encompassing areas such as robotics in health care, virtual reality in medicine and burnout in the nursing workforce. As program director of Technology Research & Education at ChristianaCare, she collaborates across the health system’s academic research enterprise to achieve both clinician- and patient-oriented research outcomes. “Susan is a wonderful colleague and she is a true researcher-educator,” said Neil Jasani, M.D., MBA, FACEP, chief academic officer for ChristianaCare. “She is a great fit for the work of Delaware INBRE as we advance ChristianaCare’s contribution to this essential research network.” She co-leads an innovative program to study the one of the first deployments of increasingly autonomous robots in a U.S. health care setting and directs the first Nursing Research Fellowship in Robotics and Innovation, housed at ChristianaCare. First nurse named Mentor of the Year Smith Birkhoff received the 2025 Seema S. Sonnad Mentor of the Year Award from INBRE’s Junior Investigator Network, nominated for her exceptional mentorship by ChristianaCare colleagues whom she mentored. Her nominators were: Kaci Rainey, MSN, RN, CEN, TCRN, an evidence-based practice specialist at ChristianaCare, and Briana Abernathy, BSN, RN, CEN, a nurse in utilization management at ChristianaCare and an inaugral nurse fellow in the Nursing Research Fellowship in Robotics and Innovation. “They say that if you are not at the table, you are on the menu. We are profoundly grateful that Dr. Smith Birkhoff selflessly provided us with a seat at the table and an overflowing feast of knowledge,” said Abernathy in presenting the award. “This knowledge has quenched our thirst for change and fueled our hunger for research and innovation, setting the stage for the rest of our careers.”

Susan Birkhoff, Ph.D, RNNeil Jasani, M.D., MBA, FACEPOmar A. Khan, M.D., MHS, FAAFP
3 min. read

Dangers of the Metaverse and VR for U.S. Youth Revealed in New Study

The metaverse, a space where the lines between physical and digital realities blur, is rising among younger populations. As of March, 33% of teens own a virtual reality (VR) device and 13% use it weekly. With the metaverse offering richer emotional experiences, youth may be particularly vulnerable to significant harm in these immersive spaces, underscoring the need to explore potential risks. Unfortunately, research of online victimization in the metaverse is sorely lacking. A new study by Florida Atlantic University , in collaboration with the University of Wisconsin-Eau Claire, is one of the first to examine the experiences of harm in the metaverse among youth in the United States. Using a nationally-representative sample of 5,005 13 to 17 year olds in the U.S., researchers focused on their experiences with VR devices, including 12 specific types of harm experienced, protective strategies employed, and differences in experiences between boys and girls. Results of the study, published in the journal New Media & Society, found a significant percentage of youth reported experiencing various forms of harm in these spaces, including hate speech, bullying, harassment, sexual harassment, grooming behaviors (predators building trust with minors), and unwanted exposure to violent or sexual content. The study also revealed notable gender differences in experiences. Among the study findings: 32.6% of youth own a VR headset (41% of boys vs. 25.1% of girls) More than 44% received hate speech/slurs (8.9% many times); 37.6% experienced bullying; and 35% faced harassment Almost 19% experienced sexual harassment; 43.3% dealt with trolling; 31.6% were maliciously obstructed; and 29.5% experienced threats More than 18% were doxed (publicly revealing someone’s personal information without their consent); and 22.8% were catfished (creating a false identity online to deceive someone, typically for romantic purposes) Nearly 21% faced unwanted violent or sexual content; 18.1% experienced grooming or predatory behavior; and 30% were targeted for factors like weight, sexual preference, sexual orientation or political affiliation Boys and girls experienced similar patterns of mistreatment, but girls experienced sexual harassment and grooming/ predatory behavior more frequently than boys. Boys and girls were equally as likely to be targeted because of their voice, avatar, race, religion or disability. “Certain populations of youth are disproportionately susceptible to harm such grooming, especially those who suffer from emotional distress or mental health problems, low self-esteem, poor parental relationships and weak family cohesion,” said Sameer Hinduja, Ph.D., first author, a professor in the School of Criminology and Criminal Justice within FAU’s College of Social Work and Criminal Justice, co-director of the Cyberbullying Research Center, and a faculty associate at the Berkman Klein Center at Harvard University. “Due to the unique characteristics of metaverse environments, young people may need extra attention and support. The immersive nature of these spaces can amplify experiences and emotions, highlighting the importance of tailored resources to ensure their safety and well-being.” Findings also reveal that girls employed in-platform safety measures significantly more so than boys such as “Space Bubble,” “Personal Boundary” and “Safe Zone.” “We found that girls are more likely to select avatars designed to reduce the risk of harassment and to use in-platform tools to maintain a safe distance from others. Additionally, both boys and girls feel comfortable leaving metaverse rooms or channels like switching servers in response to potential or actual victimization, although overall, youth tend to use these safety features infrequently,” said Hinduja. Among the recommendations offered to youth by the researchers include: Using platform-provided safety features to restrict unwanted interactions and infringements upon their personal space. It is also essential that youth understand and take advantage of the safety features available within metaverse experiences, including blocking, muting, and reporting functionalities. Continued research and development in these areas to determine how to meet the needs of users in potential or actual victimization contexts Streamlining platform reporting mechanisms to ensure swift action is taken against perpetrators Age-gating mechanisms for metaverse environments where mature content and interactions proliferate Encouraging parents and guardians to take the time to familiarize themselves with available parental control features on VR devices and metaverse platforms to set boundaries, monitor activities, and restrict certain features as needed. An active mediation approach is ideal, where they engage in open and supportive dialogue with children about their metaverse experiences. The integration of updated, relevant, and accessible digital citizenship and media literacy modules into school curricula to provide youth with the necessary knowledge and skills to navigate VR and other emerging technologies safely and responsibly Consideration by content creators of the ethical implications of their metaverse creations, ensuring that they promote inclusivity, respect, and discourage any form of harassment. They should strive to make their virtual experiences accessible to users from diverse backgrounds, languages, cultures and abilities. “VR concerns of parents and guardians generally reflect and align with their historical anxieties about video games, excessive device use, its sedentary nature, cognitive development, and stranger danger,” said Hinduja. “There remains so much promise with these new technologies, but vigilance is required when it comes to the unique challenges they present as well as the unique vulnerabilities that certain youth users may have. As such, it’s ‘all hands on deck’ to build a safer and more inclusive metaverse as it continues to evolve.” If you're looking to know more - let us help. Sameer Hinduja, Ph.D., is a professor in the School of Criminology and Criminal Justice at Florida Atlantic University and co-director of the Cyberbullying Research Center. He is recognized internationally for his groundbreaking work on the subjects of cyberbullying and safe social media use, concerns that have paralleled the exponential growth in online communication by young people. He has written seven books, and his interdisciplinary research is widely published and has been cited more than 18,000 times. Simply click on Sameer's icon now to set up an interview today.

Sameer Hinduja, Ph.D.
4 min. read

VR Simulation to Demonstrate the Danger of Snow Squalls

Since 2017, Dr. Jase Bernhardt, Hofstra associate professor of geology, environment, and sustainability, has been using virtual reality technology to teach the public about the dangers of rip currents, hurricanes, and flash flooding. His most recent award, a $100,000 Road to Zero Community Traffic Safety Grant from the National Safety Council, is enabling him to tackle another seasonal weather worry: driving in snow squalls. The National Safety Council received funding for this grant from the National Highway Traffic Safety Administration. Bernhardt’s project aims to share information about the onset of snow squalls, the importance of heeding emergency weather advisories, and what drivers should do if they are on the road when a snow squall occurs. Although squalls are infrequent, they are extreme and frightening winter weather events that can result in a rapid onset of heavy snow, low visibility, icy roadways, and frigid temperatures. “Snow squalls are a very specific type of weather phenomenon. They often occur on a clear, calm day, with no warning of precipitation,” Bernhardt said. “Seemingly out of nowhere, you’ll see clouds, followed by a quick burst of very heavy snow. For a short time, perhaps only 10, 15 minutes, there are whiteout conditions where drivers can barely see the road ahead of them.” According to the U.S. Department of Transportation website, 24% of weather-related vehicle crashes occur on snowy, slushy, or icy pavement, and 15% happen during snowfall or sleet. More than 1,300 people are killed and nearly 117,000 people are injured in vehicle crashes on snowy, slushy, or icy pavement annually. Because winters in the New York metropolitan area have been mild for the last few years, Bernhardt worries that people have been lulled into a false sense of security about driving during winter storms or squalls. “We’re not used to being in that kind of severe weather anymore,” he said. “Snow squalls can be deadly in terms of massive collisions and multi-vehicle chain collisions. The key thing to remember is that they come in rapidly, catching people by surprise.” Bernhardt is collaborating on the software for the snow squall VR simulation with Frank Martin ’22, ’23, who earned both a BS and an MS in Computer Science from Hofstra University. Users will wear a headset and hold a device – like a video game controller – in each hand to replicate the movements of a steering wheel. In this way, users will experience what it is like to drive from clear, pleasant conditions into a brutal wall of snow. Bernhardt said that if a warning for a snow squall is issued via emergency broadcast, he hopes people who have used the simulation will understand the urgency of getting off the road or pulling onto the shoulder and remaining in their vehicle. “We want people to have an experience that is as close to reality as possible. The idea is to simulate how sudden and terrifying snow squalls can be and give people an opportunity to learn what they should and should not do if they are caught in one,” Bernhardt said. In conjunction with the VR simulation, Bernhardt is developing a survey to determine people’s reactions to emergency messaging and how effective it is. He will work with the National Weather Service to have the simulation and surveys available by fall 2025 for use at training and outreach events throughout the Northeast. Like Bernhardt’s rip current project, there are plans to have a version of the snow squall simulation and corresponding literature available in Spanish. Dr. Sasha Pesci, Hofstra assistant professor of geology, environment, and sustainability, is co-principal investigator on the grant and is helping with the translation of materials. “More and more, the National Weather Service, state and federal governments, and other agencies recognize the importance of having this information available in other languages,” Bernhardt said. “There are a lot of drivers whose primary language is Spanish, and they include cab and Uber drivers, and truckers.” Jase Bernhardt is available to speak with media about this topic - simply click on his icon now to arrange an interview today.

Jase Bernhardt
3 min. read

One of the crowd or one of a kind? New artificial intelligence research indicates we're a bit of both

Evidence that behaviour follow a two-step process when we’re in a crowd We are likely to imitate the crowd first and think independently second Findings will increase understanding of how humans make decisions based on others’ actions. An Aston University computer scientist has used artificial intelligence (AI) to show that we are not as individual as we may like to think. In the late 1960s, famous psychologist Stanley Milgram demonstrated that if a person sees a crowd looking in one direction, they’re likely to follow their gaze. Now, Dr Ulysses Bernardet in the Computer Science Research Group at Aston University , collaborating with experts from Belgium and Germany, has found evidence that our actions follow a two-step process when we’re in a crowd. Their results, Evidence for a two-step model of social group influence, published in iScience show that we go through a two-stage process, where we’re more likely to imitate a crowd first and think independently second. The researchers believe their findings will increase the understanding of how humans make decisions based on what others are doing. To test this idea the academics created an immersive virtual reality (VR) experiment set in a simulated city street. Each of the 160 participants was observed individually as they watched a movie within the virtual reality environment that had been created for the experiment. As they watched the movie, 10 computer-generated ‘spectators’ within the VR simulated street were operated by AI to attempt to influence the direction of the gaze of the individual participants. During the experiment, three different sounds such as an explosion were played coming from either the left or right of the virtual street. At the same time, a number of the ‘spectators’ looked in a specific direction, not always in the direction of the virtual blast or the other two sounds. The academics calculated a direct, and an indirect, measure of gaze-following. The direct measure was the proportion of trials in which participants followed the gaze of the crowd. The indirect measure took into account the reaction speed of participants dependent on whether they were instructed to look in the same or opposite direction as the audience. The experiment’s results support the understanding that the influence of a crowd is best explained by a two-step model. Dr Bernardet, said: “Humans demonstrate an initial tendence to follow others – a reflexive, imitative process. But this is followed by a more deliberate, strategic processes when a person will decide whether to copy others around them, or not. “One way in which groups affect individuals is by steering their gaze. “This influence is not only felt in the form of social norms but also impacts immediate actions and lies at the heart of group behaviours such as rioting and mass panic. “Our model is not only consistent with evidence gained using brain imaging, but also with recent evidence that gaze following is the manifestation of a complex interplay between basic attentional and advanced social processes.” The researchers believe their experiments will pave the way for increased use of VR and AI in behavioural sciences.

3 min. read

Looking for a 'real' expert to explain the latest advances in virtual reality? We're here to help!

Virtual reality is quickly becoming an actual necessity in all facets of technology, education, entertainment and the workplace. It's a popular topic, and Augusta University's Lynsey Steinberg sat down to answer a few questions about how far VR has come and where it's going. VR is changing everyday life for many. What are the biggest advances you've seen in VR use? Virtual reality is rapidly evolving and expanding. It wasn’t too long ago we were excited for the idea of a wireless head set. Now there are companies such as Virtuix creating 360-degree treadmills to interact with your experience in VR and Hypnos VR (a product which releases scents in the air based on the experience in VR). There have been advances of adaptive and stress response simulations based on pupillometry measurements or even integration of physiological sensors for behavioral research. The biggest advancements are solutions that have been unimaginable before that are now entirely possible. It seems the medical field has been a big benefactor of VR. Is this giving students a better way to "learn" about anatomy and other aspects of the field? I believe all experience is valuable to learning. VR is unique in allowing an individual to view as if from their own perspective for virtual experiential learning. We often hear the phrase, “If you could imagine walking a mile in someone else’s shoes,” and now we can provide perspective, allowing another person to view the world as someone with a particular disease or simulate training in a low-risk environment. One example, Fire in the OR, is a VR simulation allowing medical professionals to train safely on how to remove fire danger in the operating room. I believe simulations like these are remarkable examples of how valuable VR can be in education, to remove elements of danger in everyday life. Their research showed 250% improvement rate on fire safety in the OR. A huge industry leader in surgical simulations is Osso VR, creating surgical training procedures for surgeons and hiring some of our Augusta University medical illustration graduates. How is this being applied at Augusta University? The Center for Instructional Innovation created modules with the Medical College of Georgia on handwashing hygiene health and end-of-life care scenarios with the College of Nursing. We encourage faculty to develop multiple methods of interactive modules for the benefit of all learning styles. VR certainly provides engaging and enriching materials for a low-risk environment in instruction. The Center for Instructional Innovation is currently working with the Academic Student Success Center to implement Oculus Quest head sets for anatomy and physiology students to benefit from application use in VR. Augusta University student Henry Oh and his 3D printed pottery from VR sculpture. How else has VR and its use changed the way we go about our daily lives? VR head sets are known in robotics, manufacturing, therapeutic modalities, gaming capabilities, technology in research and education. Any scene you can film in 360 degrees you can now watch in a headset and be fully immersed in the scene (ie: a theater production, a museum tour, an art exhibit, a temple historically preserved, etc). We have gone from telling a story to being immersed in a story. We have been able to utilize VR technology integration and innovation on campus to create enriching learning experiences. We collaborated with our Ceramics department (with Brian McGrath and Raoul Pachecho) to support students in virtual clay sculpting with Adobe Medium. Students 3D printed their works of art after exporting the files from the VR simulation. Where do you see the future of VR? The future developments for integrating systems for haptic feedback will be remarkable integrations. The continued development of behavioral research and integration of gamification is an exciting opportunity in VR as well as the continued development for protocols and appropriate safety procedures. The cross-platform and cross-disciplinary possibilities will allow for creativity to blossom in new world solutions. It is clear, the ongoing need for technical workforce required to create and support more VR and other high-impact technology is rapidly growing. VR is a fascinating topic and if you're a journalist looking to know more by speaking with Steinberg, then let us help. Steinberg is one of the 300 board-certified medical illustrators with experience in hands-on surgery in the operating room, utilizing development in virtual reality, 3D printing, animation, gamification and graphic design while working directly with students, faculty and physicians. Steinberg is available to speak with media -- simply click on her icon now to arrange an interview today.

Lynsey Steinberg
4 min. read

Advertising has evolved. Let our expert explain the power of innovative marketing

Late last month, Nike once again shook up the marketing landscape with a jaw-dropping new 3D billboard in Japan. It's a concept that has turned the traditional billboard into something almost out of this world. The move has once again elevated Nike as a leader in marketing. Augusta University's Dr. Christopher McKinney, associate vice president for innovation commercialization, answers a few questions for anyone looking to know more about this marketing innovation. Are you surprised how fast the landscape is changing when it comes to emerging media? Not at all. The increases both in computing power and graphics processing unit power are transforming what we can do. Now that the door of opportunity is more widely open, we’re seeing great new ways to use that power in marketing. The new Nike 3D billboard seems to be a game changer when it comes to advertising. Do you see this gaining traction as a trend that will continue? Do you see them going even further with this and how so? I do see this as a trend that will continue and even accelerate. Nike has thrown down the gauntlet; others will be trying to “one-up” Nike. Beyond the graphic elements, we’ll see ever more clever uses of 3D sound in some marketing applications. In more traditional billboard settings, the brightness, clarity and realism will improve to the point where it will be increasingly difficult to differentiate animation from what looks like a real-life image. The advent of CG in general is always changing. What's next in that realm? With the growth of virtual reality and augmented reality, we’ll see increasing applications in the home and workplace that take advantage of the VR/AR technologies. This will be especially important in areas such as education, accommodation of disabilities and entertainment. What does the future hold? In the next decade, we likely will see more tailored marketing using these CG-mediated technologies. Marketing is an ever-evolving tool used by businesses of every size and in every community. If you are a reporter looking to learn more about what's next in marketing, then let us help. McKinney specializes in marketing and commercializing innovative new technologies. He is available to speak with media; simply click on his icon to arrange an interview today.

Christopher McKinney
2 min. read

Virtual Reality-Based Surgical Simulations Could Make Patients Safer

Suvranu De, the director of the Center for Modeling, Simulation, and Imaging in Medicine at Rensselaer, has dedicated more than a decade of research to making surgery safer by developing virtual reality-based surgical training simulations that closely mimic the optics and haptics a surgeon may encounter in the operating room. A new $2.3 million grant from the National Institute of Biomedical Imaging and Bioengineering of the National Institutes of Health will further his research in this space, by supporting the development of a collaborative virtual reality-based surgical simulation environment that allows medical professionals to practice technical, cognitive, and interpersonal skills as a team. “People will be wearing head-mounted displays, and they will be immersed in a virtual operating room working on a virtual patient as a team,” De said. “We want to have an expert team in the operating room focused on the treatment of a patient, and not just a team of experts.” Conceptually, this approach is similar to crew resource management practiced by aviation pilots, which has led to a significant reduction in aircraft accidents. The Virtual Operating Room Team Experience (VORTeX) simulation system will provide realistic distractions, interruptions, and other stressors that medical professionals may encounter in an operating room. Traditionally, this type of simulation training has required mannequins, instructors, and a dedicated space, as well as significant coordination and resources. In contrast, the VORTeX system will be both distributed and asynchronous – allowing participants to join the simulation from different locations, and instructors to review the simulation and provide feedback at their convenience. Machine learning algorithms will be used to crunch the data and provide feedback to participants, who will be able to return to the virtual environment to review their performance. De is available to discuss how this type of virtual training is developed and implemented.

2 min. read

‘Alexa for chemistry’: National Science Foundation puts VCU and partners on fast track to build open network

D. Tyler McQuade, Ph.D., professor in the Department of Chemical and Life Science Engineering at Virginia Commonwealth University College of Engineering, is principal investigator of a multi-university project seeking to use artificial intelligence to help scientists come up with the perfect molecule for everything from a better shampoo to coatings on advanced microchips. The project is one of the first in the U.S. to be selected for $994,433 in funding as part of a new pilot project of the National Science Foundation (NSF) called the Convergence Accelerator (C-Accel). McQuade and his collaborators will pitch their prototype in March 2020 in a bid for additional funding of up to $5 million over five years. Adam Luxon, a Ph.D. student in the Department of Chemical and Life Science Engineering who has been involved from the beginning, explained it this way: “We want to essentially make the Alexa of chemistry.” Just as Amazon, Google and Netflix use data algorithms to suggest customized predictions, the team plans to build a platform and open knowledge network that can combine and help users make sense of molecular sciences data pulled from a wide range of sources including academia, industry and government. The idea is right in line with the goal of the NSF program: to speed up the transition of convergence research into practice in nationally critical areas such as “Harnessing the Data Revolution.” The team itself reflects expertise across several specialties. Working with McQuade are James K. Ferri, Ph.D., professor in the Department of Chemical and Life Science Engineering; Carol A. Parish, Ph.D., professor of chemistry and the Floyd D. and Elisabeth S. Gottwald Chair in the Department of Chemistry at the University of Richmond; and Adrian E. Roitberg, Ph.D., professor in the Department of Chemistry at University of Florida. Two companies are also involved with the project: Two Six Labs, based in Arlington, Virginia, and Fathom Information Design, based in Boston, Massachusetts. Currently, there is no shared network or central portal where molecular scientists and engineers can harness artificial intelligence and data science tools to build models to support their needs. What’s more, while scientists have been able to depict what elements make up a molecule, how the atoms are arranged in space and what the properties of that molecule are (such as its melting point), there is no standard way to represent — or predict — molecular performance. The team aims to fill these gaps by advancing the concept of a “molecular imprint.” The collaborators will create a new system that represents molecules by combining line-drawing, geometry and quantum chemical calculations into a single, machine-learnable format. They will develop a central platform for collecting data, creating these molecular imprints and developing algorithms for mining the data, and will develop machine learning tools to create performance prediction models. Parish said, “The ability to compute molecular properties using computational techniques, and to dovetail that data with experimental measurements, will generate databases that will produce the most comprehensive results in the molecular sciences. “There are many laboratories around the world working in this space; however, there are few organizational structures available that encourage open sharing of these data for the benefit of the community and the common good. We seek to collaborate with others to provide this structure; an open knowledge network or repository where scientists can deposit their molecular-level experimental and computational data in exchange for user-friendly tools to help manage and query the data.” The initial response to their idea has been strong from potential partners. Ferri and the others have already collected more than a dozen letters from major corporations such as Dow and Merck expressing interest in participating. Also on board are Idaho National Laboratory and Argonne National Laboratory, as well as national chemical engineering and chemistry organizations. McQuade said that chemical engineers in major industries including consumer products and oil and gas producers expend a lot of effort running experiments to determine the molecule they want to use, such as finding the best shampoo additive that doesn’t make babies cry. “The ability to design the properties you want is still more art than science.” The team also plans to develop a toolkit for processing and visualizing the data. Roitberg, whose research focuses include advanced visualization, said this could take the form of a virtual reality realm in which a user could find materials that are soluble in water but not oil, for instance, and then be able to browse for similar materials nearby. “We envision a very interactive platform where the user can explore relations between data and desired material properties,” he said. 

4 min. read

Oculus Go is coming – the market is about to evolve

Oculus Go is an all-in-one virtual reality (VR) headset that will offer 1,000s of games, 360-degree experiences without wires or even a PC to attach to. It’s coming – according to Facebook in early 2018 - but Canadian availability remains somewhat of a mystery. But the marketing has begun. There is a lot of hype when it comes to virtual reality – but will this technology be the gamechanger that shifts the market towards virtual reality experiences without wires and away from screenless viewers (headsets that require users to insert their smartphone)? Or will it take longer for VR to become mainstream, altering how we all consume games, media and virtually every experience that requires or incorporates technology? Is Oculus set to be the next Apple or Amazon? What will the future look like – and who will benefit most from VR? Which VR experiences are of most interest to Canadian consumers? These are early days, what will VR look like in a decade? As the market evolves, it seems VR's market potential has been diminished by the emergence of mobile AR as a rival platform. Even with the market pushing towards standalone headsets, premium VR might not accelerate until second-generation, standalone VR headsets break out starting over the next few years. There are a lot of questions about virtual reality and that’s where the experts from IDC Canada can help. Emily Taylor is a senior research analyst in the areas of consumer service and technology markets. She also can provide unique and intelligent insight into new landscape of virtual reality and augmented reality technologies for both consumers and businesses in Canada. Watch her video for more information on the VR/AR market in Canada, then simply click on her icon to arrange an interview. Source:

2 min. read