Virtual Reality-Based Surgical Simulations Could Make Patients Safer

Virtual Reality-Based Surgical Simulations Could Make Patients Safer Virtual Reality-Based Surgical Simulations Could Make Patients Safer

June 30, 20202 min read
Featuring:

Suvranu De, the director of the Center for Modeling, Simulation, and Imaging in Medicine at Rensselaer, has dedicated more than a decade of research to making surgery safer by developing virtual reality-based surgical training simulations that closely mimic the optics and haptics a surgeon may encounter in the operating room.


A new $2.3 million grant from the National Institute of Biomedical Imaging and Bioengineering of the National Institutes of Health will further his research in this space, by supporting the development of a collaborative virtual reality-based surgical simulation environment that allows medical professionals to practice technical, cognitive, and interpersonal skills as a team.


“People will be wearing head-mounted displays, and they will be immersed in a virtual operating room working on a virtual patient as a team,” De said. “We want to have an expert team in the operating room focused on the treatment of a patient, and not just a team of experts.”


Conceptually, this approach is similar to crew resource management practiced by aviation pilots, which has led to a significant reduction in aircraft accidents. The Virtual Operating Room Team Experience (VORTeX) simulation system will provide realistic distractions, interruptions, and other stressors that medical professionals may encounter in an operating room.


Traditionally, this type of simulation training has required mannequins, instructors, and a dedicated space, as well as significant coordination and resources. In contrast, the VORTeX system will be both distributed and asynchronous – allowing participants to join the simulation from different locations, and instructors to review the simulation and provide feedback at their convenience. Machine learning algorithms will be used to crunch the data and provide feedback to participants, who will be able to return to the virtual environment to review their performance.


De is available to discuss how this type of virtual training is developed and implemented.



Connect with:
  • Suvranu De
    Suvranu De Department Head, Mechanical, Aerospace, and Nuclear Engineering (MANE); Director, Center for Modeling, Simulation and Imaging for Medicine (CeMSIM); & J. Erik Jonsson ‘22 Distinguished Professor of Engineering

    Researches novel computational technology to solve challenging and high-impact problems in engineering, medicine, and biology

You might also like...