Improving Outcomes with AI-Powered Virtual Surgical Simulations

AI-designed image of a doctor in operating room wearing VR glasses

Generated with AI by ZoomTeam for AdobeStock

Training medical professionals in a virtual environment recently became more realistic with artificial intelligence (AI) that provides instant feedback and team interaction during mock operations. Researchers at the FAMU-FSU College of Engineering are making it “real” in a new National Institutes of Health (NIH)-funded study. 

Led by Suvranu De, a mechanical engineering professor and dean of the FAMU-FSU College of Engineering, a group of engineering researchers recently developed a technology that utilizes natural language processing, machine learning and data analytics tools to enable medical professionals to train together from any location over a network connection.  

Suvranu De headshot
Suvranu De, professor of mechanical engineering and dean of the FAMU-FSU College of Engineering (M Wallheiser/FAMU-FSU Engineering)

“There are already virtual reality-based surgical systems,” De said. “Our proposed system combines a multi-learner virtual reality (VR) environment with AI agents to provide targeted feedback to each learner and the team.”

For the new study, De is partnering with Cullen D. Jackson, Ph.D., at Beth Israel Deaconess Medical Center in Boston and Daniel B. Jones, MD. from Rutgers New Jersey Medical Center, Newark, as part of this multidisciplinary research.

The research published in the Journal of the American Medical Association (JAMA) investigates the practicality of a multiuser virtual operating room for learning nontechnical skills that are critical in avoiding devastating surgical events. The team believes the technology has advanced to offer valuable feedback during surgical procedures.

De emphasizes, “Enhancing nontechnical skills for operating room teams reduces errors and increases patient safety, which are priorities for hospitals, professional societies and malpractice carriers.”

So how does it work?

Participants, including surgeons, nurses, and anesthesiologists, enter an intelligent immersive operating room as an avatar surgical team and work on virtual patients with simulated physiology. The system uses advanced computational algorithms to mimic changes based on patient characteristics and events happening during operations. AI observers review and compare the participants to expert “gold standards” to provide real-time feedback and guidance.  

“Combining virtual reality and AI facilitates learning and can improve decision-making, teamwork and communication skills,” adds De. “Practitioners can benefit from exposure to a broader range of variations in virtual patients and team compositions. They adapt to anatomical differences in patients and get a more realistic experience than traditional practice provides using mannequins, without the need for maintaining expensive facilities in teaching hospitals.”

Is this the future?

While the new technology shows promise, the researchers acknowledge specific issues that need improvement, such as network response and speech recognition capability. 

“People can connect to a virtual operating room from anywhere in the world, so sometimes network issues can impact the quality of service,” De explains. “One of our researchers and his doctoral student are developing a new network protocol based on machine learning to overcome some of these fundamental limitations.” 

Eventually, the team wants the system to adapt to multiple people speaking in the operating room environment. They hope to develop AI algorithms that recognize the discussions between participants and summarize the context to provide feedback. 

“Breakdown in communication is a major problem associated with immersive cognitive environments,” notes De. “AI has trouble with distinguishing accented speech, adapting to different speakers, and recognizing clinical context. It’s an important parameter that we are trying to understand and automate.”

More than 185 surgical teams in the Harvard medical system participated in a study using the AI-guided simulation technology. Researchers plan to continue improving the technology with the data collected from multidisciplinary operating room teams. 

“We anticipate the AI will improve enough to provide an effective tool for team training in the operating room,” De said. “Our study clearly shows that without AI, in the first trial, with trained people, only 4% passed the training. By the third trial, when the AI was turned on, the pass rate significantly increased, by 31%, showing the effectiveness of AI in training and learning.”

The research is partially funded by the National Institute of Biomedical Imaging Bioengineering at the National Institutes of Health. It is conducted in partnership with researchers at the FAMU-FSU College of Engineering; the Department of Electrical and Systems Engineering at Rensselaer Polytechnic Institute; the Department of Anesthesia, Critical Care and Pain Medicine at Beth Israel Deaconess Medical Center in Boston, Massachusetts; the Department of Surgery, at Rutgers New Jersey Medical School in Newark, New Jersey; and the Department of Computer Science at Florida Polytechnic University.

RELATED ARTICLES

Researchers Receive $1.6M NIH Grant to Develop Virtual Bariatric Endoscopic Simulator (ViBE)

FAMU-FSU College of Engineering Names Esteemed Scholar, Innovator as New Dean

Northrop Grumman Foundation Awards $1.5M Grant to Establish New STEM Research and Education Program to Meet Areas of National Need