Physically Realistic Virtual Surgery

  • De, Suvranu (PI)
  • Cao, Caroline G.L. (CoPI)
  • Jackson, Cullen (CoPI)
  • Jones, Daniel (CoPI)

Project Details

Description

Physically Realistic Virtual Surgery Abstract While virtual reality (VR)-based surgical simulation technology is being developed to improve laparoscopic surgical training outside the operating room (OR), existing simulators focus mostly on technical skills (TS) of hand-eye coordination for isolated tasks and seldom on non-technical skills (NTS) associated with both cognitive skills of decision making, as well as interpersonal skills of communication, team-work and conflict resolution. To enable VR-based surgical simulators to also train for cognitive skills, in the previous grant period we successfully developed the next generation (Gen2) of laparoscopic surgical simulators that immerse the trainee in a virtual OR using a head-mounted display (HMD) system, and introduce distractions, interruptions and other stressors to capture the high-stress environment of the real OR. However, to the best of our knowledge, there exists no VR-based simulator for training interpersonal skills needed for the multidisciplinary integration of OR teams, which consist of surgeons, anesthesiologists, and perioperative nurses. Following the significant reduction of adverse events in other disciplines, such as aviation, by the introduction of mandatory simulation-based team training (e.g., crew resource management), the National Surgical Skills Curriculum developed by the American College of Surgeons (ACS) and Association of Program Directors in Surgery (APDS) has prescribed ten team-based training modules to be performed in a simulation facility (e.g., an OR endosuite) with scenario-based training on high- fidelity manikin simulators. However, such facility-based team training is extremely expensive and cumbersome, requires dedicated facility and faculty time, and entails significant planning and schedule coordination between trainees, technicians, and faculty. To overcome the challenges of facility-based OR team training, the goal of this project is to extend the immersive VR technology (Gen2) developed as part of our prior grant for a single user to the entire OR team, and harness recent advances in cloud computing, mobile device-based VR and artificial intelligence and machine learning to design, develop and evaluate a Virtual Operating Room Team Experience (VORTeX) simulation system. The VORTeX will allow the OR team to train together in a distributed fashion (i.e., not co-located in the same room or simulation facility) wearing mobile device-based HMD systems to develop further their NTS based on computer-generated simulation scenarios replacing the physical ones. Evaluation of the simulation scenarios will be performed asynchronously by a team of experts based on post-action replays. We will implement the VORTeX for a laparoscopic cholecystectomy crisis scenario, developed and validated by our Co-I Dr. Dan Jones at BIDMC and adopted as one of the team training modules of the ACS/APDS national surgical skills curriculum. We hypothesize that the VORTeX will be at least as good as or better than traditional facility-based simulation in providing non-technical skills training to OR teams.
StatusFinished
Effective start/end date6/1/061/31/25

Funding

  • National Institute of Biomedical Imaging and Bioengineering: $687,669.00
  • National Institute of Biomedical Imaging and Bioengineering: $619,444.00
  • National Institute of Biomedical Imaging and Bioengineering: $631,626.00
  • National Institute of Biomedical Imaging and Bioengineering: $650,837.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.