Mixed Simulation Using a Virtual Environment to Train Medical Practitioners

Provides Training for Blind Procedures Where Practitioner's Direct Line of Sight is Not Available

This mixed simulation teaching tool uses augmented reality to provide training in blind and guided medical procedures. In the United States, over 250,000 deaths per year are due to medical error, making it the third leading cause of death. Many of these instances owed to substandard care and would be preventable if healthcare professionals received proper training. Many medical procedures require placing an instrument like a needle inside a target while avoiding accidental contact or puncture of surrounding organs or tissues. For procedures such as these, verbal and written instruction, while necessary and worthwhile, cannot take the place of hands-on training. Researchers at the University of Florida have developed a mixed simulation system that allows medical training instructors to visualize and consistently score trainee performance. The simulation integrates sensors and augmented reality principles with physical and virtual medical image models, enabling the students to rehearse important medical procedures and to self-debrief without endangering human lives.

 

Application

Mixed simulators with anatomically correct physical and virtual components that combine real-time 3D visualization with tracked instruments, recording and playback, and automated and consistent scoring algorithms to facilitate training of clinicians in procedural skills

 

Advantages

  • Allows trainees to practice psychomotor and cognitive skills and to learn from mistakes in a controlled environment, sparing actual patients the discomfort and risk from having novices practice on them
  • Replicates any individual human serving as the model, resulting in completely authentic physical and virtual anatomy
  • Replicates any kind of anatomy, training students for even the most uncommon situations
  • Displays in real time a 3D color visualization of the procedure and generates tactile feedback from palpation and/or contact of medical instruments with physical structures like bones, providing a realistic experience for medical trainees
  • Reduces/eliminates the need for expensive disposables, cutting costs in the training process
  • Provides practical, experiential training for medical students, decreasing the likelihood of mistakes and potentially ensuing malpractice lawsuits

Technology

This mixed simulation technology collocates anatomically authentic virtual and physical 3D objects that represent the part of the human body that is of interest. The simulation has already successfully applied to three procedures: central venous access (upper torso and neck), regional anesthesia (spine) and ventriculostomy (brain). In all three applications, a sensor with six degrees of freedom that is smaller than a grain of rice secures inside the needle bore near the needle tip such that, as the trainee directly manipulates and steers the needle, the needle tip position is traceable respective to both the physical and virtual components representing the human body. Real-time 3D visualization allows trainees and instructors to observe and critique technique and strategy. Because of the needle tip tracking, metrics heretofore unavailable facilitate implementation of automated and consistent scoring algorithms. These scoring algorithms open the possibility for self-debriefing when experts are not available to provide feedback. CT and MRI scans of individual humans along with 3D files of discrete objects that represent different organs and tissues facilitate the physical and virtual imaging. 3D files fed into a 3D printer (fast prototyping machine) create the physical parts of the mixed simulation. The system integrates readily available commercial off-the-shelf components into turnkey (set up time of about seven minutes) simulation systems that are compact and lightweight (meeting airline checked luggage requirements).

Patent Information: