Accurate and real-time bimanual control for surgical robots

A technology that detects and classifies coordinated two-hand movements in robotic surgery by analyzing movement patterns and symmetries, ensuring accurate real-time control despite noise for improved interaction in surgical robots.

Background

In robotic-assisted surgery, precise bimanual coordination is crucial for enhancing surgical outcomes and minimizing patient recovery times. Robotic systems offer significant advantages over traditional surgical methods by providing greater dexterity, stability, and control, enabling surgeons to perform complex procedures with enhanced accuracy. The integration of advanced coordination/recognition technologies is essential to fully leverage these robotic capabilities, ensuring synchronized movements of both hands to improve maneuverability and reduce the likelihood of errors during surgical tasks. Given the increasing sophistication, the demand for robust systems that can accurately interpret and respond to the nuanced movements of surgeons continues to grow.

Current approaches to recognizing and classifying bimanual coordination in robotic systems face several challenges that hinder their effectiveness. Many existing methods struggle with real-time data processing, leading to delays that can disrupt surgical workflows. These systems also often lack robustness against noise in human movement data, resulting in inaccurate classifications of movement patterns. Moreover, current frameworks frequently require manual threshold tuning and are limited in their ability to accommodate diverse coordination modes, restricting their applicability across different surgical tasks. These limitations highlight the need for more sophisticated and adaptive solutions to enhance the reliability and precision of human-robot interactions in medical settings.

Technology description

The technology enables the recognition and classification of bimanual coordi­nation patterns in robotic systems, specifically for robot-assisted surgery. It analyzes the geometric relationships between left- and right-hand movements by representing them as discrete trajec­tories in Cartesian space, which are then normalized through interpolation to create equally spaced matrices for each hand. Direction classification is achieved by assess­ing changes in Euclidean distance between corresponding points, categorizing move­ments as together, away, or parallel. Symmetry classification utilizes transformation analysis to identify mirror, point, visual, or incongruent symmetry.

To address real-world noise in human movement data, the system incorpo­rates a hysteresis band for direction classification, modified Procrustes analysis for symmetry determination, and applies specific metrics with thresh­olds in a sequential filtering approach. Validation using both 2D and 3D data­sets demonstrated high accuracy in direction (92.3%) and symmetry (86.0%) classifications, highlighting its suitability for real-time integration into robotic control systems and applications such as haptic feedback in surgical robots.

This framework stands out due to its innovative approach to real-time recognition of bimanual coordination modes in human-robot interactions, particularly within teleoperated surgical environments. By simultaneously addressing both direction and symmetry of hand motions in Cartesian space and employing advanced techniques like modified Procrustes analysis and hierarchical decision flows, it achieves robust performance even in the presence of noise.

The implementation of statistical thresholds based on normally distributed noise and the ability to rapidly predict coordination modes within 250 milliseconds further differentiate it from existing solutions. Additionally, its successful validation across comprehensive 2D and 3D studies, along with the capability to handle complex surgical gestures, showcases its superior accuracy and efficiency. These unique features enhance human-robot interaction, providing precise and reliable movement recognition essential for high-stakes applications in surgical, medical, and other teleoperated systems.

Benefits

  • Achieves high accuracy in direction (92.3%) and symmetry (86.0%) classification of bimanual movements.
  • Reduces prediction times to under 250 milliseconds, enabling swift decision-making in surgical tasks.
  • Enables real-time integration into robotic control systems for responsive robot-assisted surgery.
  • Enhances human-robot interaction by accurately recognizing and classifying coordination patterns.
  • Robustly handles real-world noise through advanced filtering and analysis techniques.
  • Supports applications such as haptic feedback, improving surgical precision and control.
  • Facilitates intuitive identification of coordination modes for diverse surgical gestures.

Commercial applications

  • Robot-assisted surgery
  • Teleoperated surgical systems
  • Haptic feedback in robotics
  • Real-time robotic control
  • Human-robot interaction

Patent link

https://patents.google.com/patent/US20240016561A1/en?oq=+18%2f348%2c329

 

Patent Information:
Title App Type Country Serial No. Patent No. File Date Issued Date Expire Date
Online Recognition Of Bimanual Coordination To Provide Context For Movement Data In Bimanual Teleoperated Robots Utility (Conversion) United States 18/348,329   7/6/2023