A technology that detects and classifies coordinated two-hand movements in robotic surgery by analyzing movement patterns and symmetries, ensuring accurate real-time control despite noise for improved interaction in surgical robots.
In robotic-assisted surgery, precise bimanual coordination is crucial for enhancing surgical outcomes and minimizing patient recovery times. Robotic systems offer significant advantages over traditional surgical methods by providing greater dexterity, stability, and control, enabling surgeons to perform complex procedures with enhanced accuracy. The integration of advanced coordination/recognition technologies is essential to fully leverage these robotic capabilities, ensuring synchronized movements of both hands to improve maneuverability and reduce the likelihood of errors during surgical tasks. Given the increasing sophistication, the demand for robust systems that can accurately interpret and respond to the nuanced movements of surgeons continues to grow.
Current approaches to recognizing and classifying bimanual coordination in robotic systems face several challenges that hinder their effectiveness. Many existing methods struggle with real-time data processing, leading to delays that can disrupt surgical workflows. These systems also often lack robustness against noise in human movement data, resulting in inaccurate classifications of movement patterns. Moreover, current frameworks frequently require manual threshold tuning and are limited in their ability to accommodate diverse coordination modes, restricting their applicability across different surgical tasks. These limitations highlight the need for more sophisticated and adaptive solutions to enhance the reliability and precision of human-robot interactions in medical settings.
The technology enables the recognition and classification of bimanual coordination patterns in robotic systems, specifically for robot-assisted surgery. It analyzes the geometric relationships between left- and right-hand movements by representing them as discrete trajectories in Cartesian space, which are then normalized through interpolation to create equally spaced matrices for each hand. Direction classification is achieved by assessing changes in Euclidean distance between corresponding points, categorizing movements as together, away, or parallel. Symmetry classification utilizes transformation analysis to identify mirror, point, visual, or incongruent symmetry.
To address real-world noise in human movement data, the system incorporates a hysteresis band for direction classification, modified Procrustes analysis for symmetry determination, and applies specific metrics with thresholds in a sequential filtering approach. Validation using both 2D and 3D datasets demonstrated high accuracy in direction (92.3%) and symmetry (86.0%) classifications, highlighting its suitability for real-time integration into robotic control systems and applications such as haptic feedback in surgical robots.
This framework stands out due to its innovative approach to real-time recognition of bimanual coordination modes in human-robot interactions, particularly within teleoperated surgical environments. By simultaneously addressing both direction and symmetry of hand motions in Cartesian space and employing advanced techniques like modified Procrustes analysis and hierarchical decision flows, it achieves robust performance even in the presence of noise.
The implementation of statistical thresholds based on normally distributed noise and the ability to rapidly predict coordination modes within 250 milliseconds further differentiate it from existing solutions. Additionally, its successful validation across comprehensive 2D and 3D studies, along with the capability to handle complex surgical gestures, showcases its superior accuracy and efficiency. These unique features enhance human-robot interaction, providing precise and reliable movement recognition essential for high-stakes applications in surgical, medical, and other teleoperated systems.
https://patents.google.com/patent/US20240016561A1/en?oq=+18%2f348%2c329