In-Bed Pose Estimation using Under Cover Imaging via Thermal Diffusion

INV-19039
 
Background
Human in-bed pose estimation has huge practical values in medical and healthcare applications, yet still mainly relies on expensive pressure mapping (PM) solutions which costs thousands of dollars. Methods via other sensing modalities such as depth sensing (e.g. Microsoft Kinect) or camera-based method (e.g. RGB videos) have been proposed, though they either provide limited pose detection granularity or cannot work under full darkness or covered conditions common in sleeping contexts. 
 
Technology Overview
Northeastern University researchers developed a novel physics-inspired, vision-based approach that addresses the challenging issues associated with the in-bed pose estimation problem, including monitoring a fully covered person in complete darkness. Researchers have reformulated this problem using a Under Cover Imaging via Thermal Diffusion (UCITD) method to capture high resolution pose information of the human body even when it is fully covered using a long-wavelength IR (LWIR) imaging technique and a novel deep learning-powered pose estimation algorithm. Using this first-ever in-bed pose dataset study resulted in over 90% pose estimation accuracy with 14 body joints granularity utilizing this approach that is a fraction in size and cost compared to the state-of-the-art solutions with similar (or even less) capabilities.
 
Benefits
Compared to the pressure mapping for pose estimation this approach is: 
- 1/60th the price
- 1/300th the size
- Having higher pose recognition granularity and accuracy
- Safer with minimal interventions to the bed or the human user
- Able to monitor poses when the human is covered with sheet/blanket or when the room is in full darkness
 
Applications
- Sleeping behavior studies
- Patient activity monitoring in hospital 
- Pressure ulcer studies: early detection and prevention 
- Overall any healthcare research that requires in-bed human pose information over time
 
Opportunity
- License
- Partnering
- Research collaboration
 
Patent Information: