Robot Helps Feed Patients with Mobility Issues
The Cornell system uses real-time mouth tracking and a dynamic response mechanism to allow the robot to understand and react to patients’ movements in real time
A team of researchers from Cornell has developed a robotic feeding system that uses computer vision, machine learning and multimodal sensing to feed people with severe mobility issues.
According to the team, the new system could be used to assist patients suffering from cerebral palsy, multiple sclerosis and spinal cord injuries.
Typically, developing a robot to help people with these conditions is challenging, as it requires a system that directly and precisely places food in the user’s mouth, without requiring the patient to make any adjustments.
The system also needs to factor in potential spasms, meaning a patient’s mouth may move suddenly.
“Feeding individuals with severe mobility limitations with a robot is difficult, as many cannot lean forward and require food to be placed directly inside their mouths,” said Tapomayukh Bhattacharjee, senior developer behind the system. “The challenge intensifies when feeding individuals with additional complex medical conditions.”
To address these challenges, Bhattacharjee said his team fitted the system with two ‘essential’ features: real-time mouth tracking that allows the robot to adjust to users’ movements and a dynamic response mechanism that helps the robot to detect the nature of physical movements as they occur and react accordingly.
The mouth tracking method was trained on thousands of images of patients’ head poses and facial expressions and gathers data from two cameras positioned above and below the utensil.
These two features allow the system to distinguish between spasms, intentional bites and users moving to get the food into their mouths, the researchers said.
In tests, the system was able to successfully feed 13 individuals with mobility issues from a range of medical conditions, the team said. These tests took place at the EmPRISE Lab on the Cornell Ithaca campus, a medical center in New York City and a care recipient’s home in Connecticut.
“This is one of the most extensive real-world evaluations of any autonomous robot-assisted feeding system with end-users,” Bhattacharjee said.
The team said it will continue working on the system to improve its long-term usability.
The researchers presented a paper on the system at the Human Robot Interaction conference, March 11-14 in Boulder, Colorado.
About the Author
You May Also Like