Robotics researchers are developing exoskeletons and prosthetic legs capable of thinking and making control decisions on their own using sophisticated artificial intelligence (AI) technology.
The system combines computer vision and deep-learning AI to mimic how able-bodied people walk by seeing their surroundings and adjusting their movements.
“We’re giving robotic exoskeletons vision so they can control themselves,” said Brokoslaw Laschowski, a Ph.D. candidate in systems design engineering who leads a University of Waterloo research project called ExoNet.
Exoskeletons legs operated by motors already exist, but users must manually control them via smartphone applications or joysticks.
“That can be inconvenient and cognitively demanding,” said Laschowski, also a student member of the Waterloo Artificial Intelligence Institute (Waterloo.ai). “Every time you want to perform a new locomotor activity, you have to stop, take out your smartphone and select the desired mode.”
To address that limitation, the researchers fitted exoskeleton users with wearable cameras and are now optimizing AI computer software to process the video feed to accurately recognize stairs, doors and other features of the surrounding environment.
The next phase of the ExoNet research project will involve sending instructions to motors so that robotic exoskeletons can climb stairs, avoid obstacles or take other appropriate actions based on analysis of the user’s current movement and the upcoming terrain.
“Our control approach wouldn’t necessarily require human thought,” said Laschowski, who is supervised by engineering professor John McPhee, the Canada Research Chair in Biomechatronic System Dynamics. “Similar to autonomous cars that drive themselves, we’re designing autonomous exoskeletons and prosthetic legs that walk for themselves.”
The researchers are also working to improve the energy efficiency of motors for robotic exoskeletons and prostheses by using human motion to self-charge the batteries.