Teal overlay

New AI system gives robotic prosthetic legs more control

Robotic prosthetics are improving all the time, but one of the challenges faced by robotic legs is in adapting to the specific terrain they are walking on.

Lower limb robotic prosthetics need to adjust their behaviour depending on the conditions underfoot, which could be smooth, bumpy, slippery, or have other relevant properties.

Now, a team of researchers have developed an AI algorithm that they say helps the system to better gauge different terrains, allowing the relevant adjustments to be made.

This involved not just the terrain they were walking on currently but ones nearby to which they were about to transition.

Edgar Lobaton, an associate professor of electrical and computer engineering at North Carolina State University and co-author of a paper on the work, said: ‘The framework we’ve created allows the AI in robotic prostheses to predict the type of terrain users will be stepping on, quantify the uncertainties associated with that prediction, and then incorporate that uncertainty into its decision-making.’

The research involved producing a camera system linked to an artificial intelligence algorithm that was capable of identifying four different terrain types.

These were tile, brick, grass and concrete, with the system also able to differentiate between ‘upstairs’ and ‘downstairs’.

System combines cameras and control software

For the research trials, the team used cameras worn in the form of eyeglasses, along with others fitted to the prosthetic leg itself.

The AI was able to analyse the feed from these cameras in real time, using the data received to make decisions on the type of terrain coming up.

Lead author Boxuan Zhong, a recent Ph.D. graduate from NC State, said that the AI was not forced to make a decision if the level of uncertainty was too high.

Instead, it could default to a ‘safe mode’ or potentially alert the user that it was not sure of the terrain type ahead.

Helen Huang, another co-author of the paper, described the combination of computer vision and control software as ‘an exciting new area of research’ in the field of robotic prosthetics.

The team also said that the algorithm developed could have an impact on AI development in general.

Lobaton said: ‘We came up with a better way to teach deep-learning systems how to evaluate and quantify uncertainty in a way that allows the system to incorporate uncertainty into its decision making.

‘This is certainly relevant for robotic prosthetics, but our work here could be applied to any type of deep-learning system.’

Today’s news was brought to you by TD SYNNEX – the UK’s number one solutions distributor.

Back to Top