Skip to main content
Back
A woman with brown hair and a navy blue jumper posing for a photo, holding a robotic arm with each hand.
Dr Alessia Noccaro is studying how we can learn to control an extra robotic arm © Susie Woods, University of Newcastle

A sense of touch for extra robotic arms

When you could use a hand, Dr Alessia Noccaro might be able to give you an extra one. A roboticist at the University of Newcastle, Noccaro is studying extra robotic limbs, the best ways to control them without taxing our brains too much, and how we can ‘feel’ what they feel with haptics.

An extra arm could help us out in scenarios from the mundane to the exceptional, whether it’s hanging a picture on the wall or allowing a surgeon to apply pressure while operating with both hands. In rescue operations, firefighters could move debris while extracting injured people from underneath.

But it’s early days for the body augmentation field. One of the biggest hurdles is controlling an extra body part when it’s not already programmed into your brain.

So far, some of the best results have come from repurposing movements from another body part. In 2024, researchers showcased a third thumb controlled by wiggling the toes at the Royal Society’s Summer Science exhibition. 98% of the 600-odd members of the public who tried it out could learn how to use it in under a minute.

Both touch and proprioception provide critical two-way feedback for when we learn new movements. They’ll be vital for us to naturally, deftly operate extra limbs (and digits).

But there’s still a missing piece: feedback. If a person swings their arm and hits a wall, they’ll feel it. But if they’re wearing an extra arm and it collides with something, they won’t feel that force. We also know where our bodies are in space through proprioception, which is often thought of as a sixth sense and is crucial for coordinating movement.

Both touch and proprioception provide critical two-way feedback for when we learn new movements. They’ll be vital for us to naturally, deftly operate extra limbs (and digits). Dr Alessia Noccaro, based at the University of Newcastle, is developing a way to do this through a platform that will allow users to control and feel the robot at the same time.

A woman with brown hair and brown eyes smiling at the camera, wearing a blue jumper and standing in front of large robotic arms.

Alessia Noccaro, a Royal Academy of Engineering UK Intelligence Community postdoctoral researcher at the University of Newcastle, has a small collection of commercial robotic arms in her lab.

Reconstructing our senses of touch and space

Noccaro, a Royal Academy of Engineering UK Intelligence Community postdoctoral researcher, has a small collection of commercial robotic arms in her lab, including a lightweight arm that can be worn like a backpack. As with the third thumb, wearers can move the arm by wiggling their foot. The setup she’s developing also has two ways to provide feedback on the robot’s position in space.

One does this through small currents produced by an electrical mesh worn on the thigh, a device provided by European R&D company Tecnalia. “The sensation is between tickling and touch, so it’s not painful,” she says. “Wherever the robot moves, even if you're not looking at the robot, you can feel the robot on your leg.”

This means even if you were wearing it with your eyes closed, you know more or less where the arm is, for example, if it’s travelling right or left.

Another feedback device, worn across the thigh and calf, has six small motors that vibrate differently according to the forces felt by the robot. If the robotic arm pushes against the wall, you can feel it, as data from the sensor is transmitted to your leg. And it’s not just collisions – you can even feel air resistance, if the robot moves rapidly into free space. Heavy and lightweight robots produce different sensations, too.

Another route to understanding the brain

So far, she’s been studying simple movements, such as reaching for, picking up and moving objects. Much of her foundational work was in virtual reality – which is easier to study in the lab without the complicating factors of “mass and inertia that can also hit you”, explains Noccaro. Having moved onto hybrid experiments, she’s gearing up to publish work involving entirely real robots. The next step will be more complex (but still everyday) tasks: hanging a picture on the wall or mixing dough while adding flour.

Noccaro’s experiments have shown that people can learn how to control the robot in under five minutes. "We know the brain is highly plastic – it can change a lot,” she says. “We know it can control a third arm."

One of the things that excites her most about her work, however, is the prospect of untangling how the brain learns. Noccaro’s experiments have shown that people can learn how to control the robot in under five minutes. "We know the brain is highly plastic – it can change a lot,” she says. “We know it can control a third arm."

She likens the way the brain adapts to another limb to learning to play piano: the motor cortex, which is responsible for movement, changes and its representation of the fingers grows. “I imagine something similar would happen with an additional limb, but we don’t know yet.”

Her work could feed into this fundamental neuroscience research, too. “We have the technology, but we don’t have the means yet, because we don’t know [how] people could use it proficiently,” she says. “Understanding how the brain controls robotic limbs is the most fascinating and empowering part of the research.”

***

More in technology and AI:

🧠 The next Intel? MintNeuro is the advanced semiconductor design company unlocking less invasive neural implants

🤖 What do we need to consider before we see robots helping people at home?

🎨Can AI boost human creativity?

Contributors

Florence Downs

Author

Keep up-to-date with Ingenia for free

Subscribe