In the last two decades, brain-computer interfaces (BCIs) have made remarkable strides translating neural signals into motor action programing. From robotic arms to mind-controlled quadcopters and speech processors controlled by thought, these innovations offer new and rousing possibilities for people living with paralysis, amputation, locked-in syndrome, and neurodegenerative disease. But for much of this journey, something vital has been missing. Feedback. The ability to feel the bionic limb that a brain is controlling. So why does this matter?
Video: Example of bionic touch (not just bionic movement)
The Missing Loop: Why Sensation Matters
Neuroscientists have long since known the importance of sensation for coordinated action. We don’t just move our limbs. We simultaneously sense and move, with moment-to-moment signals travelling backward and forward in loops governed by sensory-motor circuits.
Think about the last time you picked up a glass of water and took a sip. You didn’t just move your hand and do it. You sensed the glass, felt it’s shape, temperature, was it slippery? You estimated the weight of the glass in order to calculate the force needed to lift it to your mouth. To sip the glass successfully, your brain used sensory-motor loops to adjust the motion path from moment to moment. Mess with even one element of this sensory feedback (like a mug that looks heavy but is actually light), and you’ll likely mis-judge the force needed and spill the contents, or clink the glass against your teeth.
We know from clinical neuroscience that when the sensory-motor circuit is disrupted, smooth movement is nearly impossible. In some neurological conditions, the motor systems that control movement are intact, but with sensory-feedback loops disrupted, standing upright or walking becomes impossible. One sense, called proprioception, is particularly important for maintaining balance, stability and coordination.
Proprioception is the sense that governs our bodies ability to know (and predict) where it is in space without looking.
💡You can test your own proprioception by standing up, with your feet together, and closing your eyes. You’ll start to notice micro-adjustments as your system works to maintain balance and stability. Try to see how long you can stand still before you loose balance and take a step. If that’s too easy, try lifting one leg, or moving your head side to side. Another popular test of proprioception is to close your eyes, lift up one hand and touch the tip of your nose with your finger. How did you know where your nose was so accurately? Sensors in your muscles and joints, that are connected in your sensory-motor system, help guide the movement.
Now imagine picking up a cup of coffee with a bionic limb. Your motor circuits tell the limb where to reach (based on sight), but once you clasp the cup, the limb has no sense of touch, weight, temperature, or proprioception as you send movement signals to guide the limb back to your mouth and take a sip. You have no feedback.
Bionics of the early age
Many of today’s bionic limbs rely on machine-driven self-correction, like torque sensors and pre-programmed grip strength to complete the circuit. In these applications, functional devices have been limited. For example, if the bionic limb was programmed to expected a generic glass cup, but instead you’re our out at a BBQ, and you try to grasp a plastic disposable cup. It would likely crush the plastic cup and spill water all over you. You or the engineers would need to reprogram the grip strength. Making bionics of this era clunky, error prone and limited in their applicability. Even with impressive precision in motor decoding, like when using neural implants placed directly on the brain under the skull, the absence of real-time sensory information leads to stiffness, overcorrection, and a sense of detachment from the prosthetic system.
Let’s say that robotics can solve this problem and develop flexible, intuitive machine-driven self correction. Think, the Boston dynamics robo-dogs who can adjust to different terrain. Why not skip the feedback loops?
Sensory feedback not gives our actions precision and nuance, but also meaning, and meaning translates to learning.
Plasticity and the Case for Sensory-Driven Bionics
The brain is highly organized. It has a section for movement, called the primary motor cortex, a section for feeling, called the somatosensory cortex, and various control centers for refined actions, like the basal ganglia and cerebellum. These regions, and countless other areas, combine into neural circuits that create smooth, functional, trainable actions. Sensory feedback is not merely supportive, it is instructive. It tells the brain what matters.
💡Fun experiment: Try playing ping-pong in a room with a flashing strobe light. You’ll be able to see the ball, but this simple change to your visual input will mess with your ability to predict the path of the balls movement. Can you hit the ball? How well?
Without feedback, bionics mimic function. With feedback, bionics reclaim embodiment. And that has a profound impact on the utility of BCI for therapeutics and helping people.
Some of my favourite work demonstrating this notion has come from Miguel Nicolelis and his team at Duke University. Known for pioneering large-scale brain-machine interfaces, Nicolelis has long emphasized that BCIs must be more than control systems, they must engage the sensorimotor loop.
In his Walk Again Project, a quadriplegic man kicked the 2014 FIFA World Cup using a brain-controlled exoskeleton. In this set up, sensory feedback wasn’t just an add-on. It was a therapeutic tool. The exoskeleton delivered tactile signals to the user’s forearm, mapped from pressure sensors in the robotic feet. If the bionic exoskeleton sensed a rock on the ground, this was translated to signals on the users arm. Over time, some participants in the trial recovered voluntary motor function and regained sensation. This would not have been possible without the sensory feedback components.
“Plasticity is a two-way street. If you send signals back to the brain, especially in a meaningful, real-time way, the brain can reorganize, rebuild, and re-engage.”
— Miguel Nicolelis, 2015 interview on brain plasticity and BCI
Meaning is Magic: The Rubber Hand Illusion
The experiments above highlight another important notion: that sensory feedback doesn’t need to be natural, it needs to be meaningful.
Recall that the exoskeleton’s feet were wired to sensors on the users arm. Because they had lost feeling in their legs due to paralysis. However, there is a long history of this kind of re-wiring occurring rapidly and functionally in human and animal brains. Even within a few minutes: cue the Rubber Hand Illusion.
The rubber hand illusion is a classic psychology and neuroscience experiment. If you’ve taken my courses, or a first-year sensory neuroscience class, you may remember being part of a demo like in the video below.
It’s relatively easy to set up. You need;
- a fake hand (plastic or rubber)
- a board or screen (to block the real hand)
- a feather or pencil
- a willing participant (they will not be harmed)
The person’s real hand is hidden by a screen, while a fake rubber hand is placed in view. At first, the person won’t perceive the rubber hand as their own. But a connection is quickly established when both the real and fake hands are stroked in synchrony, using the feather or pencil. By seeing and sensing the action in time and space, participants begin to feel as if the rubber hand is their own. The coup de gras comes when participants flinch when the fake hand is threatened, say with a hammer.
What’s remarkable is that the brain accepts this synthetic limb not because it looks like a hand, but because the timing and spatial congruence of visual and tactile cues align in a way that feels real. The illusion shows how easily the brain can remap ownership – even to non-biological objects – when feedback is coherent and temporally precise.
These insights dovetail with the theories of György Buzsáki, a prominent neuroscientist known for his work on brain rhythms, self-organization, and neural plasticity. Buzsáki emphasizes the developmental necessity of sensory feedback in shaping brain architecture.
“The interaction between movement-triggered sensory feedback signals and self-organized spindle oscillations shapes the formation of cortical connections.”
— György Buzsáki, Rhythms of the Brain, 2006
Synthetic Touch: Reawakening the Brain’s Map
Fast forward to today, and movement-triggered devices are everywhere. From BCI in medical tech, to recreational mind-controlled toy cars, to motion-triggered lighting, and gesture-controlled AR computing.
Having dabbled with some of these devices, I am usually frustrated that they aren’t where I would expect them to be. The companies working on virtual reality and augmented reality devices and games seem to be prioritizing flashy graphic interfaces and sleek-looking hard-ware, at the cost of meaningful function. It is my opinion that a with bit more development into highly precise timing and spatial congruence, the tech could be more quickly and effectively create an immersive experience.
That’s why I’m especially excited by a recent development from the team led by Giacomo Valle at the University of Chicago, which was honored with the 2024 BCI Award. The team delivered synthetic touch to users controlling a bionic limb by delivering micro-stimulation of the somatosensory cortex.
📽️ Watch the video to see how users describe feeling the sensations of the bionic limb. It’s a remarkable glimpse into a future where prosthetics don’t just move with our thoughts, but feel the world with us.
By restoring the feedback loop between body and brain, this work doesn’t just enhance prosthetic control or accuracy, it reintroduces a sense of ownership and embodiment. It adds meaning back into the movement. It opens new possibilities for neurorehabilitation, particularly for those recovering from stroke or spinal cord injury, where regaining sensation can be as important as regaining motion.
Together, the research on bionics and sensory feedback argues for a fundamental shift in how we design and deploy BCIs: not as one-way decoders, but as two-way interfaces – capable of not just interpreting intention, but shaping perception, identity, and recovery.