Today’s Solutions: May 19, 2024

It’s been decades since Robert “Buz” Chmielewski could properly move his arms. A surfing accident robbed him of this ability as a teenager, causing him to be paralyzed from the neck down. But now, over 30 years since the accident, Chmielewski was able to cut food and serve himself thanks to a pair of thought-controlled robotic hands.

Back in January of 2019, doctors implanted two sets of electrodes in each hemisphere of his brain. The hope was that the electrodes could help Chmielewski control two robotic prosthetic arms designed by researchers from Johns Hopkins Medicine (JHM) and Johns Hopkins’ Applied Physics Laboratory (APL). Previous research on mind-controlled prosthetics, which relies on a brain-computer interface, has focused on a single-arm controlled by one hemisphere of the brain, which is why the use of electrodes on both sides of the brain is particularly exciting.

As reported by FreeThink, a portion of the robotic control is automated with artificial intelligence (AI), with the idea being that signals coming from the brain-computer interface will combine with AI to control the robotic hands. It’s been two years since Chmielewski first started working with the researchers, but now he’s finally mastered a new skill: cutting food and feeding himself, with each arm completing different tasks at the same time.

“Being able to control two robotic arms performing a basic activity of daily living — in this case, cutting a pastry and bringing it to the mouth using signals detected from both sides of the brain via implanted electrodes — is a clear step forward to achieve more complex task control directly fed from the brain,” said Pablo Celnik, director of physical medicine and rehabilitation at the Johns Hopkins.

For Chmielewski and the researchers, the ability for a quadriplegic man to cut food and feed himself represents a massive step in the field of mind-controlled prosthetics. Moving forward, the researchers want to achieve the next goal of adding additional sensory feedback, which would allow the user to “feel” if they are completing a task correctly, rather than simply observing it. Should they achieve this, we’ll be sure to give you an update!

Solutions News Source Print this article
More of Today's Solutions

Why you should drink coffee after breakfast—and not before

While it may be tempting to drink coffee the moment you get out of bed, a study from the University of Bath suggests that ...

Read More

Wildlife filmaker provides a unique insight into the daily lives of bees

You may have seen bees flying around your backyard or local park, but it can be difficult for the naked human eye to grasp ...

Read More

This is the UN plan to tackle plastic pollution

The Optimist Daily very much likes writing about plastic cleanup in the oceans. So, we were ecstatic when we learned about the beginnings of a ...

Read More

Revel at the most detailed image of our universe yet

Here at The Optimist Daily, we have been sharing every exciting step of the James Webb Telescope’s journey, from its long-awaited launch, to when ...

Read More