podcast

Podcast: Neuroprosthetics: The Next Step Towards Limb Reconstruction

author avatar
Podcast: Neuroprosthetics: The Next Step Towards Limb Reconstruction

In this episode, we discuss a breakthrough from ETH Zurich to mimic natural foot signals to the brain using state of the art neuroprosthetics.

In this episode, we discuss a breakthrough from ETH Zurich to mimic natural foot signals to the brain using state of the art neuroprosthetics. 


EPISODE NOTES

(0:50) - Bio-inspired Neuroprosthetics

Link to episode 126: https://pod.link/wevolver/episode/13acdcea436f95a7b6b4977991808c60

Become a founding reader of our newsletter: thenextbyte.com/#read


Transcript

What's up folks, in today's episode we're talking about some scientists from ETH Zurich who have cracked the code, the language of the human nervous system, and they found a way to replicate this in prosthetics. It's really interesting, so let's jump into it.

I'm Daniel, and I'm Farbod. And this is the NextByte Podcast. Every week, we explore interesting and impactful tech and engineering content from Wevolver.com and deliver it to you in bite sized episodes that are easy to understand, regardless of your background. 

Daniel: What's up peeps? Like we said, today we're talking all about a team from ETH Zurich that have cracked the code on the human nervous system, and they're able to reprogram the conversation between our real human body nervous system and artificial or prosthetic limbs. But before we jump into that, I actually wanna take a second for a little passion plug and to advertise, and our new sponsor is us. The folks, we're making a newsletter, the NextByte newsletter, we're focused on throughout the process of building this podcast, becoming experts in communicating interesting and impactful technology in a way that's quick and easy to understand. We're gonna take the same fire and secret sauce that we've been bringing to you in the podcast and deliver it to you in a written format. So, we just started the newsletter. We're gonna make it great. We're gonna do everything we can to make it the best newsletter you've ever read. And I think you should join now and you can be one of our founding readers, be one of the first folks to join on, sign on to the newsletter. We're gonna link it in the show notes. You can also go to thenextbyte.com and click on the button read. It takes two to 10 seconds, depending on how fast you can type your email to sign up.

Farbod: Yeah, and look, we understand. We would love everyone to listen to every episode, but sometimes you just don't have the time. Heck, sometimes I don't have the time. So, we've made a very easy to read, easy to digest newsletter where you can still get the juicy bits of the sauce and walk away satisfied.

Daniel: And it's actually the tagline, it's at the top the newsletter, but like what we say here is that newsletter is designed to be skimmed. We're no nonsense. We're not going to fill it with a bunch of fluff. We just want to get you like Farbod said, the juicy bits of the secret sauce, give you the parts, the points that you need to know. The cliff notes from our podcast episode, if you will. So, you should check it out. We're excited about it. You should be excited about it too.

Farbod: We’ll link in the show notes too. Make it easier.

Daniel: Yeah. One click from the show notes. Boom. Right there. You'll be right there.

Farbod: 2-10 second, guaranteed, unless you are a really slow typer, then it might be more like 15.

Farbod: Then we don't guarantee it, we take it back.

Daniel: We don't guarantee it. All right, but let's get into the awesome sauce for today, which is this revolution of neuroprosthetics. Kind of some background, right? People have been making prosthetic legs, prosthetic arms, robotic limbs, and artificial limbs to try and replace someone who may have had an amputation, try and replace their missing limb. Take legs for an example, right? If you've been making robotic legs for a while to try and help people with prosthetics to walk, it's really, really challenging for someone to learn how to walk on these legs because the legs don't talk to the brain at all really right now, but even the ones that we've tried to connect into the nervous system to try and provide some sort of electrical feedback into the brain that allows the person who's wearing this prosthetic leg to feel some sort of sensory input, it hasn't been working that well. It often makes these, they call them neuroprosthetics, makes these neuroprosthetics feel weird or even very uncomfortable. I forget what the term is, but they basically said that you can get like some sort of like, I think they got paresthesia, where you get some sort of like sensory issues with the nervous system because it's getting sensory overload from these neuroprosthetics. Basically, what I was going to say is like it's like they tried to reverse engineer the nervous system.

Farbod: Correct.

Daniel: Didn't do that good of a job with it and you're just getting like a lot of signal or a lot of noise versus a lot of signal. I imagine like a really old staticky like TV before you adjusted the bunny ears antennas with a lot of static, lot of awful noise, a lot of bad pictures. That's what the nervous system is receiving right now from neuroprosthetics, because we were trying to use computer-based signals. Think about the electrical pulses that are being sent around inside your computer right now to communicate. We tried to use those signals and try to connect those to the brain in a way that they thought the brain would be able to understand. The brain's not great at it. And so, if I'm someone who had an amputation and I get a neuroprosthetic, right now, the user experience for me, even though the state of the art technology is really cool. And yes, I can feel some sort of sensation in my missing limb using this technology. It might be painful. It might be really challenging to understand what the signals are. And honestly, like the user experience probably just downright sucks versus using a normal prosthetic limb without any of this sensory feedback.

Farbod: Yeah. Yeah, you're right. And from what I gather, they made a point of saying the current state of the art when it comes to this neuroprosthetics is they take the input, for example, I just stepped on a rock or something. They take input, they translate it into signal, but that signal is not mapped to anything in the real world. It's just like pressure is felt, so let's put out something to the sciatic nerve. And the something that they put out is a time constant electric pulse. So, like you said, it's kind of like a square wave where it's like on for 50 milliseconds, off for 50 milliseconds, for the entire duration that there is activation. And again, we don't know if that's how the body operates, but what we do know is that the users who have these implants are saying, yeah, like I'm feeling something, but it doesn't feel right. So that's kind of where we're at. Yeah. Where are we going?

Daniel: Well, I think this team from ETH Zurich, they wanted to continue down this path of neuroprosthetics, right? Build a new type of robotic leg that talks to the brain, uses signals that we can collect with sensors on the artificial limb, but try and make them mimic natural ones. So instead of trying to use, like I said, computer-based waves and trying to send that into the brain and make the brain figure out what that means, can we try and understand what the actual nervous system signals look like and then do our best using technology that we have to replicate how those signals look. Bio-inspiring are electrical signals based off of the biosignals that are translated inside the body right now.

Farbod: Correct.

Daniel: How do we make that experience much closer to real leg sensations? How do we make it a lot easier for the body to understand? And really what they say here is they're trying to understand the language of the body. They're trying to understand the language of the nervous system. Once they do that, they can translate our computer signals into neural signals all day and all night, they can do it forward, backward, inside out and upside down. But they've got to first understand this translation bit, understand how do we translate electrical signals that we're getting from sensors into the signals that the, you know, in the language that the body can understand basically the neural signals. What is the nervous system doing?

Farbod: Right, and I think for the first time since episode 69, we're about to get very foot centric.

Daniel: Yeah, they built a computer model called FootSim. Which may not be what you think it's used for. What they basically did is they, they looked and instrumented real feet, real nerves and feet.

Farbod: Of volunteers.

Daniel: Yeah. Volunteers. Yeah. And they saw how these nerves in the feet sent signals and communicated to the brain.

Farbod: In response to stimuli.

Daniel: Different stimuli. I think there were three of them, right? Touch, pressure and movement. So, they, they tried different combinations of touch different amounts of pressure and different amounts of movement and then kind of tried to map how the signals go in the foot to the brain when those stimuli are presented to a real human foot. And the goal here was to try and understand how that works, right? And then translate that language into something that we can try and replicate with our computers, with our electronics, with our state-of-the-art technology and kind of allow our neuroprosthetics, right? These replacement limbs to speak in the language that most, as close as they can, mimics the real life sensory feedback that's actually happening in a body with a healthy human foot during walking or running.

Farbod: So just to recap, the purpose of this FootSim volunteer experience was that they would be able to see that if I, for example, tickled Farbod's big toe, it produced this signal. If I did this to his little toe, it did this. And as they gathered this data, they could have this FootSim model tell them, well, what would happen if her boat did a heel strike? What kind of a signal would it generate to the rest of his leg? The idea there being that if it goes to someone who has lost her foot, they can mimic that signal because their prosthetic, their neuroprosthetic just did a heel strike. They can try to map it as closely as possible.

Daniel: Yeah, nailed it. And I think the kind of the next step here, which it didn't feel super ethical to me, but it's interesting. They tested on cats to try and validate their model. I don't know that cats can volunteer for something like this, but I imagine it wasn't painful for the cats.

Farbod: As a proud cat parent, a double cat parent. This was difficult for me to digest. They said they followed all the ethical guidelines and whatnot. And I'm sure I like to think at least that they did, but it doesn't feel great, however, it's for the good of science. They mentioned that, if I'm not mistaken, the nervous system of cats is that, which is most similar to humans, which is why they're doing this experiment with cats. But yeah, they implanted an electrode in their spinal cord and one in their legs. Do you want me to get into it or did I hijack this bit for you?

Daniel: No, no, no, I just think the goal here was to try and validate their FootSim model and do a very similar thing in cats, right? Where they collect the natural signals and then they try and replicate the natural singles versus, kind of do AB testing versus the computer-based signals that they use today and see which ones that cat's brain was more likely to process. And, basically the, the long story short there did, did these experiments. They found that the cat brain and the cat nervous system processes these bio-mimicked signals, these natural signals, much better than the old simpler ones that are generated from computers.

Farbod: Right. And I think the big takeaway, at least for me, was that they were able to take the cat's paws and apply a little bit of pressure so it looked like they were walking. And they were able to verify on the electrode, on the spinal cord, that they got some signal. And then they let the cat rest and used the electrode on their leg to send the impulse that the FootSim model was telling them they should be sending if a cat was walking. And that signal, when received by the spinal cord, matched that of the actual physical paw stimulation, meaning that the real world was matching to their model's behavior. So, there was a one-to-one parody.

Daniel: But the part that I was not super thrilled about is when they were testing the old state signal or the current state signals, they said that the cat's nervous system experienced information overload and unpleasant sensations and paresthesia reported by some users of neuroprosthetics. So, I guess it's no worse than what we're already prescribing to our own neuroprosthetic patients, which sucks to some extent, but I'm like, maybe we didn't have to jack up the cat's nervous systems.

Farbod: But I think, again, the value add there might have been that they can actually monitor how the body is behaving between a good signal and a bad signal. And I think that's what led them to the takeaway that because the brain is more used to these natural signals, it can process them better versus these artificial signals that it just cannot, I guess, comprehend and process effectively. Because, like, I'm gonna go into this “so, what now”, when they tested this new neuroprosthetic approach with actual human patients. They got reports of them being able to walk and climb and run much better, but they could also do multiple tasks, for example, talking while walking, much more effectively and without flaws because their brain, their computational resource to deal with the info coming from their legs was now much lighter. Right? So maybe, I mean, again, I love my cats and I don't want to think that the cats were tortured by any means, but maybe that little bit of tidbit was actually pretty insightful.

Daniel: No, I agree. And I think, at least here, one of the things that I think this also unlocks, let's say for the future is they mentioned that this shows a lot of promise for other devices that might help people that are already connected to the nervous system. They mentioned spinal implants, blank brain electrodes. These are devices that already exist and are in a separate realm from neuroprosthetics, but they're saying, maybe this pattern that we saw here in the prosthetics field or we're trying to, instead of forcing the brain to adapt to our arbitrary signal, the one that I wanna make, and the brain actually is very, very adaptable and it has shown the ability to adapt and understand different type of signals. I think of like cochlear implants, right? That the brain learns to understand the electrical signal from the cochlear implant as though you were actually stimulating the auditory nerve. You actually had an ear. So that being said, they're saying all these things that already exist and already work pretty well as is, can we take the same approach and try and mimic the actual nervous system response and see if the brain is even better at understanding it? If it's even easier for the brain to understand. I wonder if this would, like I said, things like cochlear implants, people who have brain electrodes in their brain to help stimulate the brain for certain types of conditions. I'm wondering if by studying and mimicking the way that the nervous system communicates instead of forcing the brain to compensate for the differences between the nervous system and our arbitrary signals, I'm wondering if there's the opportunity here to unlock some extra level of effectiveness in all sorts of devices that connect to the nervous system.

Farbod: I totally agree with you, man. And I think that's the potential that the researchers saw as well, because they made a note of it. They were like, we don't want to stop here. We want this to grow and be applied for more use cases beyond just prosthetics.

Daniel: Yeah, I agree.

Farbod: Yeah, super exciting. Wanna do a quick recap?

Daniel: Would you mind wrapping this up?

Farbod: Of course. So, the current state of neuroprosthetics, which is prosthetics that can actually send signals to your nervous system, is not very great. A lot of users are reporting discomfort, they're talking about how their body feels overwhelmed by the signals that they're getting. Well, these folks have actually realized that the patients are pretty spot on. ETH Zurich researchers have been doing some studies and they noticed that these signals are unnatural, which is what your body is not expecting. In order to understand what your body should be expecting, they've created a new machine learning model called FootSim. Now FootSim is trained with data that they've gathered from people who have, you know, fully functioning bodies. They stimulate the legs and they look at what the nervous system is receiving in terms of signals, and then they try to map it so that someone who is missing a limb, for example, they can simulate that exact same signal. To verify that this is working as expected, they've tested this system on cats who have a nervous system very similar to human beings, and they verified that the model is generating the signal pretty much in a one-to-one parity with what the cats are feeling, which means this model is working as expected. To extend that, human trials have shown that this new generation of neuroprosthetics is allowing people to walk, climb, run much better, much more conveniently, no more weird feeling in their legs. And on top of that, because their body's no longer getting this artificial signal that they can't understand, they're able to multitask much better and without fail.

Daniel: Awesome, nailed it. I think there's a couple things that I wanna highlight before we close out for the day.

Farbod: Let's do it.

Daniel: One of them being, I'm gonna dunk on myself here.

Farbod: Okay.

Daniel: In the past, when we talked about episode 126, it was another similar episode where we were allowing folks with prosthetics to feel certain sensations. In this case, they were feeling warmth. And I'd mentioned in those episodes, like they were kind of taking an outside-of-the-box approach. They weren't trying to reverse engineer the nervous system. They were just directly applying temperature to the residual limb, the part of the limb that's left, and then allowing the brain to kind of infer the difference and say, oh, like the rest of my missing limb. Maybe that's where I'm feeling the temperature. And it worked pretty well, but I did want to kind of dunk on myself and say, I was like, I took a pretty hard stance when we read that episode or read that article and recorded that episode saying like, oh, I know that there's going to be some neuroscientists out there who try and reverse engineer the nervous system and they aren't going to be able to do it. Well, it seems like this team from ETH Zurich maybe heard my challenge and did exactly what I said they couldn't do. So again, a little hat tip to the folks at ETH Zurich. And then also, if you're interested in the alternate approach, you can check out episode 126. We're gonna link that in the show notes.

Farbod: Awesome. Do we have any thank yous we gotta give out?

Daniel: I think we've got two thank yous.

Farbod: Okay.

Daniel: I'm gonna do thank you to Hong Kong because I have already learned this in Mandarin and I'm cheap, but Xie Xie Niu Men, which means thank you everyone. Thank you everyone for listening. You helped us trend in the top 200 podcasts in Hong Kong. I think we've got another one as well for Portugal.

Farbod: We do and I'm going to take that on because I visited your beautiful country and I am a forever lover of Lisboa. So, obrigado por avó.

Daniel: Also, Top 200 Podcasts in Portugal. So, thank you to our friends in Hong Kong and Portugal.

Farbod: I love your sardines. I'm just gonna put that out there.

Daniel: Well, that's what I was going to say. We've never followed through on this promise, but at some point we've got to actually sit down and record evidence, proof of us eating foods from these countries that we've trended in. I think we've got probably four or five of them now.

Farbod: I got Portuguese sardines upstairs.

Daniel: We've queued up since we've made this promise and haven't yet delivered. So maybe that'll be our next social media series is us eating food from all these countries that we've trended in.

Farbod: Maybe, maybe. Let's see about that. All right, everyone, thank you so much for listening. As always, we'll catch you in the next one.

Daniel: Peace.


As always, you can find these and other interesting & impactful engineering articles on Wevolver.com.

To learn more about this show, please visit our shows page. By following the page, you will get automatic updates by email when a new show is published. Be sure to give us a follow and review on Apple podcasts, Spotify, and most of your favorite podcast platforms!

--

The Next Byte: We're two engineers on a mission to simplify complex science & technology, making it easy to understand. In each episode of our show, we dive into world-changing tech (such as AI, robotics, 3D printing, IoT, & much more), all while keeping it entertaining & engaging along the way.

article-newsletter-subscribe-image

The Next Byte Newsletter

Fuel your tech-savvy curiosity with “byte” sized digests of tech breakthroughs.

More by The Next Byte

The Next Byte Podcast is hosted by two young engineers - Daniel and Farbod - who select the most interesting tech/engineering content on Wevolver.com and deliver it in bite-sized episodes that are easy to understand regardless of your background. If you'd like to stay up to date with our latest ep...