podcast

Podcast: Why Robotic Limbs Are Critical To The Human Race

author avatar
Podcast: Why Robotic Limbs Are Critical To The Human Race

In this episode, we delve into the cognitive strategies employed by researchers at EPFL to augment the human body with an additional robotic arm and learn about the profound impact of cognitive enhancements on the integration of advanced robotic limbs

In this episode, we delve into the cognitive strategies employed by researchers at EPFL to augment the human body with an additional robotic arm and learn about the profound impact of cognitive enhancements on the integration of advanced robotic limbs and a better understanding of how the human mind works.


This podcast is sponsored by Mouser Electronics


EPISODE NOTES

(3:29) - Cognitive strategies to augment the body with an extra robotic arm - EPFL

This episode was brought to you by Mouser, our favorite place to get electronics parts for any project, whether it be a hobby at home or a prototype for work. Click HERE to learn more about the existing technologies that can enable superhero-like “powers”!


Transcript

What's going on, folks? Happy New Year. Welcome back to the Next Byte podcast. And this one, we're talking all about how you can get a third robotic arm. That's right, you ever think about the fact that two arms might not be enough maybe when you're cooking? Well, these folks at EPFL have got you hooked up. So, if that's got you excited, buckle up and let's get into it.

I'm Daniel, and I'm Farbod. And this is the NextByte Podcast. Every week, we explore interesting and impactful tech and engineering content from Wevolver.com and deliver it to you in bite sized episodes that are easy to understand, regardless of your background. 

Farbod: All right, people, as you heard, we're gonna be talking about the connection of human beings robots for our first episode of 2024. But before we get started, let's quickly talk about today's sponsor, Mouser Electronics. Now, you know, we've loved Mouser. We love working with Mouser because being one of the world's biggest electronics suppliers means that they got great connections to what's happening in industry, the latest in academia, and they actually get to share this information with the rest of us via their technical resources. The one that we're talking about today is about how all these superhero movies that we've seen, think Iron Man, think to some extent like Robocop, they're kind of now being made possible by additive manufacturing, by artificial intelligence. They even made the point of your phone has kind of become a part of you now, so if someone's asking a question about what's the most popular food in the world, you can quickly look it up and say pizza. And with the emergence of technologies like Neuralink, it's becoming closer and closer to us and getting integrated into us.

Daniel: Well, I would say like this whole genre of entertainment is called science fiction, let's say. But they mentioned that with the advent of additional technology, it's becoming less and less science fiction and more and more like science fact or quasi science fact, right? That's part of why, to me at least, all this stuff is so entertaining is because it doesn't feel like it's completely out of reach. It almost feels like a mix between science fiction and reality, because we've got technology that makes it possible to do body modification, you know, things that will massively improve your health, human machine integration. Like you're saying, like having a computer connected to your brain, extended reality, seizing virtual and augmented reality for new experiences. It reminds me of like Iron Man's heads up display. When I first saw that, I was like, wow, that'd be so cool. And now we've got like so much technology being demonstrated to show that you can use technology to overlay what you see with the rest of the world. I don't know. It's, it's really interesting. And as a pretty big sci-fi fan, seeing this stuff come to life as an engineer, both of those parts of my brain, they're tickled right now.

Farbod: Yeah. I'm with you, man. The heads-up display is a good one. Cause I too remember watching Iron Man for the first time, being like, that is crazy. And then you hop into like a 2024 Civic, and then you realize that that technology is integrated already. It's pretty exciting, and it's kind of a testament to how fast technology is growing. And with that said, let's jump into today's article, which is coming from EPFL. This one was interesting because when I first kind of got my hands on it and I was going through it. I was like, it doesn't sound so novel. When we were in college, there were projects that kids were doing where you would mount like a sensor to your muscle and as you like moved it around, it would make a robotic arm open or close or something along those lines. But then, you get into the meat of the article and you realize that the goal here is actually to, it's multi-fold to better understand how the human brain works and to figure out if there's a way for us to extend our capabilities by adding another appendage. Now, I feel like I've teased it high level enough. So, let's talk about what's happening here. These folks at EPFL wanna tackle this idea of can we control an extra appendage, an arm of some sort, without impeding our normal operation. Right?

Daniel: And, like you said, there's a lot of work been done saying, I'm working with someone who's an amputee. Can I help them by replacing their amputated arm with a robotic arm? What are the control strategies for that? What this team from EPFL is doing is taking a lot of the research that's been on that space, combining it with some additional research in the cognition space to understand how can we control a limb with the least amount of cognitive load possible, right? And now they're saying, screw giving someone who used to have two arms the one in addition to total them back up to two. That's a great noble novel cause, but they're saying if I've got two arms, what stops me from getting a third? Third arm. That makes me more efficient at the jobs that I'm doing. I'm sure you've felt like when you're cooking in the kitchen or moving boxes around the house, you're like, oh man, I just need an extra hand to get this done. This team from EPFL is trying to find a way to make that Oh, I just need a third hand, a reality using technology. And I think it really links well with what we talked about at the beginning of the episode about sci-fi becoming real. First thing that popped into my head when I first read this article, I was like, Dr. Octopus.

Farbod: Yeah.

Daniel: So obviously he ended up with like six extra arms or eight extra arms. I don't remember, but this is about scientists taking a special third robotic arm that you can control, I think actually with your breathing to limit the amount of cognitive load that's required for someone like you and I who were fortunate enough to be able-bodied both with two sets of arms, like both with two arms that will function properly. Why not find a way to give us a third arm so that when we're doing manual work or we're trying to untangle these wires for our recording for the episode, like when you need an extra hand, why not use technology to make that possible too, to increase productivity, to increase safety, et cetera.

Farbod: For sure. And the way they approached the proof of concept of this and implement this is pretty interesting. They, as I mentioned earlier, a big requirement for this was whatever the means of controlling this extra arm is gonna be, it shouldn't impede with normal functions. So like, if I have the sensor be based on like my earlier example of what my left bicep is doing, that means even if I don't want to control that arm, that extra arm, and I just want to move my left arm, that's going to move by accident.

Daniel: Or if you just want to move your left bicep, and you actually want to move your left arm, the robotic arm's going to pick that up as a signal and move as well.

Farbod: Correct.

Daniel: So, I agree with you. The most challenging part of this, I think, from what I was trying to determine, from this EPFL scientist, was trying to understand a way to enhance the human body, use a way to like, leverage the extra cognitive power that we have in our brain and translate that into physical power to move things in the real world. But do that without interfering with existing body control. So, this arm can't be clumsy, it can't be related to other signals you use to control the two arms that you already do have. When I start thinking about it that way, I'm like, man, there really isn't a great way that I am aware of naturally to say, you know, if I were to try and control a third arm without it being confused with signals from my left over my right arm. And I want them all to work in a beautiful symphony together without crashing into each other or hurting me. I can't think of a way off the top of my head that I would try and control that robotic arm, except maybe by talking, right? Cause you and I are talking with one another. I feel like I can talk and I can move my arms and those to interfere with each other, but you know, I need my eyes to blink. I truly need my facial expressions to be able to communicate with you. Other than speaking, that's pretty much the only way I could have fathomed trying to control a robotic arm.

Farbod: You see, I wasn't the same pickle as you. I was like, how would you approach this? Everything that came to my mind, besides directly giving commands or something like a Neuralink that can directly interface with the device via your thoughts, seemed like not a great option but then that's kind of where we start to get into the sauce. The approach that they use was by leveraging the diaphragm, a sensor that's strapped to your stomach and very specific motions of your diaphragm would command this extra aperture to do whatever you wanted to do. And the way they like kind of tested this out at the beginning was using a virtual environment where the user has controls of an exoskeleton to mimic the inputs of a left arm and a right arm to make sure that it's actually working. And then they would expand and contract their stomach, which the diaphragm sensor would pick up and that would move this extra third arm in the virtual environment. What this allowed them to do was monitor how this works as the person just normally was breathing or was talking to make sure you're not accidentally putting in an input when you don't want to move that arm, kind of the scenario that we were talking about earlier with the bicep sensor and it worked as expected. And that's pretty impressive. I would have assumed whatever comes from, again, the breathing process would be prone to error, but I guess they've put in a bit of processing that takes into account of this very distinct expand and contract should only be the inputs that we actually pick up on.

Daniel: One of the things that first made me aware of the extreme level of control we have over our diaphragm, was actually when I was in elementary school band.

Farbod: Okay.

Daniel: The teacher is telling us like, I guess the word is embouchure. It's like, you're supposed to have the correct posture and you're supposed to flex your diaphragm in the correct way so that you're able to get the most out of your lung capacity when you're playing a long note. But I had no idea that like the way I sat and the way I was able to like flex my diaphragm and I'm doing it right now, like sitting like, oh, I can like flex my diaphragm, move my stomach around. That is pretty much unrelated to my ability to speak. Like I'm doing it right now, I'm talking. It's pretty much unrelated to my ability to move my arms around. It seems like this pickle that we were both in mentally, this team from EPFL…

Farbod: Pinpointed.

Daniel: Yeah, pinpointed a way that they can do really, really sensitive movement. And again, they tested this over 150 sessions in this virtual environment with 61 different healthy subjects. And I'm sure they tried to get people of different sexes, of different sizes, of different shapes, etc. To try and make sure that their hypothesis was true here, that you could use diaphragm movements. And with the 61 healthy subjects, every single one of these people could intuitively after a couple minutes of trying, start to intuitively control this third arm. It didn't hinder their ability to use their natural limbs. Didn't hinder their ability to speak, to breathe, or even to like move their gaze, like train their focus around the room. None of those were interfering with their ability to control the robotic third arm, which I think is really interesting.

Farbod: And that was surprising to me because I would have expected the learning curve to also be kind of steep, right? This is a foreign... It's not, like you're not built with your arms. I mean not your third arm. I would have figured it would take a long time to figure out how to make it work and then make your body behave the way that would make that work, but no, it apparently worked pretty smooth. And just for reference, they also tested this outside of a virtual environment, like actually building a robotic arm that actuated in a linear fashion to make sure that this could work. And in addition to that, what they were trying to do in the virtual environment is to see if there is, you know how like your left hand and your right hand, when you juggle, they're collaborating with each other, like your mind is acting as the main interface, and they're able to collaborate? They were trying to see if they could get the third arm to work collaboratively with the other arms as well and it looks like they also made progress on that front, which again, is kind of mind blowing to me, given that it's so foreign to your body. But it kind of proved their hypothesis, right, that our bodies are able to adapt to an extra appendage, which is weird, but kind of, I don't know, exciting. And at the same time, one thing the lead researcher mentioned early on this article was, by understanding how our brains would adapt to a new appendage and the limitations of it, it actually gives us insight to how our brain works. So, by undertaking this research and by understanding what those roadblocks are, we get a better understanding of how our brain works, which then fills back, it's like a feedback loop into what the research into the brain and its various applications.

Daniel: Yeah, I'm gonna show myself here as a little bit of a Huberman bro. Okay. But I love that Huberman Lab podcast and a lot of what he talks about is like, a lot of the physical outcomes you want to achieve with your body in the real world is actually it all boils back down to neuroscience. And He's like, for the human body, a lot of the first principles are neuroscience. Similar to the way I would say in the engineering realm, if you're trying to push the boundaries of what's possible in engineering, physics, right. Are the first principles of what's possible. And so, to me that, that analogy rings true here where we're saying, we're not pushing the laws of physics here. We're not pushing the laws of physics here. They actually used a pretty simplified robotic arm when they did real life scenario testing. It wasn't anything groundbreaking. Yeah. In terms of like degrees of freedom with the arm or the degree to control that someone has over that arm. That physical arm wasn't where they were pushing the boundaries of what's possible. Where they're pushing the boundaries of what's possible and where they're learning is with human cognition. They're trying to understand the brain's capacity to again, not just handle two arms in symphony, not just handle two arms, two legs, and our voice to be able to speak, and our eyes to be able to pan around. They're saying humans can do all of that and still have the mental capacity to still accurately control another appendage, which to me is pretty awesome, right? It shows just how powerful our brains are. And that if you can find a way to, what they've really done is demonstrated they've found a way for humans to adapt to and control an extra limb without compromising the control of the rest of their body. It's not only showing us what's possible again in the physical realm in terms of human augmentation. I think of people like my brother, who's a firefighter, does this give him the ability to control some extra piece of life-saving equipment for him or for someone else? In addition to that, I think it's providing insights into how the brain works, how much cognition power the brain actually has, and how neuroplastic it is, how adaptive it is, that after a couple minutes of trying with this, it feels natural for people to do that. Very similarly to the way that you can use a different muscle group to control a robotic arm for people who are amputees and end up with a robotic arm. They might use their, whatever muscle mass is left of their deltoid in their shoulder to control what would have been the rest of their arm using a robotic arm. And after a little while, the brain just starts to treat the control of that shoulder muscle as though it's the control of the physical arm. It'd be really, really interesting for me to, for them to test this over long periods of time with these subjects and see if the brain truly starts to treat this third appendage as something that's natural.

Farbod: Yeah, I agree. And what I was going to say is extending on what you were saying with how quickly the brain adapts using inputs from the different muscle groups. A to do for this group is to actually start exploring different control screams, schemes, not screams. For example, right now they've been testing with the diaphragm, which has proved to be very fruitful. Now they're talking about what if we took the movement of your ear. Some people can easily move their ear, but the movement of the ear is not something you do for pretty much any other interaction. So, what if you could utilize that to control another aperture? Which I think is pretty interesting and a, kind of like a stepping stone of integrating these tools into people's daily lives and then engaging how well they do over an extended period of time.

Daniel: It is funny you say wiggling your ear, because that was one of the other things that came to mind for me, like in addition to just verbally giving controls to a robot, which again, now that I think about that, wouldn't be a great solution, because then we wouldn't be able to verbally converse with one another. I thought of wiggling my ears as something that like, I can wiggle my ears independently of whether I'm speaking or I'm trying to use my arms or I'm going for a run. I feel like I can usually just wiggle my ears regardless. And it feels like there's almost these, I wouldn't say vestigial structures, but it feels like muscle groups in the body that we don't really have an apparent use for.

Farbod: It's more like a fun trick for your friends. Like, whoa, I can wiggle my ears.

Daniel: It would be really, really interesting for these folks to really start to push the boundaries here, right? Show us how we can use some of these extra little finite muscle movements that we can do. If you're able to control or to monitor signals, let's say me wiggling my ear and my diaphragm, and maybe there's something else like an extra muscle in my shin, I don't know that I can flex that doesn't impact the rest of the way my body works. If we can start to monitor a couple of those different signals and the brain's adaptable enough to understand that like, Hey, when I wiggle my ear this way, it controls the arm this way. And when I flex my diaphragm this way, it can move it this way. You start to get a couple of different channels for control. I imagine you could take it from a simple linear robot, which is what they tested with in the physical realm. Think about like controlling your paddle in pong. They went from controlling your paddle in pong, if they get a couple more of these signals, it'd be really interesting to me to see if we can get to like using a robotic arm to pick up a ball on a table and move it from one spot to another. Again, relatively simple in the realm of what robotics can do, but groundbreaking in terms of understanding what the human brain can do to control that in conjunction with the rest of the body.

Farbod: Can you imagine, you know that meme of like the dominoes where it's like small and it gets bigger, can you imagine that the smallest one is like humans being able to wiggle their ear and the biggest one is like cyborg revolution or whatever?

Daniel: Honestly, I wouldn't be surprised. Like this to me, again, you can hear like, tell the energy that I have about this, I'm raving about it. The robotics are there. All we need to do is figure out the cognition part, the control part, and that's what they're starting to crack the egg on here. So, I agree, you know, it would be funny if wiggling your ears is the thing that starts this like cyborg revolution, but honestly again, thinking in the context of my brother who has to run into a burning building to try and rescue people, if he's got extra equipment that he can control with his diaphragm or by wiggling his ears, et cetera. That'd be really, really interesting for me to see. I don't know, like clenching your jaw. I'm trying to think of other signals, right?

Farbod: I'm with you, man. Yeah.

Daniel: That could help make him more effective at his job, help make his job safer for him. I also think of, you know, I work in the automotive industry, so there's a lot of stuff where I see technology augmenting what's possible with what humans can do as operators on a manufacturing line. This seems like another interesting signal that we could use to try and control machinery again to make it safer for folks doing their jobs or make them more productive at doing their jobs in a way that doesn't really hinder the rest of the way that their body works and moves and is intended to intended to operate which is really exciting.

Farbod: I totally agree. I'm pretty excited about it. I don't think I feel like a lot of people listening their minds are probably gonna go in like a black mirror space like oh no, it's gonna be a, was a Deus ex have you ever played the video game, we're seeing the show? Where human beings now have artificial eyes, and they have to take these pills, and there's this big corporation that's taking over everything, and you become a slave to the system, and you're, I don't know, robotic components. I feel like that's not the case.

Daniel: You're saying other people? It seems like your brain has wandered down this path before.

Farbod: I mean, I'm just saying, I'm a lover of sci-fi, right? I've consumed a lot of content that's about that stuff, because you're curious. But I feel like what we're seeing here, at least what's proposed so far, is more on the collaborative side of what if you just had an extra appendage that you could control that is not supposed to interfere with your life, but make it a little bit better. I don't know, I'm optimistic about this. I think applications, especially in manufacturing, would be super helpful, and I didn't even think about the firefighter side, but yeah, I bet Caleb would love it. And yeah, with that said, good time to do a little summary, a little TLDR.

Daniel: Yeah, let's wrap it up, man.

Farbod: All right, so folks at the EPFL have started to tackle this idea of what if we could control an extra muscle group, right? What if you could have a third robotic arm that you could control without impeding any of your normal functions? Well, they've kind of started to crack this. They've tested this in a virtual environment where a user can control a third robotic arm with a diaphragm sensor. And unlike what you might think or unlike what I thought, it's actually not that difficult to do and users got up to speed very quickly. It was kind of like second nature to them. They were able to test this out, this hypothesis of collaborating your third arm with the other appendages. And in addition to that, by doing the work that they did to understand what it would take for us to control another arm, they've been unlocking secrets on how the brain itself actually works, which is a feedback loop that tells them how to basically do this approach, but better in the future. And speaking of the future, what they want to start tackling next is how can we start looking at different control schemes like wiggling your ear or different muscle groups to control more robotic apertures like this and then potentially do more research on what extended usage of these apertures looks like for humans down the road.

Daniel: Nailed it dude.

Farbod: I try, I try. I feel like that wraps up our first episode of the year.

Daniel: Yeah. I will say, we've got to say a quick thank you to our friends in the UAE. We’re in Top 200 podcasts there. So again, we keep giving fickle promises to people that we will try a food from your country if you get us to trend in the top 200 podcasts there. I think we've got a pretty solid backlog here. We need to do Spain. I think we need to do Brazil.

Farbod: We do. Here's the thing, to our fans in the UAE and Spain and everywhere in the world. We need recommendations. We are, like, we can do our best to Yelp and Google stuff. If your country's trending, we'll shout it out and then do us a favor, DM us on what kind of foods are best from there.

Daniel: That's true.

Farbod: What we need to try. Because we're foodies.

Daniel: Yeah, it's a lot easier to say, like, what's the best paella than to say, what's the best Spanish food in the world.

Farbod: Exactly. And then we will hold up our end and come back the next episode and tell you how it was.

Daniel: We might even post some evidence to prove it to you on social media.

Farbod: That would be even better. The Next Byte goes eating.

Daniel: Biting.

Farbod: Oh, biting, oh my God. That's great, that's great.

Daniel: All right, with that awful dad joke, I think it's time to wrap up the episode, man.

Farbod: Yeah, everyone, thank you so much for listening. And as always, we'll catch you in the next one.

Daniel: Peace.


As always, you can find these and other interesting & impactful engineering articles on Wevolver.com.

To learn more about this show, please visit our shows page. By following the page, you will get automatic updates by email when a new show is published. Be sure to give us a follow and review on Apple podcasts, Spotify, and most of your favorite podcast platforms!

--

The Next Byte: We're two engineers on a mission to simplify complex science & technology, making it easy to understand. In each episode of our show, we dive into world-changing tech (such as AI, robotics, 3D printing, IoT, & much more), all while keeping it entertaining & engaging along the way.

article-newsletter-subscribe-image

The Next Byte Newsletter

Fuel your tech-savvy curiosity with “byte” sized digests of tech breakthroughs.