Podcast: The 3D Printer That Teaches Itself To Print

author avatar
Podcast: The 3D Printer That Teaches Itself To Print

In this episode, we discuss an additive manufacturing breakthrough from MIT that enables inkjet printers to print soft materials that were deemed unsuitable before, print more accurately, and print ~660 times faster than comparable printer technologies!

In this episode, we discuss an additive manufacturing breakthrough from MIT that enables inkjet printers to print soft materials that were deemed unsuitable before, print more accurately, and print ~660 times faster than comparable printer technologies!

This podcast is sponsored by Mouser Electronics


(3:23) - This 3D printer can watch itself fabricate objects

This episode was brought to you by Mouser, our favorite place to get electronics parts for any project, whether it be a hobby at home or a prototype for work. Click HERE to learn more about the underlying technology enabling nanoscale 3D printing; printing environment where dimensions are 1/100,000th of a human hair!


Hey folks, have you ever tried to use a 3D printer to produce a part and it didn't turn out the way you wanted? Well, I have and it's really frustrating and something I've learned is these old 3D printers have trouble with producing complex designs, have trouble using certain materials, especially those that don't harden quickly. So, what we're talking about today is a team from MIT and from ETH Zurich and a startup that are working to give 3D printers an eyes and a brain so they can understand what they're printing, make course corrections if they need to and this will unlock their ability to use better materials and print a lot faster than the current materials do. I think it's really interesting, so let's jump right on into it.

I'm Daniel, and I'm Farbod. And this is the NextByte Podcast. Every week, we explore interesting and impactful tech and engineering content from Wevolver.com and deliver it to you in bite sized episodes that are easy to understand, regardless of your background. 

Daniel: What's up peeps? Like we said today, we're talking all about an awesome 3D printer from MIT. But before we jump into that, I want to mention a technical resource from our sponsor, Mouser Electronics. If you've listened to this podcast at all, you must know that we're huge Mouser fans. We love that they're one of the world's biggest electronics suppliers. And as a result of that, they've got their fingers on the pulse of cutting-edge technology. They've got awesome technical resources for us to help stay up to speed with cutting edge technology. And in this case, we've linked a technical resource talking about 3D nano scale printing, which honestly sounds pretty dang sweet. So, we know what 3D printing is. We know what working with stuff at the nano scale is, which is like working at geometries that are at or below one nanometer. So, we're talking-

Farbod: One 100,000th the thickness of your hair.

Daniel: So, so small. It's one 100,000th, one, one, one thousandth, one hundred thousandth, way thinner than the human hair. Being able to manipulate geometries at that level already, that kind of boggles my mind. I've got a sweet spot in my heart, because that's where Farbod and I met, doing nanoscale level research together. But taking that and combining it with 3D printing, which allows us to basically additively create geometry. But being able to do this at such a small-scale using lasers that are only two photons wide. It sounds pretty incredible. So, it's the same principle as resin printing, which is what things that like Formlabs and many Stratasys printers use, but they're talking about the theory of being able to do this at the nano scale, building up things one atom at a time, having full control over the structure and using it to complement other types of nanotechnology, something like CVD, which is very commonly used to create super conductors and other type of awesome materials like graphene.

Farbod: Yeah. Yeah. I was going to say, I one of the things I liked about the article was that it gives you that primer on what the existing technology looks like. So, you have that basis when you're learning about the new thing to know what's like actually really cool about this versus whatever the state of the art currently is. Like other miles of resources, very well put, very easy to digest, even if you're not familiar with the field. And I really enjoyed reading it.

Daniel: And it's super relevant to what we're talking about today, which is using vision, using a closed-loop system to help push the boundaries of what's possible with 3d printing and basically what this team from MIT is done. They made a super smart 3d printer that can see what it's doing. Which is revolutionary because not many 3d printers to date use a visual tracking or visual monitoring control system, old 3d printers have had trouble with different types of materials, especially those that don't harden quickly and specifically they talk about materials that can smear during the printing process. If you've got no feedback loop to understand what the geometry is that you're printing, you don't wanna use a material that, tends to be a little bit more volatile or a little bit more dynamic while you're printing. You're gonna have to be limited to materials that are really, really rigid, really, really stable. And that's worked pretty well to date. But what this team from MIT is saying, let's use a vision-controlled system, rely on innovations like these really, really sweet super smart inkjet 3D printing system that's been developed at MIT combine these to overcome these material constraints. And it allows us to closely monitor, closely adjust the printing, but also make things with cool new materials, make cool new geometries that we've never been able to make before, and make them a lot faster than we've ever been able to make anything before.

Farbod: Yeah, so the underlying technology they're talking about is nothing crazy, right? Like you have inkjet printing, which has been adopted by a lot of big manufacturers. What is it? Smile Direct Club? The one that uses HP's inkjet printers for coming up with the Invisalign style night guards or treatments for orthodontics or whatever. So that's not new, but like you mentioned, they talk about how the type of materials they can use for inkjet printing has typically been limited because certain materials, even though it gives us desirable material properties at the end, don't do well with the printing process because they smear or whatever. So, what these folks have added, it's the collaboration between MIT, MIT Spinoff InkBit, and then ETH Zurich, is a closed feedback loop system, right? Like what if after every iteration of us printing a layer, we check is it looking right or no? And if it's not looking right, how do we fix it? Like the sauce is again, it seems straightforward, but the beauty is in the genius of its simplicity. Because every time they scan, I mean sorry, every time they print, they take a scan, they process it and all that takes about a second, and then you tell each dropper that's within the inkjet printer, there's like 16,000 of them.

Daniel: Yeah, 16,000 different nozzles.

Farbod: Crazy, but that's the level of control they have, so they can be like, nozzle number 13,582, drop 80% of what you're dropping because you're causing smears. And being able to finely tune that printing system allows them to have a lot more flexibility with the type of materials that they wanna use.

Daniel: Well, and what I wanna mention is like the analogy that came into my head is like, because we just decorated gingerbread houses at home for Nellie's birthday, which was awesome by the way. No invite for me, that's cool. If you're like, if you're, we're just gonna gloss over that. If you're decorating a gingerbread house or decorating a cookie, right? Trying to use some frosting, let's say. And I give you very, very specific instructions. I want you to make a smiley face on the top of this cookie or on the top of this gingerbread house. And then I put a blindfold on you and hand you a bag of frosting and say, go do it. You're not gonna do it very well. And that's basically what we've experienced with 3D printers to date, is we give a very, very specific set of controls, very, very specific set of instructions. We upload it to a 3D printer. It knows what it's supposed to be doing, but it doesn't have any feedback on how it's actually going. Doesn't have a visual feedback system to close the loop and understand what's going on. So, if you imagine, if I was relying on you to be able to make this cookie with a smiley face on it the perfect way every single time. I'm gonna severely limit the different types of materials you can use. I'm gonna severely limit the different types of shapes that you can use to make sure that I get a reliable product. That's basically what we've been doing with 3D printing. Because there's no visual feedback loop to help the printer understand and correct, course correct along the way, we've had to severely limit what types of materials 3D printers can use. We've had to limit what types of geometries they can use. Or the most frustrating part, and you and I both experienced this personally, is you may ask your 3D printer to print something that's toward the boundary condition, at the edge of what's possible. It could run for 24, 36, 48 hours, and then it fails in hour 47. It didn't know. It keeps printing. You show up the next morning to work or you wake up after your 3D print overnight and it just looks like a disaster happened.

Farbod: Spaghetti monster, yeah.

Daniel: And then the 3D printer didn't know because there was no visual control system to help close the loop. So, this fundamentally speaking is really, really compelling to me because we've got visual control systems in a bunch of different parts of manufacturing. It seems about right that this has made its way to 3D printing. But for it to combine forces with something like this really complex 3D printing from Inkbit, that's the one with the inkjet printer, right? With 16,000 different nozzles, that's like two superpowers collide here, which is using a vision controlled system to close the loop and doing it on a really, really sweet, really, really awesome 3D printer with a lot of firepower to help push the boundaries and tell us, these are the awesome materials that we can use now. These are the cool geometries we can use now. And they're doing things like printing soft, functional, accurate geometries of robot hands and heart models. And all of this without human intervention, doing it very reliably and doing it using the visual control system, I think it's pretty sweet.

Farbod: I agree with you, man. And one thing I was gonna say is, it really reminds me of Machina, for our listeners that might not remember. Machina is the startup based out of California that's using robots to do sheet metal forming. And one thing that was really interesting is that these robots are AI powers to know how hard to press, to how long to deform a certain area before backing off, things like that. A big portion of their secret sauce was the ability to scan and understand the form to inform it in terms of how well it matches with the CAD file, which is exactly what this solution is doing. But then it kept iterating and getting better and better over time because it would realize, even before it made the mistake, how to do it better the first time around.

Daniel: Not just course correcting mid-printing, but then also understanding some of the fundamental rules and principles and incorporating that into the design for the next time.

Farbod: So, it's interesting seeing that theme just on the what is it, the next generation of manufacturing, is that this closed feedback loop system is becoming more and more critical. But another thing I was gonna say is, now that they've kind of cracked this printing of materials that are typically sought after, but not possible, the “so what” that we're gonna start getting into is that they can experiment with certain soft materials that they didn't have access to before and do multi-material printing where you can have rigid bodies with soft components like a robotic hand that has tendons that you would want to make out of the soft material, but then the skeleton is made out of the hard material. On top of that, you're not sacrificing your precision either, if anything, you're making it better because you're constantly layer after layer comparing it with your CAD file, allowing you to have what they refer to as airtight like interfaces between the hard components and soft components.

Daniel: And on top of all that, right? Which is obviously a plus here for anyone that's dealt with 3D printing. It takes a long time. And a part of that is again, if you're trying to make sure that the part turns out to the design that you want a vast majority of the time, let's say 99.9% of the time that the desired geometry is produced reliably on that system, you're going to do things like control the speed. You're going to make it really, really slow so that it can move really precisely. Instead, when you've got this vision-controlled system, you can start to understand where we can basically turn up the speed dial and then not still sacrifice the quality of the part because we're monitoring it visually during the printing process. This team from Inkbit said that they were able to produce the same parts with the visual control system, about 660 times faster than other state of the art 3D inkjet printers. So, this is not only revolutionizing the materials we can use, it's not only revolutionizing the geometry we can use. It's also revolutionizing the production speed for 3D printing. All three of those have been touted as like potential pain points with 3D printing for different applications. So, I think that's really interesting. And then it really goes back to the analogy I came up with in my head to try and understand what's going on here. Again, let's go back to our analogy. You're sitting in the kitchen for a moment and I'm asking you to make, you know, decorate gingerbread cookies. If I tell you to make a specific design and you can take your blindfold off now and you've got a picture of what the design's supposed to look like and you're watching your gingerbread, watching yourself make your gingerbread cookie right next to it. You're gonna do a really good job. You're gonna work a lot faster than when you're blindfolded. The results gonna be better. You might be able to use more experimental materials, et cetera, to achieve the desired design. In the same way, we're kind of giving 3D printers their own eyes and their own brain to understand what, how the progress is of producing that design and then in the end state, what it's supposed to look like and make any course corrections on the way to get there.

Farbod: Absolutely. Well, besides knowing that you don't have faith in my gingerbread making abilities, I would say that that is a great analogy.

Daniel: Thanks, man.

Farbod: But I think it's time to wrap up the episode. What do you think?

Daniel: Yeah, I agree. I'll give us a quick wrap up here, man. Please do. All right, so think about 3D printing right now. It's like trying to decorate cookies without eyes and without a brain. You get a really, really specific set of instructions to your 3D printer, but it's got no feedback loop system to understand is the design that it's making close to the design that you want, and are there any course corrections that need to be made along the way, right now, 3D printers don't have that ability. So, this team of scientists from MIT, a startup called Inkbit and then also ETH Zurich are working together to add a vision controlled system to help 3D printers see what it's doing. And then that'll allow it to use many new materials to make many new complex things like robot hands and heart models. And not only that, it'll be able to do it with these better materials over 600 times faster than current printers do.

Farbod: Money, money.

Daniel: Thanks, my dawg.

Farbod: I mean, you don't invite me, but you know, that's ending was money.

Daniel: We'll resolve that off air.

Farbod: Ah, okay. All right. All right. Well, everyone, thank you so much for listening. And as always, we'll catch the next one.

Daniel: Peace.

As always, you can find these and other interesting & impactful engineering articles on Wevolver.com.

To learn more about this show, please visit our shows page. By following the page, you will get automatic updates by email when a new show is published. Be sure to give us a follow and review on Apple podcasts, Spotify, and most of your favorite podcast platforms!


The Next Byte: We're two engineers on a mission to simplify complex science & technology, making it easy to understand. In each episode of our show, we dive into world-changing tech (such as AI, robotics, 3D printing, IoT, & much more), all while keeping it entertaining & engaging along the way.


The Next Byte Newsletter

Fuel your tech-savvy curiosity with “byte” sized digests of tech breakthroughs.

More by The Next Byte

The Next Byte Podcast is hosted by two young engineers - Daniel and Farbod - who select the most interesting tech/engineering content on Wevolver.com and deliver it in bite-sized episodes that are easy to understand regardless of your background. If you'd like to stay up to date with our latest ep...