podcast

Podcast: Key To Unlock Immersive Holograms

author avatar
Podcast: Key To Unlock Immersive Holograms

In this episode, we discuss research from Harvard that tackles the blurriness and lack of disruption recovery with most holograms which prevents users from having an immersive experience.

In this episode, we discuss research from Harvard that tackles the blurriness and lack of disruption recovery with most holograms which prevents users from having an immersive experience.  


EPISODE NOTES

(0:50) - New light sheet holography overcomes the depth perception challenge in 3D holograms


Transcript

Hey Folks, today we talk about interesting research from Harvard. They've kind of cracked the code on making 3d holograms viewable from different angles without any loss of details, it feels just like looking at a real object. And Farbod and I also shed some light on our favorite sci-fi movies, I mentioned Star Wars, he mentioned Iron Man, and when you get Star Wars and Iron Man in the same room, it's never a bad time. So, let's jump right into it.

I'm Daniel, and I'm Farbod. And this is the NextByte Podcast. Every week, we explore interesting and impactful tech and engineering content from Wevolver.com and deliver it to you in bite sized episodes that are easy to understand, regardless of your background. 

Daniel: What's up folks, like we said, today, we're talking about some interesting research from Harvard University that's going to help revolutionize holograms, and kind of this mixed reality tech space that we're talking about. When I hear holograms, the first thing I think of is like how in Star Wars, whenever they had a video message, or like a video call with someone they're able to talk to the 3d version of that person, like, sitting is like a pocket-sized version of them on their desk. I think that's like a really interesting reality. If we were ever able to achieve it, and we would all look backward at like, especially the Star Wars from the 70s. And be like, wow, George Lucas, like, how did you understand this future. But technology today has not really gotten us to a point where that's a reality that any of us can trust, right? Researchers have worked for a long time to try and make 3d holograms that can be viewed from any angle. And it feels similar to viewing a real object. Honestly, the end result of all this work has been a little bit underwhelming. I don't know if you've ever seen one of these things. If not, you should probably go check it out on YouTube. Like you can tell, even in the 2d version of trying to understand the 3d object that is a little bit clunky, right?

Farbod: Yeah, I mean, I feel like on the consumer side, we've seen a bunch of different products that came out. You see on I think very commonly on like, mid to higher level cars right now, you get the heads-up display, that's a hologram displayed to you. And it's okay, but like you're saying that should be a 2d image. And even that is kind of underwhelming at times to look at. So, then you think about the 3d version of it when you're supposed to use like, let's say augmented reality. I know Google was doing the Google glasses for a while, like snap does the same thing with their, was it called? Snap something? I don't know. Spectacles, spectacles, that's what it is. And it's all just kind of falls short of this idea that we were promised. And like, you're talking about Star Wars, dude, Iron Man 1, which is probably one of the main reasons subconsciously I became an engineer, when he's doing the CAD on like the Mark 1 and pulls it up. And I'm like, Oh, my God, yes. That's what I want to do with my life. Where is it? Where is Iron Man 1 that we were promised in 2009.

Daniel: Where's this future we can live in where you can create, like a virtual projection of an object. And it feels like, you know, it looks close enough to being real, that your brain believes that it's a real 3d objects that you're able to interact with. That's a future that I think a lot of researchers have been working towards. But generally, like we said, there's still a depth perception issue, when you're interacting with these products is VR, AR, XR products; virtual reality, augmented reality, mixed reality. But also, to this holography technology field, in general, has a problem with depth perception, especially if you're trying to view things from different angles. So maybe they're able to replicate the 3d effect from a very specific angle. But if you move to a different angle, or you know, adjust your viewing angle, it doesn't look 3d anymore. Or if you're looking at it from super far away, it doesn't look super detailed. Or if you're using something like your Snapchat Spectacles, I'm not sure if there's a specific challenge with those. I didn't use that product specifically. But I've heard that like, you may be able to replicate the 3d effect in the middle of the object that you're looking at. But as you start to look towards the edges, it looks kind of foggy and blurry. All that to say it's not visually, truly emulating the 3d experience, it's still very easy to tell that what you're looking at isn't a real object. It's a virtual object, and they're not doing a great job at portraying.

Farbod: Yeah, and like the underlying problem, as explained in this article kind of makes sense, right? Like even if you don't understand optics well enough, like just think about semi-transparent sheets that are supposed to be put together in front of your eyes to resemble a ball, right? Like you would have a very small circle slightly bigger, slightly bigger, and the other side of it slightly smaller, slightly smaller. And then when you kind of look at it Looks like a sphere. Generally speaking, that last one or two spheres, they're gonna get more and more blurred because you have limited access and visibility. And that's kind of what happens when they render this stuff to you as a hologram, right? You get the layers projected onto you, as coplanar to your eyes. And so, when you try to change the angle that you're looking at those last few layers start looking more and more blurry, which messes up your depth perception.

Daniel: Yep. I think one of the, like, challenging things about holograms in general to and especially like trying to interact with them. The way that you talked about Tony Stark doing it in Iron Man, is also the fact that the way that we've been projecting light largely has trouble when an obstacle gets in the way as well, right? So, it's like, if I want to interact with this object in 3d virtual object in 3d space, my hand will start to block the light, and then I won't be able to see the rest of the projection see the rest of the holographic image. So that's another pain point, again, that just reduces the immersiveness reduces the, you know, reality in the mixed reality or in the augmented reality, right. So, we can talk a little bit about this later but I think specifically, in the VR, AR XR space that's really buzzy right now, especially following the release of Apple's Vision Pro.

Farbod: Perfect timing for this podcast episode.

Daniel: Exactly, right. But I think that something feeling immersive, something feeling real, and something feeling engaging enough that people want to actually adopt this technology is really, really important to the future of it, even existing. So right now, the current state that we live in, without this research from Harvard, where there isn't really, really good 3d depth perception with holography. It's a detriment to these technologies that want to capitalize on it, because it doesn't feel real.

Farbod: I'm gonna take a step back here, like out of this whole optics realm, right? When we talk about new technology, it is imperative that whatever you come up with, feels as authentic as possible to the end user. Like, that's how you assure adoption, right? So, when I think about what we've talked about, like 3d printed food before, right, and as the wrong texture, flavor might not be right, people are not gonna want it, then you had the founder of impossible, who was like, instead of trying to, like get people to enjoy veggies and stuff, I was like, I'm going to mimic meat, flavor, meat, texture, meat like juices coming out of a burger as well as I can. And in doing so, created a very successful company and product to the point that honestly, sometimes I choose an impossible burger over the normal patties served their Burger King or whatever, because I just think they're that good, right? So that's a winning product. And now when you're thinking about augmented reality, at least like my two cents here, by the way, I don't want to see we emojis popping up here on there, right? Like, I want to see things that look as real as possible. Just like Tony Stark was looking at it, just like in Star Wars, when it looked like you were having a meeting and you were rendering those people perfectly. That's what I'm going for. And I feel like a lot of other people are in the same boat. So even though something as small as “Oh, it's kind of blurry”, like the depth perception is right, or, as the light is projecting, it kind of bends. So, it takes me out of my immersion might not seem like a big deal. It's those little details that can make or break a product. Right?

Daniel: Yeah. And so, let's talk about how this team from Harvard, is using their new technology breakthrough, kind of to try and combat these challenges, right? So, it's not that they're completely reinventing the 3d hologram method, what they've actually their secret sauce here, what they've actually done differently, is use a unique type of light beam that is being used to generate this 3d hologram. So, it's called a Bessel beam. I hope I'm saying that right. This special type of beam is kind of like throwing a stone into a pond. And then ripples in the water spread out in circles around that concentric circles around that. So, Bessel beam is just like that, but with light instead of water. So, when you use a series of Bessel beams together, right, a bunch of them together to create this holographic image to very special things happen. The first part is the core, the center of the beam stays the same strength, no matter how far that beam travels. So, unlike, I think it's something like a flashlight where you point it up into the night sky, and the light from that flashlight gets dimmer the further that it shines the Bessel beam because of this like concentric circle, nature of it, the center the core of that stays just as bright no matter how far the light is traveling. The second part is that because of these ripples, this concentric circles of light emanating around that core. If something gets in the way of the light beams, it can heal itself. So that means, even if an object is blocking part of the beam, like you're sticking your hand inside the hologram, once the light has passed that object, it like, fills back in the missing part. So, it's like you had a magic flashlight that can shine around corners. So, Bessel beams start to attack these pain points that we're experiencing with 3d holograms where it's like, you know, maybe depending on the angle you're looking at, the intensity of the light isn't correct, or whether it's at the middle or at the edge of the image, the intensity of the light isn't correct to where it renders the image properly, where it feels real, feels immersive that is 3d. And then also when you try and interact with it, when you try to put your hand in it, and it casts a shadow instead of the rest of the object appearing the way it was before. That reduces from the immersiveness, Bessel beams seem to be like the silver bullet to help attack both of these pain points.

Farbod: Yeah, I mean, you hit the nail on the head here, right, there's two problems that we talked about that exist, these Bessel beams addressed all of them. And I had kind of talked about how the rendering was happening with the traditional approach, you know, you have coplanar, layer after layer after layer that distorts the last layer. But now with these Bessel beams, it actually is rendered perpendicularly to your eyes, right. So, it gets rid of that problem as a whole. And then lastly, you have what you talked about, which is if you're interacting with your hand, it doesn't destroy the immersion because the light beam recovers. But a little nuance to that. So, if you were shining a light and you put your hand in, right, the light wouldn't completely stop, it kind of bent and then stopped. So, it looked like there was like still like a shimmer on your hand, which was weird. Like that object isn't real. But these Bessel beams, they don't have that it's, I forgot the terminology they use. But it's more clear-cut, not only does it recover, but in terms of disruption. It handles it much, much more delicately.

Daniel: And I will say optic stuff in general, and this one specifically, way over my head. This is me, like after hours of research trying to understand exactly what a Bessel beam does and how it works. I wasn't able to completely get there. But what I was able to get as those two key takeaways. And I think that's all that we really need to take away from this is like, these Bessel beams are really interesting, because they're able to maintain intensity no matter how far the light travels. And, you know, if an object gets in the way, instead of light bending around it, it's able to like, kind of magically heal itself. And both of those are really, really compelling.

Farbod: I think it's worth noting of like, the value of this, you touched on this, but Apple Vision PRO just announced, right? Like, what does, how does this gonna translate into these products that we care about? I mean, I care about personally, I think it's amazing. I'm super excited for it. But what does it mean for us, right? And Apple displayed, like, if you're a creative person, now you can leverage this new product that they've made with augmented reality to do your design in 3d space. We've talked about this in the past, I think one of our earlier episodes where a doctor was using a augmented reality goggle to shape the implant that they were going to use on a patient, right. So, in those cases, accuracy really matters. And you don't want any sort of distortion. This kind of technology can make sure that the augmented reality solutions or even virtual reality solutions or mixed reality solutions that we get are as precise as possible, especially for applications where that's a necessity.

Daniel: And I'll just say there's a graveyard of technology solutions that have tried to enter the AR, VR, XR space. But I flopped. Google Glass from Google, Virtual Boy from Nintendo, Snapchat Spectacles, Magic Leap 1, Jaunt VR, that's just a handful of the couple that, you know, didn't make it to market like they hope to or did a launch and, you know, didn't garner the user adoption that they hoped to? Yeah. And then I think even further, there are devices that work really, really well, like the Oculus for the virtual reality experience, and even that hasn't been adopted quite as well as, let's say, might have hoped, right when he's talking about making a metaverse. That being said, right, my personal take here, my opinion, right. I think the core of all these issues is the fact that, you know, maybe these devices don't look great on your head, or maybe they're a little bit uncomfortable, or maybe it's like weird to look like you're wearing ski goggles in public. I think those are like all minor inefficiencies, minor friction points, let's say for the user. But that's not something that can't be overcome if the technology is solid enough if it's immersive enough, if it adds an have value to someone's life. For me, to be honest, it's because the technology hasn't gotten to a point yet, hasn't gotten mature enough yet, to where like, it feels real. When you're in a mixed reality experience, you know, you still have to suspend a little bit of disbelief. And I think using technology like this tech from Harvard, and maybe Apple's gotten somewhere close to there as well with their holography is like, once you create an experience that's truly immersive, users will start to flock to it, because of the experience, because it feels real, regardless of the fact that it'll, you know, they might look ridiculous, wearing ski goggles everywhere in public. And then, because of that, because you've got start to get a critical mass of users doing that, it starts to become normal to wear the ski goggles in public. I remember at the beginning, when Apple released air pods, everyone was making a joke about how it looked just like earbuds, but you cut the wires off. That's so ridiculous. No one will ever use them. No one ever wear those in public. All it needed was a really strong user experience to get a bunch of early adopters in there, that was enough of a critical mass that everyone saw people wearing air pods and you're like, you know what, maybe I'll jump on board as well. And I did the same thing right? At first, I was making fun of it a couple years later, I bought my first pair of air pods. And I'd say they've like changed my audio experience, like so. That's a long rant. And I'll get off my soapbox here. But I think this research that this team from Harvard is doing to make sure that a holography, you know, feels more real, could be an overall contribution to the AR-VR trend and other applications in various fields that things like Apple's Vision Pro are also trying to break into, which is like making a virtual technology, a part of our real world as well.

Farbod: It's funny that you brought up an Apple product when coming up with an example of something that you know, causes shift in people's perception of a product. And then Apple Vision Pro is the big thing that we're talking about today. And the example that I was thinking of was the Apple iPhone, like if you remember the Apple iPhone four, when it came out. It was technically the second phone, but it was still pretty new for most people. If you held it the wrong way, you would short the antenna connection like your calls would drop, right? Which is just like maybe the first thing that you try to get right when it comes to a product that has a phone meant for calling, right? But this product was so well made in terms of its experience. Like that's kind of what you were getting at its ecosystem is experience that people were like, yeah, no, not a big deal. I love like, look, I can play around with the apps, I got so much music in it, who cares? Like it will get better. And it did, right. But the first version had its flaws. It was clunky, it wasn't good looking. But it really delivered on the user experience. So, if the integration of the work that these folks did at Harvard, can move the, move us forward just a little bit, to getting closer to that like real immersion scenario. I think these products can actually become pretty successful. And history has shown Apple has been the trailblazer when it comes to new products, maybe they'll knock this out of the park.

Daniel: Yeah, I agree. And I would be interested, honestly, if this team from Harvard can get their hands on a Vision Pro and test it out and try it and compare it to their own holography. See if they, you know, get their unfiltered opinion on how good the apple Vision Pro is, before I go drop 3500 bucks. I'm getting it, right?

Farbod: Yeah, yeah, for sure. Now, before, I was gonna say it's probably a good spot to end the episode. But before we do that, you should do a recap of everything that we just talked about.

Daniel: I'm with you, man. So, I will kind of go all the way back to the beginning. Holograms, we've been working on them for a little while we've seen them in sci-fi movies. The goal is so that you can see 3d holograms from any angle. It's a, you know, digital virtual rendering of a virtual object in the physical world. And it feels and looks real. That reality hasn't always come to fruition. And this team from Harvard developed a new method for creating 3d holograms to try and improve that experience. So, they used a special kind of light, called a Bessel beam. And it kind of stacks. They stack these beams together, like slices of cheese, slices of cheese to create like a sandwich of light. And then these light sheets are used to build a 3d projection when you look at it with your eyes. Because of this unique Bessel beam that they used, it feels more real because the Bessel beam has higher control over intensity, as well as like self-healing properties. So, light doesn't bend when it hits an obstacle. All that comes together to make a real, more real looking 3d projection. That's what this team from Harvard has done. They think we can use it to improve virtual and augmented reality. They also plan on using it in biological imaging and optogenetics. And one thing that I think is interesting we didn't mention it yet, but Harvard Office of Technology Department is already protecting this idea with a patent. So, they know it's going to be big. They're looking forward to commercializing it and making some money off of it as well.

Farbod: Love to hear, I hope it makes it out and gets into some products, hopefully ones that you and I are going to use the not-so-distant future. Yeah. But on that note, thank you guys so much for listening. And as always, we'll catch you in the next one.

Daniel: Peace.

-------

That's all for today The NextByte Podcast is produced by Wevolver, and to learn more about the topics with discussed today visit Wevolver.com.

If you enjoyed this episode, please review and subscribe, via Apple podcasts Spotify or one of your favorite platforms. I'm Farbod and I'm Daniel. Thank you for listening and we'll see you in the next episode.


As always, you can find these and other interesting & impactful engineering articles on Wevolver.com.

To learn more about this show, please visit our shows page. By following the page, you will get automatic updates by email when a new show is published. Be sure to give us a follow and review on Apple podcasts, Spotify, and most of your favorite podcast platforms!

article-newsletter-subscribe-image

The Next Byte Newsletter

Fuel your tech-savvy curiosity with “byte” sized digests of tech breakthroughs.