CREAL AR VR lightfield displays

EXCLUSIVE: Hands-on CREAL lightfield XR headsets

I have had the pleasure of being invited by Swiss company CREAL to try its revolutionary lightfield technology that is able to make the immersive reality offered AR and VR headsets more believable. Here you are my experience with it!

Here you are the video version of the article… I got a bad cold when traveling to Lausanne, so it is not the most energetic video I have ever made 😐

CREAL

CREAL (pronounced as see real) is a company located inside the EPFL campus (the Polytechnique of Lausanne, which is one of the most important tech universities in the world) that works on Lightfield technologies since many years. Founded by Tomas Sluka in 2017, the startup has now grown to have up to 20 highly specialized employees, some of them taken from other companies in the field like Intel and Magic Leap. CREAL’s purpose has always been the one of creating highly efficient light ield displays that could let you see virtual elements as if they were real.

The company entered my radars when my review hero Ben Lang tried an old prototype from CREAL at CES 2019 and talked about it enthusiastically. Ben is a very objective journalist and is always very attentive to understand if something works well or not: if he says that a product is interesting, it means that it is really something to put an eye on. He even described CREAL as the best lightfield implementation he had tried until that moment, and this was an enormous endorsement. Ben underlined how it was impressive that he was able to switch his focus between the various virtual elements that were shown to his eyes, and that the visuals behave as if they were really in front of him.

Animated GIF - Find & Share on GIPHY
GIF from that CES Demo where CREAL shows the switch of focus you can have in your vision between the foreground and background elements

After that moment, I have followed the evolution of the startup, that last year got more than $7M of funding and this year at CES promised to shrink down the form factor of its lightfield unit, from the big black box of 2019 that must sit on a table, to a small element that can fit inside slim AR/VR glasses at the end of 2022. Notice that CREAL doesn’t want to manufacture a headset, but just to create a reference design for its display system for other companies to use.

creal ar vr ces 2021
The timeline of the evolution of CREAL glasses. It is impressive how they plan to miniaturize their technology (Image by CREAL from Road To VR)

When I got the invitation to visit the company, I was happy like a little child that is going to the amusement park. I had never tried lightfields in my life and I had the occasion to try them in what I thought (and still think) was one of the most important XR startups! I really want to thank Tomas Sluka, Alexander Kvasov and Giovanni Landi for not only inviting me there, but also for caring about me and giving me a great time with them.

I think there is a very pleasant atmosphere inside CREAL, it seems a good place to work in. The people inside are also very smart, and it is impressive how this smart united team is being able to obtain the results it is obtaining by being very efficient in the number of people and the money spent. $7M of funding may seem a lot, but actually, it is the bare minimum to pursue such an ambitious goal… I’m sure that in the USA they would have got at least 5 times that amount. This efficiency is what is driving them forward very well.

During my visit, I have been able to try the two prototypes that are in the second column of the above picture. But before telling you about my experience, let me clarify a bit what are the lightfield displays that CREAL is working on.

CREAL’s lightfield displays

CREAL has made a video to showcase what are lightfield displays, why it is working on them, and how it is contributing to making them better. You can watch it here below:

I have to admit that I got lost in some technical details, but I grasped the most important info from it. Basically, we all know how current stereoscopy works: in all AR and VR headsets, we just provide to the two eyes two images that are from two slightly different points of view, so that the brain can reconstruct a 3D image from them. Anyway, we know that this solution, which is some centuries old, doesn’t completely reconstruct how we see reality, but it is a cheap hack to give 3D depth to a virtual scene.

There are some known problems with this approach. Some of the most famous ones are:

  • Impossibility to focus on objects: in real life, I can just focus on one particular object and see it completely crisp, while all the rest becomes blurred, while in XR it is not currently possible
  • Vergence accommodation conflict: our eyes to see in focus the virtual content must focus on the actual display, that is at a fixed depth, while the content depicted inside contains virtual objects that are at different depths. This means that to look at the different elements in your virtual scene, your eyes correctly move to put them at the center of attention as it would happen if those elements were real, but then the focus of the eye lens remains always the same, that is incoherent with the other visual cues that are in the scene. This can cause eye strain and dizziness
Vergence Accommodation conflict. The eyes rotate to point at the virtual element, but the eye lenses can not focus at its depicted depth, but must focus on the depth of the screen of the headset (Image from Kroeker, 2010)
  • Impossibility in AR to focus at the same time on virtual and real elements that are in the same position. The virtual elements all lie on a fixed focal plane, so to see them in focus, your eyes must focus on that plane (usually around 1.5-2m). This means that if you have your real hand with a virtual bird on it, with both of them theoretically being at the same distance from your eyes, you can’t focus on both, because if the eyes focus at the distance of the real hand, they lose the focus on the virtual bird that actually lies in the distant focus plane;
  • Inability to see properly elements too close to your eyes: they usually appear blurred, or out of focus.

The root of all evil is that we render all the virtual objects on a display that is at a fixed distance from our eyes. Our eyes see through light rays that enter inside them, which get detected by the eyes’ “sensors” and then are analyzed by the brain. If all content comes from a display, the eyes see the light rays just come from the display, so all from its position, and this is of course different than how light rays would have arrived at the eyes if the objects depicted in the display were real. In this case, the light rays would have all arrived at the eyes from different depths and with different intensities and angles.

It would be cool to have a way to cast to the eyes some light rays that are exactly the same they would be in the real world if the virtual elements were real. Luckily there are already two technologies that allow for it: holograms and lightfields.

CREAL works exactly with lightfields: lightfield displays are optical units that try to reconstruct the light rays of the virtual element depicted so that they can be seen from one or more points of view in a realistic way. There are various ways of offering them, and one of the most popular ones is using a display with a microlens array on top of it that bends the light emitted by the pixels so that every pixel becomes a ray that gets cast with a specified direction into the eyes of the user. All pixels together become a set of rays that in the end form the representation of reality that the eye of the viewer can see.

lightfield display
Lightfields are usually generated this way: the virtual elements get rendered in a special way to a display, and then an array of microlenses transforms the pixels in directional rays (Image by CREAL)

CREAL has innovated the lightfield technology trying to make it more efficient for head-mounted displays. It exploits the fact that the eyes are in a predefined position inside the glasses and that they can see properly only in the fovea, while all the rest of the image appears blurred, to minimize the number of rays that must be calculated for every single frame.

The technology works as follows. It has been defined an array of positions around the eye about which the lightfields have to be calculated: CREAL states that its innovation is in not calculating low-resolution lightfields for many positions, but few high-quality resolution lightfields in a few selected positions around the eye of the user. Through an array of pinlights, a light modulator (a module that is able to “bend” the light rays at its will), and some optics magic, the CREAL engine basically calculates how the light rays of the virtual elements would arrive at each one of the preset points of view, and emits exactly those rays, with the exact angle, and intensity. For every point of view, we have so the reconstruction of the light rays that would be visible from that position. All those light rays computed from all these positions, create together around the eye a collection (a field) of rays that are a good approximation of the ones that would be available in real life around the eye of the user if the virtual elements depicted were real.

https://gfycat.com/flimsyquestionablebackswimmer

The eyes can focus on objects because the rays are the same that there would be if that scene was real, and can also rotate freely to explore the scene, because the computation has not been made only from a single point of view, but for many ones around the eye. Without eye tracking, is so possible to have a reconstruction of the scene that feels realistic.

https://gfycat.com/orangepoliticalhog

All of this on paper, of course. Creating such a system is not an easy feature, and it is even more complicated if the whole system must fit inside a pair of glasses… during a worldwide chip shortage. Tomas, the CEO of the company, told me that miniaturization required some compromises: when you have to reduce your form factor, you have to resort to approximations and optimizations that don’t let you exploit the full power of the lightfields.

Let me tell you the experience I had with this technology.

CREAL AR headset prototype hands-on

The first headset that I’ve been able to try has been the AR one. I tried to look serious and professional, but inside my brain, a little monkey was jumping from the happiness. I was finally going to try AR lightfields, and I imagined that the moment I saw lightfields I would have had an epiphany, a moment of bright vision like when people see the Saints in the sky, I imagined having a brain explosion, and feel like the chosen one.

CREAL AR lightfield headset
Me with the CREAL AR headset prototype on

I was given this headset, codenamed Sue, made with a blue and white plastic crown (clearly 3d-printed) with some visible circuits inside and two lenses in front of it. It was connected to a gaming PC. I put it on my head and then used some knobs to fit it. I was shown the CREAL logo in front of me, and at this point, I used other two knobs to regulate the IPD and the distance of the lenses from the eyes (eye-relief knob). When the CREAL logo was clear and centered, they showed me the first 3D AR element.

The lesson that I learned that day is that when you try lightfields, you don’t feel like having illumination from God, and you don’t become the “chosen one” either. I had super high expectations and when I saw the augmentations, the first reaction that I had was “isn’t this more or less like a HoloLens?”

What I saw was a colored 3D element in front of me, that appeared semi-transparent, and cut by a field of view similar to the one of HoloLens (CREAL confirmed to me that it is 50-60° diagonal, depending on the shape of the head of the viewer). I could move around it thanks to positional tracking supplied for the prototype by an Intel Realsense module. There was also a bird, that if I put my palm up, looked for my hand and sit on it, thanks to hand tracking from Ultraleap. I played a bit with it, but for some seconds I was quite confused about what I should watch since it really looked like what you can already see from HoloLens or Magic Leap.

So… what is the magic that these lightfields add? Why have I spent a whole paragraph explaining you boring optics stuff if we could already use what is available in the market?

The fact was that my brain was tricking me. The problem with immersive realities is that often you notice things that don’t work, and you find “normal” things that work. For the brain it is pretty obvious to see virtual objects like you see real ones, so why should it bother about it? It is nothing special for it… it does that every moment of your life. But actually, this feature is amazing for AR.

CREAL AR
Visuals inside this lightfield AR headset appeared more realistic than with the other AR glasses

I concentrated a bit more on the 3D element and I noticed that actually, it appeared to have more depth. I don’t know how to explain it: I reported this as “the 3D looks more 3D”, it is like objects feel more alive, it is like you can feel better all the nuances of the differences of depth of all their various parts. It’s a bit like when you go from a 2D movie to a 3D movie: both of them are similar and enjoyable, but the 3D one feels more realistic because it stresses more the depth of the elements in the scene. Here you have a similar effect: it is more like the elements have a more nuanced shape, it is like you can actually see all the ups and downs of their surfaces. It is like lightfield XR has another dimension that makes the virtual elements feel more believable, while in the standard one, they are more flat.

Even better, I took the little bird close to my face, and I noticed that actually I could still focus on it and see it very well and without blurring. I put it really close to my face, making my eyes start to cross, and I could still keep it on focus. Again, it seemed nothing special in the beginning, because I can do something similar every day putting a finger close to my nose… but have you ever managed to do that in AR?

https://gfycat.com/snivelingdismalkiwi
You can focus on the dragon close to your and see the background becoming blurred… or vice versa

I then tried to switch focus from the closer to the more distant elements and vice versa, and I could actually change my focus however I wanted…with the nonfocused object appearing blurred and/or doubled. I started moving with room-scale forward and backward keeping my eyes fixated on the virtual elements and things worked very naturally. I also tried to put in focus at the same time virtual and real elements at a similar depth, and it somewhat worked. Again, it’s weird how it feels natural: the specialty of this solution is that it doesn’t look special at all, it just works normally. This reminds me of a popular quote from Futurama:

The sentence that “God” tells Bender (Image credits to FOX)

So in the end, my brain and my eyes were completely tricked into thinking those light rays were real and my way of seeing virtual elements was natural.

This was confirmed by another demo that CREAL made in collaboration with Imverse, where I could see in front of me in AR a volumetric avatar of Alex, the CTO, reconstructed live by an Azure Kinect. I have already seen some volumetric avatars with HoloLens 2, and in the beginning, they seemed similar, but then Alex moved his hand forward extending completely his arm, pretending to give me an object. At that moment, I clearly perceived the difference of depth of that hand with regards to the rest of the body: it was like when in 3D cinema there is a horror movie with the hand of the evil guy holding a knife going outside the screen. I saw the hand like breaking an invisible wall, and this was just fantastic. His hand and his body were clearly at different focus depths, and I have never seen something like that in AR before.

This with the solar system is another cool video shot by CREAL

The system is very interesting, but it is anyway still a prototype and of course has some issues, like for instance:

  • The fov of 50-60 still feels limited
  • The headset is bulky and rather uncomfortable to wear
  • The virtual objects are maybe too transparent, more than on HoloLens (I guess this is a problem of the combiner, and not of the lightfield)
  • The virtual objects look like being affected by visual noise, and if you get closer, you see like if they are made by pulsating dots
  • Focusing on virtual objects gave me more eye strain than I usually have on focusing on real objects
  • The setup was not straightforward.

CREAL team told me that it is aware of all these issues, and it is working on them. Of course, it is an ongoing complicated R&D process, that requires time and money. Refining a prototype until it becomes a product is one of the most difficult processes ever.

CREAL VR headset prototype hands-on

I was then handed a big VR headset codenamed Zorya, which was so big and heavy, so I gave it the nickname of “The brick”. The black and yellow brick immediately won the award for the heaviest headset I have ever tried.

creal vr lightfield headset
The best brick I have ever tried in my life lol

Again I had to wear it and regulate the IPD and other stuff to make it fit. Then the application started, and I found myself inside the cockpit of a spaceship. In front of me, I could see some black-ish regions, and I was asked to regulate my fit until those regions became consistent with the rest of the images.

It turned out that the Brick is a headset that works more or less like a Varjo device: you have in the center of your vision a 30° region that is composed of lightfields, and the rest is made of a standard Vive Pro display. The lightfield region should fit in your Fovea, while the standard display is just there for your peripheral vision.

Long story short, I made the same tests that I made in AR, and I had similar results: the objects that fell in the lightfield region appeared more realistic than everything I have ever tried in VR in my life: the virtual elements felt so alive, crisp, and nuanced. I could change focus, and especially I could also focus on a little text that was written in the world, seeing it clearly and in focus, reading it very naturally. I usually have difficulties in reading text in VR, but in the lightfield region of this headset, the high resolution and the ability to focus on it made reading it incredibly easy.

https://gfycat.com/infinitedizzyambushbug
In this video created by CREAL, you can see the objects entering the lightfield region becoming crisp and in focus. It conveys the idea of how it is, but actually trying it is much more impressive

This VR test was very interesting because the Vive Pro display showed objects with standard stereoscopy, while in the central region I had the new stereoscopy offered by lightfields, so I could make a direct comparison between the two. There was a droid flying outside the cockpit and when it moved from the peripheral region of my vision to the central one, I could clearly see it gain a lot of resolution, and especially a lot more nuanced depth. This confirmed me the sensation of “the 3D becoming more 3D” that this technology offers. Notice that the Vive Pro display is not bad at all, but its resolution felt ridiculous if compared to the one of the lightfield region. The droid in the lightfield region was incredibly realistic.

Me wearing CREAL VR headset prototype. The demo was as impressive as the AR one

Again, also this demo didn’t come without issues:

  • The lightfield region had a very limited FOV of 30°
  • The separation between the two display types was abrupt, and needed some kind of blurring
  • For my head shape, I was never able to make the black color to disappear completely from the central region… it always remained a bit darker
  • The virtual objects looked like being affected by visual noise
  • Focusing on virtual objects gave me more eye strain than I usually have on focusing on real objects
  • The headset is heavy as a brick and totally uncomfortable
  • The setup was not straightforward.

CREAL is also working on solving these problems.

Final impressions

The brick is love, the brick is life

My hands-on with CREAL visual technology confirmed that it is one of the most interesting startups in the XR landscape, one with which all headset manufacturers should speak with. I still think it will be bought by some major company (e.g. Apple) in the next years.

Its technology is very interesting because it makes virtual elements feel more realistic, more vivid, and alive. And it also solves many visual issues of current augmentations, like the vergence-accommodation conflict that causes eye strain and can cause eye problems in the long run.

I was impressed by what they showed me. Of course, the product is not ready, and it is still in a prototypical stage, so it needs many improvements. But what makes me hope for the best is that this company never overpromised (it has no video with jumping wales) and more or less is respecting the roadmap it has defined some years ago. Also, it has very talented people working inside, it has a good working atmosphere and efficient way of spending money. I really can’t wait for 2023 to see how the lightweight lightfield glasses they promised us in the timeline will turn out to be.


The article is over, but I’m not over with CREAL: I have recorded a nice interview with CREAL CEO Tomas Sluka, and I will publish it in the next weeks (register to my newsletter not to miss it!). In the meantime, don’t forget to share this post to give more visibility to this fantastic company… and if you have any comments or questions feel free to write them here below!

PS Compliments to Ricardo Pereira for having won the contest guessing that I was going to talk about CREAL by just seeing the photo of the Brick that I published on Twitter!

(Disclaimer: I was paid the trip to visit CREAL HQ, but the company never requested me anything in exchange)


Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I'll be very happy because I'll earn a small commission on your purchase. You can find my boring full disclosure here.

Releated

ekto vr walking virtual reality shoes concept

Interview with Ekto VR about its magical shoes to walk in VR

In my article about the craziest XR devices out there, I mentioned Ekto VR, the shoes that let you walk in real life to walk in virtual reality. After that article, Brad Factor, the CEO of EktoVR, contacted me to tell me that there have been a lot of updates in the device since the […]

lose find vr job

What to do after you have lost your VR job

It’s not the best moment for the tech and gaming industry: many startups have failed and many people even from big companies have been laid off. In the VR industry, we are not immune to this trend and probably we have even bigger problems, since after the metaverse hype we entered again in an “autumn […]