Holographic

Full-Color 3D Holographic AR Displays Using Metasurface Waveguides

Augmented reality (AR) systems are transforming how we interact with digital content in the physical world. Traditional AR glasses, however, have struggled with bulkiness and a lack of true 3D depth cues, limiting their widespread adoption. In a groundbreaking study published in Nature, researchers from Stanford University, in collaboration with other institutions, have developed a novel holographic AR system that addresses these challenges using advanced metasurface waveguides and AI-driven holography algorithms. This innovative approach promises to revolutionize AR experiences by delivering vibrant, full-color, 3D visuals in a compact and wearable form factor.

Inverse-Designed Metasurface Waveguide

The core of this new AR display system is an inverse-designed metasurface waveguide. Unlike conventional AR glasses, which rely on amplitude spatial light modulators (SLMs) and bulky projection optics, this system utilizes phase-only SLMs mounted close to the in-coupling grating. This configuration minimizes the device form factor and enables the presentation of true 3D depth cues. The waveguide is made from high-index glass with homogeneous metasurfaces designed to optimize compactness, dispersion correction, transmission efficiency, and angular uniformity.

[Fig. 1: Illustration of the optical principle of waveguide-based AR displays.]

Fig. 1: Illustration of the optical principle of waveguide-based AR displays.
a, Conventional AR glasses use amplitude SLMs, such as organic light-emitting diodes or micro light-emitting diodes, which require a projector-based light engine that is typically at least as thick as the focal length f of the projection lens. b, The design of our holographic AR glasses uses a phase-only SLM that can be mounted very close to the in-coupling grating, thereby minimizing the device form factor. Additionally, unlike conventional AR glasses, our holographic design can provide full 3D depth cues for virtual content, as illustrated by the bunny (adapted from the Stanford Computer Graphics Laboratory). c, Compact 3D-printed prototype illustrating the components of our holographic AR glasses in a wearable form factor.
Key Features:
  1. Compact Design: By eliminating the need for collimation optics, the holographic light engine is significantly more compact than traditional setups.
  2. High Uniformity and See-Through Efficiency: The metasurface waveguides achieve high diffraction efficiency and spectral selectivity, providing clear, vibrant images without obstructing the user’s view of the real world.
  3. Advanced Couplers: The in- and out-couplers are designed for precise k-vector matching, ensuring minimal chromatic dispersion and high-quality image relay across the entire visible spectrum.

[Fig. 2: Design and evaluation of our inverse-designed metasurfaces.]

a, Visualization of the waveguide geometry for full-colour operation. b, Electric field maps at red (638 nm), green (521 nm) and blue (445 nm) wavelengths for light passing through the metasurface out-coupler towards the user’s eye. The black arrows illustrate the wave vectors of the incident and diffracted light. c, Visualization of the inverse-designed metasurfaces optimized for waveguide couplers. The period (Λ) and height (H) of the metasurfaces are 384 nm and 220 nm, respectively. d, Scanning electron microscope images of the fabricated metasurfaces. e, The simulated and experimentally measured transmittance spectra of unpolarized light for the inverse-designed metasurfaces in the visible range, corresponding to see-through efficiency for real-world scenes. f, The simulated (dashed lines) transfer functions along the x axis for the conventional single-lined gratings and the simulated (solid lines) and experimentally measured (circles) transfer functions for our inverse-designed metasurfaces. The colour of the plots corresponds to the red, green and blue wavelengths. The designed metasurfaces are much more efficient than conventional gratings in green and blue, but, due to the very large diffraction angle of red, further improvement of the efficiency of the red channel is more difficult. g, Uniformities of the transfer functions for the conventional gratings without optimization and the inverse-designed metasurfaces with optimization. Scale bars, 400 nm (b), 2 μm (d, left), 200 nm (d, right). E, electromagnetic field.

Waveguide Propagation Model

To accurately simulate light propagation through the waveguide, the researchers developed a sophisticated waveguide propagation model. This model combines physically accurate descriptions of wave propagation with learnable components, calibrated using camera feedback. The integration of convolutional neural networks (CNNs) helps to correct for any discrepancies between the physical model and actual waveguide performance.

[Fig. 3: Illustration of the proposed wave propagation model.]

We combine physical aspects of the waveguide (highlighted in green) with artificial-intelligence components that are learned from camera feedback (highlighted in orange). In our model, the input phase pattern (left) applies a per-pixel phase delay, from 0 to 2π, to the converging illumination before the wavefront is modulated by the learned in-coupler efficiency. This wavefront is then sent through a CNN at the in-coupler plane and propagated through the waveguide, using its physically motivated transfer function, before an additional learned out-coupler efficiency is used to determine the out-coupled wavefront (centre). The latter is propagated to the target scene at various distances from the user where a CNN is applied, converting the complex-valued field into observed intensities (right). When trained on a captured dataset, the learned parameters of the CNNs, the coupler efficiencies and the waveguide propagation enable this model to accurately predict the output of our holographic AR glasses. The model is fully differentiable, enabling simple gradient descent CGH algorithms to compute the phase pattern for a target scene at runtime. The bunny scene is from Big Buck Bunny, © 2008 Blender Foundation/www.bigbuckbunny.org, under a Creative Commons licence CC BY 3.0.
Model Components:
  1. Physical Elements: Includes the phase pattern, in-coupled and out-coupled wavefronts, and the waveguide’s transfer function.
  2. Learned Components: CNNs are employed to refine the in-coupler and target plane efficiencies, enhancing the model’s accuracy in predicting the output of holographic AR glasses.

Experimental Results

The prototype AR display system integrates the metasurface waveguide with a HOLOEYE LETO-3 phase-only SLM and a FISBA READYBeam light source. Experimental results demonstrated significant improvements in image quality and depth rendering compared to traditional models. The system produced high-quality, full-color 3D holographic images, effectively mitigating visual discomfort associated with the vergence-accommodation conflict.

[Fig. 4: Experimental results captured through our compact holographic display prototype.]

a, Comparison of 2D holograms synthesized using several different wave propagation models, including free-space propagation, a physically motivated model and our proposed model combining physics and learnable parameters that are calibrated using camera feedback. b, Comparison of two 3D holograms. Zoomed-in crops show the scene with the camera focused at different depths. Blue boxes highlight content that the camera is focused on while white boxes emphasize camera defocus. c, Comparison of a 3D hologram captured in an optical-see-through AR mode. The bird, fish and butterfly are digitally superimposed objects, and the elephant and letters are part of the physical environment. In all examples, the proposed wave propagation model represents the physical optics much more accurately, resulting in significant image quality improvements over alternative models. In a, the squirrel scene is from Big Buck Bunny, © 2008 Blender Foundation/www.bigbuckbunny.org, under a Creative Commons licence CC BY 3.0. In b, couch and market target scenes are, respectively, from the High Spatio-Angular Light Field dataset49 and the Durian Open Movie project (© copyright Blender Foundation/durian.blender.org) under a Creative Commons licence CC BY 3.0.
Highlights:
  1. 2D and 3D Image Quality: The AI-enhanced wave propagation model significantly outperformed free-space propagation and other physically motivated models, achieving 3-5 dB higher peak signal-to-noise ratios.
  2. Real-World Integration: The system successfully overlaid digital content onto physical environments, maintaining high image quality across various focus settings.

Conclusion and Future Directions

The innovative co-design of metasurface waveguides and AI-driven holography algorithms marks a significant step towards practical and high-quality 3D AR glasses. Future work aims to expand the field of view, further reduce waveguide thickness, and optimize the computational efficiency of hologram generation for real-time applications. This research opens new avenues for the development of compact, high-performance AR systems capable of delivering immersive and realistic experiences.

For more details, refer to the full research article in Nature: Full-colour 3D holographic augmented-reality displays with metasurface waveguides.

Stay updated on the latest AR, VR, and MR trends at Knoxlabs News.

References:

Gopakumar, M., Lee, G.-Y., Choi, S., Chao, B., Peng, Y., Kim, J., & Wetzstein, G. (2024). Full-colour 3D holographic augmented-reality displays with metasurface waveguides. Nature. https://doi.org/10.1038/s41586-024-07386-0

You May Also Like

+ There are no comments

Add yours