Types of Lenses Used in AR, MR, and XR Headsets: Fresnel vs Pancake vs Aspheric vs Waveguides

Published on:

Last updated:

A practical guide to XR lenses: Fresnel, pancake, prisms, waveguides, metasurfaces, and what each trades off.

Summary

  • XR lenses are not just magnifiers; they define field of view, eyebox, brightness, and comfort through hard physics tradeoffs.
  • VR often uses refractive, Fresnel, or pancake eyepieces; pancake optics reduce thickness but can lose efficiency and create ghosts.
  • Optical see-through AR relies on combiners like birdbath systems, freeform prisms, or waveguides that must stay transparent and efficient.
  • Waveguides (diffractive or reflective) enable thin AR glasses but struggle with efficiency, color uniformity, and artifact control.
  • Advanced approaches like metasurface lenses and varifocal/tunable optics target thinner form factors and better depth cues, but manufacturing and bandwidth remain key hurdles.



In MR, AR, and XR headsets, the lenses are not “just optics.” They are the physics that decides whether the product feels magical or exhausting.


Most user complaints that sound like “software issues” are actually lens issues: blurry edges, a tiny “sweet spot,” glare streaks on high-contrast text, rainbow fringes, dim overlays in daylight, or a headset that is simply too bulky to wear for an hour. The lens stack is also where manufacturability gets brutal: micron-level tolerances, coatings that must behave under sweat and skin oils, and optical artifacts that only show up when a human eye rotates through the eyebox.


So when someone asks “what lenses do XR devices use?”, the real answer is: a whole zoo of optical architectures, each one a trade between field of view, weight, brightness, eyebox size, cost, and how much pain you can tolerate in calibration.


A good starting point is to split the world into two jobs:

  • VR-style eyepieces: magnify a microdisplay so it looks like a big screen far away.
  • AR-style combiners: add synthetic light into your view while staying transparent to the real world.


A lot of MR devices combine both ideas, because they are doing video passthrough plus virtual imagery. A broad overview of these architectures and their tradeoffs is summarized well in review papers like Xiong et al. (Nature Photonics) and Kress and colleagues’ work on head-mounted display optics.


Explained in seconds

Think of an XR headset as a tiny high-resolution screen placed very close to your eye. Without lenses, you cannot focus on it.


So headsets add optics that do three simple things:

  • Make the screen look far away (so your eye can relax).
  • Make the image look big (wide field of view).
  • Make it work as your eye moves (a usable eyebox, not a pinhole you must align perfectly).


In VR, the lens is mostly a magnifier.


In AR, the lens is more like a “smart window”: it must pass real-world light through, but also redirect display light into your eye. That “smart window” is usually a prism combiner, a curved mirror system, or a waveguide embedded in glass or plastic.


The optical constraints that drive everything

Before lens types, it helps to name the constraints that keep designers awake:

  • Field of view (FOV): how wide the virtual image appears.
  • Eyebox (exit pupil volume): the region where your pupil can move and still see the image.
  • Optical efficiency: how much display light actually reaches the eye (critical for battery and outdoor AR).
  • Aberrations: blur, distortion, chromatic fringing, and astigmatism that worsen off-axis.
  • Stray light and ghosts: internal reflections that create duplicate images or glare.
  • Pupil swim: perceived distortion changes as the eye rotates within the eyebox (a big comfort issue in practice).


With that in mind, here are the major lens families you will see in MR, AR, and XR.


Classic refractive lenses (aspheric and multi-element)

What they are: “Normal” lenses (usually plastic in consumer headsets) shaped to reduce aberrations, often using aspheric surfaces (not a simple sphere).


Where they show up: VR headsets and some MR prototypes, especially when designers want fewer diffraction artifacts than Fresnel.


Why they exist: They can deliver high image quality and good contrast with fewer ring-related artifacts.


What breaks: For wide FOV, they tend to get thick and heavy. Thickness also pushes the display farther from the eye, making the headset bulkier.


Classic refractive designs are still important as a baseline, and many “new” XR lenses are really folded or hybrid versions of refractive optics.


Fresnel lenses (the “thin magnifier with rings”)

What they are: A Fresnel lens approximates a thick lens using concentric grooves, reducing thickness and weight.


Where they show up: Many VR headsets historically, because they are cheap to mold and enable large apparent screens without a thick chunk of plastic.


Why they exist: Lightweight, compact, and manufacturable at consumer scale.


What breaks: The grooves scatter light. This can create glare streaks (“god rays”) and reduce contrast on high-contrast scenes, and edge quality can suffer. Fresnel designs also interact with pupil swim and distortion constraints in ways that require aggressive software calibration.


Fresnel is the classic example of the XR lens trade: small and affordable, but visually unforgiving if your content has bright UI elements on dark backgrounds.


Pancake optics (folded lenses using polarization)

What they are: A “pancake” lens is a folded optical path that uses polarization control and partial reflections to make the lens module much thinner than a comparable refractive eyepiece.


A typical pancake stack uses components like a reflective polarizer, a quarter-wave plate, and a semi-reflective mirror to bounce light back and forth while controlling polarization states.


Where they show up: Many modern compact VR and MR headsets.


Why they exist: You can reduce front-to-back thickness substantially. That improves wearability and can help weight distribution.


What breaks (and why it matters):

  • Efficiency loss: polarization-based folding can throw away a lot of light, which pressures display brightness and battery life. Research on improving pancake efficiency is an active area.
  • Ghost images: multiple internal reflections and imperfect polarization control can generate visible ghosts that are content-dependent and annoying.


Pancake optics are a good demonstration of “systems thinking” in XR: the lens choice forces decisions in display type, brightness, thermal design, and rendering calibration.


Catadioptric and reflective eyepieces (mirror plus lens hybrids)

What they are: Optics that combine reflection (mirrors) and refraction (lenses) to fold the path and control aberrations.


Where they show up: Compact VR designs, research prototypes, and some specialty headsets.


Why they exist: Mirrors can fold paths efficiently and avoid some chromatic aberration, but they introduce alignment sensitivity and stray light challenges.


This family overlaps conceptually with pancake optics, but not every folded reflective design is polarization-based.


Free-space AR combiners (beam splitters and curved mirrors)

When you move from VR to optical see-through AR, the lens has a new job: it must be transparent.


Birdbath combiners

What they are: A curved mirror plus a beam splitter arrangement that reflects display light into the eye while passing the real world through.


Where they show up: Many early AR headsets and a lot of developer hardware, because the architecture is comparatively straightforward and can support wider FOV than many waveguides.


What breaks: Bulk, visible optics, and potential reflections from the beam splitter. This category is commonly discussed in AR/VR display architecture reviews.


Curved mirror and relay-based see-through systems

Some systems use relay optics and partially reflective surfaces to manage eye relief, eyebox, and packaging. These can be high quality, but are rarely “glasses-like.”


Freeform prisms and total internal reflection combiners

What they are: Prism-based combiners that use total internal reflection (TIR) and freeform (non-spherical) surfaces to redirect light into the eye while staying see-through.


Where they show up: Many AR designs that target better image quality and larger FOV than early waveguides, but can accept some thickness.


Why they exist: Freeform surfaces give designers more degrees of freedom to correct aberrations and shape the eyebox.


Credible anchor point: A well-cited Applied Optics paper by Cheng et al. demonstrates an optical see-through head-mounted display design using a wedge-shaped freeform prism and discusses the difficulty of achieving wide FOV and fast optics in a compact form factor.


What breaks: These designs can be hard to manufacture and align, and they can be visually noticeable as optics in front of the eye, which impacts consumer aesthetics.


Waveguides (the “light piped through glass” approach)

Waveguides are the dominant “glasses-like AR” direction: a thin transparent slab guides display light via total internal reflection, then extracts it toward the eye.


There are two big subfamilies:


Diffractive and holographic waveguides

What they are: Use diffraction gratings or holographic elements to couple light in, replicate the pupil (exit pupil expansion), and couple light out.


Where they show up: Many AR glasses concepts because they can be very thin.


What breaks: Color non-uniformity, “rainbow” artifacts, eyebox limits, and efficiency challenges. Exit pupil expansion designs are a deep topic on their own, with active research on polarization volume gratings and other grating structures.


Holographic and diffractive combiners have been surveyed for years, including by Kress and colleagues, because they sit at the intersection of optical design and manufacturable nano-structured surfaces.


Reflective (geometric) waveguides

What they are: Use partially reflective surfaces or embedded mirrors inside the waveguide to redirect and extract light.


Why they exist: Potentially fewer diffraction rainbow artifacts compared to purely diffractive approaches, but with their own fabrication and uniformity challenges.


Recent work discusses geometric versus diffractive waveguide coupling strategies in the context of AR waveguide combiners.


Holographic optical elements and volume holograms

What they are: Optical elements recorded in a volume material that can act like a lens, mirror, or combiner, often with angular and wavelength selectivity.


Where they show up: AR combiners, waveguide coupling, and some holographic display concepts.


Why they exist: They can be thin and function-rich.


What breaks: Sensitivity to wavelength and angle, which is both a feature and a headache.


Metasurface lenses (metalenses) and “flat optics”

What they are: Nanostructured surfaces that locally manipulate phase, polarization, and dispersion, enabling lens-like behavior in extremely thin form factors.


Where they show up: Primarily research today, with aggressive interest because they promise thinner AR optics and new aberration control.


A landmark demonstration showed a metasurface eyepiece for augmented reality in Nature Communications, and more recent reviews summarize metasurfaces for near-eye displays and their practical limits (efficiency, bandwidth, and manufacturing).


What breaks: Achieving broadband (full-color) performance, high efficiency, and low stray light at consumer scale is still hard, even though progress is real.


Metasurfaces are a good example of “new lens type” that is also a new manufacturing problem.


Focus-tunable and varifocal lenses (to reduce vergence-accommodation conflict)

Most XR headsets present imagery at one fixed focal distance. Your eyes converge to different depths in stereo, but your eye’s focusing (accommodation) does not match. That mismatch is the vergence-accommodation conflict (VAC) and it is linked to discomfort for some users.


Varifocal and multifocal approaches often add an additional lens element that can change optical power:

  • Alvarez lenses: two complementary freeform plates that shift laterally to change focus continuously.
  • Liquid lenses (including electrowetting and other tunable-fluid approaches): compact focus tuning without moving large glass elements.
  • Liquid crystal lenses: electrically controlled focusing elements, promising for thin stacks (with their own optical quality constraints).


These are not “main eyepiece replacements” yet in most mass-market headsets, but they matter because they change what “a good lens” means: not only sharpness, but correct depth cues.


Which lens family fits which product?

  • If you need wide FOV and lower cost: Fresnel or simpler refractive optics can win, but you pay in glare and edge artifacts.
  • If you need a thinner VR/MR headset: pancake optics are a leading choice, but efficiency and ghost control become core engineering tasks, not polish work.
  • If you need glasses-like optical see-through AR: waveguides dominate, but you fight efficiency, eyebox expansion, and color uniformity.
  • If you need higher image quality AR and can accept more visible optics: freeform prisms and free-space combiners can deliver, but industrial design and manufacturability are harder.
  • If you are betting on the future: metasurfaces and advanced holographic elements are promising, but they are still transitioning from lab demonstrations to robust consumer supply chains.


The honest reason there are “so many lenses”

XR is not one product. It is a set of products that all happen to put imagery near your eyes, and the physics refuses to give a single best lens.


Every lens architecture is a declaration of priorities:

  • Prioritize thinness, you probably accept efficiency loss and fight ghosts.
  • Prioritize outdoor AR visibility, you obsess over optical throughput and stray light.
  • Prioritize wide FOV, you wrestle distortion and eyebox size.
  • Prioritize consumer aesthetics, you end up in waveguides and advanced combiners.


That is why “lenses in MR/AR/XR” is not a list, it is a map of tradeoffs. The good news is that the field is moving fast, not by magic, but by real progress in materials, gratings, polarization optics, and computational calibration pipelines.