Car HUDs Are Bad. Jaguar Land Rover Is Testing Tech to Change That

With $5.3 million in funding and Meta’s former AR boss on its team, AllFocal Optics is looking to revolutionize the screens in car heads-up displays—and it's coming for smart glasses too.
Car HUDs Are Bad. Jaguar Land Rover Is Testing Tech to Change That
Video: AllFocal

New technology doesn’t arrive fully formed, I remind myself as I strap on a pirate’s eye patch, then place a heavily modified bicycle helmet onto my head. It isn’t the most glamorous getup. But here, in a small office on an unassuming business park on the outskirts of Cambridge, England, it could be the foundations of something quite remarkable

I'm here to meet AllFocal Optics, a startup that has patented a new type of nanophotonic lens with the power to transform everything from virtual and augmented reality headsets, to night vision goggles, binoculars, cameras and heads-up displays (HUDs). It’s the latter that piqued my interest, after hearing Jaguar Land Rover has embarked on a research project to discover whether the lens can improve car HUDs and, with it, road safety.

Its makers also claim—and prove, with the aforementioned helmet, as well as a butchered Meta Quest 3—that the lens can produce a digital image with perfect clarity, even if you have poor vision, without the need for glasses.

Founded in 2022 as Lark but since renamed AllFocal Optics, the company is headed by former Royal Academy of Engineering enterprise fellow Dr Pawan Shrestha. Dr Ash Saulsbury, former technology VP at Microsoft and former Meta AR boss, joined late last year as chair, around the same time that the startup secured a $5.3m funding round.

AllFocal Optics says the lens it has created offers two technological breakthroughs. Firstly, when used in an AR or VR headset, like the Apple Vision Pro or Meta Quest 3, it claims to provide crystal-clear vision to the wearer, even if they need glasses but aren’t wearing them.

Even if you require a significant prescription, or suffer from astigmatism, its makers say the lens beams a clear picture directly to your retina, bypassing the needs for glasses entirely. In theory, two people could share the same AR or VR headset, even if one has 20/20 vision and the other needs very strong corrective lenses.

I tried several prototypes of the lens and, yes, it works. I don’t wear glasses, so the first demonstration—viewing digital text beamed from a laptop to an augmented reality headset—didn’t seem overly impressive. But then I repeated the test while wearing glasses so strong I couldn’t see my own hands in front of my face, and yet the digital text was still pin-sharp. It’s the sort of tech demo that takes a moment to truly appreciate, but when your brain finally connects the dots it feels like magic.

Not only does the lens sidestep vision impairment, it also produces an image far sharper than the likes of Microsoft’s HoloLens (may it rest in peace), despite the input being the same 720p resolution. With the prototype lens it was easy—and this is where the eyepatch comes in—to read a chunk of text where half was augmented and half was printed on a sheet of paper, but with the HoloLens this was impossible. The augmented half of each line was a blurred mess.

Looking Sharp

What’s more, the text remains sharp regardless of what your eyes are actually focused on. I experimented with focusing first on my hand just a few inches from my face, then on the other side of the room, yet the augmented text stayed in focus. Interestingly, its size adjusts depending on where you’re looking; I could make it appear as a tiny font on my finger, or as poster-sized writing on the opposite wall. It’s also possible to look through it, focusing on the middle-distance as you might while driving, but no matter what your eyes and their lenses do, the digital text remains sharp and legible.

If that's hard to picture, then this video should help to demonstrate. The projected image remains in focus through a camera, even when the focal length is adjusted from 12 inches (30 cm) to 78 inches (200 cm) and the rest of the camera’s view becomes blurry. As well as products that sit close to the eye, AllFocus Optics says its technology could be used for other applications too, like the rear-view screens used by some cars in place of mirrors, which can appear blurred to glasses wearers.

Shrestha explains why humans sometimes struggle with existing VR and AR technology. “The way we evolved over thousands of years is that when we see a 3D object in a 3D space, our eyes rotate and we have lenses in our eyes, ocular lenses, that focus at a fixed depth. We have evolved so that our two cues—rotating, which we call vergence, and focusing, which we call accommodation—work in harmony. But in existing AR and VR devices these cues are in conflict.”

This creates what is rather appropriately known as the vergence-accommodation conflict, which can lead to our eyes to constantly refocus and confusing our brains as to where the projected object really is. Ultimately, it leads to nausea and headache. Shrestha explains why his technology is different. “We have no fixed or virtual screen at all, so our image is always in focus. We create a projected image in the retina … similar to retinal projection technology. So now the vergence and accommodation link is decoupled.”

Although Shrestha admits his company didn’t create this decoupling technique (it stems from the nineteenth-century principle of the Maxwellian view), he says AllFocal Optics can make this type of lens commercially viable for the first time.

Staying Focused

It would be easy for anyone who has experienced nausea in virtual reality to imagine how the technology will help improve AR and VR headsets, and the company highlights how it could remove the need for corrective lenses in products like the Apple Vision Pro. “Our technology is always in focus,” Shrestha insists. “Whether you have long sightedness, short sightedness, astigmatism or anything, you can see clearly because it’s bypassing any defect or some shortcomings with your ocular lens.”

With a “slightly customized” design, the technology will work for car HUDs too. Instead of re-focusing between the projected interface and the road, “all you need to do is switch your attention, and that takes almost zero reaction time,” Shrestha says. “You can just switch between contexts without having to mechanically shift the ocular lens. That’s the huge value add.”

Part of the demo for AllFocal Optics involves this modified bike helmet.

Courtesy of Allfocal Optics

AllFocal Optics hopes its technology will be adopted by car manufacturers and suppliers of heads-up displays, since the nanophotonic lens removes the need for a driver to switch their focus between the road and the HUD. Current systems are installed into the dashboard and project onto the windshield, creating a digital display roughly in the driver’s eyeline and saving them the need to look down at the dashboard. Despite being in the right place, the projection is not the same distance from the driver’s eyes as the road, and although refocusing between the two happens quickly, AllFocal Optics says this increases noticeably with age.

With the new lens, the information projected from a heads-up display—things like speed, direction and, more importantly, a potential collision warning—would always be in focus, so they could be read, processed and acted on that bit quicker. While AllFocal Optics says a driver in their 20s can refocus from the windshield to the road 65 feet (20 meters) away in 0.73 seconds, a driver in their 60s takes 2.51 seconds. At 70 mph that extra couple of seconds means the difference between covering 72 feet or 256 feet before reacting.

Car manufacturers see the potential, with JLR (formerly Jaguar Land Rover) set to begin a trial this year. Valerian Meijering, JLR’s subject matter expert for extended reality, told WIRED: “Through this research project with AllFocal Optics, we are exploring new ways to present information via heads-up displays in a way that makes it even simpler to read. By further reducing the amount of vision strain and focus that would typically be needed, we could improve cognitive processing time, especially for those with vision impairments, and continue to improve comfort and safety for our clients.”

Rapidly Evolving

HUD systems find themselves in a strange position on the adoption curve. The first cars to feature heads-up displays arrived back in the late-1980s, but even now, almost four decades later, many cars still go without. Tesla fails to offer HUDs on any of its vehicles, and even the newest systems offer little more than those from a decade ago. But change is on the horizon.

Meijering added: “Visual display technology is evolving rapidly. Our clients love the benefits of heads-up displays, they are increasingly important to their luxury in-vehicle experience and safety.”

That's likely why AllFocal Optics isn’t the only company with a better HUD in its sights. Porsche’s new electric Macan features an HUD system with augmented reality, where virtual hazard signs attach themselves to whatever danger they are warning the driver about—such as the vehicle you’re following too closely. Audi’s latest HUD places augmented arrows on the road to help with navigation, while BMW first spoke about the potential for augmented heads-up displays in 2011.

Hyundai Mobis, a South Korean parts supplier to the Hyundai, Kia and Genesis car companies, showed off a technology called Holographic HUD at the CES tech show in January 2025. Instead of projecting onto a small portion of the screen, Hyundai Mobis’ technology can be placed anywhere on the windshield.

Photograph: Hyundai

To do this, it enlists a special film embedded with an optical component (called a “Holographic Optical Element”), which uses diffraction to deliver an interface to the eyes of the driver and passenger. Developed alongside German optical company and lens manufacturer Zeiss, the holographic HUD is expected to complete pre-development by the first half of 2026, before heading for mass production “as early as 2027,” according to Hyundai Mobis.

Envisics, another UK startup with brains from Cambridge and backing from JLR, plus General Motors, Hyundai and Stellantis, is also working on its “Dynamic Holography Platform”. It claims this will transform HUDs, with the ability to produce larger, three-dimensional images with greater depth—and the potential for an interface to span three lanes of highway—from a product that is 40 percent smaller and 50 percent more energy efficient.

While the promise of rich, high-resolution augmented reality is what’s needed to wow consumers, the less glamorous benefit of compact packaging is vital if automakers want to bring HUD systems, which tend to be bulky, to smaller, cheaper cars—and energy-saving is key for electric vehicles. Envisics’ first HUD with augmented reality is due to appear in the 2026 Cadillac Lyriq-V, an electric SUV coming later this year.

But the road from holographic dream to (augmented) reality is not always a smooth one. In 2018, Hyundai announced a strategic investment into WayRay, a Swiss startup developing similar technology. The carmaker hoped WayRay’s tech would land in new vehicles as early as 2020, with Dr Youngcho Chi, Hyundai’s chief innovation officer, saying the collaboration would help them "establish a brand new ecosystem that harnesses AR technology to enhance not only navigation systems, but also establish an AR platform for smart city and smart mobility.”

WayRay also attracted investment from Porsche and e-commerce giant Alibaba, and a funding round in 2018 topped $80m. Advanced AR tech later appeared in WayRay’s own 2021 autonomous taxi concept, the Holograktor, in a bid to “highlight the maturing of the breakthrough holographic technology.” WayRay founder Vitaly Ponomarev said he hoped to homologate the car and release it in 2025.

Two years later, in September 2023, WayRay declared bankruptcy. Although based in the UK and headquartered in Switzerland since 2014, the company was actually founded in Moscow two years earlier, and failed to fully resolve what board director Philippe D. Monnier later described as “problematic ‘Russian angles’”. Despite attempts to distance itself from its origin story in the wake of Russia’s 2022 invasion of Ukraine, including share buybacks and top managers changing their citizenship, a pending $100m funding round could not be completed.

Back at AllFocal Optics, I have removed the bicycle helmet and pirate’s eyepatch. My eyes take a moment to adjust back to the real world, but I'm free of any signs of nausea.

Shrestha believes the lenses could be ready for a small batch of specialist equipment, like night vision scopes, within a few months, and that an automotive application could be realized in around two years. Tantalizingly, he says the technology can be made to work with the video screens of rear-view cameras too, coming to the aid of drivers whose glasses bring the road and conventional side mirrors into focus but leave digital screens blurred. With cars like the Polestar 4 replacing the rear windshield with a camera, it’s easy to see why such a technology would be popular.

Improving the reaction times of older drivers and helping glasses wearers see the reversing camera might not be as exciting as autonomous driving. But the crude prototypes of startups like AllFocal Optics—and the messy, unfortunate downfall of others—give a warts-and-all view of where tomorrow’s technology was born, years before it lands at your local car dealer.