Revolutionizing Car HUDs and AR/VR: AllFocal Optics' Lens Tech Tackles Vision Problems and Road Safety
New technology doesn’t arrive fully formed. Sometimes, the foundations of a significant leap forward can be found in unassuming places, demonstrated with prototypes that look more like science experiments than polished consumer products. This was the case when exploring the work of AllFocal Optics, a startup based on the outskirts of Cambridge, England, whose patented nanophotonic lens technology holds the potential to transform everything from augmented and virtual reality headsets to night vision goggles, binoculars, cameras, and, crucially, heads-up displays (HUDs) in cars.
The journey to understanding this potential began with strapping on a pirate’s eye patch and placing a heavily modified bicycle helmet onto my head. It wasn't the most glamorous setup, but it was the gateway to experiencing a technology that promises crystal-clear digital images, regardless of the viewer's natural vision or where their eyes are focused. This capability is particularly significant for car HUDs, where Jaguar Land Rover (JLR) has embarked on a research project with AllFocal Optics to investigate how this lens can improve the technology and, with it, road safety.
AllFocal Optics, founded in 2022 (originally named Lark), is led by Dr. Pawan Shrestha, a former Royal Academy of Engineering enterprise fellow. The company recently secured a $5.3 million funding round, attracting talent like Dr. Ash Saulsbury, former technology VP at Microsoft and former Meta AR boss, who joined as chair. This combination of patented technology, significant funding, and experienced leadership positions AllFocal Optics to address long-standing challenges in display technology.
The Fundamental Problem with Existing Displays: The Vergence-Accommodation Conflict
To understand the significance of AllFocal Optics' breakthrough, it's essential to grasp a fundamental challenge inherent in many modern display technologies, particularly augmented and virtual reality, and even current car HUDs: the vergence-accommodation conflict.
Our visual system evolved over millennia to perceive a three-dimensional world. When we look at an object, our eyes perform two primary actions in harmony:
- **Vergence:** Our eyes rotate inward (converge) when looking at nearby objects and straighten (diverge) when looking at distant objects. This helps us perceive depth through stereopsis (the slightly different view each eye gets).
- **Accommodation:** The lens inside our eye changes shape (accommodates) to focus light from the object onto the retina, ensuring a sharp image. The lens becomes more convex for near objects and flatter for far objects.
In the natural world, vergence and accommodation are tightly coupled. When you look at something close, your eyes converge *and* your lenses accommodate for a near focal distance. When you look far away, your eyes diverge *and* your lenses accommodate for a far focal distance. This natural link is crucial for comfortable and clear vision in 3D space.
However, many existing AR and VR devices, and even car HUDs, present digital images on a screen or projection surface that is at a fixed distance from the viewer's eyes. For instance, a VR headset might have screens just inches from your face, while a car HUD projects onto the windshield, creating a virtual image that appears several feet or even meters in front of the driver.
The conflict arises because the digital content often depicts objects that appear to be at varying depths within the virtual or augmented scene. Your brain tells your eyes to converge based on the perceived depth of the virtual object (e.g., converging as if looking at something 10 feet away), but the display screen itself is at a fixed physical distance (e.g., 2 inches away in VR, or the virtual image distance of the HUD). Your eyes' accommodation mechanism wants to focus on the actual physical distance of the screen, while your vergence is driven by the perceived depth of the digital content. This mismatch – the vergence-accommodation conflict – forces your eyes and brain into an unnatural state.
The consequences of this conflict can range from eye strain and fatigue to headaches and even nausea, commonly experienced by users of VR headsets. For car HUDs, while the effect might be less pronounced due to the simpler nature of the displayed information (speed, navigation prompts, warnings), the driver's eyes still need to quickly refocus between the virtual image projected by the HUD and the actual road ahead. While this refocusing happens rapidly, it still takes a finite amount of time and cognitive effort.
Furthermore, existing HUDs project onto a surface (the windshield) that is not at the same focal distance as the road. The driver's eyes must constantly switch focus between the road (potentially infinite distance) and the HUD projection (a fixed, closer distance). While young eyes can do this quickly, the ability to rapidly change focus (accommodation) diminishes significantly with age. This age-related decline in accommodation means older drivers take noticeably longer to refocus between the HUD and the road, potentially impacting reaction times in critical situations.
AllFocal Optics' Breakthrough: Decoupling Vergence and Accommodation
AllFocal Optics claims its nanophotonic lens technology offers a solution to these problems by effectively decoupling the vergence and accommodation link. Dr. Shrestha explains, “We have no fixed or virtual screen at all, so our image is always in focus. We create a projected image in the retina … similar to retinal projection technology. So now the vergence and accommodation link is decoupled.”
This approach is based on the principle of the Maxwellian view, a concept dating back to the nineteenth century. In a Maxwellian view system, light from a display is focused directly onto the pupil of the eye. This creates an image on the retina that is always in focus, regardless of the eye's accommodation state. Your eye's lens and its ability to focus become irrelevant because the light is entering the eye in a way that bypasses the normal focusing mechanism.
While the Maxwellian view principle isn't new, AllFocal Optics claims to have developed a nanophotonic lens that makes this type of display commercially viable for the first time. Their technology, embedded within this lens, directs light into the eye in such a way that the image is perceived as being in focus at all distances simultaneously, or perhaps more accurately, bypasses the need for the eye to accommodate to a specific distance.
The demonstration using the modified bicycle helmet and the butchered Meta Quest 3 prototype provided compelling evidence of this capability. Viewing digital text beamed through the prototype lens, the text appeared perfectly sharp. The true test came when repeating the demonstration while wearing glasses with a very strong prescription – glasses so strong that the real world became an unreadable blur. Yet, the digital text viewed through the prototype lens remained pin-sharp. This suggests the technology can indeed bypass the need for corrective lenses, offering crystal-clear vision of the digital content even for individuals with significant vision impairments like long-sightedness, short-sightedness, or astigmatism.
Furthermore, the demonstration highlighted another key benefit: the image remains sharp regardless of where the viewer's eyes are focused in the real world. Whether focusing on a hand just inches away or a wall across the room, the augmented text stayed in focus. This is a direct consequence of the vergence-accommodation decoupling. The size of the perceived digital image could also be adjusted, appearing small on a finger or large on a distant wall, adding another layer of flexibility to the display.
A video demonstration further illustrates this point, showing a projected image remaining in focus through a camera even as the camera's focal length is adjusted significantly, causing the background to blur. This capability is not limited to displays viewed close to the eye; AllFocal Optics believes its technology could also improve rear-view screens used in some cars, which can appear blurred to drivers who wear glasses focused for distance vision.
Impact on Automotive Heads-Up Displays and Road Safety
The application of AllFocal Optics' technology to car HUDs is particularly exciting due to its potential impact on road safety. Current HUD systems project information like speed, navigation, and warnings onto the windshield, aiming to keep the driver's eyes closer to the road ahead compared to looking down at the dashboard. However, as discussed, the need to refocus between the HUD projection and the actual road introduces a delay.
With AllFocal Optics' lens, the information projected from the HUD would always be in focus for the driver, regardless of whether they are focused on the road 20 meters away or a hazard just ahead. Dr. Shrestha emphasizes, “all you need to do is switch your attention, and that takes almost zero reaction time. You can just switch between contexts without having to mechanically shift the ocular lens. That’s the huge value add.”
This reduction in refocusing time is not trivial, especially for older drivers. AllFocal Optics highlights that while a driver in their 20s might take around 0.73 seconds to refocus from the windshield to the road 65 feet away, a driver in their 60s could take up to 2.51 seconds. At highway speeds of 70 mph, that difference of nearly two seconds translates to covering an additional 184 feet (over 56 meters) before the driver's eyes are fully focused on the road again. In an emergency braking situation or when reacting to a sudden hazard, this difference in reaction time could be critical.
Recognizing this potential, Jaguar Land Rover (JLR) is set to begin a trial of AllFocal Optics' technology this year. Valerian Meijering, JLR’s subject matter expert for extended reality, stated, “Through this research project with AllFocal Optics, we are exploring new ways to present information via heads-up displays in a way that makes it even simpler to read. By further reducing the amount of vision strain and focus that would typically be needed, we could improve cognitive processing time, especially for those with vision impairments, and continue to improve comfort and safety for our clients.” This indicates a clear interest from a major automotive player in leveraging this technology to enhance both the user experience and the safety credentials of their vehicles.
Part of the demo for AllFocal Optics involves this modified bike helmet.
Beyond traditional HUD information, the ability to provide a consistently focused image is crucial for the next generation of augmented reality (AR) HUDs. These systems aim to overlay digital information directly onto the real world, such as navigation arrows that appear to sit on the road or warnings that highlight specific objects like pedestrians or other vehicles. For these AR elements to be truly effective and non-distracting, they must appear stable and in focus relative to the real-world objects they are augmenting. AllFocal Optics' technology could be key to achieving this seamless integration, ensuring that AR overlays are always sharp and legible, regardless of the driver's focus point.
The Evolving Landscape of Automotive Display Technology
Heads-up displays have been around for decades, first appearing in cars in the late 1980s. Yet, their adoption has been relatively slow, and the technology in many current systems offers only incremental improvements over earlier versions. However, the automotive industry is now seeing a rapid evolution in display technology, driven by advancements in AR, VR, and holographic techniques, alongside increasing consumer demand for more sophisticated in-car interfaces.
JLR's interest in AllFocal Optics is part of a broader trend. Valerian Meijering notes, “Visual display technology is evolving rapidly. Our clients love the benefits of heads-up displays, they are increasingly important to their luxury in-vehicle experience and safety.” This suggests that advanced HUDs are becoming a key differentiator for automakers, particularly in the luxury segment.
Several other companies and manufacturers are also actively developing next-generation HUD and AR display technologies:
- **Porsche:** The new electric Macan features an AR HUD that can place virtual hazard signs directly onto the real-world objects they relate to, such as highlighting a vehicle being followed too closely.
- **Audi:** Their latest HUD systems use augmented arrows overlaid on the road surface to provide intuitive navigation guidance.
- **BMW:** Has been exploring the potential of augmented heads-up displays since at least 2011, demonstrating a long-term interest in the technology.
- **Hyundai Mobis:** A major South Korean automotive supplier, showcased a Holographic HUD at CES in January 2025. Developed with German optical company Zeiss, this technology uses a special film with a Holographic Optical Element (HOE) embedded in the windshield. This allows the display to be placed anywhere on the windshield, not just a limited area, and uses diffraction to deliver images to the driver and passenger. Hyundai Mobis expects this holographic HUD to be ready for mass production as early as 2027.
- **Envisics:** Another UK-based startup with backing from JLR, General Motors, Hyundai, and Stellantis, is developing a “Dynamic Holography Platform.” They claim their technology can produce larger, three-dimensional images with greater depth, potentially spanning multiple lanes of a highway, while being significantly more compact (40 percent smaller) and energy-efficient (50 percent more) than current systems. The first vehicle expected to feature an Envisics AR HUD is the 2026 Cadillac Lyriq-V.
The focus on compact packaging and energy efficiency, highlighted by Envisics, is particularly important. Current HUD systems can be bulky, making them difficult to integrate into smaller or less expensive vehicles. For electric vehicles, minimizing the power consumption of all systems, including displays, is also crucial for maximizing range. Innovations that reduce size and power draw will be key to the widespread adoption of advanced HUDs.
Challenges and the Cautionary Tale of WayRay
While the technological promise of companies like AllFocal Optics, Hyundai Mobis, and Envisics is significant, the path from cutting-edge prototype to mass-produced automotive component is often long and fraught with challenges. The story of WayRay, a Swiss startup also developing holographic AR technology for cars, serves as a stark reminder of these difficulties.
In 2018, WayRay attracted significant investment, including from Hyundai, Porsche, and Alibaba, raising over $80 million in a funding round. Hyundai's chief innovation officer at the time, Dr. Youngcho Chi, spoke ambitiously about establishing a “brand new ecosystem that harnesses AR technology to enhance not only navigation systems, but also establish an AR platform for smart city and smart mobility.” WayRay even developed its own autonomous taxi concept, the Holograktor, featuring its advanced AR tech, with plans for homologation and release by 2025.
However, despite the promising technology and significant investment, WayRay declared bankruptcy in September 2023. According to board director Philippe D. Monnier, the company failed to fully resolve “problematic ‘Russian angles’.” Although headquartered in Switzerland since 2014, WayRay was founded in Moscow in 2012. In the wake of Russia's 2022 invasion of Ukraine, attempts to distance the company from its Russian origins, including share buybacks and changes in management citizenship, were ultimately insufficient to secure a pending $100 million funding round, leading to its downfall.
The WayRay story underscores that technological innovation alone is not enough. Factors like geopolitical risks, the ability to navigate complex funding landscapes, and the practicalities of automotive integration and supply chains are equally critical for success in bringing new technologies to market. Automakers operate on long development cycles, and suppliers must demonstrate not only groundbreaking technology but also reliability, scalability, and financial stability.
The Road Ahead for AllFocal Optics
Back in Cambridge, after removing the prototype helmet and eyepatch, my eyes adjusted back to the real world without any lingering nausea – a small but telling detail supporting the claims about decoupling vergence and accommodation. Dr. Shrestha is optimistic about the timeline for AllFocal Optics' technology.
He believes the lenses could be ready for integration into small batches of specialist equipment, such as high-end night vision scopes, within a few months. An automotive application, which requires more rigorous testing and integration into complex vehicle systems, is estimated to be around two years away. This timeline aligns with the typical development cycles seen in the automotive industry for incorporating new core technologies.
Beyond the primary HUD application, the potential to improve the clarity of digital rear-view screens is another compelling use case. As cars like the Polestar 4 replace traditional rear windshields and mirrors with camera-based systems displayed on screens, ensuring these displays are clear and easy to read for all drivers, including those who wear glasses, becomes increasingly important for safety and usability. AllFocal Optics' technology could directly address the issue where glasses optimized for distance vision make dashboard-mounted screens appear blurred.
While the headlines in automotive technology often focus on autonomous driving or electric vehicle range, incremental improvements in fundamental areas like driver displays can have a significant and immediate impact on safety and the driving experience. Technologies that reduce cognitive load, improve reaction times, and enhance clarity for drivers with varying vision capabilities are vital steps towards making our roads safer for everyone.
The crude prototypes demonstrated by startups like AllFocal Optics, and the challenging journey faced by others like WayRay, offer a realistic glimpse into the often messy and difficult process of innovation. Years before a new technology appears seamlessly integrated into a production vehicle on a dealership lot, it begins with fundamental research, laboratory demonstrations, and partnerships aimed at proving its real-world viability. AllFocal Optics' nanophotonic lens technology represents one such promising development, tackling a core visual challenge that could pave the way for clearer, safer, and more comfortable display experiences in our cars and beyond.
The successful implementation of this technology in automotive HUDs could mark a significant step forward, moving beyond simply projecting information onto the windshield to creating truly integrated, always-in-focus displays that enhance situational awareness and reduce driver distraction. As the automotive industry continues its rapid technological transformation, innovations like the one being pursued by AllFocal Optics will play a crucial role in shaping the future of the in-car experience and contributing to enhanced road safety for drivers of all ages and vision capabilities.