Stay Updated Icon

Subscribe to Our Tech & Career Digest

Join thousands of readers getting the latest insights on tech trends, career tips, and exclusive updates delivered straight to their inbox.

Bringing Back Eye Contact: How New Tech is Revolutionizing Remote and Hybrid Meetings

5:41 PM   |   14 June 2025

Bringing Back Eye Contact: How New Tech is Revolutionizing Remote and Hybrid Meetings

Bringing Back Eye Contact: How New Tech is Revolutionizing Remote and Hybrid Meetings

In the annals of human interaction, few non-verbal cues hold as much power and significance as eye contact. From the earliest face-to-face exchanges in bustling marketplaces to the formal boardrooms of the 20th century, direct gaze has been a cornerstone of communication. It signals attention, fosters trust, conveys sincerity, and deepens connection. When two people share eye contact, their brains exhibit synchronized activity, a phenomenon linked to enhanced bonding, improved focus, and better memory retention during conversation. In the business world, this translates directly into more effective meetings, stronger relationships, and a greater ability to read intentions and emotions – critical elements for successful collaboration and negotiation.

Neuroscience underscores the importance of this seemingly simple act. Brain imaging studies reveal that mutual gaze activates specific regions associated with social cognition and understanding others' mental states, including the fusiform gyrus, medial prefrontal cortex, and amygdala. These areas are vital for processing faces, interpreting social cues, and building the foundation of trust necessary for productive work relationships.

For millennia, all business interactions inherently involved this direct, face-to-face connection. Then, technology began to mediate our communication. The telephone removed the visual element entirely. Email further abstracted communication into text. While video calls, pioneered by technologies like Skype and popularized by platforms like Zoom, promised to bring back the visual, they introduced a new, subtle, yet profound problem: the impossibility of natural eye contact.

The fundamental issue lies in the camera's placement. Typically situated above or beside the screen, the webcam forces a dilemma upon the user: either look at the faces of the people you're talking to on the screen (appearing to look away from them on their screen), or look directly into the camera lens (appearing to make eye contact on their screen, but being unable to see their reactions). This creates a persistent disconnect, a subtle but constant reminder that the interaction is mediated and artificial. While a phone call lacks visual information altogether, a standard video call provides visual information that *incorrectly* signals a lack of attention, even when participants are fully engaged.

Over the past few decades, as remote and distributed workforces have grown, we have collectively experienced a gradual erosion of the benefits that natural eye contact provides in professional settings – the enhanced trust, the deeper bonding, the improved focus, and the clearer understanding of subtle cues. The COVID-19 pandemic accelerated this shift dramatically, pushing video calls from a convenient option to the primary mode of business communication for many. The awkwardness and lack of connection became palpable for millions.

However, the same force that arguably diminished face-to-face connection – technology – is now poised to restore it. A new wave of innovation is directly addressing the eye contact problem, promising to make remote and hybrid meetings feel more natural, engaging, and effective than ever before.

High-Fidelity Telepresence: The HP Dimension and Google Beam

One of the most significant recent advancements in bringing realism, including eye contact, back to remote interactions comes from a collaboration between HP and Google. Unveiled at InfoComm 2025, the new HP Dimension system integrates Google's advanced Beam technology, formerly known as Project Starline, making this high-fidelity telepresence solution available to businesses for the first time.

Project Starline, when first demonstrated by Google, captured the imagination with its promise of 'magical' realism in video calls. It uses a combination of cutting-edge hardware and sophisticated AI. The HP Dimension system incorporates six cameras, a spatial audio setup, and adaptive lighting to capture a user's likeness and environment in detail. This data is then processed by Google Beam's AI video model, which constructs a life-sized, 3D representation of the person on the other end. Displayed on a large, 65-inch light field display, the result is an image with striking depth and realism, tracking head movements at 60 frames per second. This allows participants to naturally make eye contact and perceive subtle non-verbal cues like micro-expressions, creating a sense of shared presence that standard video calls cannot replicate. Users have described the experience as feeling so real that it seems possible to reach out and touch the person or objects displayed.

Beyond the visual fidelity, Google is also developing features like real-time speech translation for Beam, aiming to break down language barriers while preserving the speaker's natural voice and tone. The technology is already gaining traction, with major companies like Deloitte, Salesforce, Citadel, NEC, Hackensack Meridian Health, Duolingo, and Recruit reportedly planning deployments, often facilitated by integration partners like Diversified and AVI-SPL.

The HP Dimension system comes with a significant price tag of $25,000, plus a separate, yet-to-be-announced fee for the Google Beam service. At first glance, this cost might seem prohibitive for many organizations. However, the value proposition becomes clear when considering the potential for replacing expensive business travel.

Business travel represents a substantial expenditure for many companies. Costs include flights, accommodation, meals, ground transportation, and lost productivity during transit. While average costs vary widely depending on destination and duration, even a conservative estimate highlights the potential savings. If we assume an average domestic trip costs around $350 and an average international trip costs $2,600, a mix of just 17 such trips could easily exceed the $25,000 price of an HP Dimension unit. For companies with employees who travel frequently for meetings, the technology could pay for itself in a matter of months.

The calculation becomes even more compelling when considering the cumulative cost of travel across multiple employees or departments. Investing in high-fidelity telepresence could allow businesses to significantly reduce travel budgets while maintaining or even improving the quality of inter-personal communication during remote meetings. The primary condition, of course, is that both parties participating in the meeting must have access to compatible technology to achieve the full effect of mutual eye contact and shared presence.

Augmented Reality and Spatial Computing: Apple Vision Pro

While high-end telepresence systems like HP Dimension offer a dedicated room-based solution, another approach leverages personal spatial computing devices. Apple's Vision Pro headset, initially launched with features for immersive computing and entertainment, is increasingly positioning itself as a tool for collaboration and communication, particularly with the introduction of enhanced features in its visionOS operating system.

At its developers conference, Apple announced visionOS 26, which includes significant upgrades to its Spatial Personas feature. Spatial Personas are 3D avatars designed to represent users during FaceTime calls and virtual meetings within the Vision Pro environment. The goal is to create a more immersive and naturalistic meeting experience where participants feel like they are together in a shared virtual space.

Early iterations of Personas faced criticism for their somewhat uncanny and artificial appearance, with issues ranging from unnatural movements to visual glitches. However, Apple claims that the latest updates, powered by "industry-leading volumetric rendering and machine learning technology," dramatically improve the realism and expressiveness of these avatars. Enhanced facial tracking and rendering mean that subtle expressions like smiles and laughs are translated more naturally onto the Persona, capturing nuances that were previously lost.

The process for creating a Persona still involves a 3D scan of the user's face and upper body using the Vision Pro's external cameras. While the capture method remains similar, the rendering and animation pipeline has been refined to produce a more lifelike representation. According to Apple, these improved Personas are specifically designed to facilitate more meaningful connections during remote interactions.

Crucially, for the purpose of restoring eye contact, Spatial Personas enable users wearing the Vision Pro headset to appear as if they are making direct eye contact with other participants in the virtual meeting space. The system maps the user's real-world gaze and translates it to the avatar's eyes, overcoming the camera placement issue inherent in traditional video calls. While Spatial Persona meetings can accommodate up to five participants interacting in a shared spatial environment, users who are not using a Vision Pro or have not enabled the feature will still appear as traditional 2D video tiles.

The Apple Vision Pro headset carries a price tag of $3,500. While significantly less expensive than the HP Dimension system, it still represents a substantial investment for an individual or a company equipping its employees. However, applying the same logic as with the HP Dimension, the cost can be offset by reducing business travel. Replacing just one average international business trip and a couple of domestic trips could potentially justify the purchase price of a Vision Pro headset for a frequent traveler.

The Vision Pro approach represents a different paradigm – a personal, portable device that creates a shared virtual environment, as opposed to a fixed, room-based telepresence system. It offers flexibility for individuals working from various locations but requires each participant to own and wear the headset for the full immersive and eye-contact-enabled experience.

More Accessible Solutions: Dedicated Camera Hardware

Beyond high-end systems and spatial computing headsets, more accessible hardware solutions are emerging specifically to address the eye contact problem in standard video calls. These devices aim to reposition the webcam or manipulate the image to create the illusion or reality of direct gaze without requiring specialized displays or headsets.

One such solution is the iContact Camera Pro. This 4K webcam features a unique retractable arm that allows the camera lens to be positioned directly in the user's line of sight, typically just below or between the eyes displayed on the screen. By looking at the screen to see the other person, the user is simultaneously looking directly into the camera lens, achieving natural eye contact. The adjustable arm allows for fine-tuning the camera's position. The camera offers real-time adjustments for video and audio settings, connects via USB-C, and is designed to be compact and foldable for portability. It is compatible with major video conferencing platforms like Zoom, Microsoft Teams Teams, and Google Meet.

Another similar concept is the Center Cam. This small webcam is designed to hang in the middle of your screen, utilizing a small aperture or transparent mounting system to place the camera lens directly at the focal point of your gaze when looking at the screen. This "eye-to-eye" camera technology aims to make video calls feel more like face-to-face conversations by ensuring your gaze appears directed at the other person. It offers HD video quality and works with standard setups.

The PlexiCam offers a different, more universal approach. Instead of being a webcam itself, it's a clear, adjustable mount designed to hold *any* standard webcam. The mount attaches to your monitor and positions the webcam directly in the center of your screen, allowing you to maintain eye contact while looking at the video feed of the other participant. The transparency of the mount minimizes obstruction of the screen. It works with various cameras and monitors and can be easily repositioned. The latest version, PlexiCam Mag2, uses a magnetic base for increased flexibility in placement.

These hardware solutions offer a significantly lower barrier to entry compared to HP Dimension or Apple Vision Pro. They are relatively inexpensive peripherals that can be added to an existing computer setup. While they may not offer the same level of immersive realism as the higher-end options, they effectively solve the core problem of misdirected gaze in traditional video calls, making interactions feel more direct and personal.

Software-Based Eye Gaze Correction

Finally, technology is also tackling the eye contact issue through software alone, using artificial intelligence to digitally adjust the user's gaze in real-time video feeds. Casablanca AI is an example of this approach.

Casablanca AI is software that runs on compatible computers (Windows and Macs with Apple Silicon). It uses AI and Generative Adversarial Networks (GANs) to analyze the user's video feed and subtly alter the direction of their eyes and head angle. The goal is to make it appear as though the user is looking directly into the camera, even if they are looking slightly off-center at the screen. The software aims to maintain natural facial expressions and gestures while performing this gaze correction.

A key feature highlighted by Casablanca AI is its ability to handle instances where the user looks significantly away from the screen. In such cases, the software reportedly does not attempt to force a fake, fixed gaze but allows the eyes to shift naturally, mimicking real-world behavior. This prevents the uncanny valley effect that can occur with overly aggressive digital manipulation.

Software solutions like Casablanca AI offer the lowest cost of entry, often available via subscription ($7/month, $20/year) or a lifetime license ($200). They require no additional hardware beyond the user's existing webcam and computer. However, the effectiveness and naturalness of the gaze correction can vary depending on the quality of the AI, the user's setup, and lighting conditions.

As with some hardware solutions and the Apple Vision Pro, the full benefit of mutual eye contact in a software-corrected call is realized when both participants are using compatible technology. If only one person uses the software, their gaze will appear corrected to the other person, but the other person's gaze will still appear misdirected from the perspective of the person using the software.

The Impact and Future Outlook

The re-introduction of authentic or simulated eye contact into remote communication is more than just a technical novelty; it has significant implications for the future of work. As hybrid and remote models become permanent fixtures for many organizations, maintaining strong interpersonal connections and effective communication is paramount. The ability to make eye contact can foster a greater sense of presence, increase engagement during meetings, improve understanding, and build stronger rapport among colleagues and with external partners.

For sales calls, negotiations, job interviews, and sensitive team discussions, the non-verbal cues conveyed through eye contact are invaluable. Restoring this element can lead to more successful outcomes and deeper professional relationships. Furthermore, by making remote interactions feel more natural and effective, these technologies strengthen the case for reducing non-essential business travel, leading to significant cost savings and environmental benefits.

The range of solutions available, from high-fidelity telepresence costing tens of thousands of dollars to affordable webcam peripherals and software subscriptions, means that organizations and individuals can choose the technology that best fits their needs and budget. While the most immersive experiences currently come with the highest price tags, the increasing focus on this problem suggests that eye contact correction features may become more common and integrated into standard video conferencing platforms and devices over time.

The journey from face-to-face meetings to mediated digital interactions led to an unintended consequence: the loss of natural eye contact. Now, a new generation of technology is specifically designed to reclaim this vital element of human connection. Whether through sophisticated 3D rendering, augmented reality avatars, clever hardware design, or intelligent software, the future of remote communication is looking directly at you. And as these technologies continue to improve and become more widespread, they promise not only to enhance our virtual interactions but also to reshape how businesses operate, potentially making expensive business travel a relic of the past for many routine meetings.

The ability of technology to both create and solve problems is on full display here. What was lost in the transition to digital communication is now being meticulously engineered back in. The future of remote and hybrid work looks set to be one where authentic connection, facilitated by technologies that understand and replicate fundamental human interaction cues like eye contact, is not just possible, but commonplace.