Innovation

With AI, talking to one’s future self may soon be normal

October 17, 2024 | By Chris Mullen

about In Tech

In Tech is our regular feature highlighting what people are talking about in the world of technology — everything from crypto and NFTs to smart cities and cybersecurity. 

In a time when technology can feel detached, researchers are working to make our digital interactions more personal. Imagine an AI that allows you to glimpse your future self. Everyday eyeglasses integrated with augmented reality — no bulky goggles needed. Gamers transcending the screen by exchanging biosignals with online teammates or opponents to enhance the social connection and experience. Behind every screen, there is still the potential for improving the human experience.

‘Mirror, mirror on the wall, who will I become after all?’

Unlike the unsettling tales of "Black Mirror," an innovation is turning that narrative on its head. Researchers from MIT and other institutions have developed a system called "Future You" that allows users to have a text-based conversation with an AI-generated simulation of their future selves. This aims to improve the users' sense of self-continuity, positively influencing long-term decision-making.

A generative AI-produced age progression of a young woman to an elderly woman.

The system uses a large language model to create a relatable virtual version of the user at age 60, generating "future self-memories" based on the user's current life and goals. This makes the simulation more realistic and interactive, helping reduce anxiety and strengthen the connection with one’s future self. Consequently, users may make better choices, such as saving more or seeking new educational opportunities.

An initial study with 344 participants showed that those who interacted with "Future You" felt that deeper connection and reported less anxiety about the future. Researchers are exploring specific applications, such as career exploration and understanding the impact of everyday choices.

Connecting gamers heart to heart

In the realm of online gaming, where avatars in virtual worlds often replace face-to-face interactions, the quest for genuine social connection can feel elusive. But what if the heartbeat of your gaming partner could bridge that gap?

In earlier gaming eras, playing video games side by side on the couch created a social experience that online gaming has struggled to replicate. The study demonstrates that online communication tools often fall short of meeting the human need for fulfilling social interactions.

Researchers at Japan’s University of Tsukuba have discovered a simple yet groundbreaking method to enhance the sense of social presence in online interactions by sharing biosignals. Imagine playing a game and not just seeing your opponent's avatar, but also feeling their excitement or calm through their heart rate. This innovative approach transforms online gaming into a more meaningful and fulfilling social experience, bringing players closer together in a way that transcends the screen.

The study involved 20 gamers playing soccer matches under various conditions, including viewing live video and heart rate information of their opponents. The combination of bio information and face video was found to closely replicate the social presence experienced when playing together in the same room, making online interactions more meaningful. Now it’s up to developers to implement these mechanics in innovative and enriching ways for gamers.

Everyday AR … a sight to behold

Augmented reality is no longer just a futuristic concept for gamers: It's poised to revolutionize fields like surgery and self-driving cars. Imagine slipping on a pair of sleek eyeglasses that overlay digital images onto your real-world view, enhancing your perception and interaction with the environment. Researchers have reported in ACS Photonics a breakthrough in AR technology that combines two optical technologies into a single, high-resolution AR display, making it easier to integrate into everyday devices.

The challenge has always been miniaturization. Traditional AR systems, like bulky goggles, require multiple lenses. These researchers combined a metasurface and a refractive lens with a microLED screen, creating an AR system the size of eyeglasses without compromising quality.

Warning: Physics ahead. Their AR display uses an ultrathin silicon nitride film with a light-shaping pattern. This pattern focuses light from green microLEDs onto a refractive lens made of synthetic polymer, which sharpens the image and reduces distortions. The final image is projected onto an object or screen. To boost image quality, the team used computer algorithms to fix minor imperfections before the light leaves the microLED.

The prototype showed less than 2% distortion across a 30-degree field of view, matching the quality of commercial AR platforms with multiple lenses. The researchers also confirmed that their preprocessing algorithm improved the image quality, making it 74.3% like the original image. With further development, this technology could support full-color AR and lead to mainstream AR glasses.

As computing and AI-driven software solutions continue to rapidly grow, the boundaries between digital and physical experiences continue to blur. These innovative advancements not only enhance our interactions and perceptions but also pave the way for more immersive and connected experiences. Whether feeling the heartbeat of a teammate, seeing the world through AR-enhanced lenses, or leveraging AI for complex problem-solving and decision-making, the future promises a richer tapestry of human connection and technological marvels that bring us closer together in ways previously unimaginable.

Age progression made with generative AI by Adobe Stock.

Chris Mullen, contributor