- Scientists created Future You which lets users speak to themselves in the future
- The researchers found that speaking with the AI reduced user’s anxiety
While scientists haven’t invented a time machine just yet, there is now a way for you to get some much-needed advice from your older self.
Experts at Massachusetts Institute of Technology (MIT) have created Future You – an AI-powered chatbot that simulates a version of the user at 60 years old.
The researchers say that a quick chat with your future self is just what people need to start thinking more about their decisions in the present.
With an aged-up profile picture and a full life’s worth of synthetic memories, the chatbot delivers plausible stories about the user’s life alongside sage wisdom from the future.
And, in a trial of 334 volunteers, just a short conversation with the chatbot left users feeling less anxious and more connected to their future selves.

Researchers have created an AI which allows you to get advice from a future version of yourself. Future You creates a simulation of the user as they might be at the age of 60
Read More
From eating rocks to putting glue on your pizza and smoking while pregnant, here’s what Google’s new AI tool is (incorrectly) telling users to do

So far, the Black Mirror-worthy technology has only been privately tested as part of a study, but it could be made available to the public in the coming years.
To start chatting with their future selves, the AI first asks the users a series of questions about their current lives, their past, and where they may want to be in the future.
Users also give the AI a current picture of themselves which is transformed into a wrinkled, grey-haired profile picture for their future version.
Their answers are then fed into OpenAI’s ChatGPT-3.5 which generates ‘synthetic memories’ to build out a coherent backstory from which to answer questions.
One participant told Future You that she wanted to become a biology teacher in the future.
When she later asked her 60-year-old self about the most rewarding moment in her career the AI responded: ‘A rewarding story from my career would be the time when I was able to help a struggling student turn their grades around and pass their biology class.’

So far, the Black Mirror-worthy technology has only been privately tested as part of a study, but it could be made available to the public in the coming years

Using the information provided, the AI creates a coherent backstory complete with fake memories it can refer to. This diagram shows a conversation between a user and her future self about rewarding moments in her career
The AI, which said it was a retired biology teacher, added: ‘It was so gratifying to see the student’s face light up with pride and accomplishment.’
Pat Pataranutaporn, who works on the Future You project at MIT’s Media Lab, says he thinks these kinds of interactions could have real benefits for the users.
Mr Pataranutaporn told The Guardian: ‘The goal is to promote long-term thinking and behaviour change.
‘This could motivate people to make wiser choices in the present that optimise for their long-term wellbeing and life outcomes.’
He even says that he has found benefits from chatting with his future self.
In a video demonstrating the chatbot, Mr Pataranutaporn asks his future self: ‘What would be a lesson you share to the new MIT Media Lab student?’
After a short pause the AI replies: ‘The most important lesson I’ve learned is that “nothing is impossible”.
‘No matter how hard something might seem, if you work hard and put your mind to it, you can achieve anything.’
Most profoundly, he recalls one conversation in which the AI reminded him to spend time with his parents while he still could.
‘The session gave me a perspective that is still impactful to me to this day,’ Mr Patarunataporn says.

Users provide the AI with information about themselves and a picture which is used to create synthetic memories and an aged-up profile picture (pictured) for their future version
Mr Pataranutaporn is not alone in feeling a benefit from speaking with the AI.
Read More
Think twice before using AI to digitally resurrect a dead loved one: So-called ‘griefbots’ could HAUNT you, Cambridge scientists warn

In a pre-print paper, the researchers found that participants had ‘significantly decreased’ levels of negative emotions immediately after the trial.
Emotional measures found that participants displayed reduced levels of anxiety as well as an increased sense of continuity with their future selves.
As the researchers note, studies have found that people who are more connected to their future selves show better mental health, academic performance, and financial skills.
In their paper the researchers write: ‘Users emphasized how emotional of an experience the intervention was when commenting about the interaction, expressing positive feelings such as comfort, warmth, and solace’.
The researchers aren’t the first to experiment with using digital ‘human’ chatbots for mental health purposes.

The researchers found that just a short conversation with the AI left users feeling less anxious and more connected to their future selves
Character.ai, a chatbot used to impersonate characters from games and movies, is now used by many as a popular AI therapist.
More controversially, several companies also offer so-called ‘deadbots’ or ‘griefbots’ which use AI to impersonate dead loved ones.
Platforms offering the digital afterlife service including Project December and Hereafter allow users to speak with digital resurrected simulations of those who have died.
However, experts warn that these technologies can be psychologically harmful or even ‘haunt’ their users.
While the researchers found that speaking to their future selves could help many people, they also caution that there are risks.
The researchers note that risks include: ‘Inaccurately depicting the future in a way that harmfully influences present behavior; endorsing negative behaviors; and hyper-personalization that reduces real human relationships.
‘Researchers must further investigate and ensure the ethical use of this technology.’