Making actors look older or younger has been a perennial challenge for movie studios. This used to be achieved through rather cumbersome and not always convincing prosthetics and make-up effects. It was then largely superseded by time-consuming digital VFX techniques, but it looks like Disney has come up with a game changer.
While publicly accessible AI image generators are making an impact in the creative fields, Disney has been working on a studio-grade AI model that can age (and de-age) actors in a way that looks so realistic it’s scary (for more on the use of AI in other creative fields, see how to use DALL-E 2).
Making actors look older or younger is not new. Make-up artists have done incredible jobs such as David Bowie in The Hunger (1983) and Brad Pitt in the Curious Case of Benjamin Button (2008) – the latter had 56 people working in hair and make-up. More recently, VFX upped the ante, and in Captain Marvel (2019) we saw Samuel L. Jackson age by about 25 years.
Traditional VFX can take a large amount of time, with editors manually painting frame-by-frame. But Disney Research Studios just revealed a solution it’s been working on that looks like it could achieve a realistic effect in real time. It describes its Face Re-Aging Network (FRAN) as a “hands-on, fully automated and production-ready” tool that can automatically re-age and de-age faces.
The concept is not new. The tool uses a neural network technique, not unlike deep-fake software, but until now the technology has been too unreliable for use in movies due to the loss of detail or some facial features when the subject moves. Disney’s model, according to his research paper (opens in new tab)adapts to moving images with an astonishing degree of accuracy, even when the subject is not looking at the camera.
To train FRAN, the research team collected thousands of AI-generated faces, studying how machine learning technology would handle them. Rather than creating new headshots, FRAN recognizes the parts of the face that are likely to be affected by age – for example, smile lines and wrinkles around the eyes. It superimposes these features over the subject’s face, creating a stable and terrifyingly realistic effect that preserves the original facial features in different lighting conditions and from different angles.
The results are amazing, but also a little scary. Deepfakes still have some tell-tale signs that tend to give them away, the most obvious being their weakness for showing a subject in profile. We’ll be watching to see how Disney uses the technology. Its latest animated release, Strange World, was a bit of a flop (many blame Disney’s own lack of marketing, but on Disney+, Disenchanted went down a storm.