CGI DEMONSTRATES EERIE PHOTOREALISM IN REAL-TIME
Earlier this week at the Game Developer Conference in San Francisco, Epic Games unveiled a remarkable demonstration of their latest invention in real-time digital rendering system called Unreal Engine. The demonstration showed a completely digital copy of actor Andy Serkis reciting lines from Macbeth illustrating the incredibly rapid evolution of photorealistic digital effects.
Back in 2016, Rogue One: A Star Wars Story, delivered up a completely photo-realistic, computer-generated Peter Cushing, reprising his role from the original 1977 Star Wars. Since that milestone moment, we have seen a variety of photorealistic digital characters invade our live-action films from a computer-generated young Kurt Russell in Guardians of the Galaxy Vol. 2 to a resurrected 1980s Sean Young in Blade Runner 2049.
Fast forward to 2018, the latest state-of-the-art demonstration of illustrating where we are with photorealistic digital characters come through a combination of new technology from several different companies, blending volumetric capture advances with a new real-time rendering system. This allows actors to have their face and body captured in real-time and then transferred immediately into the visage of a digital avatar.
The result is full uncanny valley stuff, with amazing depth of detail that replicates micro-muscular contractions on the actor's face in startling lifelike detail. The second demonstration is just as impressive, translating the real-time data captured from the actor's performance onto a vastly different digital character, a monstrous creature called Osiris Black.
The company boldly states, "… someday photorealistic digital humans will be used in interactive entertainment, simulations, research, non-verbal communication as an interface with the machines, artificial intelligence and mixed reality applications…"