At a glance, many computer-generated humans in films appear convincingly real; others simply don't look artificial. But there's something about them that can make people feel uncomfortable the longer they observe them.
Dr. Patrick Pennefather, a professor of theatre and film at the University of British Columbia, says that overuse of artificially created humans can lead to oversaturation and ultimately to a phenomenon called the "uncanny valley."
The uncanny valley is a term to describe the experience people have when they observe a humanoid or an artificially-generated human. While they may look strikingly close to the real thing, the subtle differences create a deep sense of unease or even repulsion.
"Deepfakes" have made headlines around the world, with many people forecasting a "deep fake apocalypse" where artificial humans saturate the internet and are completely undetectable from the real thing.
Artificially generated "humans" also threaten to remove jobs from the film industry, particularly when they may be used at a lower cost. While screenwriters who belong to the Writers Guild of America (WGA) are busy back at work, film and TV actors from SAG-AFTRA remain on picket lines, with the longest strike in their history hitting the 100-day mark on Oct. 21 after talks broke off with studios.
But creating lifelike AI-generated actors, particularly ones for emotionally complex roles, is a costly, time-consuming endeavor.
It might not be cheaper to use AI in all circumstances
"Technology doesn’t necessarily make it cheaper to replace a human actor, extra or background actor, nor is it a good idea," Pennefather told V.I.A.
"The technology to de-age or age or radically transform an actor is dominantly applied to established celebrity actors and creative teams need to be mindful of not overusing those methods."
British Columbia, known as Hollywood North, is North America's third-largest film and TV production hub, behind Los Angeles and New York City. The industry generates billions in revenue annually, employing thousands across the province.
Technological advancements threaten many jobs in the film industry but many jobs aren't suitable for AI replacements. While some background actors may be at risk, it depends on the scene and how they support the story.
"Humans will always be needed because of the variety of genres out there," the professor noted. "I would say that we are not there yet with creating realistic humans that are close in appearance to a human actor unless they are a creature, animal, or otherworldly."
Pennefather notes that there is technology is attempting to create multiple humans but a great deal of work needs to be done to improve the technology, citing Epic Games' Unreal Engine as an example.
"Unreal Engine has a MetaHuman platform that’s getting close [to creating realistic-looking people] but in order to not have the characters appear uncanny a lot of work will have to be done to make them more real, with real personalities, movement, [and] emotion," he said.
Filmmakers are also using real-time 3D creation tools like the Unreal game engine to create backgrounds for actors. This technology was used in The Mandalorian, with the environments placed on an LED wall behind the actors. The scenes have to be edited in post-production, but technology is evolving to capture actors in the 360 virtual world.
As long as actors are trained to work in these environments, directors aren't likely to opt for an AI substitute. Instead, the technology will "increase support for the more repetitive tasks," allowing them to focus on the visual story.
How generative AI threatens film jobs: Writers
AI systems will affect every facet of film production, including scriptwriting -- and this hasn't been well-received by the writing community. Writers and voice actors were fighting against the unbridled use of generative AI in film. The Writers Guild of Canada (WGC) penned its concerns in a letter to the federal government, arguing that AI-created content is "purely profit-driven" whereas "artists create works to inspire, challenge, provoke, educate, inform."
South of the border, WGA was on strike from May 2 until Sept. 27. Striking writers were demanding that a variety of conditions be met, including long-term pay, consistency of employment, and control over the use of artificial intelligence.
Dr. Bryce Traister is the Dean of Creative and Critical Studies at UBC's Okanagan campus. He's hopeful that writers in various industries can work with AI systems to produce content but said the integration of these systems will significantly alter their process.
"We are going to need a new class of creative industry workers who can thrive in the new context of AI," he explained, adding that there has been a vocal outcry from screenwriters worldwide. There are typically several writers involved in large productions, including writers in the "Writers Room" who pitch ideas and technical writers who fine-tune the script once the first draft has been created.
Generative AI systems "learn" in the same way that writers do. They produce an output based on available information and then a human editor makes changes to the copy, which teaches the software how to improve it. But these systems can also be used to generate ideas and can use previous writing styles to inform new output.
"I could tell it to re-write [a script] in the style of Julia Roberts in Pretty Woman and then it will give me an output. Then I will say 'not like this' and then I'll say 'use more colour words,'" Traister described.
Typically, professional screenwriters will do a script rewrite of a first draft. "Now you call in ChatGPT," he added.
Both Traister and Pennefather say the "human touch" will always be required to guide the process -- film and TV storytellers may just need to rethink their approach.
"Generative AI is an instrument, not the piano composition itself."
While the software can create an idea or a prototype, people must then "examine, extend, refine, throw away, critically assess, and decide whether or not [they] want to use that content," Pennefather noted, adding that there is also a risk of humans becoming "too dependent" on the software to generate ideas.
"Generative AI is an instrument, not the piano composition itself. The human is always in the picture guiding that instrument towards a greater version," said Pennefather.
"If you approach AI like an artist might, then whatever it gives you can always be made better by you."
Traiser shared this sentiment, underscoring that there may be somewhat of an overreaction to ChatGPT and other AI systems meant to generate content typically produced by creatives or artists. He says advising students to steer clear of creative studies and focus solely on computer science will result in a void of creative, passionate people.
There's also the question of intellectual property. Once a writer has submitted a script, their style and ideas can be reworked through the software countless times. For voice actors, the risks are already higher, since recorded content could be used in a variety of contexts in perpetuity.
"You can train a text-speech AI clone within 30 minutes. It’s ok. You can increase that training to eight hours and you have a pretty solid voice you can use for whatever media you want. Voice over, podcast, audiobook, radio commercial, and (warning to some readers), phishing scams," Pennefather cautioned.
While it is still too early to tell how deeply the impact of AI will be felt across Hollywood North, the market was profoundly impacted by the rise of computer-generated imagery (CGI). Vancouver quickly became a hub for visual effects, which don't need to be filmed in a specific environment.
Pennefather underscores that any AI developments will require human supervision and editing and there will always be "exceptions, outliers, and the need for good stories and writing."
With files from The Associated Press.