Eurogamer took to the red carpet at the BAFTA awards to check in with the voice actors behind Baldur's Gate 3 and some of the other big games of 2023, and despite the buoyant mood at the show, they all had concerns and harsh criticism for AI voiceover tools—especially unauthorized reproductions of their own voices.
BG3 narrator Amelia Tyler described coming across multiple instances of her already-iconic performance being replicated through AI, including for upsetting and unseemly purposes: “I went on to this stream because somebody gave me a heads up, and I went on and heard my own voice reading rape porn.
“That's the level of stuff we've had to deal with since this game came out and it's been horrible, honestly.”
Tyler said that such use of her voice “is stealing not just my job but my identity,” and that while she loves videogame mod scenes, “to actually take my voice and use it to train something without my permission, I think that should be illegal.”
Raphael actor and best supporting performance winner Andrew Wincott quipped to Eurogamer that, while he appreciates the potential of doing 10 hours of work for 40 hours of output, he still expects to be paid for the full 40. Both Wincott and Samantha Béart, who played Karlach in BG3, expressed trepidation about potential contractual abuses of AI performance cloning, even in the face of SAG-AFTRA's recent contract.
Actors from other games, like Alan Wake 2's David Harewood and Final Fantasy 16's Ben Starr and Ralph Ineson, are similarly wary of the tech. Neil Newbon, who voiced Astarion in BG3, seemed more optimistic due to the sheer gulf in quality between AI copies and real actors' performances, hoping that good taste will win out: “I know a lot of people in the games industry that would like to work with an actor because of what we bring, the craft we bring.
“I don't think you can program craft. It's something beyond zeroes and ones, beyond the formula. It's quite magical.”
It's pretty clear that voice actors don't want AI tools making a muck of their profession, both from an artistic standpoint and in the interest of their own livelihoods, but Amelia Tyler did bring up a particularly interesting edge case: modding and small projects.
I agree wholeheartedly with her and long time Elder Scrolls/Fallout actor Wes Johnson, who tweeted that “anyone trying to create a mod using an actor's voice via AI *without consent* knows they are wrong.” AI voice cloning is an abuse of trust, and I can only imagine the surreal horror of hearing your own voice saying things you never said, let alone cruel or objectionable speech.
I'm also of a mind with my coworker, PCG news writer Joshua Wolens, who last month wrote an ode to crappy mod scene voice acting, which seems particularly imperiled by the proliferation of cheap AI voice tools. Bad voice acting by real humans at least has soul, and it opens you up to “happy accidents” as Neil Newbon called them in his Eurogamer interview.
Still, I find myself vexed by one of my favorite games of last year: South Scrimshaw Part One, a free visual novel presenting a Planet Earth-style documentary about alien whales on another planet. The art and writing is all real, the creation of solo developer N.O. Marsh, who made South Scrimshaw while holding down a day job at a restaurant. However, Marsh used a generic AI voice generator similar to what you'd find on TikTok to produce the game's narration.
Would South Scrimshaw benefit from a real, human performance instead? Absolutely, but the even, emotionless delivery of the generated voice works surprisingly well in a documentary context, the usage of synthesized voice tools is disclosed in the credits, and it all strikes me as a reasonable compromise for a solo artist with a day job making a free game. South Scrimshaw is, uncomfortably, an example of AI generation being deployed well to augment an artist's limited resources.