I think it's important to remember how this used to happen.
AT&T paid voice actors to record phoneme groups in the 90s/2000s and have been using those recordings to train voice models for decades now. There are about a dozen AT&T voices we're all super familiar with because they're on all those IVR/PBX replacement systems we talk to instead of humans now.
The AT&T voice actors were paid for their time, and not offered royalties but they were told that their voices would be used to generate synthentic computer voices.
This was a consensual exchange of work, not super great long term as there's no royalties or anything and it's really just a "work for hire" that turns into a product... but that aside -- the people involved all agreed to what they were doing and what their work would be used for.
The ultimate problem at the root of all the generative tools is ultimately one of consent. We don't permit the arbitrary copying of things that are perceived to be owned by people, nor do we think it's appropriate to do things without people's consent with their "Image, likeness, voice, or written works."
Artists tell politicians to stop using their music all the time etc. But ultimately until we really get a ruling on what constitutes "derivative" works nothing will happen. An AI is effectively the derivative work of all the content that makes up the vectors that represents it so it seems a no brainer, but because it's radio on the internet we're not supposed to be mad at Napster for building it's whole business on breaking the law.
Studios basically want to own the personas of their actors so they can decouple the actual human from it and just use their images. There's been a lot of weird issues with this already in videogames with body capture and voice acting, and contracts aren't read through properly or the wording is vague, and not all agents know about this stuff yet. It's very dystopian to think your whole appearance and persona can be taken from you and commodified. I remember when Tupac's hologram performed at Coachella in 2012 and thinking how fucked up that was. You have these huge studios and event promoters appropriating his image to make money, and an audience effectively watching a performance of technological necromancy where a dead person is re-animated.
Among those warning about the technology’s potential to cause harm is British actor and author Stephen Fry, who told an audience at the CogX Festival in London on Thursday about his personal experience of having his identity digitally cloned without his permission.
Speaking at a news conference as the strike was announced, union president Fran Drescher said AI “poses an existential threat” to creative industries, and said actors needed protection from having “their identity and talent exploited without consent and pay.”
As AI technology has advanced, doctored footage of celebrities and world leaders—known as deepfakes—has been circulating with increasing frequency, prompting warnings from experts about artificial intelligence risks.
At a U.K. rally held in support of the SAG-AFTRA strike over the summer, Emmy-winning Succession star Brian Cox shared an anecdote about a friend in the industry who had been told “in no uncertain terms” that a studio would keep his image and do what they liked with it.
Oscar winner Matthew McConaughey told Salesforce CEO Marc Benioff during a panel event at this year’s Dreamforce conference that he had concerns about the rise of AI in Hollywood.
A spokesperson for the Alliance of Motion Picture and Television Producers (AMPTP), the entertainment industry’s official collective bargaining representative, was not available for comment when contacted by Fortune.
The original article contains 911 words, the summary contains 213 words. Saved 77%. I'm a bot and I'm open source!
See, I'm pulling the smartest move right now: AI can't take your job if you use AI to take your own job first.
Besides, I think Hollywood is pretty behind on tech overall. The current state of the art voice generator quality is still pretty bad, it'll be a very long time before it can replace actors in quality (if ever): if you train the AI voice on audiobooks, the generated voice is going to sound like someone narrating an audiobook, which really doesn't sound natural for dialogues at all.
I think then the key point isn't to ban generative transformer based AI: once the tech out of its box, you can't exactly put it back in again. (heh) The real question to ask is, who should own this technology so that it does good and help people in the world, instead of being used to take away people's livelihood?
Audible is a premier audiobook service offering a vast selection of titles across all genres, from bestsellers to exclusive originals. With high-quality narrations and a user-friendly app, it's perfect for enjoying books on the go. Sign up now to try Audible's one-month free offer, with the flexibility to cancel anytime without any hassle. Dive into a new listening experience today! https://www.amazon.co.uk/Audible-Free-Trial-Digital-Membership/dp/B00OPA2XFG?tag=jackos1999-21
This is from a guy who advocates Linux as it is Open Source!
The only violation here would be if another used that voice claiming it to be Fry. That would be fraud. Otherwise there is no issue.
Since it is paywalled I can only guess from the title.
I don't understand the problem. He was payed for reading books and now we all have his voice. What did he expect?
Is there an AI imitating his voice making money? Is it being represented with his name? If not, what would be the difference with some person imitating his voice, whould that be stealing too?
Basically I don't see any problem with me buying those books training local model and give it other books to read. That can not be illegal, right?
Giving it to other people mentioning his name would definitely be fraud. But stealing? I don't know.
Selling it to other people under other name... I don't see a problem.
But than we come to AI generated images and I do start thinking in that way. Thou if they can find someone that looks like him, and other person sounding like him... they are all good?