OpenAI just announced Sora, a tool for creating video from text, and the results are really fucking good (especially compared to state-of-the-art AI video generation tools), and this has me thinking about job security again.
A study surveying 300 leaders across the entertainment industry reports that three-fourths of respondents indicated that AI tools supported the elimination, reduction or consolidation of jobs at their companies. Over the next three years, it estimates that nearly 204,000 positions will be adversely affected.
The Concept Art Assn. and the Animation Guild commissioned the report, which was conducted from Nov. 17 to Dec. 22 by consulting firm CVL Economics, amid concerns from members over the impact of AI on their work. Among the issues is that concept artists are increasingly being asked to “clean up” AI-generated works by studios, lowering their billed hours and the pool of available jobs, says Nicole Hendrix, founder of the advocacy group.
“We’re seeing a lot of role consolidation and reduction,” Hendrix says. “A lot of people are out of work right now.”
According to the report, nearly 77 percent of respondents use AI image generators enabling, for example, individuals to upload landscape photos to virtual productions screens or speed up rotoscoping in postproduction. They have applications in 3D modeling, storyboarding, animation and concept art, among other things.
Generative AI displacing workers isn't some future hypothetical, it's something that's already happening right now, and as someone working in a field which is vulnerable to automation by AI tools, I'm really worried that OpenAI (or some other company) is going to create a new tool that just completely puts me out of a job.
Is anyone else worried for their job? Is there anything that can be done?
No, it's largely a tech bro grift similar to cyber trucks and crypto. Yes, the bourgeoisie will fall for it and try to implement it (See Amazon's embarrassing AI product descriptions) but capitalism is already crashing and the job losses while replacing stuff with a terrible facsimile will just crash it harder. There will be an AI crash as it infects everything and makes actually making accurate predictions of reality impossible. Scientific research, for example, will be fucked by inaccurate AI models. This will include engineering and tech.
No one actually likes this shit. Something will give eventually. It's not even actual AI, just an algorithm.
I mean, "AI will crash eventually" is little relief when animators are getting put out of jobs right now. Everyone can talk all they want about how generative AI can't produce real art, but it's clear that AI is already good enough and getting way better each year. I'm probably not going to get put out of a job in 2024, but what about 2025 or 2026?
Oh, totally. I didn't mean it wouldn't suck before it gets better. It's horrible.
Like a part of me dies knowing video games that are lazy asset flips with AI-assisted character designs are being eaten up by consumers.
Capitalism has spent decades convincing people not to value artists, writers and scientists and it's really fucking sad to see some people's attitudes being "Well writers have been making lazy shit lately so I don't care if they get replaced". Mother fucker those writers aren't lazy, they're working in terrible conditions and have to have their work butchered by marketing teams.
People are going to notice though, people already hate Marvel movies, but with AI shit the bourgeoisie will have even more control over art, and the quality will be even worse than what people hate now. It's popular for now because it's new, but people are already kind of starting to dislike it for its repetitive nature and soullessness.
How long can you really say this is the case? LLMs and diffusion models in 2024 are so much better in terms of capability than conventional algorithms that it's just a truism to say "it's just an algorithm".
Like, technically, lions and tigers are just biochemical reactions and atoms, but if I'm getting chased by one, knowing that isn't going to bring me any solace.
It only took a few years from shitty dall-e mini memes to photorealistic images. Even if AI isn't "good enough" (a standard that keeps on getting raised every time a new development comes out), what about in 2025? What about in 2026?
Like, "AI can't draw hands" used to be a meme, but AI can draw hands now. AI used to not be able to form coherent backgrounds, but AI can form coherent and consistent 3D environments now. I think people have a right to be worried!
The most depressing thing about this is that in a communist society, your job getting automated away would not be met with dread and anxiety that you'll be out of a job but rather with joy that you won't need to work anymore.
I worry for the future. You need to have a job to live under capitalism, but what then once there are no more jobs? Do we just starve?
Im not worried for my job. Its likely to be one of the ones that cannot be automated away. You cant get robots or artificial intelligence to climb under a house to chase water leaks. You arent going to get a robot to develop a feel for operating sewer cleaning equipment.
I do fear for some jobs. Things dealing with language - proofreading, writing. Things dealing with maths. Things more related to having an actual brain, which i do not possess.
But then again, im just a dumb guy and have shown time and time again that i dont know shit.
Part of my work is in writing, thankfully not much of it. But I can tell you they are going to massively lay off writers in the next couple years and then hire them back five years later.
They're going to try to rely on AI writing but AI writing doesn't fucking say anything, it just riffs forever and avoid committing to real statements. Companies are going to get sued for lying on accident, and they are going to realize that people don't trust what they say anymore.
Among the issues is that concept artists are increasingly being asked to “clean up” AI-generated works by studios, lowering their billed hours and the pool of available jobs, says Nicole Hendrix, founder of the advocacy group.
Yeah, this is how this stuff tends to go down. New tech doesn't have to replace an industry wholesale, it just has to make it so that one person is more productive and capable of taking on more work that might've gone to another employee needing to be hired. Technology is a Force Multiplier. Many of the companies doing lay offs and hiring freezes right now will, in the future, find it more profitable to spend money on software and cheaper, less skilled labor than to hire back anywhere near the amount of people who have been let go. This happens pretty regularly, but at an extremely large scale during recessions. The future is gonna be pretty rough.
As leftists, we really should be trying to leverage AI's massive Force Multiplier potential to help us in our organizing and overcome the vast gulf in our numbers but it's become much more popular to shrug it off than to learn about it. A lot of people took the ExplainItLikeI'm5 version of this stuff and decided they're experts on it and that's that. I've been studying this shit in my spare time for almost 20 years. My intro book, currently sitting on my bookshelf, is so old that it talks about the potential of ATT's landline video phone which I got to demo as a kid. Believe me when I tell you the current LLMs barely scratch the surface of what neural networks are capable of. And you can bet Capital is gonna use it to squeeze every last drop of productivity out of us until we're no longer necessary.
New tech doesn't have to replace an industry wholesale, it just has to make it so that one person is more productive and capable of taking on more work that might've gone to another employee needing to be hired.
I agree. AI doesn't need to automate 100% of people's jobs, it just needs to automate 90% of people's jobs and allow one person to pick up the work of ten.
I worry more about how AI is going to be used in vile ways by companies to save some combination of their time, money, and effort. Whenever an ordinary person needs a professional - they could get fucked over by AI.
Somebody goes to the hospital. In this example - they have tests like MRIs. They meet with a doctor but nearly all the radiology analysis was done by AI. The doctor spends more time worrying and timing about his golf game.
Somebody online goes to a "mental health" firm that promises people can talk to a qualified therapist at very low cost. The firm turn out to have an "anonymized" AI backend. Not only that the "private" meetings are data mined.
Somebody goes to a law firm and they meet with an actual living person. But their case is mostly done by AI.
Somebody goes to an accountant...
Somebody goes to a financial planner...
Somebody goes to an engineer...
Of course - the rich avoid those problems.
I assume any legislation to stop this sort of stuff will be garbage. And less nefarious examples...
Somebody hires an artist but they have zero artistic skills. So - it's AI time!
I saw these comment about OpenAI making an up to 60 second vid from a prompt.
Still some visible flaws, but as I've said before, I think it's time to acknowledge that this tech is developing at an incredible and dispiriting pace.
You can easily see how a film/tv production can integrate this stuff. "We'll film the actors, but any fancy tracking shots we can feed into a generator and skip all the complicated set up."
Exactly -- even just supplemental usage is going to transform the industry.
I entirely agree. Here's the OpenAI plus vid.
Introducing Sora, our text-to-video model.
Sora can create videos of up to 60 seconds featuring highly detailed scenes, complex camera motion, and multiple characters with vibrant emotions.
Prompt: "Beautiful, snowy Tokyo city is bustling. The camera moves through the bustling city street, following several people enjoying the beautiful snowy weather and shopping at nearby stalls. Gorgeous sakura petals are flying through the wind along with snowflakes."
Yeah I think I'm done for in a few years time - I'm not a creative but for the type of office work I do, I could easily see at least a significant chunk of it being replaced by AI. Maybe not a perfect replication but it's pretty close already from my own experimentation with chatgpt and the likes. Once they integrate AI into your everyday office products - e.g. word processors, spreadsheets, outlook etc. - I fear a lot of office workers are going to find themselves in some trouble. It's only going to be important decision making and checking over for accuracy and mistakes that will require human input. People will still be needed sure, but their will be reductions in the workforce as companies look to cut costs and maximise profits, even if the AI isn't perfect yet. Just hoping I can develop some specialisation in other areas in time to futureproof myself.
No, under crapital humans will always be cheaper long term, we don't need stable electricity, internet nor reasonably temperature/humidity controlled environments to function like sensitive electronics do. Its a similar concept to why ancients preferred slavery to steam power. My main job is a cashier, even with self checkout, and in most places it getting phased out not for theft but because the machines are expensive to repair combined with above being lacking in all retail environments I've seen making repairs ever more frequent, even when people ask me (usually they don't, assume I'm too insert ableist slur here) I scoff while they shake in fear silly at the soggy wet paper tiger that is modern AI, my secondary gig is actually curating AI and then I just laugh since the economic-social base is just not there for it.