I worked IT for a multi million dollars start up. The board members were made up a couple other CEOs from larger companies, and let me tell you. They all are idiots. They read books from other "successful" people and just parrot what they saw on FOX business the night before... "FOX business said firing all your full time well trained and educated staff for part timers so we can save on overhead."
I shit you not! Before I quit, they fired 40 of our customer service representatives and hired 120 part time reps, only scheduled them 10 to 15 hours a week and turned a huge profit the next quarter. Customer satisfaction was in the tank but simply comping our service for disgruntled customers was still cheaper than the 40 full time employees that kept customers happy. Capitalism baby!
Most of them are already in a position where they could retire tomorrow and maintain a level of comfort and luxury beyond what most people experience for even a day, for the rest of the lives and the rest of their children's lives.
If I was in that situation, I wouldn't give a shit if I lost my job either.
But it doesn't end there of course. They don't just have millions of dollars sitting in a bank account, they've got also got millions of dollars tied up in the success of companies.
So if an AI takes their job and can fuck every possible person out of every possible penny with a psychopathic efficiency beyond anything we've ever imagined, they still win.
And boy, the horrors that could be endless. If an AI CEO runs multiple companies, who do you charge with price fixing and insider trading?
What do we do when an AI spying on our phones and social media knows exactly how much money we have and can adjust prices in real time?
What about when it's able to personally astro-turf your social media and content aggregation sites to convince you to buy more and complain less?
What about when it can arrange kickbacks for politicians that are perfectly calibrated to be the lowest bid needed to crush out any resistance or accountability?
Your own personal, digital Elon Musk, following you around every second of the day, basing all his statements and decisions on every action you take, pushing his financial and political agenda, from the safety of an impenetrable mask.
It's heartbreaking to watch the death of human-created art, but its a papercut compared to the torture to come when it replaces greedy psychopaths instead.
I’ll bet $5 we’ll start seeing more CEOs become CVOs, or Chief Visionary Officers, and the current pay for CEOs will switch over to CVOs because AI can’t be a visionary or some bullshit.
When AI makes your job redundant, you’ll be unemployed. Fuck you.
But when AI makes my job redundant, I’ll keep my position and just take more naps and vacations whilst my computer works for me. I may also get a pay rise for my efficiency. Also fuck you.
I honestly only skimmed the article because I kept getting more and more angry. Is that a fair summary, or am I selling these cunts short?
Not to say that CEOs are valuable but they also believe LLMs can replace writers.
There's 0 evidence they can (their work is good enough for maybe Big Bang Theory?) And yet CEOs are convinced otherwise.
CEOs think that LLMs can replace programmers. Any programmer who has used copilot knows that the fucking thing hallucinates all the time and is barely good enough to be documentation autocomplete.
CEOs are gullible as fuck and 50% thinking they can be replaced is just more evidence of their gullibility.
CEOs aren't that important but I guaranfuckingtee that an LLM with full autonomy would run a company straight into the ground.
CEO's, above everything else, manage risk. They hedge their bets against the market and against their own workforce.
While "AI" might be better at calculating risk with known factors, the joys of hallucination and often having a poor amount of true knowledge of what's actually going on means that a LLM would run a company into the ground hilariously fast.
It's why I wet myself laughing whenever someone suggests to replace HR or recruiters with AI. Their job is to protect the company, and all it takes is a poor AI decision, and suddenly there's a lawsuit for millions or a bad actor was hired and fucked the company up from the inside.
AI tools are powerful, but that's all they are, and all they will be for a long time. If anything, we're likely to see regressions in performance for ChatGPT as they fight legal battles over their use of protected data/hallucinations, and slightly improved performances from Google/Amazon/Apple on their own initiatives.
It's speculation but I'd imagine a significant number of those answering this survey don't actually understand what AI is, and how poorly suited to the job LLMs would be