There's a difference between using LLMs to edit text, provide ideas or give you information that you can double check because you have the subject matter experience. Relying on it as a substitute
for skill when something important is at stake like someone's well being is reckless at best.
Everyone anywhere using one on the job should be fired
There's no nuance there it's just AI = bad. I agree that it's shouldn't, in its current form, be used as a substitute for skill in important situations. You're totally right there.