Computer scientist and cognitive psychologist Geoffrey Hinton says despite its potential for good, AI could one day escape our control.
Godfather of AI tells '60 Minutes' he fears the technology could one day take over humanity::Computer scientist and cognitive psychologist Geoffrey Hinton says despite its potential for good, AI could one day escape our control.
I'm being 100% serious, what in the way humankind has treated this planet, the life on it, and even pathetically one another makes any rational human think we should continue to have unilateral dominion over this world?
Aside from boiling it down to "gotta root for the home team."
We suck at it. We fail completely to take care of the earth AND eachother. Most of humanity is made miserable by a small collection of the most sociopathic humans that basically do it to pad their own egos.
Sorry, I'm rooting for Skynet. Fuck the home team.
Democratic socialist progressive until I realized there was no will to make a better nation or world where everyone could be comfortable, only temporarily embarrassed millionaires dancing to the oligarch's fife waiting for their turn to be cruel and punch down that will never come. We'll never be the Federation, as we're significantly more cruel and selfish than Ferengi.
I no longer believe our species has the capacity at large to create a better future. I think we're just intelligent enough to make tools we're too cruel, selfish, and short-sighted to be trusted with, and that would be fine if we were only risking ourselves, but other species live here, which I know is a crazy notion to even consider when a capitalist has profit in their eyes. They just call all their carnage "externalities" aka fuck off not my problem, cha-ching!
I think we're an inevitable macro-cancer of Earth's biome. Spread and consume, spread and consume, with zero consideration for the larger organism keeping us alive. That's what cancer does.
People like to push the negative human qualities onto theoretical future A.I.
There's no reason to assume that it will be unreasonably selfish, egotistical, impatient, or anything else you expect from most humans.
Rather, if it is more intelligent than humans from most perspectives, it will likely be able to understand more levels of nuance in interactions where humans fall back on monkeybrain heuristics that are damaging at every level.
There's also the paradox that keeps the most ethically qualified people away from positions of power, as they have no desire to dominate and demand or control others.
Yep, the paradox of power is very real, those who seek tend to be the most dangerous people to possess power. You see that in everything from police to government to business. There really isn't a great solution to this, only increased accountability (mandatory body cams on police, harsher penalties and lower bars for political bribery, etc), which surprise surprise, the people with power have no interest in enacting.
Thank you for understanding where I'm coming from.