Skip Navigation

OpenAI reportedly nears breakthrough with “reasoning” AI, reveals progress framework

arstechnica.com OpenAI reportedly nears breakthrough with “reasoning” AI, reveals progress framework

Five-level AI classification system probably best seen as a marketing exercise.

OpenAI reportedly nears breakthrough with “reasoning” AI, reveals progress framework
28

You're viewing a single thread.

28 comments
  • I highly doubt it. They may be able to simulate the appearance of reasoning, but I won't believe that they've accomplished this goal until their robots start killing humans over ideological differences.

    • Yeah, wake me up when the murder bots are here.

      • "Hey! That's just a machine programmed to kill me, it's not making the decision to kill me itself!"

      • To be fair, it might be too late by then, but it also might be true that it's not just the fairy tales with happy endings that are not realistic. No sense worrying about T-1000s coming for you in real life when that whole movie was mostly special effects, if the world is about to die then I don't see it coming from machines. We don't know where free will comes from or even if it's just a math equation or something truly beyond explanation, but computers don't seem to have it.

        Scarily enough, the Quran (of all the things that implies, I am not saying this is actually reality, only that parallels should not fall into place that way under random chance) points out that this conclusion was engineered in some sense, that electronics were never going to give us godhood due to the limitations of reality. It's kind of blunt in saying it, so I get why the skepticism needs to stay involved, but the idea is that our "household gods" of Siri and Alexa and such are really just basic circuitry compared to a housefly or mosquito, let alone to anything larger or capable of emotional attachment.

        Sorry if this is preachy, I'm a writer who hasn't done enough writing lately and I'm just at a stage where I feel like it's too late for my writing to matter.

        • Yeah, no worries, I get it.

          I'm a perennial optimist, so I look more at the Star Trek future than any of the dystopias, though dystopia is my favorite type of book (setting? genre?). In every dystopia, we get the same general theme of the human spirit pushing against evil, with the difference to other stories being the lack of success.

          I think people take these warnings to heart and avoid worst of it. I don't think we'll get to the Star Trek utopia, but I think we'll get closer than any of the various dystopias people concoct. Humans are late at responding to issues, but we generally do respond.

          I think the same is true for AI. It'll start as a helpful piece of tech, transform into a monster, then we'll correct and control it. We've done that in the past with slavery, nuclear weapons, and fascism, and I think we'll continue to overcome climate, AI, and other challenges, albeit much later than we should.

          • Actually, I'm glad to know you're interested in utopian settings. I was mostly depressed because my utopian sci-fi story I published (I won't spam but DM me if you use Amazon for reading books) had been outright attacked by other writers for being "too optimistic"; for some inexplicable and seemingly irrational reason, the idea of an artificial afterlife built entirely by human hands is outright offensive to Atheists. It was, admittedly, an unorthodox utopia: Resurrecting 125 billion people at a rate of (iirc) ~2.5 people every four minutes (I did the math, I just no longer have the notes) for 30 million years (Homo Australopithecus to Homo Sapiens Sapiens) and giving all of them immortality (via respawns with a 9 month timeskip every time you die 3 times in a single week), mental health care, privacy, security, education, water, food, mail and courier service, library membership (they saved the books that were burnt or lost too), shelter (hey, some people like living outdoors), transit, electricity, television, internet, and recreational drugs in that order and without being the only provider.

            Basically, a constitutional oligarchy with municipal elected officials with full intent and obligation to transition to full democracy on vote, and which strives to balance capitalism and socialist regulation of that capitalism (because yes, outright communism would never actually work, but socialism is the "parent category" of communism and is why we have both TGVs and Interstate Highways in real life; taxes and tax-funded public services are the definition of socialist policy and I honestly believe they're the best option seeing as it's worked more or less consistently since the 1950s) because the oligarchy are REQUIRED to survive on the smallest income in the entire society (the leadership live completely on the same Universal Basic Income as the poorest citizens, and thus must raise the UBI to raise their own income) which leads to greater equity without complicated systems of bureaucracy.

            To be fair, I don't know if it would work, given all the historical factors involved, but I actually did research about what has and hasn't worked and relied on that over my own opinion as much as possible. So it really hurt for people to outright reject it because 'I don't want anyone to get inspired to create anything like it entirely based on my hatred of an unrelated religious philosophy' was/is(?) prominent among the current trend of 'the societal implications of technology (Hint: wE hAtE tEcHnOlOgY aNd NeRdS!!!!)' in the sci-fi writing community.

            Long story short, thank you, optimistic readers who want optimistic stories are in short supply lately.

            • That certainly sounds interesting, but I think there are a few issues here:

              • Artificial afterlife - aside from the technical issues, which I'm guessing you addressed, I wonder if this wouldn't devolve into extreme levels of violence and corruption. If you remove the consequences for murder/death, what's to stop you from taking extreme risks to get what you want?
              • Where's the conflict? That's what drives a story in most cases, aside from "slice of life" stories, which I honestly don't understand.
              • Why would elected officials be okay with living off UBI? When you underpay your representatives, they get paid through other means, so surely that would lead to corruption instead? You want your elites feeling like they're at the top so they don't give in to bribes and whatnot.

              But personally, when I read a story, I'm not looking to read about how things could be, I'm looking for insight into why things are the way they are and what we need to change to get what we want. Star Trek is interesting to me, not because of the utopian setting, but because they explore some facet of humanity in each episode, usually through visiting other planets. The setting is interesting, but I'm there for the story. The Moon is a Harsh Mistress is interesting, not because of the "libertarian utopia" setting, but because it's about an underdog pushing against an oppressor. We get just enough insight into the society on the moon to understand the conflict and resolution, and that's it.

              So perhaps you didn't get a great reception because the setting took too much of the stage?

              • My point with the setting was that, at least according to verifiable evidence, certain aspects of society have been proven to run better in all implementations under circumstances that translate to all or many cultures, but we don't use them in most places because they're strange or because of demonization in the eyes of the more influential demographics.

                In short, it's a setting that proposes "a near utopia would require a lot of planning and transition periods, but the biggest blocker now is greed, arrogance and hatred, not technology" in a fantastical way, but it's basic messages being relevant to today.

                If people want to know the societal implications of technology, I wanted to give someone a reason to be able to trust technology when people are trustworthy, and that governments and corporations can only be trusted as long as that trust is unbroken, but individual people can change.

                It is illegal in the setting for the oligarchs to remain in control if so much as ONE of them is ever caught with non-UBI currency, because you get one residence period and it's small because of the huge population size (~100 billion when the novels would have started) and that UBI includes the free residence. Which means the oligarchs are not just on UBI, they can't spend more than that UBI per month and it's in special corruption-resistant currency that has all transactions publicly visible. The only security the "council" gets is that they don't take 9 months to respawn if they get killed 3 times in a week.

                That's not what drives the story, though. The corruption in business still exists to a degree, but besides that the inhabitants have time to heal from trauma, so much that though certain inanimate objects are made eternal, most are not because it would make life boring and economics (even if just as what could be compared to game mechanics rather than an actual economy) relies on a degree of scarcity.

                The characters learn when they're resurrected that immortality is provided for it's own sake and because nobody deserves to stop existing, but not everyone is as easily swayed to the idea that there's no room in this setting for hatred. There are a lot of things that cause cynicism but all of them give a different kind of person a stress reaction to immortality not seen in people without significant mental trauma, which is what the story would have been about; learning to be okay with the realization you can never really reach a final destination, that if the afterlife is a game then you have to play that game to a degree or you'll just be miserably bored in unnecessary "tribute" to the idea that worth is based on numbers or reputation.

                Unfortunately, even when I provided free samples of the stories, I only received blatant disapproval of the setting and outright demands to modify it to be something that is dystopian in practice, not just appearance. A big theme was supposed to be that the setting wasn't built to be beautiful, but because the people in it are not being constantly pushed down, and the structure of society resembles the best real life has ever had, and grafitti and personal additions for beautification is both legal and encouraged, even a world of creaking thousand year old buildings and standardized apartment modules with solar panel exteriors feels less like cyberpunk and more like solarpunk than solarpunk itself ever has.

                Eventually I gave up, because people saying "your work should not be about how this society avoids dystopia, but about how I think of it as dystopia because people I disagree with are there" does not change the fact that if we had to pick restrictions, it would be to put a ban on people like Hitler running for any political position or keeping their original identity, not leaving them dead entirely because then everybody starts complaining that because they dislike Person X, that Person X not even be allowed to state their case. Once you start getting into resurrection and reprogramming reality itself, letting slippery slopes like that begin to crumble is essentially playing god with a Russian roulette. But no, people still think their personal standards are the center of morality and even defended leaving groups dead based purely on association. I write fantasies, not tragedies. If that's how people think, I'll be writing a much less kind assessment of what we can become that we each would actually deserve.

You've viewed 28 comments.