Skip Navigation
Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 7 July 2024

Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh facts of Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

> The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > >Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

1
We regret to inform you that Ray Kurzweil is back on his bullshit
  • First, learn the difference between scorn or disdain and hate.

    Second, read the comments in the thread already made about those "'sort of' correct" predictions.

  • Honest Government Ad | AI

    Bumping this up from the comments.

    2
    We regret to inform you that Ray Kurzweil is back on his bullshit
  • "Computers will be really good at chess" was already a trope in 1960s science fiction. HAL 9000 is canonically so good that he was instructed to throw the game half the time so that his human opponents don't get bored. The Enterprise computer is so good that Spock being able to beat it — Spock — is a major plot point.

  • Mother Jones sues OpenAI and Microsoft - best pic of Sam Altman-Fried
  • You know what? "Go fuck yourself" is way out of line for a conversation about cameras. Bye now.

  • We regret to inform you that Ray Kurzweil is back on his bullshit
  • Some of Kurzweil's predictions in 1999 about 2019:

    A $1,000 computing device is now approximately equal to the computational ability of the human brain. Computers are now largely invisible and are embedded everywhere. Three-dimensional virtual-reality displays, embedded in glasses and contact lenses, provide the primary interface for communication with other persons, the Web, and virtual reality. Most interaction with computing is through gestures and two-way natural-language spoken communication. Realistic all-encompassing visual, auditory, and tactile environments enable people to do virtually anything with anybody regardless of physical proximity. People are beginning to have relationships with automated personalities as companions, teachers, caretakers, and lovers.

    Also:

    Three‐dimensional nanotube lattices are now a prevalent form of computing circuitry.

    And:

    Autonomous nanoengineered machines can control their own mobility and include significant computational engines.

    And:

    ʺPhoneʺ calls routinely include high‐resolution three‐dimensional images projected through the direct‐eye displays and auditory lenses. Three‐dimensional holography displays have also emerged. In either case, users feel as if they are physically near the other person. The resolution equals or exceeds optimal human visual acuity. Thus a person can be fooled as to whether or not another person is physically present or is being projected through electronic communication.

    And:

    The all‐enveloping tactile environment is now widely available and fully convincing. Its resolution equals or exceeds that of human touch and can simulate (and stimulate) all of the facets of the tactile sense, including the sensing of pressure, temperature, textures, and moistness. Although the visual and auditory aspects of virtual reality involve only devices you have on or in your body (the direct‐eye lenses and auditory lenses), the ʺtotal touchʺ haptic environment requires entering a virtual reality booth. These technologies are popular for medical examinations, as well as sensual and sexual interactions with other human partners or simulated partners. In fact, it is often the preferred mode of interaction, even when a human partner is nearby, due to its ability to enhance both experience and safety.

    And:

    Automated driving systems have been found to be highly reliable and have now been installed in nearly all roads.

    And:

    The type of artistic and entertainment product in greatest demand (as measured by revenue generated) continues to be virtual‐experience software, which ranges from simulations of ʺrealʺ experiences to abstract environments with little or no corollary in the physical world.

    And:

    The expected life span, which, as a (1780 through 1900) and the first phase result of the first Industrial Revolution of the second (the twentieth century), almost doubled from less than forty, has now substantially increased again, to over one hundred.

  • We regret to inform you that Ray Kurzweil is back on his bullshit
  • "Humans are generally far removed from the scene of battle" (if you don't count the people that the drones are blowing up)

  • We regret to inform you that Ray Kurzweil is back on his bullshit
  • Some of Kurzweil's predictions in 1999 about 2009:

    • “Unused computes on the Internet are harvested, creating … human brain hardware capacity.”
    • “The online chat rooms of the late 1990s have been replaced with virtual environments…with full visual realism.”
    • “Interactive brain-generated music … is another popular genre.”
    • “the underclass is politically neutralized through public assistance and the generally high level of affluence”
    • “Diagnosis almost always involves collaboration between a human physician and a … expert system.”
    • “Humans are generally far removed from the scene of battle.”
    • “Despite occasional corrections, the ten years leading up to 2009 have seen continuous economic expansion”
    • “Cables are disappearing.”
    • “grammar checkers are now actually useful”
    • “Intelligent roads are in use, primarily for long-distance travel.”
    • “The majority of text is created using continuous speech recognition (CSR) software”
    • “Autonomous nanoengineered machines … have been demonstrated and include their own computational controls.”
  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 30 June 2024
  • And then there's this asshole:

    "AI is using all of our energy and water" is just an attempt to translate an aesthetic revulsion into a common unit of measure. It's "I don't like this" expressed in watt-gallons.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 30 June 2024
  • To avoid confirmation bias and subjective interpretation, we decided to leverage language models for a more objective analysis of the data. By providing the models with the complete set of notes, we aimed to uncover patterns and trends without our pre-existing notions and biases.

    ... the Hell?

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 30 June 2024
  • The smug presumption that any brand of spicy autocomplete is a viable tool "to summarize information, simplify language, or test their knowledge" is so fucking galling.

  • Neither the devil you know nor the devil you don’t
  • More broadly, (ie not just in relation to Cory Doctorow), I’ve seen the take floating around that’s like “hey, what the heck, artists who were opposed to ridiculous IP rights restrictions when it was the music industry doing it are now in favor of those restrictions when it’s AI, what gives with this hypocrisy?” which I think kind of… misses the point?

    I've noticed that too, on occasion. I think the "hey whoa, artists are copyright maximalists now?!" takes tend to miss how artists are coming from concerns about what is morally right and how they can make a living, not copyright as a principle. The latter is, at most, a tool to achieve the former.

    With that in mind, a lot of the artist outrage over AI feels much more in line with artists getting mad about, say, watermark-removal tools, or people reposting art without credit, than it does with the copyright battles of the 00s.

    This says it better than I was going to.

  • EA is becoming a cult? It must be wokeism fault
  • In the first Foundation story, there's a weird mention of applying symbolic logic to human language that comes from nowhere and goes nowhere. Campbell insisted upon it because

    he felt in our discussions that symbolic logic, further developed, would so clear up the mysteries of the human mind as to leave human actions predictable. The reason human beings are so unpredictable was we didn't really know what they were saying and thinking because language is generally used obscurely. So what we needed was something that would unobscure the language and leave everything clear.

    Clear being a fortuitous choice of wording on Asimov's part there, given, well.

    TESCREAL and Scientology don't just share methodology; they both descend directly from "Golden Age" science fiction. In this essay I will

  • EA is becoming a cult? It must be wokeism fault
  • "And a waifu is only a waifu, but a good cigar is a smoke."

  • EA is becoming a cult? It must be wokeism fault
  • Mastodon has Reply Guys. Lemmy has Cater To Me Whilst I Am Literally, Not Figuratively, Taking a Shit Guys.

  • EA is becoming a cult? It must be wokeism fault
  • banned for obnoxious not-pology

  • EA is becoming a cult? It must be wokeism fault
  • If we trace one ancestry path back to science-fiction fandom, well, there's John W. Campbell.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 16 June 2024

    Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    13
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 9 June 2024

    Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    0
    Neil Gaiman on spicy autocomplete
    www.tumblr.com Neil Gaiman

    I apologize if you’ve been asked this question before I’m sure you have, but how do you feel about AI in writing? One of my teachers was “writing” stories using ChatGPT then was bragging about how go…

    > Many magazines have closed their submission portals because people thought they could send in AI-written stories. > > For years I would tell people who wanted to be writers that the only way to be a writer was to write your own stories because elves would not come in the night and do it for you. > > With AI, drunk plagiaristic elves who cannot actually write and would not know an idea or a sentence if it bit their little elvish arses will actually turn up and write something unpublishable for you. This is not a good thing.

    2
    Cybertruck owners allege pedal problem as Tesla suspends deliveries
    arstechnica.com Cybertruck owners allege pedal problem as Tesla suspends deliveries

    Owners will have to wait until April 20 for deliveries to resume.

    > Tesla's troubled Cybertruck appears to have hit yet another speed bump. Over the weekend, dozens of waiting customers reported that their impending deliveries had been canceled due to "an unexpected delay regarding the preparation of your vehicle." > > Tesla has not announced an official stop sale or recall, and as of now, the reason for the suspended deliveries is unknown. But it's possible the electric pickup truck has a problem with its accelerator. [...] Yesterday, a Cybertruck owner on TikTok posted a video showing how the metal cover of his accelerator pedal allegedly worked itself partially loose and became jammed underneath part of the dash. The driver was able to stop the car with the brakes and put it in park. At the beginning of the month, another Cybertruck owner claimed to have crashed into a light pole due to an unintended acceleration problem.

    Meanwhile, layoffs!

    0
    Google Books Is Indexing AI-Generated Garbage
    www.404media.co Google Books Is Indexing AI-Generated Garbage

    Google said it will continue to evaluate its approach “as the world of book publishing evolves.”

    > Google Books is indexing low quality, AI-generated books that will turn up in search results, and could possibly impact Google Ngram viewer, an important tool used by researchers to track language use throughout history.

    0
    Elon Musk’s Tunnel Reportedly Oozing With Skin-Burning Chemical Sludge
    futurism.com Elon Musk’s Tunnel Reportedly Oozing With Skin-Burning Chemical Sludge

    Elon Musk's Boring Company has only built a few miles of tunnel underneath Vegas — but those tunnels have taken a toxic toll.

    [Eupalinos of Megara appears out of a time portal from ancient Ionia] Wow, you guys must be really good at digging tunnels by now, right?

    0
    New York taxpayers are paying for spicy autocomplete to tell landlords they can discriminate
    themarkup.org NYC’s AI Chatbot Tells Businesses to Break the Law – The Markup

    The Microsoft-powered bot says bosses can take workers’ tips and that landlords can discriminate based on source of income

    > In October, New York City announced a plan to harness the power of artificial intelligence to improve the business of government. The announcement included a surprising centerpiece: an AI-powered chatbot that would provide New Yorkers with information on starting and operating a business in the city. > > The problem, however, is that the city’s chatbot is telling businesses to break the law.

    0
    Chris Langan and the "Cognitive Theoretic Model of the Universe"? Oh boy!

    a lesswrong: 47-minute read extolling the ambition and insights of Christopher Langan's "CTMU"

    a science blogger back in the day: not so impressed

    > [I]t’s sort of like saying “I’m going to fix the sink in my bathroom by replacing the leaky washer with the color blue”, or “I’m going to fly to the moon by correctly spelling my left leg.”

    Langan, incidentally, is a 9/11 truther, a believer in the "white genocide" conspiracy theory and much more besides.

    0
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 31 March 2024

    Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut'n'paste it into its own post, there’s no quota here and the bar really isn't that high

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    0
    Elsevier keeps publishing articles written by spicy autocomplete

    If you've been around, you may know Elsevier for surveillance publishing. Old hands will recall their running arms fairs. To this storied history we can add "automated bullshit pipeline".

    In Surfaces and Interfaces, online 17 February 2024:

    > Certainly, here is a possible introduction for your topic:Lithium-metal batteries are promising candidates for high-energy-density rechargeable batteries due to their low electrode potentials and high theoretical capacities [1], [2].

    In Radiology Case Reports, online 8 March 2024:

    > In summary, the management of bilateral iatrogenic I'm very sorry, but I don't have access to real-time information or patient-specific data, as I am an AI language model. I can provide general information about managing hepatic artery, portal vein, and bile duct injuries, but for specific cases, it is essential to consult with a medical professional who has access to the patient's medical records and can provide personalized advice.

    Edit to add this erratum:

    > The authors apologize for including the AI language model statement on page 4 of the above-named article, below Table 3, and for failing to include the Declaration of Generative AI and AI-assisted Technologies in Scientific Writing, as required by the journal’s policies and recommended by reviewers during revision.

    Edit again to add this article in Urban Climate:

    > The World Health Organization (WHO) defines HW as “Sustained periods of uncharacteristically high temperatures that increase morbidity and mortality”. Certainly, here are a few examples of evidence supporting the WHO definition of heatwaves as periods of uncharacteristically high temperatures that increase morbidity and mortality

    And this one in Energy:

    > Certainly, here are some potential areas for future research that could be explored.

    Can't forget this one in TrAC Trends in Analytical Chemistry:

    > Certainly, here are some key research gaps in the current field of MNPs research

    Or this one in Trends in Food Science & Technology:

    > Certainly, here are some areas for future research regarding eggplant peel anthocyanins,

    And we mustn't ignore this item in Waste Management Bulletin:

    > When all the information is combined, this report will assist us in making more informed decisions for a more sustainable and brighter future. Certainly, here are some matters of potential concern to consider.

    The authors of this article in Journal of Energy Storage seems to have used GlurgeBot as a replacement for basic formatting:

    > Certainly, here's the text without bullet points:

    0
    SneerClub Classic: Big Yud's Mad Men Cosplay

    !

    Yudkowsky writes,

    > How can Effective Altruism solve the meta-level problem where almost all of the talented executives and ops people were in 1950 and now they're dead and there's fewer and fewer surviving descendants of their heritage every year and no blog post I can figure out how to write could even come close to making more people being good executives?

    Because what EA was really missing is collusion to hide the health effects of tobacco smoking.

    0
    Billionaires push AI apocalypse risk through college student groups

    > Last summer, he announced the Stanford AI Alignment group (SAIA) in a blog post with a diagram of a tree representing his plan. He’d recruit a broad group of students (the soil) and then “funnel” the most promising candidates (the roots) up through the pipeline (the trunk).

    See, it's like marketing the idea, in a multilevel way

    0
    Talking about a ‘schism’ is ahistorical
    medium.com Talking about a ‘schism’ is ahistorical

    In two recent conversations with very thoughtful journalists, I was asked about the apparent ‘schism’ between those making a lot of noise…

    Emily M. Bender on the difference between academic research and bad fanfiction

    0
    blakestacey blakestacey @awful.systems
    Posts 16
    Comments 23
    Moderates