Skip Navigation
Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 18 August 2024
  • might involve some amount of hubris you say...

    This really opened my eyes to some historical context I never thought of before.

    My initial gut reaction was judgmental about the way billionaires spend their money; thinking it might involve some amount of hubris.

    Then I realized I have no idea of how sculpture that are now show in museums as treasured historical art pieces were judge in the time they were created. Today we treasure them. But what did the general population think of them? I have no idea.

    I imagine that at the time of their commissioning they were also paid by affluent people that could afford such luxuries. People that probably mirror today’s billionaires in influence and access. So what’s different about these?

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 11 August 2024
  • my local community radio station is getting in on the act with a quality sneer in their annual magazine:

    What if the Silicon Valley creeps who control huge swathes of our existence decided that they didn't want this to be their legacy? Well, one solution would be to guarantee the survival of the species by uploading our brains into computers and rocketing them into space. If a few people cark it in the climate catastrophe, it'll be fine as long as there's a big cyber noggin down the track... just google TESCREAL. We didn't make this up.

  • OpenAI launches SearchGPT — and fails its own demo
  • Sam and the truly talented team at OpenAI innately understand that for AI-powered search to be effective, it must be founded on the highest-quality, most reliable information furnished by trusted sources...

    Robert Thomson, Chief Executive, News Corp

    Mmm yes, I too turn to News corp for the highest quality, most reliable information.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 21 July 2024
  • Proton kept popping up massively recommended while some occasional critical mentions from folks in anarchist circles, etc - made me a bit 🤨 and want to dig in more,

    No surprise that folks in anarchist circles are skeptical of Proton ha. That said, I do know quite a few people in the email "industry" who are broadly skeptical of Proton's general philosophy/approach to email security, and the way they market their service/offerings.

    Others I poked into are fastmail and tuta - both seem a fair bit better. Might be worth a look

    Fastmail has a great interface and user experience imo, significantly better than any other web client I've tried. That said, they're not end-to-end encrypted, so they're not really trying to fill the same niche as Proton/Tuta.

    From their website:

    Fastmail customers looking for end-to-end encryption can use PGP or s/mime in many popular 3rd party apps. We don’t offer end-to-end encryption in our own apps, as we don’t believe it provides a meaningful increase in security for most users...

    If you don’t trust the server, you can’t trust it to load uncompromised code, so you should be using a third party app to do end-to-end encryption, which we fully support. And if you really need end-to-end encryption, we highly recommend you don’t use email at all and use Signal, which was designed for this kind of use case.

    I honestly don't know enough to separate the wheat from the chaff here (I can barely write functional python scripts lol - so please chime in if I'm completely off base), but this comes across to me as an understandable (and fairly honest) compromise, that is probably adequate for some threat models?

    Last time I used Tuta the user experience was pretty clunky, but afaik it is E2EE, so it's probably a better direct alternative to Proton.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 21 July 2024
  • In the land down under, the ABC continues to feed us with golden tech takes: Australia might be snoozing through the AI 'gold rush'

    "This is the largest gold rush in the history of capitalism and Australia is missing out," said Artificial Intelligence professor Toby Walsh, from the University of New South Wales.

    It's even bigger than the actual gold rush! Buy your pans now folks!

    One option Professor Van Den Hengel suggests is building our own Large Language Model like OpenAI's ChatGPT from the ground up, rather than being content to import the tech for decades to come.

    lol, but also please god no

    "The only way to have a say in what happens globally in this critical space is to be an active participant," he said.

    mate, I think that ship might have already sailed

  • Devs and the Culture of Tech
  • Hello, and welcome!

    I also desperately need a place where people know what a neoreactionary is so I can more easily complain about them so I’d like to hang around longer term too.

    Sounds like you're in the right place. Please complain as much as you need, so we can all scream, sigh and sneer into the void in unison.

    for my first project I use the Alex Garland TV show Devs

    I haven't read your piece yet, because I'd like to watch devs, unspoiled, at some point, but have bookmarked to come back to at a later point :)

  • Why I'm leaving EA
  • lmao this person writes a personal goodbye message, detailing their experience and motivations in what reads to be quite an important decision for them, and receives "15 disagrees" for their trouble, and this comment:

    I gave this post a strong downvote because it merely restates some commonly held conclusions without speaking directly to the evidence or experience that supports those conclusions.

    This is EA at its "open to criticism" peak.

  • EA is becoming a cult? It must be wokeism fault
  • these people can't stop telling on themselves lmao

    There’s currently a loud minority of EAs saying that EA should ostracize people if they associate with people who disagree with them. That we should try to protect EAs from ideas that are not held by the majority of EAs.

    how fucking far are their heads up their own collective arses to not understand that you can't have a productive, healthy discourse without drawing a line in the sand?

    they spend fucking hundreds of collective hours going around in circles on the EA forum debating^[where "debating" here is continually claiming to be "'open to criticism" while, at the same time, trashing anyone who does provide any form of legitimate criticism, so much so that it seems to be a "norm" for internal criticism to be anonymous for fear of retribution] this shit, instead of actually doing anything useful

    how do they, in good conscience, deny any responsibility for the real harms ideas cause, when they continue to lend them legitimacy by entertaining them over and over and over again?

    I swear these fuckers have never actually had to fight for or defend something that is actually important, or directly affects the day-to-day lived experience or material conditions of themselves or anyone they care about

    I hope we protect EA’s incredible epistemic norms

    lol, the norms that make it a-okay to spew batshit stuff like this? fuck off

    Also, it’s obvious that this isn’t actually EA cultiness really, but just woke ideology trying to take over EA

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 23 June 2024
  • NYT opinion piece title: Effective Altruism Is Flawed. But What’s the Alternative? (archive.org)

    lmao, what alternatives could possibly exist? have you thought about it, like, at all? no? oh...

    (also, pet peeve, maybe bordering on pedantry, but why would you even frame this as singular alternative? The alternative doesn't exist, but there are actually many alternatives that have fewer flaws).

    You don’t hear so much about effective altruism now that one of its most famous exponents, Sam Bankman-Fried, was found guilty of stealing $8 billion from customers of his cryptocurrency exchange.

    Lucky souls haven't found sneerclub yet.

    But if you read this newsletter, you might be the kind of person who can’t help but be intrigued by effective altruism. (I am!) Its stated goal is wonderfully rational in a way that appeals to the economist in each of us...

    rational_economist.webp

    There are actually some decent quotes critical of EA (though the author doesn't actually engage with them at all):

    The problem is that “E.A. grew up in an environment that doesn’t have much feedback from reality,” Wenar told me.

    Wenar referred me to Kate Barron-Alicante, another skeptic, who runs Capital J Collective, a consultancy on social-change financial strategies, and used to work for Oxfam, the anti-poverty charity, and also has a background in wealth management. She said effective altruism strikes her as “neo-colonial” in the sense that it puts the donors squarely in charge, with recipients required to report to them frequently on the metrics they demand. She said E.A. donors don’t reflect on how the way they made their fortunes in the first place might contribute to the problems they observe.

  • InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)JA
    jax @awful.systems
    Posts 0
    Comments 14