Skip Navigation
Comments on Trump lawyer Alina Habba's use of "gaming laptop"
  • I once worked for a company who had an accountant who used a gaming laptop. They didn't play games, but it was the only decent one they could get with a number pad.

  • Humble Choice October 2023 Review - A spooky month
  • I really enjoyed The Quarry. Although I killed pretty much everyone.

  • Doctor Who confirms spin-off show return
  • Ooo. I haven't listened to most of those, might to give them a go. I did enjoy The Doctors Daughter though. Jago & Litefoot probably still my favourite BF spinoff.

  • The EU says X is the worst platform for disinformation | Just as it removes a way to report election misinformation
  • I haven't used Twitter much in years really. But after switching to Lemmy during the reddit API debacle I thought I'd give it a go and am really enjoying it. I've set up a ton of filters to block out stuff I don't want to see, and joined a couple of instances for two different personas. I'm not using the official mobile client. On Android I use Tusky and Megalodon. Tusky is my daily driver and feels like how I remember the Twitter app from 7 or 8 years ago. Megalodon is nice for cross instance discovery, but has a couple of UI quirks that prevent me from using fully. My SO uses Ice Cubes on iOS and that looks pretty sweet. Personally I found the switch comparable to Lemmy. It took me a month or two to build up a good number of active people to follow to get to the stage of having an interesting feed. It also seems to have got a lot more active in the last week. When I have dropped into Twitter it's a dumpster fire on top of a cesspit. I don't think I could go back. I'd absolutely recommend giving Mastodon a go.

  • LLMs are surprisingly great at compressing images and audio, DeepMind researchers find
  • I imagine that the compression is linked to the dataset, so if you update or retrain then you maybe lose access to the compressed data.

  • LLMs are surprisingly great at compressing images and audio, DeepMind researchers find
  • The research specifically looked at lossless algorithms, so gzip

    "For example, the 70-billion parameter Chinchilla model impressively compressed data to 8.3% of its original size, significantly outperforming gzip and LZMA2, which managed 32.3% and 23% respectively."

    However they do say that it's not especially practical at the moment, given that gzip is a tiny executable compared to the many gigabytes of the LLM's dataset.