Skip Navigation
Generating (often non-con) porn is the new crypto mining
  • The main issue here is user knowledge and consent. Otherwise this isn't a whole lot different from services like vast.ai offering on demand GPU rentals or the KoboldAI Horde. Based on the incentives offered it's clear that they're targeting younger or less savvy users which is a problem.

  • Microsoft’s VASA-1 can deepfake a person with one photo and one audio track
  • The "why would they make this" people don't understand how important this type of research is. It's important to show what's possible so that we can be ready for it. There are many bad actors already pursuing similar tools if they don't have them already. The worst case is being blindsided by something not seen before.

  • Dock GPU to Laptop or to small SOC?
  • Your best bet would probably be to get a used office PC to put the card in. You'll likely have to replace the power supply and maybe swap the storage but with how much proper external enclosures go for the price might not be too different. Some frameworks don't support direct GPU loading so make sure that you have more ram than vram.

    An arm soc won't work in most cases due to a lack of bandwidth and software support. The only board I know of that can do it is the rpi5 and that's still mostly a poc.

    In general I wouldn't recomend a titan x unless you already have one because it's been deprecated in cuda, so getting modern libraries to work will be a pain.

  • Microsoft’s first AI PCs are the Surface Pro 10 and Surface Laptop 6 for businesses
  • The "AI PC" specification requires a minimum of 40TOPs of AI compute which is over double the 18TOPs in the current M3s. Direct comparison doesn't really work though.

    What really matters is how it's made available for development. The Neural engine is basically a black box. It can't be incorporated into any low level projects because it's only made available through a high-level swift api. Intel by comparison seems to be targeting pytorch acceleration with their libraries.

  • Generative AI will eventually poison itself
  • This article is grossly overstating the findings of the paper. It's true that bad generated data hurts model performance, but that's true of bad human data as well. The paper used opt125M as their generator model, a very small research model with fairly low quality and often incoherent outputs. The higher quality generated data which makes up a majority of the generated text online is far less of an issue. The use of generated data to improve output consistency is a common practice for both text and image models.

  • OpenAI's GPT Trademark Request Has Been Denied

    > First, applicant argues that the mark is not merely descriptive because consumers will not immediately > understand what the underlying wording "generative pre-trained transformer" means. The trademark > examining attorney is not convinced. The previously and presently attached Internet evidence > demonstrates the extensive and pervasive use in applicant's software industry of the acronym "GPT" in > connection with software that features similar AI technology with ask and answer functions based on > pre-trained data sets; the fact that consumers may not know the underlying words of the acronym does > not alter the fact that relevant purchasers are adapted to recognizing that the term "GPT" is commonly > used in connection with software to identify a particular type of software that features this AI ask and > answer technology. Accordingly, this argument is not persuasive.

    34
    InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)BE
    BetaDoggo_ @lemmy.world
    Posts 1
    Comments 256