Skip Navigation

How easy is it to run stable diffusion on framework’s amd graphics card?

I don’t know much about graphics cards, but the framework laptop seems to offer an “AMD Radeon™ RX 7700S” and stable diffusion requires Linux ROCm.

It’s not completely clear if ROCm runs on AMD Radeon™ RX 7700S, so I was wondering if anyone had any experience with setting it up on framework.

10

You're viewing a single thread.

10 comments
  • It won't run. Don't try to damage your laptop. This should be done on a desktop or a server in the first place.

    • I don’t have the space for a desktop/server yet, hence I’m considering a graphics card laptop.
      What do you mean by “ It won’t run. Don’t try to damage your laptop.”? What road-block prevents it from running? Why would it cause damage?

      • I would like to retract the first sentence. It is misleading - the model can run without any hiccups with ROCm, and is probably a decent GPU for this job. However, I will provide more context on the second sentence.

        I tried training and running models (GAN, LLM) on my Lenovo IdeaPad S540 (i5-8265U + 8GB + MX250 2GB laptop). The result of such heavy computation is that the battery is toast, the barrel port adapter is severely damaged where it connects with the board, the metallic panels have stress marks due to uneven heating, and the soldered RAM is a goner. And the battery has no capacity. Remember those old TVs where you had to hit them to make it work? That's the state of my device right now.

        The cooling system in a laptop isn't the best. So, it would be better to run the model on a desktop or cloud GPU. By the way, older stable diffusion models seem to work on something as old as RX580 with ROCm.

        • I see. Thank you.

          Would keeping it plugged in or removing the battery help with the battery issue?

          Edit: Also, is there any way to force the GPU to throttle earlier to reduce damage?

          • I think that would still be a bad idea - running billions of parameters on a laptop - because it's not just the battery, you might also want to think about how much the heat-sink can handle cooling. However, there's nothing wrong with running the model a few times. If that were a desktop GPU, it would handle a load as heavy as that easily.

            Might I suggest you try using KOALA once? It has way lesser parameters than your typical large-scale diffusion model, so it will be forgiving on your device. Best of luck with your attempt.

You've viewed 10 comments.