You're viewing a single thread.
View all comments
54
comments
If you have a supported GPU you could try Ollama (with openwebui), works like a charm.
13 0 Replyyou don't even need a supported gpu, I run ollama on my rx 6700 xt
6 0 ReplyYou don't even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af
3 0 ReplyI tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)
2 0 ReplyI ran it on my dual core celeron and.. just kidding try the mini llama 1B. I'm in the same boat with Ryzen 5000 something cpu
1 0 Reply
I have the same gpu my friend. I was trying to say that you won't be able to run ROCm on some Radeon HD xy from 2008 :D
2 0 Reply
You've viewed 54 comments.
Scroll to top