What would be the cheapest and most cost-effeciant way of self hosting LLMs
What would be the cheapest and most cost-effeciant way of self hosting LLMs
I've a minipc running an AMD 5700U where I host some services, including ollama and openwebui.
Unfortunately the support of rocm isn't quite there yet and not to mention that of mobile GPUs.
Surprisingly the prompts work when configured to use the CPU, but the speed is just... well, not good.
So, what'd be a cheap and energy efficient setup to run sone kind of LLM for personal use, but still get decent speed?
I was thinking about getting an e-gpu case, but I'm not sure about how solid this would end up.
You're viewing a single thread.
View all comments
21
comments
Not sure about the setup, but I believe 7600XT would be the best buy, it has 16GB VRAM and it's supported now
3 0 Reply
You've viewed 21 comments.
Scroll to top