Skip Navigation

Any of you have a self-hosted AI "hub"? (e.g. for LLM, stable-diffusion, ...)

I've been looking into self-hosting LLMs or stable diffusion models using something like LocalAI and / or Ollama and LibreChat.

Some questions to get a nice discussion going:

  • Any of you have experience with this?
  • What are your motivations?
  • What are you using in terms of hardware?
  • Considerations regarding energy efficiency and associated costs?
  • What about renting a GPU? Privacy implications?
22

You're viewing a single thread.

22 comments
You've viewed 22 comments.