Skip Navigation

Home Assistant 2024.6: Dipping our toes in the world of AI using LLMs

www.home-assistant.io 2024.6: Dipping our toes in the world of AI using LLMs 🤖

Control your home with an AI powered Assist, conditional sections and cards for your dashboards, amazing new media player commands, and so much more! 🚀

2024.6: Dipping our toes in the world of AI using LLMs 🤖
15

You're viewing a single thread.

15 comments
  • Great, but it's restrictive only letting you use openai and google. I'm already hosting oogabooga text generation, let me use that

    • I believe that's because those two APIs support function calling, open source support is still coming along.

      • Ah that makes sense. That's when I'd start using it myself. Self hosted models and audio

      • Mistral Instruct v0.3 added in function calling, but I don't know if its method for implementation is the same/compatible. Also, it is fairly new and wasn't released all that long ago. Hopefully we'll get there soon. :)

        • I saw a few others, but the ones I looked at were basically instruct layers where you'd need to add your own parser. I didn't find anything (in my 3 minutes of searching) that offers an openai chat completions endpoint, which is probably the main stopper.

          • Looking at the documentation it looks like it relies on Mistral's python tooling to work. I'm fairly dumb, so I don't know if the tool suggestion coming from Mistral is from some kind of separate neural net or as some kind of special response you have to parse (or that their client parses for you?).

You've viewed 15 comments.