Can we have smaller more domain specific models. that shouldn't require more than casual hardware. like a small model for coding, one for medicine, one for history, and so on. ???
Check out hugging face! Honestly fine tunned models for specific domains seems very popular (if for nothing else because training smaller models is just easier!).
Unfortunately the roleplaying chatbot type models are typically fairly sizeable / demanding. I'm curious how this will develop with more specific AI hardware though, like extension cards with primarily tensor cores + their own ram, so that you don't have to use your GPU for that. If we can drag down the price for such hardware then locally run models could become much more viable and mainstream.
Not wanting to be mean, I just find the thought of people talking to robots a bit strange, and use them as tools only. Not sure what "roleplay" means, if it is some "fantasy DND generator" still you could say this may be better done by humans to keep that grey matter running.
Not so much for the latter but I'm pretty specifically talking about my personal use case here. lol
"Roleplaying" in this scenario isn't really referring to actual tabletop type RPGs btw. It's the LLM roleplaying specific characters or personas that you then chat with in specific (or not so specific) scenarios. Although that same tech is also experimented with to be used in video games for NPCs.
But who knows. A specifically trained model could potentially make a half decent dungeon master too.
There also a huge amount of training, medical and otherwise, that’s done through role-playing. I could definitely see medical students getting use out of learning telemedicine with LLMs that were ultimately adapted from TTRPGs character generator schemas.