You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Not an isue but an idea that has been in the back of my mind as a web dev. It would be nice if we could use a local LLM like LM studio (that allow local API calls) to feet the bots chatting capability. LLms can be trained to specialize in specific subject hence it could be fed with World of Warcraft Lore but an alternative is to feed it with multiple PDFs (LM studio allow that) so it can provide meaningful lore related conversation. The API address could be passed the playerbot config file. I guess the main issue could be performance but it could be tuned up. That would bring some serious level of immersion :) If I can get a local environment up and running I might give it a go and see what is feasible. @celguar any thoughts on that ?
The text was updated successfully, but these errors were encountered:
Hokken
changed the title
bot chat / conversation local LLM
bot chat / conversation fed by local LLM
Nov 28, 2024
Check latest bots commits and configs on GitHub. cmangos/playerbots
Active development of llm and API calls is in progress and is working. Join our discord for more info
Not an isue but an idea that has been in the back of my mind as a web dev. It would be nice if we could use a local LLM like LM studio (that allow local API calls) to feet the bots chatting capability. LLms can be trained to specialize in specific subject hence it could be fed with World of Warcraft Lore but an alternative is to feed it with multiple PDFs (LM studio allow that) so it can provide meaningful lore related conversation. The API address could be passed the playerbot config file. I guess the main issue could be performance but it could be tuned up. That would bring some serious level of immersion :) If I can get a local environment up and running I might give it a go and see what is feasible. @celguar any thoughts on that ?
The text was updated successfully, but these errors were encountered: