Bypass Usage Limit

There is a thing called ollama, it’s a free ai app that has no usage limits. Maybe you could switch the ai with that?

If it’s free, I doubt it’s very good, and if it’s not very good I doubt it will be able to support it’s self under the weight of rec rooms player base

Ollama isn’t an AI model itself, but rather a tool to run AI models locally. There’s no rate limits because you’re running the AI on your own device, rather than OpenAI’s servers. If every player had the hardware capability, and storage capacity to run ChatGPT on their device, that might work. But with the lower end devices like Mobile, and Quest, that’s just not possible.

1 Like

It would be really cool if RecRoom could offer a tool to let users offer their own local hardware running these LLM’s/other AI to avoid the credits or usage of OpenAI instead. Then Room Creators can use lighter or stronger AI models depending on the purpose they need. RecRoom would totally need to do a hardware check with their tool to be sure your computer is even capable of becoming an AI proxy for your prompts for Roomie, MakerPenAI, and their other AI additions they’re working on, as stuff like this won’t be possible to just run on a low end computer or headset at all.

Sure if the user running the LLM isn’t running it anymore for the other AI projects RecRoom is working on, they’ll have to show a subtitle or warn the players attempting to use it that it’s unavailable.