There's also open-webui and you can run open source LLMs locally for free.

Reply to this note

Please Login to reply.

Discussion

I have run open source LLMs locally for free. The performance can't match, not even in the same ballpark (hence me mentioning $20K etc) ... or did I misunderstand you?

I could have been more clear.

You can use open-webui and an API key from an LLM aggregator. I've been using NanoGPT and Venice APIs. Gives me a wide range of models. But also use them with locally hosted ones too.

Privacy wise, its a decent trade off.