Hot would you use a LLM if you could anonymously ask a question with a zap?

nostr:note1cy3cmwwffdht4wkz870eqtxfnqyc2eyvc6huad4lk05dq4pxfnpst8ehm8

Reply to this note

Please Login to reply.

Discussion

I like that idea

How *

I probably wouldn’t. Because I already pay a lot of money for the best one in the world.

It would use the same one, but you pay as you go

I would probably cancel all my subscriptions and switch to this. Choose any model, pay as you go, anonymous. Compatible with openwebui and other frontends

If I can cancel, then sure.

What do you mean?

I mean, if it can replace my ChatGPT subscription, then I would probably switch.

The thing that sucks is losing the very nice claude/chatgpt frontends, but you gain the flexibility of using any model. The third party frontends are improving every day though.

nostr:npub1nje4ghpkjsxe5thcd4gdt3agl2usxyxv3xxyx39ul3xgytl5009q87l02j how much are you paying monthly? How many queries do you send monthly? 10, 100, 1000, 10000?

The $20 a month ChatGPT one.

I would say I do about 20-30 queries a day on average.

I honestly stopped using google.

If the LLM provides an answer much better than ollama or gpt4all models, and what I need to ask is sensitive then yes.

In practice that means no, sorry.

I would hook up our platform with a platoon of orchestrated agent functions and switch out the current llm calls we’re using. Then slowly onboard our fiat clients to that stream. After I ask you how you make it anonymous and which llm’s we can use ^^

anonymous is easy, its just a random token associated with the request. In the nostr ai assistant app i’m working I plan on making it the pubkey of lmzap_nsec=sha256(user_nsec + “lmzap”)

So that it’s deterministic from your nostr identity.

Then it would just be a matter of improving the backends, adding claude, gemini, etc.