Hot would you use a LLM if you could anonymously ask a question with a zap?
nostr:note1cy3cmwwffdht4wkz870eqtxfnqyc2eyvc6huad4lk05dq4pxfnpst8ehm8
Hot would you use a LLM if you could anonymously ask a question with a zap?
nostr:note1cy3cmwwffdht4wkz870eqtxfnqyc2eyvc6huad4lk05dq4pxfnpst8ehm8
I like that idea
How *
I probably wouldn’t. Because I already pay a lot of money for the best one in the world.
It would use the same one, but you pay as you go
I would probably cancel all my subscriptions and switch to this. Choose any model, pay as you go, anonymous. Compatible with openwebui and other frontends
If I can cancel, then sure.
nostr:npub1nje4ghpkjsxe5thcd4gdt3agl2usxyxv3xxyx39ul3xgytl5009q87l02j how much are you paying monthly? How many queries do you send monthly? 10, 100, 1000, 10000?
The $20 a month ChatGPT one.
I would say I do about 20-30 queries a day on average.
I honestly stopped using google.
If the LLM provides an answer much better than ollama or gpt4all models, and what I need to ask is sensitive then yes.
In practice that means no, sorry.
I already do via nostr:nprofile1qqsdy27dk8f9qk7qvrm94pkdtus9xtk970jpcp4w48k6cw0khfm06mspp4mhxue69uhkummn9ekx7mqpz9mhxue69uhkummnw3e82cfwvdhk6qg5waehxw309aex2mrp0yhxgctdw4eju6t09x35fm They have a discount when paying with Lightning!
This is awesome
I would hook up our platform with a platoon of orchestrated agent functions and switch out the current llm calls we’re using. Then slowly onboard our fiat clients to that stream. After I ask you how you make it anonymous and which llm’s we can use ^^
anonymous is easy, its just a random token associated with the request. In the nostr ai assistant app i’m working I plan on making it the pubkey of lmzap_nsec=sha256(user_nsec + “lmzap”)
So that it’s deterministic from your nostr identity.
Then it would just be a matter of improving the backends, adding claude, gemini, etc.