Indeed. I’m working on https://contex.st where the end goal is local LLMs but currently support nostr:nprofile1qqsgha3fk023ng8c4quszdayghqwkt6l9d9ga4c3280gnqz3aqqx7ycpz3mhxue69uhhyetvv9ujuerpd46hxtnfduq35amnwvaz7tmjv4kxz7fwvajhgctvvfujucm0d5hhvvg5lm7lc, soon nostr:nprofile1qqs8msutuusu385l6wpdzf2473d2zlh750yfayfseqwryr6mfazqvmgpzdmhxue69uhhqatjwpkx2urpvuhx2ue0qyjhwumn8ghj7en9v4j8xtnwdaehgu3wvfskuep0dakku62ltamx2mn5w4ex2ucys3fck and all data is locally stored. I have 21+ GB of documents and memes on my device

Reply to this note

Please Login to reply.

Discussion

Using this for lora/ham based nostr would be epic

Please tell me more about your use case

Its not mine. But you send hd pics over ham or lora, so the AI could just replace the picture with its description.

nostr:nprofile1qqszw48usckkhs9hcwt3q3np9k2z2c73s8qc0gu3uxqw66cqlq88ukcpzemhxue69uhk2er9dchxummnw3ezumrpdejz7qgnwaehxw309aex2mrp0yhxvdm69e5k7tcpy3mhxue69uhhyetvv9ujuum0wejhyetfvahx2mn8d9hx2etjd9hxwtnfduhsn9fual might run hamstr