You can privately access open source AI on your phone using Local++ by nostr:nprofile1qqsgha3fk023ng8c4quszdayghqwkt6l9d9ga4c3280gnqz3aqqx7ycpzamhxue69uhky6t5vdhkjmn9wgh8xmmrd9skctcamwllm

Here I configured Local++ to query Deepseek using my own nostr:nprofile1qqsyu43zk46umw6dthksj0jgc6xd8aeylt25w9p0psds4mp09v4qktspz4mhxue69uhhyetvv9ujuerpd46hxtnfduhsz9thwden5te0dehhxarj9ehhsarj9ejx2a30qythwumn8ghj7un9d3shjtnwdaehgu3wvfskuep0tntfql

Reply to this note

Please Login to reply.

Discussion

I sent #cashu to my proxy

My proxy sends query upstream

Response returns to app

This is neat. I've been using open webui over tailscale but I can only access it on my laptop. Tailscale for Android and graphene stopped playing well together.

This even lets you have some models locally on the phone. But this is cashu to proxy, works with routstr provider by default easy peasy. Puts routstr on the phone.

Sounds like I need to read about routstr!