I as many people can't run a local LLM - coding barrier, cost barrier, not a solution for travellers.... Started using nostr:nprofile1qqs8msutuusu385l6wpdzf2473d2zlh750yfayfseqwryr6mfazqvmgpy4mhxue69uhkvet9v3ejumn0wd68ytnzv9hxgtm0d4hxjh6lwejkuar4wfjhxqfswaehxw309a5hgcmg0ykkwmmvv3jkuun0vskkvatjvdhkuargdacxsct8w4ejuumrv9exzc3wd9kj7qfpwaehxw309ahx7um5wgkhyetvv9ujuar90pshx6r9v3nk2tnc09az7em0qzz and nostr:nprofile1qqsdy27dk8f9qk7qvrm94pkdtus9xtk970jpcp4w48k6cw0khfm06msppemhxue69uhkummn9ekx7mp0qythwumn8ghj7un9d3shjtnswf5k6ctv9ehx2ap0qy2hwumn8ghj7mn0wd68ytn00p68ytnyv4mz7zvqagl I don't see any downside regarding convenience. Both have a lot of models to choose from.