What's something you'd like to see done on nostr that an llm (text-generation model) could potentially do?
if you used evernote, here's my quick cheatsheet to export everything to local markdown files
```
brew install evernote-backup evernote2md
evernote-backup init-db
evernote-backup sync
evernote-backup export output_dir/
evernote2md output_dir markdown
```
Apparently they were recently bought out and most of the staff were laid off
I'll die on this hill
It was stronger than any Amercano I've ever had because it's their coffee rather than just water
Wild. Just ordered a double shot and a drip pour and the barrista suggested combining them 🤯
And it totally works.
gg Rekindle Coffee shop
Tried this a couple weeks ago and it didn't work, but it does now

Nice, Deskmini? iirc those are small form factor so a bit bigger.
I've replaced fans in all my nucs, but had to watch YT videos to figure out how to get passed the plate
After taxes and shipping:
Intel NUC12WSKi3 $382
64Gb DDR4-3200 $108
980 EVO 2Tb nvme PCIe4 $132
2 performance cores and 8 effeciency cores
The ram is indeed overkill but my current workload is north of 16Gb and future proofing at this price is reasonable.
PCIe4 r/w speeds
At 0.5L volume, NUCs are perfect for a shelf. It's smaller than its own power adapter. I've been running these for over a decade. You can save a buck by sniping older gen used models on ebay. ymmv
I orchestrate so much infrastructure on these little things though so this time I'm getting it new.
$600 for an investment into self-sovereign sofware infrastructure and compute that'll I never stop needing.
This old RTX 2080Ti is so humble.


Sounds like a better use case for nip-94 too.
/duck
Sorry, I'm a little confused. Put the link where?
Happy to shill nos.social
OpenAI’s whisper transcription of audio and translation is amazing.
Here are some examples: https://openai.com/research/whisper
The cool part is they’ve got
The ideal way to run it is probably the GGml port
Ah, it's to avoid those force closes then. Gotcha.
I've actually been trying to figure out how to enable anchor outputs in cln recently. Maybe someday it'll be the default.
nostr:npub1r0ck3tz85gl36wg730thk36c6xylc6caaafqsxy6pe5wq8y8gfhqjl0j93 Your Amboss page says "FYI: Most channels will be disabled when the mempool fees are high. They will be re-enabled once the mempool cools down."
Can you point me to more info about this behavior? I'm interested in learning what motivates this (and how to do it)
nostr:npub18l0pstx8umh6dx3e8vtw7sd3pspe9r0nh94v7ncwkqleljnr5zdq73y8he fyi I emailed support@deezy.io to request an access token, but noticed your mx records are Gmail so it probably went to spam. My mailserver is on a /24 that's in a one of the spamhaus lists, sigh
$ lightning-cli parsefeerate normal
{"perkw": 6120}
Actually I got lucky on this channel open, as that worked out to only be ~25 sat/vB. My mempool suggests that I should have used "urgent" rather than "normal" feerate to be on the safe side, but at least it got confirmed!
Starting another new LN wallet for LNbits and it needs inbound liquidity:
* pay 5385 sats to open a small 1MM sats channel to deezy (his minimum channel size)
* use swap.deezy.io to pay him ~20k sats to swap funds back to an onchain address so that I have inbound liquidity. actual cost depends on what I choose to pay for the chain fees. (7.5k-31k sats range)
* 25k sats is ~$6.75 right now, so that's the barrier to entry to get ~$270 equivalent liquidity on LN
Working with higher amounts actually *reduces* the fee rate, scaling the total cost up only slightly. So for example if I opened a 25MM sats channel I could swap the whole thing out for 55k sats (~$14). That's 25x liquidity for a little more than 2x the cost of the smallest direct swap. Trade off being that there's a timeframe where he can close the channel if it's not being used after 30 days or so. By then I'll have more liquidity providers for this wallet.
Pretty low barrier to entry to get yourself on lightning, even with chain fees as they are right now.
Logseq seems interesting, as a floss note taking app. I'm going to try it out. Curious if it'd make sense to build a plugin for publishing and collaborating on it via nostr.
It's total overkill if you're not interested or aware of the Zettelkasten/2nd brain/mind garden stuff.
