ladies, this is what men actually want
https://video.nostr.build/9dc0928f6fdb4829a09f9c9d551dad8abb69d4df86c68db2979fba7bd8933963.mp4
whats wrong with silicone
yeah building a good training data set seems to be the challenge
Nice to see Twitter copying nostr:npub18m76awca3y37hkvuneavuw6pjj4525fw90necxmadrvjg0sdy6qsngq955 #onlyzaps mode, minus the zaps
cc nostr:npub1xtscya34g58tk0z605fvr788k263gsu6cy9x0mhnm87echrgufzsevkk5s nostr:npub1uapy44zhu5f0markfftt7m2z3gr2zwssq6h3lw8qlce0d5pjvhrs3q9pmv nostr:note1gkf0uc6zquep2rqrs9yyfzfanxw3j3r3v6tjnlcd09g8kzjw6wks2ln60j
Also swipe to reply…
Some clients support blossom file sharing
Just post a link, files aren’t stored on nostr
this guy never misses (a retarded take)
You know that feeling, when you give up on something because it’s not going anywhere.
You then find a new thing and get heavily involved with that community and it takes over your life.
Then your new community suddenly find your old community and get heavily involved in it?
You feel like maybe you should have not dumped the old thing after all.
That’s Bitcoin & LoRa 😂
This was me in 2016:
https://mikehardcastle.com/2016/05/18/gaia-ai-and-robots/ nostr:note17hlcjqj2cry2ntnazda9ny03av40ncf67qxh4uuh73rdumsqqj7qpxelj2
Its a bit awkward that #LoRA (low rank adapters) and #LoRa (long range) clash, especially since im interested in both 😅
#LoRA makes me bullish on #ai hacking on consumer hardware while still leveraging large pretrained models as a base. It works by putting lower rank matrices at each layer of the transformer stack in a larger base model like llama and training those.
The base model’s weights are frozen, but you can train these low-rank “adapters” which are much smaller and require less memory/compute.
Nice thing about fine tuning is that you are basically teaching the ai new things that it won’t forget all the time. So we can give it lots of domain knowledge about nostr, nips, etc. hardest part is setting up a good training dataset.
is this true anon
We fixed ours in the latest appstore release
So far the transformers themselves recommend t5flan small
yeah its more of a learning excercise for me to see if i can find the smallest model possible to fine tune with high quality results
training an ai that can convert natural language to nostr queries:
"show me all the longform articles with the most zaps from alice and bob"
=>
{"kinds": [30023], "authors":["bob","alice"],"sortby":"zaps"}
if I can get this working I make this the default way to do advanced search in damus notedeck and damus ios.
Yakihonne is my new favourite longform client. Well done nostr:npub1xtwy7fvu8f7wdtgnpm68wyrf6uxshf49tn5kp7kyusu6872amn8qh06rus and team!
generated this image on my m2 mac with 32gb of ram in about 38 seconds:

using:
https://github.com/filipstrand/mflux + schnell model
impressed... could replace my use of midjourney.
nostr:note1pnezaa9ehx0kwt5pcxn33vrx27pa2uxsakusudvrqsj42tg7nd2szwxg7e
M4 macs are becoming an interesting (and surprisingly cheaper) option for running local LLMs. They have lots of unified memory, integrated gpus and neural cores that are pretty good for running local models.
Git issues + project management
nostr:npub18m76awca3y37hkvuneavuw6pjj4525fw90necxmadrvjg0sdy6qsngq955 nostr:npub1xtscya34g58tk0z605fvr788k263gsu6cy9x0mhnm87echrgufzsevkk5s for some reason this post didn’t show up until I write a new post. Using the latest AppStore version rn. nostr:note1mzzw7nunfg7dgl62rxhey4we45hkakqd32f4ufzckj6ma9dcr7zs820pju
May fail to send initially but it will keep retrying until you have connection again. We just need to make this more obvious somehow
I gave this exact feedback on the PR review, with the addition of pulling followed hashtags from your contact list into a column
ai powers most of my learning these days. during my 1h walk a day i get ai to read articles and papers to me. I occasionally stop to talk to chatgpt and ask questions about certain topics. What a bizarre future we’re in.

Music discovery isn’t what it used to be. Fountain Radio makes it fun again.
Wave goodbye to algorithmic playlists and say hello to a new communal listening experience powered by Bitcoin and Nostr.
Fountain Radio is live in version 1.1.8 now. Here's how it works:
Add and upvote tracks in the queue ➕
Search for any song on Fountain and pay 100 sats to add it to the queue.. Upvote any track in the queue to change its position. 1 sat equals 1 upvote and the track with the most upvotes will play next.
Support the artist currently playing ⚡️
Boost to send a payment with a message on enable streaming to send a small amount for every minute you spend listening. 95% of every boost and streaming payment goes directly to the artist currently playing.
Post in the live chat 💬
Hang out with other listeners in the live chat by connecting Nostr. You can post chat messages for free. Every time a track is added or upvoted this appears in the activity feed too.
Save tracks to your library 💜
Listen to your latest discoveries again later in the app. Tap on any content card to add a song to your library or a playlist, or see more music from that artist.
Listen to Fountain Radio on other apps 🎧
Fountain Radio now has its own RSS feed so you can tune in on any podcast app that supports live streams and Nostr live streaming platforms like nostr:npub1eaz6dwsnvwkha5sn5puwwyxjgy26uusundrm684lg3vw4ma5c2jsqarcgz.
Artist takeovers 🔒
Artists can now take control of the music and host a listening party. During a takeover, only the host can add tracks to the queue and upvotes are disabled.
The first artist takeover will be with nostr:npub19r9qrxmckj2vyk5a5ttyt966s5qu06vmzyczuh97wj8wtyluktxqymeukr on Wednesday 27th November at 12:00pm EST. Want to host a takeover? Get in touch and we will get you scheduled in.
You can also find Fountain Radio on our website here:
👀
people are so obsessed over the nips repo like it has any real power. the power comes from implementors choosing to implement things or not. the best ideas will have the highest number of implementations. just because a nip is merged doesn't mean its a good idea
Yeah for some reason my notifications are much more accurate in Damus. Idk why but nostr:npub12vkcxr0luzwp8e673v29eqjhrr7p9vqq8asav85swaepclllj09sylpugg is always missing notifications in comparison.
It’s pretty simple why. they depend on a centralized server for all their stats and feeds, we do not. damus will always work, so people can use us when primal goes down. We will reach the same level of ux with a more robust client architecture, will just take a bit more time because it is much harder to do.
I also have this for note blocks, which are the parsed sections of a note, these are also stored in the database. this way each frame I can simply walk and interact with these structures immediately after they come out of the local database subscription and enter the timeline.
You'll like this video. This guy uses lots of optimization techniques to get a PHT that runs more than twice as fast as the gperf one (the best one he found that isn't specifically constructed) https://www.youtube.com/watch?v=DMQ_HcNSOAI
great video
Right! Its cool to see what people are reading
still in it for the tech
It’s probably in the apple docs somewhere but im too lazy to find it
I'll probably do it. it's pretty simple to implement
nostr will have the best tech and the most devs, people won’t be able to ignore that for long. other clients on other protocols will run into heavy handed moderation issues due to centralization of power, government pressure, and lack of user keys. This is the main risk I see for mastodon, bluesky and big tech, regardless of current user numbers.
As long as people care about freedom and building a sovereign presence in cyberspace, nostr is inevitable.
It’s possible most people don’t actually care about these things, but at least we’ll have the best solution for people who do. Those people are the coolest to hang out with anyways.
You don’t need testflight to zap notes, you only need to run https://zap.army
nostr:npub1zafcms4xya5ap9zr7xxr0jlrtrattwlesytn2s42030lzu0dwlzqpd26k5 what happened to Friday? That was too soon?
alpha launch is end of month


