Avatar
Michael Henke
e8e7be38685b18f0f91323465a9cb815e5414a62908a8378d57618b1bf85537b
₿itcoin Enthusiast

Today is the day #nostr blew up my purple bird app 🤙🏻

Will the stats update include how many npubs opted out?

PV keep #nostr purple 🤙🏻

#[0]​ should be the #lightning conductor and @zappr should be thunder clap.

#thunderclapping 👏🏻 ⚡️

#[0]​ What exactly happens in #Damus when I long-press a post and select Broadcast?

Does it push the same note to all connected relays?

#pv #nostr #nostrica #nostrich #nostrgram

In the context of machine learning, a stable diffusion model typically refers to a deep neural network that has been trained and optimized for a particular task, such as image classification or natural language processing. Pruning is a technique used to reduce the size of a neural network by removing unimportant connections or neurons, while maintaining or improving its performance. Pruning can be applied to a stable diffusion model to reduce its computational complexity and memory footprint, making it more efficient to run on devices with limited resources, such as mobile phones or embedded systems. When a stable diffusion model is pruned, it means that certain connections or neurons in the model have been identified as unimportant and removed. The remaining connections and neurons are then retrained to fine-tune the model's performance and compensate for the removed elements. The resulting pruned model may have a smaller number of parameters and a lower computational cost, while still maintaining similar levels of accuracy or performance compared to the original model. Overall, pruning can be a useful technique to optimize a stable diffusion model for deployment in resource-constrained environments, without sacrificing its accuracy or performance.

PV #nostr 🤙🏻

I just noticed #[0]​. It made me reminisce of ₿itcoin faucets from way back in the day…

#[0]​ should #[1]​ work with LNURL hosted on Tor? Apparently my lightning address was no good as I just got notified of a failed zap.