You’re looking at solutions for a different problem here (restricting notes or media so they’re only served to a group). We already have the base mechanisms to solve this in both Nostr and Blossom (auth & encryption). On the end-user side, Nielson is currently working on Communikeys. And a large proportion of Nostr devs are working on NIP-29, Signal-like MLS groups, or other alternatives. I’m confident that one or more of these efforts will succeed.

The underlying ask here isn’t “don’t show my media to random folks.” On the contrary, people want to post publicly. They want kind 1, and they want it to be seen. But at the same time, they also want more control over how notes and media propagate.

I know, that’s a contradictory position from the start. But it’s the position of some users who see Blossom mirroring as the villain. And, for better or worse, that’s where “solutions” like NIP-70 and what I’m discussing here come from. The goal is to let users “restrict” (or at least signal their intent to restrict) mirroring, not access.

Reply to this note

Please Login to reply.

Discussion

I’ll have to confess, I made assumptions.

“They want kind 1, and they want to be seen.”

By whom?

“…they also want more control over how notes and media propagate”

Why?

So we’re in a position where some relay provides a mirror “for free/out of goodwill” and somehow we get an undesired outcome for some users.

If this was a jira ticket I would’ve asked for a user story/talk to the user because I’m not sure if either of us understand what is really hurting them (fancy way to say customers don’t always know what they want).

So while it could seem to be a “paradox” at first, “to be seen” and “to control the distribution”, it may actually be the case of “communication is hard”.

To be less abstract, my reasoning to untangle the “paradox” was that if the user doesn’t some notes to propagate freely in the network then there must be either a target audience or some “undesirable audience”. So that needed to happen at the npub layer, maybe adding cohorts tags to type 1 or close friends lists or bounded outboxes (i.e. I could add language and subject tags to my notes so that I could have my notes compartmented into multiple virtual outboxes or whatever that people could selectively follow, idk. Sounds like a completely unrelated idea but I think I like it).

Another possibility would be that the user understands deleting stuff from the internet is harder the more said content is replicated since deletions are in essence a gentleman’s agreement, but if that’s the case then the proposed solution of adding another optional request to not redistribute said content wouldn’t address the issue, risking resiliency for little to no tangible return.