Oops, didn't realize how dear the idea might have been to you. Might have been a bit too straightforward / brutally honest.
That said, there's a big "Why not?" for adding a protected tag anyway. It can only help. Just can't rely on it.
I think the real mechanism regarding “privacy” and siloing community media is AUTH. (I’m using quotes here because I honestly don’t believe any kind of real privacy exists on social media, even with encryption, but that’s a hot take of mine.)
nostr:npub149p5act9a5qm9p47elp8w8h3wpwn2d7s2xecw2ygnrxqp4wgsklq9g722q has torn my idea apart in a way I probably couldn’t have done myself (unfortunately, the discussion thread has fragmented because poll support on Nostr is terrible):
He’s right. Just like NIP-70, “protected” events are really just asking people to respect the original uploader's intent. In other words, marking media as protected is just a statement of intention; people can easily ignore or bypass it. Still, this can be useful for well-behaved Blossom implementations run by well-intentioned folks, i.e. those mirroring to genuinely help. Marking a blob as protected is simply a request for other responsible Blossom users, nothing more.
Oops, didn't realize how dear the idea might have been to you. Might have been a bit too straightforward / brutally honest.
That said, there's a big "Why not?" for adding a protected tag anyway. It can only help. Just can't rely on it.
Nah mate, your take is not only valid but very welcome. I wouldn’t have asked if I wasn’t looking for an honest answer. And yours is well-embased, coming from someone who not only understands but is actually building with Blossom.
I think I’m ultimately trying to solve a people’s behaviour/expectations problem with tech, which is always a bit silly.
The core of the problem is:

Nevertheless, if there’s enough interest (and nostr:npub1ye5ptcxfyyxl5vjvdjar2ua3f0hynkjzpx552mu5snj3qmx5pzjscpknpr doesn’t dislike the idea), I might still try to write a small optional BUD. But I’m not sure if it old help. What we really need is a better way to explain and educate both devs and users about Blossom. I’m trying to do something on this front as well.
Feels like we need some sort of community groups/forums sort of thing.
And/or snapstr/nostories with proper encryption (at least for a “close friends” mode etc).
That way you keep relays decoupled from the distribution policies themselves (I mean, there’s no incentive for a relay to host something none of its users can reach).
You’re looking at solutions for a different problem here (restricting notes or media so they’re only served to a group). We already have the base mechanisms to solve this in both Nostr and Blossom (auth & encryption). On the end-user side, Nielson is currently working on Communikeys. And a large proportion of Nostr devs are working on NIP-29, Signal-like MLS groups, or other alternatives. I’m confident that one or more of these efforts will succeed.
The underlying ask here isn’t “don’t show my media to random folks.” On the contrary, people want to post publicly. They want kind 1, and they want it to be seen. But at the same time, they also want more control over how notes and media propagate.
I know, that’s a contradictory position from the start. But it’s the position of some users who see Blossom mirroring as the villain. And, for better or worse, that’s where “solutions” like NIP-70 and what I’m discussing here come from. The goal is to let users “restrict” (or at least signal their intent to restrict) mirroring, not access.
I’ll have to confess, I made assumptions.
“They want kind 1, and they want to be seen.”
By whom?
“…they also want more control over how notes and media propagate”
Why?
So we’re in a position where some relay provides a mirror “for free/out of goodwill” and somehow we get an undesired outcome for some users.
If this was a jira ticket I would’ve asked for a user story/talk to the user because I’m not sure if either of us understand what is really hurting them (fancy way to say customers don’t always know what they want).
So while it could seem to be a “paradox” at first, “to be seen” and “to control the distribution”, it may actually be the case of “communication is hard”.
To be less abstract, my reasoning to untangle the “paradox” was that if the user doesn’t some notes to propagate freely in the network then there must be either a target audience or some “undesirable audience”. So that needed to happen at the npub layer, maybe adding cohorts tags to type 1 or close friends lists or bounded outboxes (i.e. I could add language and subject tags to my notes so that I could have my notes compartmented into multiple virtual outboxes or whatever that people could selectively follow, idk. Sounds like a completely unrelated idea but I think I like it).
Another possibility would be that the user understands deleting stuff from the internet is harder the more said content is replicated since deletions are in essence a gentleman’s agreement, but if that’s the case then the proposed solution of adding another optional request to not redistribute said content wouldn’t address the issue, risking resiliency for little to no tangible return.