Three options here:

- Reporting is censorship (necessarily decreases reach)

- Reporting is advertising (necessarily increases reach)

- Reporting is speech (it's complicated)

Reporting can decrease reach, depending on client/relay implementations. It can also act as an advertisment for the reported content (this can expose you to legal risk if what you're reporting is illegal). At its core, reporting is like any other speech. It's a comment with a value judgment. What software and governments do about it is another thing.

Tangentially: be careful what/how you report. Many clients sign reports with your own key, and some even send kind 7 with a warning symbol instead of kind 1984. This is terrible for users, who donc't want to associate their keys with objectionable content.

nostr:nevent1qy88wumn8ghj7mn0wvhxcmmv9uq32amnwvaz7tmjv4kxz7fwv3sk6atn9e5k7tcqyztc94eyvlhm0d5w647gquzf0q0v46azrlku847djyegsa9qhxd5ucxuuy4

Reply to this note

Please Login to reply.

Discussion

free speech does not mean the right to commit crimes

should some speech be a crime?

How about:

Reporting is Reporting

Censorship is Censorship

I am neither a fan of reporting nor censorship, but I don't get why we should mix up the words 😅...

That's basically what I'm saying, yeah

curation not censorship

Because censorship isn't binary & it always starts with a slope that is slippery.

Reporting is censorship!

It results in removal of content, influence of speech, gatekeeping, subjectivity!

A report is just a note of kind 1984. It can result in censorship, but so could any ither note. Censorship is censorship.

So, you agree then. The act of reporting is a tool of censorship that proves my original statement😉

So Alex Jones' kind 1s are censorship, because they result in censorship?

Reporting is free speech.

Censorship requires a power imbalance.

source everything, nice too c it being discussed

Correcto

What? Nobody can remove someone elses note/content on nostr? Reporting is definitely not censorship. On nostr, people have the free will to use kind 1984 events to filter their own feed but that's not censorship at all.

It is central planning, is it not?

How so?

Do you know how to vet the central reporting database?

Huh? Vet what? There is no central reporting database?

Relays that use the reporting bot to detect which notes & eventually npubs to censor will use a DB that you won't be able to vet.

Just my prediction, anything that can aid a central planner - probably will.

Nostr has a free and open market for relay selection where users have the power to select which relays they want to use. If one of your relays is doing something you don't agree with, simply stop using them. I don't expect all relays to agree on the content they store so naturally some relays will filter out some stuff while other filter other things. Don't expect to be allowed to post your content on all relays in the world.

Relays should be content agnostic & filtering should be done at the client -> user level.

Otherwise, no one will vet relays & you know it.

Are we talking about filtering new events or filtering on the content returned on REQ messages?

One leads to the other

Relays are just someone elses server and harddrive. Those "someone else" will have an opinion on what they want to store on there. There is no way in hell that all relays will accept anything from anyone. You just need to find the relays you agree with or better yet host your own where you own your own speech.

Having a way to filter out spam is good,

But if you end up targeting npubs due to some notes, and that is widely adopted...I fear this.

I personally don’t think that’s going to be a problem. Basing your whitelist on a far reaching, naïve, reporting bot is definitely retarded but I don’t believe that’s going to be the norm.

This is the first iteration, so maybe we don't have to worry about that...yet.

reporting is speech (it's complicated) is where im at on it

as in: doesnt scale to me like some universal final answer exists

only ever case by case

take a fb group

admin could decide ok lets make community standards

what is that process like? is there membership input on that?

ok lets have mods

what is that process like? are they nominated by the members? can they be relieved of that position? can they be re-instated? what is that whole process like? can a tyrant mod abuse that position? ive seen it many times but not countless. is modding usually just really thankless work? yea

on a small scale this is fine

cos you can make a report

& a mod knows who you are & makes a determination

if they make it the wrong way to you

you have some kind of redress

like you can literally text sarah:

rlly sarah?

& sarah can be like:

rlly eric why dont you become mod if you love this shit so much

at a bigger scale than that?

fuck if i know & i dont really have much to contribute its beyond me

🌺 thanks for reading

Can you name the kind 7 apps? This seems like an anti-best practice, given there is kind 1984

I don't know who it is, but it is very bad practice

Reporting is not censorship, without power imbalance.

But a simple protocol will never account for the fact that human societies trend toward power imbalance. Give it time and even Nostr will feel pressure from centralizing forces.

Can we have anon reporting with blind WoT, derived from the original user?

Maybe somehow, at the very least it could be done by trusting an intermediary like nos's bot.

you think users shouldn't be signing with their own key when they report someone? I can't see how a signature from an ephemeral key of a kind:1984 could ever be useful 🤔

It would require additional review, but is some signal at least. PoW or some kind of blinded WoT would be necessary to ignore spam. I like how nos is doing it: report privately, review the report, then publish under a reputable reporting-only pubkey.