Avatar
banjo
94f66a6138a20e120cefbe343103186804847ad9619316761e3e76a062d5fed0
"Freedom of speech is being able to tell someone else something they don't want to hear." "No matter what you say, someone is going to be offended." "Once you realize that politicians are not altruistic, and that they're in politics for themselves and their own personal gain, then everything makes sense." "No one is above the law." "Have you ever met a poor politician?"

GM Nostr!

Happy Saturday!

Nothing on the agenda today (other than my morning coffee)!

Maybe head out for some fishing...we'll see how the weather shapes up (looks like a nice day so far).

#coffeechain

Agree--it won't be easy--but--if we *can* put content control in the hands of the users we'll have taken a GIANT LEAP forward with Nostr--

People ask "why not just use Twitter, or Mastodon or..." and "when will Nostr adoption reach a critical mass?"

Well, user-controlled content is the "killer app" that will answer both questions...

Ok, is it just me, or does Kamala *always* sound like she's whining?

(Maybe it's her voice, or maybe her delivery, but it's like nails on a chalkboard for me).

Anyone else?

GM Nostr!

Happy Friday! (And a welcome Friday it is...)

Have a bit of work to do today, so hoping for a nice day (seems to be starting out that way). Looking forward to the weekend as well.

But (as always) starting out with my Raktajino...

#coffeechain

Yes, I do get it...yet I still believe (STRONGLY) that we need to develop tools that let USERS control and filter their own content--and to not rely on someone else (e.g., relay operators) to do it.

Frankly, the functionality to "focus" a user's feed is really missing from Nostr currently...developing such a framework would help to solve both problems.

Decentralization is the primary core tenet of Nostr--and any "filtering" should be decentralized as well.

Keep an eye on Bring a Trailer...the pricing graphs are really helpful. You may find one that's more affordable than you may think...

https://bringatrailer.com/audi/r8/

Ah, I agree--rent one (preferably on a track). You may find it's not quite what you expected...(or you may find it's exactly what you'd dreamed).

Either way, it's a win... 😃

Hmmm...that certainly furthers the conversation...

Exactly my point--it's not easy--so we (users) need better tools from the developers to enable user moderation of content (i.e., change the channel)

That's where development efforts shoudl be focused--enabling the USER to control (and choose) their own content, in and easy, simple, expedient way.

And yes, censoring content at the relay level is much easier to implement--yet my concern is that it takes us in the wrong direction (i.e., centralization vs. decentralization).

We're better than that--devs, please take up the challenge--Nostr is AMAZING--let's build the tools that will continue to inspire and yet stay true to the core reason Nostr exists--decentralized communication and freedom of speech!

Said this in another reply, but are we really wanting Nostr's freedom of speech proposition to be "Hey spin up your own relay"?

We're better than that--we need to make things easier for folks to adopt Nostr...not harder.

Yes, anyone CAN spin up their own relay, but do we really want to make that our freedom of speech proposition?

"Hey folks, Nostr is great--but you have to spin up your own relay to make it work if you don't want to be censored"

One of our greatest (current) problems is easy of use...I'd say going down the "spin up your own relay" isn't really our best answer...

Oh, no argument--it's a VERY complex issue--and "filtering" is one solution--

Yet it relies on counting on relay operators to then not censor other things...and that's when it gets sticky...

It's really the exact problem Facebook, Twitter, etc. are faced with--how much censorship is "ok"? And who gets to decide?

GM Nostr!

Happy Thursday!

Neighbor's dog barking at 7:00 AM...glad I'm already up getting some coffee...

#coffeechain

How will you know what a relay operator is "filtering"?

How will you know if information you'd like to see is not being presented to you?

How will you ensure a relay operator is not shadow banning certain accounts or topics?

How will you know if a relay operator isn't doing exactly what Twitter and Facebook have been doing?

So...

Let's suppose a relay operator "filters" out spam--users are happy, and they sign up for that relay.

And it pospers--in fact, it becomes one of the most utilized relays on Nostr...

And...then let's suppose that relay operator at some point in the future decides that "Hunter's Laptop" is disinformation, and for the good of his users he begins "filtering" that...but...he never tells his users.

You can (of course) think of many such examples.

And yes in this model what and how to "filter" becomes the choice of each relay operator...and yet (if so) it also then becomes the RESPONSIBILITY of those relay operators to act altruistically, and to not become individual arbiters of truth.

This then becomes the proverbial "slippery slope"...

And while advocates would say "I'd never censor 'Hunter's Laptop'" the unfortunate truth is (likely) that some relay operators will be tempted to inject their own biases into their relays.

How will relay users then know what's being "filtered" (censored) by the relay operators? Or will users need to blindly trust those operators to not censor something else? And isn't that exactly what happened with Facebook and Twitter (and why Nostr was "born" in the first place)?