Well, look at how Primal and Damus differ as one example. You can 100% lose posts between them if you use both.

Or, ask yourself, if Nostr were the size of X, who could afford to run a relay?

RELAYS ARE ALWAYS PERMISSIONED, THEY ARE NOT PUBLIC COMMONS

Reply to this note

Please Login to reply.

Discussion

> Well, look at how Primal and Damus differ as one example. You can 100% lose posts between them if you use both.

it depends entirely on the relay you are using and how you the client you are using implemented client-side filtering, that's not censorship...

> Or, ask yourself, if Nostr were the size of X, who could afford to run a relay?

The largest cost by far would be bandwidth, storage space is insignificant, because the likelyhood that any relay would have to store everything is essentially zero (and that's a good thing).

Thanks to nostr:nprofile1qy88wumn8ghj7mn0wvhxcmmv9uq36amnwvaz7tmwdaehgu3wvf5hgcm0d9hx2u3wwdhkx6tpdshsqgpy3kgaawurzqz25t7v5w4fl23e4mp55zcq6sacvj6kw06dyf70nutjs8cc, and in the future other p2p relay protocols, many of which I assume will be integrated directly into clients (which is something I am plannign for Eve), the bandwidth cost drops as well, to the point where only small, but periodic queries would have to be performed on relays.

Also, I happen to believe that the current way messages are sent and recevied between clients and relays is straight up garbage, but by simply introducing binary messsaging, and possibly compression, we can reduce the amount of data sent and received significantly.

> RELAYS ARE ALWAYS PERMISSIONED, THEY ARE NOT PUBLIC COMMONS

that's a non-refutable and entirely definition based argument that has nothing to do with the discussion at hand. fine relays are permissioned by your definition, doesn't change my argument

> The largest cost by far would be bandwidth, storage space is insignificant, because the likelyhood that any relay would have to store everything is essentially zero (and that's a good thing).

The largest cost by far would not be bandwidth or storage but rather the cost to index these relays. By several orders of magnitude.

Nostr as it stands cannot scale to millions of users without indexing—or put another way as it scales power and profit will naturally accrue to indexing parties to the degree that Nostr will lose itself. And since indexing (naturally) isn't part of the protocol, it will be done in a computationally inefficient manner, with multiple competing parties performing the same tasks, and then falling off one by one due to the immense costs each party must bear alone due to the lack of cost-sharing.

This will become much more obvious after around 500k daily active users, should Nostr one day hit that mark. It's not an issue now, but the initial hints of it are here.

There are only two ways to avoid the more-scale-more-indexing trap. The first is to stick to use cases that make indexing irrelevant (it doesn't help with anything). The second is to accept that core use cases will be those for which indexing does bring about benefit at scale but bake indexing into the protocol at the most fundamental level. Both are still roads Nostr could take.

And there is a deus ex machina that some kind of decentralised indexing technology comes along and is simple and just works.

> The largest cost by far would not be bandwidth or storage but rather the cost to index these relays. By several orders of magnitude.

Touché, but I was talking on an individual relay scale, not global scale, to be fair.

That's a bridge that we will cross once we get to it.

If the majority of nostr clients will stick to being this global x-like (kind 1) feed, then yeah, you are absolutely correct, that's something that needs to be solved, but I forsee nostr eventually evolving to something else.

>but I forsee nostr eventually evolving to something else.

Same here. And feels like the first stages of that evolution are progressing.

This is a study of nostr resilience and how notes survive even if top 5% relays go down

https://arxiv.org/pdf/2402.05709.pdf

Didn't see this paper before! Nice find

Yes primal can censor me, and when they do I'll just stop using primal.

That's the thing a about nostr, it's not owned by primal and damus.

Primal and damus are just clients used to access the protocol.

If I really want to avoid being censored I can even build my own relay with my own personal media hosting.

I'm struggling to see how that's the same as any centralised social media platform.

When Facebook banned me my account was removed you can't even find it anymore, all the information censored and hidden.

Nobody has that kind of power over nostr

Relays are just web servers. If you run your own, no one knows it exists. If you use someone else's, it can censor you.

No one will run a scaled web server for you for free.

No government will allow a scaled web server that does not censor.

Nostr can never scale while preventing censorship.

Most people cannot imagine scale. That’s why they still think Lightning works.

Also, whatever you’re working on will be replaced by bitnames when miners activate bip300

Someday, people will stop trying to strap bullshit onto Bitcoin.

Someday, everything will be strapped to Bitcoin

Zzz

Remember when you were wrong about lightning? Lol

I don't remember when people cared about drivechains.

Doesn’t matter. The miners just need to care about money. The users will come when they have something to use.

Meouw

What do you think of this? Apparently 32 bytes on a UTXO for each "publish changes" button press. The rest is client resolved. Bullshit on Bitcoin? Or makes some sense? I'm not sure where I stand on these pin-a-tiny-anchor things.

nostr:nevent1qvzqqqqqqypzphn7e50zja4x4ke0lf05mwq60kqjezakdx92qrw0rem2md27l4j9qyg8wumn8ghj7mn0wd68ytnddakj7qpqstqdpxn728gu52qpa55q85484ry72sz0lrpc0zgcx3amh4l5n4rq5ju0qk

why does the server itself have to scale so massively? why doesnt ever user have a server and they only scale it according to the content and attention they receive. mine could stay quite small, Kanye's would need to be pretty beefy. Peer-to-peer connections between personal servers. only.

Your idea obviously results in fragmentation, isolation, and echo chambers, instead of censorship resistance

Dig into "Kanye's would need to be pretty beefy" and answer your own question. ;)

Scale requirements are a constant on the web.

One great thing about Pubky is it has a key-based discovery method, so you can safely include centralized servers into the design.

If you get censored, you just change your DNS to a new provider or self-hosting. No loss of context.

it doesn't answer my question. for the Kanye's of the world, they can R&D performance and scaling features that help them meet their own ruinious popularity.

for 99% of the rest of humanity, home-scale personal servers will be just fine. and people can form comparatively small networks between themselves with no peoblem. the mildly more popular can throw a few extra bucks at their setups.

This is incorrect.

If you want to have pocket networks of limited size, then maybe you can keep using nostr for hobbyists and outcasts.

But Kanye is never gonna host his own server, and millions of people self-hosting following Kanye will never be performant without massive indexing.

You're just re-injecting nostr into the design for no benefit. (But nothing stopping people from using Pubky that way if they must...)

It is too inefficient to have everyone syncing everything locally, hence why Bitcoin doesn't scale well.

I didn't even have nostr in mind when saying that. more like urbit.org or plunder.tech

also how do you know Kanye won't host a server? maybe servers are just too hard to run and maintain... I'm sure Kanye has a smartphone and a car - those are pretty complicated and performant, but have been made idiot-proof and simple to operate.

Who (or what) says every relays has to scale up with the growth of the ecosystem? Why?

Ah I see that nostr:npub19ma2w9dmk3kat0nt0k5dwuqzvmg3va9ezwup0zkakhpwv0vcwvcsg8axkl was asking the same here 🤓

Well, they don't need to scale if no one uses it.

But if they do, you have a big problem because now a few businesses power the bulk of the network, and the users have no way to hot swap servers.

I agree that is just one scenario that could happen. But is that something that will break the protocol? I don’t think so. I think this scenario will be played out somewhere in the future and for now I would be ok with that. With that scenario we already have a way better internet / communication layer than we have today.

Breaking the protocol is not the concern, it is that the protocol solves real problems for many people.

Without that it dies.

Let me be clear, nostr is not a "better internet" layer, it is the same web we already have, except it signs everything with a key.

Everything else is the same, or worse.

What is the main difference for you between what the internet is and what the web is?

I think we have to clear about how we see things (making the definitions clear).