I'm not looking at the IPs no, scrapers and such will not usually disco. if they want to re-scrape data from months ago over and over uselessly, or an app has just gone wild after being put out to pasture and it's causing a REQ loop, they can make their pubkey known to the relay and use their allotment of REQs. Workers such as blastr, is why I refer to assuming there are many IPs and it being somewhat useless to base your filtering on.

Reply to this note

Please Login to reply.

Discussion

also just to mention, some of the relays that need auth, are just asking for a key. ANY key.. so you could auth with a different key, and then if the relay is open like this, send and req any valid events on that connection, except DMs as those require auth by your main key.

🤔

It is not a bad workaround. But given the workaround exists, AUTH isn't achieving anything.

Nonetheless I'll implement it simply because it is easier to change my code than to change other people's thinking.

freedom isnt free 😅 or ur the product etc. reading a relay, i dont see a point in someone writing to a relay only to have everyone centralize to the biggest aggregators. those aggregators wouldnt exist, without homegrown relays, so they will need to pay for this, or relays ded.

With all this authorization crap, we're going back to the era of the WWW, and even worse.

Once nostr embraces CORS, I'm out.

blame websockets. traditional methods of rate limiting will not work here (im talking http)