Browsers make many arbitrary requests to servers you didn't specify to fetch page resources. I like to block that with uBlock origin. Clients *should* put the user in control with white lists and/or blacklists. Practically though, the attack surface from websockets and nostr events is incredibly smaller than that of a web page so I don't think that is an important feature at this point, even for paranoid tor users.

Beyond that, if you don't fetch bob's events from where he publishes them, what else can be done in the case that you want to follow Bob?

Reply to this note

Please Login to reply.

Discussion

Try out the addon DecentralEyes. Gives you more power to block unwanted CDN requests.

Browsers don’t make arbitrary requests. They open site you ask them too and follow links from that site are trusted. And also they put tremendous efforts to make it as safe as possible. And when some of the links leads to untrusted site its called XSS

As you said, clients should put users in control - I’m not against this feature, I’m just saying this feature must be implemented responsibly

If bob suddenly changes relay I want to know it. And if it happens he posts to nsa.gov from now on I want to be able to say bye ye Bob

OK you should have thatcontrolI. If you trust Bob and he sent you to NSA maybe trust was misplaced. He signed his relay list. Websites don't digitally sign their content so bad links there or XSS from other user input is a far bigger problem than nostr relay references. They put tremendous effort that content is safe because the web stack is massive and riddled with security bugs including 0days right now. Nostr stackisverysimplee but it is not provably secure, still much easier to trust simple stack with hardened language like rust (please don't attack me for being rust fan boi)