Avatar
Blake
b2dd40097e4d04b1a56fb3b65fc1d1aaf2929ad30fd842c74d68b9908744495b
#Bitcoin #Nostr #Freedom wss://relay.nostrgraph.net

Yep. Being passive transfer is also a major feature.

Requiring two actively online/communicating clients is active/active and way more complex.

It’s an option for anyone to use I guess for every note, or perhaps if relay are rejecting them with PoW minimum. If you’re not a member, it could be an ok way for relays to generate income - or pay costs.

In early testing you need at least 20 PoW to start to combat spam and automated flooding. That’s more than a laptop or mobile can do in a second. So around there may be the starting point.

Plus, if you PoW once, ideally your pubkey event now is accepted by any relay with a minimum PoW that’s same or lower.

#[6]

Well, alternatively I’m thinking in the next 6 months relays will be forced to start dropping older or less ‘valuable’ events.

Large database are not fun. And unless someone is paying you for storage, I’d expect the content to have an expiry anyway.

Has Musk taken control of the Twitter bots to use them against Nostr?

Yeah. It’s why I’ve built that PoW Service Provider. I’ll open source it this week. It’s not complete 100%, but good enough perhaps to whitelist pubkeys and test it.

It’s funny, I’ve mostly been messing with aggregation and spam stuff to help find performance weaknesses and address them - before the network 10x and then 10x again. I think 1000x, and we likely start seeing the network become islands, and relays become tiered into classes.

I’ve been writing too much SQL, all my rust comparisons have single equals 🫤

😄

The problem with me filtering stuff is I’m not sure what I don’t see anymore, but the network does. ha.

I do have an event rejection queue, but it’s limited in size.

I hate spending the time on it, spam is just wasting my time, but I’ve built some pretty solid detection across the board. Have a few more things in the works.

Literally just purged another 800k events from my db, by backtesting against new defences.

To be fair, that’s kind of the default. It’s extra work to process and remove deleted events. The code doesn’t write itself.

However, it was never intended as being any guarantee.. more like a please forget, and stop including in future requests.

I’d imagine it’s not very interesting content anyway - most client apps don’t even have a UI for it.

And major issue is technically any deletion event would need to live forever to even check against. Meaning both a spam vector and a growing database cost.

Seeing way more deletion event spikes being broadcast the past couple days. 48k in the past 6 hours - mostly over a 15 minute period.

I haven’t validated the deletes beyond being a valid json event.

May be a new delete everything tool, someone testing, spam.. not sure

I’m just lowering my standards.. auto correct isn’t much better.

We decided it was best to trust politicians and government (read: legal corruption) with our life (read: singular period of existence).

They have only your personal best interest at heart, and seek only for you to live a happy and fulfilling life (under their personally prosperous rule, and at your expense).

The reason bitcoin can’t fail is oppression never lasts. It’s a shared idea now, regardless of technical implementations.

But also just Yin and yang.

I’ve got a pretty large training set of 13k (with some dupes) spam events. You could filter labelled as spam, and perhaps hash the content into a set. Then maybe check membership?

I also have around 28k pubkeys flagged as spam I can share directly. You could review them and then delete their events.

Failing those, you could use the ML to get spam scores.. but it likely is more computational.

I’ve just purged around 2.8MM spam events. Some I can’t detect easy yet - like bogus reactions and reposts. I see them in network traffic. I just can’t do anything automatically.

https://github.com/blakejakopovic/nostr-spam-detection/blob/master/labelled_nostr_events_20230225000.csv