I maintain a cache of kinds 3, 1984 and 10000 notes that is updated continuously. I also maintain a graph database with pubkeys as nodes, and each follow and mute forms an edge. (Reports will be next.) Whenever a new kind 3 or 10000 note replaces an old one, the edges of the graph database get updated, individual edges being removed or added as needed.
Discussion
There are tradeoffs maintaining a nostr cache versus downloading events every time on the fly. The former is more performant, the latter is perhaps more “pure.”
yeah, that's how we do it too. I don't get the 5min. You shared previously that pagerank takes about 15s. What am I missing?
Yup, about 15 seconds. The 5 min (probably a little less) is to run three algos:
1. Calculate DoS, degree of separation from the reference user to every other user
2. PageRank for every user, currently takes about 15 seconds.
3. GrapeRank for every user, currently takes about 60 seconds.
Then consolidation of all scores into a single ~ 20 MB json that can be accessed by API, put into a table for exploration.
So maybe more like ~ 3 min for now. (GrapeRank might take longer under certain conditions. Or I might get GrapeRank to go faster if I implement it using neo4j graph data science pregel but I haven’t tackled that yet.)
You can try it and tell me if it works for you! (Works for me on my laptop but I haven’t tested it thoroughly so i can’t promise it won’t break.)