Apologies if I am not following completely, but out of curiosity, what language are we talking about, and is the code on GitHub?
Discussion
https://orly.dev is the address of the repo (it's a redirect to github using a reverse proxy i modded for this "go vanity imports") and yes , the language is #golang
i have discovered in the last 18 months working on nostr relay dev that it's quite easy to cause a shitload of memory allocation temporarily that causes applications to get killed by the kernel when they exhaust the available memory.
the solution usually just involves changing the algorithm to avoid piling up large amounts of data at once and instead processing things in a pipeline where the memory gets freed properly before it builds into a giant slab of OOM death (out of memory).
I am feeling you. For something as dynamic as Nostr, pipelining and strict memory hygiene seem like the only feasible way.
yeah, for this spider stuff, fetching events for whitelisted users on the relay, and for bulk import, there is some serious challenges with not having memory blow up.
One thing that comes to mind is a project like TigerBeetle choosing Zig for deterministic, explicit memory control (no GC surprises), but their use case (a financial database) is much more predictable. For your relay's open-ended datasets, careful pipelining is probably the main solution, regardless of the language.
Are you experiencing real GC pain points or just the challenges of processing large data streams?