Well, without AI scrapers, almost all FOSS would have no traffic on their repos, so there's that.

How long GitHub will continue to foot the bill for microapps with three faithful users, and "Hello world!" attempts by people who are destined to become vibers, is the only open question. That's a lot of storage cost, for ded code.

What we can definitely see, is that everything is an infrastructure and services play, not a code play. Vibe-code your website into existence... and put it... where?

Reply to this note

Please Login to reply.

Discussion

Now, they're all, well, we'll just put it on a blossom or a relay. So there! We have beaten the system. Nobody needs to run infrastructure, anymore.

Umm... blossom and relays are also servers. Someone is running them. You seem to have missed some sort of memo.

there has to be spiders and search engines

ideally it minimises spidering usage by multiple indexes sharing data with each other

this is going to be essential, and also why having LLMs trained on this spidered data will be essential, and it needs to be compounding, so every node can take in new data and it modifies their model and updates for the current state

there is also the possibility that you make nostr-hooks so when you push stuff, it goes to relays that aggregate git activity events, and you don't even need spiders

that's something we can do that they can't do

Yeah, being able to find all git commits over events will be cool.