
Discussion
They instituted 2FA to keep the other AIs out, otherwise, they'd have to shut the program down because it's just a money hole. 🕳️
yes, like the hole in their souls is a trapdoor to the abyss
Also, in before

sucks to be them
we all know that billy wants it so he can make him a skynet right?
because he's never been one to not throw everything at the next machine intelligence thing, and this fantasy of hyperintelligent machines just won't die even though it literally cannot be done more efficiently than in greasy gray skull stuffing (there is already very strong evidence about thermodynamic costs of computation having a limit like lightspeed due to heat and volume)
low key, AI is a fantasy of psychopaths, but carry on
God already made far more efficient computing long ago, silicon can never come close
in order to long term prevail any lifeform would have to have access to God's book of everything, and as if he's gonna give that to a psychopath
Well, without AI scrapers, almost all FOSS would have no traffic on their repos, so there's that.
How long GitHub will continue to foot the bill for microapps with three faithful users, and "Hello world!" attempts by people who are destined to become vibers, is the only open question. That's a lot of storage cost, for ded code.
What we can definitely see, is that everything is an infrastructure and services play, not a code play. Vibe-code your website into existence... and put it... where?
Now, they're all, well, we'll just put it on a blossom or a relay. So there! We have beaten the system. Nobody needs to run infrastructure, anymore.
Umm... blossom and relays are also servers. Someone is running them. You seem to have missed some sort of memo.
there has to be spiders and search engines
ideally it minimises spidering usage by multiple indexes sharing data with each other
this is going to be essential, and also why having LLMs trained on this spidered data will be essential, and it needs to be compounding, so every node can take in new data and it modifies their model and updates for the current state
there is also the possibility that you make nostr-hooks so when you push stuff, it goes to relays that aggregate git activity events, and you don't even need spiders
that's something we can do that they can't do
Yeah, being able to find all git commits over events will be cool.
Quite accurate open source is an easy target for AI and large tech firms to exploit.
We need Dark Open Code. Here the binaries are free, and the code is visible for contributors, and real members of the community, but the IP remains with the team.