It blows my mind how local AI language models are only 5 gigabytes.
Discussion
Yea same! Crazy how much knowledge can be stored that way. It's not "exact" knowledge, but it would help me in a emergency where there is no Internet anymore. It's like a little companion who knows a little about everything
Ha! I have thought of the same scenario.
In some strange future where I am trusting my low paramater model to guide me and hope its not hallucinating again!
that would be cool to feed one the entirety of wikipedia (and talk pages)
One might even call LLMs advanced compression algorithms ^^