It blows my mind how local AI language models are only 5 gigabytes.

Reply to this note

Please Login to reply.

Discussion

Yea same! Crazy how much knowledge can be stored that way. It's not "exact" knowledge, but it would help me in a emergency where there is no Internet anymore. It's like a little companion who knows a little about everything

Ha! I have thought of the same scenario.

In some strange future where I am trusting my low paramater model to guide me and hope its not hallucinating again!

that would be cool to feed one the entirety of wikipedia (and talk pages)

Could you elaborate on what you mean?

I am sure they would be trained on the entirety of Wikipedia.

i mean a free model based on free information so you can search it in more abstract ways than its current structure allows

One might even call LLMs advanced compression algorithms ^^