i have been chatting with my local AI for a while. it is pretty decent. if the internet goes down now I have a friend to chat!! lol

also, i have another way to get encyclopedia type of information that i can query with english if the whole internet goes away. i have reached next level prepping, lol. now I dont have to store a big pile of books maybe because this thing has a ton of information.

it is like running a gigantic library locally, on your PC. of course you have to be careful because it is fed will everything out there. use your own discretion.

i discovered vast.ai. you can rent GPU servers on the cheap while playing with these..

nvidia rtx 4090 GPUs seem to be prosumer grade with great performance. best value for your bucks.

Reply to this note

Please Login to reply.

Discussion

Usually you wanted to pile books up because they contain distilled, clean, deep and verified information because in the majority of the cases published information stays true for decades or centuries with just small adjustments from time to time

Its probably incorrect to compare llm to books and considering one in favor of another

It could be a super cool offline-first swap for the majority of your google queries like “how many continents are there”, “what is the time in Sydney right now” etc, those question-answers pairs should have its own term