Iβve got 64GB RAM arriving today. I donβt expect it to help with performance, but it should enable me to run bigger models.
Right now, Iβm running Mistral through ollama. Takes a while to get responses back. My machine is a 6-core Intel-based 2018 era Alienware gaming PC with 16GB RAM.
Banana republic confirmed
Too soon to make recommendations. All Iβve used so far is llama3 (7B) and mistral same size. Iβm downloading the bigger llama3 now.
Yes, an airgapped computer is the way to go. But what software to run on said computer?
This is where I want Sparrow to do 24th word calculations.
Different LLM models recommend different amounts. The ~4GB models want the OS to have ~8GB RAM. The largest models (~40GB) recommend 32GB RAM.
I ordered 64GB which should be arriving today. I figure that should be enough for now. Itβs also the max my 2018 era desktop can handle.
I believe you could do this with ollama, yes. It supports saving and loading models, so you could feed it a bunch of info, then begin the bot serving from that point.
Yeah, Iβve got llama 3 running via ollama. Downloading the bigger version now. Should be able to run it once my new RAM arrives.
But mostly, Iβm thinking about going forward. I plan to download and try out a whole bunch of different models all of which are extraordinarily large data files. This seems like a perfect job for BitTorrent.
Starting to run LLMs on my own hardware. These model files are often 4-40GB in size.
Where are the torrents? #asknostr
Look man, how else am I gonna compute word 24?
Commit to a fixed amount of time, set a timer, then get started. I find this helps me get started with tasks. You donβt have to finish the whole thing, just a short allotment, like 25 minutes, then take a break, then decide if you want to go another round.
the current rule of #nostr is that if it isn't for a twitter clone threaded feed purpose, it's buggy
Nah. Itβs buggy even then.
There is no such thing as βpublic propertyβ.
There is private property and public tyranny.
nostr:npub1hea99yd4xt5tjx8jmjvpfz2g5v7nurdqw7ydwst0ww6vw520prnq6fg9v2 Does Sparrow yet offer a way to generate the checksum word of a BIP39 seed phrase?
In the past Iβve had to resort to using other software to generate the checksum. Would be nice to reduce the number of tools required.

