PREFERRED UNCENSORED FAST LOCAL AI LLM? #AskNostr https://youtu.be/Wjrdr0NU4Sk
Discussion
Too expensive to do at home decently
I don't think so. Have just set up a £200 machine that's running multiple local small LLMs at acceptable speeds.
EBay Lenovo P500, E5 Xeon 2697 V3, SSD, GTX1650 GPU,, 32GB RAM, Zima OS, OpenWebUI
Idk man you can probably use like 98 dogecoin to get a phone that runs the smallest PocketPal bundled model for at least 1 prompt before resetting
Anything Dolphin seems a decent choice but newb to this.
I like Danube