my macbook is old and can barely run the primal web app. so i doubt it would be good enough to run anything. 😂

Reply to this note

Please Login to reply.

Discussion

Depending on how old it is, the best thing to try now is to install ollama, try a small-ish model such as phi3-mini, and see if you like it

i want to use llama 3.1 with 405 billion params though 😂

actually, maybe the 70 billion param one would be fine 😂

Build a 3x 3090 machine. Keep us updated 😁