Depending on how old it is, the best thing to try now is to install ollama, try a small-ish model such as phi3-mini, and see if you like it

Reply to this note

Please Login to reply.

Discussion

i want to use llama 3.1 with 405 billion params though 😂

actually, maybe the 70 billion param one would be fine 😂

Build a 3x 3090 machine. Keep us updated 😁