Ollama is an open-source project that serves as a powerful and user-friendly platform for running LLMs on your local machine.

https://ollama.com/library/llama3.1

Reply to this note

Please Login to reply.

Discussion

I started using Ollama 2 days ago. Pretty easy to get up and running.

A little disappointed though at the size of the models I can run at a useful speed.

405b is completely out of my grasp.

We are still early. Hopefully it will become easier in time, but the 405b isnt the only choice. Others are decent. What could be even better is running it as a service for sats. Openagents is still running llama 3 70B but they are moving fast.

lotta things moving fast these days! t-y^

I don't think I could run this on my machine. It's so laggy with LM Studio, I had to give up.

Pretty happy with Venice.ai since they enabled 405B.