I started using Ollama 2 days ago. Pretty easy to get up and running.

A little disappointed though at the size of the models I can run at a useful speed.

405b is completely out of my grasp.

Reply to this note

Please Login to reply.

Discussion

We are still early. Hopefully it will become easier in time, but the 405b isnt the only choice. Others are decent. What could be even better is running it as a service for sats. Openagents is still running llama 3 70B but they are moving fast.

lotta things moving fast these days! t-y^