Running Llama locally is the only interaction with AI I am comfortable with. And it's a great experience.

We only need simpler ways to train the models further on local data, a method that can be deployed by the normal user without much tech involved.

I think a better/easier training framework is more important than a new model.

Reply to this note

Please Login to reply.

Discussion

ollama with webui is great

Got any good tutorials handy? I know i can just google them but I would appreciate some input

dont know good tutorials but there are likely some around, you can put stuff in docker for easy deployment