What do you self-host?

Reply to this note

Please Login to reply.

Discussion

multiple models for t2i and i2i depending on the usecase. Llama3 for playing around, and then some code specific models like codestral.

What’s codestral?

llm thatspecializes in coding by the same guys that brought mistral, early impressions are good but I haven't put it through enough tests yet. codellama is also good

A couple of months ago, I was looking for a code helper LLM that I could run locally, and I couldn’t find anything. Are these new? How do you run them?

ollama and continue on vscode