Been using ollama for text summaries, so far the mistral-small3.2, mistral-nemo and gemma3 models seem like they do the best job.
They can all hallucinate stuff into the summaries, not sure how to minimize the chance of that happening.
Been using ollama for text summaries, so far the mistral-small3.2, mistral-nemo and gemma3 models seem like they do the best job.
They can all hallucinate stuff into the summaries, not sure how to minimize the chance of that happening.
Yeah I really want to add other cool local AI features, but I'm not sure what would be really useful, maybe I'll let them add a quote below each note just for giggles 😂