What are your preferred setups for running local LLMs on laptops?

Also, what are the current best methods for allowing a local LLM to fetch live data from the internet?

#asknostr #AI #llm #localAI #privacy #localLLM

Reply to this note

Please Login to reply.

Discussion

You can install via docker ollama&open web ui.

Then find some small models, perhaps under 1b params

I tried lm studio… but is not complete open. I think ollama is the way to go 👍

Is it possible to implement a online web search?

Check open web ui, has a lot of features.

https://docs.openwebui.com/

You may like KoboldCpp, scales well.