Global Feed Post Login
Replying to Avatar SUPERMAX

client GPT4ALL (there's more out there now)

Been using them for year+

Local LLM's for general purpose I've found Falcon-7b does pretty good. Run larger Falcon if you have RAM for that

I've also spent the time to finetune the smaller models for my use cases. Takes time but feed it your local data and you'll thank your past self for the work you put in

Avatar
SUPERMAX 1y ago

JAN.AI is another great client. Recently been testing this as well. Can load own LLM's from huggingface to use here just like GPT4ALL

Reply to this note

Please Login to reply.

Discussion

Avatar
Javier 1y ago

This is the easiest and the fastest.

Thread collapsed