A colleague of mine pointed me to a nice usecase for local running LLMs:

Available offline "documentation".

Yes, LLMs hallucinate a lot, but depending on the model it can be a useful backup for documentation or similar ressources if you do not have any internet connection available at the moment.

Might be better than having nothing at all if you need to get some work done.

#development #ai #llm #local #selfhosted #backup

Reply to this note

Please Login to reply.

Discussion

Are you talking about asking it questions about how to get something done in an API when you can't look up documentation on that API?