Replying to 56c4201e...

A colleague of mine pointed me to a nice usecase for local running LLMs:

Available offline "documentation".

Yes, LLMs hallucinate a lot, but depending on the model it can be a useful backup for documentation or similar ressources if you do not have any internet connection available at the moment.

Might be better than having nothing at all if you need to get some work done.

#development #ai #llm #local #selfhosted #backup

Avatar
TerrestrialOrigin 10mo ago

Are you talking about asking it questions about how to get something done in an API when you can't look up documentation on that API?

Reply to this note

Please Login to reply.

Discussion

56
56c4201e... 10mo ago

nostr:nprofile1qy2hwumn8ghj7un9d3shjtnddaehgu3wwp6kyqpq6ga0ftymhu7fuhz5n8sn9hf9yft42x5q2x5wg986t8m5z7ls0tdsmmeu3n For example, yes.

Avatar
TerrestrialOrigin 10mo ago

Yeah, I ask it rather than seargh through the API for an hour anyway. If it tells me the wrong thing ot just won't work and I'll know right away and look in the API then.

Thread collapsed
Thread collapsed