Global Feed Post Login
Replying to Avatar Prisoner24601

This may be a silly question, but can someone explain how a large language model "learns" from sets of data?

I'm imagining a scenario where, lets say scientists that study a rare animals like the Sumatran rhino, could use an AI model and feed it daily reports for years, then the AI could majorly help to summarize in various ways all the data together to distill insights.

Is this possible with a large language model?

Can large databases be searched and summarized in this example?

Could existing databases be fed to AI models?

How do I go about getting and testing my own AI model to use for specialized training?

#asknostr #nostr #nostriches #plebs #plebchain #grownostr #ai

Avatar
Zapgur 2y ago

One popular way to go about it is called RAG (retrieval augmented generation), basically maintain your knowledge base of any type of data and chat with LLM about, you can try nuclia.com

Reply to this note

Please Login to reply.

Discussion

No replies yet.