Global Feed Post Login
Replying to Avatar Prisoner24601

This may be a silly question, but can someone explain how a large language model "learns" from sets of data?

I'm imagining a scenario where, lets say scientists that study a rare animals like the Sumatran rhino, could use an AI model and feed it daily reports for years, then the AI could majorly help to summarize in various ways all the data together to distill insights.

Is this possible with a large language model?

Can large databases be searched and summarized in this example?

Could existing databases be fed to AI models?

How do I go about getting and testing my own AI model to use for specialized training?

#asknostr #nostr #nostriches #plebs #plebchain #grownostr #ai

Avatar
Prisoner24601 2y ago

Could I use this to have my own trained AI?

https://bigscience.huggingface.co/blog/bloom

Reply to this note

Please Login to reply.

Discussion

b7
b75b9a31... 2y ago

I'm not well invested in LLMs but hope it helps this, also I think for using BLOOM you gonna need to use Paperspace in case you don't have the PC to run that model locally.

https://www.cloudflare.com/learning/ai/what-is-large-language-model/

https://www.infoworld.com/article/3705035/5-easy-ways-to-run-an-llm-locally.html

https://learn.microsoft.com/en-us/semantic-kernel/prompt-engineering/llm-models

Avatar
iru@localhot $_ 2y ago

I heard bloom is absolutely terrible.

Thread collapsed
Thread collapsed