Global Feed Post Login
Replying to Avatar Laeserin

“There’s a technique in AI called distillation, which you’re going to hear a lot about. It’s when one model learns from another model,” Sacks explained to Fox News. “Effectively, the student model asks the parent model millions of questions, mimicking the reasoning process and absorbing knowledge.”

“They can essentially extract the knowledge out of the model,” he continued. “There’s substantial evidence that what #DeepSeek did here was distill knowledge from OpenAI’s models.” “I don’t think #OpenAI is too happy about this,” Sacks added.

https://www.zerohedge.com/ai/us-navy-bans-deepseek-over-security-concerns-substantial-evidence-emerges-chinese-ai-ripped

Avatar
Dhay 🏴‍☠️ 11mo ago

This opens the possibility of distributed AIs, all of them learning from each other.

Reply to this note

Please Login to reply.

Discussion

No replies yet.