Global Feed Post Login
Replying to Avatar Laeserin

“There’s a technique in AI called distillation, which you’re going to hear a lot about. It’s when one model learns from another model,” Sacks explained to Fox News. “Effectively, the student model asks the parent model millions of questions, mimicking the reasoning process and absorbing knowledge.”

“They can essentially extract the knowledge out of the model,” he continued. “There’s substantial evidence that what #DeepSeek did here was distill knowledge from OpenAI’s models.” “I don’t think #OpenAI is too happy about this,” Sacks added.

https://www.zerohedge.com/ai/us-navy-bans-deepseek-over-security-concerns-substantial-evidence-emerges-chinese-ai-ripped

Avatar
Laeserin 11mo ago

This is why it was so cheap. They didn't need any access to the underlying data, just to ChatGPT.

A copy of a copy.

Reply to this note

Please Login to reply.

Discussion

No replies yet.