“There’s a technique in AI called distillation, which you’re going to hear a lot about. It’s when one model learns from another model,” Sacks explained to Fox News. “Effectively, the student model asks the parent model millions of questions, mimicking the reasoning process and absorbing knowledge.”
“They can essentially extract the knowledge out of the model,” he continued. “There’s substantial evidence that what #DeepSeek did here was distill knowledge from OpenAI’s models.” “I don’t think #OpenAI is too happy about this,” Sacks added.
I will start printing next week for also other Artist who submit their work. So join us. You don’t need to be here. We just showcase the coolest #Nostr artists