Distillation is the practice of training smaller AI models on the outputs of more advanced ones. This allows developers to ...
Microsoft researchers have developed On-Policy Context Distillation (OPCD), a training method that permanently embeds enterprise system prompt instructions into model weights, reducing inference ...
Unlock the full InfoQ experience by logging in! Stay updated with your favorite authors and topics, engage with content, and download exclusive resources. Ludi Akue discusses how the tech sector’s ...
Chinese artificial intelligence lab DeepSeek roiled markets in January, setting off a massive tech and semiconductor selloff after unveiling AI models that it said were cheaper and more efficient than ...
Anthropic is accusing three Chinese artificial intelligence companies of "industrial-scale campaigns" to "illicitly extract" ...
The original version of this story appeared in Quanta Magazine. The Chinese AI company DeepSeek released a chatbot earlier this year called R1, which drew a huge amount of attention. Most of it ...
The latest trends in software development from the Computer Weekly Application Developer Network. This is a guest post for the Computer Weekly Developer Network written by Jarrod Vawdrey in his ...
Things are moving quickly in AI — and if you're not keeping up, you're falling behind. Two recent developments are reshaping the landscape for developers and enterprises alike: DeepSeek's R1 model ...
Hosted on MSN

What is AI Distillation?

Distillation, also known as model or knowledge distillation, is a process where knowledge is transferred from a large, complex AI ‘teacher’ model to a smaller and more efficient ‘student’ model. Doing ...