Anthropic said it has identified large-scale campaigns by DeepSeek, Moonshot AI and MiniMax to extract capabilities from its Claude models illicitly.
The company said the three labs generated more than 16 million exchanges with Claude through roughly 24,000 fraudulent accounts, violating terms of service and regional access restrictions. Anthropic attributed the campaigns using IP correlations, metadata, infrastructure indicators and corroboration from industry partners.
According to Anthropic, the labs used “distillation,” a method that trains a smaller model on the outputs of a more capable one. While widely used internally by frontier labs to create lighter versions of their own systems, Anthropic said the technique was deployed here to replicate Claude’s reasoning, coding and tool use capabilities at scale.
DeepSeek reportedly ran more than 150,000 exchanges focused on reasoning tasks and eliciting detailed step by step explanations to generate training data. Moonshot conducted over 3.4 million exchanges targeting agentic reasoning, coding and computer use.
MiniMax accounted for more than 13 million exchanges, with Anthropic detecting the activity while it was ongoing and observing traffic shifts following new model releases.
Anthropic warned that models built through illicit distillation may lack safety guardrails designed to prevent misuse in areas such as cyber operations or biological threats. The company argued that such activity could undermine US export controls by allowing foreign labs to replicate capabilities intended to be restricted.
To counter the campaigns, Anthropic said it has deployed new behavioral detection systems, strengthened account verification, shared intelligence with industry peers and authorities, and is developing product and API level safeguards to reduce the effectiveness of distillation without degrading service for legitimate users.
The company said addressing large scale distillation will require coordinated action across AI labs, cloud providers and policymakers.












