Microsoft probes if DeepSeek-linked group improperly obtained OpenAI data

Microsoft and OpenAI are probing if data output from the ChatGPT maker’s technology was obtained in an unauthorized manner by a group linked to Chinese artificial intelligence (AI) startup DeepSeek, Bloomberg News reported.

Microsoft’s security researchers observed that, in the fall, individuals they believed to be connected to DeepSeek exfiltrating a large amount of data using the OpenAI’s application programming interface (API), the report said.

OpenAI’s API is the main way software developers and business customers buy OpenAI’s services.
Microsoft, the largest investor in OpenAI, notified the company of suspicious activity, according to the Bloomberg report.

Low-cost Chinese AI startup DeepSeek, an alternative to US rivals, sparked a tech stock selloff on Monday as its free AI assistant overtook OpenAI’s ChatGPT on Apple’s App Store in the US.

David Sacks, the White House’s AI and crypto czar, told Fox News in an interview earlier on Tuesday that it was “possible” that DeepSeek stole intellectual property from the US.

“There’s substantial evidence that what DeepSeek did here is they distilled the knowledge out of OpenAI’s models,” Sacks said.

An OpenAI spokesperson echoed Sacks in a statement, noting that China-based companies and others were constantly attempting to replicate the models of leading US AI companies without specifically naming DeepSeek or any other company.

“We engage in counter-measures to protect our IP, including a careful process for which frontier capabilities to include in released models, and believe as we go forward that it is critically important that we are working closely with the US government to best protect the most capable models from efforts by adversaries and competitors to take US technology.”

OpenAI did not directly address comments on the Bloomberg report.

Microsoft did not immediately respond to Reuters’ request for comment outside regular business hours, while DeepSeek could not be immediately reached for comment.