In a bid to protect its crown jewels, OpenAI is now requiring government ID verification for developers who want access to its most advanced AI models.
While the move is officially about curbing misuse, a deeper concern is emerging: that OpenAI’s own outputs are being harvested to train competing AI systems.
A new research paper from Copyleaks, a company that specializes in AI content detection, offers evidence of why OpenAI may be acting now. Using a system that…
Article Source
https://www.businessinsider.com/openai-tightens-access-evidence-ai-model-mimicry-deepseek-2025-4