By Pieter Arntz
Publication Date: 2026-01-15 13:16:00
Researchers found a method to steal data which bypasses Microsoft Copilot’s built-in safety mechanisms.
The attack flow, called Reprompt, abuses how Microsoft Copilot handled URL parameters in order to hijack a user’s existing Copilot Personal session.
Copilot is an AI assistant which connects to a personal account and is integrated into Windows, the Edge browser, and various consumer applications.
The issue was fixed in Microsoft’s January Patch Tuesday update, and there is no evidence of in‑the‑wild exploitation so far. Still, it once again shows how risky it can be to trust AI assistants at this point in time.
Reprompt hides a malicious prompt in the q parameter of an otherwise legitimate Copilot URL. When the page loads, Copilot auto‑executes that prompt, allowing an attacker to run actions in the victim’s authenticated session after just a single click on a phishing link.
In other words, attackers can hide secret instructions inside the…