AI scammer impersonates Marco Rubio to target U.S. and foreign officials

Reuters
Reuters

An impostor used artificial intelligence to mimic Secretary of State Marco Rubio’s voice and writing style, contacting top U.S. and foreign officials in a bold deception campaign.

An impostor pretending to be Secretary of State Marco Rubio contacted at least five high-level officials, including three foreign ministers, a U.S. governor, and a member of Congress, according to a State Department cable obtained by The Washington Post.

The impersonation campaign began in mid-June when the unidentified actor created a Signal account using the display name “Marco.Rubio@state.gov.” Although it appeared official, it was not Rubio’s real email address. Using the account, the impostor left voicemails for at least two targets and sent text messages inviting them to communicate further on Signal, an encrypted messaging app extensively used by the Trump administration for both personal and official business.

According to the cable, the main aim of this campaign was likely to gain access to sensitive information or accounts held by the targeted officials. A senior U.S. official said the impersonator used AI-powered software to mimic Rubio’s voice and writing style convincingly enough to deceive powerful figures.

Digital forensics expert Hany Farid of the University of California, Berkeley, said such operations do not require a sophisticated actor. “You just need 15–20 seconds of audio of the person, which is easy in Marco Rubio’s case. You upload it to any number of services, click a button that says ‘I have permission to use this person’s voice,’ and then type what you want him to say,” he explained. Farid added that voicemails are particularly effective because they are not interactive, allowing AI-generated voices to remain convincing.

The State Department has launched an investigation and urged its personnel to report any impersonation attempts to the Bureau of Diplomatic Security. Non-State Department officials have been advised to alert the FBI’s Internet Crime Complaint Center.

The FBI, which previously issued a warning in May about malicious actors impersonating senior officials using AI-generated voice and text messages, declined to comment on this specific incident. Their warning cautioned that such campaigns are intended to extract sensitive information or funds from targeted individuals and institutions.

This incident follows other high-profile impersonation attempts. In May, someone breached White House Chief of Staff Susie Wiles’ phone, using her identity to place calls and send messages to senators, governors, and business executives. Although President Donald Trump dismissed the significance of that incident, saying Wiles is “an amazing woman” who “can handle it,” it spurred a joint White House and FBI investigation.

Internationally, AI-powered impersonation campaigns are on the rise. In June, Ukraine’s Security Service announced that Russian intelligence agents were impersonating their officials to recruit civilians for sabotage missions. Around the same time, Canadian authorities warned that AI-generated voice and text scams were targeting senior government officials to steal sensitive data or inject malware into networks.

These developments underscore growing concerns over AI misuse in cybersecurity. While Signal remains widely used in government circles for its reliable end-to-end encryption, experts warn that AI voice cloning makes it easier than ever to deceive targets.

The State Department cable emphasised vigilance, warning that such impersonation attempts could undermine diplomatic security. The FBI has reiterated that any message claiming to be from a senior US official should never be assumed authentic without direct verification.

As AI capabilities advance rapidly, governments worldwide are grappling with the security risks posed by deepfake audio and text. For now, officials are being urged to maintain strict security protocols and to remain alert for any unexpected contact, regardless of how familiar the voice at the other end might sound.

Tags