Italian authorities have successfully frozen nearly €1 million (£870,000) after a high-profile businessman fell victim to a sophisticated AI-driven scam.
Officials revealed on Wednesday that fraudsters used artificial intelligence to mimic the voice of Italy’s Defence Minister, Guido Crosetto. Posing as the minister, they made urgent calls seeking financial assistance for the supposed release of kidnapped Italian journalists in the Middle East.
Prosecutors in Milan stated earlier this week that some of Italy’s most influential business figures were targeted, including fashion designer Giorgio Armani and Prada co-founder Patrizio Bertelli. However, only Massimo Moratti, the former Inter Milan owner, transferred funds, believing the request was legitimate.
#VantageOnFirstpost: Italian tycoons are being targeted by an AI-driven scam in which scammers clone the voice of Italy's Defence Minister to demand millions of dollars in ransom.@palkisu tells you more. pic.twitter.com/GOXzd3NoWd
— Firstpost (@firstpost) February 10, 2025
Authorities initially feared recovering the stolen money would be difficult. However, on Wednesday, they successfully traced the cash to a Dutch bank account and frozen it.
“I’m very pleased that the money fraudulently taken from an entrepreneur, using my falsified voice and name, has been traced and completely frozen,” Crosetto said on X (formerly Twitter). “Excellent work by the magistrates and police forces.”
How the Scam Worked
The fraudsters posed as officials from the defence ministry, making calls that appeared to originate from government offices in Rome. They then passed the phone to an AI-generated voice of Crosetto, requesting funds while insisting that the government could not be seen as funding the transactions.
Scammers clone Italian minister's voice with AI in ransom scheme https://t.co/ao5cFt8H6e
— euronews (@euronews) February 10, 2025
Moratti, who transferred nearly €1 million, filed a legal complaint after realizing he had been deceived. Speaking to La Repubblica, he admitted: “It all seemed real. They were good. It could happen to anyone.”
The case has sparked serious concerns over the growing use of AI in financial fraud, highlighting the need for stricter security measures in digital communications.