The FBI presented to the Chartered Accountants of New Zealand & Australia to our Forensic Accounting Special Interest Group a few years ago & this piece of advice stayed with me from that presentation & is re-iterated below. Namely, don’t get tricked into paying your company’s money to fraudsters & this is easy to solve – pick up the phone & call to confirm the request is legitimate!

“The best way to avoid being exploited is to verify the authenticity of requests to send money by walking into the CEO’s office or speaking to him or her directly on the phone. Don’t rely on e-mail alone,” said Martin Licciardo, special agent, FBI Washington Field Office. ”

https://www.cso.com.au/article/666110/voice-ai-becomes-an-accessory-ceo-fraud-fbi-advice-just-pick-up-phone/

Voice AI becomes an accessory to CEO fraud — FBI advice to just ‘pick up the phone’

By now execs should be aware of the threat of email fraudsters impersonating them to instruct subordinates to wire funds to a scammer’s account. But now they also need to watch out for phone hustlers, aided by artificial intelligence that spoofs the voice of CEOs.

Deepfake videos depicting politicians like Barack Obama saying things he never did have sparked concerns over their potential for messing with elections and causing social chaos.

But one use of the same technology is simple fraud. And in recent years the most lucrative fraud targeting businesses is carried out by spoofing the email account of a CEO (or some other trusted or authoritative person) who instructs a subordinate, such as a financial controller, to wire funds to a fraudster’s account. It’s called business email compromise or BEC fraud and the FBI estimates it cost US businesses $1.3 billion in 2018 alone.

However, execs and financial controllers should be wary of phone calls too since commercially available AI software can allow criminals to reproduce audio that sounds convincingly like a specific individual.

That’s exactly what happened to an unnamed German energy firm with operations in the UK, as reported by the Wall Street Journal last week.

The UK CEO of the Germany subsidiary thought he was talking to his German boss, but was in fact talking to a scammer who’d manufactured his boss’s words through an AI voice generator. The account of the incident was told by the firm’s French insurer, Euler Hermes.

The UK-based CEO was duped into transferring USD$240,000 to an account in Hungary under the belief his boss had instructed him to urgently pay a supplier in Hungary. The funds were sent to an account in Hungary but were then dispersed through accounts in Mexico.

The insurer told US media that suspects had not been named and that the money had vanished.

While widely reported as the first instance that AI had been used to mimic a CEO’s voice for fraud, the BBC reported in July that Symantec had received three reports of AI-generated voice fraud resulting in multi-million dollar losses for victims.

Symantec CTO Hugh Thompson told BBC that attackers have sufficient data to build a decent model of an exec’s voice by using audio from videos, earnings calls, and media appearances — all of which can are publicly accessible via sites like YouTube.

Euler Hermes has said the software was able to imitate the voice, tonality, and the German accent of the German CEO. The insurer intends to cover the victim’s full claim.

The case serves as a reminder that AI designed to benefit humans can equally be used to harm them. Google, for example, has showed off that its Duplex AI voice service can book a table at a restaurant without the receiver being aware they’re talking to a machine.

And using the phone is exactly what the FBI has said to use to verify email requests for wire transfers. The bureau in 2017 urged business execs to call each other before making transfers requested by email.

“The best way to avoid being exploited is to verify the authenticity of requests to send money by walking into the CEO’s office or speaking to him or her directly on the phone. Don’t rely on e-mail alone,” said Martin Licciardo, special agent, FBI Washington Field Office.

But past apparent BEC cases have illustrated that the new technique could mean more legal costs and conflict between business and employees. A UK firm earlier this year sued a former employee, who had claimed to have fallen victim to BEC fraud.

“There’s a tension in the commercial space between wanting to make the best product and considering the bad applications that product could have,” said Charlotte Stanton, the director of the Silicon Valley office of the Carnegie Endowment for International Peace.

“Researchers need to be more cautious as they release technology as powerful as voice-synthesis technology, because clearly it’s at a point where it can be misused.”