Fraudsters use AI voice manipulation to steal £200,000
Bold social engineering attack could be the first of many powered by machine learning
Cyber criminals have used artificial intelligence (AI) and voice technology to impersonate a UK business owner, resulting in the fraudulent transfer of $243,000 (201,000).
In March this year, what is believed to be an unknown hacker group is said to have exploited AI-powered software to mimic the prominent business leader's voice to fool his subordinate, the CEO of a UK-based energy subsidiary, according to the Wall Street Journal (WSJ).
The hackers were then able to convince the CEO to carry out transactions in the guise of urgent funds destined for its German parent company.
It's believed that the fraudsters phoned the UK-based CEO to demand a transfer to a Hungarian supplier. They contacted him again, still impersonating the parent company's chief executive, to reassure him the transfer would be reimbursed.
The CEO was then contacted a third time, before any reimbursement funds had appeared, to request a second urgent transfer. It was at this point the CEO became suspicious and declined to make any further payments.
The funds that were transferred to Hungary, however, were soon moved on to Mexico and various other locations, with law enforcement still looking for suspects.
This social engineering attack could be a sign for things to come, according to ESET cyber security specialist Jake Moore, who expects to see a huge rise in machine-learned cyber crimes in the near future.
"We have already seen DeepFakes imitate celebrities and public figures in video format, but these have taken around 17 hours of footage to create convincingly," he said.
"Being able to fake voices takes fewer recordings to produce. As computing power increases, we are starting to see these become even easier to create, which paints a scary picture ahead."
With enterprise security practices becoming more robust with time, criminals may increasingly look to staff as the most easily-exploitable gaps in an organisation's defence.
Social engineering has, indeed, grown to be far more sophisticated in recent years with employees faced with slicker phishing campaigns and highly targeted attempts at deception.
"To reduce risks it is imperative not only to make people aware that such imitations are possible now, but also to include verification techniques before any money is transferred," Moore added.
Preparing for AI-enabled cyber attacks
MIT technology review insightsDownload now
Cloud storage performance analysis
Storage performance and value of the IONOS cloud Compute EngineDownload now
The Forrester Wave: Top security analytics platforms
The 11 providers that matter most and how they stack upDownload now
Harness data to reinvent your organisation
Build a data strategy for the next wave of cloud innovationDownload now