Rechercher
Contactez-nous Suivez-nous sur Twitter En francais English Language
 

Freely subscribe to our NEWSLETTER

Newsletter FR

Newsletter EN

Vulnérabilités

Unsubscribe

Deep fake losses could be major, CyberCube warns

January 2021 by Darren Thomson, Head of Cyber Security Strategy for cyber analytics leader CyberCube

The use of deep fake video and audio technologies could become a major cyber threat to businesses within the next two years, cyber analytics specialist CyberCube has predicted.

In its new report, Social Engineering: Blurring reality and fake, CyberCube says the ability to create realistic audio and video fakes using AI and machine learning has grown steadily. In addition, recent technological advances and the increased dependence of businesses on video-based communication have accelerated developments.

Because of the increasing number of video and audio samples of business people now accessible online – in part due to the pandemic – cyber criminals have a large supply of data from which to build photo-realistic simulations of individuals, which can then be used to influence and manipulate people.
In addition, ‘mouth mapping’ – a technology created by the University of Washington – can be used to mimic the movement of the human mouth during speech with extreme accuracy. This complements existing deep fake video and audio technologies.

The report’s author, CyberCube’s head of cyber security strategy Darren Thomson, said: “As the availability of personal information increases online, criminals are investing in technology to exploit this trend. New and emerging social engineering techniques like deep fake video and audio will fundamentally change the cyber threat landscape and are becoming both technically feasible and economically viable for criminal organisations of all sizes.

“Imagine a scenario in which a video of Elon Musk giving insider trading tips goes viral – only it’s not the real Elon Musk. Or a politician announces a new policy in a video clip, but once again, it’s not real. We’ve already seen these deep fake videos used in political campaigns; it’s only a matter of time before criminals apply the same technique to businesses and wealthy private individuals. It could be as simple as a faked voicemail from a senior manager instructing staff to make a fraudulent payment or move funds to an account set up by a hacker.”

The report also examines the growing use of traditional social engineering techniques – exploiting human vulnerabilities to gain access to personal information and protection systems. One facet of this is social profiling, the technique of assembling the information necessary to create a fake identity for a target individual based on information available online or from physical sources such as refuse or stolen medical records. According to the report, the blurring of domestic and business IT systems created by the pandemic combined with the growing use of online platforms is making social engineering easier for criminals. In addition, AI technology is making it possible to create social profiles at scale.

The report warns insurers that there is little they can do to combat the development of deep fake technologies but stresses that risk selection will become increasingly important for cyber underwriters.

Darren Thomson said: “There is no silver bullet that will translate into zero losses. However, underwriters should still try to understand how a given risk stacks up to information security frameworks. Training employees to be prepared for deep fake attacks will also be important.”
Insurers should also consider the potential of deep fake technology to create large losses as it could be used in an attempt to destabilise a political system or a financial market.

In March 2019, cyber criminals used AI-based software to impersonate a chief executive’s voice to demand the fraudulent transfer of $243,000.


See previous articles

    

See next articles


Your podcast Here

New, you can have your Podcast here. Contact us for more information ask:
Marc Brami
Phone: +33 1 40 92 05 55
Mail: ipsimp@free.fr

All new podcasts