Allianz Trade claims statistics: Fake resident fraud scam still "en vogue"

White-collar criminals are striking more and more frequently and causing ever greater damage. They are becoming more and more professional - also thanks to artificial intelligence. Social engineering, i.e. scams in which the perpetrators manipulate people, is particularly popular with criminals. However, most damage is still caused by internal perpetrators, i.e. the company's own employees.

Well-made deepfakes are often difficult to identify. (Image: www.depositphotos.com)

"The uncomfortable truth for companies remains: The weak point is people, and their own employees continue to cause the most damage and - at least until 2023 - also the greatest damage. In 2024, this tide could turn for the first time in terms of the level of losses," says Marie-Christine Kragh. She is Global Head of Fidelity Insurance at Allianz Trade.

In 2023, internal perpetrators were responsible for more than half (55 %) of all claims reported to Allianz Trade in Germany and for around three quarters of the reported claims volume (76 %). In 2024, this trend has so far continued in terms of the number of cases: from January to August 2024, domestic perpetrators committed around 60 % of the reported cases. What is new in 2024, however, is that external perpetrators were ahead in terms of the amount of damage in the same period (61 %). Experience shows that there may still be significant shifts for the year as a whole, both due to major losses and due to the fact that criminal acts committed by internal perpetrators are usually discovered and reported much later than offenses committed by external perpetrators.

Social engineering is booming among white-collar criminals

External perpetrators also include "social engineers": in payment and ordering fraud, they divert payment and goods flows, and in fake resident fraud, they pose as supposed bosses and instruct employees to transfer sums of money to fraudulent accounts for supposed business transactions. The number of cases of these crimes rose by 17 % in 2023 compared to the previous year and the volume of losses by 19 %.

"The fake president scam is still in vogue after the surprising revival two years ago," says Kragh, "The number of cases of this scam shot up again by almost a third (+31 %) in 2023."

However, the losses per case caused by both companies fell significantly in 2023. Last year, the loss volume halved (-55 %). In most cases, the loss amounts were in the low to mid six-figure range.

False bosses still "en vogue" with major losses, AI brings new level of evolution

"The false bosses struck much more often in 2023, but did not steal as large sums of money," says Kragh. "However, companies should not be lulled into a sense of security - on the contrary. This year, we expect the number of cases to remain high but stable, with an increase in major losses. We assume that the claims volume for companies is likely to more than double in 2024. This indicates that fraudsters are using AI tools to further professionalize their scams with an even more targeted approach to employees and companies."

The new technology is also likely to play into the hands of white-collar criminals when it comes to payment fraud. The amount of losses due to payment fraud more than doubled in 2023 compared to the previous year (+59 %), mainly driven by major losses. In many cases, the counterfeit invoices are practically indistinguishable from the originals.

According to estimates by Allianz Trade based on claims statistics from January to August 2024, there are signs of a slight easing in major claims for payment fraud for 2024 as a whole: Although the number of cases is likely to remain at a high level, average losses should normalize somewhat and the overall loss volume should decline in 2024 (-25 %).

Danger from deepfakes: voice cloning at the touch of a button

With the rapid development of AI tools, deepfakes are likely to pose an increasing threat to companies in the future. "A few years ago, voice cloning was still something for absolute specialists and the quality was often questionable," says Tom Alby, Chief Digital Transformation Officer at Allianz Trade in Germany, Austria and Switzerland. "Today, thanks to AI tools, it's practically available 'off the shelf' at the touch of a button. This also opens up completely new horizons for fraudsters. The hurdles are lower than ever before, and they need fewer and fewer skills for really well-crafted attacks."

At Social Engineers, however, technology is only a means to an end. It is intended to emphasize the authenticity of the boss and the job. Manipulation through emotions and pressure plays an equally important role. "Using artificially generated voices and images to build trust is a powerful tool," says Kragh. "A well-worded email is one thing, but if the fake boss suddenly speaks with a real voice or even looks real and can be seen in 'his' office in case of doubt, then that's a whole new dimension that in many cases makes all doubts disappear. You can't simply apply a security patch to employees and everything is automatically blocked. Raising awareness is therefore more important than ever."

Race between evolutionary stages of criminals and protective measures

Well-made deepfakes are often difficult to identify. Employees should look out for unnatural intonation or speech melody or how authentic movements or blinking appear. Poor audio or video quality, inexplicable background noises or changes in light and skin tone could also be important indicators. The same goes for poor lip-syncing to what is being said. You can also simply ask the other person to touch their nose with their finger.

"However, I assume that we will see deepfakes in the coming months where all of this no longer applies," says Alby. "That's why it makes sense to think internally about how to install control mechanisms. After all, criminals don't sleep, they work day and night on the remaining deficits and are the first to take such "detection tips" to heart. This is their input for the next stage of evolution. It's definitely going to be a game of cat and mouse."

"However, vigilance, critical thinking and a good, open corporate culture are the most important factors," says Kragh. "A single query can bring the whole house of cards crashing down and expose the perpetrators. The CEO's commitment not to instruct transfers in video calls or a solution for certain transactions can also be suitable safeguards."

In a recent case, an employee of a car company foiled a fake president fraud attempt with a simple query: Which book the CEO had recommended to him last week. The fake boss had no idea.

Source: www.allianz-trade.ch

(Visited 265 times, 1 visits today)

More articles on the topic