Are We Leaving People Hanging?

I recently discussed an escalation in tactics by scammers during an episode of the Security Weekly News podcast. The quick summary is that a finance employee in Hong Kong received an email that appeared to be from their CFO in the UK telling them to transfer the money to several external bank accounts. They were told to transfer $200M Hong Kong dollars ($25M USD) to these accounts. The victim’s first reaction was that this was a malicious message. They were invited to a video conference with the CFO and several other senior officers in the company. The victim recognized the people’s faces and voices. This convinced them to send the money. Unfortunately, everyone on the call was a deepfake video except for the victim.

I find this to be an incredible application of AI to have created a video conference to steal millions of dollars from the victim’s employer. Unfortunately, criminals are able to misuse the potential benefits of generative AI. The investment into AI by the attackers appears to have paid off. This new capability is something that we need to address in our security awareness videos. I have sat through a lot of these trainings and even the recent ones haven’t hinted at this type of escalation in tactics. Educating our colleagues on this type of is not going to be easy. The message will likely end up being “you can’t even believe what you see and hear.” Ouch.

A Question to Think About

A question occurred to me after the podcast was over. Are our security awareness videos and response procedures too focused on technical response? Training videos give a list of things to check to see if an unusual email may be malicious. They give procedures on how to report the malicious email, text message, voicemail, etc. It may be clicking a button to report the message or even contacting the security team via email or chat. This excludes any response needed within the targeted person’s department. Who should they notify for help in their area of the business?

The victim of this scam worked in finance and issuing transfers to outside bank accounts was part of their job. Let’s say they reported the messages to security. The victim is left wondering if it really was the CFO that demanded that the transfers be made. Will they lose their job? Will there be some other kind of punishment for failing to carry out the transfers? This person must have felt incredibly lonely. What if there was a parallel procedure to contact someone in a position of authority in the finance department to evaluate the transfer request? And our training emphasized this parallel response? Such a procedure could have prevented this person from needing to make this decision on their own. It also would have significantly decreased the chances of the scam’s success.

Link to CNN article: Finance worker pays out $25 million after video call with deepfake ‘chief financial officer’

Jason Wood