“Ocean’s 11” Redux
When it comes to financial fraud, “deep fakes” are the new game in town. Think “Ocean’s 11” comes to your bank and you are the victim, like casino-owner Terry Benedict in the movie. “Deepfake technology uses artificial intelligence (AI) software to make convincing impersonations of voices, images and videos. AI-based neural networks generate a counterfeit of a photo, audio recording or video while another tries to identify the fake version.”[1] Two publicized cases involved banks operating in Dubai and London, which lost tens of millions to deep fake schemes. Most cases go unreported.
Analyst Kelley M. Sayler, of the Congressional Research Service, put it this way: “The use of AI to generate deepfakes is causing concern because the results are increasingly realistic, rapidly created, and cheaply made with [widely available] software and the ability to rent processing power through cloud computing. Thus, even unskilled operators [can] download the requisite software tools and, using publicly available data, create increasingly convincing counterfeit content.”[2]
In the Dubai case, perpetrators used software to mimic the voice of a company executive well known to a local banker. The executive told the banker he was completing an acquisition and needed funding from the bank to do so. The bank obliged and $35 million vanished into accounts around the world. The case became known only when Forbes magazine uncovered legal pleadings involving the loss.
Deep fake schemes are proliferating in other fields. One circulating presently shows Ukrainian President Zelensky exhorting his compatriots to put down their arms and surrender to the Russians. Another took the first place blue ribbon and $300 cash prize in the Colorado State Fair’s juried art exhibit, category of “digital art.” Titled, "Théâtre D'opéra Spatial,” the man who created it used artificial intelligence software and written prompts to create what he describes as a "lavish sort of space opera scene." One of the exhibit’s jurors said he was unaware AI was used to create the image.
When it comes to financial fraud losses, the risks to business owners are difficult to quantify. Non-depository financial companies today comprise a growing share of our financial system. Newly hatched fintech companies proudly claim to be disrupting the status quo. That attribute makes them magnets for fraud schemes, as PayPal’s founders discovered in their early, high-growth years.
Nondisclosure is the norm when fintech companies suffer fraud losses. Public securities filings frequently include bland statements that fraud is a risk of the business. Government authorities have been slow to require companies to publish aggregate data about losses and risks of loss.
Law enforcement is swamped with cases. When I ask U.S. Attorneys, the FBI and other agencies to investigate clients’ losses due to financial fraud, I am repeatedly told, “This happens every day to lots of people; we can only do so much.”
Given these limitations, I recommend the following precautions.
Be vigilant. Slow down. Ask questions. Look for little details that do not make sense in the context of what you are being asked to do.
Know technology countermeasures can always be defeated. Think cops and speeding drivers in their cat-and-mouse game of ever-improving radar guns and detectors. That game has been going 50 years and there is always a new device or tactic being promoted as a game-changer.
Recognize ego and emotion are as important to financial fraud as greed. In the movie, the Oceans 11 team considers scuttling their plan when they learn Danny Ocean’s ex-wife is now Terry Benedict’s mistress. Effective crooks know they need to think clearly, unimpeded by emotion. So do their targets if they are to avoid being victims.
Understand successful deceptions require a drama as their foundation. A story is created. It is plausible, presented in a way that is convincing, and invites the victim to draw a false conclusion and then act on that conclusion. “Invites” is the key word. Victims are not told what to believe. They are seduced to believe what their captors want them to believe. They are then captives of their own beliefs.
AI will not replace human agency during the lifetime of anybody now in the workforce. To have AI work for us rather than against us, we must design complex adaptive systems that serve human needs while recognizing we are bound to the machines we create for the betterment of all of us.
[1] https://www.shrm.org/resourcesandtools/hr-topics/technology/pages/deepfake-scams-may-be-on-the-rise.aspx#:~:text=Deepfake%20technology%20uses%20artificial%20intelligence,to%20identify%20the%20fake%20version.
[2] Id.