Deepfake It Till You Make It: A Growing Threat to Enterprise Security
VOLUME 1 - ISSUE 17 ~ May 28, 2025
In this edition of the “CIO Two Cents” newsletter, I explore the rising threat of deepfakes against enterprise security.
— Yvette Kanouff, partner at JC2 Ventures
The JC2 Ventures team:
(John J. Chambers, Shannon Pina, John T. Chambers, me, and Pankaj Patel)
(1)
In Q1 2025 alone, deepfake-driven fraud has caused $200M in losses, with predictions reaching $40B by 2027.
(2)
From CFO impersonations to fake job interviews and customer support calls, deepfake technology is enabling identity fraud to infiltrate companies and steal sensitive data.
(3)
Advanced biometrics, AI-powered fraud analysis, and public education are critical to combat the growing risk of deepfake-enabled fraud.
As much as I’ve been talking about deepfakes for years, it is a shock to see how much deepfakes are a part of every day today. We can’t seem to spend a day without being exposed to AI babies on social media speaking with the voices of real adults, deepfake joke videos, and unfortunately – deepfake fraud. Deepfake technology is rapidly advancing, posing significant challenges to enterprise security. I assume we all know deepfakes well – this technology enables the manipulation of media to impersonate individuals in real-time, clone voices, and create deceptive content, making it tool for entertainment as well as enabling illegal identity theft.
Deepfake-driven, financial fraud has resulted in $200 million in losses in just the first quarter of 2025 with projections estimating losses will reach $40 billion by 2027 in the U.S. alone. I recall a conversation from years ago with Vijay Balasubramaniyan, CEO of Pindrop, a cybersecurity company specializing in deepfake detection. He warned me about the risks of scenarios where a CFO might fall victim to a deepfake call impersonating their CEO, authorizing an urgent financial transfer. Unfortunately, this once-theoretical risk has become reality with cases like one last year where a finance worker was scammed out of $25 million through such a deepfake impersonation scheme. But it’s not just executive theft that’s leading the deepfake fraud market. Employee identity fraud is growing at an alarming rate. AI-generated deepfake calls are being used to impersonate employees and deceive recruiters during job interviews. The reason? To infiltrate organizations and access sensitive information. As a result, stricter identity verification measures are increasingly becoming critical at many companies.
In addition to enterprise companies, voice cloning fraud can cause many issues to industries across the board. The need for both voice identity confirmation and fraud detection is becoming critical for each of us. Pindrop offers a convincing example of how their voice analysis technology identifies fraud, such as instances where the same deepfake voice is used to call in under multiple accounts to make insurance claims. The situation is unfortunately very real and very dire.
CIOs should be very concerned about this across various aspects of a business. Advanced detection tools and proactive measures for authentication and fraud detection are key. Some of the obvious tools include today’s favorites like multifactor authentication and password best practices, but I think the future will include more enhanced biometrics and AI fraud analysis on voice, video, and other metadata. Certainly, public education is key, but the growing threat of deepfake-enabled fraud requires enhanced technology measures and swift incident response capabilities. It is an area that I have been following and that JC2 Ventures has been invested in for some time.
Here are insights from fellow CIOs on the topic of deepfakes that I think are interesting:
One in four CIOs is concerned about the rise of deepfake technology over the next year.
Hany Farid, co-founder and chief science officer at GetReal urges enterprises to focus on “present proofing” rather than future planning.
CFOs and IT leaders are teaming up against the rising threat of deepfake fraud. The answer? AI.
Image of the Moment
Photo: iStock