We've curated 6 cybersecurity statistics about Deepfake to help you understand how this deceptive technology is being used to create realistic fake videos and audio, posing threats to privacy and misinformation in 2025.
Showing 1-20 of 81 results
21% of Americans have low confidence in spotting deepfakes in 2025.
39% of Americans have clicked on fake celebrity or influencer endorsements in 2025.
44% of Americans have seen fake or AI-generated influencer endorsements in 2025.
72% of Americans have seen fake celebrity or influencer endorsements in 2025.
10% of Americans lost money after clicking on fake celebrity or influencer endorsements in 2025, with average losses of $525.
Only 29% of Americans feel very confident about spotting deepfakes in 2025.
Taylor Swift ranks #1 as the most impersonated and exploited celebrity.
37% of consumers worldwide identified the use of artificial intelligence in sophisticated scams, such as deepfakes, as their top concern in 2025.
61% of cybersecurity professionals in Germany identified deepfakes as the most significant identity-based threat in 2025.
93% of senior legal professionals are concerned that AI-generated fake assets could materially harm their business
One in three businesses worldwide has been impacted by deepfakes and other impersonation attacks.
75% of companies globally report not having a dedicated plan to specifically address generative AI risks, including deepfakes and AI-driven fraud attacks in 2025.
87% of organizations expect deepfakes to become major attack vectors in future ransomware campaigns.
89% of healthcare organizations express concern that deepfake audio and video will become major vectors for social engineering in future ransomware attacks.
90% of C-level executives express concern that deepfake audio and video will become major vectors for social engineering in future ransomware attacks.
53% of organizations implemented AI-powered threat detection.
83% of family offices are concerned about deepfakes or other impersonation threats.
One in five mobile users has been the target of a deepfake scam.
Over 61% of organizations that have lost money in a deepfake attack reported losses in excess of $100,000.
Recorded audio/voice manipulations currently account for 52% of deepfake threat vectors.