AI impersonation scams are sky-rocketing in 2025, security experts warn – here’s how to stay safe

AI impersonation scams are sky-rocketing in 2025, security experts warn – here’s how to stay safe

Curated from Latest from TechRadar US in News,opinion — Here’s what matters right now:

AI impersonation scams use voice cloning and deepfake video to convincingly mimic trusted people Cybercriminals target people and businesses through calls, video meetings, messages, and emails Experts say that independently verifying identities and using multi-factor authentication are key to protecting yourself Imagine getting a frantic call from your best friend. Their voice is shaky as they tell you they’ve been in an accident and urgently need money. You recognize the voice instantly; after all, you’ve known them for years. But what if that voice isn’t actually real? In 2025, scammers are increasingly using AI to clone voices, mimic faces, and impersonate people you trust the most. The rise in this type of scam has been staggering. According to Moonlock , AI scams have surged by 148% this year, with criminals using advanced tools that make their deception near-impossible to detect. So how can you stay safe from this growing sci-fi threat? Here's everything you need to know, including what cybersecurity experts are recommending. What are AI impersonation scams? AI impersonation scams are a fast-growing form of fraud where criminals use artificial intelligence to mimic a person’s voice, face, or typing style with alarming accuracy. These scams often rely on voice cloning, which is a technology that can recreate someone’s speech patterns with just a few seconds of recorded audio. The samples aren’t hard to find; you can often spot them in voicemails, interviews, or social media videos. According to Montclair State University , even short clips from a podcast or online class can be enough to build a convincing AI impersonation of someone’s voice. Some scams take this even further, using deepfake video to simulate live calls. For instance, Forbes reports that scammers have impersonated company executives in video meetings, convincing staff to authorize large wire transfers . (Image credit: Getty Images / Tero Vesalainen) Experts say the rapid growth of AI impersonation scams in 2025 comes down to three factors: better technology, lower costs, and wider accessibility. With these digital forgeries at their side, attackers assume the identity of someone you trust, such as a family member, a boss, or even a government official. They then request valuable, confidential information, or skip the extra step and ask for urgent payments. These impersonated voices can be very convincing, and this makes them particularly nefarious. As the US Senate Judiciary Committee recently warned, even trained professionals can be tricked. Who is affected by AI impersonation scams? AI impersonation scams can happen across phone calls, video calls, messaging apps, and emails, often catching victims off guard in the middle of their daily routines. Criminals use voice cloning to make so-called "vishing" calls, which are phone scams that sound like a trusted person. The FBI recently warned about AI-generated calls pretending to be US politicians, including Senator Marco Rubio

Next step: Stay ahead with trusted tech. See our store for scanners, detectors, and privacy-first accessories.

Original reporting: Latest from TechRadar US in News,opinion

Back to blog

Leave a comment

Please note, comments need to be approved before they are published.