AI Deepfake & Malware Threats
Artificial intelligence (AI) is rapidly changing the cybersecurity landscape. In 2026, one of the most significant emerging risks is AI-driven impersonation, in which cybercriminals use deepfake video and voice technologies to impersonate trusted individuals.
These attacks are increasingly taking place over platforms like Zoom, where a realistic-looking participant may request urgent actions—such as running software, sharing credentials, or approving payments. Because the interaction appears genuine, traditional warning signs are often missed.
Why is this threat growing?
AI tools are now widely accessible, allowing attackers to create convincing deepfakes quickly and at low cost. At the same time, businesses rely more than ever on digital communication, making them vulnerable to identity-based attacks rather than just technical breaches.
In this environment, seeing and hearing are no longer enough to believe.
How Zoom is responding
To address this, Zoom is introducing new verification features, including verified user badges, facial verification and liveness detection, and stronger controls for unverified participants.
These measures are designed to confirm that meeting participants are real—not AI-generated impersonations.
What businesses should do
To reduce risk, organisations should:
- Verify requests through a second channel (e.g. phone or internal systems)
- Train staff to question unusual or urgent instructions
- Implement layered security, combining identity checks with existing protections.
Cyber threats are no longer just about malware; they’re about manipulating trust.
Businesses that adapt quickly, by strengthening verification and awareness, will be best placed to stay secure. If you’re in any doubt about what you can do as a business to keep safe online, then get in touch. We’re here to help local businesses across the Crawley, Gatwick and West Sussex region stay protected in a world where digital identities can be faked.
Photo credit: Image designed by Freepik (www.freepik.com)

