Understanding the
Biometric Threat Landscape
This is not hypothetical. Every step below is happening right now, at scale, using tools anyone can access for free.
They find your profile
An attacker visits your LinkedIn, Facebook, Twitter, or company website. Your profile photo is publicly visible — no login required.
LinkedIn alone has 1 billion profiles. Google Images indexes most of them. Your headshot is one search away.
Difficulty: low
They download your photo
Right-click, save. Or they use automated scrapers that harvest millions of photos per day. No permission needed.
Clearview AI has scraped 60+ billion images this way. They were fined in the EU, but the database still exists and is actively sold.
Difficulty: low
They run facial recognition
Your photo is uploaded to a reverse face search engine like PimEyes, FaceCheck ID, or FindClone. In seconds, every other photo of you on the internet is found.
These tools are public. PimEyes costs $30/month. FaceCheck ID is free. They find your dating profiles, social media, news articles, event photos — everything.
Difficulty: medium
They build a profile on you
By cross-referencing all your photos and accounts, they piece together: your full name, employer, location, daily routine, family members, and social circle.
Open-source intelligence (OSINT) tools automate this. A complete dossier on someone can be built in under an hour.
Difficulty: high
They track you in the real world
Your face is now a search query. Security cameras, public surveillance, even photos posted by strangers — if your face is in frame, they can find you.
Cities like London have 1 camera for every 11 people. Combined with facial recognition, anonymity in public spaces no longer exists.
Difficulty: high
From here, the attacks branch
Deepfake Fraud
- 1.Your face is used to generate a deepfake video
- 2.The video is used in a Zoom call to impersonate you
- 3.A colleague or bank employee is tricked into transferring money
- 4.In Q1 2025 alone, deepfake fraud caused $200M+ in losses
Identity Theft
- 1.Your photo is combined with leaked personal data
- 2.Fake IDs, passports, or bank accounts are created
- 3.1 in 20 identity verifications now fail due to deepfakes
- 4.Cleaning up identity theft takes an average of 200+ hours
Targeted Phishing
- 1.Your photo and employer info are used to craft a spear phishing email
- 2.Coworkers receive messages that appear to come from you
- 3.Malware is installed or credentials are stolen
- 4.Executive impersonation is the fastest-growing attack vector
Stalking & Harassment
- 1.A stranger uses your photo to find all your online accounts
- 2.They piece together your location, workplace, and routine
- 3.Doxxing, threats, or physical stalking follow
- 4.This disproportionately affects women and public figures
This is not a future problem
60B+
Photos scraped by Clearview AI
The Register, 2025
$200M+
Deepfake fraud losses, Q1 2025
Veriff, 2025
1 in 20
ID verifications failed by deepfakes
Veriff, 2025
$0
Cost to reverse-search a face
FaceCheck ID
CloakBioGuard breaks the chain at Step 3
We add invisible perturbations to your photo that are imperceptible to humans but completely scramble facial recognition algorithms. When an attacker uploads your protected photo to a face search engine, they get zero results.
No matches. No cross-referencing. No dossier. No attack.
Sources
- Clearview AI criminal charges — The Register, Oct 2025
- FaceCheck ID reverse face search — AI Insights News, 2026
- Real-time deepfake fraud statistics — Veriff, 2025
- Adversarial facial privacy research — Published biometric privacy literature
- AI and biometric theft — Interface Media, Feb 2025
- Deepfake statistics and trends — Keepnet Labs, 2026