What is KYC and why do banks rely on it for identity verification?

"Know Your Customer" (KYC) is the legal requirement that financial institutions verify the identity of account owners before allowing them to transact. Without KYC, banks can't comply with anti-money-laundering (AML) rules. Without AML compliance, they get clobbered by regulators.

For decades, KYC meant paperwork: photo ID, address verification, maybe a phone call. Then AI and biometrics changed the game. Now most banks and crypto exchanges use facial recognition combined with "liveness detection"—a real-time video check that confirms the face on screen matches the ID document and that the face is actually alive (not a photo, not a video, not a deepfake).

The logic is sound. A photo can be spoofed. A deepfake can be spoofed. But a live person matching their ID document in real time? That should be hard to fake.

Should be.

How do virtual cameras actually defeat facial recognition and liveness checks?

The exploit is elegant in its simplicity. Instead of letting the banking app use your phone's real camera on the liveness check, scammers use special software to substitute a "virtual camera" that plays whatever video they want.

A software engineer from Cambodia sits in a money-laundering center. He opens a Vietnamese bank app on a compromised phone. When the app asks for a photo, he uploads a stolen photo of someone else. When the app requests a video liveness check, the virtual camera software kicks in. Instead of showing the live feed from the real camera, it displays a looped video of the stolen photo—or a deepfake of the person whose ID was stolen—or in some cases, just a few seconds of real video mixed with still images. The app sees motion and facial features and processes the account as verified.

The scammer is now inside someone else's bank account. The real account owner has no idea.

According to MIT Technology Review's investigation, released April 15, these VCam tools are sold on Telegram in 22 different channels and groups operating in Chinese, Vietnamese, and English. Services advertise themselves as "all smooth and seamless" and "specializing in bank services—handling dirty money." Some channels have thousands of subscribers.

What makes virtual camera attacks so hard for banks to detect?

The attack works at multiple layers simultaneously, which is why traditional defenses fail.

First, scammers often start by jailbreaking the physical phone. They gain root access to the operating system itself. At this level, they can install code that intercepts camera feeds before they reach the banking app. The app thinks it's using the real camera. It's not. It's piping through malicious software.

Second, they inject code called a "hooking framework" directly into the banking app. This technique is more sophisticated and doesn't require physical phone access—it can happen remotely. The framework hijacks the app's camera calls and redirects them to the virtual camera. The app's own code never suspects anything is wrong.

Third—and this is where deepfakes and AI enter—scammers layer stolen biometric data and AI-generated face videos on top. Instead of sending a static photo through the virtual camera, they send a deepfake video of the person whose ID was stolen. Liveness detection requires motion and eye movement and breathing patterns. Deepfakes have all of that now. The biometric match data (iris patterns, facial geometry, voice prints) is harvested from earlier breaches or purchased on the dark web.

Cumulatively, these attacks are nearly impossible to stop with a single technical control.

The timeline also works against banks. Money moves fast. According to cybersecurity researchers quoted in the MIT Technology Review investigation, scammers drain accounts and convert funds to stablecoin Tether within seconds. By the time a bank's anti-fraud system flags the transaction pattern, the money is already converted to cryptocurrency and is untraceable.

Who is behind these attacks and where does the stolen money go?

This isn't opportunistic fraud. It's organized crime infrastructure.

The primary victims are people targeted by "pig-butchering" scams—a transnational romance/investment fraud scheme that costs victims tens of thousands of dollars. Scammers build fake dating profiles, seduce victims, and convince them to invest in fake crypto schemes. Over months, layers of false evidence and charts make the scheme seem legitimate. By the time victims realize they've been trapped, they've lost six figures.

At that point, the scam moves to money laundering. The victim's funds sit in bank accounts spread across multiple countries—Vietnam, Thailand, Cambodia. To move that money out before law enforcement traces it, scammers need to access those accounts without triggering KYC checks. That's where VCam comes in.

Funds flow through "water houses"—money-laundering operations run out of buildings in Cambodia and Myanmar. An employee there can use VCam bypass services to open new accounts and funnel money in. From there, the funds are converted to Tether (a stablecoin that moves between exchanges with little friction) and spiral through multiple exchange wallets before converting back to fiat currency in Hong Kong or Singapore.

The scale is staggering. Chainalysis, a blockchain analysis firm, estimates $17 billion was stolen in crypto scams and fraud in 2025—up from $13 billion in 2024. The United Nations Office on Drugs and Crime recently warned that Asian scam syndicates operating in Africa and the Pacific have "dramatically scaled up profits."

VCam bypass tools are now essential infrastructure for this entire ecosystem.

Attack Layer Method Defense Difficulty Cost to Attacker
Phone OS compromise Jailbreak device for root access Very High (requires device in hand) Low ($50–$200 tool)
App-level injection Hooking framework into banking app code High (requires crypto signatures to bypass) Medium ($500–$5,000 custom attack)
Biometric spoofing Stolen iris/face data + deepfake video High (requires real biometric data) Medium ($1,000–$10,000 dark web purchase)
Timing/velocity Drain and convert to Tether in seconds Medium (requires account access first) Low (automated after entry)

Why haven't Binance, BBVA, and major exchanges patched this vulnerability?

They're aware of the problem. Binance acknowledged to MIT Technology Review that it has "observed attempts of this nature to circumvent our controls" and claimed it "successfully prevented such attacks. BBVA and Revolut, the UK fintech, declined to comment on whether their safeguards had been breached.

But here's the difficult truth: You can't patch a vulnerability that exists at the phone operating system level. If the scammer controls jailbreak access to the OS itself, the banking app is already compromised before it even launches. It doesn't matter how secure the app code is.

The second challenge is that liveness detection is an arms race. Banks install detection. Scammers find a workaround. Banks install a new detector. Scammers layer a countermeasure (like deepfakes). At each step, the attack gets more sophisticated and resource-intensive, but it doesn't become impossible—it just becomes more expensive. For organized crime networks stealing $17 billion annually, the investment in better VCam tools pays for itself thousands of times over.

The third challenge is regulatory. KYC standards were written before virtual cameras, deepfakes, and multi-layer spoofing existed. Regulators are still catching up. Thailand and Vietnam have recently tightened KYC requirements and transaction monitoring. Notably, the US Financial Crimes Enforcement Network (FinCEN) issued a specific warning against KYC deepfakes in late 2024—but warnings aren't defenses. They're admissions that the existing system is broken.

What would a more secure identity verification stack actually look like?

Single points of failure are out. Liveness detection alone isn't enough anymore. Here's what a more resilient stack would include:

**Behavioral biometrics beyond the face.** Instead of just matching a face, analyze how the person moves their phone, the angle they hold it, their eye gaze patterns, the speed of their head movements. Deepfakes and static spoofs can nail a face. They're much worse at mimicking the micro-behaviors that distinguish real people. Add voice biometrics (voice deepfakes exist but require more data and resources). Add iris recognition. Add hand geometry. The more orthogonal biometric signals you require, the exponentially harder the attack becomes.

**Hardware attestation.** Verify that the phone itself is real and hasn't been jailbroken. Use technologies like Android Attestation and Apple App Attest to prove the device is running genuine OS code and the banking app hasn't been tampered with. This pushes the attacker away from software hacks and toward either stealing physical phones or creating custom hardware—both much harder and more expensive.

**Transaction pattern monitoring in real time.** Don't just watch whether an account opens—watch whether it behaves like the real account owner. If an account that sends $500/month to family suddenly moves $100K to an exchange, flag it. If a new account logs in from Cambodia and attempts bulk transfers to crypto exchanges, flag it. Not perfect—and privacy-invasive—but harder to scale at the volumes criminals operate at.

**Decentralized verification.** Don't rely on a single liveness check. Have the user reverify their identity via multiple channels (email, SMS, security questions) before confirming bulk transactions. Make the attack not just technically hard but operationally expensive.

None of these are perfect. But a combination of behavioral biometrics + hardware attestation + transaction monitoring would make VCam bypass attacks materially harder and less profitable. Right now, banks have picked the easiest solution: facial recognition liveness check. Criminals have proven it's not enough.

Why This War Will Keep Escalating

For every defense, there will be a countermeasure. Deepfakes beat static spoofs. Behavioral biometrics beat deepfakes (but require more data and are more privacy-invasive). Hardware attestation can be defeated with custom device hacks or supply chain compromises. The fundamental asymmetry is that criminals only need to win once per account. Banks need to win every single time. As AI gets better at generating convincing deepfakes and multi-layer spoofs, this dynamic will only get worse. The real solution isn't a technical fix—it's harder and slower and involves international law enforcement cooperation against organized crime. But that never makes headlines as often as "new security system."

Sources

Cybersecurity Deepfakes KYC Fraud Financial Security