The Wall Avenue Journal experiences that firms are quietly beating AI impostors with delightfully low-tech strikes: ask the caller to draw a smiley face and maintain it to the digicam, nudge them to pan the webcam, throw in a curveball query solely an actual colleague would know, or grasp up and name again on a identified quantity. Easy, a bit cheeky, and—proper now—surprisingly efficient.
Right here’s the factor I preserve listening to from CISOs once I ask, “What really works on a Tuesday afternoon?”
They are saying the combo transfer issues: mix primary human challenges with coverage checkbacks and solely then lean on detection software.
That’s not shade on the instruments; it’s an admission that social engineering—not simply silicon—is carrying these scams.
And sure, the numbers are grim: deepfake fraud losses topped $200 million in Q1 2025 alone, which helps clarify why even very conventional companies are piloting call-back and passphrase protocols.
In order for you a government-grade cross-check, NIST’s current steering on face-photo morph detection provides a well timed reminder: algorithms mislead, however provenance and layered checks can shut the hole. It’s not about one magic detector; it’s about workflow.
Zoom out for a minute—as a result of one thing else shifted this week. Google is baking C2PA Content material Credentials into Pixel 10’s digicam and Google Images so photos can carry cryptographic “vitamin labels” about how they had been made.
That’s provenance, not detection, but it surely adjustments the default: show it, or be handled as unverified.
You may ask: “Cute doodles apart, does any of this cease the headline-grabbing heists?” Generally—particularly when individuals keep in mind to decelerate.
Regulation enforcement, for its half, is getting quicker at clawbacks: earlier this 12 months Italian police froze almost €1 million from an AI-voice rip-off that impersonated a cupboard minister to shake down enterprise leaders. It wasn’t good justice, but it surely was concrete.
Let me be candid: I used to roll my eyes at “analog defenses” as a result of they felt… flimsy.
Then I watched an actual incident assessment the place a finance supervisor defused a suspicious video name by asking the “CFO” to angle the webcam towards the whiteboard.
The lag, the artifacts, the awkward silence—it was a inform. That precise tactic exhibits up in skilled playbooks too: change the lighting, transfer the digicam, maintain up right now’s newspaper (sure, that previous chestnut nonetheless works). The purpose is to yank the attacker off their pre-rendered rails.
There’s coverage momentum, not simply road smarts. Provenance schemes like C2PA will solely matter if platforms show and respect them, and if organizations wire provenance checks into consumption flows.
YouTube’s early steps to label camera-captured, unaltered clips through Content material Credentials trace at the place this might go if extra ecosystems play alongside.
Does this imply detectors are lifeless? By no means. They’re simply transferring backstage whereas procedures transfer front-of-house.
The pragmatic learn from requirements our bodies is: mix authentication (who/what created this), verification (did it change, and the way), and analysis (does this make sense in context?). It sounds fussy till you keep in mind the stakes.
Another query I preserve getting from readers: “Is that this overkill for small groups?”
Truthfully, no. Decide two strikes you’ll be able to practice in an hour—verbal passphrases that change weekly and a tough rule to name again on a saved quantity for any cash request.
Tape a reminder subsequent to the monitor. It’s not glamorous, however neither is wiring out HK$200 million on a pretend Zoom. The lesson is human: after they anticipate you to zig, you zag—on goal, collectively, each time.
Backside line: the smiley-face check isn’t a punchline—it’s a sample interrupt. Pair it with call-backs, provenance checks, and a tradition that rewards “decelerate” over “rush it,” and also you’ve acquired a preventing probability towards fakes that look and sound uncomfortably actual.

