Trending: AI Tools, Social Media, Reviews

News

Deepfake Scams Surge as Fake BSE CEO Video Sparks Fresh Warning

Vivek Gupta
Published By
Vivek Gupta
Updated Mar 2, 2026 5 min read
Deepfake Scams Surge as Fake BSE CEO Video Sparks Fresh Warning

A new wave of AI-powered fraud has put Indian investors and global businesses on alert after a deepfake video impersonating Bombay Stock Exchange CEO Sundararaman Ramamurthy began circulating online. The incident, highlighted in recent reporting, underscores how convincingly synthetic media can now mimic trusted leaders and trigger real financial risk.

Fake Stock Tips Using the BSE Chief’s Face

The viral clip showed what appeared to be Ramamurthy confidently recommending specific stocks and promising strong returns. The problem was simple and serious: the video was entirely fake.

Ramamurthy publicly clarified that both the face and voice in the clip were AI-generated and had no connection to him or the exchange. He warned that the content was designed to mislead retail investors and could easily trick viewers who assume the endorsement is genuine.

The Bombay Stock Exchange moved quickly. Officials said complaints are filed whenever such impersonations surface, and social platforms including Instagram have been contacted to remove the content. Market advisories have also been issued urging investors to verify any investment advice through official channels before acting.

The exchange noted that it remains unclear how widely the video spread or whether any trades were executed because of it. Preventing investor losses, however, has become the immediate priority.

A Pattern Regulators Have Been Warning About

The BSE incident is not isolated. Indian regulators have repeatedly flagged deepfake misuse in financial scams.

The Reserve Bank of India previously warned the public about fabricated videos using Governor Shaktikanta Das’s likeness to promote fake investment schemes. Similarly, the National Stock Exchange had to issue alerts after deepfake clips falsely showed CEO Ashishkumar Chauhan endorsing stocks.

Data cited in recent reporting suggests the threat is accelerating. India saw a roughly 280 percent year-on-year rise in deepfake incidents in early 2024, spanning financial fraud, political manipulation, and personal extortion.

The $25 Million Wake-Up Call

Globally, one of the most alarming cases involved engineering giant Arup in Hong Kong. A finance employee joined what looked like a routine video call with the company’s CFO and several colleagues.

Every participant on that call was a deepfake.

Believing the instructions were legitimate and seeing apparent agreement from others on screen, the employee executed 15 transfers totaling about $25 million. Hong Kong police later confirmed that AI-generated video and voice cloning were used to impersonate the executives.

Security experts now cite this case as proof that deepfakes have moved beyond novelty into serious enterprise risk. The key lesson being repeated across the industry is straightforward: no large financial instruction should rely solely on a video call, no matter how convincing it appears.

How to protect your company from deepfakes

India’s Expanding Deepfake Fraud Playbook

Recent cases suggest criminals are experimenting aggressively with the technology.

In early 2026, a family in Indore reportedly transferred money after receiving a video call that appeared to show their son kidnapped. Investigators believe deepfake manipulation may have been used to simulate the victim’s appearance.

Other scams have used AI-cloned voices on WhatsApp to impersonate friends or relatives requesting urgent funds. The common thread is emotional pressure combined with highly believable synthetic media.

Why These Scams Work So Well

Several factors are making deepfake fraud unusually effective right now.

First, the realism threshold has crossed a psychological tipping point. Average users often cannot reliably distinguish authentic video from AI-generated content.

Second, attackers are increasingly attaching trusted authority figures to their scams. When a BSE chief, RBI governor, or corporate CFO appears to speak directly to viewers, skepticism drops quickly.

Third, distribution has become frictionless. Social platforms, messaging apps, and short-video feeds allow fraudulent clips to spread at massive scale before takedowns can catch up.

The Defensive Playbook Is Still Catching Up

Regulators and platforms are responding, but largely in reactive mode. Exchanges are issuing alerts. Central banks are publishing warnings. Tech firms are expanding watermarking and detection tools.

Even so, experts caution that technical safeguards alone will not solve the problem. Process discipline remains critical, especially inside companies handling large payments.

Common recommendations now include:

  • Independent verification for financial instructions
  • Multi-person approval workflows
  • Out-of-band confirmation for urgent transfers

The Bottom Line

The fake BSE CEO video is not just another internet hoax. It is part of a rapidly maturing fraud ecosystem where AI can convincingly impersonate trusted voices at scale.

For investors and businesses alike, the message is becoming unavoidable: seeing is no longer believing. Verification, not visual realism, is now the only safe default.