Attorney General James Warns New Yorkers of Investment Scams Using AI-Manipulated Videos

LongIsland.com

“Sophisticated scammers are using AI to impersonate trusted business leaders and scam vulnerable New Yorkers out of their hard-earned money,” said Attorney General James.

Print Email

New York Attorney General Letitia James today issued an investor alert warning New Yorkers about scams that are luring potential investors with fake videos created with artificial intelligence (AI) to show celebrities and business leaders touting fraudulent investment schemes. These AI-manipulated videos, known as deepfakes, often appear in social media feeds, digital ads, and messaging apps and are expected to fuel billions of dollars in fraud every year. After receiving complaints from New Yorkers about these videos and their associated scams, the Office of the Attorney General (OAG) is urging New Yorkers to follow tips to protect themselves and to report this misleading content to law enforcement and the social media platforms they appear on.

“Sophisticated scammers are using AI to impersonate trusted business leaders and scam vulnerable New Yorkers out of their hard-earned money,” said Attorney General James. “Manipulated videos advertising phony investment scams are spreading like wildfire on social media, and New Yorkers should know how to avoid falling victim to these schemes. I encourage anyone who encounters these types of scams to contact my office.”

Deepfake investment scams target victims online with AI-manipulated videos that show wealthy individuals like Elon Musk, Jeff Bezos, and Warren Buffet apparently touting the scammers’ investment schemes, which often involve cryptocurrency. Scammers use AI tools to manipulate existing videos to change the voice and mouth movements of the individuals to create seemingly genuine endorsements for their frauds. These videos are then served to social media users as advertisements or broadcast as fake livestreams on platforms like YouTube. Scammers may also pose as government officials or other people in order to gain victims’ trust. The accessibility of AI tools allows scammers to produce ads for their frauds at scale with a variety of impersonations to target a wide range of potential victims. It can be difficult to tell from the video alone if it is a deepfake.

Once their victims show interest in the supposed investment, scammers may try to move the conversation off of public social media into encrypted private messaging services such as Whatsapp or Signal. After the victims have made an initial investment on the fraudulent platforms described in the ads, scammers will often create fake websites showing the investment increasing in value and encourage victims to send more money. After making more investments, sometimes totaling hundreds of thousands of dollars, victims will be unable to withdraw their funds or will be asked to pay more in withdrawal fees or “taxes.” Eventually, the scammers will cut off contact, having pocketed the victim’s money from the first investment. Even worse, victims may be later contacted by services that suggest they can retrieve their funds, only to end up draining more of the victims’ net worth.

Attorney General James recommends that New Yorkers take the following steps to avoid becoming the victim of a deepfake investment scam:

  • When considering investing, look out for the following red flags:
    • Promises of guaranteed returns on your investment.
    • Demands to invest immediately or warning that you will miss out on the opportunity.
    • Requests to invest by celebrities or other famous people.
    • Demands that you send cryptocurrency to a non-bitlicensed platform or a private wallet.
    • Requests to move public conversations to encrypted private messaging platforms.
  • Conduct your own due diligence by searching the internet for reviews about the salesperson and by verifying any physical address provided by the salesperson.
  • You can check investment professional registration at FINRA’s BrokerCheck.
  • To do a reverse search of the footage used, check reliable sources for previous interviews of the speaker. Fraudsters will often utilize footage from an existing public video or audio.
  • Be wary of livestreams or other video/audio messages you receive that are trying to persuade you to make any investments. Even if it appears to be a livestream, this could in fact still be a deepfake.
  • To prevent fraudsters from impersonating your contacts, change your profile settings to keep your friends list, photos, videos, posts, and even comments private.
  • Do not provide any personal information, financial information, log in credentials, etc., to individuals that you have only met online.
  • If a solicitation comes from one of your known contacts, verify that it is your actual contact by reviewing their profile and contacting them through an alternate form of communication, such as a phone call.
  • Do not click on any links in emails or direct messages unless you have confirmed the identity of the sender.
  • Be wary of unsolicited investment opportunities, especially those that come via email, social media, or phone calls. Legitimate firms do not aggressively push potential investors to make quick decisions or disclose sensitive information through these channels.
  • Enable Multi-Factor Authentication (MFA) on all your accounts, particularly those related to finance and email. MFA requires more than one method of verification, making it much harder for scammers to access your accounts—even if they use sophisticated techniques like deepfakes.
  • Please keep in mind that these transactions are almost always irreversible. If you decide to invest, keep all your paperwork and preserve your communications.

Attorney General James encourages anyone who may have been a victim of this type of scam to report it to OAG by filing a complaint online or calling 1-800-771-7755. Any identifying information provided to OAG will be protected according to law and policies on the safeguarding of identifying information.