Protecting Yourself from Fraud: How Scammers Utilize AI Deepfakes to Steal Millions, Including Real Estate

0

Security experts are sounding an alarm about the dangers posed by artificial intelligence (AI)-generated deepfakes, or artificial audio or video files altered to portray someone as they appear, in real estate fraud cases.

Authorities in Hong Kong reported in April 2018 that a group of scammers used deepfake technology to steal over $25 million from a multinational company by impersonating its Chief Financial Officer on a video call with an employee, and convincing him/her to transfer funds.

As deepfakes become more realistic and convincing, experts fear fraudsters could use deepfakes to impersonate professionals involved in real estate transactions for the purpose of intercepting payments or collecting sensitive data – even experienced real estate investors could fall prey to such schemes and suffer financial loss or identity theft as a result.

No financial transaction comes without risk, but investors could once more protect themselves with education. Buyers could look out for misspelled email addresses or signs of wire fraud and call the real estate agent over the phone to confirm the transaction. Unfortunately, red flags have become harder to spot as fraudsters have the means to change phone numbers or use deepfake audio to impersonate real estate agent voices and become difficult to detect.

Jordan Peele cautioned us about the perils of artificial intelligence (AI) in 2018 when he created a deepfake video featuring former U.S. president Barack Obama warning Americans about online misinformation and disinformation. According to Obama in this fake video, viewers should “be more vigilant with what we trust from the internet”, though in fact viewers were witnessing Jordan Peele after 56 hours of editing and processing time.

Something was odd with this video beyond its humorous script–it had characteristics typical of deepfake videos at that time, such as jerky facial movements and lighting changes. With more advanced technology coming online, scammers will likely find ways to trick even people with keen eyesight into believing a fake message.

The National Association of Realtors notes that face-to-face communication at some point during a transaction will be essential in protecting people against fraud. That means long-distance investors will require contact with local agents at least to gain an authentic phone number to call directly for verification purposes.

AI systems allow scammers to manipulate not only audio and video recordings but also generate falsified documents for seller impersonation schemes and other schemes.

Fraud on the Rise–And How It Is Being Combatted
With artificial intelligence (AI) becoming more accessible to everyday scammers, investment fraud is on the rise, resulting in greater financial losses for victims. A record $4 billion was stolen via investment fraud scams alone in 2023 according to Federal Trade Commission data, while imposter scam losses totaled $2.7 billion; median investment-related scam losses averaged $7,768 losses while identity theft reports also rose significantly year over year in 2023.

Carlson Law conducted an analysis of FBI and FTC data in 2023 to reveal AI scams as one of the five most frequent forms of investment fraud, according to Carlson’s findings. AI-detection software may help but is never 100 percent reliable; provenance analysis provides greater clarity as to whether content was created by humans or AI; similarly, The Content Authenticity Initiative consists of tech companies, academics, and organizations all working toward creating an industry standard for verifying content authenticity through open-source development.

Problematic is that rapid advances in AI require lawmakers to adapt quickly, while tech companies increasingly make AI tools accessible to everyday people.

Policymakers are working hard to catch up with AI advances. Last fall, the Biden administration issued an Executive Order designed to establish security standards, encourage development of privacy measures, prevent civil rights violations from AI-enhanced computerized systems, capitalize on AI’s potential benefits in healthcare and education sectors, research labor market effects more extensively and ensure government agencies use AI responsibly.

In February, the FTC published its Trade Regulation Rule on Impersonation of Government and Businesses which, according to Commission Chair Mary Jo White, was inadequate due to technological changes since completion. The rule allows the FTC to bring scammers who impersonate businesses or governments into federal court for prosecution.

Due to an increase in complaints from individuals about impersonation fraud, the FTC proposed a supplemental rule that would broaden protections to cover individual victims of fraud. Furthermore, they are seeking public comment as to whether revisions should make it illegal for firms such as AI platforms who provide goods or services that they know or have reason to know are being used against consumers through impersonation fraud.

If this latter provision were included, it would enable the FTC to hold tech companies liable for providing AI tools that facilitate scams – potentially prompting them to be cautious about making new deepfake technology available to their users.

Given that there are not yet reliable detection tools or enforcement measures in place, media literacy is especially essential. Investors should remain wary of anything that doesn’t feel right or seems too good to be true, check the authenticity of documents and payment instructions, and stay abreast of new technology or any potential scams that arise.

How to Protect Yourself
Artificial intelligence may offer solutions to its own creation by detecting fake documents using learned patterns. Intel recently unveiled a deepfake detection platform with 96% accuracy; but as innovation will always outstrip AI detection methods, investors should take appropriate precautions themselves.

The National Cybersecurity Alliance recommends:

Be cautious of what you post: To protect the privacy of your photo and video content, change your social media settings to private or use watermarks on any publicly accessible images.
Keep an eye out for AI news: Stay abreast of recent updates to AI technology and any possible scams so you know what to look out for.
Phishing attempts: Always remain wary of emails or texts sent from unknown sources that contain payment instructions, links or files for downloads that request payments and any requests to share sensitive data via video calls. Also take care to avoid urgent demands or hesitation to connect directly by phone or in person when communicating online.
Report Deepfakes: If you discover deepfake content portraying yourself or someone close to you as being fake, report the platform for removal as soon as possible and file a formal complaint with federal authorities immediately. Seek legal assistance as needed if necessary.
Even with its inherent risks of fraud, advances in generative AI continue to offer time-saving resources for real estate professionals while at the same time raising fraud risks. Investors and agents already using chatbots for expeditious communication purposes; however, its full potential has yet to be unlocked by McKinsey & Company who estimate its estimated added value could range anywhere between $110 billion to $180 billion to the real estate industry as a whole.

McKinsey estimates that real estate companies have seen over 10% increase in net operating income thanks to AI technology’s ability to streamline processes, increase customer satisfaction and tenant retention rates, create new sources of revenue streams and make faster (and smarter) investment decisions.

Today’s real estate investors must sift through multiple data sources in order to analyze if a market or property will be profitable. But McKinsey suggests that an advanced generative AI tool equipped with access to key information could perform multifaceted analysis to prioritize which listings investors should investigate; this would especially benefit newbies without an extensive investment history that can inform their decisions; an AI-powered solution might allow an interested person to simply ask, for instance: “Which duplexes should I invest in?”

AI tools offer investors more free time and help them make more profitable decisions.

AI will undoubtedly revolutionize the real estate industry, creating new vulnerabilities during transactions while also helping investors act with precision and communicate more easily. But its rapid progress and adoption will present policymakers and anti-fraud agencies with serious challenges when it comes to fraud prevention.

AI can have a positive impact on your business if used effectively to enhance daily work tasks while remaining informed and taking measures against fraud.

Leave a Reply

Your email address will not be published. Required fields are marked *