Have You Been Deepfaked? What To Do Next
By John Devendorf, Esq. | Reviewed by Tim Kelly, J.D. | Last updated on December 4, 2025Deepfakes refer to synthetic media (images, video, or audio) produced by computer algorithms based on existing data. Scammers can use deepfakes with malicious intent to cause reputational harm or for blackmail.
Some state-specific laws prohibit posting non-consensual intimate imagery (real or deepfake). These laws require social media platforms to remove offending content after reporting.
There are limited legal options for victims of deepfake fraud, who find it difficult to track down scammers or hold them accountable for causing financial harm and emotional distress. For legal advice about deepfake fraud and what you can do, talk to a science and technology lawyer.
What Is a Deepfake?
Deepfakes generally refer to AI-generated video, images, audio, or a combination to produce synthesized content. Machine learning algorithms take input from training datasets of images, video, audio, or other multimedia to generate new content based on the same characteristics.
Essentially, deepfakes use deep learning and advanced artificial intelligence (AI) to create new media intended to impersonate another person. For example, a deepfake video could appear to show a celebrity saying something they never said to sell a product.
Deepfakes can continue to evolve based on generative AI, competing network learning, and deeper training data to produce more convincing images, audio, and video.
Deepfake technology is often used for more nefarious purposes, including committing cryptocurrency fraud, spreading misinformation, or blackmailing people with deepfake images. There are a few different types of deepfake scams. Hackers can try to scam you out of your money, get your personal information for identity theft, or cause reputational harm.
For example, scammers can use your voice to imitate you to your company to request money. Your company may get a phone call from a fake spoofing number that appears to come from your phone. Someone may recognize what sounds like your voice requesting a fund transfer. An employee may transfer funds by a wire transfer, and by the time they realize you never made the call, it is too late.
How To Report Deepfakes and Request Content Removal
If you identify deepfake content of yourself, report it to law enforcement. You can also report it to the Federal Trade Commission (FTC), which has authority over deceptive practices and impersonation scams involving AI.. Reporting the issue to law enforcement can establish a police report that you can reference if you suffer financial fraud as a result. Banks, financial institutions, and insurance companies may require a police report to help you get your money back or start a claim.
You can also report deepfake fraud to the hosting platform. Social media platforms are slow to respond and may not take down misleading information without a call from an attorney. Your attorney can contact the service provider and get a court order requiring them to take down the offending posts or face court sanctions or financial penalties.
Can I File a Civil Lawsuit After a Deepfake Scam?
You can file a civil lawsuit for deepfake scams, but there is often difficulty identifying the perpetrators. Many deepfakes get posted online anonymously using fake usernames or VPNs, hiding the user’s location. If the scammers behind the deepfakes are in another country, it is difficult to get jurisdiction over the scammers.
Internet service providers (ISPs) and social media companies can avoid liability by claiming they are simply conduits of information and not responsible for content. Section 230 of the Communications Decency Act provides ISPs and platforms broad immunity for user-generated content, which can include deepfakes.
If fraudsters used a deepfake scam to get you to send money overseas, it can be difficult to recover the money once it is gone. You may have banking and consumer protections if you used a credit card or a check for payment. However, wire transfers, debit card payments, or cash payments are more difficult to recover.
If you want to know what legal action you can take to recover damages after a deepfake scam, talk to a technology lawyer about your legal rights.
Deepfake Scam Laws
The law is slow to adapt to developing technologies, including artificial intelligence and deep learning video and voice cloning. There could be millions of victims of deepfake scams by the time lawmakers get around to establishing strong protections against deepfake fraud.
Some states are more proactive in establishing consumer protections against deepfake fraud. A number of states have passed deepfake laws prohibiting the use of false and deceptive images in elections. Other laws address using deepfake images in producing pornographic materials.
Many legal protections for individuals and companies are based on existing laws that don’t relate directly to deepfake scams. Common law fraud prohibits making false representations of material facts with the intent to deceive, which causes harm.
A victim of a deepfake can use consumer protection laws, wire fraud laws, or the Computer Fraud and Abuse Act (CFAA) to recover compensation for their damages.
Protections Against Non-Consensual Intimate Imagery
Posting non-consensual intimate imagery on social media platforms can harm someone’s reputation and even lead to self-harm or suicide. Some scammers use intimate images, either real or AI deepfakes, to blackmail victims for money or force them to do things against their will.
Federal law, specifically the Violence Against Women Act (VAWA), provides a civil cause of action allowing victims to sue for damages. Criminal penalties, which can include prison time, are generally enforced under state laws or specific federal cyberstalking statutes. The penalties are more severe involving underage victims. Websites and social media platforms are encouraged to respond promptly to reports by victims and take down any images, but there is no federal mandate specifying a 48-hour timeframe.
Protect Yourself Against Deepfake Scams
You can take standard cybersecurity steps to limit what content scammers can access without your consent. The less information you provide to potential scammers, the less likely they will get enough data to create a convincing deepfake. The following are some steps you can take to limit your exposure:
- Use multifactor authentication for account access
- Watermark any photos you send or post online
- Use strong privacy settings on social media accounts
- Update your passwords with strong and unique password settings, and change your passwords regularly
- Put a freeze or hold on your credit report to prevent unauthorized new accounts
- Use a virtual private network (VPN) to keep your location private
- Stay up-to-date with deepfake threats and talk to family and friends about avoiding phishing and spoofing scams
How a Lawyer Can Help Deepfake Scam Victims
Victims of deepfake impersonations and scams may not know where they can turn to get help. Talking to a science and technology attorney can help you understand your legal rights and identify the next steps. An attorney can review your case and explain your legal options. Your attorney can take steps to remove deepfake content and prevent the reposting of disinformation.
If your attorney can identify the scammers involved, they can take legal action with a civil lawsuit to help you get compensation. For more information about your options, talk to a science and technology lawyer.
What do I do next?
Enter your location below to get connected with a qualified attorney today.Additional Science and Technology Law articles
- Overview of Science and Technology Law
- Data Privacy and Security Risks in Scientific AI Applications
- Liability and Risk Management: When an AI System Causes Harm
- How AI Evidence Is Changing Expert Testimony
- Intellectual Property Challenges for AI-Generated Content
- Using AI in Legal Practice: What Lawyers Say
- Can Companies Use My Likeness for AI Applications?
- Can Lawyers Use AI in Court? State-by-State Rules
- AI Compliance Audit: Does My Company Need One?
- State vs. Federal AI Regulation: Where Are We Heading?
- Avoiding Algorithmic Bias: Top 5 AI Liability Issues in Courts
- AI Hallucination in Legal Practice: When Technology Gets the Law Wrong
- The EU AI Act: How Other Countries Are Regulating AI
Related topics
At Super Lawyers, we know legal issues can be stressful and confusing. We are committed to providing you with reliable legal information in a way that is easy to understand. Our legal resources pages are created by experienced attorney writers and writers that specialize in legal content in consultation with the top attorneys that make our Super Lawyers lists. We strive to present information in a neutral and unbiased way, so that you can make informed decisions based on your legal circumstances.
Attorney directory searches
Helpful links
Find top lawyers with confidence
The Super Lawyers patented selection process is peer influenced and research driven, selecting the top 5% of attorneys to the Super Lawyers lists each year. We know lawyers and make it easy to connect with them.
Find a lawyer near you