TAKE IT DOWN Act: US Law Against AI-Generated Intimate Imagery
By Learnia Team
TAKE IT DOWN Act: US Law Against AI-Generated Intimate Imagery
This article is written in English. Our training modules are available in French.
In May 2025, the United States enacted the TAKE IT DOWN Act, establishing the first federal criminal law specifically targeting non-consensual intimate imagery (NCII) including AI-generated deepfakes. This bipartisan legislation represents a watershed moment in addressing one of the most harmful applications of generative AI technology.
This comprehensive guide explains the law, its protections, enforcement mechanisms, and implications for individuals and platforms.
Background: Why This Law Matters
The Scale of the Problem
Non-consensual intimate imagery has been called the most prevalent harmful use of deepfakes:
- →Estimates suggest 90-95% of deepfakes are NCII
- →Women and girls are disproportionately targeted
- →Anyone can be victimized using just photos from social media
- →Creation tools have become widely accessible
- →Harm is severe: psychological trauma, career damage, extortion
Previous Legal Gaps
Before the TAKE IT DOWN Act:
| Issue | Previous Situation |
|---|---|
| Federal NCII law | None existed |
| Deepfake-specific | Few state laws |
| Takedown mechanism | No federal mandate |
| Minor protection | Varying state coverage |
What the Law Does
Criminal Penalties
The TAKE IT DOWN Act creates federal crimes for:
1. Publishing Non-Consensual Intimate Images
Elements:
- Knowingly publishes intimate visual depiction
- Identifiable individual depicted
- Without consent of depicted person
- Knows or should know consent wasn't given
Penalties:
- Up to 2 years imprisonment (adults)
- Up to 3 years (if minor depicted)
- Fines and restitution to victims
2. Threatening to Publish
Elements:
- Threatens to publish intimate images
- Intent to extort, harass, or coerce
Penalties:
- Same as publishing
- Additional charges may apply
Coverage of AI-Generated Content
Critically, the law explicitly covers synthetic/AI-generated content:
"Intimate visual depiction" includes any visual depiction that is created or altered by software or digital technology, whether or not such depiction is realistic, that depicts the intimate body parts of an identifiable individual or depicts an identifiable individual engaging in sexually explicit conduct.
This means:
- →✅ AI-generated deepfakes covered
- →✅ Photoshopped/edited images covered
- →✅ Even unrealistic depictions covered
- →✅ Any technology used to create
Platform Removal Requirements
The law mandates that covered platforms:
Removal Timeline:
- →48 hours to remove after valid notice
- →Applies to identical and "substantively similar" copies
Covered Platforms:
- →Social media platforms
- →Websites with user-generated content
- →Hosting services where content is accessible
- →Search engines (for search results)
Exceptions:
- →Platforms with <1 million monthly users
- →News reporting (bona fide journalism)
- →Educational/documentation purposes
Notice Requirements
Valid takedown notices must include:
Required Notice Elements:
1. Identification of specific content
2. Statement that requester or minor is depicted
3. Statement that content is intimate
4. Statement of non-consent
5. Contact information for requester
6. Signature (electronic acceptable)
Platforms must provide accessible notice submission mechanisms.
Who Is Protected
Adults
Any adult whose intimate imagery is shared without consent.
Minors
Enhanced protections for minors:
- →Parents/guardians can file notices
- →Higher penalties for violators
- →Intersects with existing child protection laws
- →Priority treatment for removal
Note: Images of minors may also violate other federal laws with severe penalties.
How Enforcement Works
Federal Enforcement
- →Department of Justice can pursue criminal charges
- →FBI investigates violations
- →Coordination with state law enforcement
Civil Remedies
While TAKE IT DOWN is criminal law, victims also have civil routes:
- →DEFIANCE Act (2024): Civil damages for deepfake victims
- →State laws: Many states have civil NCII statutes
- →Tort claims: Privacy, emotional distress, etc.
Practical Victim Actions
Steps for Victims:
1. DOCUMENT EVERYTHING
- Screenshots with URLs and dates
- Save evidence securely
- Document your identity in the image
2. FILE TAKEDOWN NOTICES
- To each platform hosting content
- Follow platform's submission process
- Keep copies of all notices
3. REPORT TO LAW ENFORCEMENT
- Local police + FBI tip line
- Provide all documentation
- Note: Criminal process is separate from takedowns
4. SEEK SUPPORT
- Victim advocacy organizations
- Legal counsel if needed
- Mental health resources
Platform Obligations
For Covered Platforms
Required Actions:
- →Implement notice process for NCII reports
- →Remove within 48 hours of valid notice
- →Remove copies of same content
- →Provide confirmation to requester
Best Practices (Not Required):
- →Proactive hash-matching to prevent re-upload
- →Victim-initiated preventive uploads
- →Quick response teams for NCII
- →Clear policies publicly posted
Liability Protection
Platforms that comply gain protection:
- →Safe harbor for good-faith compliance
- →No liability for non-removed content before notice
- →Protection for over-removal errors (within reason)
Penalties for Non-Compliance
Platforms failing to remove within 48 hours may face:
- →FTC enforcement actions
- →State attorney general actions
- →Civil liability to victims
Limitations and Criticisms
Enforcement Challenges
| Challenge | Issue |
|---|---|
| Anonymity | Creators often anonymous |
| Jurisdiction | International platforms |
| Volume | Millions of potential violations |
| Detection | Finding content is difficult |
| Whack-a-mole | Content reappears elsewhere |
Privacy Concerns
Some critics note:
- →Verification may require victims to re-expose imagery
- →Notice information could be weaponized
- →Definition of "intimate" may be subjective
Free Speech Considerations
Balancing concerns:
- →Legitimate journalistic exceptions
- →Artistic expression boundaries
- →Satirical content questions
Relationship to Other Laws
Federal Landscape
| Law | Relationship |
|---|---|
| DEFIANCE Act | Civil companion to TAKE IT DOWN |
| Section 230 | Modified for NCII compliance |
| Child exploitation laws | Concurrent for minor imagery |
| Extortion statutes | Apply to threatening behavior |
State Laws
- →Many states have NCII laws
- →Federal law doesn't preempt states
- →Victims may pursue both
- →State laws often have civil remedies
Impact Since Enactment
Platform Responses
Major platforms have enhanced NCII policies:
- →Faster removal processes
- →Hash-based blocking of known content
- →Preventive uploads for potential victims
- →Dedicated reporting pathways
Enforcement Actions
While still early:
- →DOJ has established NCII task force
- →Initial prosecutions underway
- →Focus on egregious cases and deterrence
Ongoing Challenges
- →Underground distribution continues
- →International hosting evades jurisdiction
- →New AI tools create content rapidly
- →Victim awareness of rights remains low
How to Stay Protected
Prevention Tips
Reducing Risk:
□ Limit high-quality face/body photos online
□ Check privacy settings on social media
□ Be cautious about what you share privately
□ Be aware of who has access to your images
□ Consider watermarking personal photos
□ Regularly search for your images online
If You're a Victim
Immediate Steps:
1. Don't panic—help is available
2. Document before it's removed
3. File takedown notices to each platform
4. Report to IC3.gov (FBI Internet Crime)
5. Contact the Cyber Civil Rights Initiative
6. Consider contacting an attorney
7. Seek emotional support
Resources
- →Cyber Civil Rights Initiative: cybercivilrights.org
- →NCMEC CyberTipline: for minor victims
- →StopNCII.org: preventive image hashing
- →FBI IC3: ic3.gov
Key Takeaways
- →
TAKE IT DOWN Act is federal law criminalizing non-consensual intimate imagery including AI deepfakes
- →
Both real and synthetic content are covered—deepfakes explicitly included
- →
48-hour removal mandate for platforms receiving valid notices
- →
Criminal penalties include up to 2-3 years imprisonment
- →
Enhanced protections for minors with higher penalties and parent/guardian rights
- →
Platforms must implement accessible notice and takedown processes
- →
Multiple paths for victims: federal criminal, civil lawsuits, state laws, platform policies
Understand AI Ethics and Regulation
The TAKE IT DOWN Act is part of a broader evolution in AI regulation addressing harms from synthetic media. Understanding this landscape helps you navigate the ethical and legal dimensions of AI.
In our Module 8 — AI Ethics & Safety, you'll learn:
- →The landscape of AI-related harms
- →Regulatory frameworks across jurisdictions
- →Ethical principles for AI development
- →How to build responsible AI applications
- →Deepfake detection and prevention
- →Protecting against AI misuse
This knowledge is essential for anyone working with or affected by AI technology.
Module 8 — Ethics, Security & Compliance
Navigate AI risks, prompt injection, and responsible usage.