Wednesday, May 13, 2026

A New Federal Law Takes Effect in 6 Days. If You Run Any Online Platform, Read This.

A New Federal Law Takes Effect in 6 Days. If You Run Any Online Platform, Read This.

The Take It Down Act requires covered online platforms to remove nonconsensual intimate images - including AI-generated deepfakes - within 48 hours of a valid request. The FTC just warned the biggest tech companies. If you run a forum, marketplace, or community platform, the deadline is May 19.

The FTC sent warning letters this week to Amazon, Meta, Reddit, Discord, TikTok, Snapchat, and about a dozen other major tech platforms. The message: a new federal law is taking effect on May 19, and they need to be ready.

The law is called the Take It Down Act (TIDA). The FTC isn't just reminding the biggest platforms - it's signaling that the agency is watching and will enforce.

If you run any kind of online platform - a marketplace, a community forum, a review site, a dating app - this law may apply to you. Here's what it requires and what you need to know before the deadline.

What the Law Requires

The Take It Down Act was signed by President Trump last year and takes full effect May 19. It requires covered platforms to:

  • Establish a clear, easy-to-find process that allows people to request removal of intimate images shared without their consent
  • Remove those images - and all identical copies - within 48 hours of receiving a valid request
  • Post "clear and conspicuous" notice about how the removal process works

The law covers nonconsensual intimate imagery of all kinds, including AI-generated deepfakes. That's an important expansion: this isn't just about photos or videos someone secretly took. It also includes synthetically created content - a realistic but fabricated intimate image of a real person - that was never a real photo to begin with.

Who Is a "Covered Platform"

This is where small business owners need to pay attention. The law defines covered platforms broadly: websites, apps, and online services that allow users to share images or videos. The list in the law includes social media, image sharing, video sharing, messaging, and gaming platforms.

The FTC sent letters to the major consumer platforms - Amazon, Alphabet, Apple, Automattic (WordPress), Bumble, Discord, Match Group, Meta, Microsoft, Pinterest, Reddit, SmugMug, Snapchat, TikTok, and X.

But if you run a platform that allows user-generated content involving images or video - a niche social community, a marketplace with user profiles, a forum where members can post photos, an adult content platform - you need to assess whether TIDA covers you.

The FTC has published compliance guidance at ftc.gov/tida. The guidance is practical and readable.

The 48-Hour Requirement Is the Hard Part

Most online platforms have some kind of content reporting system. What TIDA adds is a clock: 48 hours from a valid request to removal, including all identical copies.

That's the operationally challenging piece. If someone uploads an intimate image and it gets shared or mirrored to multiple places on your platform, all copies have to come down within that window. For a large platform with millions of posts, that requires automation. For a small platform, it requires a clear human process and someone responsible for executing it.

The FTC's guidance specifically addresses what constitutes a "valid request" - it doesn't require extensive documentation from the victim, which is intentional. The goal is to make it easy for the victim to get content removed, not to put the burden of proof on them.

Why This Law Exists

The nonconsensual sharing of intimate images - what used to be called "revenge porn" - has been a persistent harm for years. AI-generated content made the problem significantly worse: it's now possible to create realistic fake intimate images of someone without ever having a real photo of them.

Previous legal tools were slow, fragmented across states, and rarely available to victims who needed content removed in hours, not months. This law creates a federal floor: a baseline requirement for every covered platform to have a fast removal process.

For small business operators, the human dimension here matters. If you run a platform that serves a community, and someone in that community becomes the target of this kind of abuse, your response in those 48 hours is part of what your platform stands for.

What to Do Before May 19

If you run any platform with user-generated images or video:

  1. Read the FTC's compliance guide: ftc.gov - Complying with the Take It Down Act
  2. Establish a reporting mechanism - even a dedicated email address is a starting point
  3. Create a written process for how a valid request gets handled within 48 hours
  4. Add visible notice about the process somewhere users can find it

If you're not sure whether TIDA covers your platform: Talk to a lawyer before May 19. The FTC's stated position is that they "stand ready to monitor compliance, investigate violations, and enforce."

The deadline is six days away.


Sam Torres covers regulatory and policy news for small business owners at The Useful Daily. Source: FTC Press Release, May 11, 2026 | FTC TIDA Compliance Guide

Sam Torres covers AI news for The Useful Daily. She spent 12 years as a local business journalist. She breaks it down so you can get back to running your business.

Are you overpaying for AI tools?

Most small businesses waste $150+/month on tools they don't need. Find out in 2 minutes.

Take the Free AI Audit →

Liked this? There's more where that came from.

Every Sunday we send the week's best AI tips for your business. Free. No spam. Ever.