In recent years, we’ve seen a disturbing rise in the unauthorized sharing of intimate images and AI-generated deepfakes online. Victims often face significant challenges in having these images removed from websites and social media platforms, leading to continued circulation and repeated trauma.

In 2022, Congress passed legislation creating a civil cause of action for victims to sue individuals responsible for publishing non-consensual intimate imagery. However, bringing a civil action can be time-consuming, expensive, and may force victims to relive trauma.

While some social media platforms often make good faith efforts to remove these images, there is currently no universal standard requiring prompt removal.

My bill will address this issue by requiring social media platforms and similar websites to remove non-consensual and unauthroized intimate images within 48 hours of notification. Additionally, these platforms would be required to maintain a clear, user-friendly webpage where victims can submit removal requests.

As cyberbullying and online harassment continue to become increasingly prevalent, ensuring a pathway for victims to rapidly remove harmful content is essential.

Please join me in co-sponsoring this important legislation, which will help to protect individuals from ongoing digital abuse in an increasingly connected world.