BILL NUMBER: S4609A
SPONSOR: GOUNARDES
 
TITLE OF BILL:
An act to amend the general business law, in relation to establishing
the stop online predators act
 
PURPOSE OR GENERAL IDEA OF BILL:
To prevent the explosive growth of child predators on certain digital
platforms by defaulting to certain privacy and security settings for
child users
 
SUMMARY OF PROVISIONS:
Section one of this bill names it the Stop Online Predators Act (SOPA).
Section two of this bill creates a new Article 45-A in General Business
Law (GBL) to mandate certain privacy settings by default for child users
under the age of 18. The mandate would apply to all social media plat-
forms covered under SOPA, defined as digital platforms which host user-
generated content, allow users to construct a public or semi-public
profile, and allow users to directly message each other as a significant
part of the provision of the service. The New York State Attorney Gener-
al would be empowered to further define scope in regulations promulgated
pursuant to this act, just as they are already doing for similar stat-
utes such as the SAFE for Kids Act (Article 45 of GBL).
All social media platforms under NYCOSA would be required to turn off
open chat functions, which allow users to instantly and privately commu-
nicate with child users whether or not they know such child or have been
previously connected. Unconnected users would also be barred from view-
ing the full profile of a child user, tagging them in a post, viewing
the minor's posted geographic location, or sending them digital curren-
cy. Minor's accounts would also not be able to be suggested to other
unknown users outside of the context of the minor having synced their
preexisting contacts with the platform. Nothing would prevent a user
from searching for a child user at any time and submitting a friend
request, however.
Parents would be able to override these default privacy settings and
switch to a different setting if they so choose. Parents would also be
notified when a child user attempts to change their privacy settings to
something less protective than the default, and, if the minor is under
the age of sixteen, parental approval for the less protective setting
would be required.
These contact settings would notably not apply to connections which a
parent and/or child has previously approved by accepting a friend
request. For all child users under the age of 13, parents would have to
approve incoming friend requests and would also be able to view the list
of their child's current friends. For child users 13 and over, the child
themself can approve friend requests and the parent is not granted this
visibility.
Finally, for all users under the age of 18, parents would also be
required to approve all financial transactions on their child's account
which pertain to other users. Platforms must set up a mechanism by which
a parent can view financial transactions pertaining to other users on a
child user's account at any time.
This bill prescribes the minimum standard for privacy settings, with
apps able to implement something more protective if they desire.
Platforms would be required to undergo commercially reasonable age
verification, which many social media platforms are already required to
conduct under Article 45 of GBL as well as various laws in other coun-
tries and states (Sakasegawa, J. (2024, August 29). The state of age
verification in social media: an overview. Persona.
https://withoersona.com/blog/age-verification-in-social-media).
Operators would be barred from deploying dark patterns, defined as any
mechanism or design on a platform which intentionally inhibits user
choice and/or autonomy, in order to prevent any user or their parent
from exercising their rights under this article. One example of a dark
pattern in the context of SOPA might be a mechanism that technically
allows parents to view their under 13 child's connected accounts and
financial transactions but is so difficult to access that it is essen-
tially useless.
Operators would not be able to induce parents to change the required
privacy settings in this bill by, for example, degrading the quality or
increasing the price of the platform. Enforcement against violations of
the bill would be vested in the New York State Attorney General, who
would be empowered to pursue damages of $5,000 per violation.
Section three of this bill is a severability clause.
Section four of this bill sets the effective date.
 
JUSTIFICATION:
Child safety experts estimate that there are approximately 500,000
online predators active on any given day. According to the FBI, over 50%
of the victims of online sexploitation are between the ages of 12 and
15, and an estimated 89% of sexual advances occur in Internet chatrooms
or through instant messaging (Kraut, M. E. (2024). Children and Grooming
/ Online Predators Child Crime Prevention & Safety Center. Childsafe-
ty.losangelescriminallawyer.pro; Child Crime Prevention & Safety
Center). Fifty-eight percent of parents report being concerned about
online predation, yet only seven percent of the targets of such behavior
were aware that their children had received inappropriate content from
an adult. Forty percent of children in grades four through eight report
chatting online with a stranger, and Internet use amongst three to four-
year-olds has doubled within the last five years (Lazic, M. (2023, May
19). How Many Predators are Online Each Day? (Online Predators Statis-
tics). Legaljobs.io).
Virtual platforms like Facebook, Instagram, Snapchat, TikTok, X, and
Roblox, where adult users can collect vast troves of information about
child users and lure them into private chats within minutes, have become
veritable hunting grounds for pedophiles in the modern era. Over 80% of
child sex crimes can be traced back to social media, and reports of
online child exploitation surged by a staggering 106% in the early days
of the COVID-19 lockdown when many households moved online ((Lazic, M.
(2023, May 19). How Many Predators are Online Each Day? (Online Preda-
tors Statistics). Legaljobsio). Many platforms have thus taken the
responsible step of creating certain "privacy by default" settings for
users under a certain age, meaning that the strictest possible privacy
settings are applied without manual input. Such settings limit which
types of users can message and tag underage accounts.
Despite these efforts, however, critical gaps in the online safety net
remain: platforms turn a blind eye to the millions of underage users who
lie about their age to create an account, bolstered by the 26-year-old
federal Children's Online Privacy and Protection Act (COPPA) which only
holds them liable if they have "actual knowledge" that a user is under
the age of 13 - a high legal bar which is virtually impossible to clear
in court. The Federal Trade Commission (FTC) openly admits that there is
nothing in COPPA to prevent users from lying about their age, assuring
companies that they need only establish a date-of-birth portal for users
to self-report age - despite such portals' notorious unreliability
(Federal Trade Commission. (2020, July 20). Complying with COPPA:
Frequently asked questions. Federal Trade Commission.).
Furthermore, even where a platform does know a user's age, not all have
chosen to deploy privacy by default for minors' accounts. The popular
gaming app Roblox, for example, the subject of several sweeping press
investigations and state Attorney General actions from at least four
different states, previously boasted an open chat function wherein a
gamer of any age can post anything they want in a game chat and then
privately message other users. Roblox only curtailed the private chats
after a barrage of public criticism, and child safety features remain
inconsistent across apps, with many dependent upon proactive "family
pairing" by the parent and minor - a friction-full process that data
tells us barely any parents bother to undertake (Tenbarge, Kat. "Fewer
than 1% of Parents Use Social Media Tools to Monitor Their Children's
Accounts, Tech Companies Say." NBC News, 29 Mar. 2024,
www.nbcnews.com/tech/social-med
ia/fewer-1-parents-use-social-media-tools-monitor-childrens
-accounts-tech-rcna145592.)
This bill, known as the Stop Online Predators Act (SOPA), requires
social media and gaming platforms that feature user-to-user messaging to
undertake several common sense steps to protect kids online. Firstly,
it requires them to turn off open chat functions and public profiles by
default for any user under the age of 18. Unconnected accounts would
also not be able to tag minors in content or view their posted geograph-
ic location. While parents and children can override these default
privacy settings, a parent would be required to approve the override if
a child user is attempting to change to a setting that is less protec-
tive than the default and the child is under 16.
Other users can only message child users if their friend request has
been previously accepted - which, for users under the age of 13, will
require parental approval. Parents would also be required to approve
financial transactions connected to a minor's account that pertain to
other users, as the exchange of digital forms of currency, such as
Roblox's "Robux," has featured prominently in nearly every case of sexu-
al assault and abuse connected to the app. Parents would also be able to
view a list of recent financial transactions connected to an account
which pertain to other users and, for users under the age of 13, a list
of current friends. For parents of users between the ages of 13 and 17,
however, the parent would gain no special visibility into the child's
contacts as these privacy settings would simply operate as an "on/off'
toggle.
This bill leverages the effectiveness of default settings to more effec-
tively protect minors online. When Instagram unveiled their Teen
Accounts in 2024, for example, which set under 18 accounts to private by
default and limited certain direct messages and tags, 97% of teens
between the ages of 13 and 15 stayed in these built-in settings ("We're
Introducing New Built-in Restrictions for Instagram Teen Accounts, and
Expanding to Facebook and Messenger I Meta." Meta, 8 Apr.
2025,about.fb.com/news/2025/04/introducing-new-built-in-restrictions-ins
taoram-teen-ac counts-expanding-facebook-messenger/.)
Better privacy settings would not only assist parents in being able to
identify and report early stages of predatory behavior, but would also
deter future sexploitation from predators who know that their inter-
actions with their next child victim will be closely watched.
Violations of this bill would be enforced by the Office of the Attorney
General, which is well-equipped to investigate allegations of misconduct
through its Bureau of Internet and Technology.
The remedy created by SOPA is inspired by the one prescribed by the
Federal Trade Commission in its 2022 settlement against Epic Games,
Inc., the creator of video game Fortnite, which similarly found that
Fortnite's live on-by-default text and voice communications had put
children and teens at serious risk (United States of America v. Epic
Games, Inc. 16 Dec. 2022,
www.ftc.gov/system/files/ftc_gov/pdf/2223087EpicGamesSettlement.pdf. p.
17.).
In mandating common sense measures to protect child safety, some of
which have already been adopted by the world's leading social platforms,
this bill would send a clear message that New York has zero tolerance
for platforms that prioritize daily active user count at the expense of
kids' safety. It is the least we can do to ensure safer digital spaces
for the most vulnerable among us.
 
PRIOR LEGISLATIVE HISTORY:
2024: Referred to Rules
2025: Referred to Internet and Technology
 
FISCAL IMPLICATIONS:
TBD
 
EFFECTIVE DATE:
This act shall take effect on the one hundred eightieth day after the
office of the attorney general shall promulgate rules and regulations
necessary to effectuate the provisions of this act.