BILL NUMBER: S4609
SPONSOR: GOUNARDES
TITLE OF BILL:
An act to amend the general business law, in relation to establishing
the New York children's online safety act
PURPOSE OR GENERAL IDEA OF BILL:
To prevent the exploSive growth of child predators on certain digital
platforms by defaulting to certain privacy and security settings for
child users
SUMMARY OF PROVISIONS:
Section one of this bill names it the New York Childrens Online Safety
Act (NYCOSA).
Section two of this bill creates a new Article 45-A in General Business
Law (GBL) to mandate certain privacy settings by default for child users
under the age of 18. The mandate would apply to all social media plat-
forms covered under NYCOSA, defined as digital platforms which host
user-generated content, allow users to construct a public or semi-public
profile, and allow users to directly message each other as a significant
part of the provision of such platform. The New York State Attorney
General would be empowered to further define scope in regulations
promulgated pursuant to this act, just as they are already doing for
similar statutes such as the SAFE for Kids Act (Article 45 of GBL).
All social media platforms under NYCOSA would be required to turn off
open chat functions, which allow adults to instantly and privately
communicate with child users whether or not they know such child or have
been previously connected. Unconnected users would also be barred from
viewing the profile of a child user, tagging them in a post, or sending
them digital currency. Parents would be able to override these default
privacy settings and switch to a different setting, however, if they so
choose. Parents would also be notified when a child user attempts to
change these settings on their own, at which point the parent would be
able to either approve or deny the change.
These contact settings would notably not apply to connections which a
parent and/or child has previously approved by accepting a friend
request. For all child users under the age of 13, parents would have to
approve incoming friend requests and would also be able to view the list
of their child's current friends. For child users 13 and over, the child
themself can approve friend requests and the parent is not granted this
visibility.
For all users under the age of 18, parents would also be required to
approve all financial transactions related to their child's account.
Social media platforms must set up a mechanism by which a parent can
view financial transactions of a child user's account at any time.
Social media operators would be required to undergo commercially reason-
able age verification to determine which of their users is a minor
covered by the provisions of the bill, which many social media platforms
are already required to conduct under Article 45 of GBL as well as vari-
ous laws in other countries and states (Sakasegawa, J. (2024, August
29). The state of age verification in social media: an overview. Perso-
na. https://withpersona.com/blog/age-verification-in-social-media).
Operators would be barred from deploying dark patterns, defined as any
mechanism or design on a platform which intentionally inhibits user
choice and/or autonomy, in order to prevent any user or their parent
from exercising their rights under this article. One example of a dark
pattern in the context of the New York Children's Online Safety Act
might be a mechanism that technically allows parents to view their
under-13 child's connected accounts and financial transactions but is so
difficult to access that it is essentially useless.
Operators would not be able to induce parents to change the required
privacy settings in this bill by, for example, degrading the quality or
increasing the price of the platform. Enforcement against violations of
the bill would be vested in the New York State Attorney General, who
would be empowered to pursue damages of $5,000 per violation.
Section three of this bill is a severability clause.
Section four of this bill sets the effective date.
JUSTIFICATION:
Child safety experts estimate that there are approximately 500,000
online predators active on any given day. According to the FBI, over 50%
of the victims of online sexploitation are between the ages of 12 and
15, and an estimated 89% of sexual advances occur in Internet chatrooms
or through instant messaging (Kraut, M. E. (2024). Children and Grooming
/ Online Predators | Child Crime Prevention & Safety Center. Childsafe-
ty.losangelescriminallawyer.pro; Child Crime Prevention & Safety
Center). Fifty-eight percent of parents report being concerned about
online predation, yet only seven percent of the targets of such behavior
were aware that their children had received inappropriate content from
an adult. Forty percent of children in grades four through eight report
chatting online with a stranger, and Internet use amongst three to four-
year-olds has doubled within the last five years (Lazic, M. (2023, May
19). How Many Predators are Online Each Day? (Online Predators Statis-
tics). Legaljobs.io).
Virtual platforms like Facebook, Instagram, Snapchat, TikTok, X, and
Roblox, where adult users can collect vast troves of information about
child users and lure them into private chats within minutes, have become
veritable hunting grounds for pedophiles in the modern era. Over 80% of
child sex crimes can be traced back to social media, and reports of
online child exploitation surged by a staggering 106% in the early days
of the COVID-19 lockdown when many households moved online ((Lazic, M.
(2023, May 19). How Many Predators are Online Each Day? (Online Preda-
tors Statistics). Legaljobs.io). Many platforms have thus taken the
responsible step of creating certain "privacy by default" settings for
users under a certain age, meaning that the strictest possible privacy
settings are applied without manual input. Such settings limit which
types of adult users can message and tag underage accounts.
Despite these efforts, however, critical gaps in the online safety net
remain: platforms turn a blind eye to the millions of underage users who
lie about their age to create an account, bolstered by the 26-year-old
federal Children's Online Privacy and Protection Act (COPPA) which only
holds them liable if they have "actual knowledge" that a user is under
the age of 13 - a high legal bar which is virtually impossible to clear
in court. The Federal Trade Commission (FTC) openly admits that there is
nothing in COPPA to prevent users from lying about their age, assuring
companies that they need only establish a date-of-birth portal for users
to self-report age - despite such portals' notorious unreliability
(Federal Trade Commission. (2020, July 20). Complying with COPPA:
Frequently asked questions. Federal Trade Commission.).
Furthermore, even where a platform does know a user's age, not all have
chosen to deploy privacy by default for minors' accounts. The popular
gaming app Roblox, for example, the subject of several sweeping press
investigations, boasts an open chat function wherein a gamer of any age
can post anything they want in a game chat and privately message other
users. Highly contentious amongst child safety experts, Roblox's open
chat leaves it "to parents to activate child safety features such as
restricting what categories of people their kids can talk to, or which
games they can play. If parents don't, children can introduce themselves
to any stranger in a game, chat for hours and accept requests to
converse in private messages." (Carville, O., & D'Anastasio, C. (2024,
July 22). Roblox Is Fighting to Keep Pedophiles Away and Not Always
Winning. Bloomberg.com.). In November 2023, Roblox announced it would be
launching a new feature, Roblox Connect, that enables users as young as
13 to initiate avatar voice calls, complete with facial motion tracking
technology, with any other user despite 13-year-olds being the prime
target demographic for online predators (Hatmaker, T. (2023, September
8). Roblox is launching avatar-based voice calls with facial motion
tracking | TechCrunch. TechCrunch.). Roblox Connect was immediately
panned by the National Center on Sexual Exploitation, which pointed out
that such voice chats are one of the primary methods by which online
predators groom their victims, establishing emotional connections via an
impossible-to-monitor medium with the intent of gaining their trust.
Features such as open chat and Roblox Connect have not escaped the
attention of the Internet's least savory characters: in 2023, Roblox
reported 13,316 instances of child exploitation to the National Center
for Missing & Exploited Children and responded to 1,300 requests for
information from law enforcement (Carville, O., & D'Anastasio, C. (2024,
July 22). Roblox Is Fighting to Keep Pedophiles Away and Not Always
Winning. Bloomberg.com.). With little to no barrier to entry (Roblox
allows users to sign up without emails or parental permission), child
users on the platform can find games revolving around sex and virtual
"strip clubs" within minutes (Roblox: A Mainstream Contributor to Sexual
Exploitation. (n.d.). National Center on Sexual Exploitation.
https://endsexualexploitation.org/roblox/). An October 2024 study by the
investment research firm Hindenburg Research LLC found that researchers
were unable to create a test account with the name "Jeffrey Epstein" as
it, along with more than 900 variations, was already taken. ("Roblox:
Inflated Key Metrics for Wall Street and a Pedophile Hellscape for Kids
- Hindenburg Research." Hindenburgresearch.com, 8 Oct. 2024, hindenbur-
gresearch.com/roblox/). Usernames were also taken for anther notorious
child abuser, Earl Brian Bradley, who was indicted on 471 charges of
molesting, raping, and exploiting 103 children, and researchers were
able to access games like "Escape to Epstein Island" and over 600 games
involving the term "Diddy" (i.e. "Run From Diddy Simulator," "Diddy
Party") within minutes, despite having registered as a child under the
age of 13. Roblox's pedophile problem is so severe, in fact, that a
short seller in 2023 was able to drop the company's share price eight
percent simply by publishing a blog post aggregating all of the arrests
linked to the site.
This bill, known as the New York Childrens Online Safety Act (NYCOSA),
requires social media and gaming platforms that feature user-to-user
messaging to undertake several common sense steps to better protect kids
online. Firstly, it requires them to turn off open chat functions by
default for any user under the age of 18, unless a parent switches them
back on. Adult users can only message child users if their friend
request has been previously accepted - which, for users under the age of
13, will require parental approval. Parents would also be required to
approve financial transactions connected to a minor's account, as the
exchange of digital forms of currency, such as Roblox's "Robux," has
featured prominently in nearly every case of sexual assault and abuse
connected to the app. Parents would also be able to view a list of
recent financial transactions connected to an account and, for users
under the age of 13, a list of current friends. This would not only
assist parents in being able to identify and report early stages of
predatory behavior, but would also deter future sexploitation from pred-
ators who know that their interactions with their next child victim will
be closely watched. Violations of NYCOSA would be enforced by the Office
of the Attorney General, which is well-equipped to investigate allega-
tions of misconduct through its Bureau of Internet and Technology. The
remedy created by NYCOSA is the same prescribed by the Federal Trade
Commission in its 2022 settlement against Epic Games, Inc., the creator
of video game Fortnite, which similarly found that Fortnite's live
on-by-default text and voice communications had put children and teens
at serious risk (United States of America v. Epic Games, Inc. 16 Dec.
2022,
www.ftc.gov/system/files/ftc_gov/pdf/2223087EpicGamesSettlement.pdf. p.
17.).
In mandating common sense measures to protect child safety, many of
which have already been adopted by the world's leading social media
platforms, this bill would send a clear message that New York has zero
tolerance for platforms that prioritize daily active user count at the
expense of kids' safety. It is the least we can do to ensure safer
digital spaces for the most vulnerable among us.
PRIOR LEGISLATIVE HISTORY:
None
FISCAL IMPLICATIONS:
TBD
EFFECTIVE DATE:
This act shall take effect on the one hundred eightieth day after the
office of the attorney general shall promulgate rules and regulations
necessary to effectuate the provisions of this act.