BILL NUMBER: S4506
SPONSOR: GOUNARDES
 
TITLE OF BILL:
An act to amend the general business law, in relation to enacting the
Stop Addictive Feeds Exploitation (SAFE) for all act
 
PURPOSE OR GENERAL IDEA OF BILL:
To create a healthier and more ethical digital landscape by allowing
users to turn off algorithmic feeds and other predatory features meant
to extend time spent on social media
 
SUMMARY OF PROVISIONS:
Section one of this bill names it the Stop Addictive Feeds Exploitation
(SAFE) for All Act.
Section two of the bill adds a new article 45-A to General Business Law
(GBL).
Section 1509 of Article 45-A sets definitions. The SAFE for All Act
shall apply to all addictive social media platforms, defined as a
website, online service, online application, or mobile application that
is accessed by a user in this state and offers an addictive feed as a
significant part of the provision of such website, service, or applica-
tion, mirroring the scope provisions in the SAFE for Kids Act (Article
45 of General Business Law).
Section 1510 of Article 45-A provides that all addictive social media
platforms shall establish a mechanism by which users can turn off algo-
rithmic recommendations in their feed, turn off notifications on the
platform (this setting shall at a minimum contain an option for a user
to turn off notifications altogether and an option for users to set a
"quiet mode" overnight between 12:00am and 6:00am), turn off autoplay,
and set screen time limits that lock a user out of a platform, at such
user's discretion. A setting which simply reminds a user of time spent
on the platform rather than allowing them to limit access shall not
count as a screen time limit.
Section 1511 of Article 45-A requires the above described settings to be
clear and easily accessible for users. Addictive social media platforms
shall not deploy dark patterns, or design choices that serve to inten-
tionally inhibit user choice and suppress the ability to exercise rights
under Section 1510 of the article. Common examples of dark patterns that
might apply in this bill include misleading language that doesn't clear-
ly describe what a given setting is intended to do (trick wording),
preselected default options which nudge users into choosing a given
setting over another, or visual inferences that obscure opt-out icons or
present them in illegibly small font (Davis, Matt. "9 Dark Pattern Exam-
ples to Look out For." Www.osano.com, 1 Dec. 2023).
Subdivision two of section 1511 provides that addictive social media
platforms shall not deploy dark patterns that make it harder for a user
to deactivate, reactivate, suspend, or cancel an account.
Section 1512 of Article 45-A provides that nothing in this act shall be
construed to override, supplant, or conflict with the provisions of the
SAFE for Kids Act in Article 45 of GBL. This means that addictive social
media companies would still be required to undergo commercially reason-
able age verification under Section 1501 of GBL to determine whether or
not a user is a minor. For all users under the age of 18, the provisions
of Article 45 shall apply and apps must switch off feeds and overnight
notifications for minors by default in addition to creating a setting by
which all users can access the feature customizations prescribed in the
new Article 45-A.
Section 1513 of Article 45-A provides that this act shall apply to
conduct that occurs in whole or in part in New York.
Section 1514 of Article 45-A requires the New York State Attorney Gener-
al to promulgate regulations to effectuate and enforce the provisions of
this act.
Section 1515 of Article 45-A creates remedies. The Attorney General can
pursue damages of $5,000 per instance of noncompliance with the act.
Section three of this bill is a severability clause.
Section four of this bill sets the effective date.
 
JUSTIFICATION:
Much has been written about the impact of social media on teen mental
health, with the US Surgeon General declaring the link between social
media and anxiety, depression, and insomnia in young people to be so
strong that he famously called it unsafe for kids in a May 2023 public
health advisory (Social Media and Youth Mental Health: The U.S. Surgeon
General's Advisory. (2023). U.S. Surgeon General.
https://www.hhs.gov/sites/default/files/sg-youth-mental-health-social-me
dia-advisory.pdf. ). Yet the negative repercussions of excessive social
media use do not affect only those under the age of 18, with multiple
studies showing similar links between social media use and depression in
adult users as well (O'Donnell, E. (2022, February 7). Social Media Use
and Adult Depression / Harvard Magazine.
Www.harvardmagazine.com.https://www.harvardmagazine.com/2022/02/right-no
w-social-media-adult-depression). Worldwide, more than 59% of the popu-
lation uses social media, with the average individual dedicating around
two and a half hours a day to social media apps, and 40% of people
between the ages of 18 and 22 reported that they felt "addicted" to
social media in one 2019 study (Bergman, M. (2023, December 4). Why is
Social Media Addictive? Social Media Victims Law Center.
https://socialmediavictims.org/social-media-addiction/why-is-social-medi
a-addictive/).
One need not travel far to find the reasons for such feelings of
addiction, as technologists and regulators alike describe a series of
deliberate design choices driven by an understanding of the human brain
and its dopamine loops (Brooks, J. (2017, June 9). 7 Specific Tactics
Social Media Companies Use to Keep You Hooked. KQED.
https://www.kqed.org/futureofyou/397018/7-specific-ways-social-media-com
panies-hav-you-hooked). Notifications, for example, deploy what psychol-
ogists call a "variable rewards system" to keep users compulsively
checking their phones, in the same way that gamblers compulsively pull
the lever of a slot machine at a casino. Features such as video autoplay
and infinite scroll are based on a study involving trick bowls of soup,
in which eater subjects whose bowls were imperceptibly refilled as they
ate wound up consuming 73% more soup than those who received the visual
cue of an empty bowl. Machine-driven algorithms combine all of the
above techniques with a "fear of missing out" (FOMO), collecting hordes
of personal data on users to drive personalized content to the top of
their feeds so that they keep checking the app as frequently as possi-
ble. Lacking a moral compass, these machines will frequently recommend
content related to violence, cyberbullying, self-harm, eating disorders,
and body dysmorphia if they have detected that the user may be more
susceptible to engaging with such content. In the words of one former
Google engineer, social media companies have engaged in a "race to the
bottom of the brain stem," tapping our most primitive emotions of fear,
anxiety, and loneliness in order to maximize user engagement (Cooper, A.
(2017, April 9). What is "brain hacking"? Tech insiders on why you
should care. Cbsnews.com.
https://www.cbsnews.com/news/brain-hacking-tech-insiders-60-minutes/).
New York State recently took action to curtail Silicon Valley's worst
impulses with the passage of the Stop Addictive Feeds Exploitation
(SAFE) for Kids Act (Ch. 120 of 2024), which turns off social media's
user-personalization algorithms for all users under the age of 18 by
default, unless a parent switches them back on. This bill, the Stop
Addictive Feeds Exploitation (SAFE) for All Act, extends similar
protections to older users, by requiring social media platforms to allow
all users to turn off their algorithm. Feeds would then resort to being
chronological, displaying content of pages and accounts that a user has
proactively decided to follow rather than suggesting new content to
which they haven't subscribed (note that while social media algorithms
perform a number of functions, ranking content based on virality, recen-
cy, originality, image or video quality, community guidelines, and
reporting by other users, the SAFE for All Act is narrowly focused on
algorithms that promote content based on information specific to a user
or their device rather than these other data points) (Alyssa Gagliardi.
(2024, March 21). This is How The lnstagram Algorithm Works in 2019.
Later Blog. https://later.com/blog/how-instagram-algorithm-works/).
The SAFE for All Act would also require all social media platforms
servicing users in the state to allow users to turn off notifications
and autoplay as well as to set screen time limits that restricts a
user's access to a platform if the user so chooses. Platforms would be
unable to hide these prescribed settings behind dark patterns, which the
Federal Trade Commission (FTC) describes as "design practices that trick
or manipulate users into making decisions they would not otherwise have
made" (Gorman, F., Chapin, B., Jacob, R., & May, J. M. (2023, August
14). FTC Targets "Dark Patterns" in Actions Against Amazon and Publish-
ers Clearing House. Www.wilmerhale.com; WilmerHale.
https://www.wilmerhale.com/insights/client-alerts/20230814-ftc-targets-d
ark-patterns-in-actions-against-amazon-and-publishers-clearing-house).
The subject of several recent FTC lawsuits which call these patterns
illegal under various consumer protection statutes, dark patterns in the
context of the SAFE for All Act might appear as misleading language that
doesn't clearly describe what a given setting is intended to do (trick
wording), preselected default options which nudge users into choosing a
given setting over another, or visual inferences that obscure opt- out
icons or present them in illegibly small font (Davis, Matt. "9 Dark
Pattern Examples to Look out For." Www.osano.com, 1 Dec. 2023). Plat-
forms would be further prohibited from deploying dark patterns to
prevent users from deactivating, reactivating, suspending, or canceling
an account, such as limiting account deactivations or refusing to let
users cancel one social media account without also canceling another one
owned by the same company. Enforcement would be vested in the Office of
the Attorney General, which is well-equipped to investigate allegations
of misconduct as the entity also enforcing the SAFE for Kids Act.
In curbing the most egregious features of the apps increasingly dominat-
ing our global politics, psychology, economy, and forms of socializa-
tion, the SAFE for All Act empowers users by allowing them to design
their app experience in the manner that they see fit. While not specific
to users under the age of 18, in empowering all users to customize
features such as autoplay, notifications, algorithmic feeds, and screen
time limits, this bill promotes child safety by fostering digital liter-
acy amongst adults. The more deftly adults are able to navigate their
own social media experience and understand its impact on their mental
health and productivity, the more likely they are to have these critical
conversations with and activate digital well-being features for their
children as well.
In summary, the SAFE for All Act prioritizes user autonomy over time
spent on the app, sending a clear signal to some of the most profitable
companies of the world that New Yorkers are not willing to boost their
bottom line at the expense of their mental health and well-being.
 
PRIOR LEGISLATIVE HISTORY::
None
 
FISCAL IMPLICATIONS::
TBD
 
EFFECTIVE DATE::
This act shall take effect on the one hundred eightieth day after the
office of the attorney general shall promulgate rules and regulations
necessary to effectuate the provisions of this act.