BILL NUMBER: S8510A
SPONSOR: FERNANDEZ
TITLE OF BILL:
An act to amend the penal law, the criminal procedure law and the
correction law, in relation to establishing offenses involving sexually
explicit digital alterations
PURPOSE OF THE BILL:
Establishes the crimes of unlawful distribution of a sexually explicit
depiction of a child in the first and second degrees and unlawful access
of a sexually explicit depiction of a child. It also amends unlawful
dissemination or publication of an intimate image (Penal Law § 245.15)
to add in two intent elements not previously considered and to require
an actor's knowledge (or presumed knowledge) of the image or video's
creation. Overall, the proposal addresses the threat of digitized sexu-
ally explicit material that is created and shared to circumvent child
sex abuse material laws and is typically harmful to the depicted indi-
viduals.
SUMMARY OF PROVISIONS:
Section 1 of the bill announces the title of the act.
Section 2 adds article 246 to the penal law with the following sections:
Section 246.00 defines the terms used in the article for the crimes of
unlawful distribution of a sexually explicit depiction of a child in the
first and second degrees and unlawful access of a sexually explicit
depiction of a child. This section incorporates by reference the terms
"disseminate" and "publish" as they are used in Penal Law § 245.15
(unlawful dissemination or publication of an intimate image) and "sexual
conduct" as it is used in Penal Law Article 263 (promotion and
possession of a sexual performance of a child). The section also defines
an identifiable person, an individual, a depicted individual, digitiza-
tion, sexually explicit material, and a sexually explicit depiction.
Sections 246.05 creates the crime of unlawful access of a sexually
explicit depiction of a child. A person is guilty of this crime when he
or she, knowing the character and content thereof, knowingly accesses,
with intent to view and to sexually gratify any person, any sexually
explicit material of a depicted individual and he or she knows or
reasonably should know that the depicted individual is less than seven-
teen years of age. Unlawful access of a sexually explicit depiction of a
child is a class A misdemeanor. This section is narrowly tailored with
specific mens rea so that it penalizes only those individuals who are
specifically perpetuating the dissemination of sexually explicit
depictions of a child or children for sexual gratification and know (or
should know) that the depictions are created by sexually explicit digi-
tization.
Section 246.10 creates the crime of unlawful distribution of a sexually
explicit depiction of a child in the second degree. A person commits
this crime when the person, knowing the character and content thereof,
disseminates or publishes sexually explicit material that includes the
depicted individual, and knows or reasonably should have known that the
depicted individual is less than seventeen years of age. Unlawful
distribution of a sexually explicit depiction of a child in the second
degree is a class E felony.
Section 246.15 creates the crime of unlawful distribution of a sexually
explicit depiction of a child in the first degree. A person commits this
crime when the person, knowing the character and content thereof,
creates and either disseminates or publishes sexually explicit material
that includes the depicted individual and that individual is less than
seventeen years old. Unlawful distribution of a sexually explicit
depiction of a child in the first degree is a class D felony. The
section imposes strict liability on the creator of a sexually explicit
depiction because that person created and shared the depiction perpet-
uating the severe harms to children that result by sharing such
depictions with others.
Section 246.20 forecloses the use of a disclaimer to circumvent the
creation, distribution, and publication of sexually explicit depictions
of children.
Section 246.25 reinforces the established policy that those under eigh-
teen cannot consent and forecloses the possibility of any adult consent-
ing on a child's behalf.
Section 246.30 states that nothing in the article should be construed to
limit or enlarge protections provided by 47 U.S.C. § 230 or interfere
with any rights established by Civil Rights Law § 52-c.
Section 3 amends paragraph (a) subdivision 1 of section 245.15 of the
penal law to include other common mens tea used by those who create
sexual deepfakes.
Section 4 amends paragraph (b) of subdivision 1 of section 245.15 of the
penal law to require an actor charged with this crime to have knowledge
or presumed knowledge of a sexual deepfake because knowledge of a
deepfake's creation in and of itself can alert the person that they
should not be disseminating the sexual image or video.
Sections 5 and 6 establish the statute of limitations for the article
and amend C.P.L. § 30.10(3)(f) to include certain of the enumerated
crimes.
Sections 7, 8, and 9 amend the criminal procedure law to allow certain
of the enumerated offenses to be qualifying felony offenses that are
eligible for bail and other securing orders.
Section 10 amends the correction law to add first- and second- degree
unlawful distribution of sexually explicit depiction of a child as
reportable sex offenses.
Section 11 provides for the severability of any provision in the event
that any other provision is invalidated.
Section 12 announces the time within which the act shall take effect.
JUSTIFICATION:
The State has a compelling interest in safeguarding the physical,
psychological, and reputational wellbeing of its minors. To that end,
Article 263 was enacted to prohibit the possession and distribution of
child sex abuse material ("CSAM"). The law was later amended to close a
loophole that allowed for the viewing, but not downloading, of CSAM on a
computer. With the advent of newer technology, pedophiles regularly
circumvent existing laws by taking photographs and videos of identifi-
able children and digitally altering those images and videos to make it
appear as if those children are naked or engaged in sexual conduct. The
identity of the depicted children is established by using identifying
features of the children, most often their faces. Their identities are
also often established through information posted along with the altered
content. Bad actors are increasingly using newer artificial intelligence
applications to create incredibly detailed images and videos that are
indistinguishable from real life. In popular parlance, these digitally
altered images and videos are known as deepfakes.
Created using artificial intelligence-assisted technology, deepfakes are
audio visual material that is convincingly altered and manipulated to
misrepresent individuals as saying or doing something that they have not
actually said or done. Because sexual deepfakes appear genuine, they can
sully reputations, destroy educational and employment opportunities,
ruin personal relationships, and invite unwanted requests for sexual
services.
Deepfakes depicting CSAM are especially pernicious. The images and vide-
os often combine an innocent photograph or video of a child, pulled by a
pedophile from a parent's social media or blog, with a sexual or nude
picture or video of an adult creating a realistic depiction that can,
once circulated, cause lasting harm to the child. As technology
advances, bad actors can use artificial intelligence programs to easily
eliminate clothing from a photograph or create a hyper-realistic image
or video depicting the identifiable child engaged in a wide range of
sexual conduct.
There is, moreover, no limit to the number of sexual deepfakes of a
child that a pornographer can generate. In one case ( United States
v. Mecham, 950 F.3d 267
5H CIR. 2020), a Texas man was convicted of
possessing 31,562 images and 1,741 videos of CSAM, all of them digitally
altered to include the faces of the man's grandchildren, ranging in age
from four- to sixteen-years-old. One of the videos showed his grand-
daughter's face on an adult female having sex, and the defendant super-
imposed his face on the male in the video. Victims may understandably
become traumatized when they discover that cherished childhood memories
have been transformed into CSAM. And the problem is only growing. In
2022, the National Center for Missing & Exploited Children's Cyber
Tipline received approximately thirty-two million reports of suspected
online CSAM alone. Indeed, in January of 2024, the American Academy of
Pediatrics released a paper entitled Rising Threats of AI-Driven Child
Sexual Abuse Material to warn doctors of the widespread and growing
problem of virtual CSAM.
These sexual deepfakes of children give rise to the same reputational
harm caused by conventional forms of CSAM, but the penalties for their
creation and distribution pales in comparison to other statutes protect-
ing children. In 2003, Congress, recognizing the burgeoning problem,
modified federal law to include such digitally altered images in its
definition of CSAM. In 2019, Virginia added deepfaked images to its
statute that criminalizes the unlawful dissemination or sale of images
of another person.
New York took its first steps to close this loophole when it amended the
unlawful dissemination or publication of an intimate image statute
(Penal Law 5 245.15). This statute, commonly referred to as the
revenge-porn statute, was modified to prohibit the dissemination of
digitally altered images and videos. While this statute prohibits the
sharing of sexual deepfakes of all individuals, including children, it
fails to adequately protect children-a uniquely vulnerable and defense-
less segment of the population who are in need of greater protection
than adults. This is unsurprising considering the law's intent was the
protection of adults. In fact, the statute does not even specifically
mention children. This lack of specialized protection is at odds with
the rest of the Penal Law that routinely enhances protections for chil-
dren. See, e.g., Penal Law § 130.25(1), 130.30, 130.35, 130.75, 130.80,
130.95, 260.10, 263.15, 263.16.
Tragically, pedophiles who traffic in sexual deepfakes of children are
able to do so in the absence of appropriate laws. And a technological
solution does not seem to be imminent. In what some liken to a game of
whack-a-mole, the research community is developing improved methods of
face manipulation detection as powerful synthetic face generators
produce even more realistic images and videos. C. Rathgeb et al. (eds.),
Handbook of Digital Face Manipulation and Detection (2022). As of yet,
however, there is no tool that can reliably identify and stop the
creation and dissemination of sexual deepfakes of children. And, as our
history with the internet has proved time and again, bad actors will
likely find ways to stay ahead of any tools created.
The spread of sexual deepfakes can affect any community. In 2021, for
instance, approximately thirteen women alerted Nassau County police that
they had discovered images of themselves on a pornographic website. The
women reported that dozens of images, taken when they were in middle and
high school, had been reposted on the website from their own social
media platforms and altered to depict them as engaging in sexual
conduct. These new digitally altered images misrepresented the victims
as engaging in sexual conduct when they were minors. For many of the
women, their faces had been superimposed on separate images of others
engaged in sexual activity. The posted images were accompanied by the
women's actual names, addresses, telephone numbers, and social media
contact information. Some of the images were doctored to make it appear
as if they were voluntarily sent by the victim through social media to
the recipient. One victim was as young as twelve years old in the
original photographs that had been altered. As a result of their images
being posted, some of these women were subjected to constant harassment
and frequent unwanted requests for sexual services.
Nassau County police thereafter charged a local man with posting the
images and a grand jury indicted him for crimes including stalking,
aggravated harassment, and promoting CSAM related to his dissemination
of an unaltered sexual image of an actual child. Many of the thirteen
women had attended middle school and high school with the man accused of
creating digitally altered images, and he allegedly targeted these women
because of perceived slights he had suffered. In fact, he used his
purported friendships with these women to gain access to their other-
wise-private social media accounts. Prosecutors could not charge him
with publishing and distributing the digitally altered images of the
underage girls because New York did not penalize this conduct, notwith-
standing the fact that it subjected the victims to years of harassment,
embarrassment, and reputational harm.
The Digital Alterations Protection Act ("DAPA") offers a path forward to
protect New York's children from the scourge of sexual deepfakes. By
creating felony offenses with enhanced punishments, DAPA increases the
deterrent effect on those considering trafficking in sexual deepfakes of
children. At the same time, DAPA signals to these bad actors that the
protection of children is a fundamental interest of the State. For any
individual who is undeterred by the possibility of being convicted of a
felony offense, DAPA requires those convicted to register as a sex
offender, which often comes with increased monitoring that can aid in
preventing recidivism. Indeed, DAPA mirrors the reporting requirement of
those convicted of the existing CSAM laws found in Penal Law sections
263.15 and 263.16 (promoting a sexual performance by a child and
possessing a sexual performance by a child).
In light of the above, it is necessary to continue the important work
started by the New York Legislature this past term. Passage of DAPA will
protect the growing number of victims from the severe reputational,
emotional, and financial harms caused by the depraved and abhorrent
conduct of those creating sexual deepfakes from publicly shared media of
children and adults.
PRIOR LEGISLATIVE HISTORY:
New Bill
FISCAL IMPLICATIONS FOR STATE AND LOCAL GOVERNMENTS:
None.
EFFECTIVE DATE:
This act would take effect immediately.
Statutes affected: S8510A: 245.15 penal law, 245.15(1) penal law