Florida Senate - 2024 SB 454
By Senator Garcia
36-00574-24 2024454__
1 A bill to be entitled
2 An act relating to the protection of minors on social
3 media platforms; creating s. 501.174, F.S.; defining
4 the terms “account holder” and “social media
5 platform”; requiring social media platforms to develop
6 and implement a content moderation strategy to prevent
7 a minor from being exposed to certain materials on the
8 social media platform; providing requirements for the
9 moderation strategy; requiring social media platforms
10 to verify the ages of users creating accounts on the
11 platforms; requiring certain features and content to
12 be restricted from minors; requiring social media
13 platforms to provide parents or legal guardians with
14 parental control settings that place controls on a
15 minor child’s account; requiring algorithms and a
16 real-time monitoring system that meet certain
17 requirements; requiring social media platforms to
18 report certain activity to the appropriate local
19 authorities or child protection agencies; requiring
20 social media platforms to collaborate with certain
21 entities and experts to ensure compliance with privacy
22 laws and regulations; requiring safety alerts and
23 notifications to account holders; requiring regular
24 audits and assessments of the monitoring and reporting
25 measures; providing penalties under the Florida
26 Deceptive and Unfair Trade Practices Act; providing an
27 effective date.
28
29 Be It Enacted by the Legislature of the State of Florida:
30
31 Section 1. Section 501.174, Florida Statutes, is created to
32 read:
33 501.174 Social media platform content moderation; penalty.—
34 (1) DEFINITIONS.—As used in this section, the term:
35 (a) “Account holder” means a resident of this state who has
36 or opens an account or creates a profile in order to use or
37 access a social media platform.
38 (b) “Social media platform” has the same meaning as in s.
39 112.23.
40 (2) DATA COLLECTION AND ANALYSIS.—A social media platform
41 shall develop and implement a content moderation strategy to
42 prevent an account holder who is a minor from being exposed to
43 content that promotes, glorifies, or facilitates grooming,
44 solicitation, child pornography, or other sexual exploitation or
45 abuse.
46 (a) The content moderation strategy developed must include,
47 at a minimum, all of the following:
48 1. The use of natural language processing techniques that
49 analyze text content for patterns associated with grooming,
50 solicitation, or sexually explicit language involving minors.
51 2. The use of computer vision techniques that analyze
52 images shared on social media platforms to identify sexually
53 explicit or inappropriate images involving minors.
54 (b) The data and information collected pursuant to this
55 subsection must be compiled in a standardized format for
56 analysis.
57 (3) AGE VERIFICATION AND AGE-RESTRICTED CONTENT.—A social
58 media platform shall:
59 (a) Verify the age of users who attempt to create an
60 account with the social media platform.
61 (b) Identify features and content that are inappropriate
62 for an account holder who is a minor to access and that use geo
63 fencing to restrict the minor’s access to such content or
64 features.
65 (4) PARENTAL CONTROLS.—A social media platform shall
66 provide parental control settings that include geo-fencing
67 features, allowing parents or legal guardians to set boundaries
68 on their minor child’s social media usage based on location and
69 time restrictions.
70 (5) REAL-TIME MONITORING AND REPORTING.—A social media
71 platform shall:
72 (a) Develop and use:
73 1. Algorithms that detect suspicious patterns and flag
74 potentially inappropriate activity, including adult interactions
75 with minors, private messaging frequency, and attempts to
76 establish inappropriate relationships with minors.
77 2. A real-time monitoring system that continuously analyzes
78 social media content and identifies potentially inappropriate
79 activity involving minors, including an automated reporting
80 mechanism that promptly reports identified instances to the
81 appropriate authorities.
82 (b) Prioritize and handle reports of inappropriate activity
83 involving minors by directing reports to the appropriate local
84 authorities or child protection agencies based on the user’s
85 location.
86 (c) Collaborate with law enforcement agencies, child
87 protection agencies, and legal experts to ensure compliance with
88 privacy laws and regulations.
89 (d) Send safety alerts and notifications to an account
90 holder in a specific geographic area that has an increased risk
91 of child exploitation or grooming.
92 (e) Conduct regular audits and assessments to evaluate the
93 effectiveness of the implemented monitoring and reporting
94 measures and make necessary improvements.
95 (6) PENALTY.—A social media platform that fails to comply
96 with this section commits a deceptive and unfair trade practice
97 under part II of this chapter and is subject to the penalties
98 and remedies provided in that part.
99 Section 2. This act shall take effect July 1, 2024.