[Congressional Bills 118th Congress] [From the U.S. Government Publishing Office] [S. 4614 Introduced in Senate (IS)] <DOC> 118th CONGRESS 2d Session S. 4614 To direct the Secretary of Health and Human Services and the Secretary of Education to coordinate and distribute educational materials and resources regarding artificial intelligence and social media platform impact, and for other purposes. _______________________________________________________________________ IN THE SENATE OF THE UNITED STATES June 20, 2024 Mr. Markey introduced the following bill; which was read twice and referred to the Committee on Health, Education, Labor, and Pensions _______________________________________________________________________ A BILL To direct the Secretary of Health and Human Services and the Secretary of Education to coordinate and distribute educational materials and resources regarding artificial intelligence and social media platform impact, and for other purposes. Be it enacted by the Senate and House of Representatives of the United States of America in Congress assembled, SECTION 1. SHORT TITLE. This Act may be cited as the ``Social Media and AI Resiliency Toolkits in Schools Act'' or the ``SMART in Schools Act''. SEC. 2. DEFINITIONS. In this Act: (1) ESEA definitions.--The terms ``elementary school'', ``evidence-based'', ``local educational agency'', ``paraprofessional'', ``parent'', ``secondary school'', ``specialized instructional support personnel'', and ``State educational agency'' have the meanings given the terms in section 8101 of the Elementary and Secondary Education Act of 1965 (20 U.S.C. 7801). (2) Bureau-funded school.--The term ``Bureau-funded school'' has the meaning given the term in section 1141 of the Education Amendments of 1978 (25 U.S.C. 2021). (3) Departments.--The term ``Departments'' means the Department of Education and the Department of Health and Human Services. (4) Digital citizenship.--The term ``digital citizenship'' means the ability to-- (A) safely, responsibly, and ethically use communication technologies and digital information technology tools and platforms; (B) create and share media content using principles of social and civic responsibility and with awareness of the legal and ethical issues involved; and (C) participate in the political, economic, social, and cultural aspects of life related to technology, communications, and the digital world by consuming and creating digital content, including media. (5) Digital resilience.--The term ``digital resilience'' means the ability to recognize, manage, and recover from online risks. (6) Educator.--The term ``educator'' means an early childhood educator, teacher, or paraprofessional, serving students. (7) Gender identity.--The term ``gender identity'' means the gender-related identity, appearance, mannerism, or other gender-related characteristic of an individual, regardless of the designated sex at birth of the individual. (8) Health care provider serving pediatric patients.--The term ``health care provider serving pediatric patients'' means a health care provider who serves children, including a family medicine physician, pediatrician, child and adolescent psychiatrist, mental health provider, or behavioral health provider. (9) Labor organization.--The term ``labor organization'' has the meaning given the term in section 2 of the National Labor Relations Act (29 U.S.C. 152). (10) School or educational agency administrator.-- (A) In general.--The term ``school or educational agency administrator'' means an individual who is a principal, other school leader, superintendent, or other employee or officer of an elementary school or secondary school, local educational agency, State educational agency, or other entity operating an elementary school or secondary school. (B) Exception.--The term ``school or educational agency administrator'' does not include an individual solely due to the individual's service as a member of a public board of education or other public authority legally constituted within a State for either administrative control or direction of, or to perform a service function for, public elementary schools or secondary schools. (11) Secretaries.--The term ``Secretaries'' means the Secretary of Health and Human Services and the Secretary of Education, acting jointly or acting jointly through their designees. (12) Sexual orientation.--The term ``sexual orientation'' means how a person identifies in terms of their emotional, romantic, or sexual attraction, and includes identification as straight, heterosexual, gay, lesbian, or bisexual, among other terms. (13) Student.--The term ``student'' means a student in any of grades kindergarten through grade 12. (14) Toolkit.--The term ``toolkit'' means a collection of materials and resources to inform responsible use of artificial intelligence and social media platforms. (15) Tribal educational agency.--The term ``Tribal educational agency'' has the meaning given the term (without regard to capitalization) in section 6132(b) of the Elementary and Secondary Education Act of 1965 (20 U.S.C. 7452). SEC. 3. JOINT DEVELOPMENT OF EDUCATIONAL TOOLKIT ON ARTIFICIAL INTELLIGENCE AND SOCIAL MEDIA PLATFORM IMPACT, RESPONSIBLE USES OF THESE TECHNOLOGIES, AND THE IMPACT ON YOUTH MENTAL HEALTH. (a) Development of Toolkits.-- (1) In general.--Beginning not later than 1 year after the date of enactment of this Act, the Secretaries shall-- (A) develop, and update on a biennial basis, including with reference to any existing resources, toolkits to facilitate greater awareness of, and ability to respond to, the impact of artificial intelligence and social media platforms on students, in accordance with subsections (b) through (d); and (B) not less frequently than once a year, disseminate such toolkits to school or educational agency administrators, educators, specialized instructional support personnel, health care providers serving pediatric patients, students, parents, guardians, and caregivers in accordance with subsection (e). (2) Consultation and considerations.--In developing the educational materials and resources described in paragraph (1), the Secretaries shall-- (A) consult with-- (i) students, parents, guardians, and caregivers; (ii) relevant subject-matter experts; (iii) labor organizations representing educators, health care providers serving pediatric patients, and specialized instructional support personnel; (iv) professional organizations representing educators, health care providers serving pediatric patients, and specialized instructional support personnel; (v) health care providers serving pediatric patients; (vi) specialized instructional support personnel and educators; (vii) youth-serving or community-based youth-oriented organizations; and (viii) school or educational agency administrators; and (B) consider evidence-based recommendations from other groups as determined necessary by the Secretaries. (b) Toolkits Audiences.--In order to carry out subsection (a), the Secretaries shall create different toolkits tailored for each of the following audiences: (1) Students. (2) Educators. (3) Specialized instructional support personnel. (4) Health care providers serving pediatric patients. (5) Parents, guardians, and caregivers. (6) School or educational agency administrators. (7) Additional audiences, as the Secretaries determine necessary. (c) Tenets for Educational Resources.--The information provided in the toolkits described in subsection (a) shall be-- (1) in an easily accessible and understandable format; (2) evidence-based; and (3) culturally appropriate and in a manner that is inclusive of race, ethnicity, language spoken, disability, geographic location, gender identity, and sexual orientation. (d) Contents of Educational Resources.-- (1) In general.--The toolkits described in subsection (a) shall be designed to-- (A) strengthen digital resilience and improve the ability to recognize, manage, recover from, and avoid perpetuating online risks (such as harassment, excessive use, discrimination, and other impacts to mental health) with respect to youth mental health concerns due to artificial intelligence and social media platform use; (B) provide information and instruction regarding healthy and responsible use cases of artificial intelligence and social media platform technologies and examples of responsible and healthy use of such technologies; and (C) provide evidence-based education to the relevant audience regarding-- (i) artificial intelligence and social media platform education, including privacy concerns; (ii) the mental health implications and risk factors of excessive, irresponsible, maladaptive, or otherwise unhealthy use for students; and (iii) methods that the audience can use to seek help for a student with respect to excessive, irresponsible, maladaptive, or otherwise unhealthy artificial intelligence or social media platform use. (2) Group-specific content requirements.--The toolkits described in subsection (a) for each audience described in subsection (b) shall meet the following requirements: (A) Students.--Such toolkits for students shall-- (i) provide accessible explanations, differentiated for various grade-levels, for how artificial intelligence and social media platforms function; (ii) provide skills to identify generative artificial intelligence and the use of such technologies in ``human-like'' or ``companion'' chatbots, and information on how to interact with such artificial intelligence responsibly; (iii) inform students of indicators that the students are interacting with artificial intelligence and algorithms while using the internet and social media platform applications, including, as age appropriate-- (I) information about attention- diverting and disguised algorithmic techniques like dark patterns; and (II) information regarding, and examples of, the effects of bad training or incomplete datasets on perpetuating existing inequities, including incorrect and negative outputs of artificial intelligence such as hallucinations, deep fakes, and false information; (iv) inform students of their rights online, both on social media platform applications and with regard to artificial intelligence; (v) teach digital resilience; (vi) teach digital citizenship and the skills necessary to reduce online risks from the user end; (vii) teach students to recognize excessive, irresponsible, maladaptive, or otherwise unhealthy use of social media platforms and how to initiate a conversation about such use or how to seek help from an adult; and (viii) provide information on unique impacts for students based on race, language spoken, disability, geographic location, gender identity, and sexual orientation. (B) Educators.--Such materials and resources for educators shall-- (i) define and provide an appropriate knowledge base of artificial intelligence systems and social media platforms, including information regarding contexts and instances where technologies and functions that rely on artificial intelligence are in use; (ii) provide additional, specific information on-- (I) the ways in which students are uniquely vulnerable to generative artificial intelligence and ``human- like'' or ``companion'' chatbots and other high-risk applications of artificial intelligence; (II) specific risks for different age groups of students; and (III) data privacy and management, including technologies that rely on artificial intelligence to-- (aa) surveil students; (bb) track students' academic outcomes and engagement; and (cc) monitor students' online activities; (iii) provide information on the benefits of responsible use and strategies to encourage responsible use of artificial intelligence and social media platforms, including practical examples of how to teach and engage students to understand responsible use which may include professional development and training opportunities in addition to the information provided in the toolkit; (iv) provide information on the ways in which artificial intelligence and social media platform use outside of the classroom impacts student academic achievement, well-being, and mental health, and school climate; (v) inform how to recognize excessive, irresponsible, maladaptive, or otherwise unhealthy use of social media platforms in the educator's age group of students; (vi) provide information on available resources educators can inform a student of if the educator identifies-- (I) excessive, irresponsible, maladaptive, or otherwise unhealthy artificial intelligence and social media platform use or content; or (II) the use of these technologies impacting mental health; (vii) engagement strategies with parents, guardians, and caregivers to address excessive, irresponsible, maladaptive, or otherwise unhealthy artificial intelligence and social media platform use; and (viii) provide information on unique impacts for students based on race, language spoken, disability, geographic location, gender identity, or sexual orientation, including providing guidance for educators on how to present this information to students. (C) Specialized instructional support personnel.-- Such materials and resources for specialized instructional support personnel shall meet the requirements for educators under subparagraph (B) and also include-- (i) clinically relevant information on the mental health impacts of excessive, irresponsible, maladaptive, or otherwise unhealthy artificial intelligence and social media platform use; (ii) more information on available in- school behavioral health or school resources that can be employed to assist in the prevention and early intervention of mental