[Congressional Bills 118th Congress] [From the U.S. Government Publishing Office] [S. 5152 Introduced in Senate (IS)] <DOC> 118th CONGRESS 2d Session S. 5152 To establish protections for individual rights with respect to computational algorithms, and for other purposes. _______________________________________________________________________ IN THE SENATE OF THE UNITED STATES September 24, 2024 Mr. Markey (for himself and Ms. Hirono) introduced the following bill; which was read twice and referred to the Committee on Commerce, Science, and Transportation _______________________________________________________________________ A BILL To establish protections for individual rights with respect to computational algorithms, and for other purposes. Be it enacted by the Senate and House of Representatives of the United States of America in Congress assembled, SECTION 1. SHORT TITLE; TABLE OF CONTENTS. (a) Short Title.--This Act may be cited as the ``Artificial Intelligence Civil Rights Act of 2024''. (b) Table of Contents.--The table of contents for this Act is as follows: Sec. 1. Short title; table of contents. Sec. 2. Definitions. TITLE I--CIVIL RIGHTS Sec. 101. Discrimination. Sec. 102. Pre-deployment evaluations and post-deployment impact assessments. TITLE II--COVERED ALGORITHM AND CONTRACT STANDARDS Sec. 201. Covered algorithm standards. Sec. 202. Relationships between developers and deployers. Sec. 203. Human alternatives and other protections. TITLE III--TRANSPARENCY Sec. 301. Notice and disclosure. Sec. 302. Study on explanations regarding the use of covered algorithms. Sec. 303. Consumer awareness. TITLE IV--ENFORCEMENT Sec. 401. Enforcement by the Commission. Sec. 402. Enforcement by States. Sec. 403. Private right of action. Sec. 404. Severability. Sec. 405. Rules of construction. TITLE V--FEDERAL RESOURCES Sec. 501. Occupational series relating to algorithm auditing. Sec. 502. United States Digital Service algorithm auditors. Sec. 503. Additional Federal resources. SEC. 2. DEFINITIONS. In this Act: (1) Collect; collection.--The terms ``collect'' and ``collection'', with respect to personal data, mean buying, renting, gathering, obtaining, receiving, accessing, or otherwise acquiring such data by any means. (2) Commission.--The term ``Commission'' means the Federal Trade Commission. (3) Consequential action.--The term ``consequential action'' means an act that is likely to have a material effect on, or to materially contribute to, access to, security and authentication relating to, eligibility for, cost of, terms of, or conditions related to any of the following: (A) Employment, including hiring, pay, independent contracting, worker management, promotion, and termination. (B) Education and vocational training, including assessment, proctoring, promotion of academic integrity, accreditation, certification, admissions, and provision of financial aid and scholarships. (C) Housing and lodging, including rental and short-term housing and lodging, home appraisals, rental subsidies, and publicly supported housing. (D) Essential utilities, including electricity, heat, water, municipal trash or sewage services, internet and telecommunications service, and public transportation. (E) Health care, including mental health care, and dental, vision, and adoption services. (F) Credit, banking, and other financial services. (G) Insurance. (H) Actions of the criminal justice system, law enforcement or intelligence operations, immigration enforcement, border control (vetting, screening, and inspection), child protective services, child welfare, and family services, including risk and threat assessments, situational awareness and threat detection, investigations, watchlisting, bail determinations, sentencing, administration of parole, surveillance, use of unmanned vehicles and machines, and predictive policing. (I) Legal services, including court-appointed counsel services and alternative dispute resolution services. (J) Elections, including voting, redistricting, voter eligibility and registration, support or advocacy for a candidate for Federal, State, or local office, distribution of voting information, election security, and election administration. (K) Government benefits and services, as well as identity verification, fraud prevention, and assignment of penalties. (L) A public accommodation. (M) Any other service, program, product, or opportunity which has a comparable legal, material, or similarly significant effect on an individual's life as determined by the Federal Trade Commission through rules promulgated pursuant to section 553 of title 5, United States Code. (4) Covered algorithm.-- (A) In general.--The term ``covered algorithm'' means a computational process derived from machine learning, natural language processing, artificial intelligence techniques, or other computational processing techniques of similar or greater complexity, that, with respect to a consequential action-- (i) creates or facilitates the creation of a product or information; (ii) promotes, recommends, ranks, or otherwise affects the display or delivery of information that is material to the consequential action; (iii) makes a decision; or (iv) facilitates human decision making. (B) Modified definition by rulemaking.--The Commission may promulgate regulations under section 553 of title 5, United States Code, to modify the definition of the term ``covered algorithm'' as the Commission considers appropriate. (5) Covered language.--The term ``covered language'' means the 10 languages with the most speakers in the United States, according to the most recent data collected by the United States Census Bureau. (6) De-identified data.--The term ``de-identified data'' means information-- (A) that does not identify and is not linked or reasonably linkable to an individual or a device, regardless of whether the information is aggregated; and (B) with respect to which any developer or deployer using such information-- (i) takes reasonable technical measures to ensure that the information cannot, at any point, be used to re-identify any individual or device that identifies or is linked or reasonably linkable to an individual; (ii) publicly commits in a clear and conspicuous manner-- (I) to process and transfer the information solely in a de-identified form without any reasonable means for re-identification; and (II) to not attempt to re-identify the information with any individual or device that identifies or is linked or reasonably linkable to an individual; and (iii) contractually obligates any person that receives the information from the developer or deployer-- (I) to comply with all of the provisions of this paragraph with respect to such information; and (II) to require that such contractual obligations be included in all subsequent instances for which the information may be received. (7) Deployer.-- (A) In general.--The term ``deployer'' means any person, other than an individual acting in a non- commercial context, that uses a covered algorithm in or affecting interstate commerce. (B) Rule of construction.--The terms ``deployer'' and ``developer'' shall not be interpreted to be mutually exclusive. (8) Developer.-- (A) In general.--The term ``developer'' means any person, other than an individual acting in a non- commercial context, that designs, codes, customizes, produces, or substantially modifies an algorithm that is intended or reasonably likely to be used as a covered algorithm for such person's own use, or use by a third party, in or affecting interstate commerce. (B) Assumption of developer responsibilities.--In the event that a deployer uses an algorithm as a covered algorithm, and no person is considered the developer of the algorithm for purposes of subparagraph (A), the deployer shall be considered the developer of the covered algorithm for the purposes of this Act. (C) Rule of construction.--The terms ``developer'' and ``deployer'' shall not be interpreted to be mutually exclusive. (9) Disparate impact.-- (A) In general.--The term ``disparate impact'' means an unjustified differential effect on an individual or group of individuals on the basis of an actual or perceived protected characteristic. (B) Unjustified differential effect.--For purposes of subparagraph (A), with respect to the action, policy, or practice of a person, a differential effect is unjustified if-- (i) the person fails to demonstrate that such action, policy, or practice causing the differential effect is necessary to achieve a substantial, legitimate, and nondiscriminatory interest; or (ii) in the event the person demonstrates such interest, an alternative action, policy, or practice could serve such interest with less differential effect. (C) Application to covered algorithms.--With respect to demonstrating that a covered algorithm causes or contributes to a differential effect, the covered algorithm is presumed to be not separable for analysis and may be analyzed holistically as a single action, policy, or practice, unless the developer or deployer proves that the covered algorithm is separable by a preponderance of the evidence. (10) Harm.--The term ``harm'', with respect to a consequential action, means a non-de minimis adverse effect on an individual or group of individuals-- (A) on the basis of a protected characteristic; (B) that involves the use of force, coercion, harassment, intimidation, or detention; or (C) that involves the infringement of a right protected under the Constitution of the United States. (11) Independent auditor.-- (A) In general.--The term ``independent auditor'' means an individual that conducts a pre-deployment evaluation or impact assessment of a covered algorithm in a manner that exercises objective and impartial judgment on all issues within the scope of such evaluation or assessment. (B) Exclusion.--An individual is not an independent auditor of a covered algorithm if such individual-- (i) is or was involved in using in a commercial context, developing, offering, licensing, or deploying the covered algorithm; (ii) at any point during the pre-deployment evaluation or impact assessment, has an employment relationship (including a contractor relationship) with a developer or deployer that uses, offers, or licenses the covered algorithm; or (iii) at any point during the pre- deployment evaluation or impact assessment, has a direct financial interest or a material indirect financial interest in a developer or deployer that uses, offers, or licenses a covered algorithm, not including routine payment for the auditing services described in subparagraph (A). (12) Individual.--The term ``individual'' means a natural person in the United States. (13) Personal data.-- (A) In general.--The term ``personal data''-- (i) means information that identifies or is linked or reasonably linkable, alone or in combination with other information, to an individual or an individual's device; and (ii) shall include derived data and unique persistent identifiers. (B) Exclusion.--The term ``personal data'' does not include de-identified data. (14) Process.--The term ``process'', with respect to personal data, means to conduct or direct any operation or set of operations performed on such data, including analyzing, organizing, structuring, retaining, storing, using, or otherwise handling such data. (15) Protected characteristic.--The term ``protected characteristic'' means any of the following actual or perceived traits of an individual or group of individuals: (A) Race. (B) Color. (C) Ethnicity. (D) National origin or nationality. (E) Religion. (F) Sex (including a sex stereotype, pregnancy, childbirth, or a related medical condition, sexual orientation or gender identity, and sex characteristics, including intersex traits). (G) Disability. (H) Limited English proficiency. (I) Biometric information. (J) Familial status. (K) Source of income. (L) Income level (not including the ability to pay for a specific good or service being offered). (M) Age. (N) Veteran status. (O) Genetic information or medical conditions. (P) Any other classification protected by Federal law. (16) Public accommodation.-- (A) In general.--The term ``public accommodation'' means-- (i) a business that offers goods or services to the general public, regardless of whether the business is operated for profit or operates from a physical facility; (ii) a park, road, or pedestrian pathway open to the general public; (iii) a means of public transportation; or (iv) a publicly owned or operated facility open to the general public. (B) Exclusions.--The term ``public accommodation'' does not include a private club or establishment described in section 101(b)(2). (17) State.--The term ``State'' means each of the 50 States, the District of Columbia, Puerto Rico, the United States Virgin Islands, Guam, American Samoa, and the Commonwealth of the Northern Mariana Islands. (18) State data protection authority.--The term ``State data protection authority'' means an independent public authority of a State that supervises, investigates, and regulates data protection and security law in the State, including handling complaints lodged against persons for violations of State and relevant Federal laws. (19) Transfer.--The term ``transfer'', with respect to personal data, means to disclose, release, disseminate, make available, license, rent, or share such data orally, in writing, electronically, or by any other means. TITLE I--CIVIL RIGHTS SEC. 101. DISCRIMINATION. (a) In General.--A developer or deployer shall not offer, license, promote, sell, or use a covered algorithm in a manner that-- (1) causes or contributes to a disparate impact in; (2) otherwise discriminates in; or (3) otherwise makes unavailable, the equal enjoyment of goods, services, or other activities or opportunities, related to a consequential action, on the basis of a protected characteristic. (b) Exceptions.--This section shall not apply to-- (1) the offer, licensing, or use of a covered algorithm for the sole purpose of-- (A) a developer's or deployer's self-testing (or auditing by an independent auditor at a developer's or deployer's request) to identify, prevent, or mitigate discrimination, or otherwise