[Congressional Bills 119th Congress]
[From the U.S. Government Publishing Office]
[H.R. 7110 Introduced in House (IH)]

<DOC>






119th CONGRESS
  2d Session
                                H. R. 7110

 To require agencies that use, fund, or oversee algorithms to have an 
office of civil rights focused on bias, discrimination, and other harms 
                 of algorithms, and for other purposes.


_______________________________________________________________________


                    IN THE HOUSE OF REPRESENTATIVES

                            January 15, 2026

   Ms. Lee of Pennsylvania (for herself, Ms. Norton, Ms. Tlaib, Mrs. 
    Ramirez, Mr. Johnson of Georgia, Mr. Thanedar, Mr. Thompson of 
   Mississippi, Mr. Evans of Pennsylvania, Ms. Bonamici, Mrs. Watson 
Coleman, Mrs. Foushee, and Mr. Green of Texas) introduced the following 
 bill; which was referred to the Committee on Oversight and Government 
                                 Reform

_______________________________________________________________________

                                 A BILL


 
 To require agencies that use, fund, or oversee algorithms to have an 
office of civil rights focused on bias, discrimination, and other harms 
                 of algorithms, and for other purposes.

    Be it enacted by the Senate and House of Representatives of the 
United States of America in Congress assembled,

SECTION 1. SHORT TITLE.

    This Act may be cited as the ``Eliminating Bias in Algorithmic 
Systems Act of 2026''.

SEC. 2. DEFINITIONS.

    In this Act:
            (1) Agency.--The term ``agency'' has the meaning given the 
        term in section 3502 of title 44, United States Code.
            (2) Covered agency.--The term ``covered agency'' means an 
        agency that--
                    (A) uses, funds, or procures a covered algorithm, 
                or funds or otherwise participates in the development 
                of a covered algorithm; or
                    (B) oversees, regulates, or advises on the 
                development or use of a covered algorithm.
            (3) Covered algorithm.--The term ``covered algorithm'' 
        means a process that--
                    (A) is--
                            (i) a computational process that uses 
                        machine learning, natural language processing, 
                        artificial intelligence techniques, or other 
                        computational processing techniques of similar 
                        or greater complexity; or
                            (ii) a computational process derived from a 
                        process described in clause (i); and
                    (B) has the potential to have a material effect on 
                the impact of, access to, availability of, eligibility 
                for, cost of, terms of, or conditions of--
                            (i) a program operated or funded by an 
                        agency;
                            (ii) an economic opportunity regulated by 
                        an agency; or
                            (iii) rights protected by an agency.
            (4) Protected characteristic.--The term ``protected 
        characteristic'' means any of the following actual or perceived 
        traits of an individual or group of individuals:
                    (A) Race.
                    (B) Color.
                    (C) Ethnicity.
                    (D) National origin, nationality, or immigration 
                status.
                    (E) Religion.
                    (F) Sex (including a sex stereotype, pregnancy, 
                childbirth, or a related medical condition, sexual 
                orientation or gender identity, and sex 
                characteristics, including intersex traits).
                    (G) Disability.
                    (H) Limited English proficiency.
                    (I) Biometric information.
                    (J) Familial or marital status.
                    (K) Source of income.
                    (L) Income level (not including the ability to pay 
                for a specific good or service being offered).
                    (M) Age.
                    (N) Veteran status.
                    (O) Genetic information or medical conditions.
                    (P) Any other classification protected by Federal 
                law.

SEC. 3. CIVIL RIGHTS OFFICES AND REPORTING ON AI BIAS, DISCRIMINATION, 
              AND OTHER HARMS.

    (a) Offices of Civil Rights.--The head of each covered agency shall 
ensure that the covered agency has an office of civil rights that 
employs experts and technologists focused on bias, discrimination, and 
other harms, including the effect or tendency to subject communities, 
groups, or individuals to bias based on, discrimination based on, or 
other harms attributable to possessing or being perceived as possessing 
a protected characteristic.
    (b) Bias, Discrimination, and Other Harms Reports.--Not later than 
1 year after the date of enactment of this Act, and every 2 years 
thereafter, each office of civil rights of a covered agency established 
under subsection (a) shall submit to each congressional committee with 
jurisdiction over the covered agency a report that details--
            (1) the state of the field and technology of covered 
        algorithms with respect to jurisdiction of the covered agency, 
        including risks relating to bias based on, discrimination based 
        on, and other harms attributable to possessing or being 
        perceived as possessing a protected characteristic;
            (2) any relevant steps the covered agency has taken to 
        mitigate harms from covered algorithms relating to bias based 
        on, discrimination based on, and other harms attributable to 
        possessing or being perceived as possessing a protected 
        characteristic;
            (3) actions the covered agency has taken to engage with 
        relevant stakeholders, including industry representatives, 
        businesses, civil rights advocates, consumer protection 
        organizations, other relevant civil society organizations, 
        academic experts, individuals with technical expertise, 
        organizations representing workers, and affected populations, 
        regarding bias, discrimination, and other harms including the 
        effect or tendency to subject communities, groups, or 
        individuals to bias based on, discrimination based on, and 
        other harms attributable to possessing or being perceived as 
        possessing a protected characteristic; and
            (4) any relevant recommendations for legislation or 
        administrative action to mitigate bias based on, discrimination 
        based on, and other harms attributable to possessing or being 
        perceived as possessing a protected characteristic from covered 
        algorithms, as determined appropriate by the head of the 
        office.
    (c) Interagency Working Group.--Not later than 1 year after the 
date of enactment of this Act, the Assistant Attorney General in charge 
of the Civil Rights Division of the Department of Justice shall 
establish an interagency working group on covered algorithms and civil 
rights, of which each office of civil rights of a covered agency 
established under subsection (a) shall be a member.
    (d) Authorization of Appropriations.--There are authorized to be 
appropriated to each covered agency such sums as may be necessary to 
carry out this Act.
                                 <all>