1
2 Councilmember Anita Bonds Councilmember Robert C. White, Jr.
3
4
5 _________________________________ _________________________________
6 Councilmember Janeese Lewis George Councilmember Charles Allen
7
8
9 _________________________________
10 Councilmember Zachary Parker
11
12
13
14
15 A BILL
16 _______
17
18 IN THE COUNCIL OF THE DISTRICT OF COLUMBIA
19 ________________
20
21 To prohibit users of algorithmic decision-making from utilizing algorithmic eligibility
22 determinations in a discriminatory manner, to require corresponding notices to
23 individuals whose personal information is used, and to provide for appropriate means of
24 civil enforcement.
25
26 BE IT ENACTED BY THE COUNCIL OF THE DISTRICT OF COLUMBIA, That this
27 act may be cited as the “Stop Discrimination by Algorithms Act of 2023”.
28 Sec. 2. Findings and declaration of policy.
29 The Council of the District of Columbia makes the following findings:
30 (a) It is the sense of the Council that technological advancements should support the
31 dignity and well-being of the people of the District.
32 (b) Computers and data-derived decision-making tools play ever larger roles in modern
33 life. As of 2019, 90 percent of U.S. adults regularly used the internet. Approximately 76 percent
34 of households in the District of Columbia have a broadband internet subscription, and many who
35 lack a home internet connection use smartphones to go online.
1
36 (c) When District residents engage in online activities like posting on social media,
37 searching web-based listings for an apartment, or submitting electronic job applications, they
38 generate personalized information that is harvested by data collectors. Data collectors can track
39 hundreds of categories of data about specific individuals including age, gender, employment
40 status and place of employment, income level, sexual orientation, national origin, and religion.
41 (d) Companies often use data from both online and offline sources to create algorithms,
42 which are tools that use machine learning and personal data to make educated guesses about an
43 individual’s preferences, abilities, and future behavior. These algorithms are then incorporated
44 into decision-making processes that affect many aspects of life.
45 (e) Increasingly, algorithms determine an individual’s opportunities to secure
46 employment, insurance, credit, housing, and public accommodations, as well as access to
47 information about those opportunities.
48 (f) Algorithms often rely on personal traits protected under the D.C. Human Rights Act.
49 And algorithmic decision-making can amplify discrimination based on race, gender, sexual
50 orientation, disability, age, source of income, credit information, and other protected traits when
51 algorithmic models replicate existing societal inequalities. Algorithmic decision-making systems
52 that fail to account for bias disproportionately harm marginalized communities.
53 (g) Despite their prevalence and the potential problems they pose, algorithms are poorly
54 understood by most individuals, in part because of the many entities involved and the lack of
55 accountability among those entities.
56 (h) This act seeks to protect individuals and classes of individuals from the harm that
57 results when algorithmic decision-making processes operate without transparency, rely on
58 protected traits and other personal data that are correlated with those traits, or disproportionately
2
59 limit access to and information about important life opportunities. The act combats these
60 challenges by:
61 (1) Encouraging transparency and accountability by requiring covered entities to
62 provide notice to individuals about how the covered entity uses personal information in
63 algorithmic decisions, including additional information when the algorithmic decision results in
64 an adverse action, audit its algorithmic determination practices for discriminatory processing or
65 impact, and report this information to the Office of the Attorney General;
66 (2) Prohibiting adverse algorithmic decision-making based on protected traits, or
67 that have the effect of making decisions based on such traits; and
68 (3) Creating public investigatory and enforcement authority, and an individual
69 right of action.
70 Sec. 3. Definitions.
71 The following words and terms when used in this act have the following meanings:
72 (1) “Adverse action” means a denial, cancellation, or other adverse change or assessment
73 regarding an individual’s eligibility for, opportunity to access, or terms of access to important
74 life opportunities.
75 (2) “Algorithmic eligibility determination” means a determination based in whole or in
76 significant part on an algorithmic process that utilizes machine learning, artificial intelligence, or
77 similar techniques to determine an individual’s eligibility for, or opportunity to access, important
78 life opportunities.
79 (3) “Algorithmic information availability determination” means a determination based in
80 whole or in significant part on an algorithmic process that utilizes machine learning, artificial
3
81 intelligence, or similar techniques to determine an individual’s receipt of advertising, marketing,
82 solicitations, or offers for an important life opportunity.
83 (4) “Covered entity” means any individual, firm, corporation, partnership, cooperative,
84 association, or any other organization, legal entity, or group of individuals however organized,
85 including entities related by common ownership or corporate control, that either makes
86 algorithmic eligibility determinations or algorithmic information availability determinations, or
87 relies on algorithmic eligibility determinations or algorithmic information availability
88 determinations supplied by a service provider, and that meets one of the following criteria:
89 (A) Possesses or controls personal information on more than 25,000 District
90 residents;
91 (B) Has greater than $15 million in average annualized gross receipts for the 3
92 years preceding the most recent fiscal year;
93 (C) Is a data broker, or other entity, that derives 50 percent or more of its annual
94 revenue by collecting, assembling, selling, distributing, providing access to, or maintaining
95 personal information, and some proportion of the personal information concerns a District
96 resident who is not a customer or an employee of that entity; or
97 (D) Is a service provider.
98 (5) “Important life opportunities” means access to, approval for, or offer of credit,
99 education, employment, housing, a place of public accommodation as defined in section 102(24)
100 of the Human Rights Act of 1977, effective December 13, 1977 (D.C. Law 2-38; D.C. Official
101 Code § 2-1401.02(24)), or insurance.
4
102 (6)(A) “Personal information” means any information held by a covered entity –
103 regardless of how the information is collected, inferred, derived, created, or obtained – that is
104 linked or reasonably linkable to an individual, household, or a personal device.
105 (B) Information is reasonably linkable to an individual, household, or personal
106 device if it can be used on its own or in combination with other information reasonably available
107 to the covered entity, regardless of whether such other information is held by the covered entity,
108 to identify an individual, household, or personal device.
109 (C) Examples of personal information include:
110 (i) Individually identifiable information such as a real name, alias,
111 signature, date of birth, union membership number, postal address, unique personal identifier,
112 online identifier, internet protocol address, media access control (MAC) address, unique device
113 identifier, email address, phone number, account name, social security number, military
114 identification number, driver’s license number, vehicle identification number, passport number,
115 or other similar identifiers;
116 (ii) A person’s race, national origin, religious affiliation, gender identity,
117 sexual orientation, marital status, or disability;
118 (iii) Commercial information, including records of personal property,
119 products or services purchased, obtained, or considered, or other purchasing or consuming
120 histories or tendencies;
121 (iv) Real-time or historical geolocation data more specific than a 50-mile
122 radius;
123 (v) Education records, as defined in 34 C.F.R. § 99.3 or any successor
124 regulation;
5
125 (vi) Biometric data, including voice signatures, facial geometry,
126 fingerprints, and retina/iris scans;
127 (vii) Inferences drawn from any of the information identified in sub-
128 subparagraphs (i)-(vi) to create a profile about an individual reflecting the individual’s
129 predispositions, behavior, habits, attitudes, intelligence, abilities, and aptitudes.
130 (7) “Service provider” means any entity that performs algorithmic eligibility
131 determinations or algorithmic information availability determinations on behalf of another entity.
132 Sec. 4. Prohibited practices.
133 (a) In general.
134 (1) A covered entity shall not make an algorithmic eligibility determination or an
135 algorithmic information availability determination on the basis of an individual’s or class of
136 individuals’ actual or perceived race, color, religion, national origin, sex, gender identity or
137 expression, sexual orientation, familial status, source of income, or disability in a manner that
138 segregates, discriminates against, or otherwise makes important life opportunities unavailable to
139 an individual or class of individuals.
140 (2) Any practice that has the effect or consequence of violating paragraph (1) of
141 this subsection shall be deemed to be an unlawful discriminatory practice.
142 (b) Exemptions.
143 (1) Nothing in subsection (a) shall limit the availability of the exemptions in
144 section 103 of the Human Rights Act of 1977, effective December 13, 1977 (D.C. Law 2-38;
145 D.C. Official Code § 2-1401.03).
6
146 (2) Nothing in this act shall prohibit covered entities from using individuals’
147 personal information to s part of an affirmative action plan, adopted pursuant to District or
148 federal law
149 (C)make algorithmic eligibility determinations or algorithmic information
150 availability determinations
151 Sec. 5. Relationships with service providers.
152 Any covered entity that relies in whole or in part on a service provider to conduct an
153 algorithmic eligibility determination or an algorithmic information availability determination
154 shall require by written agreement that the service provider implement and maintain measures
155 reasonably designed to ensure that the service provider complies with this act.
156 Sec. 6. Right to notice and disclosure.
157 (a) Notice requirement.
158 A covered entity shall:
159 (1) Develop a notice about how the covered entity uses personal information in
160 algorithmic eligibility determinations and algorithmic information availability determinations,
161 including:
162 (A) What personal information the covered entity collects, generates,
163 infers, uses, and retains;
164 (B) What sources the covered entity uses to collect, generate, or infer
165 personal information;
166 (C) Whether the personal information is shared, sold, leased, or exchanged
167 with any service providers for any kind of consideration, and if so, the names of those service
168 providers, including subsidiaries of the service providers;
7
169 (D) A brief description of the relationship between the personal
170 information and the algorithmic information availability or algorithmic eligibility
171 determinations;
172 (E) How long the covered entity will hold the personal information; and
173 (F) The rights provided under this act;
174 (2) Ensure that the notice developed and made available under paragraph (1) of
175 this subsection:
176 (A) Is clear, concise, and complete;
177 (B) Does not contain unrelated, confusing, or contradictory materials; and
178 (C) Is in a format that is:
179 (i) Prominent and easily accessible;
180 (ii) Capable of fitting on one printed page; and
181 (iii) Provided in English, as well as in any non-English language
182 spoken by at least 500 individuals in the District of Columbia population.
183 (3) Within 30 days after changing its collection or use practices or policies in a
184 way that affects the content of the notice required by paragraph (1) of this subsection, update that
185 notice;
186 (4) Make the notice required under paragraph (1) of this subsection continuously
187 and conspicuously available:
188 (A) On the covered entity’s website or mobile application, if the covered
189 entity maintains a website or mobile application;
190 (B) At the physical place of business or any offline equivalent the covered
191 entity maintains; and
8
192 (5) Send the notice required under paragraph (1) of this subsection to an
193 individual before the first algorithmic information availability determination it makes about the
194 individual, by:
195 (A) Mail, if the personal information was gathered through the individual
196 contacting or contracting with the covered entity through mail;
197 (B) Email, if the personal information was gathered through the individual
198 contacting or contracting with the covered entity through email, or if the covered entity has the
199 individual’s email address for another reason;
200 (C) Informing individuals through a “pop-up” notification upon navigation
201 to the covered entity’s website or within the covered entity’s mobile application; or
202 (D) Providing a clear and conspicuous link on the covered entity’s
203 website’s homepage, or the home screen of its mobile application, leading to the notice.
204 (b) A covered entity need not provide the notice described under subsection (a) of this
205 section if another covered entity has provided notice to the same individual for the same action
206 as part of a contracted arrangement with the covered entity.
207 (c) Prohibited acts.
208 A covered entity that is subject to paragraph (a)(1), with respect to any individual whose
209 personal information the covered entity holds as described in that paragraph, may not use any
210 personal information of the individual in an algorithmic eligibility determination unless the
211 covered entity has provided the individual with notice consistent with that paragraph.
212 (d) Adverse action disclosure requirements.
9
213 If a covered entity takes any adverse action with respect to any individual that is based in
214 whole or in part on the results of an algorithmic eligibility determination, the covered entity shall
215 provide the individual a written or electronic disclosure that includes:
216 (1) The covered entity’s name, address, email address, and telephone number;
217 (2) The factors the determination depended on; and
218 (3) An explanation that the individual may:
219 (A) Access any personal information described in section 3(6)(A)-(C),
220 pertaining to that individual, that the covered entity used to make the determination;
221 (B) Submit corrections to that information; and
222 (C) If the individual submits corrections, request that the covered entity
223 conduct a reasoned reevaluation of the relevant algorithmic eligibility determination, conducted
224 by a human, based on the corrected data.
225 Sec. 7. Auditing for Discriminatory Processing and Reporting Requirement.
226 (a) Auditing requirement.
227 A covered entity shall annually audit its algorithmic eligibility determination and
228 algorithmic information availability determination practices to:
229 (1) Determine whether the processing practices discriminate in a manner
230 prohibited by section 4 of this act;
231 (2) Analyze disparate-impact risks of algorithmic eligibility determinations and
232 algorithmic information availability determinations based on actual or perceived race, color,
233 religion, national origin, sex, gender identity or expression, sexual orientation, familial status,
234 genetic information, source of income, or disability;
10
235 (3) Create and retain for at least 5 years an audit trail that records, for each
236 algorithmic eligibility determination:
237 (A) The type of algorithmic eligibility determination made;
238 (B) The data used in the determination, including the source of any such
239 data;
240 (C) The methodology used by the entity to establish the algorithm;
241 (D) The algorithm used to make the determination;
242 (E) Any data or sets of data used to train the algorithm;
243 (F) Any testing and results for model performance across different
244 subgroups or for discriminatory effects;
245 (G) The methodology used to render the determination; and
246 (H) The ultimate decision rendered;
247 (4) Conduct annual impact assessments of:
248 (A) Existing systems that render algorithmic eligibility determinations and
249 alg