295:, a facial recognition technology they were using on users when they log in. The AJL and other organizations sent letters to legislators and requested them to encourage the IRS to stop the program. In February 2022, the IRS agreed to halt the program and stop using facial recognition technology. AJL has now shifted efforts to convince other government agencies to stop using facial recognition technology; as of March 2022, the DumpID.me petition has pivoted to stop the use of ID.me in all government agencies.
29:
143:. While experimenting with facial detection software in her research, she found that the software could not detect her "highly melanated" face until she donned a white mask. After this incident, Buolamwini became inspired to found AJL to draw public attention to the existence of bias in artificial intelligence and the threat it can poses to civil rights. Early AJL campaigns focused primarily on bias in
342:
programs that compensate and encourage individuals to locate and disclose the existence of bias in AI systems. AJL intends for the CRASH framework to give individuals the ability to report algorithmic harms and stimulate change in AI technologies deployed by companies, especially individuals who have traditionally been excluded from the design of these AI technologies .
207:. Their research, entitled "Gender Shades", determined that machine learning models released by IBM and Microsoft were less accurate when analyzing dark-skinned and feminine faces compared to performance on light-skinned and masculine faces. The "Gender Shades" paper was accompanied by the launch of the Safe Face Pledge, an initiative designed with the
125:(AI) in society and the harms and biases that AI can pose to society. The AJL has engaged in a variety of open online seminars, media appearances, and tech advocacy initiatives to communicate information about bias in AI systems and promote industry and government action to mitigate against the creation and deployment of biased AI systems. In 2021,
308:
commitment to obtaining customer consent for their selfies and skin data to be used in this audit. The AJL and ORCAA audit revealed that the OSA system contained bias in its performance across participants' skin color and age. The OSA system demonstrated higher accuracy for participants with lighter skin tones, per the
307:
collaborated with AJL and O'Neil Risk
Consulting & Algorithmic Auditing (ORCAA) to conduct the Decode the Bias campaign, which included an audit that explored whether the Olay Skin Advisor (OSA) System included bias against women of color. The AJL chose to collaborate with Olay due to Olay's
279:
In 2019, Buolamwini represented AJL at a congressional hearing of the US House
Committee on Science, Space, and Technology, to discuss the applications of facial recognition technologies commercially and in the government. Buolamwini served as a witness at the hearing and spoke on underperformance of
226:
A research collaboration involving AJL released a white paper in May 2020 calling for the creation of a new United States federal government office to regulate the development and deployment of facial recognition technologies. The white paper proposed that creating a new federal government office for
211:
that urged technology organizations and governments to prohibit lethal use of facial recognition technologies. The Gender Shades project and subsequent advocacy undertaken by AJL and similar groups led multiple tech companies, including Amazon and IBM, to address biases in the development of their
341:
programs (BBPs) that would incentivize individuals to uncover and report instances of algorithmic bias in AI technologies. After conducting interviews with BBP participants and a case study of
Twitter's BBP program, AJL researchers developed and proposed a conceptual framework for designing BBP
312:
Skin Type and individual typology angle skin classification scales. The OSA system also demonstrated higher accuracy for participants aged 30–39. Olay has, since, taken steps to internally audit and mitigate against the bias of the OSA system. Olay has also funded 1,000 girls to attend the
239:
and inequities in the performance of AI systems for speech and language modeling across gender and racial populations. The AJL's work in this space centers on highlighting gender and racial disparities in the performance of commercial
1563:
271:. AJL based their development of "Voicing Erasure" on a 2020 PNAS paper, titled, "Racial disparities in automated speech recognition" that identified racial disparities in performance of five commercial ASR systems.
995:
1404:
1863:
854:
1305:
1688:
1657:
1555:
325:
In July 2020, the
Community Reporting of Algorithmic System Harms (CRASH) Project was launched by AJL. This project began in 2019 when Buolamwini and digital security researcher
917:
1938:
1903:
1068:
Bender, Emily M.; Gebru, Timnit; McMillan-Major, Angelina; Shmitchell, Shmargaret (March 3, 2021). "On the
Dangers of Stochastic Parrots: Can Language Models be Too Big?".
280:
facial recognition technologies in identifying people with darker skin and feminine features and supported her position with research from the AJL project "Gender Shades".
1968:
1373:
987:
884:
798:
524:
1833:
1953:
1943:
1786:
171:
680:
1194:; Nam, Andrew; Lake, Emily; Nudell, Joe; Quartey, Minnie; Mengesha, Zion; Toups, Connor; Rickford, John R.; Jurafsky, Dan; Goel, Sharad (April 7, 2020).
1752:
1465:
1396:
618:
1718:
1855:
1811:
1174:
1054:
743:
1338:
939:
578:
718:
251:
In March 2020, AJL released a spoken word artistic piece, titled
Voicing Erasure, that increased public awareness of racial bias in automatic
1913:
1434:
227:
this area would help reduce the risks of mass surveillance and bias posed by facial recognition technologies towards vulnerable populations.
223:. This documentary focused on the AJL's research and advocacy efforts to spread awareness of algorithmic bias in facial recognition systems.
846:
1297:
1680:
1963:
1649:
1029:
907:
1923:
1365:
876:
788:
514:
1973:
1908:
1087:
288:
1825:
820:
1775:
1593:
147:
software; recent campaigns have dealt more broadly with questions of equitability and accountability in AI, including
1329:"We listened to more than 3 hours of US Congress testimony on facial recognition so you didn't have to go through it"
988:"Biometrics experts call for creation of FDA-style government body to regulate facial recognition | Biometric Update"
547:
423:
964:
166:
Additionally there is a community of other organizations working towards similar goals, including Data and
Society,
1948:
1918:
1149:
208:
195:
to release a 2018 study on racial and gender bias in facial recognition algorithms used by commercial systems from
672:
1928:
1272:
765:
1525:
1744:
1457:
644:
458:
255:(ASR) systems. The piece was performed by numerous female and non-binary researchers in the field, including
487:
1016:
600:
121:, the AJL uses research, artwork, and policy advocacy to increase societal awareness regarding the use of
1933:
1710:
428:
245:
359:
248:
systems, which have been shown to underperform on racial minorities and reinforced gender stereotypes.
1626:
1328:
570:
152:
144:
1495:
703:
408:
403:
374:
recognized AJL as one of the 10 most innovative AI companies in 2021. Additionally, venues such as
114:
77:
1426:
1122:. Proceedings of the 7th Joint Conference on Lexical and Computational Semantics. pp. 43–53.
156:
122:
1958:
363:
330:
220:
1805:
1048:
737:
1556:"Olay Teams Up With Algorithmic Justice Pioneer Joy Buolamwini To #DecodetheBias In Beauty"
1207:
1018:
448:
355:
334:
284:
260:
175:
167:
139:
Buolamwini founded the
Algorithmic Justice League in 2016 as a graduate student in the MIT
333:. Since then, the project has also been co-led by MIT professor and AJL research director
268:
8:
673:"Activists pushed the IRS to drop facial recognition. They won, but they're not done yet"
605:
601:"Google fired its star AI researcher one year ago. Now she's launching her own institute"
291:
to release an online petition called DumpID.me, calling for the IRS to halt their use of
1211:
704:"Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification"
326:
1238:
1195:
1168:
1141:
1123:
1093:
933:
382:
367:
338:
252:
241:
1681:"Twitter's photo-cropping algorithm prefers young, beautiful, and light-skinned faces"
1243:
1225:
1097:
1083:
877:"IBM pulls out of facial recognition, fearing racial profiling and mass surveillance"
793:
610:
309:
1145:
1070:
Proceedings of the 2021 ACM Conference on
Fairness, Accountability, and Transparency
1017:
Miller, Erik
Learned; Ordóñez, Vicente; Morgenstern, Jamie; Buolamwini, Joy (2020).
1233:
1215:
1191:
1133:
1073:
418:
314:
236:
148:
1397:"Dozens of advocacy groups push for Congress to ban facial recognition technology"
1120:
Proceedings of the Seventh Joint Conference on Lexical and Computational Semantics
28:
438:
394:
have featured Buolamwini's work with the AJL in several interviews and articles.
376:
351:
212:
algorithms and even temporarily ban the use of their products by police in 2020.
847:"The two-year fight to stop Amazon from selling face recognition to the police"
824:
443:
413:
118:
55:
1458:"IRS halts plan to require facial recognition for logging in to user accounts"
711:
Proceedings of the 1st Conference on Fairness, Accountability and Transparency
1897:
1229:
614:
337:. The CRASH project focused on creating the framework for the development of
256:
140:
1220:
1078:
1856:"Joy Buolamwini: How Do Biased Algorithms Damage Marginalized Communities?"
1585:
1333:
1247:
1137:
519:
463:
453:
433:
264:
192:
127:
1113:"Examining Gender and Race Bias in Two Hundred Sentiment Analysis Systems"
1019:"Facial Recognition Technologies in the Wild: A Call for a Federal Office"
1776:"Bounty Everything: Hackers and the Making of the Global Bug Marketplace"
1298:"Algorithmic Justice League protests bias in voice AI and media coverage"
956:
1112:
1110:
571:"Documentary 'Coded Bias' Unmasks The Racism Of Artificial Intelligence"
216:
1264:
131:
named AJL as one of the 10 most innovative AI companies in the world.
1517:
1427:"U.S. government study finds racial bias in facial recognition tools"
757:
196:
1128:
1067:
1650:"Bug Bounties For Algorithmic Harms? | Algorithmic Justice League"
640:
317:
camp, to encourage African-American girls to pursue STEM careers.
574:
483:
215:
Buolamwini and AJL were featured in the 2020 Netflix documentary
1773:
204:
789:"MIT Researcher: AI Has a Race Problem, and We Need to Fix It"
701:
515:"The 10 most innovative companies in artificial intelligence"
292:
160:
1518:"Decode the Bias & Face Anything | Women in STEM | OLAY"
1366:"Artificial Intelligence: Societal and Ethical Implications"
235:
The AJL has run initiatives to increase public awareness of
1618:
304:
1072:. FAccT '21. Virtual Event Canada: ACM. pp. 610–623.
912:
676:
391:
387:
200:
113:) is a digital advocacy non-profit organization based in
329:
met at the Bellagio Center Residency Program, hosted by
1888:
92:
172:
Distributed Artificial Intelligence Research Institute
1939:
Existential risk from artificial general intelligence
1487:
1190:
191:
AJL founder Buolamwini collaborated with AI ethicist
1904:
Civil liberties advocacy groups in the United States
1196:"Racial disparities in automated speech recognition"
1969:
Social welfare charities based in the United States
1774:Ellis, Ryan Ellis; Stevens, Yuan (January 2022).
1619:"Algorithmic Vulnerability Bounty Project (AVBP)"
1111:Kiritchenko, Svetlana; Mohammad, Saif M. (2018).
1895:
1370:House Committee on Science, Space and Technology
345:
298:
1954:Non-profit organizations based in Massachusetts
1944:Organizations based in Cambridge, Massachusetts
1200:Proceedings of the National Academy of Sciences
1743:League, Algorithmic Justice (August 4, 2021).
548:"Coded Bias and the Algorithm Justice League"
230:
209:Georgetown Center on Privacy & Technology
1810:: CS1 maint: multiple names: authors list (
1173:: CS1 maint: multiple names: authors list (
1053:: CS1 maint: multiple names: authors list (
742:: CS1 maint: multiple names: authors list (
1488:"Demand All Government Agencies Drop ID.me"
1455:
938:: CS1 maint: numeric names: authors list (
283:In January 2022, the AJL collaborated with
1237:
1219:
1127:
1077:
274:
1553:
908:"When Bias Is Coded Into Our Technology"
350:AJL initiatives have been funded by the
117:. Founded in 2016 by computer scientist
16:Digital advocacy non-profit organization
1678:
1394:
702:Buolamwini, Joy; Gebru, Timnit (2018).
545:
1896:
1792:from the original on February 24, 2022
1755:from the original on November 16, 2021
1742:
1738:
1736:
1554:Shacknai, Gabby (September 14, 2021).
1549:
1547:
1545:
1543:
1360:
1358:
1356:
724:from the original on December 12, 2020
1721:from the original on January 31, 2022
1660:from the original on January 18, 2023
1613:
1611:
1395:Rodrigo, Chris Mills (July 2, 2020).
1341:from the original on January 21, 2022
1326:
1292:
1290:
1259:
1257:
1186:
1184:
1035:from the original on January 21, 2022
951:
949:
786:
666:
664:
662:
643:. Distributed AI Research Institute.
621:from the original on December 2, 2021
565:
563:
561:
509:
507:
505:
289:Electronic Privacy Information Center
186:
1914:Artificial intelligence associations
1647:
985:
906:Lee, Jennifer 8 (February 8, 2020).
787:Buell, Spencer (February 23, 2018).
670:
581:from the original on January 4, 2022
546:Villoro, ElĂas (February 16, 2023).
1818:
1733:
1629:from the original on March 18, 2022
1596:from the original on April 11, 2022
1566:from the original on March 28, 2022
1540:
1528:from the original on April 11, 2022
1498:from the original on April 26, 2022
1376:from the original on March 15, 2022
1353:
1308:from the original on March 31, 2022
1275:from the original on April 11, 2022
967:from the original on March 24, 2022
920:from the original on March 26, 2022
905:
683:from the original on March 31, 2022
490:from the original on March 29, 2022
13:
1964:Data and information organizations
1866:from the original on April 3, 2022
1836:from the original on April 8, 2022
1745:"Happy Hacker Summer Camp Season!"
1703:
1691:from the original on April 8, 2022
1679:Vincent, James (August 10, 2021).
1608:
1468:from the original on April 8, 2022
1437:from the original on April 8, 2022
1407:from the original on April 8, 2022
1287:
1254:
1181:
1155:from the original on March 8, 2022
998:from the original on April 7, 2022
946:
887:from the original on April 7, 2022
857:from the original on April 7, 2022
823:. January 20, 2021. Archived from
801:from the original on April 7, 2022
659:
647:from the original on April 7, 2022
558:
527:from the original on April 7, 2022
502:
486:. The Algorithmic Justice League.
14:
1985:
1882:
821:"Announcement - Safe Face Pledge"
768:from the original on May 29, 2022
424:Ethics of artificial intelligence
1924:Ethics of science and technology
1648:Laas, Molly (January 27, 2022).
1456:Rachel Metz (February 7, 2022).
1327:Quach, Katyanna (May 22, 2019).
320:
27:
1848:
1767:
1672:
1641:
1578:
1510:
1480:
1449:
1419:
1388:
1320:
1104:
1061:
1010:
979:
899:
869:
839:
813:
780:
370:and individual private donors.
181:
986:Burt, | Chris (June 8, 2020).
750:
732:– via December 12, 2020.
695:
671:Metz, Rachel (March 7, 2022).
633:
593:
539:
476:
1:
1826:"AJL Bug Bounties Report.pdf"
1711:"AJL Bug Bounties Report.pdf"
469:
459:Margaret Mitchell (scientist)
346:Support and media appearances
299:Olay Decode the Bias campaign
1974:Social justice organizations
1909:Digital rights organizations
957:"Watch Coded Bias | Netflix"
7:
1163:– via June 5–6, 2018.
429:Fairness (machine learning)
397:
246:natural language processing
153:algorithmic decision-making
10:
1990:
1889:Algorithmic Justice League
360:Alfred P. Sloan Foundation
331:The Rockefeller Foundation
231:Bias in speech recognition
134:
107:Algorithmic Justice League
22:Algorithmic Justice League
219:, which premiered at the
87:
69:
61:
51:
43:
35:
26:
409:Algorithmic transparency
404:Regulation of algorithms
115:Cambridge, Massachusetts
78:Cambridge, Massachusetts
1949:Government by algorithm
1919:Politics and technology
1221:10.1073/pnas.1915768117
1079:10.1145/3442188.3445922
992:www.biometricupdate.com
123:artificial intelligence
1929:Diversity in computing
364:Rockefeller Foundation
275:Algorithmic governance
221:Sundance Film Festival
157:algorithmic governance
1433:. December 19, 2019.
851:MIT Technology Review
577:. November 18, 2020.
1492:Fight for the Future
1138:10.18653/v1/S18-2005
449:Sasha Costanza-Chock
356:MacArthur Foundation
335:Sasha Costanza-Chock
285:Fight for the Future
261:Sasha Costanza-Chock
176:Fight for the Future
168:Data for Black Lives
1212:2020PNAS..117.7684K
827:on January 20, 2021
606:The Washington Post
303:In September 2021,
23:
1934:Information ethics
383:The New York Times
368:Mozilla Foundation
253:speech recognition
242:speech recognition
187:Facial recognition
159:, and algorithmic
21:
1372:. June 26, 2019.
1304:. April 1, 2020.
1265:"Voicing Erasure"
1206:(14): 7684–7689.
1192:Koenecke, Allison
1089:978-1-4503-8309-7
523:. March 9, 2021.
269:Kimberlé Crenshaw
103:
102:
1981:
1876:
1875:
1873:
1871:
1852:
1846:
1845:
1843:
1841:
1822:
1816:
1815:
1809:
1801:
1799:
1797:
1791:
1780:
1771:
1765:
1764:
1762:
1760:
1740:
1731:
1730:
1728:
1726:
1707:
1701:
1700:
1698:
1696:
1676:
1670:
1669:
1667:
1665:
1645:
1639:
1638:
1636:
1634:
1615:
1606:
1605:
1603:
1601:
1586:"ORCAA's Report"
1582:
1576:
1575:
1573:
1571:
1551:
1538:
1537:
1535:
1533:
1514:
1508:
1507:
1505:
1503:
1484:
1478:
1477:
1475:
1473:
1453:
1447:
1446:
1444:
1442:
1423:
1417:
1416:
1414:
1412:
1392:
1386:
1385:
1383:
1381:
1362:
1351:
1350:
1348:
1346:
1324:
1318:
1317:
1315:
1313:
1294:
1285:
1284:
1282:
1280:
1261:
1252:
1251:
1241:
1223:
1188:
1179:
1178:
1172:
1164:
1162:
1160:
1154:
1131:
1117:
1108:
1102:
1101:
1081:
1065:
1059:
1058:
1052:
1044:
1042:
1040:
1034:
1023:
1014:
1008:
1007:
1005:
1003:
983:
977:
976:
974:
972:
953:
944:
943:
937:
929:
927:
925:
903:
897:
896:
894:
892:
873:
867:
866:
864:
862:
843:
837:
836:
834:
832:
817:
811:
810:
808:
806:
784:
778:
777:
775:
773:
762:gendershades.org
754:
748:
747:
741:
733:
731:
729:
723:
708:
699:
693:
692:
690:
688:
668:
657:
656:
654:
652:
637:
631:
630:
628:
626:
597:
591:
590:
588:
586:
567:
556:
555:
543:
537:
536:
534:
532:
511:
500:
499:
497:
495:
480:
419:Algorithmic bias
327:Camille François
315:Black Girls Code
237:algorithmic bias
149:algorithmic bias
145:face recognition
99:
96:
94:
80:
31:
24:
20:
1989:
1988:
1984:
1983:
1982:
1980:
1979:
1978:
1894:
1893:
1885:
1880:
1879:
1869:
1867:
1854:
1853:
1849:
1839:
1837:
1824:
1823:
1819:
1803:
1802:
1795:
1793:
1789:
1778:
1772:
1768:
1758:
1756:
1741:
1734:
1724:
1722:
1709:
1708:
1704:
1694:
1692:
1677:
1673:
1663:
1661:
1646:
1642:
1632:
1630:
1617:
1616:
1609:
1599:
1597:
1584:
1583:
1579:
1569:
1567:
1552:
1541:
1531:
1529:
1516:
1515:
1511:
1501:
1499:
1486:
1485:
1481:
1471:
1469:
1454:
1450:
1440:
1438:
1425:
1424:
1420:
1410:
1408:
1393:
1389:
1379:
1377:
1364:
1363:
1354:
1344:
1342:
1325:
1321:
1311:
1309:
1296:
1295:
1288:
1278:
1276:
1263:
1262:
1255:
1189:
1182:
1166:
1165:
1158:
1156:
1152:
1115:
1109:
1105:
1090:
1066:
1062:
1046:
1045:
1038:
1036:
1032:
1021:
1015:
1011:
1001:
999:
984:
980:
970:
968:
961:www.netflix.com
955:
954:
947:
931:
930:
923:
921:
904:
900:
890:
888:
875:
874:
870:
860:
858:
845:
844:
840:
830:
828:
819:
818:
814:
804:
802:
785:
781:
771:
769:
758:"Gender Shades"
756:
755:
751:
735:
734:
727:
725:
721:
706:
700:
696:
686:
684:
669:
660:
650:
648:
639:
638:
634:
624:
622:
599:
598:
594:
584:
582:
569:
568:
559:
544:
540:
530:
528:
513:
512:
503:
493:
491:
482:
481:
477:
472:
439:Emily M. Bender
400:
352:Ford Foundation
348:
323:
301:
277:
233:
189:
184:
137:
91:
83:
76:
17:
12:
11:
5:
1987:
1977:
1976:
1971:
1966:
1961:
1956:
1951:
1946:
1941:
1936:
1931:
1926:
1921:
1916:
1911:
1906:
1892:
1891:
1884:
1883:External links
1881:
1878:
1877:
1847:
1817:
1766:
1732:
1702:
1671:
1640:
1607:
1577:
1539:
1509:
1479:
1448:
1418:
1387:
1352:
1319:
1286:
1253:
1180:
1103:
1088:
1060:
1009:
978:
945:
898:
868:
838:
812:
779:
749:
694:
658:
632:
592:
557:
538:
501:
474:
473:
471:
468:
467:
466:
461:
456:
451:
446:
444:Joy Buolamwini
441:
436:
431:
426:
421:
416:
414:Digital rights
411:
406:
399:
396:
347:
344:
322:
319:
300:
297:
276:
273:
232:
229:
188:
185:
183:
180:
136:
133:
119:Joy Buolamwini
101:
100:
89:
85:
84:
82:
81:
73:
71:
67:
66:
63:
59:
58:
56:Joy Buolamwini
53:
49:
48:
45:
41:
40:
37:
33:
32:
15:
9:
6:
4:
3:
2:
1986:
1975:
1972:
1970:
1967:
1965:
1962:
1960:
1959:Data activism
1957:
1955:
1952:
1950:
1947:
1945:
1942:
1940:
1937:
1935:
1932:
1930:
1927:
1925:
1922:
1920:
1917:
1915:
1912:
1910:
1907:
1905:
1902:
1901:
1899:
1890:
1887:
1886:
1865:
1861:
1857:
1851:
1835:
1831:
1827:
1821:
1813:
1807:
1788:
1784:
1777:
1770:
1754:
1750:
1746:
1739:
1737:
1720:
1716:
1712:
1706:
1690:
1686:
1682:
1675:
1659:
1655:
1651:
1644:
1628:
1624:
1620:
1614:
1612:
1595:
1591:
1587:
1581:
1565:
1561:
1557:
1550:
1548:
1546:
1544:
1527:
1523:
1519:
1513:
1497:
1493:
1489:
1483:
1467:
1463:
1459:
1452:
1436:
1432:
1428:
1422:
1406:
1402:
1398:
1391:
1375:
1371:
1367:
1361:
1359:
1357:
1340:
1336:
1335:
1330:
1323:
1307:
1303:
1299:
1293:
1291:
1274:
1270:
1266:
1260:
1258:
1249:
1245:
1240:
1235:
1231:
1227:
1222:
1217:
1213:
1209:
1205:
1201:
1197:
1193:
1187:
1185:
1176:
1170:
1151:
1147:
1143:
1139:
1135:
1130:
1125:
1121:
1114:
1107:
1099:
1095:
1091:
1085:
1080:
1075:
1071:
1064:
1056:
1050:
1031:
1027:
1020:
1013:
997:
993:
989:
982:
966:
962:
958:
952:
950:
941:
935:
919:
915:
914:
909:
902:
886:
882:
878:
872:
856:
852:
848:
842:
826:
822:
816:
800:
796:
795:
790:
783:
767:
763:
759:
753:
745:
739:
720:
716:
712:
705:
698:
682:
678:
674:
667:
665:
663:
646:
642:
636:
620:
616:
612:
608:
607:
602:
596:
580:
576:
572:
566:
564:
562:
553:
549:
542:
526:
522:
521:
516:
510:
508:
506:
489:
485:
479:
475:
465:
462:
460:
457:
455:
452:
450:
447:
445:
442:
440:
437:
435:
432:
430:
427:
425:
422:
420:
417:
415:
412:
410:
407:
405:
402:
401:
395:
393:
389:
385:
384:
379:
378:
373:
369:
365:
361:
357:
353:
343:
340:
336:
332:
328:
321:CRASH project
318:
316:
311:
306:
296:
294:
290:
286:
281:
272:
270:
266:
262:
258:
257:Ruha Benjamin
254:
249:
247:
243:
238:
228:
224:
222:
218:
213:
210:
206:
202:
198:
194:
179:
177:
173:
169:
164:
162:
158:
154:
150:
146:
142:
132:
130:
129:
124:
120:
116:
112:
108:
98:
90:
86:
79:
75:
74:
72:
68:
64:
60:
57:
54:
50:
46:
42:
38:
34:
30:
25:
19:
1868:. Retrieved
1859:
1850:
1838:. Retrieved
1829:
1820:
1806:cite journal
1794:. Retrieved
1783:Data Society
1782:
1769:
1757:. Retrieved
1748:
1723:. Retrieved
1714:
1705:
1693:. Retrieved
1684:
1674:
1662:. Retrieved
1653:
1643:
1631:. Retrieved
1622:
1598:. Retrieved
1590:www.olay.com
1589:
1580:
1568:. Retrieved
1559:
1530:. Retrieved
1522:www.olay.com
1521:
1512:
1500:. Retrieved
1491:
1482:
1470:. Retrieved
1461:
1451:
1439:. Retrieved
1430:
1421:
1409:. Retrieved
1400:
1390:
1378:. Retrieved
1369:
1343:. Retrieved
1334:The Register
1332:
1322:
1310:. Retrieved
1301:
1277:. Retrieved
1268:
1203:
1199:
1157:. Retrieved
1119:
1106:
1069:
1063:
1049:cite journal
1037:. Retrieved
1025:
1012:
1000:. Retrieved
991:
981:
969:. Retrieved
960:
922:. Retrieved
911:
901:
889:. Retrieved
880:
871:
859:. Retrieved
850:
841:
829:. Retrieved
825:the original
815:
803:. Retrieved
792:
782:
770:. Retrieved
761:
752:
738:cite journal
728:December 12,
726:. Retrieved
714:
710:
697:
685:. Retrieved
649:. Retrieved
635:
623:. Retrieved
604:
595:
583:. Retrieved
551:
541:
529:. Retrieved
520:Fast Company
518:
492:. Retrieved
484:"Learn More"
478:
464:Resisting AI
454:Timnit Gebru
434:Deborah Raji
381:
375:
372:Fast Company
371:
349:
324:
302:
282:
278:
265:Safiya Noble
250:
234:
225:
214:
193:Timnit Gebru
190:
182:Notable work
174:(DAIR), and
165:
138:
128:Fast Company
126:
110:
106:
104:
36:Abbreviation
18:
1830:Google Docs
1715:Google Docs
1623:www.ajl.org
1302:VentureBeat
1269:www.ajl.org
1026:White Paper
552:Boing Boing
310:Fitzpatrick
65:AI activism
1898:Categories
1129:1805.04508
470:References
380:magazine,
339:bug-bounty
217:Coded Bias
1685:The Verge
1654:MediaWell
1230:0027-8424
1169:cite book
1098:232040593
934:cite news
717:: 77–91.
615:0190-8286
197:Microsoft
141:Media Lab
44:Formation
1870:April 8,
1864:Archived
1840:April 8,
1834:Archived
1796:April 8,
1787:Archived
1785:: 3–86.
1759:April 8,
1753:Archived
1725:April 8,
1719:Archived
1695:April 8,
1689:Archived
1664:April 8,
1658:Archived
1633:April 8,
1627:Archived
1600:April 8,
1594:Archived
1570:April 8,
1564:Archived
1532:April 8,
1526:Archived
1502:April 8,
1496:Archived
1472:April 8,
1466:Archived
1441:April 8,
1435:Archived
1411:April 8,
1405:Archived
1401:The Hill
1380:April 8,
1374:Archived
1345:April 8,
1339:Archived
1312:April 7,
1306:Archived
1279:April 7,
1273:Archived
1248:32205437
1159:April 8,
1150:Archived
1146:21670658
1039:April 8,
1030:Archived
1028:: 3–49.
1002:April 7,
996:Archived
971:April 8,
965:Archived
924:April 7,
918:Archived
891:April 7,
885:Archived
861:April 7,
855:Archived
831:April 7,
805:April 7,
799:Archived
772:April 7,
766:Archived
719:Archived
687:April 7,
681:Archived
651:April 7,
645:Archived
625:April 7,
619:Archived
585:April 7,
579:Archived
531:April 7,
525:Archived
494:April 7,
488:Archived
398:See also
287:and the
161:auditing
70:Location
1431:Reuters
1239:7149386
1208:Bibcode
881:Fortune
575:WBUR-FM
135:History
88:Website
62:Purpose
52:Founder
1749:Medium
1560:Forbes
1246:
1236:
1228:
1144:
1096:
1086:
794:Boston
641:"DAIR"
613:
390:, and
366:, the
362:, the
358:, the
354:, the
267:, and
205:Face++
203:, and
170:, the
1790:(PDF)
1779:(PDF)
1153:(PDF)
1142:S2CID
1124:arXiv
1116:(PDF)
1094:S2CID
1033:(PDF)
1022:(PDF)
722:(PDF)
707:(PDF)
293:ID.me
1872:2022
1842:2022
1812:link
1798:2022
1761:2022
1727:2022
1697:2022
1666:2022
1635:2022
1602:2022
1572:2022
1534:2022
1504:2022
1474:2022
1443:2022
1413:2022
1382:2022
1347:2022
1314:2022
1281:2022
1244:PMID
1226:ISSN
1175:link
1161:2022
1084:ISBN
1055:link
1041:2022
1004:2022
973:2022
940:link
926:2022
893:2022
863:2022
833:2022
807:2022
774:2022
744:link
730:2020
689:2022
653:2022
627:2022
611:ISSN
587:2022
533:2022
496:2022
377:Time
305:Olay
244:and
105:The
97:.org
95:.ajl
47:2016
1860:NPR
1462:CNN
1234:PMC
1216:doi
1204:117
1134:doi
1074:doi
913:NPR
677:CNN
392:CNN
388:NPR
201:IBM
111:AJL
93:www
39:AJL
1900::
1862:.
1858:.
1832:.
1828:.
1808:}}
1804:{{
1781:.
1751:.
1747:.
1735:^
1717:.
1713:.
1687:.
1683:.
1656:.
1652:.
1625:.
1621:.
1610:^
1592:.
1588:.
1562:.
1558:.
1542:^
1524:.
1520:.
1494:.
1490:.
1464:.
1460:.
1429:.
1403:.
1399:.
1368:.
1355:^
1337:.
1331:.
1300:.
1289:^
1271:.
1267:.
1256:^
1242:.
1232:.
1224:.
1214:.
1202:.
1198:.
1183:^
1171:}}
1167:{{
1148:.
1140:.
1132:.
1118:.
1092:.
1082:.
1051:}}
1047:{{
1024:.
994:.
990:.
963:.
959:.
948:^
936:}}
932:{{
916:.
910:.
883:.
879:.
853:.
849:.
797:.
791:.
764:.
760:.
740:}}
736:{{
715:81
713:.
709:.
679:.
675:.
661:^
617:.
609:.
603:.
573:.
560:^
550:.
517:.
504:^
386:,
263:,
259:,
199:,
178:.
163:.
155:,
151:,
1874:.
1844:.
1814:)
1800:.
1763:.
1729:.
1699:.
1668:.
1637:.
1604:.
1574:.
1536:.
1506:.
1476:.
1445:.
1415:.
1384:.
1349:.
1316:.
1283:.
1250:.
1218::
1210::
1177:)
1136::
1126::
1100:.
1076::
1057:)
1043:.
1006:.
975:.
942:)
928:.
895:.
865:.
835:.
809:.
776:.
746:)
691:.
655:.
629:.
589:.
554:.
535:.
498:.
109:(
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.