Knowledge

Algorithms of Oppression

Source đź“ť

178:, examining search results from 2009 to 2015. The book addresses the relationship between search engines and discriminatory biases. Noble argues that search algorithms are racist and perpetuate societal problems because they reflect the negative biases that exist in society and the people who create them. Noble dismantles the idea that search engines are inherently neutral by explaining how algorithms in search engines privilege whiteness by depicting positive cues when key words like “white” are searched as opposed to “asian,”  “hispanic,”  or “Black.” Her main example surrounds the search results of "Black girls" versus "white girls" and the biases that are depicted in the results. These algorithms can then have negative biases against 187:
how their distinct backgrounds affect their struggles. Additionally, Noble's argument addresses how racism infiltrates the google algorithm itself, something that is true throughout many coding systems including facial recognition, and medical care programs. While many new technological systems promote themselves as progressive and unbiased, Noble is arguing against this point and saying that many technologies, including google's algorithm "reflect and reproduce existing inequities."
341:
battle to change the Library's terminology from 'illegal aliens' to 'noncitizen' or 'unauthorised immigrants.' Noble later discusses the problems that ensue from misrepresentation and classification which allows her to enforce the importance of contextualisation. Noble argues that it is not just google, but all digital search engines that reinforce societal structures and discriminatory biases and by doing so she points out just how interconnected technology and society are.
312:
privacy laws to those of the European Union, which provides citizens with “the right to forget or be forgotten.” When utilizing search engines such as Google, these breaches of privacy disproportionately affect women and people of color. Google claims that they safeguard our data in order to protect us from losing our information, but fails to address what happens when you want your data to be deleted.
237:
relevant to the search query taking place. An advertiser can also set a maximum amount of money per day to spend on advertising. The more you spend on ads, the higher probability your ad will be closer to the top. Therefore, if an advertiser is passionate about his/her topic but it is controversial it may be the first to appear on a Google search.
361:. She first argues that public policies enacted by local and federal governments will reduce Google's “information monopoly” and regulate the ways in which search engines filter their results. She insists that governments and corporations bear the most responsibility to reform the systemic issues leading to algorithmic bias. 27: 856:
Noble's main focus is on Google’s algorithms, although she also discusses Amazon, Facebook, Twitter, and WordPress. She invests in the control over what users see and don't see. "Search results reflects the values and norms of the search companies commercial partners and advertisers and often reflect
186:
Noble takes a Black intersectional feminist approach to her work in studying how google algorithms affect people differently by race and gender. Intersectional Feminism takes into account the diverse experiences of women of different races and sexualities when discussing their oppression society, and
182:
and other marginalized populations, while also affecting Internet users in general by leading to "racial and gender profiling, misrepresentation, and even economic redlining." The book argues that algorithms perpetuate oppression and discriminate against People of Color, specifically women of color.
262:
has maintained social inequalities and stereotypes for Black, Latina, and Asian women, mostly due in part to Google's design and infrastructure that normalizes whiteness and men. She explains that the Google algorithm categorizes information which exacerbates stereotypes while also encouraging white
340:
and other societal standards as correct, and alternatives as problematic. She explains this problem by discussing a case between Dartmouth College and the Library of Congress where "student-led organization the Coalition for Immigration Reform, Equality (CoFired) and DREAMers" engaged in a two-year
224:
Noble also discusses how Google can remove the human curation from the first page of results to eliminate any potential racial slurs or inappropriate imaging. Another example discussed in this text is a public dispute of the results that were returned when “Jew” was searched on Google. The results
154:
degree in the early 2000s. The book's first inspiration came in 2011, when Noble Googled the phrase "black girls" and saw results for pornography on the first page. Noble's doctoral thesis, completed in 2012, was titled "Searching for Black girls: Old traditions in new media." At this time, Noble
263:
hegemonic norms. Noble found that after searching for black girls, the first search results were common stereotypes of black girls, or the categories that Google created based on their own idea of a black girl. Google hides behind their algorithm that has been proven to perpetuate inequalities.
212:
search's auto suggestion feature is demoralizing. On September 18, 2011, a mother googled “black girls” attempting to find fun activities to show her stepdaughter and nieces. To her surprise, the results encompassed websites and images of porn. This result encloses the data failures specific to
311:
has oppressive control over identity. This chapter highlights multiple examples of women being shamed due to their activity in the porn industry, regardless if it was consensual or not. She critiques the internet's ability to influence one's future due to its permanent nature and compares U.S.
236:
Noble reflects on AdWords which is Google's advertising tool and how this tool can add to the biases on Google. Adwords allows anyone to advertise on Google's search pages and is highly customizable. First, Google ranks ads on relevance and then displays the ads on pages which it believes are
432:
searches and our search patterns online. Noble challenges the idea of the internet being a fully democratic or post-racial environment. Each chapter examines different layers to the algorithmic biases formed by search engines. By outlining crucial points and theories throughout the book,
459:
writes, "What emerges from these pages is the sense that Google’s algorithms of oppression comprise just one of the hidden infrastructures that govern our daily lives, and that the others are likely just as hard-coded with white supremacy and misogyny as the one that Noble explores." In
376:
optimism,” or a failure to challenge the notion that the institutions themselves do not always solve, but sometimes perpetuate inequalities. To illustrate this point, she uses the example of Kandis, a Black hairdresser whose business faces setbacks because the review site
155:
thought of the title "Algorithms of Oppression" for the eventual book. By this time, changes to Google's algorithm had changed the most common results for a search of "black girls," though the underlying biases remain influential. Noble became an assistant professor at
497:
expressed criticism of the book, saying that the results of a Google search suggested in its blurb did not match Noble's predictions. IEEE's outreach historian, Alexander Magoun, later revealed that he had not read the book, and issued an apology.
475:, are not simply imperfect machines, but systems designed by humans in ways that replicate the power structures of the western countries where they are built, complete with all the sexism and racism that are built into those structures." In 257:
has exacerbated racism and how they continue to deny responsibility for it. Google puts the blame on those who have created the content and as well as those who are actively seeking this information. Google's
283:. Noble highlights that the sources and information that were found after the search pointed to conservative sources that skewed information. These sources displayed racist and anti-black information from 408:” ideologies toward race because it has historically erased the struggles faced by racial minorities. Lastly, she points out that big-data optimism leaves out discussion about the harms that 279:
discusses how Google's search engine combines multiple sources to create threatening narratives about minorities. She explains a case study where she searched “black on white crimes” on
229:
pages and Google claimed little ownership for the way it provided these identities. Google instead encouraged people to use “Jews” or “Jewish people” and claimed the actions of
739: 482:
reviewer Lesley Williams states, "Noble’s study should prompt some soul-searching about our reliance on commercial search engines and about digital social equity."
372:. She calls this argument “complacent” because it places responsibility on individuals, who have less power than media companies, and indulges a mindset she calls “ 1219: 688: 857:
our lowest and most demeaning beliefs, because these ideas circulate so freely and so often that they are normalized and extremely profitable." (Nobel, 36)
637: 328:
moves the discussion away from google and onto other information sources deemed credible and neutral. Noble says that prominent libraries, including the
494: 585: 773: 1181: 1132: 1083: 1034: 1195: 870: 233:
groups are out of Google's control. Unless pages are unlawful, Google will allow its algorithm to continue to act without removing pages.
151: 1288: 1278: 1253: 147: 1157: 1108: 1059: 982: 927: 672: 156: 112: 743: 1283: 794: 337: 143: 437:
is not limited to only academic readers. This allows for Noble's writing to reach a wider and more inclusive audience.
221:
lens, with racial awareness to understand the “problematic positions about the benign instrumentality of technologies.”
535: 1010: 385: 160: 287:
sources. Ultimately, she believes this readily-available, false information fueled the actions of white supremacist
1268: 1263: 609: 1273: 890: 368:
argument that algorithmic biases will disappear if more women and racial minorities enter the industry as
451: 765: 1221:
Algorithms of Oppression: How Search Engines Reinforce Racism, by Safiya Umoja Noble | Booklist Online
333: 1196:"Ideologies of Boring Things: The Internet and Infrastructures of Race - Los Angeles Review of Books" 389: 561: 713:"a book review by Robert Fantina: Algorithms of Oppression: How Search Engines Reinforce Racism" 146:
in the 1990s, then worked in advertising and marketing for fifteen years before going to the
369: 712: 159:
in 2014. In 2017, she published an article on racist and sexist bias in search engines in
8: 1258: 329: 689:"In 'Algorithms of Oppression,' Safiya Noble finds old stereotypes persist in new media" 130:
in the fields of information science, machine learning, and human-computer interaction.
1175: 1126: 1077: 1028: 839: 811: 1163: 1153: 1114: 1104: 1065: 1055: 1016: 1006: 978: 923: 843: 831: 812:"Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble" 740:"Safiya Umoja Noble Receives Top Honor from Fresno State | UCLA GSE&IS Ampersand" 668: 405: 284: 107: 823: 507: 358: 214: 662: 456: 230: 891:"Can an algorithm be racist? Spotting systemic oppression in the age of Google" 179: 1118: 1247: 1167: 1069: 1020: 835: 586:"Opinion | Noah Berlatsky: How search algorithms reinforce racism and sexism" 381:
has used biased advertising practices and searching strategies against her.
365: 280: 92: 1238: 1098: 425: 354: 325: 304: 276: 250: 226: 205: 127: 40: 26: 1147: 1049: 947:. MIT Press. pp. The Power Chapter 1: The Power Chapter (pgs 21-47). 288: 174:
is a text based on over six years of academic research on Google search
462: 428:
explores the social and political implications of the results from our
397: 401: 259: 175: 139: 82: 871:"Scholar sets off Twitter furor by critiquing a book he hasn't read" 1149:
Algorithms of oppression : how search engines reinforce racism
1100:
Algorithms of oppression : how search engines reinforce racism
1003:
Algorithms of oppression : how search engines reinforce racism
827: 512: 477: 409: 373: 218: 661:
Jessie, Daniels; Karen, Gregory; Cottom, Tressie McMillan (2017).
490: 404:
rhetoric on the Internet. She urges the public to shy away from “
922:. New York, NY, US: New York University Press. pp. Ch. 2. 472: 429: 393: 308: 254: 209: 960:
Race After Technology: Abolitionist Tools for the New Jim Code
1239:
Algorithms of Oppression: How Search Engines Reinforce Racism
1051:
Algorithms of oppression: how search engines reinforce racism
975:
Algorithms of oppression: How search engines reinforce racism
920:
Algorithms of oppression: How search engines reinforce racism
123:
Algorithms of Oppression: How Search Engines Reinforce Racism
977:. New York, NY, US: New York University Press. p. 230. 378: 412:
can disproportionately enact upon minority communities.
392:(FTC) to “regulate decency,” or to limit the amount of 536:"Don't Google It! How Search Engines Reinforce Racism" 307:
furthers her argument by discussing the way in which
471:"demonstrate that search engines, and in particular 1000: 217:. Noble also adds that as a society we must have a 660: 495:Institute of Electrical and Electronics Engineers 1245: 562:"Coded prejudice: how algorithms fuel injustice" 357:discusses possible solutions for the problem of 163:. The book was published on February 20, 2018. 942: 610:"How search engines are making us more racist" 213:people of color and women which Noble coins 489:received press attention when the official 384:She closes the chapter by calling upon the 1180:: CS1 maint: location missing publisher ( 1131:: CS1 maint: location missing publisher ( 1082:: CS1 maint: location missing publisher ( 1033:: CS1 maint: location missing publisher ( 364:Simultaneously, Noble condemns the common 25: 796:ALGORITHMS OF OPPRESSION | Kirkus Reviews 152:Master of Library and Information Science 1146:Noble, Safiya Umoja (20 February 2018). 1048:Noble, Safiya Umoja (20 February 2018). 1001:Noble, Safiya Umoja (20 February 2018). 957: 138:Noble earned an undergraduate degree in 148:University of Illinois Urbana-Champaign 1246: 809: 1145: 1096: 1047: 996: 994: 972: 917: 913: 911: 865: 863: 763: 635: 440: 157:University of California, Los Angeles 734: 732: 631: 629: 559: 530: 528: 190: 766:"Google and the Misinformed Public" 144:California State University, Fresno 13: 991: 908: 860: 449:has been largely positive. In the 14: 1300: 1232: 962:. Medford, MA: Polity. p. 3. 943:D’Ignazio, C.; Klein, L. (2019). 729: 626: 525: 386:Federal Communications Commission 161:The Chronicle of Higher Education 1289:New York University Press books 1212: 1188: 1139: 1090: 1041: 966: 951: 936: 883: 850: 803: 787: 776:from the original on 2020-07-23 764:Noble, Safiya U. (2017-01-15). 757: 560:Fine, Cordelia (7 March 2018). 16:2018 book by Safiya Umoja Noble 1254:Books about race and ethnicity 1005:. New York. pp. 134–135. 810:Erigha, Maryann (2019-07-01). 705: 681: 654: 602: 578: 553: 1: 816:American Journal of Sociology 667:. Policy Press. p. 420. 518: 415: 133: 1097:Noble, Safiya Umoja (2018). 742:. 2019-02-07. Archived from 636:Munro, Donald (2018-04-19). 344: 315: 294: 266: 240: 195: 7: 1284:Machine learning algorithms 1200:Los Angeles Review of Books 638:"When Google gets it wrong" 501: 452:Los Angeles Review of Books 291:, who committed a massacre 166: 10: 1305: 1279:Human–computer interaction 1103:. New York. p. 121. 1054:. New York. p. 112. 467:Hans Rollman writes that 106: 98: 88: 78: 70: 62: 54: 46: 36: 24: 20:Algorithms of Oppression 717:www.nyjournalofbooks.com 487:Algorithms of Oppression 485:In early February 2018, 469:Algorithms of Oppression 447:Algorithms of Oppression 435:Algorithms of Oppression 422:Algorithms of Oppression 390:Federal Trade Commission 351:Algorithms of Oppression 322:Algorithms of Oppression 301:Algorithms of Oppression 273:Algorithms of Oppression 247:Algorithms of Oppression 202:Algorithms of Oppression 172:Algorithms of Oppression 445:Critical reception for 332:, encourage whiteness, 1269:English-language books 1264:2018 non-fiction books 973:Noble, Safiya (2018). 918:Noble, Safiya (2018). 215:algorithmic oppression 958:Benjamin, R. (2019). 225:included a number of 1274:Information science 664:Digital Sociologies 330:Library of Congress 21: 441:Critical reception 370:software engineers 128:Safiya Umoja Noble 126:is a 2018 book by 58:Racism, algorithms 19: 1159:978-1-4798-3724-3 1110:978-1-4798-3364-1 1061:978-1-4798-3724-3 984:978-1-4798-3364-1 929:978-1-4798-3364-1 770:www.chronicle.com 693:annenberg.usc.edu 674:978-1-4473-2901-5 334:heteronormativity 285:white supremacist 231:White supremacist 191:Chapter Summaries 119: 118: 113:978-1-4798-4994-9 89:Publication place 1296: 1226: 1225: 1216: 1210: 1209: 1207: 1206: 1192: 1186: 1185: 1179: 1171: 1143: 1137: 1136: 1130: 1122: 1094: 1088: 1087: 1081: 1073: 1045: 1039: 1038: 1032: 1024: 998: 989: 988: 970: 964: 963: 955: 949: 948: 940: 934: 933: 915: 906: 905: 903: 902: 887: 881: 880: 878: 877: 867: 858: 854: 848: 847: 807: 801: 800: 791: 785: 784: 782: 781: 761: 755: 754: 752: 751: 736: 727: 726: 724: 723: 709: 703: 702: 700: 699: 685: 679: 678: 658: 652: 651: 649: 648: 642:THE MUNRO REVIEW 633: 624: 623: 621: 620: 606: 600: 599: 597: 596: 582: 576: 575: 573: 572: 557: 551: 550: 548: 547: 532: 508:Algorithmic bias 493:account for the 359:algorithmic bias 349:In Chapter 6 of 320:In Chapter 5 of 299:In Chapter 4 of 271:In Chapter 3 of 245:In Chapter 2 of 200:In Chapter 1 of 29: 22: 18: 1304: 1303: 1299: 1298: 1297: 1295: 1294: 1293: 1244: 1243: 1235: 1230: 1229: 1218: 1217: 1213: 1204: 1202: 1194: 1193: 1189: 1173: 1172: 1160: 1144: 1140: 1124: 1123: 1111: 1095: 1091: 1075: 1074: 1062: 1046: 1042: 1026: 1025: 1013: 999: 992: 985: 971: 967: 956: 952: 941: 937: 930: 916: 909: 900: 898: 889: 888: 884: 875: 873: 869: 868: 861: 855: 851: 808: 804: 793: 792: 788: 779: 777: 762: 758: 749: 747: 738: 737: 730: 721: 719: 711: 710: 706: 697: 695: 687: 686: 682: 675: 659: 655: 646: 644: 634: 627: 618: 616: 608: 607: 603: 594: 592: 584: 583: 579: 570: 568: 566:Financial Times 558: 554: 545: 543: 534: 533: 526: 521: 504: 457:Emily Drabinski 443: 418: 347: 318: 297: 269: 243: 198: 193: 169: 136: 32: 17: 12: 11: 5: 1302: 1292: 1291: 1286: 1281: 1276: 1271: 1266: 1261: 1256: 1242: 1241: 1234: 1233:External links 1231: 1228: 1227: 1211: 1187: 1158: 1138: 1109: 1089: 1060: 1040: 1011: 990: 983: 965: 950: 935: 928: 907: 895:Digital Trends 882: 859: 849: 828:10.1086/703431 822:(1): 305–307. 802: 786: 756: 728: 704: 680: 673: 653: 625: 601: 577: 552: 523: 522: 520: 517: 516: 515: 510: 503: 500: 442: 439: 417: 414: 388:(FCC) and the 346: 343: 317: 314: 296: 293: 268: 265: 253:explains that 242: 239: 197: 194: 192: 189: 180:women of color 168: 165: 135: 132: 117: 116: 110: 104: 103: 100: 96: 95: 90: 86: 85: 80: 76: 75: 72: 68: 67: 64: 60: 59: 56: 52: 51: 48: 44: 43: 38: 34: 33: 30: 15: 9: 6: 4: 3: 2: 1301: 1290: 1287: 1285: 1282: 1280: 1277: 1275: 1272: 1270: 1267: 1265: 1262: 1260: 1257: 1255: 1252: 1251: 1249: 1240: 1237: 1236: 1223: 1222: 1215: 1201: 1197: 1191: 1183: 1177: 1169: 1165: 1161: 1155: 1151: 1150: 1142: 1134: 1128: 1120: 1116: 1112: 1106: 1102: 1101: 1093: 1085: 1079: 1071: 1067: 1063: 1057: 1053: 1052: 1044: 1036: 1030: 1022: 1018: 1014: 1012:9781479837243 1008: 1004: 997: 995: 986: 980: 976: 969: 961: 954: 946: 945:Data Feminism 939: 931: 925: 921: 914: 912: 896: 892: 886: 872: 866: 864: 853: 845: 841: 837: 833: 829: 825: 821: 817: 813: 806: 798: 797: 790: 775: 771: 767: 760: 746:on 2019-02-07 745: 741: 735: 733: 718: 714: 708: 694: 690: 684: 676: 670: 666: 665: 657: 643: 639: 632: 630: 615: 611: 605: 591: 587: 581: 567: 563: 556: 541: 537: 531: 529: 524: 514: 511: 509: 506: 505: 499: 496: 492: 488: 483: 481: 479: 474: 470: 466: 464: 458: 454: 453: 448: 438: 436: 431: 427: 423: 413: 411: 407: 403: 399: 395: 391: 387: 382: 380: 375: 371: 367: 362: 360: 356: 352: 342: 339: 335: 331: 327: 323: 313: 310: 306: 302: 292: 290: 286: 282: 278: 274: 264: 261: 256: 252: 248: 238: 234: 232: 228: 222: 220: 216: 211: 208:explores how 207: 203: 188: 184: 181: 177: 173: 164: 162: 158: 153: 149: 145: 141: 131: 129: 125: 124: 114: 111: 109: 105: 101: 97: 94: 93:United States 91: 87: 84: 81: 77: 74:February 2018 73: 69: 65: 61: 57: 53: 49: 45: 42: 39: 35: 31:First edition 28: 23: 1220: 1214: 1203:. Retrieved 1199: 1190: 1152:. New York. 1148: 1141: 1099: 1092: 1050: 1043: 1002: 974: 968: 959: 953: 944: 938: 919: 899:. Retrieved 897:. 2018-03-03 894: 885: 874:. Retrieved 852: 819: 815: 805: 795: 789: 778:. Retrieved 769: 759: 748:. Retrieved 744:the original 720:. Retrieved 716: 707: 696:. Retrieved 692: 683: 663: 656: 645:. Retrieved 641: 617:. Retrieved 613: 604: 593:. Retrieved 589: 580: 569:. Retrieved 565: 555: 544:. Retrieved 542:. 2018-01-30 539: 486: 484: 476: 468: 461: 450: 446: 444: 434: 426:Safiya Noble 421: 419: 383: 363: 355:Safiya Noble 350: 348: 321: 319: 300: 298: 277:Safiya Noble 272: 270: 246: 244: 235: 227:anti-Semitic 223: 206:Safiya Noble 201: 199: 185: 171: 170: 137: 122: 121: 120: 41:Safiya Noble 289:Dylann Roof 115:(Hardcover) 66:Non-fiction 1259:Algorithms 1248:Categories 1205:2018-03-24 1119:1017736697 901:2018-03-24 876:2018-02-08 780:2021-10-05 750:2021-10-05 722:2021-10-05 698:2021-10-05 647:2021-10-05 619:2018-05-10 595:2018-05-10 571:2018-05-10 546:2018-03-24 540:PopMatters 519:References 463:PopMatters 416:Conclusion 406:colorblind 402:prejudiced 398:homophobic 366:neoliberal 338:patriarchy 176:algorithms 134:Background 1176:cite book 1168:987591529 1127:cite book 1078:cite book 1070:987591529 1029:cite book 1021:987591529 844:198603932 836:0002-9602 345:Chapter 6 316:Chapter 5 295:Chapter 4 267:Chapter 3 260:algorithm 241:Chapter 2 196:Chapter 1 140:sociology 83:NYU Press 79:Publisher 71:Published 774:Archived 590:NBC News 513:Techlash 502:See also 478:Booklist 410:big data 374:big-data 219:feminist 167:Overview 47:Language 491:Twitter 55:Subject 50:English 1166:  1156:  1117:  1107:  1068:  1058:  1019:  1009:  981:  926:  842:  834:  671:  473:Google 430:Google 394:racist 309:Google 281:Google 255:Google 210:Google 150:for a 102:256 pp 37:Author 840:S2CID 400:, or 326:Noble 305:Noble 251:Noble 142:from 99:Pages 63:Genre 1182:link 1164:OCLC 1154:ISBN 1133:link 1115:OCLC 1105:ISBN 1084:link 1066:OCLC 1056:ISBN 1035:link 1017:OCLC 1007:ISBN 979:ISBN 924:ISBN 832:ISSN 669:ISBN 379:Yelp 108:ISBN 824:doi 820:125 614:Vox 420:In 1250:: 1198:. 1178:}} 1174:{{ 1162:. 1129:}} 1125:{{ 1113:. 1080:}} 1076:{{ 1064:. 1031:}} 1027:{{ 1015:. 993:^ 910:^ 893:. 862:^ 838:. 830:. 818:. 814:. 772:. 768:. 731:^ 715:. 691:. 640:. 628:^ 612:. 588:. 564:. 538:. 527:^ 455:, 424:, 396:, 353:, 336:, 324:, 303:, 275:, 249:, 204:, 1224:. 1208:. 1184:) 1170:. 1135:) 1121:. 1086:) 1072:. 1037:) 1023:. 987:. 932:. 904:. 879:. 846:. 826:: 799:. 783:. 753:. 725:. 701:. 677:. 650:. 622:. 598:. 574:. 549:. 480:, 465:,

Index


Safiya Noble
NYU Press
United States
ISBN
978-1-4798-4994-9
Safiya Umoja Noble
sociology
California State University, Fresno
University of Illinois Urbana-Champaign
Master of Library and Information Science
University of California, Los Angeles
The Chronicle of Higher Education
algorithms
women of color
Safiya Noble
Google
algorithmic oppression
feminist
anti-Semitic
White supremacist
Noble
Google
algorithm
Safiya Noble
Google
white supremacist
Dylann Roof
Noble
Google

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑