476:
on data models and geospatial mapping and real-time sensors and processing of the environment. Cars with levels 1 to 3 are already available on the market in 2021. In 2016 The German government established an 'Ethics
Commission on Automated and Connected Driving' which recommended connected and automated vehicles (CAVs) be developed if the systems cause fewer accidents than human drivers (positive balance of risk). It also provided 20 ethical rules for the adaptation of automated and connected driving. In 2020 the European Commission strategy on CAMs recommended that they be adopted in Europe to reduce road fatalities and lower emissions however self-driving cars also raise many policy, security and legal issues in terms of liability and ethical decision-making in the case of accidents, as well as privacy issues. Issues of trust in autonomous vehicles and community concerns about their safety are key factors to be addressed if AVs are to be widely adopted.
328:(RAI), are being used to supplement or replace the human judgment of judges, civil servants and police officers in many contexts. In the United States RAI are being used to generate scores to predict the risk of recidivism in pre-trial detention and sentencing decisions, evaluate parole for prisoners and to predict "hot spots" for future crime. These scores may result in automatic effects or may be used to inform decisions made by officials within the justice system. In Canada ADM has been used since 2014 to automate certain activities conducted by immigration officials and to support the evaluation of some immigrant and visitor applications.
437:. Many governments around the world are now using automated, algorithmic systems for profiling and targeting policies and services including algorithmic policing based on risks, surveillance sorting of people such as airport screening, providing services based on risk profiles in child protection, providing employment services and governing the unemployed. A significant application of ADM in social services relates to the use of
405:
platforms, user data, ad servers and their delivery data, inventory management systems, ad traders and ad exchanges. There are various issues with this system including lack of transparency for advertisers, unverifiable metrics, lack of control over ad venues, audience tracking and privacy concerns. Internet users who dislike ads have adopted counter measures such as
38:, business, health, education, law, employment, transport, media and entertainment, with varying degrees of human oversight or intervention. ADM involves large-scale data from a range of sources, such as databases, text, social media, sensors, images or speech, that is processed using various technologies including computer software, algorithms,
252:
recognition, translations, text, data and simulations. While machine learning has been around for some time, it is becoming increasingly powerful due to recent breakthroughs in training deep neural networks (DNNs), and dramatic increases in data storage capacity and computational power with GPU coprocessors and cloud computing.
506:, intellectual property rights, the spread of misinformation via media platforms, administrative discrimination, risk and responsibility, unemployment and many others. As ADM becomes more ubiquitous there is greater need to address the ethical challenges to ensure good governance in information societies.
454:
which refers to transparency around the reasons for a decision and the ability to explain the basis on which a machine made a decision. For example
Australia's federal social security delivery agency, Centrelink, developed and implemented an automated processes for detecting and collecting debt which
336:
Automated decision-making systems are used in certain computer programs to create buy and sell orders related to specific financial transactions and automatically submit the orders in the international markets. Computer programs can automatically generate orders based on predefined set of rules using
100:
Automated decision-making involves using data as input to be analyzed within a process, model, or algorithm or for learning and generating new models. ADM systems may use and connect a wide range of data types and sources depending on the goals and contexts of the system, for example, sensor data for
449:
or crime in policing and criminal justice, predictions of welfare/tax fraud in compliance systems, predictions of long term unemployment in employment services. Historically these systems were based on standard statistical analyses, however from the early 2000s machine learning has increasingly been
475:
and other forms of transport which use automated decision-making systems to replace various aspects of human control of the vehicle. This can range from level 0 (complete human driving) to level 5 (completely autonomous). At level 5 the machine is able to make decisions to control the vehicle based
255:
Machine learning systems based on foundation models run on deep neural networks and use pattern matching to train a single huge system on large amounts of general data such as text and images. Early models tended to start from scratch for each new problem however since the early 2020s many are able
640:
between individuals whose data feeds into the system and the platforms and decision-making systems capable of inferring information from that data. On the other hand it has been observed that in financial trading the information asymmetry between two artificial intelligent agents may be much less
87:
Since the 1950s computers have gone from being able to do basic processing to having the capacity to undertake complex, ambiguous and highly skilled tasks such as image and speech recognition, gameplay, scientific and medical analysis and inferencing across multiple data sources. ADM is now being
404:
involves automating the sale and delivery of digital advertising on websites and platforms via software rather than direct human decision-making. This is sometimes known as the waterfall model which involves a sequence of steps across various systems and players: publishers and data management
382:
or content-based filtering. This includes music and video platforms, publishing, health information, product databases and search engines. Many recommender systems also provide some agency to users in accepting recommendations and incorporate data-driven algorithmic feedback loops based on the
109:
The quality of the available data and its ability to be used in ADM systems is fundamental to the outcomes. It is often highly problematic for many reasons. Datasets are often highly variable; corporations or governments may control large-scale data, restricted for privacy or security reasons,
75:
that make recommendations for human decision-makers to act on, sometimes known as augmented intelligence or 'shared decision-making', to fully automated decision-making processes that make decisions on behalf of individuals or organizations without human involvement. Models used in automated
517:
in Canada argues for a critical human rights analysis of the application of ADM in various areas to ensure the use of automated decision-making does not result in infringements on rights, including the rights to equality and non-discrimination; freedom of movement, expression, religion, and
251:
Machine learning (ML) involves training computer programs through exposure to large data sets and examples to learn from experience and solve problems. Machine learning can be used to generate and analyse data as well as make algorithmic calculations and has been applied to image and speech
488:
practices and institutions in government and commercial sectors. As a result there has been a major shift from targeted monitoring of suspects to the ability to monitor entire populations. The level of surveillance now possible as a result of automated data collection has been described as
122:
Automated decision-making technologies (ADMT) are software-coded digital tools that automate the translation of input data to output data, contributing to the function of automated decision-making systems. There are a wide range of technologies in use across ADM applications and systems.
501:
There are many social, ethical and legal implications of automated decision-making systems. Concerns raised include lack of transparency and contestability of decisions, incursions on privacy and surveillance, exacerbating systemic bias and inequality due to data and
365:
continue to advance, accountants and auditors may make use of increasingly sophisticated algorithms which make decisions such as those involving determining what is anomalous, whether to notify personnel, and how to prioritize those tasks assigned to personnel.
58:. The increasing use of automated decision-making systems (ADMS) across a range of contexts presents many benefits and challenges to human society requiring consideration of the technical, legal, ethical, societal, educational, economic and health consequences.
275:
ADM is being used to replace or augment human decision-making by both public and private-sector organisations for a range of reasons including to help increase consistency, improve efficiency, reduce costs and enable new solutions to complex problems.
386:
Large-scale machine learning language models and image creation programs being developed by companies such as OpenAI and Google in the 2020s have restricted access however they are likely to have widespread application in fields such as advertising,
2234:
101:
self-driving cars and robotics, identity data for security systems, demographic and financial data for public administration, medical records in health, criminal records in law. This can sometimes involve vast amounts of data and computing power.
653:(HCI), law, public administration, and media and communications. The automation of media content and algorithmically driven news, video and other content via search systems and platforms is a major focus of academic research in media studies.
545:(EU). Article 22(1) enshrines the right of data subjects not to be subject to decisions, which have legal or other significant effects, being based solely on automatic individual decision making. GDPR also includes some rules on the
113:
For machines to learn from data, large corpora are often required, which can be challenging to obtain or compute; however, where available, they have provided significant breakthroughs, for example, in diagnosing chest X-rays.
619:
Questions of biased or incorrect data or algorithms and concerns that some ADMs are black box technologies, closed to human scrutiny or interrogation, has led to what is referred to as the issue of explainability, or the
628:(XAI), or Interpretable AI, in which the results of the solution can be analysed and understood by humans. XAI algorithms are considered to follow three principles - transparency, interpretability and explainability.
91:
An ADM system (ADMS) may involve multiple decision points, data sets, and technologies (ADMT) and may sit within a larger administrative or technical system such as a criminal justice system or business process.
509:
ADM systems are often based on machine learning and algorithms which are not easily able to be viewed or analysed, leading to concerns that they are 'black box' systems which are not transparent or accountable.
66:
There are different definitions of ADM based on the level of automation involved. Some definitions suggests ADM involves decisions made through purely technological means without human input, such as the EU's
1481:
399:
Online advertising is closely integrated with many digital media platforms, websites and search engines and often involves automated delivery of display advertisements in diverse formats. 'Programmatic'
798:
Larus, James; Hankin, Chris; Carson, Siri Granum; Christen, Markus; Crafa, Silvia; Grau, Oliver; Kirchner, Claude; Knowles, Bran; McGettrick, Andrew; Tamburri, Damian Andrew; Werthner, Hannes (2018).
1063:
Seah, Jarrel C Y; Tang, Cyril H M; Buchlak, Quinlan D; Holt, Xavier G; Wardman, Jeffrey B; Aimoldin, Anuar; Esmaili, Nazanin; Ahmad, Hassan; Pham, Hung; Lambert, John F; Hachey, Ben (August 2021).
895:
1124:
670:
568:. Similarly scoped and worded provisions with varying attached rights and obligations are present in the data protection laws of many other jurisdictions across the world, including
409:
technologies which allow users to automatically filter unwanted advertising from websites and some internet applications. In 2017, 24% of
Australian internet users had ad blockers.
558:
649:
Many academic disciplines and fields are increasingly turning their attention to the development, application and implications of ADM including business, computer sciences,
325:
657:
284:
Research and development are underway into uses of technology to assess argument quality, assess argumentative essays and judge debates. Potential applications of these
1502:
Nissan, Ephraim (2017-08-01). "Digital technologies and artificial intelligence's present and foreseeable impact on lawyering, judging, policing and law enforcement".
1886:
1473:
1309:
Proceedings of the 53rd Annual
Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing
1065:"Effect of a comprehensive deep-learning model on the accuracy of chest x-ray interpretation by radiologists: a retrospective, multi-reader multicase study"
357:
processes. It can be utilized in the private sector by business enterprises and in the public sector by governmental organizations and municipalities. As
484:
Automated digital data collections via sensors, cameras, online transactions and social media have significantly expanded the scope, scale, and goals of
337:
trading strategies which are based on technical analyses, advanced statistical and mathematical computations, or inputs from other electronic sources.
891:
2434:
2055:
660:
was established in 2018 to study transparency and explainability in the context of socio-technical systems, many of which include ADM and AI.
917:
1210:
Wachsmuth, Henning; Naderi, Nona; Hou, Yufang; Bilu, Yonatan; Prabhakaran, Vinodkumar; Thijm, Tim; Hirst, Graema; Stein, Benno (2017).
493:
or surveillance economy to indicate the way digital media involves large-scale tracking and accumulation of data on every interaction.
1453:
110:
incomplete, biased, limited in terms of time or coverage, measuring and describing terms in different ways, and many other issues.
2454:
1845:
Ethics of connected and automated vehicles: recommendations on road safety, privacy, fairness, explainability and responsibility
584:
550:
977:
288:
span education and society. Scenarios to consider, in these regards, include those involving the assessment and evaluation of
2410:
2209:
2031:
1998:
1965:
1861:
2235:"Request for a preliminary ruling from the Verwaltungsgericht Wien (Austria) lodged on 16 March 2022 – CK (Case C-203/22)"
553:. These provisions were not first introduced in the GDPR, but have been present in a similar form across Europe since the
1878:
625:
451:
374:
Digital media, entertainment platforms, and information services increasingly provide content to audiences via automated
289:
526:
264:
68:
1733:
1125:"Robots are creating images and telling jokes. 5 things to know about foundation models and the next generation of AI"
563:
301:
293:
1611:
714:
450:
developed and deployed. Key issues with the use of ADM in social services include bias, fairness, accountability and
1232:
Wachsmuth, Henning; Naderi, Nona; Habernal, Ivan; Hou, Yufang; Hirst, Graeme; Gurevych, Iryna; Stein, Benno (2017).
297:
309:
219:
1649:
Proceedings of the 19th Annual
International Conference on Digital Government Research: Governance in the Data Age
1633:
Bots at the Gate: A Human Rights
Analysis of Automated Decision-Making in Canada's Immigration and Refugee System
1254:
Gretz, Shai; Friedman, Roni; Cohen-Karlik, Edo; Toledo, Assaf; Lahav, Dan; Aharonov, Ranit; Slonim, Noam (2020).
583:
Rights for the explanation of public sector automated decisions forming 'algorithmic treatment' under the French
650:
321:
305:
607:
Technical design of the algorithm, for example where assumptions have been made about how a person will behave
88:
increasingly deployed across all sectors of society and many diverse domains from entertainment to transport.
1219:
Proceedings of the 15th
Conference of the European Chapter of the Association for Computational Linguistics
2464:
530:
209:
43:
1004:
2192:
Brkan, Maja (2017-06-12). "AI-supported decision-making under the general data protection regulation".
694:
388:
2469:
2082:
1758:
751:"The lifecycle of algorithmic decision-making systems: Organizational choices and ethical challenges"
554:
429:
Governments have been implementing digital technologies to provide more efficient administration and
81:
2310:"Is that your final decision? Multi-stage profiling, selective effects, and Article 22 of the GDPR"
1647:
Ezzamouri, Naoual; Hulstijn, Joris (2018). "Continuous monitoring and auditing in municipalities".
704:
610:
Emergent bias, where the application of ADM in unanticipated circumstances creates a biased outcome
418:
199:
2117:
Philosophical
Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
1806:"The Computer Says 'Debt': Towards A Critical Sociology Of Algorithms And Algorithmic Governance"
1274:
719:
490:
379:
358:
234:
180:
72:
47:
35:
2194:
Proceedings of the 16th edition of the
International Conference on Articial Intelligence and Law
51:
2113:"Governing artificial intelligence: ethical, legal and technical opportunities and challenges"
2253:"Enslaving the Algorithm: From a "Right to an Explanation" to a "Right to Better Decisions"?"
1636:. Citizen Lab and International Human Rights Program (Faculty of Law, University of Toronto).
1255:
637:
825:"Ethics-Based Auditing of Automated Decision-Making Systems: Nature, Scope, and Limitations"
801:
When
Computers Decide: European Recommendations on Machine-Learned Automated Decision Making
2124:
1682:
1558:
1158:
621:
546:
468:
438:
169:
1837:
1835:
549:
however the exact scope and nature of these is currently subject to pending review by the
71:(Article 22). However, ADM technologies and applications can take many forms ranging from
8:
2459:
2309:
1038:
709:
518:
association; privacy rights and the rights to life, liberty, and security of the person.
350:
285:
2196:. ICAIL '17. London, United Kingdom: Association for Computing Machinery. pp. 3–8.
2128:
1562:
1162:
978:"In AI we trust? Perceptions about automated decision-making by artificial intelligence"
2428:
2383:
2290:
2264:
2215:
2145:
2112:
2049:
1938:
1832:
1821:
1786:
1710:
1631:
1579:
1546:
1527:
1431:
1411:
1241:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics
1192:
1102:
1016:
976:
Araujo, Theo; Helberger, Natali; Kruikemeier, Sanne; de Vreese, Claes H. (2020-09-01).
955:
867:
836:
824:
823:
Mökander, Jakob; Morley, Jessica; Taddeo, Mariarosaria; Floridi, Luciano (2021-07-06).
729:
682:
401:
391:, stock imagery and graphic design as well as other fields such as journalism and law.
375:
149:
1358:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
1081:
1064:
2416:
2406:
2387:
2375:
2331:
2282:
2205:
2150:
2037:
2027:
2004:
1994:
1971:
1961:
1942:
1930:
1876:
1857:
1825:
1790:
1778:
1714:
1702:
1607:
1584:
1519:
1372:
1324:"A multimodal predictive model of successful debaters or how I learned to sway votes"
1323:
1184:
1176:
1106:
1094:
1086:
1020:
1008:
947:
939:
872:
854:
750:
229:
204:
2219:
1604:
The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement
1531:
1435:
1196:
959:
2365:
2321:
2294:
2274:
2197:
2140:
2132:
2094:
1920:
1849:
1813:
1770:
1694:
1652:
1574:
1566:
1511:
1421:
1380:
1331:
1282:
1166:
1076:
1000:
992:
929:
862:
846:
805:
762:
724:
699:
597:
503:
472:
442:
362:
246:
164:
154:
39:
1841:
1774:
1400:"Winning on the merits: The joint effects of content and style on debate outcomes"
2086:
1286:
538:
456:
430:
31:
2168:
934:
2252:
2024:
Automating inequality: how high-tech tools profile, police, and punish the poor
1925:
1908:
1256:"A large-scale dataset for argument quality ranking: Construction and analysis"
996:
850:
766:
542:
159:
2420:
2041:
1975:
1729:
1515:
799:
2448:
2379:
2335:
2286:
2278:
2008:
1934:
1782:
1706:
1698:
1523:
1180:
1090:
1012:
943:
858:
77:
2201:
2090:
2076:
1843:
1759:"Of algorithms, Apps and advice: digital social policy and service delivery"
1657:
1384:
1351:"Towards debate automation: a recurrent model for predicting debate winners"
1335:
1171:
1146:
2326:
2154:
2136:
2072:
1991:
Black box society: the secret algorithms that control money and information
1588:
1570:
1469:
1188:
1098:
951:
876:
604:
Data sources, where data inputs are biased in their collection or selection
485:
434:
144:
2370:
2353:
2098:
1817:
1377:
Proceedings of the 20th International Conference on Multimodal Interaction
455:
led to many cases of wrongful debt collection in what became known as the
1426:
1399:
1373:"Multimodal prediction of the audience's impression in political debates"
676:
514:
406:
224:
16:
Decision-making process conducted with varying degrees of human oversight
1350:
1301:
1233:
1211:
889:
658:
ACM Conference on Fairness, Accountability, and Transparency (ACM FAccT)
446:
1960:. Leah A. Lievrouw, Brian Loader. Abingdon, Oxon: Taylor and Francis.
1879:
Ethics Commission's complete report on automated and connected driving
1683:"Programming, filtering, adblocking: advertising and media automation"
256:
to be adapted to new problems. Examples of these technologies include
27:
1212:"Computational argumentation quality assessment in natural language"
809:
2269:
1853:
1805:
1416:
975:
841:
641:
than between two human agents or between human and machine agents.
577:
55:
2232:
1328:
Proceedings of the 23rd ACM international conference on Multimedia
671:
ARC Centre of Excellence for Automated Decision-Making and Society
2403:
Artificial intelligence and economic theory: Skynet in the market
1810:
Data for Policy 2017: Government by Algorithm? Conference, London
573:
189:
1279:
International Conference on Artificial Intelligence in Education
1398:
Wang, Lu; Beauchamp, Nick; Shugars, Sarah; Qin, Kechen (2017).
1253:
569:
534:
260:
257:
2078:
Algorithms of Oppression: How Search Engines Reinforce Racism
1909:"Trusting autonomous vehicles: An interdisciplinary approach"
1547:"The accuracy, fairness, and limits of predicting recidivism"
1455:
Understanding risk assessment instruments in criminal justice
1404:
Transactions of the Association for Computational Linguistics
1260:
Proceedings of the AAAI Conference on Artificial Intelligence
354:
1727:
892:
Guide to the UK General Data Protection Regulation (UK GDPR)
441:– eg predictions of risks to children from abuse/neglect in
1877:
Federal Ministry of Transport and Digital Infrastructures.
1842:
EU Directorate-General for Research and Innovation (2020).
822:
267:
language models, and Google's PaLM language model program.
76:
decision-making systems can be as simple as checklists and
1231:
1732:(Report). Reuters Institute for the Study of Journalism.
797:
749:
Marabelli, Marco; Newell, Sue; Handunge, Valerie (2021).
918:"Making Policy on Augmented Intelligence in Health Care"
1397:
1234:"Argumentation quality assessment: Theory vs. practice"
748:
378:
based on demographic information, previous selections,
1993:. Cambridge, Massachusetts: Harvard University Press.
1913:
Transportation Research Interdisciplinary Perspectives
1907:
Raats, Kaspar; Fors, Vaike; Pink, Sarah (2020-09-01).
1209:
421:
and detecting the eye condition macular degeneration.
130:
Search (includes 1-2-1, 1-2-many, data matching/merge)
1958:
Routledge handbook of digital media and communication
1145:
Taddeo, Mariarosaria; Floridi, Luciano (2018-08-24).
624:
of automated decisions and AI. This is also known as
1956:Andrejevic, Mark (2021). "Automated surveillance".
1062:
890:UK Information Commissioner's Office (2021-09-24).
2351:
1728:Newman, N; Fletcher, R; Kalogeropoulos, A (2017).
527:European General Data Protection Regulation (GDPR)
1848:. LU: Publications Office of the European Union.
1646:
1468:
1275:"Towards automated analysis of student arguments"
1123:Snoswell, Aaron J.; Hunter, Dan (13 April 2022).
804:. New York: Association for Computing Machinery.
663:Key research centres investigating ADM include:
2446:
2352:Friedman, Batya; Nissenbaum, Helen (July 1996).
2246:
2244:
894:(Report). Information Commissioner's Office UK.
467:Connected and automated mobility (CAM) involves
1348:
1036:
126:ADMTs involving basic computational operations
1370:
1321:
1302:"Modeling argument strength in student essays"
1122:
263:(an image creation program) and their various
2241:
2067:
2065:
1906:
1144:
915:
195:ADMTs for processing of complex data formats
2308:Binns, Reuben; Veale, Michael (2021-12-20).
2251:Edwards, Lilian; Veale, Michael (May 2018).
2250:
2071:
1544:
1472:; Larson, Jeff; Mattu, Surya (23 May 2016).
1451:
1005:11245.1/b73d4d3f-8ab9-4b63-b8a8-99fb749ab2c5
433:since the early 2000s, often referred to as
324:around the world, algorithmic tools such as
80:through to artificial intelligence and deep
636:Automated decision-making may increase the
496:
353:uses advanced analytical tools to automate
2433:: CS1 maint: location missing publisher (
2307:
2062:
2054:: CS1 maint: location missing publisher (
1955:
1629:
1322:Brilman, Maarten; Scherer, Stefan (2015).
369:
2369:
2325:
2268:
2144:
1924:
1656:
1578:
1425:
1415:
1299:
1170:
1080:
933:
866:
840:
462:
26:) involves the use of data, machines and
2233:Court of Justice of the European Union.
1988:
1601:
755:Journal of Strategic Information Systems
631:
417:Deep learning AI image models are being
2400:
2358:ACM Transactions on Information Systems
2021:
1371:Santos, Pedro; Gurevych, Iryna (2018).
1349:Potash, Peter; Rumshisky, Anna (2017).
2447:
1803:
1756:
1680:
1501:
916:Crigger, E.; Khoury, C. (2019-02-01).
557:in 1995, and the 1978 French law, the
551:Court of Justice of the European Union
521:Legislative responses to ADM include:
345:
2347:
2345:
2191:
2169:"EUR-Lex - 32016R0679 - EN - EUR-Lex"
1752:
1750:
1730:Reuters Institute Digital News Report
1676:
1674:
1672:
1670:
1668:
1625:
1623:
1447:
1445:
1272:
2110:
1545:Dressel, Julia; Farid, Hany (2018).
1300:Persing, Isaac; Ng, Vincent (2015).
1118:
1116:
1032:
1030:
971:
969:
793:
791:
789:
787:
785:
783:
1262:. Vol. 34. pp. 7805–7813.
240:
176:ADMTs relating to space and flows:
140:ADMTs for assessment and grouping:
117:
13:
2342:
1900:
1747:
1665:
1620:
1442:
644:
424:
136:Mathematical Calculation (formula)
69:General Data Protection Regulation
34:in a range of contexts, including
14:
2481:
1630:Molnar, Petra; Gill, Lex (2018).
1602:Ferguson, Andrew Guthrie (2017).
1113:
1027:
966:
780:
715:Ethics of artificial intelligence
614:
585:loi pour une République numérique
220:Business rules management systems
210:Natural Language Processing (NLP)
2026:(First ed.). New York, NY.
1147:"How AI can be a force for good"
2394:
2301:
2226:
2185:
2161:
2104:
2015:
1982:
1949:
1889:from the original on 2017-09-04
1870:
1797:
1736:from the original on 2013-08-17
1721:
1640:
1595:
1538:
1495:
1484:from the original on 2021-10-04
1462:
1391:
1364:
1342:
1315:
1293:
1266:
1247:
1225:
1203:
1138:
898:from the original on 2018-12-21
479:
270:
133:Matching (two different things)
104:
2455:Science and technology studies
2314:International Data Privacy Law
1763:Journal of Asian Public Policy
1281:. Springer. pp. 591–594.
1056:
909:
883:
829:Science and Engineering Ethics
816:
742:
394:
1:
1885:(Report). German Government.
1775:10.1080/17516234.2018.1495885
1687:Media International Australia
1082:10.1016/s2589-7500(21)00106-0
735:
2111:Cath, Corinne (2018-11-28).
1287:10.1007/978-3-642-39112-5_66
559:loi informatique et libertés
383:actions of the system user.
331:
7:
2401:Marwala, Tshilidzi (2017).
2257:IEEE Security & Privacy
1757:Henman, Paul (2019-01-02).
1452:Chohlas-Wood, Alex (2020).
935:10.1001/amajethics.2019.188
688:
529:, introduced in 2016, is a
340:
326:risk assessment instruments
61:
44:natural language processing
10:
2486:
2354:"Bias in computer systems"
2022:Eubanks, Virginia (2018).
1926:10.1016/j.trip.2020.100201
997:10.1007/s00146-019-00931-w
851:10.1007/s11948-021-00319-4
767:10.1016/j.jsis.2021.101683
695:Automated decision support
651:human computer interaction
312:argumentation and debate.
244:
183:(includes link prediction)
2083:New York University Press
1516:10.1007/s00146-015-0596-5
1069:The Lancet Digital Health
555:Data Protection Directive
419:used for reviewing x-rays
412:
279:
20:Automated decision-making
2279:10.1109/MSP.2018.2701152
1989:Pasquale, Frank (2016).
1699:10.1177/1329878X17738787
1458:. Brookings Institution.
1037:Algorithm Watch (2020).
705:Decision-making software
667:Algorithm Watch, Germany
497:Ethical and legal issues
73:decision-support systems
2202:10.1145/3086512.3086513
1681:Thomas, Julian (2018).
1658:10.1145/3209281.3209301
1606:. New York: NYU Press.
1385:10.1145/3281151.3281157
1336:10.1145/2733373.2806245
1172:10.1126/science.aat5991
1039:Automating Society 2019
720:Government by algorithm
622:right to an explanation
591:
491:surveillance capitalism
380:collaborative filtering
370:Media and Entertainment
359:artificial intelligence
181:Social network analysis
95:
48:artificial intelligence
2405:. Evan Hurwitz. Cham.
2137:10.1098/rsta.2018.0080
1571:10.1126/sciadv.aao5580
1410:. MIT Press: 219–232.
463:Transport and Mobility
315:
172:(includes forecasting)
52:augmented intelligence
2371:10.1145/230538.230561
1818:10.5281/ZENODO.884116
1804:Henman, Paul (2017).
1360:. pp. 2465–2475.
1273:Green, Nancy (2013).
922:AMA Journal of Ethics
638:information asymmetry
632:Information asymmetry
286:argument technologies
36:public administration
2327:10.1093/idpl/ipab020
1427:10.1162/tacl_a_00057
1330:. pp. 149–158.
596:ADM may incorporate
576:and the US state of
547:right to explanation
439:predictive analytics
235:Modelling/Simulation
225:Time series analysis
170:Predictive analytics
2129:2018RSPTA.37680080C
1563:2018SciA....4.5580D
1311:. pp. 543–552.
1243:. pp. 250–255.
1221:. pp. 176–187.
1163:2018Sci...361..751T
730:Recommender systems
710:Decision Management
541:and privacy in the
469:autonomous vehicles
376:recommender systems
351:Continuous auditing
346:Continuous auditing
150:Recommender systems
2465:Digital technology
2123:(2133): 20180080.
1127:. The Conversation
683:Informatics Europe
402:online advertising
2412:978-3-319-66104-9
2211:978-1-4503-4891-1
2173:eur-lex.europa.eu
2033:978-1-250-07431-7
2000:978-0-674-97084-7
1967:978-1-315-61655-1
1863:978-92-76-17867-5
1651:. pp. 1–10.
1157:(6404): 751–752.
473:self-driving cars
445:, predictions of
230:Anomaly detection
2477:
2470:Machine learning
2439:
2438:
2432:
2424:
2398:
2392:
2391:
2373:
2349:
2340:
2339:
2329:
2305:
2299:
2298:
2272:
2248:
2239:
2238:
2230:
2224:
2223:
2189:
2183:
2182:
2180:
2179:
2165:
2159:
2158:
2148:
2108:
2102:
2101:
2069:
2060:
2059:
2053:
2045:
2019:
2013:
2012:
1986:
1980:
1979:
1953:
1947:
1946:
1928:
1904:
1898:
1897:
1895:
1894:
1874:
1868:
1867:
1839:
1830:
1829:
1801:
1795:
1794:
1754:
1745:
1744:
1742:
1741:
1725:
1719:
1718:
1678:
1663:
1662:
1660:
1644:
1638:
1637:
1627:
1618:
1617:
1599:
1593:
1592:
1582:
1551:Science Advances
1542:
1536:
1535:
1504:AI & Society
1499:
1493:
1492:
1490:
1489:
1466:
1460:
1459:
1449:
1440:
1439:
1429:
1419:
1395:
1389:
1388:
1379:. pp. 1–6.
1368:
1362:
1361:
1355:
1346:
1340:
1339:
1319:
1313:
1312:
1306:
1297:
1291:
1290:
1270:
1264:
1263:
1251:
1245:
1244:
1238:
1229:
1223:
1222:
1216:
1207:
1201:
1200:
1174:
1142:
1136:
1135:
1133:
1132:
1120:
1111:
1110:
1084:
1075:(8): e496–e506.
1060:
1054:
1053:
1051:
1050:
1034:
1025:
1024:
985:AI & Society
982:
973:
964:
963:
937:
913:
907:
906:
904:
903:
887:
881:
880:
870:
844:
820:
814:
813:
795:
778:
777:
775:
773:
746:
725:Machine learning
700:Algorithmic bias
598:algorithmic bias
567:
504:algorithmic bias
443:child protection
363:machine learning
247:Machine learning
241:Machine learning
205:Audio processing
200:Image processing
165:Feature learning
118:ADM Technologies
40:machine learning
2485:
2484:
2480:
2479:
2478:
2476:
2475:
2474:
2445:
2444:
2443:
2442:
2426:
2425:
2413:
2399:
2395:
2350:
2343:
2306:
2302:
2249:
2242:
2231:
2227:
2212:
2190:
2186:
2177:
2175:
2167:
2166:
2162:
2109:
2105:
2070:
2063:
2047:
2046:
2034:
2020:
2016:
2001:
1987:
1983:
1968:
1954:
1950:
1905:
1901:
1892:
1890:
1875:
1871:
1864:
1840:
1833:
1802:
1798:
1755:
1748:
1739:
1737:
1726:
1722:
1679:
1666:
1645:
1641:
1628:
1621:
1614:
1600:
1596:
1557:(1): eaao5580.
1543:
1539:
1500:
1496:
1487:
1485:
1467:
1463:
1450:
1443:
1396:
1392:
1369:
1365:
1353:
1347:
1343:
1320:
1316:
1304:
1298:
1294:
1271:
1267:
1252:
1248:
1236:
1230:
1226:
1214:
1208:
1204:
1143:
1139:
1130:
1128:
1121:
1114:
1061:
1057:
1048:
1046:
1043:Algorithm Watch
1035:
1028:
980:
974:
967:
928:(2): E188–191.
914:
910:
901:
899:
888:
884:
821:
817:
810:10.1145/3185595
796:
781:
771:
769:
747:
743:
738:
691:
647:
645:Research fields
634:
617:
594:
561:
539:data protection
499:
482:
465:
431:social services
427:
425:Social Services
415:
397:
372:
348:
343:
334:
318:
282:
273:
249:
243:
120:
107:
98:
82:neural networks
64:
17:
12:
11:
5:
2483:
2473:
2472:
2467:
2462:
2457:
2441:
2440:
2411:
2393:
2364:(3): 330–347.
2341:
2300:
2240:
2225:
2210:
2184:
2160:
2103:
2061:
2032:
2014:
1999:
1981:
1966:
1948:
1899:
1869:
1862:
1854:10.2777/035239
1831:
1796:
1746:
1720:
1664:
1639:
1619:
1612:
1594:
1537:
1510:(3): 441–464.
1494:
1474:"Machine Bias"
1461:
1441:
1390:
1363:
1341:
1314:
1292:
1265:
1246:
1224:
1202:
1137:
1112:
1055:
1026:
991:(3): 611–623.
965:
908:
882:
815:
779:
740:
739:
737:
734:
733:
732:
727:
722:
717:
712:
707:
702:
697:
690:
687:
686:
685:
680:
674:
668:
646:
643:
633:
630:
626:Explainable AI
616:
615:Explainability
613:
612:
611:
608:
605:
600:arising from:
593:
590:
589:
588:
581:
543:European Union
513:A report from
498:
495:
481:
478:
464:
461:
452:explainability
426:
423:
414:
411:
396:
393:
371:
368:
347:
344:
342:
339:
333:
330:
317:
314:
290:conversational
281:
278:
272:
269:
245:Main article:
242:
239:
238:
237:
232:
227:
222:
213:
212:
207:
202:
193:
192:
187:
184:
174:
173:
167:
162:
160:Classification
157:
152:
147:
145:User profiling
138:
137:
134:
131:
119:
116:
106:
103:
97:
94:
78:decision trees
63:
60:
32:make decisions
15:
9:
6:
4:
3:
2:
2482:
2471:
2468:
2466:
2463:
2461:
2458:
2456:
2453:
2452:
2450:
2436:
2430:
2422:
2418:
2414:
2408:
2404:
2397:
2389:
2385:
2381:
2377:
2372:
2367:
2363:
2359:
2355:
2348:
2346:
2337:
2333:
2328:
2323:
2319:
2315:
2311:
2304:
2296:
2292:
2288:
2284:
2280:
2276:
2271:
2266:
2262:
2258:
2254:
2247:
2245:
2236:
2229:
2221:
2217:
2213:
2207:
2203:
2199:
2195:
2188:
2174:
2170:
2164:
2156:
2152:
2147:
2142:
2138:
2134:
2130:
2126:
2122:
2118:
2114:
2107:
2100:
2096:
2092:
2088:
2084:
2080:
2079:
2074:
2068:
2066:
2057:
2051:
2043:
2039:
2035:
2029:
2025:
2018:
2010:
2006:
2002:
1996:
1992:
1985:
1977:
1973:
1969:
1963:
1959:
1952:
1944:
1940:
1936:
1932:
1927:
1922:
1918:
1914:
1910:
1903:
1888:
1884:
1880:
1873:
1865:
1859:
1855:
1851:
1847:
1846:
1838:
1836:
1827:
1823:
1819:
1815:
1811:
1807:
1800:
1792:
1788:
1784:
1780:
1776:
1772:
1768:
1764:
1760:
1753:
1751:
1735:
1731:
1724:
1717:. Q110607881.
1716:
1712:
1708:
1704:
1700:
1696:
1692:
1688:
1684:
1677:
1675:
1673:
1671:
1669:
1659:
1654:
1650:
1643:
1635:
1634:
1626:
1624:
1615:
1613:9781479869978
1609:
1605:
1598:
1590:
1586:
1581:
1576:
1572:
1568:
1564:
1560:
1556:
1552:
1548:
1541:
1533:
1529:
1525:
1521:
1517:
1513:
1509:
1505:
1498:
1483:
1479:
1475:
1471:
1470:Angwin, Julia
1465:
1457:
1456:
1448:
1446:
1437:
1433:
1428:
1423:
1418:
1413:
1409:
1405:
1401:
1394:
1386:
1382:
1378:
1374:
1367:
1359:
1352:
1345:
1337:
1333:
1329:
1325:
1318:
1310:
1303:
1296:
1288:
1284:
1280:
1276:
1269:
1261:
1257:
1250:
1242:
1235:
1228:
1220:
1213:
1206:
1198:
1194:
1190:
1186:
1182:
1178:
1173:
1168:
1164:
1160:
1156:
1152:
1148:
1141:
1126:
1119:
1117:
1108:
1104:
1100:
1096:
1092:
1088:
1083:
1078:
1074:
1070:
1066:
1059:
1044:
1040:
1033:
1031:
1022:
1018:
1014:
1010:
1006:
1002:
998:
994:
990:
986:
979:
972:
970:
961:
957:
953:
949:
945:
941:
936:
931:
927:
923:
919:
912:
897:
893:
886:
878:
874:
869:
864:
860:
856:
852:
848:
843:
838:
834:
830:
826:
819:
811:
807:
803:
802:
794:
792:
790:
788:
786:
784:
768:
764:
761:(1): 101683.
760:
756:
752:
745:
741:
731:
728:
726:
723:
721:
718:
716:
713:
711:
708:
706:
703:
701:
698:
696:
693:
692:
684:
681:
678:
675:
672:
669:
666:
665:
664:
661:
659:
654:
652:
642:
639:
629:
627:
623:
609:
606:
603:
602:
601:
599:
586:
582:
579:
575:
571:
565:
560:
556:
552:
548:
544:
540:
536:
532:
528:
524:
523:
522:
519:
516:
511:
507:
505:
494:
492:
487:
477:
474:
470:
460:
458:
453:
448:
444:
440:
436:
432:
422:
420:
410:
408:
403:
392:
390:
384:
381:
377:
367:
364:
360:
356:
352:
338:
329:
327:
323:
322:legal systems
313:
311:
307:
303:
299:
295:
291:
287:
277:
268:
266:
262:
259:
253:
248:
236:
233:
231:
228:
226:
223:
221:
218:
217:
216:
211:
208:
206:
203:
201:
198:
197:
196:
191:
188:
185:
182:
179:
178:
177:
171:
168:
166:
163:
161:
158:
156:
153:
151:
148:
146:
143:
142:
141:
135:
132:
129:
128:
127:
124:
115:
111:
102:
93:
89:
85:
83:
79:
74:
70:
59:
57:
53:
49:
45:
41:
37:
33:
29:
25:
21:
2402:
2396:
2361:
2357:
2317:
2313:
2303:
2263:(3): 46–54.
2260:
2256:
2228:
2193:
2187:
2176:. Retrieved
2172:
2163:
2120:
2116:
2106:
2077:
2073:Safiya Noble
2023:
2017:
1990:
1984:
1957:
1951:
1916:
1912:
1902:
1891:. Retrieved
1882:
1872:
1844:
1809:
1799:
1769:(1): 71–89.
1766:
1762:
1738:. Retrieved
1723:
1693:(1): 34–43.
1690:
1686:
1648:
1642:
1632:
1603:
1597:
1554:
1550:
1540:
1507:
1503:
1497:
1486:. Retrieved
1477:
1464:
1454:
1407:
1403:
1393:
1376:
1366:
1357:
1344:
1327:
1317:
1308:
1295:
1278:
1268:
1259:
1249:
1240:
1227:
1218:
1205:
1154:
1150:
1140:
1129:. Retrieved
1072:
1068:
1058:
1047:. Retrieved
1042:
988:
984:
925:
921:
911:
900:. Retrieved
885:
832:
828:
818:
800:
770:. Retrieved
758:
754:
744:
662:
655:
648:
635:
618:
595:
520:
512:
508:
500:
486:surveillance
483:
480:Surveillance
466:
435:e-government
428:
416:
398:
385:
373:
349:
335:
319:
302:interpretive
294:mathematical
283:
274:
271:Applications
254:
250:
214:
194:
175:
139:
125:
121:
112:
108:
105:Data quality
99:
90:
86:
65:
23:
19:
18:
1883:www.bmvi.de
772:November 1,
677:Citizen Lab
673:, Australia
562: [
515:Citizen lab
407:ad blocking
395:Advertising
389:copywriting
215:Other ADMT
2460:Automation
2449:Categories
2421:1004620876
2320:(4): 320.
2270:1803.07540
2178:2021-09-13
2042:1013516195
1976:1198978596
1919:: 100201.
1893:2021-11-23
1740:2022-01-19
1488:2021-10-04
1478:ProPublica
1417:1705.05040
1131:2022-04-21
1049:2022-02-28
902:2021-10-05
842:2110.10980
736:References
531:regulation
447:recidivism
298:scientific
155:Clustering
28:algorithms
2429:cite book
2388:207195759
2380:1046-8188
2336:2044-3994
2287:1540-7993
2099:Q48816548
2091:19734838W
2050:cite book
2009:946975299
1943:225261480
1935:2590-1982
1826:158228131
1791:158229201
1783:1751-6234
1715:149139944
1707:1329-878X
1524:1435-5655
1181:0036-8075
1107:235735320
1091:2589-7500
1021:209523258
1013:1435-5655
944:2376-6980
859:1471-5546
835:(4): 44.
332:Economics
310:political
258:Open AI's
2220:23933541
2155:30322996
2095:Wikidata
2075:(2018),
1887:Archived
1734:Archived
1589:29376122
1532:21115049
1482:Archived
1436:27803846
1197:52075037
1189:30139858
1099:34219054
1045:(Report)
960:73490120
952:30794129
896:Archived
877:34231029
689:See also
679:, Canada
578:Virginia
471:such as
459:scheme.
457:RoboDebt
355:auditing
341:Business
62:Overview
56:robotics
2295:4049746
2146:6191666
2125:Bibcode
1580:5777393
1559:Bibcode
1159:Bibcode
1151:Science
868:8260507
574:Morocco
190:Routing
186:Mapping
84:(DNN).
2419:
2409:
2386:
2378:
2334:
2293:
2285:
2218:
2208:
2153:
2143:
2097:
2089:
2040:
2030:
2007:
1997:
1974:
1964:
1941:
1933:
1860:
1824:
1789:
1781:
1713:
1705:
1610:
1587:
1577:
1530:
1522:
1434:
1195:
1187:
1179:
1105:
1097:
1089:
1019:
1011:
958:
950:
942:
875:
865:
857:
570:Uganda
535:EU law
413:Health
308:, and
280:Debate
261:DALL-E
2384:S2CID
2291:S2CID
2265:arXiv
2216:S2CID
1939:S2CID
1822:S2CID
1787:S2CID
1711:S2CID
1528:S2CID
1432:S2CID
1412:arXiv
1354:(PDF)
1305:(PDF)
1237:(PDF)
1215:(PDF)
1193:S2CID
1103:S2CID
1017:S2CID
981:(PDF)
956:S2CID
837:arXiv
566:]
306:legal
2435:link
2417:OCLC
2407:ISBN
2376:ISSN
2332:ISSN
2283:ISSN
2206:ISBN
2151:PMID
2056:link
2038:OCLC
2028:ISBN
2005:OCLC
1995:ISBN
1972:OCLC
1962:ISBN
1931:ISSN
1858:ISBN
1779:ISSN
1703:ISSN
1608:ISBN
1585:PMID
1520:ISSN
1185:PMID
1177:ISSN
1095:PMID
1087:ISSN
1009:ISSN
948:PMID
940:ISSN
873:PMID
855:ISSN
774:2022
656:The
592:Bias
525:The
361:and
96:Data
54:and
2366:doi
2322:doi
2275:doi
2198:doi
2141:PMC
2133:doi
2121:376
1921:doi
1850:doi
1814:doi
1771:doi
1695:doi
1691:166
1653:doi
1575:PMC
1567:doi
1512:doi
1422:doi
1381:doi
1332:doi
1283:doi
1167:doi
1155:361
1077:doi
1001:hdl
993:doi
930:doi
863:PMC
847:doi
806:doi
763:doi
537:on
533:in
320:In
316:Law
265:GPT
30:to
24:ADM
2451::
2431:}}
2427:{{
2415:.
2382:.
2374:.
2362:14
2360:.
2356:.
2344:^
2330:.
2318:11
2316:.
2312:.
2289:.
2281:.
2273:.
2261:16
2259:.
2255:.
2243:^
2214:.
2204:.
2171:.
2149:.
2139:.
2131:.
2119:.
2115:.
2093:,
2087:OL
2085:,
2081:,
2064:^
2052:}}
2048:{{
2036:.
2003:.
1970:.
1937:.
1929:.
1915:.
1911:.
1881:.
1856:.
1834:^
1820:.
1812:.
1808:.
1785:.
1777:.
1767:12
1765:.
1761:.
1749:^
1709:.
1701:.
1689:.
1685:.
1667:^
1622:^
1583:.
1573:.
1565:.
1553:.
1549:.
1526:.
1518:.
1508:32
1506:.
1480:.
1476:.
1444:^
1430:.
1420:.
1406:.
1402:.
1375:.
1356:.
1326:.
1307:.
1277:.
1258:.
1239:.
1217:.
1191:.
1183:.
1175:.
1165:.
1153:.
1149:.
1115:^
1101:.
1093:.
1085:.
1071:.
1067:.
1041:.
1029:^
1015:.
1007:.
999:.
989:35
987:.
983:.
968:^
954:.
946:.
938:.
926:21
924:.
920:.
871:.
861:.
853:.
845:.
833:27
831:.
827:.
782:^
759:30
757:.
753:.
572:,
564:fr
304:,
300:,
296:,
292:,
50:,
46:,
42:,
2437:)
2423:.
2390:.
2368::
2338:.
2324::
2297:.
2277::
2267::
2237:.
2222:.
2200::
2181:.
2157:.
2135::
2127::
2058:)
2044:.
2011:.
1978:.
1945:.
1923::
1917:7
1896:.
1866:.
1852::
1828:.
1816::
1793:.
1773::
1743:.
1697::
1661:.
1655::
1616:.
1591:.
1569::
1561::
1555:4
1534:.
1514::
1491:.
1438:.
1424::
1414::
1408:5
1387:.
1383::
1338:.
1334::
1289:.
1285::
1199:.
1169::
1161::
1134:.
1109:.
1079::
1073:3
1052:.
1023:.
1003::
995::
962:.
932::
905:.
879:.
849::
839::
812:.
808::
776:.
765::
587:.
580:.
22:(
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.