Will algorithms blind people? The effect of explainable AI and decision-makers' experience on AI-supported decision- making in government

Janssen, M, Hartog, M, Ricardo, M, Ding, A and Kuk, G ORCID logoORCID: https://orcid.org/0000-0002-1288-3635, 2020. Will algorithms blind people? The effect of explainable AI and decision-makers' experience on AI-supported decision- making in government. Social Science Computer Review. ISSN 0894-4393

[thumbnail of 1385111_a1390_Kuk.pdf]
Preview
Text
1385111_a1390_Kuk.pdf - Published version

Download (295kB) | Preview

Abstract

Computational artificial intelligence (AI) algorithms are increasingly used to support decision-making by governments. Yet algorithms often remain opaque to the decision-makers and devoid of clear explanations for the decisions made. In this study, we used an experimental approach to compare decision-making in three situations: humans making decisions 1) without any support of algorithms, 2) supported by business rules (BR) and 3) supported by machine learning (ML). Participants were asked to make the correct decisions given various scenarios, whilst BR and ML algorithms could provide correct or incorrect suggestions to the decision-maker. This enabled us to evaluate whether the participants were able to understand the limitations of BR and ML. The experiment shows that algorithms help decision-makers to make more correct decisions. The findings suggest that explainable AI combined with experience, helps them detect incorrect suggestions made by algorithms. However, even experienced persons were not able to identify all mistakes. Ensuring the ability to understand and traceback decisions are not sufficient for avoiding making incorrect decisions. The findings imply that algorithms should be adopted with care and that selecting the appropriate algorithms for supporting decisions and training of decision-makers are key factors in increasing accountability and transparency. Abstract Computational artificial intelligence (AI) algorithms are increasingly used to support decision-making by governments. Yet algorithms often remain opaque to the decision-makers and devoid of clear explanations for the decisions made. In this study, we used an experimental approach to compare decision-making in three situations: humans making decisions 1) without any support of algorithms, 2) supported by business rules (BR) and 3) supported by machine learning (ML). Participants were asked to make the correct decisions given various scenarios, whilst BR and ML algorithms could provide correct or incorrect suggestions to the decision-maker. This enabled us to evaluate whether the participants were able to understand the limitations of BR and ML. The experiment shows that algorithms help decision-makers to make more correct decisions. The findings suggest that explainable AI combined with experience, helps them detect incorrect suggestions made by algorithms. However, even experienced persons were not able to identify all mistakes. Ensuring the ability to understand and traceback decisions are not sufficient for avoiding making incorrect decisions. The findings imply that algorithms should be adopted with care and that selecting the appropriate algorithms for supporting decisions and training of decision-makers are key factors in increasing accountability and transparency.

Item Type: Journal article
Publication Title: Social Science Computer Review
Creators: Janssen, M., Hartog, M., Ricardo, M., Ding, A. and Kuk, G.
Publisher: SAGE Publications
Date: 28 December 2020
ISSN: 0894-4393
Identifiers:
Number
Type
10.1177/0894439320980118
DOI
1385111
Other
Rights: © The Author(s) 2020. Open Access. https://creativecommons.org/licenses/by/4.0/This article is distributed under the terms of the Creative Commons Attribution 4.0 License (https://creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/open-access-at-sage).
Divisions: Schools > Nottingham Business School
Record created by: Linda Sullivan
Date Added: 05 Nov 2020 10:50
Last Modified: 31 May 2021 15:07
URI: https://irep.ntu.ac.uk/id/eprint/41512

Actions (login required)

Edit View Edit View

Statistics

Views

Views per month over past year

Downloads

Downloads per month over past year