An Integrated Approach to Study the Delegation of Conflict-of-Interest Decisions to Autonomous Agents

SCHEME: INTER

CALL: 2018

DOMAIN: IS - Information and Communication Technologies

FIRST NAME: Jean

LAST NAME: Botev

INDUSTRY PARTNERSHIP / PPP: No

INDUSTRY / PPP PARTNER:

HOST INSTITUTION: University of Luxembourg

KEYWORDS: Autonomous Agents, Delegation, Decision-Making, Socio-Technical Systems

START: 2018-11-15

END: 2019-03-14

WEBSITE: https://www.uni.lu

Submitted Abstract

In this age of ubiquitous digital interconnectivity, we may envisage that humans will increasingly delegate their social, economic or data-related transactions to an autonomous agent, for reasons of convenience or complexity. Although the scientific knowledge to create such systems appears to be available, this transformation does not appear to become commonplace soon, except maybe the use of basic digital assistants.We aim to explore if this is due to the lack of knowledge about human trust and acceptance of artificial autonomous delegates that make decisions in their place or even how these delegates should be designed. We study these questions using computational agents models that are validated in a series of behavioural experiments defined around the public goods game. We investigate when and how the autonomous agent may evolve from observer, over decision support to a delegate with full autonomy in decision-making. Using VR and AR technologies, we will investigate if the representation in which the agent is experienced influences trust. All the technology-oriented research is checked against socio-technology acceptance theories through an intricate collaboration with experts in social sciences. The results of this fundamental research will allow us to explore important questions related to the intelligence and interface of the envisioned agents, and lay the foundation for new types of online markets that brings autonomous agents into real-world applications.

This site uses cookies. By continuing to use this site, you agree to the use of cookies for analytics purposes. Find out more in our Privacy Statement