Dagstuhl Report on Trustworthy Conversational Agents

On February 18, 2022, the Dagstuhl Report „Conversational Agent as Trustworthy Autonomous System (Trust-CA)“ was published. Editors are Effie Lai-Chong Law, Asbjørn Følstad, Jonathan Grudin, and Björn Schuller. From the abstract: „This report documents the program and the outcomes of Dagstuhl Seminar 21381 ‚Conversational Agent as Trustworthy Autonomous System (Trust-CA)‘. First, we present the abstracts of the talks delivered by the Seminar’s attendees. Then we report on the origin and process of our six breakout (working) groups. For each group, we describe its contributors, goals and key questions, key insights, and future research. The themes of the groups were derived from a pre-Seminar survey, which also led to a list of suggested readings for the topic of trust in conversational agents. The list is included in this report for references.“ (Abstract Dagstuhl Report) The seminar, attended by scientists and experts from around the world, was held at Schloss Dagstuhl from September 19-24, 2022. The report can be downloaded via drops.dagstuhl.de/opus/volltexte/2022/15770/.

Fig.: The on-site group (Photo: Schloss Dagstuhl – LZ GmbH)

Trustworthy Conversational Agents

The Dagstuhl seminar „Conversational Agent as Trustworthy Autonomous System (Trust-CA)“ will take place from September 19 – 24, 2021. According to the website, Schloss Dagstuhl – Leibniz-Zentrum für Informatik „pursues its mission of furthering world class research in computer science by facilitating communication and interaction between researchers“. Organizers of this event are Asbjørn Følstad (SINTEF – Oslo), Jonathan Grudin (Microsoft – Redmond), Effie Lai-Chong Law (University of Leicester) and Björn Schuller (University of Augsburg). They outline the background as follows: „CA, like many other AI/ML-infused autonomous systems, need to gain the trust of their users in order to be deployed effectively. Nevertheless, in the first place, we need to ensure that such systems are trustworthy. Persuading users to trust a non-trustworthy CA is grossly unethical. Conversely, failing to convince users to trust a trustworthy CA that is beneficial to their wellbeing can be detrimental, given that a lack of trust leads to low adoption or total rejection of a system. A deep understanding of how trust is initially built and evolved in human-human interaction (HHI) can shed light on the trust journey in human-automation interaction (HAI). This calls forth a multidisciplinary analytical framework, which is lacking but much needed for informing the design of trustworthy autonomous systems like CA.“ (Website Dagstuhl) Regarding the goal of the workshop, the organizers write: „The overall goal of this Dagstuhl Seminar is to bring together researchers and practitioners, who are currently engaged in diverse communities related to Conversational Agent (CA) to explore the three main challenges on maximising the trustworthiness of and trust in CA as AI/ML-driven autonomous systems – an issue deemed increasingly significant given the widespread uses of CA in every sector of life – and to chart a roadmap for the future research on CA.” (Website Dagstuhl) Oliver Bendel (School of Business FHNW) will talk about his chatbot and voice assistant projects. These emerge since 2013 from machine ethics and social robotics. Further information is available here (photo: Schloss Dagstuhl).

Fig.: Bird’s eye view of Schloss Dagstuhl (Photo: Schloss Dagstuhl)

International Workshop on Trustworthy Conversational Agents

In the fall of 2021, a five-day workshop on trustworthy conversational agents will be held at Schloss Dagstuhl. Prof. Dr. Oliver Bendel is among the invited participants. According to the website, Schloss Dagstuhl – Leibniz Center for Informatics pursues its mission of furthering world class research in computer science by facilitating communication and interaction between researchers. Oliver Bendel and his teams have developed several chatbots like GOODBOT, LIEBOT and BESTBOT in the context of machine ethics since 2013, which were presented at conferences at Stanford University and Jagiellonian University and received international attention. Since the beginning of 2020, he has been preparing to develop several voice assistants that can show empathy and emotion. „Schloss Dagstuhl was founded in 1990 and quickly became established as one of the world’s premier meeting centers for informatics research. Since the very first days of Schloss Dagstuhl, the seminar and workshop meeting program has always been the focus of its programmatic work. In recent years, Schloss Dagstuhl has expanded its operation and also has significant efforts underway in bibliographic services … and in open access publishing.“ (Website Schloss Dagstuhl)

Fig.: Is this voicebot trustworthy?