Modeling and Evaluating Personas with Software Explainability Requirements

Publikation: Beitrag in Buch/Konferenzbericht/Sammelband/GutachtenBeitrag in Buch/Sammelband/GutachtenBeigetragenBegutachtung

Beitragende

Abstract

This work focuses on the context of software explainability, which is the production of software capable of explaining to users the dynamics that govern its internal functioning. User models that include information about their requirements and their perceptions of explainability are fundamental when building software with such capability. This study investigates the process of creating personas that include information about users’ explainability perceptions and needs. The proposed approach is based on data collection with questionnaires, modeling of empathy maps, grouping the maps, generating personas from them and evaluation employing the Persona Perception Scale method. In an empirical study, personas are created from 61 users’ response data to a questionnaire. The generated personas are evaluated by 60 users and 38 designers considering attributes of the Persona Perception Scale method. The results include a set of 5 distinct personas that users rate as representative of them at an average level of 3.7 out of 5, and designers rate as having quality 3.5 out of 5. The median rate is 4 out of 5 in the majority of criteria judged by users and designers. Both the personas and their creation and evaluation approach are contributions of this study to the design of software that satisfies the explainability requirement.

Details

OriginalspracheEnglisch
TitelHuman-Computer Interaction. HCI-COLLAB 2021. Communications in Computer and Information Science
Herausgeber (Verlag)Springer, Cham
Band1478
ISBN (elektronisch)978-3-030-92325-9
ISBN (Print)978-3-030-92324-2
PublikationsstatusVeröffentlicht - 2021
Peer-Review-StatusJa

Externe IDs

Scopus 85121869756

Schlagworte