A Profile and Design Space for Characterizing User Interface Adaptation Víctor López-Jaquero, Vivian Genaro Motti, Francisco Montero, Pascual González López, Nicolas Burny |
47-67 |
Explainable Artificial Intelligence in Natural Language Processing Marian Gabriel Sandu, Ştefan Trăuşan-Matu |
68-84 |
1 Laboratory of User Interaction and Software Engineering (LoUISE), Computer Science Dept., University of Castilla-La Mancha
Campus Universitario, 02071 Albacete, Spain
2 Dept. of Information Sciences and Technology (IST), Volgenau School of Engineering, George Mason University
Engineering Building, Rm. 5350, Fairfax Campus, USA
3 Louvain Research Institute in Management and Organizations (LouRIM), Université catholique de Louvain
Place des Doyens, 1 – B-1348 Louvain-la-Neuve, Belgium
Abstract. To better characterize the adaptation process of a user interface, we introduce an adaptation profile and a design space based on the seven adaptation stages defined in the GISATIE life-cycle: goals, initiative, specification, application, transition, interpretation, and evaluation. The adaptation profile expresses who is responsible for ensuring each adaptation cycle: one or several end users, one or several machine agents, one or many third parties, and any combination of the former. The adaptation design space expresses seven key dimensions along which adaptation can be decided and designed: autonomy level, granularity level, task resuming granularity, user interface deployment, technological space coverage, user feedback, and modality. Some examples are included to illustrate how to use this profile and design space for two systems ensuring user interface adaptation to some extent.
Keywords: adaptation profile, design space, user interface adaptation
Cite this paper as:
López-Jaquero, V., Motti, V. G., Montero, F., López, P. G., Burny, N. A Profile and Design Space for Characterizing User Interface Adaptation.
International Journal of User-System Interaction 14(2),
47-67, 2021.
1 University Politehnica of Bucharest
313 Splaiul Independentei, Bucharest, Romania
2 Institutul de Cercetări în Inteligenţa Artificială
Calea 13 Septembrie nr. 13, Bucureşti
3 Academy of Romanian Scientists
Splaiul Indpendentei 54, Bucharest, Romania
Abstract. Explainable Artificial Intelligence has received a lot of interest and has been growing steadily in the last few years. This is because Machine Learning and Deep Learning domains overgrew creating more complex models that are highly accurate but lack explainability and interpretability. The aim of this paper is to present two most used model-agnostic explanation methods and experiment with them on a conversational dataset.
Keywords: artificial intelligence; machine learning; deep learning; neural networks; human-AI interaction
Cite this paper as:
Sandu, M. G., Trăuşan-Matu, S. Explainable Artificial Intelligence in Natural Language Processing.
International Journal of User-System Interaction 14(2),
68-84, 2021.