Involving Cognitive Science in Model Transformation for Description Logics

Research output: Contribution to conferencesPresentation slidesContributedpeer-review

Abstract

Knowledge representation and reasoning (KRR) is a fundamental area in artificial intelligence (AI) research, focusing on
encoding world knowledge as logical formulae in ontologies. This formalism enables logic-based AI systems to deduce new
insights from existing knowledge. Within KRR, description logics (DLs) are a prominent family of languages to represent
knowledge formally. They are decidable fragments of first-order logic, and their models can be visualized as edge- and vertexlabeled directed binary graphs. DLs facilitate various reasoning tasks, including checking the satisfiability of statements and
deciding entailment. However, a significant challenge arises in the computation of models of DL ontologies in the context
of explaining reasoning results. Although existing algorithms efficiently compute models for reasoning tasks, they usually
do not consider aspects of human cognition, leading to models that may be less effective for explanatory purposes. This
paper tackles this challenge by proposing an approach to enhance the intelligibility of models of DL ontologies for users. By
integrating insights from cognitive science and philosophy, we aim to identify key graph properties that make models more
accessible and useful for explanation.

Details

Original languageEnglish
Number of pages15
Publication statusPublished - 9 Jun 2023
Peer-reviewedYes

Conference

TitleModel-Based Reasoning, Abductive Cognition, Creativity 2023
SubtitleInferences & Models in Science, Language, and Technology
Abbreviated titleMBR023
Conference number9
Duration7 - 9 June 2023
Website
Degree of recognitionInternational event
LocationSapienza University of Rome
CityRome
CountryItaly

External IDs

ORCID /0000-0001-5232-5729/work/187997467
ORCID /0000-0001-5398-5569/work/187999516

Keywords