First Notes on Maximum Entropy Entailment for Quantified Implications
Publikation: Beitrag in Buch/Konferenzbericht/Sammelband/Gutachten › Beitrag in Konferenzband › Beigetragen › Begutachtung
Beitragende
Abstract
Entropy is a measure for the uninformativeness or randomness of a data set, i.e., the higher the entropy is, the lower is the amount of information. In the field of propositional logic it has proven to constitute a suitable measure to be maximized when dealing with models of probabilistic propositional theories. More specifically, it was shown that the model of a probabilistic propositional theory with maximal entropy allows for the deduction of other formulae which are somehow expected by humans, i.e., allows for some kind of common sense reasoning. In order to pull the technique of maximum entropy entailment to the field of Formal Concept Analysis, we define the notion of entropy of a formal context with respect to the frequency of its object intents, and then define maximum entropy entailment for quantified implication sets, i.e., for sets of partial implications where each implication has an assigned degree of confidence. Furthermore, then this entailment technique is utilized to define so-called maximum entropy implicational bases (ME-bases), and a first general example of such a ME-base is provided.
Details
Originalsprache | Englisch |
---|---|
Titel | Formal Concept Analysis |
Redakteure/-innen | Karell Bertet, Daniel Borchmann, Peggy Cellier, Sébastien Ferré |
Herausgeber (Verlag) | Springer, Berlin [u. a.] |
Seiten | 155-167 |
Seitenumfang | 13 |
Publikationsstatus | Veröffentlicht - 13 Juni 2017 |
Peer-Review-Status | Ja |
Publikationsreihe
Reihe | Lecture Notes in Computer Science, Volume 10308 |
---|---|
ISSN | 0302-9743 |
Externe IDs
Scopus | 85021223700 |
---|---|
ORCID | /0000-0003-0219-0330/work/153109395 |