First Notes on Maximum Entropy Entailment for Quantified Implications
Research output: Contribution to book/Conference proceedings/Anthology/Report › Conference contribution › Contributed › peer-review
Contributors
Abstract
Entropy is a measure for the uninformativeness or randomness of a data set, i.e., the higher the entropy is, the lower is the amount of information. In the field of propositional logic it has proven to constitute a suitable measure to be maximized when dealing with models of probabilistic propositional theories. More specifically, it was shown that the model of a probabilistic propositional theory with maximal entropy allows for the deduction of other formulae which are somehow expected by humans, i.e., allows for some kind of common sense reasoning. In order to pull the technique of maximum entropy entailment to the field of Formal Concept Analysis, we define the notion of entropy of a formal context with respect to the frequency of its object intents, and then define maximum entropy entailment for quantified implication sets, i.e., for sets of partial implications where each implication has an assigned degree of confidence. Furthermore, then this entailment technique is utilized to define so-called maximum entropy implicational bases (ME-bases), and a first general example of such a ME-base is provided.
Details
Original language | English |
---|---|
Title of host publication | Formal Concept Analysis |
Editors | Karell Bertet, Daniel Borchmann, Peggy Cellier, Sébastien Ferré |
Publisher | Springer, Berlin [u. a.] |
Pages | 155-167 |
Number of pages | 13 |
Publication status | Published - 13 Jun 2017 |
Peer-reviewed | Yes |
Publication series
Series | Lecture Notes in Computer Science, Volume 10308 |
---|---|
ISSN | 0302-9743 |
External IDs
Scopus | 85021223700 |
---|---|
ORCID | /0000-0003-0219-0330/work/153109395 |