Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Achieving parametric uniformity for knowledge bases in a relational probabilistic conditional logic with maximum entropy semantics

Achieving parametric uniformity for knowledge bases in a relational probabilistic conditional... When extending probabilistic logic to a relational setting, it is desirable to still be able to use efficient computation mechanisms developed for the propositional case. In this paper, we investigate the relational probabilistic conditional logic FO-PCL whose semantics employs the principle of maximum entropy. While in general, this semantics is defined via the ground instances of the rules in an FO-PCL knowledge base ${\cal R}$ , the maximum entropy model can be computed on the level of rules rather than on the level of instances of the rules if ${\cal R}$ is parametrically uniform. We elaborate in detail the reasons that cause ${\cal R}$ to be not parametrically uniform. Based on this investigation, we derive a new syntactic criterion for parametric uniformity and develop an algorithm that transforms any FO-PCL knowledge base ${\cal R}$ into an equivalent knowledge base ${\cal R}^{\prime}$ that is parametrically uniform. This provides a basis for a simplified maximum entropy model computation since for this computation, ${\cal R}^{\prime}$ can be used instead of ${\cal R}$ . http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Annals of Mathematics and Artificial Intelligence Springer Journals

Achieving parametric uniformity for knowledge bases in a relational probabilistic conditional logic with maximum entropy semantics

Loading next page...
 
/lp/springer-journals/achieving-parametric-uniformity-for-knowledge-bases-in-a-relational-B3MlvlbO0g

References (13)

Publisher
Springer Journals
Copyright
Copyright © 2013 by Springer Science+Business Media Dordrecht
Subject
Computer Science; Artificial Intelligence (incl. Robotics); Mathematics, general; Computer Science, general; Statistical Physics, Dynamical Systems and Complexity
ISSN
1012-2443
eISSN
1573-7470
DOI
10.1007/s10472-013-9369-3
Publisher site
See Article on Publisher Site

Abstract

When extending probabilistic logic to a relational setting, it is desirable to still be able to use efficient computation mechanisms developed for the propositional case. In this paper, we investigate the relational probabilistic conditional logic FO-PCL whose semantics employs the principle of maximum entropy. While in general, this semantics is defined via the ground instances of the rules in an FO-PCL knowledge base ${\cal R}$ , the maximum entropy model can be computed on the level of rules rather than on the level of instances of the rules if ${\cal R}$ is parametrically uniform. We elaborate in detail the reasons that cause ${\cal R}$ to be not parametrically uniform. Based on this investigation, we derive a new syntactic criterion for parametric uniformity and develop an algorithm that transforms any FO-PCL knowledge base ${\cal R}$ into an equivalent knowledge base ${\cal R}^{\prime}$ that is parametrically uniform. This provides a basis for a simplified maximum entropy model computation since for this computation, ${\cal R}^{\prime}$ can be used instead of ${\cal R}$ .

Journal

Annals of Mathematics and Artificial IntelligenceSpringer Journals

Published: Jul 20, 2013

There are no references for this article.