explaining logical entailments |
Subject |
knowledge media |
explaining logical entailments |
hasPrincipalInvestigator |
2515c15e5a8e5ef71a6e3a3c05d159fc |
explaining logical entailments |
type |
Project |
explaining logical entailments |
label |
Explaining Logical Entailments |
explaining logical entailments |
name |
Explaining Logical Entailments |
explaining logical entailments |
alternative label |
Explaining Logical Entailments |
explaining logical entailments |
Description |
<p> </p><p> </p><h3>Explaining Entailments in OWL Ontologies</h3><p>Building
error-free and high-quality domain ontologies in OWL (Web Ontology Language) - the
latest standard ontology language endorsed by the World Wide Web Consortium - is
not
an easy task for domain experts, who usually have limited knowledge of OWL and logic.
One sign of an erroneous ontology is the occurrence of undesired inferences (or entailments),
often caused by interactions among (apparently innocuous) axioms within the ontology.
This suggests the need for a tool that allows ontology developers to inspect why such
an entailment follows from the ontology in order to debug and repair the ontology.<br
/><br />This PhD project aims to address the above problem by developing a Natural
Language Generation system which is capable of generating accessible explanations,
in English, of why an entailment follows from an OWL ontology. Justifications for
entailments, which are minimal subsets of the ontology from which an entailment can
be drawn, are adopted as the basis for generating such explanations. The focus of
this thesis is on issues of planning for the content of an explanation and how to
explain OWL inferences in English. Part of the novelty of this thesis is the assessment
of understandability of inferences in OWL - both simple and complex inferences - in
order to enable the selection of the easiest explanation for an entailment among alternatives.<br
/><br />The project findings should be of interest to researchers in the areas of
Natural Language Generation and Knowledge Representation, and developers of ontology
viewing and editing tools and automated reasoners for OWL, who wish to integrate an
explanation facility in their systems in order to support their users, especially
non-expert users.</p><p><strong>Publications</strong></p><p>Click<a href="http://oro.open.ac.uk/cgi/search/simple?screen=Public%3A%3AEPrintSearch&meta_merge=ALL&meta=&person_merge=ALL&person=Nguyen%2C+Tu&date=&satisfyall=ALL&order=-date%2Fcreators_name%2Ftitle&_action_search=Search">here
for project publications</a>.</p><p> </p><p> </p> |
explaining logical entailments |
in dataset |
crc |
explaining logical entailments |
organization |
explaining logical entailments |