Open-Domain Contextual Link Prediction and its Complementarity with Entailment Graphs

Open-Domain Contextual Link Prediction and its Complementarity with Entailment Graphs

Mark Johnson, Mohammad Javad Hosseini, Shay Cohen, Mark Steedman

06 November 2021

An open-domain knowledge graph (KG) has entities as nodes and natural language relations as edges, and is constructed by extracting (subject, relation, object) triples from text. The task of open-domain link prediction is to infer missing relations in the KG. Previous work has used standard link prediction for the task. Since triples are extracted from text, we can ground them in the larger textual context in which they were originally found. However, standard link prediction methods only rely on the KG structure and ignore the textual context of the triples. In this paper, we introduce the new task of open-domain contextual link prediction which has access to both the textual context and the KG structure to perform link prediction. We build a dataset for the task and propose a model for it. Our experiments show that context is crucial in predicting missing relations. We also demonstrate the utility of contextual link prediction in discovering out-of-context entailments between relations, in the form of entailment graphs (EG), in which the nodes are the relations. The reverse holds too: out-of-context EGs assist in predicting relations in context.


Venue : EMNLP 2021 https://2021.emnlp.org/

File Name : Contextual_lpred_and_its_complementarity_with_entgraphs.pdf