Combining Analogy with Language Models for Knowledge Extraction

Danilo Neves RibeiroKenneth Forbus.


Combines language models with analogical learning for extracting common sense facts from web text from a few examples.
Learning structured knowledge from natural language text has been a long-standing challenge. Previous work has focused on specific domains, mostly extracting knowledge about named entities (e.g. countries, companies, or persons) instead of general-purpose world knowledge (e.g. information about science or everyday objects). In this paper we combine the Companion Cognitive Architecture with the BERT Language Model to extract structured knowledge from text, with the goal of automatically inferring missing commonsense facts from an existing knowledge base. Using the principles of distant supervision, the system learns functions called query cases that map statements expressed in natural language into knowledge base relations. Afterwards, the system uses such query cases to extract structured knowledge using analogical reasoning. We run experiments on 2,679 Simple English Wikipedia articles, where the system is able to learn high precision facts about a variety of subjects from a few training examples, outperforming strong baselines.


title={Combining Analogy with Language Models for Knowledge Extraction},
author={Danilo Neves Ribeiro and Kenneth Forbus},
booktitle={3rd Conference on Automated Knowledge Base Construction},