Improving Relation Extraction by Pre-trained Language Representations

Christoph AltMarc HübnerLeonhard Hennig.

doi:10.24432/C5KW2W

TL;DR

We propose a Transformer based relation extraction model that uses pre-trained language representations instead of explicit linguistic features.
Current state-of-the-art relation extraction methods typically rely on a set of lexical, syntactic, and semantic features, explicitly computed in a pre-processing step. Training feature extraction models requires additional annotated language resources, which severely restricts the applicability and portability of relation extraction to novel languages. Similarly, pre-processing introduces an additional source of error. To address these limitations, we introduce TRE, a Transformer for Relation Extraction, extending the OpenAI Generative Pre-trained Transformer [Radford et al., 2018]. Unlike previous relation extraction models, TRE uses pre-trained deep language representations instead of explicit linguistic features to inform the relation classification and combines it with the self-attentive Transformer architecture to effectively model long-range dependencies between entity mentions. TRE allows us to learn implicit linguistic features solely from plain text corpora by unsupervised pre-training, before fine-tuning the learned language representations on the relation extraction task. TRE obtains a new state-of-the-art result on the TACRED and SemEval 2010 Task 8 datasets, achieving a test F1 of 67.4 and 87.1, respectively. Furthermore, we observe a significant increase in sample efficiency. With only 20% of the training examples, TRE matches the performance of our baselines and our model trained from scratch on 100% of the TACRED dataset. We open-source our trained models, experiments, and source code.

Citation

@inproceedings{
alt2019improving,
title={Improving Relation Extraction by Pre-trained Language Representations},
author={Christoph Alt and Marc H{\"u}bner and Leonhard Hennig},
booktitle={Automated Knowledge Base Construction (AKBC)},
year={2019},
url={https://openreview.net/forum?id=BJgrxbqp67},
doi={10.24432/C5KW2W}
}
Gold Sponsors
Silver Sponsors
Bronze Sponsors
Chan Zuckerberg Initiative Facebook Google
Diffbot Oracle Corporation NEC
Elsevier Kenome