Learning Relation Representations from Word Representations

Huda HakamiDanushka Bollegala.

doi:10.24432/C5BC7F

TL;DR

Identifying the relations that connect words is important for various NLP tasks. We model relation representation as a supervised learning problem and learn parametrised operators that map pre-trained word embeddings to relation representations.
Identifying the relations that connect words is an important step towards understanding human languages and is useful for various NLP tasks such as knowledge base completion and analogical reasoning. Simple unsupervised operators such as vector offset between two-word embeddings have shown to recover some specific relationships between those words, if any. Despite this, how to accurately learn generic relation representations from word representations remains unclear. We model relation representation as a supervised learning problem and learn parametrised operators that map pre-trained word embeddings to relation representations. We propose a method for learning relation representations using a feed-forward neural network that performs relation prediction. Our evaluations on two benchmark datasets reveal that the penultimate layer of the trained neural network-based relational predictor acts as a good representation for the relations between words.

Citation

@inproceedings{
hakami2019learning,
title={Learning Relation Representations from Word Representations},
author={Huda Hakami and Danushka Bollegala},
booktitle={Automated Knowledge Base Construction (AKBC)},
year={2019},
url={https://openreview.net/forum?id=r1e3WW5aTX},
doi={10.24432/C5BC7F}
}
Gold Sponsors
Silver Sponsors
Bronze Sponsors
Chan Zuckerberg Initiative Facebook Google
Diffbot Oracle Corporation NEC
Elsevier Kenome