Joint Learning of Hierarchical Word Embeddings from a Corpus and a Taxonomy

Mohammed AlsuhaibaniTakanori MaeharaDanushka Bollegala.

doi:10.24432/C50591

TL;DR

We presented a method to jointly learn a Hierarchical Word Embedding (HWE) using a corpus and a taxonomy for identifying the hypernymy relations between words.
Identifying the hypernym relations that hold between words is a fundamental task in NLP. Word embedding methods have recently shown some capability to encode hypernymy. However, such methods tend not to explicitly encode the hypernym hierarchy that exists between words. In this paper, we propose a method to learn a hierarchical word embedding in a specific order to capture the hypernymy. To learn the word embeddings, the proposed method considers not only the hypernym relations that exists between words on a taxonomy, but also their contextual information in a large text corpus. The experimental results on a supervised hypernymy detection and a newly-proposed hierarchical path completion tasks show the ability of the proposed method to encode the hierarchy. Moreover, the proposed method outperforms previously proposed methods for learning word and hypernym-specific word embeddings on multiple benchmarks.

Citation

@inproceedings{
alsuhaibani2019joint,
title={Joint Learning of Hierarchical Word Embeddings from a Corpus and a Taxonomy},
author={Mohammed Alsuhaibani and Takanori Maehara and Danushka Bollegala},
booktitle={Automated Knowledge Base Construction (AKBC)},
year={2019},
url={https://openreview.net/forum?id=S1xf-W5paX},
doi={10.24432/C50591}
}
Gold Sponsors
Silver Sponsors
Bronze Sponsors
Chan Zuckerberg Initiative Facebook Google
Diffbot Oracle Corporation NEC
Elsevier Kenome