Commonsense Reasoning with Implicit Knowledge in Natural Language

Pratyay BanerjeeSwaroop MishraKuntal Kumar PalArindam MitraChitta Baral.

doi:10.24432/C57C7H

TL;DR

We take a middle ground between large language models and knowledge graphs by using smaller language models together with a relatively smaller but targeted natural language text corpora to reason with implicit commonsense.
Commonsense Reasoning is a research challenge studied from the early days of AI. In recent years, several natural language QA task have been proposed where commonsense reasoning is important. Two common approaches to this are (i) Use of well-structured commonsense present in knowledge graphs, and (ii) Use of progressively larger transformer language models. While acquiring and representing commonsense in a formal representation is challenging in approach (i), approach (ii) gets more and more resource-intensive. In this work, we take a middle ground where we use smaller language models together with a relatively smaller but targeted natural language text corpora. The advantages of such an approach is that it is less resource intensive and yet at the same time it can use unstructured text corpora. We define different unstructured commonsense knowledge sources, explore three strategies for knowledge incorporation, and propose four methods competitive to state-of-the-art methods to reason with implicit commonsense.

Citation

@inproceedings{
banerjee2021commonsense,
title={Commonsense Reasoning with Implicit Knowledge in Natural Language},
author={Pratyay Banerjee and Swaroop Mishra and Kuntal Kumar Pal and Arindam Mitra and Chitta Baral},
booktitle={3rd Conference on Automated Knowledge Base Construction},
year={2021},
url={https://openreview.net/forum?id=a4-fFL7aCi0},
doi={10.24432/C57C7H}
}