Reasoning with Transformer-based Models: Deep Learning, but Shallow Reasoning - Equipe Data, Intelligence and Graphs Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Reasoning with Transformer-based Models: Deep Learning, but Shallow Reasoning

Résumé

Recent years have seen impressive performance of transformer-based models on different natural language processing tasks. However, it is not clear to what degree the transformers can reason on natural language. To shed light on this question, this survey paper discusses the performance of transformers on different reasoning tasks, including mathematical reasoning, commonsense reasoning, and logical reasoning. We point out successes and limitations, of both empirical and theoretical nature.
Fichier principal
Vignette du fichier
akbc-2021-reasoning.pdf (335.71 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03344668 , version 1 (15-09-2021)

Identifiants

  • HAL Id : hal-03344668 , version 1

Citer

Chadi Helwe, Chloé Clavel, Fabian Suchanek. Reasoning with Transformer-based Models: Deep Learning, but Shallow Reasoning. International Conference on Automated Knowledge Base Construction (AKBC), 2021, online, United States. ⟨hal-03344668⟩
522 Consultations
256 Téléchargements

Partager

Gmail Facebook X LinkedIn More