Skip to Main content Skip to Navigation
Conference papers

Implicit Discourse Relation Classification with Syntax-Aware Contextualized Word Representations

Abstract : Automatically identifying implicit discourse relations requires an in-depth semantic understanding of the text fragments involved in such relations. While early work investigated the usefulness of different classes of input features, current state-of-the-art models mostly rely on standard pretrained word embeddings to model the arguments of a discourse relation. In this paper, we introduce a method to compute contextualized representations of words, leveraging information from the sentence dependency parse, to improve argument representation. The resulting token embeddings encode the structure of the sentence from a dependency point of view in their representations. Experimental results show that the proposed representations achieve state-of-the-art results when input to standard neural network architectures, surpassing complex models that use additional data and consider the interaction between arguments.
Document type :
Conference papers
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-03352337
Contributor : Eric Gaussier Connect in order to contact the contributor
Submitted on : Thursday, September 23, 2021 - 10:10:46 AM
Last modification on : Tuesday, October 19, 2021 - 11:18:50 AM

File

18303-78922-1-PB.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03352337, version 1

Collections

Citation

Diana Popa, Julien Perez, James Henderson, Éric Gaussier. Implicit Discourse Relation Classification with Syntax-Aware Contextualized Word Representations. 32nd FLAIRS Conference 2019: Sarasota, Florida, USA, 2019, Florida, USA, United States. ⟨hal-03352337⟩

Share

Metrics

Record views

20

Files downloads

19