NCPrompt: NSP-Based Prompt Learning and Contrastive Learning for Implicit Discourse Relation RecognitionDownload PDF

Anonymous

16 Feb 2024ACL ARR 2024 February Blind SubmissionReaders: Everyone
Abstract: Implicit Discourse Relation Recognition (IDRR) is an important task to classify the discourse relation sense between argument pairs without an explicit connective. Recently, prompt learning methods have demonstrated success in dealing with IDRR. However, prior work primarily transform IDRR into a connective-cloze task based on the masked language model (MLM), which limits the predicted word to one single token. Besides, these methods use hand-crafted verbalizers which are time-consuming and less convincing. In this paper, we propose NCPrompt, an NSP-based prompt learning and Contrastive learning method for IDRR. Specifically, we automatically search the optimal verbalizer for IDRR based on the statistical and expressive features of connectives. Furthermore, we transform the IDRR task into a next sentence prediction (NSP) task and introduce contrastive learning by constructing augmentation views. In this way, the answer words of multiple tokens can convey more precise meaning and contrastive learning can help to generate more informative embeddings, expected to boost the model performance. To our knowledge, we are the first to apply NSP to handle the IDRR task. Experiments on the PDTB 3.0 corpus have demonstrated the effectiveness and superiority of our proposed model.
Paper Type: long
Research Area: Discourse and Pragmatics
Languages Studied: English
Preprint Status: There is no non-anonymous preprint and we do not intend to release one.
A1: yes
A2: no
A3: yes
B: no
C: yes
D: no
E: no
0 Replies

Loading