Semi-supervised Continual Learning with Meta Self-training and Consistency RegularizationDownload PDF

Anonymous

16 Oct 2022 (modified: 05 May 2023)ACL ARR 2022 October Blind SubmissionReaders: Everyone
Keywords: continual learning, semi-supervised learning, meta-learning, text classification
Abstract: Recent advances in continual learning (CL) are mainly confined to a supervised learning setting, which is often impractical. To narrow this gap, we consider a semi-supervised continual learning (SSCL) for lifelong language learning. In this paper, we exploit unlabeled data under limited supervision in the CL setting and demonstrate the feasibility of semi-supervised learning in CL. Specifically, we propose a novel method, namely Meta-Aug, which employs meta self-training and consistency regularization to learn a sequence of semi-supervised tasks. We employ prototypical pseudo-labeling and data augmentation to efficiently learn under limited supervision without catastrophic forgetting. Furthermore, replay-based CL methods easily overfit to memory samples. We solve this problem by applying strong textual augmentation to introduce generalization. Extensive experiments on CL benchmark text classification datasets from diverse domains show that our method achieves promising results in SSCL.
Paper Type: long
Research Area: Machine Learning for NLP
0 Replies

Loading