Exploring Low-dimensional Intrinsic Task Subspace via Prompt Tuning
Unsupervised Domain Adaptation of a Pretrained Cross-Lingual Language Model
Contrastive Representation Distillation
Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning
On Transferability of Prompt Tuning for Natural Language Understanding
TransPrompt Towards an Automatic Transferable Prompting Framework for Few-shot Text Classification
Iterative Network Pruning with Uncertainty Regularization for Lifelong Sentiment Classification
Continual Learning with Hypernetworks
PromptBERT: Improving BERT Sentence Embeddings with Prompts
Parameter-Efficient Tuning by Manipulating Hidden States of Pretrained Language Models For Classification Tasks