AdapterShare: Task Correlation Modeling with Adapter Differentiation

Zhi Chen, Bei Chen, Lu Chen, Kai Yu, Jian-Guang Lou


Abstract
Thanks to the development of pre-trained language models, multitask learning (MTL) methods achieve a great success in natural language understanding area.However, current MTL methods pay more attention to task selection or model design to fuse as much knowledge as possible, while intrinsic task correlation is often neglected. It is important to learn sharing strategy among multiple tasks rather than sharing everything.%The MTL model is directly shared among all the tasks. %For example, in traditional MTL methods, the last classification layers or the decoder layers are manually separated. More deeply, In this paper, we propose AdapterShare, an adapter differentiation method to explicitly model the task correlation among multiple tasks. AdapterShare is automatically learned based on the gradients on tiny held-out validation data. Compared to single-task learning and fully shared MTL methods, our proposed method obtains obvious performance improvement. Compared to the existing MTL method AdapterFusion, AdapterShare achieves absolute 1.90 average points improvement on five dialogue understanding tasks and 2.33 points gain on NLU tasks.
Anthology ID:
2022.emnlp-main.728
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10645–10651
Language:
URL:
https://aclanthology.org/2022.emnlp-main.728
DOI:
10.18653/v1/2022.emnlp-main.728
Bibkey:
Cite (ACL):
Zhi Chen, Bei Chen, Lu Chen, Kai Yu, and Jian-Guang Lou. 2022. AdapterShare: Task Correlation Modeling with Adapter Differentiation. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 10645–10651, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
AdapterShare: Task Correlation Modeling with Adapter Differentiation (Chen et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.728.pdf