Effective Integration of Text Diffusion and Pre-Trained Language Models with Linguistic Easy-First Schedule

Yimin Ou, Ping Jian


Abstract
Diffusion models have become a powerful generative modeling paradigm, achieving great success in continuous data patterns. However, the discrete nature of text data results in compatibility issues between continuous diffusion models (CDMs) and pre-trained language models (PLMs). That is, the performance of diffusion models even degrades when combined with PLMs. To alleviate this issue, we propose to utilize a pre-trained decoder to convert the denoised embedding vectors into natural language instead of using the widely used rounding operation. In this way, CDMs can be more effectively combined with PLMs. Additionally, considering that existing noise schedules in text diffusion models do not take into account the linguistic differences among tokens, which violates the easy-first policy for text generation, we propose a linguistic easy-first schedule that incorporates the measure of word importance, conforming to easy-first-generation linguistic features and bringing about improved generation quality. Experiment results on the E2E dataset and five controllable tasks show that our approach can combine the merits of CDMs and PLMs, significantly outperforming other diffusion-based models.
Anthology ID:
2024.lrec-main.493
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
5551–5561
Language:
URL:
https://aclanthology.org/2024.lrec-main.493
DOI:
Bibkey:
Cite (ACL):
Yimin Ou and Ping Jian. 2024. Effective Integration of Text Diffusion and Pre-Trained Language Models with Linguistic Easy-First Schedule. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 5551–5561, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Effective Integration of Text Diffusion and Pre-Trained Language Models with Linguistic Easy-First Schedule (Ou & Jian, LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.493.pdf