Controllable Factuality in Document-Grounded Dialog Systems Using a Noisy Channel Model

Nico Daheim, David Thulke, Christian Dugast, Hermann Ney


Abstract
In this work, we present a model for document-grounded response generation in dialog that is decomposed into two components according to Bayes’ theorem.One component is a traditional ungrounded response generation model and the other component models the reconstruction of the grounding document based on the dialog context and generated response.We propose different approximate decoding schemes and evaluate our approach on multiple open-domain and task-oriented document-grounded dialog datasets.Our experiments show that the model is more factual in terms of automatic factuality metrics than the baseline model.Furthermore, we outline how introducing scaling factors between the components allows for controlling the tradeoff between factuality and fluency in the model output.Finally, we compare our approach to a recently proposed method to control factuality in grounded dialog, CTRL (Rashkin et al., 2021), and show that both approaches can be combined to achieve additional improvements.
Anthology ID:
2022.findings-emnlp.98
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1365–1381
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.98
DOI:
10.18653/v1/2022.findings-emnlp.98
Bibkey:
Cite (ACL):
Nico Daheim, David Thulke, Christian Dugast, and Hermann Ney. 2022. Controllable Factuality in Document-Grounded Dialog Systems Using a Noisy Channel Model. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 1365–1381, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Controllable Factuality in Document-Grounded Dialog Systems Using a Noisy Channel Model (Daheim et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.98.pdf
Video:
 https://aclanthology.org/2022.findings-emnlp.98.mp4