Empowering Tree-structured Entailment Reasoning: Rhetorical Perception and LLM-driven Interpretability

Longyin Zhang, Bowei Zou, Ai Ti Aw


Abstract
The study delves into the construction of entailment trees for science question answering (SQA), employing a novel framework termed Tree-structured Entailment Reasoning (TER). Current research on entailment tree construction presents significant challenges, primarily due to the ambiguities and similarities among candidate science facts, which considerably complicate the fact retrieval process. Moreover, the existing models exhibit limitations in effectively modeling the sequence of reasoning states, understanding the intricate relations between neighboring entailment tree nodes, and generating intermediate conclusions. To this end, we explore enhancing the TER performance from three aspects: First, improving retrieval capabilities by modeling and referring to the chained reasoning states; Second, enhancing TER by infusing knowledge that bridges the gap between reasoning types and rhetorical relations. Third, exploring a task-specific large language model tuning scheme to mitigate deficiencies in intermediate conclusion generation. Experiments on the English EntailmentBank demonstrate the effectiveness of the proposed methods in augmenting the quality of tree-structured entailment reasoning to a certain extent.
Anthology ID:
2024.lrec-main.513
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
5783–5793
Language:
URL:
https://aclanthology.org/2024.lrec-main.513
DOI:
Bibkey:
Cite (ACL):
Longyin Zhang, Bowei Zou, and Ai Ti Aw. 2024. Empowering Tree-structured Entailment Reasoning: Rhetorical Perception and LLM-driven Interpretability. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 5783–5793, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Empowering Tree-structured Entailment Reasoning: Rhetorical Perception and LLM-driven Interpretability (Zhang et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.513.pdf