Evaluating Unsupervised Argument Aligners via Generation of Conclusions of Structured Scientific Abstracts

Yingqiang Gao, Nianlong Gu, Jessica Lam, James Henderson, Richard Hahnloser


Abstract
Scientific abstracts provide a concise summary of research findings, making them a valuable resource for extracting scientific arguments. In this study, we assess various unsupervised approaches for extracting arguments as aligned premise-conclusion pairs: semantic similarity, text perplexity, and mutual information. We aggregate structured abstracts from PubMed Central Open Access papers published in 2022 and evaluate the argument aligners in terms of the performance of language models that we fine-tune to generate the conclusions from the extracted premise given as input prompts. We find that mutual information outperforms the other measures on this task, suggesting that the reasoning process in scientific abstracts hinges mostly on linguistic constructs beyond simple textual similarity.
Anthology ID:
2024.eacl-short.14
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
151–160
Language:
URL:
https://aclanthology.org/2024.eacl-short.14
DOI:
Bibkey:
Cite (ACL):
Yingqiang Gao, Nianlong Gu, Jessica Lam, James Henderson, and Richard Hahnloser. 2024. Evaluating Unsupervised Argument Aligners via Generation of Conclusions of Structured Scientific Abstracts. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers), pages 151–160, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Evaluating Unsupervised Argument Aligners via Generation of Conclusions of Structured Scientific Abstracts (Gao et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-short.14.pdf
Video:
 https://aclanthology.org/2024.eacl-short.14.mp4