Portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 2
Published in COLING 2022, 2022
In this study, we propose a method, called DiverseQA, to tackle the problem of diversifying answers in unsupervised question answering.
Recommended citation: Yuxiang Nie, Heyan Huang, Zewen Chi, and Xian-Ling Mao. 2022. Unsupervised Question Answering via Answer Diversifying. In Proceedings of the 29th International Conference on Computational Linguistics, pages 1732–1742, Gyeongju, Republic of Korea. International Committee on Computational Linguistics. https://aclanthology.org/2022.coling-1.149/
Published in EMNLP 2022, 2022
In this study, we propose Compressive Graph Selector Network (CGSN) to capture the global structure in a compressive and iterative manner.
Recommended citation: Yuxiang Nie, Heyan Huang, Wei Wei, and Xian-Ling Mao. 2022. Capturing Global Structural Information in Long Document Question Answering with Compressive Graph Selector Network. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 5036–5047, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics. https://aclanthology.org/2022.emnlp-main.336/
Published in ACL 2023 Findings, 2023
In this study, we propose a new task, named unsupervised long-document question answering (ULQA), aiming to generate high-quality long-document QA instances in an unsupervised manner. Besides, we propose AttenWalker, a novel unsupervised method to aggregate and generate answers with long-range dependency so as to construct long-document QA pairs.
Recommended citation: Yuxiang Nie, Heyan Huang, Wei Wei, and Xian-Ling Mao. 2023. AttenWalker: Unsupervised Long-Document Question Answering via Attention-based Graph Walking. Findings of the Association for Computational Linguistics: ACL 2023. Toronto, Canada: Association for Computational Linguistics. https://aclanthology.org/2023.findings-acl.862/
Published in NAACL 2024, 2024
In this work, we propose a general mix-Initiative Dynamic Prefix Tuning framework (IDPT) to decouple different initiatives from the generation model.
Recommended citation: Yuxiang Nie, Heyan Huang, Xian-Ling Mao, and Lizi Liao. 2024. Mix-Initiative Response Generation with Dynamic Prefix Tuning. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Mexico City, Mexico. Association for Computational Linguistics. https://arxiv.org/abs/2403.17636
Published:
This is a description of your talk, which is a markdown files that can be all markdown-ified like any other post. Yay markdown!
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
Undergraduate course, University 1, Department, 2014
This is a description of a teaching experience. You can use markdown like any other post.
Workshop, University 1, Department, 2015
This is a description of a teaching experience. You can use markdown like any other post.