Revisiting out-of-distribution robustness in nlp: Benchmarks, analysis, and LLMs evaluations

L Yuan, Y Chen, G Cui, H Gao, F Zou… - Advances in …, 2023 - proceedings.neurips.cc
This paper reexamines the research on out-of-distribution (OOD) robustness in the field of
NLP. We find that the distribution shift settings in previous studies commonly lack adequate …

Promptner: Prompting for named entity recognition

D Ashok, ZC Lipton - arXiv preprint arXiv:2305.15444, 2023 - arxiv.org
In a surprising turn, Large Language Models (LLMs) together with a growing arsenal of
prompt-based heuristics now offer powerful off-the-shelf approaches providing few-shot …

Glue-x: Evaluating natural language understanding models from an out-of-distribution generalization perspective

L Yang, S Zhang, L Qin, Y Li, Y Wang, H Liu… - arXiv preprint arXiv …, 2022 - arxiv.org
Pre-trained language models (PLMs) are known to improve the generalization performance
of natural language understanding models by leveraging large amounts of data during the …

Cross-domain data augmentation with domain-adaptive language modeling for aspect-based sentiment analysis

J Yu, Q Zhao, R Xia - Proceedings of the 61st Annual Meeting of …, 2023 - aclanthology.org
Abstract Cross-domain Aspect-Based Sentiment Analysis (ABSA) aims to leverage the
useful knowledge from a source domain to identify aspect-sentiment pairs in sentences from …

Decoupled Hyperbolic Graph Attention Network for Cross-domain Named Entity Recognition

J Xu, Y Cai - Proceedings of the 46th International ACM SIGIR …, 2023 - dl.acm.org
To address the scarcity of massive labeled data, cross-domain named entity recognition
(cross-domain NER) attracts increasing attention. Recent studies focus on decomposing …

Mere contrastive learning for cross-domain sentiment analysis

Y Luo, F Guo, Z Liu, Y Zhang - arXiv preprint arXiv:2208.08678, 2022 - arxiv.org
Cross-domain sentiment analysis aims to predict the sentiment of texts in the target domain
using the model trained on the source domain to cope with the scarcity of labeled data …

Rfid: Towards rational fusion-in-decoder for open-domain question answering

C Wang, H Yu, Y Zhang - arXiv preprint arXiv:2305.17041, 2023 - arxiv.org
Open-Domain Question Answering (ODQA) systems necessitate a reader model capable of
generating answers by simultaneously referring to multiple passages. Although …

RobustGEC: Robust Grammatical Error Correction Against Subtle Context Perturbation

Y Zhang, L Cui, E Zhao, W Bi, S Shi - arXiv preprint arXiv:2310.07299, 2023 - arxiv.org
Grammatical Error Correction (GEC) systems play a vital role in assisting people with their
daily writing tasks. However, users may sometimes come across a GEC system that initially …

Generalizing few-shot named entity recognizers to unseen domains with type-related features

Z Wang, Z Zhao, Z Chen, P Ren, M de Rijke… - arXiv preprint arXiv …, 2023 - arxiv.org
Few-shot named entity recognition (NER) has shown remarkable progress in identifying
entities in low-resource domains. However, few-shot NER methods still struggle with out-of …

Prompting large language models for counterfactual generation: An empirical study

Y Li, M Xu, X Miao, S Zhou, T Qian - arXiv preprint arXiv:2305.14791, 2023 - arxiv.org
Large language models (LLMs) have made remarkable progress in a wide range of natural
language understanding and generation tasks. However, their ability to generate …