Revisiting out-of-distribution robustness in nlp: Benchmarks, analysis, and LLMs evaluations
This paper reexamines the research on out-of-distribution (OOD) robustness in the field of
NLP. We find that the distribution shift settings in previous studies commonly lack adequate …
NLP. We find that the distribution shift settings in previous studies commonly lack adequate …
Promptner: Prompting for named entity recognition
In a surprising turn, Large Language Models (LLMs) together with a growing arsenal of
prompt-based heuristics now offer powerful off-the-shelf approaches providing few-shot …
prompt-based heuristics now offer powerful off-the-shelf approaches providing few-shot …
Glue-x: Evaluating natural language understanding models from an out-of-distribution generalization perspective
Pre-trained language models (PLMs) are known to improve the generalization performance
of natural language understanding models by leveraging large amounts of data during the …
of natural language understanding models by leveraging large amounts of data during the …
Cross-domain data augmentation with domain-adaptive language modeling for aspect-based sentiment analysis
Abstract Cross-domain Aspect-Based Sentiment Analysis (ABSA) aims to leverage the
useful knowledge from a source domain to identify aspect-sentiment pairs in sentences from …
useful knowledge from a source domain to identify aspect-sentiment pairs in sentences from …
Decoupled Hyperbolic Graph Attention Network for Cross-domain Named Entity Recognition
J Xu, Y Cai - Proceedings of the 46th International ACM SIGIR …, 2023 - dl.acm.org
To address the scarcity of massive labeled data, cross-domain named entity recognition
(cross-domain NER) attracts increasing attention. Recent studies focus on decomposing …
(cross-domain NER) attracts increasing attention. Recent studies focus on decomposing …
Mere contrastive learning for cross-domain sentiment analysis
Cross-domain sentiment analysis aims to predict the sentiment of texts in the target domain
using the model trained on the source domain to cope with the scarcity of labeled data …
using the model trained on the source domain to cope with the scarcity of labeled data …
Rfid: Towards rational fusion-in-decoder for open-domain question answering
Open-Domain Question Answering (ODQA) systems necessitate a reader model capable of
generating answers by simultaneously referring to multiple passages. Although …
generating answers by simultaneously referring to multiple passages. Although …
RobustGEC: Robust Grammatical Error Correction Against Subtle Context Perturbation
Grammatical Error Correction (GEC) systems play a vital role in assisting people with their
daily writing tasks. However, users may sometimes come across a GEC system that initially …
daily writing tasks. However, users may sometimes come across a GEC system that initially …
Generalizing few-shot named entity recognizers to unseen domains with type-related features
Few-shot named entity recognition (NER) has shown remarkable progress in identifying
entities in low-resource domains. However, few-shot NER methods still struggle with out-of …
entities in low-resource domains. However, few-shot NER methods still struggle with out-of …
Prompting large language models for counterfactual generation: An empirical study
Large language models (LLMs) have made remarkable progress in a wide range of natural
language understanding and generation tasks. However, their ability to generate …
language understanding and generation tasks. However, their ability to generate …