Text‐based emotion detection: Advances, challenges, and opportunities

FA Acheampong, C Wenyu… - Engineering …, 2020 - Wiley Online Library
Emotion detection (ED) is a branch of sentiment analysis that deals with the extraction and
analysis of emotions. The evolution of Web 2.0 has put text mining and analysis at the …

[HTML][HTML] A survey on deep learning for textual emotion analysis in social networks

S Peng, L Cao, Y Zhou, Z Ouyang, A Yang, X Li… - Digital Communications …, 2022 - Elsevier
Abstract Textual Emotion Analysis (TEA) aims to extract and analyze user emotional states
in texts. Various Deep Learning (DL) methods have developed rapidly, and they have …

TimeLMs: Diachronic language models from Twitter

D Loureiro, F Barbieri, L Neves, LE Anke… - arXiv preprint arXiv …, 2022 - arxiv.org
Despite its importance, the time variable has been largely neglected in the NLP and
language model literature. In this paper, we present TimeLMs, a set of language models …

Tweeteval: Unified benchmark and comparative evaluation for tweet classification

F Barbieri, J Camacho-Collados, L Neves… - arXiv preprint arXiv …, 2020 - arxiv.org
The experimental landscape in natural language processing for social media is too
fragmented. Each year, new shared tasks and datasets are proposed, ranging from classics …

Ext5: Towards extreme multi-task scaling for transfer learning

V Aribandi, Y Tay, T Schuster, J Rao, HS Zheng… - arXiv preprint arXiv …, 2021 - arxiv.org
Despite the recent success of multi-task learning and transfer learning for natural language
processing (NLP), few works have systematically studied the effect of scaling up the number …

XLM-T: Multilingual language models in Twitter for sentiment analysis and beyond

F Barbieri, LE Anke, J Camacho-Collados - arXiv preprint arXiv …, 2021 - arxiv.org
Language models are ubiquitous in current NLP, and their multilingual capacity has recently
attracted considerable attention. However, current analyses have almost exclusively focused …

Twhin-bert: A socially-enriched pre-trained language model for multilingual tweet representations at twitter

X Zhang, Y Malkov, O Florez, S Park… - Proceedings of the 29th …, 2023 - dl.acm.org
Pre-trained language models (PLMs) are fundamental for natural language processing
applications. Most existing PLMs are not tailored to the noisy user-generated text on social …

Lifelong pretraining: Continually adapting language models to emerging corpora

X Jin, D Zhang, H Zhu, W Xiao, SW Li, X Wei… - arXiv preprint arXiv …, 2021 - arxiv.org
Pretrained language models (PTLMs) are typically learned over a large, static corpus and
further fine-tuned for various downstream tasks. However, when deployed in the real world …

Emotion recognition from unimodal to multimodal analysis: A review

K Ezzameli, H Mahersia - Information Fusion, 2023 - Elsevier
The omnipresence of numerous information sources in our daily life brings up new
alternatives for emotion recognition in several domains including e-health, e-learning …

Knowledge is a region in weight space for fine-tuned language models

A Gueta, E Venezian, C Raffel, N Slonim, Y Katz… - arXiv preprint arXiv …, 2023 - arxiv.org
Research on neural networks has focused on understanding a single model trained on a
single dataset. However, relatively little is known about the relationships between different …