Text‐based emotion detection: Advances, challenges, and opportunities
FA Acheampong, C Wenyu… - Engineering …, 2020 - Wiley Online Library
Emotion detection (ED) is a branch of sentiment analysis that deals with the extraction and
analysis of emotions. The evolution of Web 2.0 has put text mining and analysis at the …
analysis of emotions. The evolution of Web 2.0 has put text mining and analysis at the …
[HTML][HTML] A survey on deep learning for textual emotion analysis in social networks
S Peng, L Cao, Y Zhou, Z Ouyang, A Yang, X Li… - Digital Communications …, 2022 - Elsevier
Abstract Textual Emotion Analysis (TEA) aims to extract and analyze user emotional states
in texts. Various Deep Learning (DL) methods have developed rapidly, and they have …
in texts. Various Deep Learning (DL) methods have developed rapidly, and they have …
TimeLMs: Diachronic language models from Twitter
Despite its importance, the time variable has been largely neglected in the NLP and
language model literature. In this paper, we present TimeLMs, a set of language models …
language model literature. In this paper, we present TimeLMs, a set of language models …
Tweeteval: Unified benchmark and comparative evaluation for tweet classification
The experimental landscape in natural language processing for social media is too
fragmented. Each year, new shared tasks and datasets are proposed, ranging from classics …
fragmented. Each year, new shared tasks and datasets are proposed, ranging from classics …
Ext5: Towards extreme multi-task scaling for transfer learning
Despite the recent success of multi-task learning and transfer learning for natural language
processing (NLP), few works have systematically studied the effect of scaling up the number …
processing (NLP), few works have systematically studied the effect of scaling up the number …
XLM-T: Multilingual language models in Twitter for sentiment analysis and beyond
Language models are ubiquitous in current NLP, and their multilingual capacity has recently
attracted considerable attention. However, current analyses have almost exclusively focused …
attracted considerable attention. However, current analyses have almost exclusively focused …
Twhin-bert: A socially-enriched pre-trained language model for multilingual tweet representations at twitter
Pre-trained language models (PLMs) are fundamental for natural language processing
applications. Most existing PLMs are not tailored to the noisy user-generated text on social …
applications. Most existing PLMs are not tailored to the noisy user-generated text on social …
Lifelong pretraining: Continually adapting language models to emerging corpora
Pretrained language models (PTLMs) are typically learned over a large, static corpus and
further fine-tuned for various downstream tasks. However, when deployed in the real world …
further fine-tuned for various downstream tasks. However, when deployed in the real world …
Emotion recognition from unimodal to multimodal analysis: A review
K Ezzameli, H Mahersia - Information Fusion, 2023 - Elsevier
The omnipresence of numerous information sources in our daily life brings up new
alternatives for emotion recognition in several domains including e-health, e-learning …
alternatives for emotion recognition in several domains including e-health, e-learning …
Knowledge is a region in weight space for fine-tuned language models
Research on neural networks has focused on understanding a single model trained on a
single dataset. However, relatively little is known about the relationships between different …
single dataset. However, relatively little is known about the relationships between different …