When Multitask Learning Meets Partial Supervision: A Computer Vision Review
Multitask learning (MTL) aims to learn multiple tasks simultaneously while exploiting their
mutual relationships. By using shared resources to simultaneously calculate multiple …
mutual relationships. By using shared resources to simultaneously calculate multiple …
Large language models are built-in autoregressive search engines
Document retrieval is a key stage of standard Web search engines. Existing dual-encoder
dense retrievers obtain representations for questions and documents independently …
dense retrievers obtain representations for questions and documents independently …
Knowledge-augmented methods for natural language processing
Knowledge in NLP has been a rising trend especially after the advent of large-scale pre-
trained models. Knowledge is critical to equip statistics-based models with common sense …
trained models. Knowledge is critical to equip statistics-based models with common sense …
Muffin: Curating multi-faceted instructions for improving instruction following
In the realm of large language models (LLMs), enhancing instruction-following capability
often involves curating expansive training data. This is achieved through two primary …
often involves curating expansive training data. This is achieved through two primary …
The vault: A comprehensive multilingual dataset for advancing code understanding and generation
We present The Vault, a dataset of high-quality code-text pairs in multiple programming
languages for training large language models to understand and generate code. We present …
languages for training large language models to understand and generate code. We present …
Task compass: Scaling multi-task pre-training with task prefix
Leveraging task-aware annotated data as supervised signals to assist with self-supervised
learning on large-scale unlabeled data has become a new trend in pre-training language …
learning on large-scale unlabeled data has become a new trend in pre-training language …
[HTML][HTML] Challenges and opportunities of using transformer-based multi-task learning in NLP through ML lifecycle: A position paper
L Torbarina, T Ferkovic, L Roguski, V Mihelcic… - Natural Language …, 2024 - Elsevier
The increasing adoption of natural language processing (NLP) models across industries has
led to practitioners' need for machine learning (ML) systems to handle these models …
led to practitioners' need for machine learning (ML) systems to handle these models …
[HTML][HTML] Explaining legal judgments: A multitask learning framework for enhancing factual consistency in rationale generation
The explanation of legal judgments is crucial for the transparency, fairness, and
trustworthiness, aiming to provide rationales for decision-making. While previous works …
trustworthiness, aiming to provide rationales for decision-making. While previous works …
Granular syntax processing with multi-task and curriculum learning
Syntactic processing techniques are the foundation of natural language processing (NLP),
supporting many downstream NLP tasks. In this paper, we conduct pair-wise multi-task …
supporting many downstream NLP tasks. In this paper, we conduct pair-wise multi-task …
Natural language processing based on name entity with n-gram classifier machine learning process through ge-based hidden markov model
Abstract Natural Language Processing (NLP) is the computational linguistics mode for the
identification and classification of text documents. The NLP process comprises the retrieval …
identification and classification of text documents. The NLP process comprises the retrieval …