When Multitask Learning Meets Partial Supervision: A Computer Vision Review

M Fontana, M Spratling, M Shi - Proceedings of the IEEE, 2024 - ieeexplore.ieee.org
Multitask learning (MTL) aims to learn multiple tasks simultaneously while exploiting their
mutual relationships. By using shared resources to simultaneously calculate multiple …

Large language models are built-in autoregressive search engines

N Ziems, W Yu, Z Zhang, M Jiang - arXiv preprint arXiv:2305.09612, 2023 - arxiv.org
Document retrieval is a key stage of standard Web search engines. Existing dual-encoder
dense retrievers obtain representations for questions and documents independently …

Knowledge-augmented methods for natural language processing

C Zhu, Y Xu, X Ren, BY Lin, M Jiang, W Yu - Proceedings of the sixteenth …, 2023 - dl.acm.org
Knowledge in NLP has been a rising trend especially after the advent of large-scale pre-
trained models. Knowledge is critical to equip statistics-based models with common sense …

Muffin: Curating multi-faceted instructions for improving instruction following

R Lou, K Zhang, J Xie, Y Sun, J Ahn, H Xu… - The Twelfth …, 2023 - openreview.net
In the realm of large language models (LLMs), enhancing instruction-following capability
often involves curating expansive training data. This is achieved through two primary …

The vault: A comprehensive multilingual dataset for advancing code understanding and generation

DN Manh, NL Hai, ATV Dau, AM Nguyen… - arXiv preprint arXiv …, 2023 - arxiv.org
We present The Vault, a dataset of high-quality code-text pairs in multiple programming
languages for training large language models to understand and generate code. We present …

Task compass: Scaling multi-task pre-training with task prefix

Z Zhang, S Wang, Y Xu, Y Fang, W Yu, Y Liu… - arXiv preprint arXiv …, 2022 - arxiv.org
Leveraging task-aware annotated data as supervised signals to assist with self-supervised
learning on large-scale unlabeled data has become a new trend in pre-training language …

[HTML][HTML] Challenges and opportunities of using transformer-based multi-task learning in NLP through ML lifecycle: A position paper

L Torbarina, T Ferkovic, L Roguski, V Mihelcic… - Natural Language …, 2024 - Elsevier
The increasing adoption of natural language processing (NLP) models across industries has
led to practitioners' need for machine learning (ML) systems to handle these models …

[HTML][HTML] Explaining legal judgments: A multitask learning framework for enhancing factual consistency in rationale generation

C He, TP Tan, S Xue, Y Tan - Journal of King Saud University-Computer …, 2023 - Elsevier
The explanation of legal judgments is crucial for the transparency, fairness, and
trustworthiness, aiming to provide rationales for decision-making. While previous works …

Granular syntax processing with multi-task and curriculum learning

X Zhang, R Mao, E Cambria - Cognitive Computation, 2024 - Springer
Syntactic processing techniques are the foundation of natural language processing (NLP),
supporting many downstream NLP tasks. In this paper, we conduct pair-wise multi-task …

Natural language processing based on name entity with n-gram classifier machine learning process through ge-based hidden markov model

SD Pande, RK Kanna, I Qureshi - … Learning Applications in …, 2022 - yashikajournals.com
Abstract Natural Language Processing (NLP) is the computational linguistics mode for the
identification and classification of text documents. The NLP process comprises the retrieval …