Self-instruct: Aligning language models with self-generated instructions
Large" instruction-tuned" language models (ie, finetuned to respond to instructions) have
demonstrated a remarkable ability to generalize zero-shot to new tasks. Nevertheless, they …
demonstrated a remarkable ability to generalize zero-shot to new tasks. Nevertheless, they …
Large language models can self-improve
Large Language Models (LLMs) have achieved excellent performances in various tasks.
However, fine-tuning an LLM requires extensive supervision. Human, on the other hand …
However, fine-tuning an LLM requires extensive supervision. Human, on the other hand …
Data-driven causal effect estimation based on graphical causal modelling: A survey
In many fields of scientific research and real-world applications, unbiased estimation of
causal effects from non-experimental data is crucial for understanding the mechanism …
causal effects from non-experimental data is crucial for understanding the mechanism …
Weak localization of radiographic manifestations in pulmonary tuberculosis from chest x-ray: A systematic review
Pulmonary tuberculosis (PTB) is a bacterial infection that affects the lung. PTB remains one
of the infectious diseases with the highest global mortalities. Chest radiography is a …
of the infectious diseases with the highest global mortalities. Chest radiography is a …
Balancing logit variation for long-tailed semantic segmentation
Semantic segmentation usually suffers from a long tail data distribution. Due to the
imbalanced number of samples across categories, the features of those tail classes may get …
imbalanced number of samples across categories, the features of those tail classes may get …
Learning from future: A novel self-training framework for semantic segmentation
Self-training has shown great potential in semi-supervised learning. Its core idea is to use
the model learned on labeled data to generate pseudo-labels for unlabeled samples, and in …
the model learned on labeled data to generate pseudo-labels for unlabeled samples, and in …
A survey on semi-supervised graph clustering
Abstract Semi-Supervised Graph Clustering (SSGC) has emerged as a pivotal field at the
intersection of graph clustering and semi-supervised learning (SSL), offering innovative …
intersection of graph clustering and semi-supervised learning (SSL), offering innovative …
Temporal-domain adaptation for satellite image time-series land-cover mapping with adversarial learning and spatially aware self-training
Nowadays, satellite image time series (SITS) are commonly employed to derive land-cover
maps (LCM) to support decision makers in a variety of land management applications. In the …
maps (LCM) to support decision makers in a variety of land management applications. In the …
Linguistic steganalysis in few-shot scenario
Due to the widespread use of text in cyberspace, linguistic steganography, which hides
secret information into normal texts, develops quickly in these years. While linguistic …
secret information into normal texts, develops quickly in these years. While linguistic …
Self-training with direct preference optimization improves chain-of-thought reasoning
Effective training of language models (LMs) for mathematical reasoning tasks demands high-
quality supervised fine-tuning data. Besides obtaining annotations from human experts, a …
quality supervised fine-tuning data. Besides obtaining annotations from human experts, a …