A comprehensive survey of ai-generated content (aigc): A history of generative ai from gan to chatgpt
Recently, ChatGPT, along with DALL-E-2 and Codex, has been gaining significant attention
from society. As a result, many individuals have become interested in related resources and …
from society. As a result, many individuals have become interested in related resources and …
[PDF][PDF] Deep unsupervised domain adaptation: A review of recent advances and perspectives
Deep learning has become the method of choice to tackle real-world problems in different
domains, partly because of its ability to learn from data and achieve impressive performance …
domains, partly because of its ability to learn from data and achieve impressive performance …
Llava-med: Training a large language-and-vision assistant for biomedicine in one day
Conversational generative AI has demonstrated remarkable promise for empowering
biomedical practitioners, but current investigations focus on unimodal text. Multimodal …
biomedical practitioners, but current investigations focus on unimodal text. Multimodal …
The rise and potential of large language model based agents: A survey
For a long time, humanity has pursued artificial intelligence (AI) equivalent to or surpassing
the human level, with AI agents considered a promising vehicle for this pursuit. AI agents are …
the human level, with AI agents considered a promising vehicle for this pursuit. AI agents are …
Should chatgpt be biased? challenges and risks of bias in large language models
E Ferrara - arXiv preprint arXiv:2304.03738, 2023 - arxiv.org
As the capabilities of generative language models continue to advance, the implications of
biases ingrained within these models have garnered increasing attention from researchers …
biases ingrained within these models have garnered increasing attention from researchers …
scGPT: toward building a foundation model for single-cell multi-omics using generative AI
Generative pretrained models have achieved remarkable success in various domains such
as language and computer vision. Specifically, the combination of large-scale diverse …
as language and computer vision. Specifically, the combination of large-scale diverse …
Using deepspeed and megatron to train megatron-turing nlg 530b, a large-scale generative language model
Pretrained general-purpose language models can achieve state-of-the-art accuracies in
various natural language processing domains by adapting to downstream tasks via zero …
various natural language processing domains by adapting to downstream tasks via zero …
Fine-tuning language models with just forward passes
Fine-tuning language models (LMs) has yielded success on diverse downstream tasks, but
as LMs grow in size, backpropagation requires a prohibitively large amount of memory …
as LMs grow in size, backpropagation requires a prohibitively large amount of memory …
On the opportunities and risks of foundation models
AI is undergoing a paradigm shift with the rise of models (eg, BERT, DALL-E, GPT-3) that are
trained on broad data at scale and are adaptable to a wide range of downstream tasks. We …
trained on broad data at scale and are adaptable to a wide range of downstream tasks. We …
[HTML][HTML] Auditing large language models: a three-layered approach
Large language models (LLMs) represent a major advance in artificial intelligence (AI)
research. However, the widespread use of LLMs is also coupled with significant ethical and …
research. However, the widespread use of LLMs is also coupled with significant ethical and …