Large language models: a comprehensive survey of its applications, challenges, limitations, and future prospects
Within the vast expanse of computerized language processing, a revolutionary entity known
as Large Language Models (LLMs) has emerged, wielding immense power in its capacity to …
as Large Language Models (LLMs) has emerged, wielding immense power in its capacity to …
Large ai models in health informatics: Applications, challenges, and the future
Large AI models, or foundation models, are models recently emerging with massive scales
both parameter-wise and data-wise, the magnitudes of which can reach beyond billions …
both parameter-wise and data-wise, the magnitudes of which can reach beyond billions …
Large language models generate functional protein sequences across diverse families
Deep-learning language models have shown promise in various biotechnological
applications, including protein design and engineering. Here we describe ProGen, a …
applications, including protein design and engineering. Here we describe ProGen, a …
Scaling data-constrained language models
The current trend of scaling language models involves increasing both parameter count and
training dataset size. Extrapolating this trend suggests that training dataset size may soon be …
training dataset size. Extrapolating this trend suggests that training dataset size may soon be …
Evolutionary-scale prediction of atomic-level protein structure with a language model
Recent advances in machine learning have leveraged evolutionary information in multiple
sequence alignments to predict protein structure. We demonstrate direct inference of full …
sequence alignments to predict protein structure. We demonstrate direct inference of full …
Illuminating protein space with a programmable generative model
Three billion years of evolution has produced a tremendous diversity of protein molecules,
but the full potential of proteins is likely to be much greater. Accessing this potential has …
but the full potential of proteins is likely to be much greater. Accessing this potential has …
Transformers as statisticians: Provable in-context learning with in-context algorithm selection
Neural sequence models based on the transformer architecture have demonstrated
remarkable\emph {in-context learning}(ICL) abilities, where they can perform new tasks …
remarkable\emph {in-context learning}(ICL) abilities, where they can perform new tasks …
ProtGPT2 is a deep unsupervised language model for protein design
Protein design aims to build novel proteins customized for specific purposes, thereby
holding the potential to tackle many environmental and biomedical problems. Recent …
holding the potential to tackle many environmental and biomedical problems. Recent …
Evaluating large language models in generating synthetic hci research data: a case study
P Hämäläinen, M Tavast, A Kunnari - … of the 2023 CHI Conference on …, 2023 - dl.acm.org
Collecting data is one of the bottlenecks of Human-Computer Interaction (HCI) research.
Motivated by this, we explore the potential of large language models (LLMs) in generating …
Motivated by this, we explore the potential of large language models (LLMs) in generating …
Learning inverse folding from millions of predicted structures
We consider the problem of predicting a protein sequence from its backbone atom
coordinates. Machine learning approaches to this problem to date have been limited by the …
coordinates. Machine learning approaches to this problem to date have been limited by the …