Physics for neuromorphic computing
Neuromorphic computing takes inspiration from the brain to create energy-efficient hardware
for information processing, capable of highly sophisticated tasks. Systems built with standard …
for information processing, capable of highly sophisticated tasks. Systems built with standard …
Designing neural networks through neuroevolution
Much of recent machine learning has focused on deep learning, in which neural network
weights are trained through variants of stochastic gradient descent. An alternative approach …
weights are trained through variants of stochastic gradient descent. An alternative approach …
[HTML][HTML] Explainable Artificial Intelligence (XAI): What we know and what is left to attain Trustworthy Artificial Intelligence
Artificial intelligence (AI) is currently being utilized in a wide range of sophisticated
applications, but the outcomes of many AI models are challenging to comprehend and trust …
applications, but the outcomes of many AI models are challenging to comprehend and trust …
Multi-concept customization of text-to-image diffusion
While generative models produce high-quality images of concepts learned from a large-
scale database, a user often wishes to synthesize instantiations of their own concepts (for …
scale database, a user often wishes to synthesize instantiations of their own concepts (for …
Three types of incremental learning
Incrementally learning new information from a non-stationary stream of data, referred to as
'continual learning', is a key feature of natural intelligence, but a challenging problem for …
'continual learning', is a key feature of natural intelligence, but a challenging problem for …
Revisiting class-incremental learning with pre-trained models: Generalizability and adaptivity are all you need
Class-incremental learning (CIL) aims to adapt to emerging new classes without forgetting
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …
On the opportunities and risks of foundation models
AI is undergoing a paradigm shift with the rise of models (eg, BERT, DALL-E, GPT-3) that are
trained on broad data at scale and are adaptable to a wide range of downstream tasks. We …
trained on broad data at scale and are adaptable to a wide range of downstream tasks. We …
Foster: Feature boosting and compression for class-incremental learning
The ability to learn new concepts continually is necessary in this ever-changing world.
However, deep neural networks suffer from catastrophic forgetting when learning new …
However, deep neural networks suffer from catastrophic forgetting when learning new …
Foundational challenges in assuring alignment and safety of large language models
This work identifies 18 foundational challenges in assuring the alignment and safety of large
language models (LLMs). These challenges are organized into three different categories …
language models (LLMs). These challenges are organized into three different categories …
Dytox: Transformers for continual learning with dynamic token expansion
Deep network architectures struggle to continually learn new tasks without forgetting the
previous tasks. A recent trend indicates that dynamic architectures based on an expansion …
previous tasks. A recent trend indicates that dynamic architectures based on an expansion …