ChatNT: A Multimodal Conversational Agent for DNA, RNA and Protein Tasks

G Richard, BP de Almeida, H Dalla-Torre, C Blum… - bioRxiv, 2024 - biorxiv.org
Language models are thriving, powering conversational agents that assist and empower
humans to solve a number of tasks. Recently, these models were extended to support …

SysCaps: Language Interfaces for Simulation Surrogates of Complex Systems

P Emami, Z Li, S Sinha, T Nguyen - arXiv preprint arXiv:2405.19653, 2024 - arxiv.org
Data-driven simulation surrogates help computational scientists study complex systems.
They can also help inform impactful policy decisions. We introduce a learning framework for …

Sample-Efficient Bayesian Optimization with Transfer Learning for Heterogeneous Search Spaces

A Deshwal, S Cakmak, Y Xia, D Eriksson - arXiv preprint arXiv:2409.05325, 2024 - arxiv.org
Bayesian optimization (BO) is a powerful approach to sample-efficient optimization of black-
box functions. However, in settings with very few function evaluations, a successful …

What explains the success of cross-modal fine-tuning with ORCA?

P García-de-Herreros, V Gautam, P Slusallek… - arXiv preprint arXiv …, 2024 - arxiv.org
ORCA (Shen et al., 2023) is a recent technique for cross-modal fine-tuning, ie, applying pre-
trained transformer models to modalities beyond their training data. The technique consists …

What explains the success of cross-modal fine-tuning with ORCA?

PG De Herreros, V Gautam, P Slusallek… - Proceedings of the …, 2024 - aclanthology.org
Abstract ORCA (Shen et al., 2023) is a recent technique for cross-modal fine-tuning, ie,
applying pre-trained transformer models to modalities beyond their training data. The …