Large Language Models Can Be Contextual Privacy Protection Learners

Y Xiao, Y Jin, Y Bai, Y Wu, X Yang, X Luo… - Proceedings of the …, 2024 - aclanthology.org
Abstract The proliferation of Large Language Models (LLMs) has driven considerable
interest in fine-tuning them with domain-specific data to create specialized language …

Multi-modal and multi-agent systems meet rationality: A survey

B Jiang, Y Xie, X Wang, WJ Su, CJ Taylor… - ICML 2024 Workshop …, 2024 - openreview.net
Rationality is characterized by logical thinking and decision-making that align with evidence
and logical rules. This quality is essential for effective problem-solving, as it ensures that …

Llamaf: An efficient llama2 architecture accelerator on embedded fpgas

H Xu, Y Li, S Ji - 2024 IEEE 10th World Forum on Internet of …, 2024 - ieeexplore.ieee.org
Large language models (LLMs) have demonstrated remarkable abilities in natural language
processing. However, their deployment on resource-constrained embedded devices …

Towards seamless user query to rest api conversion

H Xu - Proceedings of the 33rd ACM International Conference …, 2024 - dl.acm.org
Integrating Large Language Models (LLMs) with external tools and APIs is essential for
fields such as information retrieval and knowledge management. While LLMs have made …

Enhancing Decision-Making in Offline Reinforcement Learning: Adaptive, Multi-Agent, and Online Perspectives

Y Zhang - 2024 - ses.library.usyd.edu.au
Inspired by the successful application of large models in natural language processing and
computer vision, both the research community and industry have increasingly focused on …

[PDF][PDF] From Transformers to the Future: An In-Depth Exploration of Modern Language Model Architectures

H Xu, Z Bi, H Tseng, X Song, P Feng - osf.io
The Transformer is a neural network architecture that was introduced in the paper Attention
is All You Need by Vaswani et al. in 2017 [294]. It fundamentally changed the way natural …