Llm maybe longlm: Self-extend llm context window without tuning
This work elicits LLMs' inherent ability to handle long contexts without fine-tuning. The
limited length of the training sequence during training may limit the application of Large …
limited length of the training sequence during training may limit the application of Large …
Kv cache compression, but what must we give in return? a comprehensive benchmark of long context capable approaches
Long context capability is a crucial competency for large language models (LLMs) as it
mitigates the human struggle to digest long-form texts. This capability enables complex task …
mitigates the human struggle to digest long-form texts. This capability enables complex task …
CoWPE: Adaptive Context Window Adjustment in LLMs for Complex Input Queries
VM Tamanampudi - … General science (JAIGS) ISSN: 3006-4023, 2024 - ojs.boulibrary.com
Recent work has shown that large language models, or LLMs, are capable of amazing
processing context windows based on the nuance and complexity of respective input …
processing context windows based on the nuance and complexity of respective input …
When Text Embedding Meets Large Language Model: A Comprehensive Survey
Text embedding has become a foundational technology in natural language processing
(NLP) during the deep learning era, driving advancements across a wide array of …
(NLP) during the deep learning era, driving advancements across a wide array of …
From Reading to Compressing: Exploring the Multi-document Reader for Prompt Compression
Large language models (LLMs) have achieved significant performance gains using
advanced prompting techniques over various tasks. However, the increasing length of …
advanced prompting techniques over various tasks. However, the increasing length of …
TACO-RL: Task Aware Prompt Compression Optimization with Reinforcement Learning
The increasing prevalence of large language models (LLMs) such as GPT-4 in various
applications has led to a surge in the size of prompts required for optimal performance …
applications has led to a surge in the size of prompts required for optimal performance …
Enhancing embedding performance through large language model-based text enrichment and rewriting
N Harris, A Butani, S Hashmy - arXiv preprint arXiv:2404.12283, 2024 - arxiv.org
Embedding models are crucial for various natural language processing tasks but can be
limited by factors such as limited vocabulary, lack of context, and grammatical errors. This …
limited by factors such as limited vocabulary, lack of context, and grammatical errors. This …
Prompt Compression for Large Language Models: A Survey
Leveraging large language models (LLMs) for complex natural language tasks typically
requires long-form prompts to convey detailed requirements and information, which results …
requires long-form prompts to convey detailed requirements and information, which results …
Lossless KV Cache Compression to 2%
Large language models have revolutionized data processing in numerous domains, with
their ability to handle extended context reasoning receiving notable recognition. To speed …
their ability to handle extended context reasoning receiving notable recognition. To speed …
Parse Trees Guided LLM Prompt Compression
Offering rich contexts to Large Language Models (LLMs) has shown to boost the
performance in various tasks, but the resulting longer prompt would increase the …
performance in various tasks, but the resulting longer prompt would increase the …