A survey on text-to-sql parsing: Concepts, methods, and future directions
Text-to-SQL parsing is an essential and challenging task. The goal of text-to-SQL parsing is
to convert a natural language (NL) question to its corresponding structured query language …
to convert a natural language (NL) question to its corresponding structured query language …
Table pre-training: A survey on model architectures, pre-training objectives, and downstream tasks
Since a vast number of tables can be easily collected from web pages, spreadsheets, PDFs,
and various other document types, a flurry of table pre-training frameworks have been …
and various other document types, a flurry of table pre-training frameworks have been …
Harnessing the power of llms in practice: A survey on chatgpt and beyond
This article presents a comprehensive and practical guide for practitioners and end-users
working with Large Language Models (LLMs) in their downstream Natural Language …
working with Large Language Models (LLMs) in their downstream Natural Language …
Transtab: Learning transferable tabular transformers across tables
Tabular data (or tables) are the most widely used data format in machine learning (ML).
However, ML models often assume the table structure keeps fixed in training and testing …
However, ML models often assume the table structure keeps fixed in training and testing …
Shortcut learning of large language models in natural language understanding
Shortcut Learning of Large Language Models in Natural Language Understanding Page 1 110
COMMUNICATIONS OF THE ACM | JANUARY 2024 | VOL. 67 | NO. 1 research IMA GE B Y …
COMMUNICATIONS OF THE ACM | JANUARY 2024 | VOL. 67 | NO. 1 research IMA GE B Y …
MultiHiertt: Numerical reasoning over multi hierarchical tabular and textual data
Numerical reasoning over hybrid data containing both textual and tabular content (eg,
financial reports) has recently attracted much attention in the NLP community. However …
financial reports) has recently attracted much attention in the NLP community. However …
Large language models are versatile decomposers: Decomposing evidence and questions for table-based reasoning
Table-based reasoning has shown remarkable progress in a wide range of table-based
tasks. It is a challenging task, which requires reasoning over both free-form natural language …
tasks. It is a challenging task, which requires reasoning over both free-form natural language …
Transformers for tabular data representation: A survey of models and applications
In the last few years, the natural language processing community has witnessed advances
in neural representations of free texts with transformer-based language models (LMs). Given …
in neural representations of free texts with transformer-based language models (LMs). Given …
HYTREL: Hypergraph-enhanced tabular data representation learning
Abstract Language models pretrained on large collections of tabular data have
demonstrated their effectiveness in several downstream tasks. However, many of these …
demonstrated their effectiveness in several downstream tasks. However, many of these …
SEQZERO: Few-shot compositional semantic parsing with sequential prompts and zero-shot models
Recent research showed promising results on combining pretrained language models (LMs)
with canonical utterance for few-shot semantic parsing. The canonical utterance is often …
with canonical utterance for few-shot semantic parsing. The canonical utterance is often …