A survey on deep learning for software engineering

Y Yang, X Xia, D Lo, J Grundy - ACM Computing Surveys (CSUR), 2022 - dl.acm.org
In 2006, Geoffrey Hinton proposed the concept of training “Deep Neural Networks (DNNs)”
and an improved model training method to break the bottleneck of neural network …

Towards natural language interfaces for data visualization: A survey

L Shen, E Shen, Y Luo, X Yang, X Hu… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
Utilizing Visualization-oriented Natural Language Interfaces (V-NLI) as a complementary
input modality to direct manipulation for visual analytics can provide an engaging user …

Can llm already serve as a database interface? a big bench for large-scale database grounded text-to-sqls

J Li, B Hui, G Qu, J Yang, B Li, B Li… - Advances in …, 2024 - proceedings.neurips.cc
Text-to-SQL parsing, which aims at converting natural language instructions into executable
SQLs, has gained increasing attention in recent years. In particular, GPT-4 and Claude-2 …

Din-sql: Decomposed in-context learning of text-to-sql with self-correction

M Pourreza, D Rafiei - Advances in Neural Information …, 2024 - proceedings.neurips.cc
There is currently a significant gap between the performance of fine-tuned models and
prompting approaches using Large Language Models (LLMs) on the challenging task of text …

Resdsql: Decoupling schema linking and skeleton parsing for text-to-sql

H Li, J Zhang, C Li, H Chen - Proceedings of the AAAI Conference on …, 2023 - ojs.aaai.org
One of the recent best attempts at Text-to-SQL is the pre-trained language model. Due to the
structural property of the SQL queries, the seq2seq model takes the responsibility of parsing …

TaBERT: Pretraining for joint understanding of textual and tabular data

P Yin, G Neubig, W Yih, S Riedel - arXiv preprint arXiv:2005.08314, 2020 - arxiv.org
Recent years have witnessed the burgeoning of pretrained language models (LMs) for text-
based natural language (NL) understanding tasks. Such models are typically trained on free …

Text-to-sql empowered by large language models: A benchmark evaluation

D Gao, H Wang, Y Li, X Sun, Y Qian, B Ding… - arXiv preprint arXiv …, 2023 - arxiv.org
Large language models (LLMs) have emerged as a new paradigm for Text-to-SQL task.
However, the absence of a systematical benchmark inhibits the development of designing …

Rat-sql: Relation-aware schema encoding and linking for text-to-sql parsers

B Wang, R Shin, X Liu, O Polozov… - arXiv preprint arXiv …, 2019 - arxiv.org
When translating natural language questions into SQL queries to answer questions from a
database, contemporary semantic parsing models struggle to generalize to unseen …

Graphix-t5: Mixing pre-trained transformers with graph-aware layers for text-to-sql parsing

J Li, B Hui, R Cheng, B Qin, C Ma, N Huo… - Proceedings of the …, 2023 - ojs.aaai.org
The task of text-to-SQL parsing, which aims at converting natural language questions into
executable SQL queries, has garnered increasing attention in recent years. One of the major …

Bridging textual and tabular data for cross-domain text-to-SQL semantic parsing

XV Lin, R Socher, C Xiong - arXiv preprint arXiv:2012.12627, 2020 - arxiv.org
We present BRIDGE, a powerful sequential architecture for modeling dependencies
between natural language questions and relational databases in cross-DB semantic …