A review of deep learning techniques for speech processing
The field of speech processing has undergone a transformative shift with the advent of deep
learning. The use of multiple processing layers has enabled the creation of models capable …
learning. The use of multiple processing layers has enabled the creation of models capable …
[PDF][PDF] Recent advances in end-to-end automatic speech recognition
J Li - APSIPA Transactions on Signal and Information …, 2022 - nowpublishers.com
Recently, the speech community is seeing a significant trend of moving from deep neural
network based hybrid modeling to end-to-end (E2E) modeling for automatic speech …
network based hybrid modeling to end-to-end (E2E) modeling for automatic speech …
Are transformers effective for time series forecasting?
Recently, there has been a surge of Transformer-based solutions for the long-term time
series forecasting (LTSF) task. Despite the growing performance over the past few years, we …
series forecasting (LTSF) task. Despite the growing performance over the past few years, we …
Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting
Recently many deep models have been proposed for multivariate time series (MTS)
forecasting. In particular, Transformer-based models have shown great potential because …
forecasting. In particular, Transformer-based models have shown great potential because …
Images speak in images: A generalist painter for in-context visual learning
In-context learning, as a new paradigm in NLP, allows the model to rapidly adapt to various
tasks with only a handful of prompts and examples. But in computer vision, the difficulties for …
tasks with only a handful of prompts and examples. But in computer vision, the difficulties for …
Transformers as statisticians: Provable in-context learning with in-context algorithm selection
Neural sequence models based on the transformer architecture have demonstrated
remarkable\emph {in-context learning}(ICL) abilities, where they can perform new tasks …
remarkable\emph {in-context learning}(ICL) abilities, where they can perform new tasks …
Transformers in time series: A survey
Transformers have achieved superior performances in many tasks in natural language
processing and computer vision, which also triggered great interest in the time series …
processing and computer vision, which also triggered great interest in the time series …
Freematch: Self-adaptive thresholding for semi-supervised learning
Pseudo labeling and consistency regularization approaches with confidence-based
thresholding have made great progress in semi-supervised learning (SSL). In this paper, we …
thresholding have made great progress in semi-supervised learning (SSL). In this paper, we …
[HTML][HTML] Transformers in medical image analysis
Transformers have dominated the field of natural language processing and have recently
made an impact in the area of computer vision. In the field of medical image analysis …
made an impact in the area of computer vision. In the field of medical image analysis …
Gpt (generative pre-trained transformer)–a comprehensive review on enabling technologies, potential applications, emerging challenges, and future directions
The Generative Pre-trained Transformer (GPT) represents a notable breakthrough in the
domain of natural language processing, which is propelling us toward the development of …
domain of natural language processing, which is propelling us toward the development of …