A review and evaluation of the state-of-the-art in PV solar power forecasting: Techniques and optimization
Integration of photovoltaics into power grids is difficult as solar energy is highly dependent
on climate and geography; often fluctuating erratically. This causes penetrations and voltage …
on climate and geography; often fluctuating erratically. This causes penetrations and voltage …
Object detection in 20 years: A survey
Object detection, as of one the most fundamental and challenging problems in computer
vision, has received great attention in recent years. Over the past two decades, we have …
vision, has received great attention in recent years. Over the past two decades, we have …
Scaling up your kernels to 31x31: Revisiting large kernel design in cnns
We revisit large kernel design in modern convolutional neural networks (CNNs). Inspired by
recent advances in vision transformers (ViTs), in this paper, we demonstrate that using a few …
recent advances in vision transformers (ViTs), in this paper, we demonstrate that using a few …
Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting
Long-term time series forecasting is challenging since prediction accuracy tends to
decrease dramatically with the increasing horizon. Although Transformer-based methods …
decrease dramatically with the increasing horizon. Although Transformer-based methods …
Fnet: Mixing tokens with fourier transforms
We show that Transformer encoder architectures can be sped up, with limited accuracy
costs, by replacing the self-attention sublayers with simple linear transformations that" mix" …
costs, by replacing the self-attention sublayers with simple linear transformations that" mix" …
Scaling local self-attention for parameter efficient visual backbones
A Vaswani, P Ramachandran… - Proceedings of the …, 2021 - openaccess.thecvf.com
Self-attention has the promise of improving computer vision systems due to parameter-
independent scaling of receptive fields and content-dependent interactions, in contrast to …
independent scaling of receptive fields and content-dependent interactions, in contrast to …
Pruning and quantization for deep neural network acceleration: A survey
T Liang, J Glossner, L Wang, S Shi, X Zhang - Neurocomputing, 2021 - Elsevier
Deep neural networks have been applied in many applications exhibiting extraordinary
abilities in the field of computer vision. However, complex network architectures challenge …
abilities in the field of computer vision. However, complex network architectures challenge …
Fourier neural operator for parametric partial differential equations
The classical development of neural networks has primarily focused on learning mappings
between finite-dimensional Euclidean spaces. Recently, this has been generalized to neural …
between finite-dimensional Euclidean spaces. Recently, this has been generalized to neural …
Etsformer: Exponential smoothing transformers for time-series forecasting
Transformers have been actively studied for time-series forecasting in recent years. While
often showing promising results in various scenarios, traditional Transformers are not …
often showing promising results in various scenarios, traditional Transformers are not …
Pooling methods in deep neural networks, a review
H Gholamalinezhad, H Khosravi - arXiv preprint arXiv:2009.07485, 2020 - arxiv.org
Nowadays, Deep Neural Networks are among the main tools used in various sciences.
Convolutional Neural Network is a special type of DNN consisting of several convolution …
Convolutional Neural Network is a special type of DNN consisting of several convolution …