A review and evaluation of the state-of-the-art in PV solar power forecasting: Techniques and optimization

R Ahmed, V Sreeram, Y Mishra, MD Arif - Renewable and Sustainable …, 2020 - Elsevier
Integration of photovoltaics into power grids is difficult as solar energy is highly dependent
on climate and geography; often fluctuating erratically. This causes penetrations and voltage …

Object detection in 20 years: A survey

Z Zou, K Chen, Z Shi, Y Guo, J Ye - Proceedings of the IEEE, 2023 - ieeexplore.ieee.org
Object detection, as of one the most fundamental and challenging problems in computer
vision, has received great attention in recent years. Over the past two decades, we have …

Scaling up your kernels to 31x31: Revisiting large kernel design in cnns

X Ding, X Zhang, J Han, G Ding - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
We revisit large kernel design in modern convolutional neural networks (CNNs). Inspired by
recent advances in vision transformers (ViTs), in this paper, we demonstrate that using a few …

Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting

T Zhou, Z Ma, Q Wen, X Wang… - … on machine learning, 2022 - proceedings.mlr.press
Long-term time series forecasting is challenging since prediction accuracy tends to
decrease dramatically with the increasing horizon. Although Transformer-based methods …

Fnet: Mixing tokens with fourier transforms

J Lee-Thorp, J Ainslie, I Eckstein, S Ontanon - arXiv preprint arXiv …, 2021 - arxiv.org
We show that Transformer encoder architectures can be sped up, with limited accuracy
costs, by replacing the self-attention sublayers with simple linear transformations that" mix" …

Scaling local self-attention for parameter efficient visual backbones

A Vaswani, P Ramachandran… - Proceedings of the …, 2021 - openaccess.thecvf.com
Self-attention has the promise of improving computer vision systems due to parameter-
independent scaling of receptive fields and content-dependent interactions, in contrast to …

Pruning and quantization for deep neural network acceleration: A survey

T Liang, J Glossner, L Wang, S Shi, X Zhang - Neurocomputing, 2021 - Elsevier
Deep neural networks have been applied in many applications exhibiting extraordinary
abilities in the field of computer vision. However, complex network architectures challenge …

Fourier neural operator for parametric partial differential equations

Z Li, N Kovachki, K Azizzadenesheli, B Liu… - arXiv preprint arXiv …, 2020 - arxiv.org
The classical development of neural networks has primarily focused on learning mappings
between finite-dimensional Euclidean spaces. Recently, this has been generalized to neural …

Etsformer: Exponential smoothing transformers for time-series forecasting

G Woo, C Liu, D Sahoo, A Kumar, S Hoi - arXiv preprint arXiv:2202.01381, 2022 - arxiv.org
Transformers have been actively studied for time-series forecasting in recent years. While
often showing promising results in various scenarios, traditional Transformers are not …

Pooling methods in deep neural networks, a review

H Gholamalinezhad, H Khosravi - arXiv preprint arXiv:2009.07485, 2020 - arxiv.org
Nowadays, Deep Neural Networks are among the main tools used in various sciences.
Convolutional Neural Network is a special type of DNN consisting of several convolution …