Edgevits: Competing light-weight cnns on mobile devices with vision transformers

J Pan, A Bulat, F Tan, X Zhu, L Dudziak, H Li… - … on Computer Vision, 2022 - Springer
Self-attention based models such as vision transformers (ViTs) have emerged as a very
competitive architecture alternative to convolutional neural networks (CNNs) in computer …

Revisiting random channel pruning for neural network compression

Y Li, K Adamczewski, W Li, S Gu… - Proceedings of the …, 2022 - openaccess.thecvf.com
Channel (or 3D filter) pruning serves as an effective way to accelerate the inference of
neural networks. There has been a flurry of algorithms that try to solve this practical problem …

Bignas: Scaling up neural architecture search with big single-stage models

J Yu, P Jin, H Liu, G Bender, PJ Kindermans… - Computer Vision–ECCV …, 2020 - Springer
Neural architecture search (NAS) has shown promising results discovering models that are
both accurate and fast. For NAS, training a one-shot model has become a popular strategy …

Channel pruning via lookahead search guided reinforcement learning

Z Wang, C Li - Proceedings of the IEEE/CVF winter …, 2022 - openaccess.thecvf.com
Channel pruning has become an effective yet still challenging approach to achieve compact
neural networks. It aims to prune the optimal set of filters whose removal results in minimal …

Dance: Differentiable accelerator/network co-exploration

K Choi, D Hong, H Yoon, J Yu, Y Kim… - 2021 58th ACM/IEEE …, 2021 - ieeexplore.ieee.org
This work presents DANCE, a differentiable approach towards the co-exploration of
hardware accelerator and network architecture design. At the heart of DANCE is a …

Real-Time Semantic Segmentation: A brief survey and comparative study in remote sensing

C Broni-Bediako, J Xia, N Yokoya - IEEE Geoscience and …, 2023 - ieeexplore.ieee.org
Real-time semantic segmentation of remote sensing imagery is a challenging task that
requires a tradeoff between effectiveness and efficiency. It has many applications, including …

Hardware-adaptive efficient latency prediction for nas via meta-learning

H Lee, S Lee, S Chong… - Advances in Neural …, 2021 - proceedings.neurips.cc
For deployment, neural architecture search should be hardware-aware, in order to satisfy
the device-specific constraints (eg, memory usage, latency and energy consumption) and …

Searching by generating: Flexible and efficient one-shot NAS with architecture generator

SY Huang, WT Chu - … of the IEEE/CVF Conference on …, 2021 - openaccess.thecvf.com
In one-shot NAS, sub-networks need to be searched from the supernet to meet different
hardware constraints. However, the search cost is high and N times of searches are needed …

The Myth of the Pyramid

R Izquierdo-Cordova… - Proceedings of the …, 2024 - openaccess.thecvf.com
A deep-rooted strategy for building convolutional neural networks in computer vision is to
increase the number of filters every time the feature map resolution is decreased. The notion …

Help: Hardware-adaptive efficient latency prediction for nas via meta-learning

H Lee, S Lee, S Chong, SJ Hwang - arXiv preprint arXiv:2106.08630, 2021 - arxiv.org
For deployment, neural architecture search should be hardware-aware, in order to satisfy
the device-specific constraints (eg, memory usage, latency and energy consumption) and …