Design space exploration of neural network activation function circuits T Yang, Y Wei, Z Tu, H Zeng, MA Kinsy, N Zheng, P Ren IEEE Transactions on Computer-Aided Design of Integrated Circuits and …, 2018 | 60 | 2018 |
Adabin: Improving binary neural networks with adaptive binary sets Z Tu, X Chen, P Ren, Y Wang European conference on computer vision, 379-395, 2022 | 52 | 2022 |
Genimage: A million-scale benchmark for detecting ai-generated image M Zhu, H Chen, Q Yan, X Huang, G Lin, W Li, Z Tu, H Hu, J Hu, Y Wang Advances in Neural Information Processing Systems 36, 2024 | 47 | 2024 |
NTIRE 2023 challenge on image denoising: Methods and results Y Li, Y Zhang, R Timofte, L Van Gool, Z Tu, K Du, H Wang, H Chen, W Li, ... Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2023 | 36 | 2023 |
A survey on transformer compression Y Tang, Y Wang, J Guo, Z Tu, K Han, H Hu, D Tao arXiv preprint arXiv:2402.05964, 2024 | 12 | 2024 |
Toward accurate post-training quantization for image super resolution Z Tu, J Hu, H Chen, Y Wang Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2023 | 8 | 2023 |
Cbq: Cross-block quantization for large language models X Ding, X Liu, Y Zhang, Z Tu, W Li, J Hu, H Chen, Y Tang, Z Xiong, B Yin, ... arXiv preprint arXiv:2312.07950, 2023 | 4 | 2023 |
Data upcycling knowledge distillation for image super-resolution Y Zhang, W Li, S Li, J Hu, H Chen, H Wang, Z Tu, W Wang, B Jing, ... arXiv preprint arXiv:2309.14162, 2023 | 2 | 2023 |
CAQ: Context-aware quantization via reinforcement learning Z Tu, J Ma, T Xia, W Zhao, P Ren, N Zheng 2021 International Joint Conference on Neural Networks (IJCNN), 1-8, 2021 | 1 | 2021 |
U-DiTs: Downsample Tokens in U-Shaped Diffusion Transformers Y Tian, Z Tu, H Chen, J Hu, C Xu, Y Wang arXiv preprint arXiv:2405.02730, 2024 | | 2024 |
LIPT: Latency-aware Image Processing Transformer J Qiao, W Li, H Xie, H Chen, Y Zhou, Z Tu, J Hu, S Lin arXiv preprint arXiv:2404.06075, 2024 | | 2024 |
IPT-V2: Efficient Image Processing Transformer using Hierarchical Attentions Z Tu, K Du, H Chen, H Wang, W Li, J Hu, Y Wang arXiv preprint arXiv:2404.00633, 2024 | | 2024 |