State-space models with layer-wise nonlinearity are universal approximators with exponential decaying memory S Wang, B Xue Advances in Neural Information Processing Systems 36, 2024 | 12 | 2024 |
A brief survey on the approximation theory for sequence modelling H Jiang, Q Li, Z Li, S Wang Journal of Machine Learning (JML) 2 (1), 1-30, 2023 | 7 | 2023 |
Efficient hyperdimensional computing Z Yan, S Wang, K Tang, WF Wong Joint European Conference on Machine Learning and Knowledge Discovery in …, 2023 | 4 | 2023 |
Inverse approximation theory for nonlinear recurrent neural networks S Wang, Z Li, Q Li The 12th International Conference on Learning Representations (Spotlight …, 2024 | 3 | 2024 |
StableSSM: Alleviating the Curse of Memory in State-space Models through Stable Reparameterization S Wang, Q Li Proceedings of the 41 st International Conference on Machine Learning, 2023 | 2 | 2023 |
HyperSNN: A new efficient and robust deep learning model for resource constrained control applications Z Yan, S Wang, K Tang, WF Wong arXiv preprint arXiv:2308.08222, 2023 | 1 | 2023 |
The Effects of Nonlinearity on Approximation Capacity of Recurrent Neural Networks S Wang, Z Li, Q Li | 1 | 2022 |
LongSSM: On the Length Extension of State-space Models in Language Modelling S Wang arXiv preprint arXiv:2406.02080, 2024 | | 2024 |
Integrating Deep Learning and Synthetic Biology: A Co-Design Approach for Enhancing Gene Expression via N-terminal Coding Sequences Z Yan, W Chu, Y Sheng, K Tang, S Wang, Y Liu, WF Wong arXiv preprint arXiv:2402.13297, 2024 | | 2024 |
Improve Long-term Memory Learning Through Rescaling the Error Temporally S Wang, Z Yan arXiv preprint arXiv:2307.11462, 2023 | | 2023 |