Lipschitz-margin training: Scalable certification of perturbation invariance for deep neural networks Y Tsuzuku, I Sato, M Sugiyama Advances in neural information processing systems 31, 2018 | 325 | 2018 |
Does distributionally robust supervised learning give robust classifiers? W Hu, G Niu, I Sato, M Sugiyama International Conference on Machine Learning, 2029-2037, 2018 | 310 | 2018 |
Reducing wrong labels in distant supervision for relation extraction S Takamatsu, I Sato, H Nakagawa Proceedings of the 50th Annual Meeting of the Association for Computational …, 2012 | 259 | 2012 |
Ghost cytometry S Ota, R Horisaki, Y Kawamura, M Ugawa, I Sato, K Hashimoto, ... Science 360 (6394), 1246-1251, 2018 | 231 | 2018 |
Bayesian differential privacy on correlated data B Yang, I Sato, H Nakagawa Proceedings of the 2015 ACM SIGMOD international conference on Management of …, 2015 | 216 | 2015 |
Deep neural network‐based computer‐assisted detection of cerebral aneurysms in MR angiography T Nakao, S Hanaoka, Y Nomura, I Sato, M Nemoto, S Miki, E Maeda, ... Journal of Magnetic Resonance Imaging 47 (4), 948-953, 2018 | 189 | 2018 |
A diffusion theory for deep learning dynamics: Stochastic gradient descent exponentially favors flat minima Z Xie, I Sato, M Sugiyama arXiv preprint arXiv:2002.03495, 2020 | 138 | 2020 |
Generative adversarial nets from a density ratio estimation perspective M Uehara, I Sato, M Suzuki, K Nakayama, Y Matsuo arXiv preprint arXiv:1610.02920, 2016 | 105 | 2016 |
Topic models with power-law using Pitman-Yor process I Sato, H Nakagawa Proceedings of the 16th ACM SIGKDD international conference on Knowledge …, 2010 | 103 | 2010 |
Approximation analysis of stochastic gradient Langevin dynamics by using Fokker-Planck equation and Ito process I Sato, H Nakagawa International Conference on Machine Learning, 982-990, 2014 | 96 | 2014 |
Few-shot domain adaptation by causal mechanism transfer T Teshima, I Sato, M Sugiyama International Conference on Machine Learning, 9458-9469, 2020 | 94 | 2020 |
Sequential line search for efficient visual design optimization by crowds Y Koyama, I Sato, D Sakamoto, T Igarashi ACM Transactions on Graphics (TOG) 36 (4), 1-11, 2017 | 94 | 2017 |
Person name disambiguation by bootstrapping M Yoshida, M Ikeda, S Ono, I Sato, H Nakagawa Proceedings of the 33rd international ACM SIGIR conference on Research and …, 2010 | 87 | 2010 |
Sequential gallery for interactive visual design optimization Y Koyama, I Sato, M Goto ACM Transactions on Graphics (TOG) 39 (4), 88: 1-88: 12, 2020 | 77 | 2020 |
Variational inference based on robust divergences F Futami, I Sato, M Sugiyama International Conference on Artificial Intelligence and Statistics, 813-822, 2018 | 73 | 2018 |
Differential privacy without sensitivity K Minami, HI Arai, I Sato, H Nakagawa Advances in Neural Information Processing Systems 29, 2016 | 70 | 2016 |
Normalized flat minima: Exploring scale invariant definition of flat minima for neural networks using pac-bayesian analysis Y Tsuzuku, I Sato, M Sugiyama International Conference on Machine Learning, 9636-9647, 2020 | 67 | 2020 |
Unsupervised domain adaptation based on source-guided discrepancy S Kuroki, N Charoenphakdee, H Bao, J Honda, I Sato, M Sugiyama Proceedings of the AAAI Conference on Artificial Intelligence 33 (01), 4122-4129, 2019 | 64 | 2019 |
On the structural sensitivity of deep convolutional networks to the directions of fourier basis functions Y Tsuzuku, I Sato Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2019 | 63 | 2019 |
Artificial neural variability for deep learning: On overfitting, noise memorization, and catastrophic forgetting Z Xie, F He, S Fu, I Sato, D Tao, M Sugiyama Neural computation 33 (8), 2163-2192, 2021 | 57 | 2021 |