Neural network with unbounded activation functions is universal approximator S Sonoda, N Murata Applied and Computational Harmonic Analysis 43 (2), 233-268, 2015 | 430 | 2015 |
Transport Analysis of Infinitely Deep Neural Network S Sonoda, N Murata Journal of Machine Learning Research, 2016 | 58 | 2016 |
Double Continuum Limit of Deep Neural Networks S Sonoda, N Murata ICML 2017 Workshop on Principled Approaches to Deep Learning, 1-5, 2017 | 43 | 2017 |
A statistical model for predicting the liquid steel temperature in ladle and tundish by bootstrap filter S Sonoda, N Murata, H Hino, H Kitada, M Kano ISIJ international 52 (6), 1086-1091, 2012 | 31 | 2012 |
Differentiable multiple shooting layers S Massaroli, M Poli, S Sonoda, T Suzuki, J Park, A Yamashita, H Asama Advances in Neural Information Processing Systems 34, 16532-16544, 2021 | 21 | 2021 |
Learning with optimized random features: Exponential speedup by quantum machine learning without sparsity and low-rank assumptions H Yamasaki, S Subramanian, S Sonoda, M Koashi Advances in neural information processing systems 33, 13674-13687, 2020 | 17 | 2020 |
Fully-connected network on noncompact symmetric space and ridgelet transform based on helgason-fourier analysis S Sonoda, I Ishikawa, M Ikeda International Conference on Machine Learning, 20405-20422, 2022 | 16 | 2022 |
Neural network with unbounded activations is universal approximator S Sonoda, N Murata CoRR, abs/1505.03654, 2015 | 16 | 2015 |
Ridge regression with over-parametrized two-layer networks converge to ridgelet spectrum S Sonoda, I Ishikawa, M Ikeda International Conference on Artificial Intelligence and Statistics, 2674-2682, 2021 | 15 | 2021 |
Sampling hidden parameters from oracle distribution S Sonoda, N Murata Artificial Neural Networks and Machine Learning–ICANN 2014: 24th …, 2014 | 15 | 2014 |
Lpml: Llm-prompting markup language for mathematical reasoning R Yamauchi, S Sonoda, A Sannai, W Kumagai arXiv preprint arXiv:2309.13078, 2023 | 11 | 2023 |
How powerful are shallow neural networks with bandlimited random weights? M Li, S Sonoda, F Cao, YG Wang, J Liang International Conference on Machine Learning, 19960-19981, 2023 | 10 | 2023 |
Universality of group convolutional neural networks based on ridgelet analysis on groups S Sonoda, I Ishikawa, M Ikeda Advances in Neural Information Processing Systems 35, 38680-38694, 2022 | 9 | 2022 |
Exponential error convergence in data classification with optimized random features: Acceleration by quantum machine learning H Yamasaki, S Sonoda arXiv preprint arXiv:2106.09028, 2021 | 8 | 2021 |
The global optimum of shallow neural network is attained by ridgelet transform S Sonoda, I Ishikawa, M Ikeda, K Hagihara, Y Sawano, T Matsubara, ... arXiv preprint arXiv:1805.07517v3, 2019 | 7 | 2019 |
Transportation analysis of denoising autoencoders: a novel method for analyzing deep neural networks S Sonoda, N Murata NIPS Workshop on Optimal Transport & Machine Learning, 2017 | 7 | 2017 |
Ghosts in neural networks: Existence, structure and role of infinite-dimensional null space S Sonoda, I Ishikawa, M Ikeda arXiv preprint arXiv:2106.04770, 2021 | 6 | 2021 |
Nonparametric weight initialization of neural networks via integral representation S Sonoda, N Murata arXiv preprint arXiv:1312.6461, 2013 | 6 | 2013 |
Quantum ridgelet transform: winning lottery ticket of neural networks with quantum computation H Yamasaki, S Subramanian, S Hayakawa, S Sonoda International Conference on Machine Learning, 39008-39034, 2023 | 5 | 2023 |
EEG dipole source localization with information criteria for multiple particle filters S Sonoda, K Nakamura, Y Kaneda, H Hino, S Akaho, N Murata, ... Neural networks 108, 68-82, 2018 | 5 | 2018 |