Kenji Kawaguchi
Kenji Kawaguchi
Presidential Young Professor, National University of Singapore
Verified email at - Homepage
Cited by
Cited by
Deep learning without poor local minima
K Kawaguchi
Advances In Neural Information Processing Systems (NeurIPS), 586-594, 2016
Interpolation consistency training for semi-supervised learning
V Verma, K Kawaguchi, A Lamb, J Kannala, A Solin, Y Bengio, ...
Neural Networks 145, 90-106, 2022
Adaptive activation functions accelerate convergence in deep and physics-informed neural networks
AD Jagtap, K Kawaguchi, GE Karniadakis
Journal of Computational Physics 404, 109136, 2020
Generalization in Deep Learning
K Kawaguchi, LP Kaelbling, Y Bengio
Cambridge University Press, 2022
Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks
AD Jagtap*, K Kawaguchi*, G Em Karniadakis
Proceedings of the Royal Society A 476 (2239), 20200334, 2020
How Does Mixup Help With Robustness and Generalization?
L Zhang*, Z Deng*, K Kawaguchi*, A Ghorbani, J Zou
International Conference on Learning Representations (ICLR), 2021
Combined scaling for zero-shot transfer learning
H Pham*, Z Dai*, G Ghiasi*, K Kawaguchi*, H Liu, AW Yu, J Yu, YT Chen, ...
Neurocomputing 555, 126658, 2023
GraphMix: Improved Training of GNNs for Semi-Supervised Learning
V Verma, M Qu, K Kawaguchi, A Lamb, Y Bengio, J Kannala, J Tang
Proceedings of the AAAI Conference on Artificial Intelligence (AAAI), 2021
Theory of Deep Learning III: explaining the non-overfitting puzzle
T Poggio, K Kawaguchi, Q Liao, B Miranda, L Rosasco, X Boix, J Hidary, ...
Massachusetts Institute of Technology, CBMM Memo No. 073, 2018
Bayesian optimization with exponential convergence
K Kawaguchi, LP Kaelbling, T Lozano-Pérez
Advances in Neural Information Processing Systems (NeurIPS) 28, 2809-2817, 2015
Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions
AD Jagtap, Y Shin, K Kawaguchi, GE Karniadakis
Neurocomputing 468, 165-180, 2022
Depth Creates No Bad Local Minima
H Lu, K Kawaguchi
arXiv preprint arXiv:1702.08580, 2017
Towards domain-agnostic contrastive learning
V Verma, T Luong, K Kawaguchi, H Pham, Q Le
International Conference on Machine Learning (ICML), 10530-10541, 2021
Interpolated adversarial training: Achieving robust neural networks without sacrificing too much accuracy
A Lamb, V Verma, K Kawaguchi, A Matyasko, S Khosla, J Kannala, ...
Neural Networks, 2022
When Do Extended Physics-Informed Neural Networks (XPINNs) Improve Generalization?
Z Hu, AD Jagtap, GE Karniadakis, K Kawaguchi
SIAM Journal on Scientific Computing 44 (5), A3158-A3182, 2022
Multi-Task Learning as a Bargaining Game
A Navon, A Shamsian, I Achituve, H Maron, K Kawaguchi, G Chechik, ...
International Conference on Machine Learning (ICML), 2022, 2022
Optimization of graph neural networks: Implicit acceleration by skip connections and more depth
K Xu*, M Zhang, S Jegelka, K Kawaguchi*
International Conference on Machine Learning (ICML), 11592-11602, 2021
Meta-learning PINN loss functions
AF Psaros, K Kawaguchi, GE Karniadakis
Journal of Computational Physics 458, 111121, 2022
Decomposition Enhances Reasoning via Self-Evaluation Guided Decoding
Y Xie, K Kawaguchi, Y Zhao, X Zhao, MY Kan, J He, Q Xie
Advances in Neural Information Processing Systems (NeurIPS), 2023, 2023
Effect of depth and width on local minima in deep learning
K Kawaguchi, J Huang, LP Kaelbling
Neural computation 31 (7), 1462-1498, 2019
The system can't perform the operation now. Try again later.
Articles 1–20