1. Home
  2. Research
  3. Centers & Labs
  4. RIKEN Center for Advanced Intelligence Project
  5. Generic Technology Research Group

RIKEN Center for Advanced Intelligence Project Deep Learning Theory Team

Team Leader: Taiji Suzuki (Ph.D.)

Research Summary

Taiji  Suzuki(Ph.D.)

Our team, deep learning theory team, is studying various kinds of learning systems including deep learning from theoretical viewpoints. We enrich our understandings of complex learning systems, and leverage the insights to construct new machine learning techniques and apply them. Especially, machine learning should deal with high dimensional and complicated data, and thus we are studying deep learning and structured sparse learning as methods to deal with such complicated data. Moreover, we are also developing efficient optimization algorithms for large and complicated machine learning problems based on such techniques as stochastic optimization.

Research Subjects:

  • Statistical learning theory of wide range of learning systems including deep learning
  • Efficient optimization algorithm for large dataset
  • High dimensional statistics

Main Research Fields

  • Informatics

Related Research Fields

  • Mathematical & Physical Sciences
  • Principles of Informatics/Mathematical informatics
  • Principles of Informatics/Statistical science

Keywords

  • Deep learning
  • Statistical learning theory
  • Machine learning
  • Stochastic optimization
  • Mathematical statistics

Selected Publications

  • 1. Taiji Suzuki, Hiroshi Abe, Tomoya Murata, Shingo Horiuchi, Kotaro Ito, Tokuma Wachi, So Hirai, Masatoshi Yukishima, Tomoaki Nishimura.:
    "Spectral pruning: Compressing deep neural networks via spectral analysis and its generalization error"
    The 29th International Joint Conference on Artificial Intelligence and the 17th Pacific Rim International Conference on Artificial Intelligence (IJCAI-PRICAI 2020).
  • 2. Taiji Suzuki, Hiroshi Abe, Tomoaki Nishimura.:
    "Compression based bound for non-compressed network: unified generalization error analysis of large compressible deep neural network"
    The 8th International Conference on Learning Representations (ICLR 2020).
  • 3. Jimmy Ba, Murat Erdogdu, Taiji Suzuki, Denny Wu, Tianzong Zhang.:
    "Generalization of two-layer neural networks: An asymptotic viewpoint"
    The 8th International Conference on Learning Representations (ICLR 2020).
  • 4. Kenta Oono and Taiji Suzuki.:
    "Approximation and non-parametric estimation of ResNet-type convolutional neural networks"
    The 36th International Conference on Machine Learning (ICML 2019), pp. 4922-4931, (2019).
  • 5. Taiji Suzuki.:
    "Adaptivity of deep ReLU network for learning in Besov and mixed smooth Besov spaces: optimal rate and curse of dimensionality"
    The 7th International Conference on Learning Representations (ICLR 2019).
  • 6. Atsushi Nitanda and Taiji Suzuki.:
    "Functional gradient boosting based on residual network perception"
    The 36th International Conference on Machine Learning (ICML 2018), pp. 3819-3828, (2018).
  • 7. Taiji Suzuki.:
    "Fast generalization error bound of deep learning from a kernel perspective"
    The 21st International Conference on Artificial Intelligence and Statistics (AISTATS 2018), pp. 1397-1406, (2018).
  • 8. Suzuki, T., Kanagawa, H., Kobayashi, H., Shimizu, N., and Tagami, Y.:
    "Minimax optimal alternating minimization for kernel nonparametric tensor learning"
    The 30th Annual Conference on Neural Information Processing Systems (NIPS2016), pp. 3783-3791, (2016).

Related Links

Lab Members

Principal investigator

Taiji Suzuki
Team Leader

Core members

Sho Sonoda
Postdoctoral Researcher

Careers

Position Deadline
Seeking a Research Scientist or Postdoctoral Researcher (W20082) Open until filled

Contact Information

Department of Mathematical Informatics, Graduate School of Information Science and Technology, The University of Tokyo
Hongo 7-3-1, Bunkyo-ku, Tokyo 113-8656, JAPAN
Email: taiji [at] mist.i.u-tokyo.ac.jp

Top