Educational Materials (2026)
– Calculus.pdf
– Image Recognition
– Information Geometry
– Materials Informatics
– Neural Network Math: Robotics
– Neural Network Math: Transformer
– Neural Network Math: Transformer (JP)
– Sparse Estimation & HDLSS
– Bayesian Statistics Textbook
– Chemometrics
– Linear Algebra
– Machine Learning Textbook
– Math & Statistics Textbook
– Physics Textbook
– Solutions Manual
– Time Series Textbook
– ニューラルネットはなぜ動くのか?
Full Repository (GitHub)
Academic Papers
2024
Nakayama, Y., Yata, K. and Aoshima, M. (2024).
Test for high-dimensional outliers with principal component analysis.
Japanese Journal of Statistics and Data Science. [Details]
2022
Nakayama, Y. (2022).
Support vector machine and optimal parameter selection for high-dimensional imbalanced data.
Communications in Statistics – Simulation and Computation 51(11), 6739–6754. [Details]
2021
[OPEN ACCESS]
Nakayama, Y., Yata, K. and Aoshima, M. (2021).
Clustering by principal component analysis with Gaussian kernel in high-dimension, low-sample-size settings.
Journal of Multivariate Analysis 185, 104779. [Details]
Nakayama, Y. (2021).
Robust support vector machine for high-dimensional imbalanced data.
Communications in Statistics – Simulation and Computation 50(5), 1524-1540. [Details]
2020
Nakayama, Y., Yata, K. and Aoshima, M. (2020).
Bias-corrected support vector machine with Gaussian kernel in high-dimension, low-sample-size settings.
Annals of the Institute of Statistical Mathematics 72(5), 1257-1286. [Details]
2018
Yata, K., Aoshima, M. and Nakayama, Y. (2018).
A test of sphericity for high-dimensional data and its application for detection of divergently spiked noise.
Sequential Analysis 37(3), 397-411. [Details]
2017
Nakayama, Y., Yata, K. and Aoshima, M. (2017).
Support vector machine and its bias correction in high-dimension, low-sample-size settings.
Journal of Statistical Planning and Inference 191, 88-100. [Details]
