Asap: A Stochastic Adaptive PCA Method For Increasing Block Size Setting

Abstract

We propose Asap, an adaptive stochastic optimization algorithm for principal component analysis (PCA), in the increasing block size setting. Asap is a novel generalized variant of the classical Oja’s algorithm (Oja, 1982), but can compute top-k principal components without the necessity of tuning the step size. Asap performs PCA by first-order gradient-based optimization based on adaptive estimates of lower-order moments as with Adagrad and Adam. We provide a theoretical guarantee for the convergence of Asap to the top eigenvector of the true covariance matrix. It is worth noting that our proof for the convergence rate is independent of the eigengap, and therefore does not require the assumption of positive eigengap, in contrast to many existing algorithms. Empirical results demonstrate that, for the increasing block size setting, Asap consistently outperforms or performs comparably to the state-of-the-art PCA algorithms.

Overall Rating

0

5 Star
(0)
4 Star
(0)
3 Star
(0)
2 Star
(0)
1 Star
(0)
APA

Kumar, N. & Kawahara, Y (2021). Asap: A Stochastic Adaptive PCA Method For Increasing Block Size Setting. Afribary. Retrieved from https://track.afribary.com/works/adaptive-stochastic-algorithm-for-pca

MLA 8th

Kumar, Navish, and Yoshinobu Kawahara "Asap: A Stochastic Adaptive PCA Method For Increasing Block Size Setting" Afribary. Afribary, 25 Jun. 2021, https://track.afribary.com/works/adaptive-stochastic-algorithm-for-pca. Accessed 13 Nov. 2024.

MLA7

Kumar, Navish, and Yoshinobu Kawahara . "Asap: A Stochastic Adaptive PCA Method For Increasing Block Size Setting". Afribary, Afribary, 25 Jun. 2021. Web. 13 Nov. 2024. < https://track.afribary.com/works/adaptive-stochastic-algorithm-for-pca >.

Chicago

Kumar, Navish and Kawahara, Yoshinobu . "Asap: A Stochastic Adaptive PCA Method For Increasing Block Size Setting" Afribary (2021). Accessed November 13, 2024. https://track.afribary.com/works/adaptive-stochastic-algorithm-for-pca