Randomized density matrix renormalization group algorithm for low-rank tensor train decomposition
Abstract
Tensor train decomposition is a powerful tool for processing high-dimensional data. Density matrix renormalization group (DMRG) is an alternating scheme for low-rank tensor train decomposition of large tensors. However, it may suffer from the curse of dimensionality due to the large scale of subproblems. In this paper, we proposed a novel randomized proximal DMRG algorithm for low-rank tensor train decomposition by using TensorSketch to alleviate the curse of dimensionality. Numerical experiments on synthetic and real-world data also demonstrate the effectiveness and efficiency of the proposed algorithm.References
S. Ahmadi-Asl, S. Abukhovich, M. G. Asante-Mensah, A. Cichocki, A. H. Phan, T. Tanaka, and I. Oseledets, Randomized algorithms for computation of Tucker decomposition and higher order SVD (HOSVD), IEEE Access, 9: 28684-28706, 2021.
H. Avron, H. Nguyen, and D. Woodruff, Subspace embeddings for the polynomial kernel, Advances in neural information processing systems, 27, 2014.
B. W. Bader, T. G. Kolda, and others, Tensor Toolbox for MATLAB, Version 3.5, www.tensortoolbox.org, February 25, 2023.
K. Batselier, W. Yu, L. Daniel, and N. Wong, Computing low-rank approximations of large-scale matrices with the tensor network randomized SVD, SIAM Journal on Matrix Analysis and Applications, 39(3): 1221-1244, 2018.
C. Battaglino, G. Ballard, and T. G. Kolda, A practical randomized CP tensor decomposition, SIAM Journal on Matrix Analysis and Applications, 39(2): 876-901, 2018.
J. L. Carter and M. N. Wegman, Universal classes of hash functions, Journal of Computer and System Sciences, 18(2): 143-154, 1979.
M. Che, Y. Wei, and H. Yan, The computation of low multilinear rank approximations of tensors via power scheme and random projection, SIAM Journal on Matrix Analysis and Applications, 41(2): 605-636, 2020.
M. Che, Y. Wei, and H. Yan, Randomized algorithms for the low multilinear rank approximations of tensors, Journal of
Computational and Applied Mathematics, 390: 113380, 2021.
M. Che and Y. Wei, Randomized algorithms for the approximations of Tucker and the tensor train decompositions, Advances in Computational Mathematics, 45(1): 395-428, 2019.
Z. Chen, H. Jiang, G. Yu, and L. Qi, Low-Rank Tensor Train decomposition Using TensorSketch, arXiv preprint arXiv: 2309.08093, 2023.
H. Diao, Z. Song, W. Sun, and D. P. Woodruff, Woodruff. Sketching for Kronecker product regression and p-splines, In International Conference on Articial Intelligence and Statistics, pages 1299-1308. PMLR, 2018.
W. Dong, G. Yu, L. Qi, and X. Cai, Practical sketching algorithms for low-rank Tucker approximation of large tensors, Journal of Scientific Computing, 95(2): 52, 2023.
N. B. Erichson, K. Manohar, S. L. Brunton, and J. N. Kutz, Randomized CP tensor decomposition, Machine Learning: Science and Technology, 1(2): 025012, 2020.
N. Halko, P. G. Martinsson, and J. A. Tropp, Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions, SIAM review, 53(2): 217-288, 2011.
F. L. Hitchcock, The expression of a tensor or a polyadic as a sum of products, Journal of Mathematics and Physics, 6(1-4): 164-189, 1927.
B. Huber, R. Schneider, and S. Wolf, A randomized tensor train singular value decomposition, In Compressed Sensing and its Applications: Second International MATHEON Conference 2015, pages 261-290. Springer International Publishing, 2017.
E. Jeckelmann, Dynamical density-matrix renormalization-group method, Physical Review B, 66(4), 045114, 2002.
D. M. Kane and J. Nelson, Sparser Johnson-Lindenstrauss transforms, Journal of the ACM (JACM), 61(1): 4:1–4:23, 2014.
T. G. Kolda and B. W. Bader, Tensor decompositions and applications, SIAM review, 51(3): 455-500, 2009.
L. Ma and E. Solomonik, Fast and accurate randomized algorithms for low-rank tensor decompositions, Advances in Neural Information Processing Systems, 34: 24299-24312, 2021.
M. W. Mahoney, Randomized algorithms for matrices and data, Foundations and Trends® in Machine Learning, 3(2): 123-224, 2011.
O. A. Malik and S. Becker, Low-rank Tucker decomposition of large tensors using TensorSketch, Advances in neural information processing systems, 31: 10096-10106, 2018.
O. A. Malik and S. Becker, A sampling-based method for tensor ring decomposition, In International Conference on Machine Learning, pages 7400-7411. PMLR, 2021.
P. G. Martinsson and J. A. Tropp, Randomized numerical linear algebra: Foundations and algorithms, Acta Numerica, 29: 403-572, 2020.
R. Minster, A. K. Saibaba, and M. E. Kilmer, Randomized algorithms for low-rank tensor decompositions in the Tucker format, SIAM Journal on Mathematics of Data Science, 2(1): 189-215, 2020.
I. Oseledets, Tensor train Decomposition, SIAM Journal on Scientific Computing 33(5): 2295-2317, 2011.
R. Pagh, Compressed matrix multiplication, ACM Transactions on Computation Theory (TOCT), 5(3): 1–17, 2013.
M. Patrascu and M. Thorup, The power of simple tabulation hashing, Journal of the ACM (JACM), 59(3): 1-50, 2012.
T. Shi, M. Ruth, and A. Townsend, Parallel algorithms for computing the tensor-train decomposition, SIAM Journal on Scientific Computing, 45(3): C101-C130, 2023.
V. I. Slyusar. Analytical model of the digital antenna array on a basis of face-splitting matrixs product In Proceedings of International Conference on Antenna Theory and Techniques, pages 108-109, 1997.
J. A. Tropp, A. Yurtsever, M. Udell, and V. Cevher, Practical sketching algorithms for low-rank matrix approximation, SIAM Journal on Matrix Analysis and Applications, 38(4): 1454-1485, 2017.
L. R. Tucker, Some mathematical notes on three-mode factor analysis, Psychometrika 31(3): 279-311, 1966.
Y. Wu, R. Chen, and Z. Chen, Solving Sylvester tensor equation based on tensor train decomposition (in Chinese), Journal of Hangzhou Dianzi University (Natural Sciences), 41(6): 94-99, 2021.
Y. Wang, H. Y. Tung, A. Smola, A. Anandkumar, Fast and guaranteed tensor decomposition via sketching, Advances in neural information processing systems, 28, 2015.
S. White, Density-matrix algorithms for quantum renormalization groups, Physical Review B, Vol. 48, no. 14, pp. 10345-10356,1993.
D. P. Woodruff, Sketching as a tool for numerical linear algebra, arXiv preprint arXiv: 1411.4357, 2014.
Y. Yu and H. Li, Practical sketching-based randomized tensor ring decomposition, arXiv preprint arXiv: 2209.05647, 2022.
G. Yu, J. Feng, Z. Chen, X. Cai, and L. Qi, A randomized block Krylov method for tensor train approximation, arXiv preprint arXiv: 2308.01480, 2023.
Q. Zhao, G. Zhou, S. Xie, L. Zhang, and A. Cichocki, Tensor ring decomposition, arXiv preprint arXiv: 1606.05535, 2016
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).