Show simple item record

dc.contributor.authorBatselier, Kim
dc.contributor.authorYu, Wenjian
dc.contributor.authorDaniel, Luca
dc.contributor.authorWong, Ngai
dc.date.accessioned2019-07-09T19:10:20Z
dc.date.available2019-07-09T19:10:20Z
dc.date.issued2018-01
dc.date.submitted2017-07-25
dc.identifier.issn0895-4798
dc.identifier.issn1095-7162
dc.identifier.urihttps://hdl.handle.net/1721.1/121552
dc.description.abstractWe propose a new algorithm for the computation of a singular value decomposition (SVD) low-rank approximation of a matrix in the matrix product operator (MPO) format, also called the tensor train matrix format. Our tensor network randomized SVD (TNrSVD) algorithm is an MPO implementation of the randomized SVD algorithm that is able to compute dominant singular values and their corresponding singular vectors. In contrast to the state-of-the-art tensor-based alternating least squares SVD (ALS-SVD) and modified alternating least squares SVD (MALS-SVD) matrix approximation methods, TNrSVD can be up to 13 times faster while achieving better accuracy. In addition, our TNrSVD algorithm also produces accurate approximations in particular cases where both ALS-SVD and MALS-SVD fail to converge. We also propose a new algorithm for the fast conversion of a sparse matrix into its corresponding MPO form, which is up to 509 times faster than the standard tensor train SVD method while achieving machine precision accuracy. The efficiency and accuracy of both algorithms are demonstrated in numerical experiments. Key words: curse of dimensionality, low-rank tensor approximation, matrix factorization, matrix product operator, singular value decompositon (SVD), tensor network, tensor train (TT) decomposition, randomized algorithmen_US
dc.description.sponsorshipResearch Grants Council (Hong Kong, China) (17246416)en_US
dc.language.isoen
dc.publisherSociety for Industrial & Applied Mathematics (SIAM)en_US
dc.relation.isversionof10.1137/17m1140480en_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceSIAMen_US
dc.titleComputing Low-Rank Approximations of Large-Scale Matrices with the Tensor Network Randomized SVDen_US
dc.typeArticleen_US
dc.identifier.citationBatselier, Kim, et al. “Computing Low-Rank Approximations of Large-Scale Matrices with the Tensor Network Randomized SVD.” SIAM Journal on Matrix Analysis and Applications 39, no. 3 (January 2018): 1221–44. © 2018 Society for Industrial and Applied Mathematics.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.relation.journalSIAM Journal on Matrix Analysis and Applicationsen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2019-05-15T17:10:18Z
dspace.date.submission2019-05-15T17:10:19Z


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record