Further Limitations of the Known Approaches for Matrix Multiplication
Author(s)
Alman, Josh; Williams, Virginia Vassilevska
DownloadPublished version (547.8Kb)
Terms of use
Metadata
Show full item recordAbstract
© Josh Alman and Virginia V. Williams. We consider the techniques behind the current best algorithms for matrix multiplication. Our results are threefold. (1) We provide a unifying framework, showing that all known matrix multiplication running times since 1986 can be achieved from a single very natural tensor - the structural tensor Tq of addition modulo an integer q. (2) We show that if one applies a generalization of the known techniques (arbitrary zeroing out of tensor powers to obtain independent matrix products in order to use the asymptotic sum inequality of Schönhage) to an arbitrary monomial degeneration of Tq, then there is an explicit lower bound, depending on q, on the bound on the matrix multiplication exponent ω that one can achieve. We also show upper bounds on the value α that one can achieve, where α is such that n × nα × n matrix multiplication can be computed in n2+o(1) time. (3) We show that our lower bound on ω approaches 2 as q goes to infinity. This suggests a promising approach to improving the bound on ω: for variable q, find a monomial degeneration of Tq which, using the known techniques, produces an upper bound on ω as a function of q. Then, take q to infinity. It is not ruled out, and hence possible, that one can obtain ω = 2 in this way.
Date issued
2018Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceCitation
Alman, Josh and Williams, Virginia Vassilevska. 2018. "Further Limitations of the Known Approaches for Matrix Multiplication."
Version: Final published version