Skip to content
Texas A&M University
Mathematics

Seminar in Random Tensors

Date: May 21, 2020

Time: 4:00PM - 5:00PM

Location: zoom

Speaker: Luke Oeding, Auburn University

  

Title: Stochastic Alternating Least Squares for Tensor Decomposition

Abstract: Least Squares is the standard method for approximating the solution to overdetermined systems of linear equations. It is known to converge quickly to the optimal solution. A solved linear system can be seen as a diagonalized system. For many applications data are multilinear, and we want to use that structure. A multilinear analogue to a diagonalized system is a rank decomposition. Alternating Least Squares is one standard method for decomposing tensors into a rank decomposition by attempting to reduce this problem into a sequence of least squares optimizations. While this method can be effective in some situations, it is limited because it doesn’t always converge, and it can be computationally expensive. However, when tensor data arrive sample by sample, we can use stochastic methods to attempt to decompose a model tensor from its samples. We show that under mild regularity and boundedness assumptions, the Stochastic Alternating Least Squares (SALS) method converges. Even though tensor problems often have high complexity, the tradeoff for using sampling in place of exactness can lead to large savings in time and resources. I’ll describe the SALS algorithm and its advantages, and I’ll give some hints as to why (and when) it converges.