Skip to content
Texas A&M University
Mathematics

Student/Postdoc Working Geometry Seminar

Date: February 10, 2021

Time: 10:30AM - 11:30AM

Location: zoom

Speaker: Elina Robeva, UBC

  

Title: Orthogonal decomposition of tensor trains

Abstract: Tensor decomposition has many applications. However, it is often a hard problem. Orthogonally decomposable tensors form are a small subfamily of tensors and retain many of the nice properties of matrices that general tensors don't. A symmetric tensor is orthogonally decomposable if it can be written as a linear combination of tensor powers of n orthonormal vectors. The decomposition of such tensors can be found efficiently, their eigenvectors can be computed efficiently, and the set of orthogonally decomposable tensors of low rank is closed and can be described by a set of quadratic equations. One of the issues with orthogonally decomposable tensors, however, is that they form a very small subset of the set of all tensors. We expand this subset and consider orthogonally decomposable tensor trains. These are formed by placing an orthogonally decomposable tensor at each of the vertices of a tensor train, and then contracting. We give algorithms for decomposing such tensors both in the setting where the tensors at the vertices of the tensor train are symmetric and non-symmetric. This purely theoretical work is based on joint work with Karim Halaseh and Tommi Muller.