Skip to content
Texas A&M University
Mathematics

Seminar in Random Tensors

Fall 2020

 

Date:November 9, 2020
Time:11:00am
Location:zoom
Speaker:Liza Rebrova, UCLA
Title:About modewise tensor dimension reduction and fitting low-rank tensors
Abstract:It is not a secret that probabilistic view in general and random matrix theory in particular present amazing tools to understand large high-dimensional data. However, in many cases, one has to go beyond “simple” matrix models to correctly represent and treat the data. For example, inherently multimodal data is better represented with a tensor, that is, higher-order generalization of a matrix. Transition to more advanced data structures sometimes can survive re-using old algorithms, however, the development of the special tools that honor the full structure within the data pays off by making the algorithms both much more efficient and better interpretable. In this talk, I will focus on our new provable methods for modewise (structure preserving) tensor dimension reduction. I will also discuss its application to the tensor fitting problem and the connections to interpretable learning from multi-modal data through tensor decompositions.

Date:December 18, 2020
Time:2:15pm
Location:zoom
Speaker:S. Kopparty, Rutgers
Title:On the bias of multilinear forms
Abstract:We study the bias of multilinear forms over finite fields, a quantity also studied under the name "analytic rank". We show two results: 1. Random multilinear forms are unbiased with (very) high probability. 2. Unbiased multilinear forms cannot have small tensor rank (which can be restated as saying that analytic rank gives a lower bound for tensor rank). On the way, we develop some tools involving Fourier analysis, probability and linear algebra. For example, we show that for any d dimensional subspace of the space of k x k matrices over F_2, the probability that a uniformly random rank 1 matrix lies in this space is at most 2^{d/k}/ 2^k. Based on joint work with Abhishek Bhrushundi, Pooya Hatami, Prahladh Harsha and Mrinal Kumar