
Date Time 
Location  Speaker 
Title – click for abstract 

01/22 11:00am 
zoom 
Ming Yuan Columbia U. 
Low rank tensor completion
Many problems can be formulated as recovering a lowrank tensor from partial observations. Although an increasingly common task, tensor recovery remains a challenging problem because of the delicacy associated with the decomposition of higher order tensors. In this talk, I will describe results from our recent investigation of several common approaches to low rank tensor completion. 

01/29 11:00am 
zoom 
Grigoris Paouris TAMU 
Canceled 

02/05 11:00am 
zoom 
Shachar Lovett UCSD 
Tensor ranks and their applications
Tensors are highorder analogs of matrices. There are several notions of "tensor rank" that extend the standard notion of matrix rank to tensors. In this talk I will describe these notions, with a focus on tensors that are formed as high tensor powers of base tensors. I will also describe some applications of tensor ranks of tensor powers, both in combinatorics and in computer science.


02/19 11:00am 
zoom 
Henrique Goulart U. Toulouse 
A random matrix perspective on spiked tensor models
Both from a theoretical and a methodological viewpoints, lowrank tensor estimation from noisy data is a difficult problem, however rich of a host of applications in numerous domains. Though many different methods have been developed to address it, no theory is available for predicting their performance in practice. Progress has been recently made by studying the asymptotic performance of estimators of certain socalled spiked tensor models, the dimensions of which are assumed to be rather large. Yet, these results rely upon techniques and concepts borrowed from statistical physics, which are largely inaccessible to nonexperts and difficult to extend to other, more general tensor models. In this talk, I will show how standard but powerful tools from random matrix theory can be leveraged to study these lowrank tensor estimators, opening a new window into spectral properties of random tensors and allowing one to reach several predictions that had been previously obtained only with the statistical physics machinery. 

03/05 11:00am 
zoom 
Dan Mikulincer Weizmann Institute 
A central limit theorem for tensor powers
We introduce the Wishart tensor as the p'th tensor power of a given random vector X in R^n. This is inspired by the classical Wishart matrix, obtained when p = 2. Sums of independent Wishart tensors appear naturally in several settings, such as empirical moment tensors and random geometric graphs. We will discuss possible connections and recent results.
The main focus of the talk will be quantitative estimates for the central limit theorem of Wishart tensors. In this setting, we will explain how Stein's method may be used to exploit the low dimensional structure which is inherent to tensor powers. Specifically, it will be shown that, under appropriate regularity assumptions, a sum of independent Wishart tensors is close to a Gaussian tensor as soon as n^(2p1) 

05/07 04:00am 
zoom 
G. Moshkovitz City University of New York (Baruch College) 
An Optimal Inverse Theorem
The partition rank and analytic rank of a tensor measure algebraic structure and bias, respectively. We prove that they are equivalent up to a constant, over any large enough finite field (independently of the number of variables). The proof constructs rational maps computing a rank decomposition for successive derivatives, on a carefully chosen subset of the kernel variety associated with the tensor.
Proving the equivalence between these two quantities is the main question in the "bias implies low rank" line of work in higherorder Fourier analysis, and was reiterated by multiple authors.
Joint work with Alex Cohen. 