Skip to content
Texas A&M University
Mathematics

Inverse Problems and Machine Learning

Date: November 13, 2019

Time: 12:00PM - 1:00PM

Location: BLOC 628

Speaker: Christoper Snyder, UT Austin

  

Title: Combinatorial Complexity of Deep Networks: Think Weight Configurations, not Perturbations!

Abstract: Did you know that (ReLU) Deep Neural Networks (DNNs) trained on linearly separable data are linear classifiers? While it is widely appreciated that some data assumptions are necessary to explain generalization in deep learning, we observe that very strong data assumptions induce regularity in gradient descent trained DNNs that is entirely combinatorial in nature. That is, strong constraints exist between the binary neuron states and binary output, which simplify the description of the classification map. We present a hierarchical decomposition of the DNN discrete classification map into logical (AND/OR) combinations of intermediate (True/False) classifiers of the input. Those classifiers that can not be further decomposed, called atoms, are (interpretable) linear classifiers. Taken together, we obtain a logical circuit with linear classifier inputs that computes the same label as the DNN. This circuit does not structurally resemble the network architecture, and it may require many fewer parameters, depending on the configuration of weights. In these cases, we obtain simultaneously an interpretation and generalization bound (for the original DNN), connecting two fronts which have historically been investigated separately. We study DNNs in simple, controlled settings, where we obtain superior generalization bounds despite using only combinatorial information (e.g. no margin information). On the MNIST dataset. We show that the learned, internal, logical computations correspond to semantically meaningful (unlabeled) categories that allow DNN descriptions in plain English. We improve the generalization of an already trained network by interpreting, diagnosing, and replacing components \textit{within} the logical circuit that is the DNN.