Multilinear compressive sensing and an application to convolutional linear networks - Archive ouverte HAL Access content directly
Journal Articles SIAM Journal on Mathematics of Data Science Year : 2019

## Multilinear compressive sensing and an application to convolutional linear networks

François Malgouyres
Joseph Landsberg
• Function : Author

#### Abstract

We study a deep linear network endowed with a structure. It takes the form of a matrix $X$ obtained by multiplying $K$ matrices (called factors and corresponding to the action of the layers). The action of each layer (i.e. a factor) is obtained by applying a fixed linear operator to a vector of parameters satisfying a constraint. The number of layers is not limited. Assuming that $X$ is given and factors have been estimated, the error between the product of the estimated factors and $X$ (i.e. the reconstruction error) is either the statistical or the empirical risk. In this paper, we provide necessary and sufficient conditions on the network topology under which a stability property holds. The stability property requires that the error on the parameters defining the factors (i.e. the stability of the recovered parameters) scales linearly with the reconstruction error (i.e. the risk). Therefore, under these conditions on the network topology, any successful learning task leads to stably defined features and therefore interpretable layers/network. In order to do so, we first evaluate how the Segre embedding and its inverse distort distances. Then, we show that any deep structured linear network can be cast as a generic multilinear problem (that uses the Segre embedding). This is the {\em tensorial lifting}. Using the tensorial lifting, we provide necessary and sufficient conditions for the identifiability of the factors (up to a scale rearrangement). We finally provide the necessary and sufficient condition called \NSPlong~(because of the analogy with the usual Null Space Property in the compressed sensing framework) which guarantees that the stability property holds. We illustrate the theory with a practical example where the deep structured linear network is a convolutional linear network. As expected, the conditions are rather strong but not empty. A simple test on the network topology can be implemented to test if the condition holds.

### Dates and versions

hal-01494267 , version 1 (23-03-2017)
hal-01494267 , version 2 (03-07-2018)
hal-01494267 , version 3 (22-10-2018)
hal-01494267 , version 4 (01-02-2023)

### Identifiers

• HAL Id : hal-01494267 , version 4
• ARXIV :

### Cite

François Malgouyres, Joseph Landsberg. Multilinear compressive sensing and an application to convolutional linear networks. SIAM Journal on Mathematics of Data Science, 2019, 1 (3), pp.446-475. ⟨hal-01494267v4⟩

### Export

BibTeX TEI Dublin Core DC Terms EndNote Datacite

354 View