Research Group of Prof. Dr. C. Sminchisescu
Mathematical Sciences



Matrix Backpropagation - Code

Matrix Backpropagation for Training Deep Networks with Structured Layers

Catalin Ionescu, Orestis Vantzos and Cristian Sminchisescu

Picture

Abstract


Deep neural network architectures have recently produced excellent results in a variety of areas in artificial intelligence and visual recognition, well surpassing traditional shallow architectures trained using hand-designed features. The power of deep networks stems both from their ability to perform local computations followed by pointwise non-linearities over increasingly larger receptive fields, and from the simplicity and scalability of the gradient-descent training procedure based on backpropagation. An open problem is the inclusion of layers that perform global, structured matrix computations like segmentation (e.g. normalized cuts) or higher-order pooling (e.g. log-tangent space metrics defined over the manifold of symmetric positive definite matrices) while preserving the validity and efficiency of an end-to-end deep training framework. In this paper we propose a sound mathematical apparatus to formally integrate global structured computation into deep computation architectures. At the heart of our methodology is the development of the theory and practice of backpropagation that generalizes to the calculus of adjoint matrix variations. We perform segmentation experiments using the BSDS and MSCOCO benchmarks and demonstrate that deep networks relying on second-order pooling and normalized cuts layers, trained end-to-end using matrix backpropagation, outperform counterparts that do not take advantage of such global layers. 


References

C. Ionescu, O. Vantzos, C. Sminchisescu. Matrix Backpropagation for Deep Networks with Structured Layers,
IEEE International Conference on Computer Vision (ICCV), December 2015. pdf.

C. Ionescu, O. Vantzos, C. Sminchisescu. Training Deep Networks with Structured Layers by Matrix Backpropagation,
Preprint arXiv:1509.07838, December 2015. pdf. (long version with significantly more detail)

 

Code and Results


Matlab code is available to reproduce the results in the article.

Feel free to contact us with questions, suggestions and/or bug reports.

Comparative results for different methods presented in the paper are illustrated on a set of images below.