Sparse approximation is a well-established theory, with a profound impact on the fields of signal and image processing. In this talk we describe two special cases of this model – the convolutional sparse coding (CSC) and its multi-layered version (ML-CSC). We show that the projection of signals (a.k.a. pursuit) to the ML-CSC model leads to various deep convolutional neural network architectures. This connection brings a fresh view to CNN, as we are able to accompany the above by theoretical claims such as uniqueness of the representations throughout the network, and their stable estimation, all guaranteed under simple local sparsity conditions. The 'take-home-message' from this talk is this: The ML-CSC model can serve as the theoretical foundation to deep-learning.
Back to New Deep Learning Techniques