The elusive generalization and easy optimization 2

Misha Belkin
University of California, San Diego (UCSD)

Generalization is the central topic of machine learning and data science. What patterns can be learned from observations and how can we be sure that they extend to future, not yet seen, data? In this tutorial I will outline the arc of recent developments in current understanding (or lack thereof) of generalization in machine learning. These changes occurred largely due to empirical findings in neural networks which necessitated revisiting theoretical foundations of generalizations. I will also discuss the recent understanding of optimization by gradient descent and show why large non-convex systems are remarkably easy to optimize by local methods.

Back to Mathematics of Intelligences Tutorials