Notes on the Neural Tangent Kernel

A beginners’ guide. Jan 2023

To access the document, please click [PDF]
The presentation is at [this link].

60 pages report written to prepare a presentation for the Machine Learning II course, offered at Bocconi University. Expanded devolving time to the Visitiing Student Initiative.

Abstract

The following document is an exploration of the results of “Neural Tangent Kernel: Convergence and Generalization in Neural Networks” - Jacot, Gabriel, Hongler. It was written to understand better the content of the claims. It is not an extension, but rather an expansion of some of the elements needed for a less experienced reader. As this production is done in fulfillment of a semester exam for an Machine Learning course, it does not cover all of the content. The focus is on the formulation and properties of the Neural Tangent Kernel, and all the theoretical results needed to understand them. The works cited are in line with those of the authors, with some additional resources that I found helpful. Given the breadth of the subject, some of the content is left for future studies.
Section I discusses the peculiarities of the setting chosen by the authors. Sections II, III introduce the Neural Tangent Kernel and show the conditions for it to be well formulated. Section IV is a collection of interesting interpretations of the consequences of the NTK regime, partly empirically checked in Section V. Section VI comments on the results found and gives an overview of some of the contributions that stemmed from the original work. In Appendix A it is possible to find more details about the mathematical objects used throughout the document. The image is a cutout of the wonderful cover of [CPW21].