David I. Spivak
In "Backprop as functor", the authors show that the fundamental elements of deep learning—gradient descent and backpropagation—can be conceptualized as a strong monoidal functor Para(Euc)–>Learn from the category of parameterized Euclidean spaces to that of learners, a category developed explicitly to capture parameter update and backpropagation. It was soon realized that there is an isomorphism Learn=Para(Slens), where Slens is the symmetric monoidal category of simple lenses as used in functional programming.
In this note, we observe that Slens is a full subcategory of Poly, the category of polynomial functors in one variable, via the functor A|->Ay^A. Using the fact that (Poly,ø) is monoidal closed, we show that a map A–>B in Para(Slens) has a natural interpretation in terms of dynamical systems (more precisely, generalized Moore machines) whose interface is the internal-hom type [Ay^A,By^B].
Finally, we review the fact that the category p-Coalg of dynamical systems on any p in Poly forms a topos, and consider the logical propositions that can be stated in its internal language. We give gradient descent as an example, and we conclude by discussing some directions for future work.
|ArXived at: https://dx.doi.org/10.4204/EPTCS.372.2||bibtex|
|Comments and questions to: firstname.lastname@example.org|
|For website issues: email@example.com|