Perturbation confusion in forward automatic differentiation of higher-order functions

Automatic differentiation (AD) is a technique for augmenting computer programs to compute derivatives. The essence of AD in its forward accumulation mode is to attach perturbations to each number, and propagate these through the computation by overloading the arithmetic operators. When derivatives are nested, the distinct derivative calculations, and their associated perturbations, must be distinguished. This is typically accomplished by creating a unique tag for each derivative calculation and tagging the perturbations. We exhibit a subtle bug, present in fielded implementations which support derivatives of higher-order functions, in which perturbations are confused despite the tagging machinery, leading to incorrect results. The essence of the bug is as follows: a unique tag is needed for each derivative calculation, but in existing implementations unique tags are created when taking the derivative of a function at a point. When taking derivatives of higher-order functions, these need not correspond! We exhibit a simple example: a higher-order function f whose derivative at a point x, namely f′(x), is itself a function which calculates a derivative. This situation arises naturally when taking derivatives of curried functions. Two potential solutions are presented, and their deficiencies discussed. One uses eta expansion to delay the creation of fresh tags in order to put them into one-to-one correspondence with derivative calculations. The other wraps outputs of derivative operators with tag substitution machinery. Both solutions seem very difficult to implement without violating the desirable complexity guarantees of forward AD.

[1]  Barak A. Pearlmutter,et al.  Lazy multivariate higher-order forward-mode AD , 2007, POPL '07.

[2]  René Lavendhomme,et al.  Basic Concepts of Synthetic Differential Geometry , 1996 .

[3]  Barak A. Pearlmutter,et al.  Perturbation Confusion and Referential Transparency:Correct Functional Implementation of Forward-Mode AD , 2005 .

[4]  Barak A. Pearlmutter,et al.  Using Programming Language Theory to Make Automatic Differentiation Sound and Efficient , 2008 .

[5]  Gerald Jay Sussman,et al.  Functional Differential Geometry , 2013 .

[6]  Laurent Regnier,et al.  The differential lambda-calculus , 2003, Theor. Comput. Sci..

[7]  Barak A. Pearlmutter,et al.  DiffSharp: An AD Library for .NET Languages , 2016, ArXiv.

[8]  Laurent Hascoët,et al.  TAPENADE 2.1 user's guide , 2004 .

[9]  Tom Ronan,et al.  OpenEnsembles: A Python Resource for Ensemble Clustering , 2018, J. Mach. Learn. Res..

[10]  R. E. Wengert,et al.  A simple automatic derivative evaluation program , 1964, Commun. ACM.

[11]  Bart van Merriënboer,et al.  Automatic Differentiation in Myia , 2017 .

[12]  Marcin Andrychowicz,et al.  Learning to learn by gradient descent by gradient descent , 2016, NIPS.

[13]  Andreas Griewank,et al.  Evaluating derivatives - principles and techniques of algorithmic differentiation, Second Edition , 2000, Frontiers in applied mathematics.

[14]  Nematollah Batmanghelich,et al.  Deep Diffeomorphic Normalizing Flows , 2018, ArXiv.

[15]  Maziar Raissi,et al.  Deep Hidden Physics Models: Deep Learning of Nonlinear Partial Differential Equations , 2018, J. Mach. Learn. Res..

[16]  Barak A. Pearlmutter,et al.  Nesting forward-mode AD in a functional framework , 2008, High. Order Symb. Comput..

[17]  Andrew M. Pitts,et al.  A First Order Theory of Names and Binding , 2001 .

[18]  David Duvenaud,et al.  Neural Ordinary Differential Equations , 2018, NeurIPS.

[19]  Conal Elliott Beautiful differentiation , 2009, ICFP.

[20]  Andreas Griewank,et al.  ADIFOR - Generating Derivative Codes form Fortran Programs , 1992, Sci. Program..

[21]  B. Speelpenning Compiling Fast Partial Derivatives of Functions Given by Algorithms , 1980 .

[22]  Louis B. Rall,et al.  Automatic differentiation , 1981 .

[23]  Dan Moldovan,et al.  Tangent: Automatic differentiation using source-code transformation for dynamically typed array programming , 2018, NeurIPS.

[24]  C. Bendtsen FADBAD, a flexible C++ package for automatic differentiation - using the forward and backward method , 1996 .

[25]  Barak A. Pearlmutter,et al.  First-class nonstandard interpretations by opening closures , 2007, POPL '07.

[26]  Oleksandr Manzyuk A Simply Typed λ-Calculus of Forward Automatic Differentiation , 2012, MFPS.

[27]  James Cheney,et al.  A dependent nominal type theory , 2012, Log. Methods Comput. Sci..

[28]  Ryan P. Adams,et al.  Gradient-based Hyperparameter Optimization through Reversible Learning , 2015, ICML.

[29]  Oleksandr Manzyuk,et al.  Tangent bundles in differential lambda-categories , 2012, 1202.0411.

[30]  Barak A. Pearlmutter,et al.  Evolving the Incremental Lambda Calculus into a Model of Forward Automatic Differentiation (AD) , 2016 .

[31]  Gerald J. Sussman,et al.  Structure and interpretation of classical mechanics , 2001 .

[32]  A. Church The calculi of lambda-conversion , 1941 .

[33]  Jerzy Karczmarczuk Functional Differentiation of Computer Programs , 2001, High. Order Symb. Comput..

[34]  Conal Elliott Compiling to categories , 2017, Proc. ACM Program. Lang..