Perturbation confusion in forward automatic differentiation of higher-order functions

Abstract Automatic differentiation (AD) is a technique for augmenting computer programs to compute derivatives. The essence of AD in its forward accumulation mode is to attach perturbations to each number, and propagate these through the computation by overloading the arithmetic operators. When derivatives are nested, the distinct derivative calculations, and their associated perturbations, must be distinguished. This is typically accomplished by creating a unique tag for each derivative calculation and tagging the perturbations. We exhibit a subtle bug, present in fielded implementations which support derivatives of higher-order functions, in which perturbations are confused despite the tagging machinery, leading to incorrect results. The essence of the bug is as follows: a unique tag is needed for each derivative calculation, but in existing implementations unique tags are created when taking the derivative of a function at a point. When taking derivatives of higher-order functions, these need not correspond! We exhibit a simple example: a higher-order function f whose derivative at a point x, namely f′(x), is itself a function which calculates a derivative. This situation arises naturally when taking derivatives of curried functions. Two potential solutions are presented, and their deficiencies discussed. One uses eta expansion to delay the creation of fresh tags in order to put them into one-to-one correspondence with derivative calculations. The other wraps outputs of derivative operators with tag substitution machinery. Both solutions seem very difficult to implement without violating the desirable complexity guarantees of forward AD.

[1]  A. Church The calculi of lambda-conversion , 1941 .

[2]  R. E. Wengert,et al.  A simple automatic derivative evaluation program , 1964, Commun. ACM.

[3]  B. Speelpenning Compiling Fast Partial Derivatives of Functions Given by Algorithms , 1980 .

[4]  Louis B. Rall,et al.  Automatic differentiation , 1981 .

[5]  Griewank,et al.  On automatic differentiation , 1988 .

[6]  Andreas Griewank,et al.  ADIFOR - Generating Derivative Codes form Fortran Programs , 1992, Sci. Program..

[7]  René Lavendhomme,et al.  Basic Concepts of Synthetic Differential Geometry , 1996 .

[8]  C. Bendtsen FADBAD, a flexible C++ package for automatic differentiation - using the forward and backward method , 1996 .

[9]  Andreas Griewank,et al.  Evaluating derivatives - principles and techniques of algorithmic differentiation, Second Edition , 2000, Frontiers in applied mathematics.

[10]  Jerzy Karczmarczuk Functional Differentiation of Computer Programs , 2001, High. Order Symb. Comput..

[11]  Gerald J. Sussman,et al.  Structure and interpretation of classical mechanics , 2001 .

[12]  Andrew M. Pitts Nominal logic, a first order theory of names and binding , 2003, Inf. Comput..

[13]  Laurent Regnier,et al.  The differential lambda-calculus , 2003, Theor. Comput. Sci..

[14]  Laurent Hascoët,et al.  TAPENADE 2.1 user's guide , 2004 .

[15]  Barak A. Pearlmutter,et al.  Perturbation Confusion and Referential Transparency:Correct Functional Implementation of Forward-Mode AD , 2005 .

[16]  Barak A. Pearlmutter,et al.  Lazy multivariate higher-order forward-mode AD , 2007, POPL '07.

[17]  Barak A. Pearlmutter,et al.  First-class nonstandard interpretations by opening closures , 2007, POPL '07.

[18]  Barak A. Pearlmutter,et al.  Nesting forward-mode AD in a functional framework , 2008, High. Order Symb. Comput..

[19]  Barak A. Pearlmutter,et al.  Using Programming Language Theory to Make Automatic Differentiation Sound and Efficient , 2008 .

[20]  Conal Elliott Beautiful differentiation , 2009, ICFP.

[21]  Isaac Sir Newton,et al.  Opticks, or a Treatise of the Reflexions, Refractions, Inflexions and Colours of Light: Also Two Treatises of the Species and Magnitude of Curvilinear Figures , 2010 .

[22]  James Cheney,et al.  A dependent nominal type theory , 2012, Log. Methods Comput. Sci..

[23]  Oleksandr Manzyuk A Simply Typed λ-Calculus of Forward Automatic Differentiation , 2012, MFPS.

[24]  Gerald Jay Sussman,et al.  Functional Differential Geometry , 2013 .

[25]  Ryan P. Adams,et al.  Gradient-based Hyperparameter Optimization through Reversible Learning , 2015, ICML.

[26]  Marcin Andrychowicz,et al.  Learning to learn by gradient descent by gradient descent , 2016, NIPS.

[27]  Barak A. Pearlmutter,et al.  Evolving the Incremental Lambda Calculus into a Model of Forward Automatic Differentiation (AD) , 2016 .

[28]  Barak A. Pearlmutter,et al.  DiffSharp: An AD Library for .NET Languages , 2016, ArXiv.

[29]  Barak A. Pearlmutter,et al.  P L ] 1 0 N ov 2 01 6 Evolving the Incremental λ Calculus into a Model of Forward AD ∗ , 2016 .

[30]  Conal Elliott Compiling to categories , 2017, Proc. ACM Program. Lang..

[31]  Bart van Merriënboer,et al.  Automatic Differentiation in Myia , 2017 .

[32]  Nematollah Batmanghelich,et al.  Deep Diffeomorphic Normalizing Flows , 2018, ArXiv.

[33]  David Duvenaud,et al.  Neural Ordinary Differential Equations , 2018, NeurIPS.

[34]  Tom Ronan,et al.  OpenEnsembles: A Python Resource for Ensemble Clustering , 2018, J. Mach. Learn. Res..

[35]  Dan Moldovan,et al.  Tangent: Automatic differentiation using source-code transformation for dynamically typed array programming , 2018, NeurIPS.

[36]  Maziar Raissi,et al.  Deep Hidden Physics Models: Deep Learning of Nonlinear Partial Differential Equations , 2018, J. Mach. Learn. Res..