Introduction to Differentiable Programming

Authors

Joe Wallwork

Niccolò Zanotti

Derivatives are at the heart of scientific programming. From the Jacobian matrices used to solve nonlinear systems to the gradient vectors used for optimisation methods, from the backpropagation operation in machine learning to the data assimilation methods used in weather forecasting, all of these techniques rely on derivative information. Differentiable programming (also known as automatic/algorithmic differentiation (AD)) provides a suite of tools for users to compute derivatives of quantities in their code without any manual encoding.

In Session 1, we will learn about the history and mathematical background of differentiable programming and investigate “forward mode” using the Tapenade AD tool.

In Session 2, we will learn about adjoint methods and “reverse mode”, investigate deploying reverse mode using pyadjoint (an operator-overloading algorithmic differentiation framework for Python), and see some demonstrations of more advanced usage.

📓 Notebooks