Computing sensitivities is at the core of many compute-intensive calculations in finance today (XVA, FRTB SA-CVA, FRTB SBA, SIMM). These typically depend on hundreds or thousands of inputs, such as market parameters, risk factors, or correlation coefficients, and thus computing the sensitivities is a challenging task. The traditional approach of bumping the inputs one by one and re-evaluating (finite differences) leads to an extensive increase in complexity and is often not practical. Algorithmic differentiation (AD) offers an efficient and robust alternative to computing sensitivities.
This video gives an introduction into algorithmic differentiation. It covers the theory for both forward (tangent-linear) and adjoint mode, and explains the concepts using simple examples. These principles are then extended to higher order derivatives. A live demonstration walks through an adjoint mode implementation for a swap pricer.