AD Theory and Manual Implementation

This video gives an introduction into algorithmic differentiation: theory, manual implementation, live example

Algorithmic Differentiation Video Series - Part 1

Computing sensitivities is at the core of many compute-intensive calculations in finance today (XVA, FRTB SA-CVA, FRTB SBA, SIMM). These typically depend on hundreds or thousands of inputs, such as market parameters, risk factors, or correlation coefficients, and thus computing the sensitivities is a challenging task. The traditional approach of bumping the inputs one by one and re-evaluating (finite differences) leads to an extensive increase in complexity and is often not practical. Algorithmic differentiation (AD) offers an efficient and robust alternative to computing sensitivities.

This video gives an introduction into algorithmic differentiation. It covers the theory for both forward (tangent-linear) and adjoint mode, and explains the concepts using simple examples. These principles are then extended to higher order derivatives. A live demonstration walks through an adjoint mode implementation for a swap pricer.  

More in this Series:

Video Duration 10:56
  • Algorithmic differentiation theory
  • Forward mode (tangent-linear mode)
  • Adjoint mode
  • Higher order derivatives
  • Live demonstration on swap pricer code

Get exclusive access to this whitepaper

Get exclusive access to this video

Get exclusive access to this case study

Unlock valuable insights and industry trends. Enter your details to download now.

By clicking "Submit", you agree to the processing of your personal data by Xcelerit as described in our Privacy Policy.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.