Skip to content

ManuelP96/Intro_AAD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Introduction to Algorithmic Adjoint Differentiation

In recent years, the computational cost of some mathematical operations in finance has increased every day due to the complexity of financial products or the models behind their valuation. In addition, the classical techniques (that is, finite differences) do not give us exact values of the derivatives but approximations of them.

For that reason, when reading about Adjoint Differentiation and its application to finance (mainly from Savine's work), we are interested in its previous results in terms of reducing the time and computational cost of calculating sensitivities.

The first objective was to understand the mathematics behind this technique and a first application (Dupire's Volatility Model), this objective was developed within a study group at the Faculty of Economics of the Universidad del Rosario during the second semester of 2022. Therefore, this repository contains the material (Videos and Slides of the sessions) used and produced at that time.

After a brief exposition of mathematical concepts, we introduce the term adjoint and its formula. Then we give a simple calculus example of the adjoints and its relation with the partial derivatives of a function.

We translate the calculus-type definitions of AAD into matrix-type definitions for Forward and Adjoint modes to understand how algorithms should be developed, and give the example of computing partial derivatives for a simple function, with Forward Mode and Adjoint Mode.

We implemented, as a first simple example in Finance, the AAD technique for computing greeks in the Black-Scholes-Merton Model for a European Call Option. The implementation in Python is not optimized and just follows the matrix-type algortihm.

Now, we review the paper Differential Machine Learning while exploring the possible research using AAD, here the main idea is that AAD is equivalent to Backpropagation in the Machine Learning language, so Savine & Huge combined Forward and Adjoint Mode into a Twin Network. Here we understand that Tensorflow2 is a Python library written in C++ which has already implemented AAD in the GradientTape API.

We explain the Dupire's Volatility Formula and review the Savine's implementation on C++ of this model.

We implement the Dupire's Model example in Python which are useful to understand how AAD reduces time and computational cost using the record of calculations in a Tape.

About

Material of AAD Study Group. (2022)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published