Parallel sessions will normally consist of four mini-symposium or contributed talks.
Mini-courses (Monday 31 August, details TBC)
Prof. Mike Giles: An introduction to the use of adjoints in computational finance
AAD (Adjoint Algorithmic Differentiation, or sometimes Adjoint Automatic Differentiation)) is used extensively in computational finance for estimating sensitivities (Greeks), especially when estimating the sensitivity of a single option value to changes in a large number of input parameters (such as future interest rates or correlation coefficients). The mathematics is also the same as back-propagation in machine learning, computing the sensitivity of the average mis-match to training data to changes in all of the neural network coefficients.
This set of three lectures (each about 50 mins long) will give an introduction to this subject for students and others, with no prior knowledge assumed other than a basic knowledge of Monte Carlo and finite difference methods in computational finance. Those who are interested solely in Monte Carlo simulation are welcome to skip the third lecture.
Lecture 1: the mathematical basics
- A simple example of matrix multiplication
- A black-box view -- forward and reverse mode
- Automatic differentiation
- Adjoints for linear algebra
- Fixed-point iteration
Lecture 2: Monte Carlo calculations
- Pathwise sensitivity analysis
- SDE approximation for European options
- Path-dependent options
- Multiple options
- Binning for expensive pre-computations
- Discontinuous payoffs
- Black-box assembly for multi-stage calculations
Lecture 3: Finite difference methods
- Forward/backward Kolmogorov PDEs
- Use of adjoints for European option pricing (not sensitivities)
- Sensitivity calculations
- Calibration to European prices
- What can go wrong?
Timings and possible further courses to be confirmed.