top of page

As advertised to Heriot Watt University and University of Edinburgh students in

​

  • Computational Mathematical Finance

  • Financial Modelling and Optimisation

  • Operational Research

  • Statistics

  • Financial Mathematics

  • Quantitative Finance and Mathematics

  • Actuarial Science and Management

  • Quantitative Financial Risk Management

​

Please find below a list of the completed project placements

Summer 2021 Placements

Hymans Robertson LLP - Analysis of Financial Model Risks

Supervisor:

Dr Mayukh Gayen, Risk and Modelling Consultant

​

One participating student from 

Heriot Watt University

Financial models can be calibrated based on historical information, economic validation, and subjective judgement about the future. For a reduced form statistical model, those information are gathered in statistical terms and calibration aims at aligning the model to those target statistics.  

 

Applications where quantitative information is available from data and less subjectivity involved, these statistics are robust; on the other hand, lack of data or availability/imposition of qualitative information makes the calibration more subjective. The first approach is data driven, whereas for the latter we resort to Bayesian approaches. This project explores the model risk associated with the calibration exercise by resorting to one approach or the other. 

 

Appreciating above that different calibration approaches do exist and can be useful for certain applications, this project has two main aims: 

  • Understanding the limitation (or not) of the data driven (i.e., frequentist) approach to calibrate statistical financial models. 

  • Exploring the usefulness (or not) of Bayesian approach in producing more realistic calibrations. 

 

The project should contain a literature review on the above context. 

 

The next step would be to explore Bayesian calibration approaches for a Stochastic Differential Equation (SDE) – the Vasicek interest rate model, say. 

 

A conclusion should be made regarding flexibility, usefulness, limitations of the Bayesian approach in comparison with the frequentist approach. 

 

For Bayesian approaches relevant to this project some initial references can be: 

  • Bayesian Estimation of Short-Rate Models 

  • Calibration of Interest Rates 

  • The use of Bayesian methods in financial research 

or anything similar which the student can explore themselves. 

 

For interest rate model or data driven calibration, the student can use any standard textbook on stochastic calculus. For example: Interest Rate Models - Theory and Practice: With Smile, Inflation and Credit (2001), Brigo and Mercurio. 

.

Lloyds Banking Group - A comparison of Extended Kalman Filtering (EKF) and Unscented Kalman Filtering (UKF) applied to the estimation of hazard rates and default probabilities implied by the Credit Default Swap (CDS Market)

Supervisor:

Dr Colin Burke, Head of Market Risk Model Approval

​

One participating student from 

University of Edinburgh

In measuring and managing counterparty credit risk, estimates of future credit exposure are required and for financial products involving credit default swaps (CDS), models of hazard rates are typically used. Moreover, modelling of default processes in general often employs hazard rate modelling.

When examining the financial market, the evidence is that hazard rates are stochastic and affine models of the hazard rate evolution are attractive because of their tractability. The class of affine models usually chosen has a strong correspondence with the so-called short rate interest rate models. When using such models, model parameters such as mean reversion speeds and volatilities need to estimated. The modeller is also confronted with a choice of which measure to use: the historical (“physical”) measure or that implied by the market (“risk neutral “). The two measures are sometimes related by the market price of risk and if that can be estimated this can give further insight on the market’s pricing.

​

In an analogous manner to short rates and market quantities such as swap rates, hazard rates are not directly observable in the market but are implicit in observable CDS prices.  The Kalman filter is an algorithm that under certain conditions, notably linearity and if the process is Gaussian, that provides an optimal way to extract the unobservable hazard rates. If the market price of risk is chosen in particular ways, the associated hazard rate stochastic process remains of the same class under the risk neutral and physical measures. The Kalman filter combined with maximum likelihood estimation therefore is an attractive concept to be used to extract the hazard rate and estimate the parameters of the stochastic process.

​

Unfortunately, the assumption of linearity is invalid when applied to CDS prices, since the CDS pricing function is not linear in the hazard rate. The so-called Extended Kalman Filter (EKF) employs a Taylor series expansion in an attempt to deal with this. The Gaussian assumption though is more difficult to deal with. Gaussian distributed hazard rates are unrealistic as “negative hazard” rates can be generated but which have no meaning. Modellers have sometimes tried to overcome this by assuming non-Gaussian processes, (typically non-central chi-squared) but still apply the EKF.

​

The Unscented Kalman Filter also assumes a Gaussian distribution but deals with non-linearity in a very different way and that is to employ a set of deterministically chosen sample points which can potentially capture the true mean and covariance to third order compared to the EKFs first order.

The project is a mix of theoretical and computational work to develop and apply the EKF and  UKF to historical CDS data, beginning first with a deliberately unrealistic Gaussian/Vasicek process before moving on to a Cox Ingersoll Ross process. The students need a working knowledge  of both stochastic processes. The industrial supervisor will provide detailed guidance on the estimation of Gaussian should the student require it.

​

The student should implement the Kalman filter in VBA/R/Python. The student should derive the linearization of the CDS pricing function and then apply this to the filter. The industrial supervisor has already set up an EKF estimation of the CIR and Vasicek models using CDS data and this can be made available should the student require additional help. Alternatively, the industrial supervisor’s model can be used for the student to benchmark their results.  Finally the student should review the Unscented Kalman Filter algorithm and implement on the same data set as the EKF.

​

Project References:

‘Default Intensity and Expected Recovery of Japanese Banks and “Government”:

New Evidence from the CDS Market’ , Bank of Japan Working Paper Series, March 2006.

 

A demonstration of impact of different indices on a financial portfolio can be considered.

.

Lloyds Banking Group - An investigation into the simulation and model parameterisation of stochastic volatility

Supervisor:

Dr David Saadaoui

​

One participating student from 

University of Edinburgh

In option pricing, it is standard practice to use models of stochastic volatility. Such models are normal used under the pricing measure. For risk management purposes however we are also interested in the historical (physical) measure.

​

This project will investigate how successful parameter estimation is for parametric stochastic volatility diffusion models.

​

The student will begin by building strong approximations to univariate diffusion models, beginning firstly with the Euler method before applying the Milstein scheme and then will apply MLE scheme(s) to estimate the known parameters. The student will be expected to examine the properties of explicit and implicit schemes.

​

The student will then review the literature on the so-called SABR model or the Heston model.

After this stage, stochastic volatility models will be simulated using the Euler scheme and, if time permits, the Milstein and other higher order schemes. The student will then apply MLE to estimate the parameters of the known stochastic volatility model and the effects of time discretisation in the simulated investigated in the context of the MLE results. Finally, the student will employ the phi divergence measure to test the significance of the MLE results.

​

Although software packages exist that can perform the simulation and estimation automatically, it may add to the student’s learning if he or she codes the strong approximations him or herself for example in VBA in addition to using standard software packages and similar remarks apply for the MLE.

 

Project References:

  • “A Comparative Study of Various Versions of the SABR Model Adapted to Negative Interest Rates”, Sahdrina I, Amsterdam School of Economics, 2017

  • “Numerical Simulation of Stochastic Differential Equations”, Kloeden, P and Platen E, Springer, 2007

  • “Simulation and Inference for Stochastic Processes with YUIMA”, Iacus S and Yoshida N, Springer 2018
     

Predictiva - Volume Trading for Digital Currencies: Studying the Effect of Trading Using Volume-Based Methods in Unregulated Markets.

Supervisor:

Maysara 

Hammouda

​

One participating student from

Heriot Watt University

When it comes to trading the financial markets, traders are divided into many categories based on their trading methods. Short-term traders usually prefer using Technical Analysis (TA) because it gives them quick insights into what is happening in the markets. Long-term traders, on the other side, typically prefer Fundamental Analysis (FA) because they are more concerned about the company’s wellbeing than the short-term price changes. Other traders try to get the best out of both worlds and use a mix of TA and FA strategies. 

​

The problem with some unregulated markets, such as the digital currencies (cryptocurrencies) market, is that the assets’ prices can be easily manipulated, which affects the accuracy of the technical indicators. Additionally, these new assets do not usually have strong fundamentals, making it hard to judge the quality of the company’s work. 

​

Is there a way to analyse the changes in such unregulated markets before they translate into price changes while we know that TA and FA are not completely useful here? 

​

We believe there is a way. The one thing which no one can manipulate, the volume! 

​

Volume is one of the most critical factors which can give traders some vital insights into how the big players are moving the market. The trading volume is compelling when dealing with assets with relatively low trading volume. Any decent volume introduced might cause a considerable change in price and vice versa. That is why researchers and traders are currently researching the topic of using volume-based trading algorithms to trade.

 

The most prominent volume-based method is Volume Profile (VP), which is used to find the actual support and resistance areas based on trading volumes. The other method is Volume Spread Analysis (VSA), which uses the relationship between the price movement and the trading volume to decide on the next logical action. There are also other methods available, but these two proved to be effective under many consequences.

​

In this project, the requirements are:
1.    Researching these two methods (VP and VSA) and understanding why they might be useful in digital currencies trading.
2.    Creating trading bots based on these two methods. These bots should be able to scan a large number of assets and decide on the most promising ones.
3.    Testing these trading bots using historical data to prove their effectiveness.
4.    Commenting on the results and proposing any modifications/improvements that can increase the effectiveness of these trading methods.
5.    (If time allows) Testing these methods in the live market.

 

Expected Outcomes:


A literature review of the most important papers and implementations of the Volume Profile (VP) and the Volume Spread Analysis (VSA) methods.


An analysis of why these methods work / do not work and the ways to improve them.


A trading agent that can scan the market, apply these methods, and decide on the most promising assets. The agent should then specify the expected targets for these assets.
 

Moodys Analytics - Carbon Emissions and Climate Environment Based on Chinese Economy and Climate Simulation in DICE and RICE framework

Supervisor:

Dr Jiajia Cui, Associate Director Research

​

One participating student from both

Heriot Watt University &

University of Edinburgh

At Moody’s Analytics, we help our clients understand how the uncertainty of financial markets can impact their business.  Climate change is an environmental issue on a global scale, but also an economic issue. It is generally believed that Williams Nordhaus was the first economist to use modern economic analysis methods to study climate change issues (Nordhaus, 1975, 1977). As a modern integrated climate and economic model, DICE model established a comprehensive feedback response model for economic and climate. As the world’s most populous country, China is also the world's fastest-growing major economy, the second-wealthiest nation in the world, and the world's largest manufacturer and exporter. As the impact on world’s economy from China is growing rapidly, the impact on world’s climate from China is also growing rapidly. 

 

The project will focus on the climate and economy impact from China in DICE framework, will find an IAM model that is suitable from Chinese market and stress test the impact if under different Economy and Policy scenarios.

 

The project will involve:
•    Review of academic literature on IAM (integrated assessment models) models which are potentially suitable for Chinese market.
•    Simulate China’s economic output, industrial carbon emissions as well as clean energy cost for the coming 100 years starting from year 2000. 
•    Stress test the impact from Policy (Tax Rate, Abatement and Social cost of carbon) and study the impact on global climate.

​

References*:

Nordhaus, William (October 2017). "DICE/RICE models - William Nordhaus - Yale Economics". Retrieved October 11, 2018
Nordhaus, William. "Original DICE and RICE models". Archived from the original on October 8, 2014. Retrieved February 19, 2014.

 

Moodys Analytics - Numerical analysis of the CIR model

Supervisor:

Dr Grieg Smith

Associate Director Research

​

One participating student from

University of Edinburgh

At Moody’s Analytics, we help our clients understand how the uncertainty of financial markets can impact their business.  The project aims to improve a numerical method in our award-winning Economic Scenario Generator software.

​

Moody’s Economic Scenario Generator (SG) is a multi-asset class, multi-risk factor, multi-time step scenario generator which features a wide range of stochastic models used in financial modelling, including the Cox-Ingersoll-Ross (CIR) model. This model has several appealing features that make it widely used in financial mathematics: e.g., mean reverting and non-negative. While a strong solution to this SDE is known to exist, an analytical form for the solution is unknown and thus implementing such a model amounts to using numerical schemes to approximate the process on a discrete time grid; a notoriously hard problem for the CIR process. However, recent academic literature proposes novel approaches to simulating the CIR process. The purpose of the project is to review possible implementations of the CIR process, and assess their suitability in scenario generation.

 

The project will involve:
•    Review of academic literature on simulating the CIR process.
•    Implementing in Matlab (or another scripting language) the selected numerical schemes for the CIR process.
•    Analysis of convergence properties of implemented numerical schemes.

 

References*:

Alfonsi, A., 2015. Affine diffusions and related processes: simulation, theory and applications (Vol. 6, p. 20). Cham, Switzerland: Springer.

R. Lord, R. Koekkoek, D. Van Dijk: A comparison of biased simulation schemes for
stochastic volatility models, Quantitative Finance, Vol. 10, No. 2, February 2010, 177–194.
A. Shao: A fast and exact simulation for CIR process, PhD Dissertation, University of Florida 2012.
C. Kahl & P. Jäckel: Fast strong approximation Monte Carlo schemes for stochastic volatility models, Quantitative Finance, 6:6, 513-536, DOI:
10.1080/14697680600841108
C. Labbé, B. Rémillard, J.-F. Renaud: A Simple Discretization Scheme for Nonnegative Diffusion Processes, With Applications To Option Pricing, arXiv:1011.3247v1
Chassagneux, J.F., Jacquier, A. and Mihaylov, I., 2016. An explicit Euler scheme with strong rate of convergence for financial SDEs with non-Lipschitz coefficients. SIAM Journal on Financial Mathematics, 7(1), pp.993-1021.

 

More advanced reading
Hefter, M. and Herzwurm, A., 2018. Strong convergence rates for Cox–Ingersoll–Ross processes—full parameter range. Journal of Mathematical Analysis and Applications, 459(2), pp.1079-1101.
Hefter, M. and Jentzen, A., 2019. On arbitrarily slow convergence rates for strong numerical approximations of Cox–Ingersoll–Ross processes and squared Bessel processes. Finance and Stochastics, 23(1), pp.139-172.

 

Moodys Analytics - Empirical Bayesian techniques for parameter estimation

Supervisor:

Nicholas Miller Smith

Associate Director Research

​

One participating student from

Heriot Watt University

At Moody’s Analytics, we help our clients understand how the uncertainty of financial markets can impact their business.  The project aims to improve parameter calibration in our award-winning economic Scenario Generator (SG) software.

​

The SG is a multi-asset class, multi-risk factor, multi-time step scenario generator which features a wide range of stochastic models used in financial and macroeconomic modelling.  These models are partly calibrated to metrics such as historical averages and volatilities.  Targets values of the metrics for individual economies typically reflect either the historical experience of that individual economy or an average of the experience across all relevant economies.

​

We would like to apply empirical Bayesian techniques to explore the optimum relative weight that should be placed on individual economy experience and global experience in setting individual economy targets.  Challenges include the fact that historical time series are of varying length, and are typically both autocorrelated and cross correlated with the experience of other economies.

 

The project will involve:
•    Review of academic literature on applying empirical Bayesian techniques to economic panel data
•    Implementing in R (or another scripting language) an empirical Bayesian technique to an agreed international panel data set
•    Analysis of the out of sample performance of that technique.

 

References*:
Gu, J., & Koenker, R. (2017). Empirical Bayesball Remixed: Empirical Bayes Methods for Longitudinal Data. Journal of Applied Econometrics, 32, 575-599.
Ho, Shu & Lee, Alan & Marsden, Alastair. (2011). Use of Bayesian Estimates to determine the Volatility Parameter Input in the Black-Scholes and Binomial Option Pricing Models. Journal of Risk and Financial Management. 4. 74-96. 10.3390/jrfm4010074.
Laura Liu & Hyungsik Roger Moon & Frank Schorfheide, 2020. "Panel Forecasts of Country-Level Covid-19 Infections," NBER Working Papers 27248, National Bureau of Economic Research, Inc.

 

Moodys Analytics - Calibration of a stochastic MBS model

Supervisor:

Dr David Redfern

Associate Director Research

​

One participating student from

University of Edinburgh

At Moody’s Analytics, we help our clients understand how the uncertainty of financial markets can impact their business.  This project will involve working with our award-winning Economic Scenario Generator software.

​

Moody’s Economic Scenario Generator (SG) is a multi-asset class, multi-risk factor, multi-time step scenario generator which combines structural and reduced-form modelling approaches in order to capture real-world as well as risk-neutral dynamics.  One of the asset classes that we cover is mortgage-backed securities (MBS) which made headlines in 2008 as a contributing factor behind the global credit crunch.

​

The purpose of this project is to look at the calibration of the MBS model we have in the SG software.  Calibration involves setting the values of the model parameters (of which there are about 15) using optimisation or some other approach, so that the model produces realistic output.  Modelling an MBS requires modelling the behaviour of mortgage holders in terms of when they will refinance, so there are elements of behavioural finance in here too.

​

The project will involve:
•    Understanding the existing model (including the underlying 2-factor Black-Karasinski model for interest rates) and the type of real-world scenarios it produces;
•    Examine and document the existing calibration tool, which is written in Excel VBA;
•    Consider alternative calibration approaches that might use more market data (provided) or utilise more advanced optimisation techniques;
•    Generation of real-world scenarios to validate the proposed calibrations.

 

References*:

A. Kalotay, D. Yang and F. J. Fabozzi, “An Option-Theoretic Prepayment Model for Mortgages and Mortgage-Backed Securities”, International Journal of Theoretical and Applied FinanceVol. 07, No. 08, pp. 949-978 (2004)
 

bottom of page