Research topics

ESR1: Detailed modelling of WWR in CVA. Credit risk is intended not only as default risk, but also as migration risk, that is when the creditworthiness of a given counterparty gets worse, and recovery risk, i.e. the risk of not being able to recover the exposure at default entirely. The XVA framework is the Basel III (and IFRS 9) response to this. Within the XVA setting, a cutting-edge topic is the joint modelling of the probability of default (PD) and the loss given default (LGD). Empirical studies have shown that the PD and the LGD tend to be comonotonic, giving rise to a dangerous WWR, especially during crises. PD and LGD have often been modelled as independent random variables, thus underestimating the risk. If we also add the error due to the use of risk-neutral probability measures instead of the real-world ones, it is easy to understand why the regulator requires more effective approaches to risk valuation. We will deal with the dependence between PD and LGD, and take into consideration the discrepancies between risk-neutral and physical measures. The basis is a special class of combinatorial processes, i.e., urn models, a large family of probabilistic models in which the probability of certain events is described in terms of sampling, replacing and adding balls in one or more boxes. Urn models simplify complex probability, making them intuitive and concrete, and yet guaranteeing a good level of abstraction, that allows for general results. We will mainly work with Pólya-like urns, i.e. urns with reinforcement. Thanks to reinforcement, a Pólya-like urn naturally embeds a Bayesian/machine learning component (data will play an important role here). We will set initial compositions as defining the reinforced urn process (RUP) under the risk-neutral measure, and we will reinforce them using actual observations from financial datasets, thus introducing a physical measure element. The dependence between PD and LGD will be represented via different sampling schemes. The first results obtained are encouraging.

ESR2: Manage portfolio XVA: MVA Sensitivities, KVA under Q in P. In this research we manage XVA on portfolio level for interest rate derivatives. Moreover, we will hedge portfolio level XVA and simulate a hedge strategy. The ISDA standard initial margin model (SIMM) is based on sensitivities of financial derivatives to determine today’s initial margins. To compute the MVA, sensitivities along the paths need to be computed. A simple approach for estimating sensitivities with respect to a parameter is based on a finite difference approximation, which is known as bump-and-revalue (BR). However, the scheme is not suitable for computing sensitivities along the scenarios, as required for MVA. In Oosterlee's group, the Monte Carlo based Stochastic Grid Bundling Method (SGBM), which is based on partitioning of the state space at each time step and a local regression in each partition, has been developed for the valuation of high-dimensional options. It can be used for MVA sensitivity computation. The adjoint method is advantageous for calculating sensitivities of a small number of securities with respect to a large number of parameters, and fits well within SGBM. Furthermore, for quantities that are not traded like KVA, the future state of the market should be modelled using a real-world measure (Q in P). Real-world models are calibrated to observed historical time-series and are typically used to compute non-traded quantities. We thus need accurate and efficient calibration with financial data, and we will compute KVA in practical valuation. By using SGBM we aim to circumvent the problem of nested simulation in the hedging.

ESR3: XVA in the context of PDE and hybrid modelling. Numerical methods in finance can be classified into three major approaches: Monte-Carlo methods, PDEs, and numerical integration, with particular pros and cons depending on the application. All three approaches have been successfully applied to estimate exposure profiles for CVA computation. When high-dimensionality arises in XVA, a hybrid MC computational approach that emphasises dimension and variance reduction may be appealing and of high novelty. This is especially important for XVA at portfolio level, involving many different derivatives. Although we initially focus on the PDE approach for low dimensions, a hybrid Monte Carlo (MC) computational approach will be built upon a combination of MC and the other approaches, thus joining the main individual method advantages. A hybrid approach will be built on a conditional MC technique, applied to a few dimensions (independent of the other dimensions), and the development of highly efficient models, possibly in closed-form, for the other dimensions. Since this leaves a few factors to be simulated by MC, a powerful dimension and variance reduction results. Recent advances in all numerical approaches can be integrated into this framework to achieve a highly efficient software toolbox for a comprehensive study of realistic XVA-related issues. Also, we will develop efficient techniques for the computation of XVA sensitivities, by the AAD method. Incorporation of financial data will facilitate dimension reduction.

ESR4: XVA in a multicurrency setting. In a global economy, financial institutions operate in different currencies. In the context of XVA they can either fund or post collateral in different currencies. Recently, attention has been given to the extension of the different XVA adjustments from the single currency to the multicurrency setting, starting from the more classical CVA. In this project we initially address multicurrency extensions to the settings where funding spreads include cross-currency basis spreads to hedge the foreign exchange rate risk. Subsequently, collateral spreads are incorporated when collateral is posted in a foreign currency. We start addressing classical vanilla options. When these cross- currency spreads are constant then the PDE modelling is similar to the single currency case, however, the stochastics of cross-currency spreads add a new stochastic factor. The number of factors can be increased by considering stochastic intensities of default in the involved counterparties, thus increasing the dimensionality. Hybrid formulations will be explored when needed. Accurate calibration with relevant financial data in reasonable time is a challenge. Efficient numerical methods will be proposed and GPU computing techniques will be considered.

ESR5: Incremental CVA and collateralised VA (CollVA). Credit Value Adjustment is often defined as an upfront payment added to a derivative. There is, however, a desire within the industry to reformulate the upfront CVA payment as a running, incremental CVA charge, or as a CVA spread. The running spread simplifies adding new trades with a client in a smart way, where also existing trades may be adjusted, so that the CVA charge is most beneficial for the client. The questions may for example be on how to change strike prices and other trades details in new and/or existing trades, so that the CVA charge is minimised. This relates to portfolio optimisation. The challenge is the never- ending controversy between accuracy on the one hand and speed of computation on the other hand. By means of a hedge test we will justify the new CVA definition.

Secondly, in the context of CollVA, there is the notion of "cheapest to deliver" (CTD), referring to the type of collateral which is selected according to the lowest interest rates in the market. So, also here we deal with portfolio optimisation techniques. Models for CTD have already been implemented in the case of deterministic governing interest rates. However, due to this deterministic modelling, the sensitivities related to the CTD decisions fluctuate highly which implies that the cheapest to deliver asset may change on a daily basis, which may be undesirable in practice. We will focus on a stochastic interest rate model and a way the determine efficient sensitivities, so that the impact of volatilities would enter the CTD decisioning. Machine learning and optimization will be incorporated.

ESR6: Unified model for XVA, including WWR, FTD and Rating. Standard models to compute XVA describe the dynamics of the underlying (interest rates, FX, ...) and the evolution of the default times, however, often a simplified approach is used. For instance, default times of two counterparties are often modelled separately with the first-to-default (FTD) impact ignored or approximated, wrong/right way risk (WWR/RWR) is assumed to be negligible, dependence relations between rating transitions and credit spreads are disregarded. Such simplifications may lead to erroneous results. The purpose of the project is to present a universal model to compute XVA taking into account the above features in a unified way. Such model would jointly describe default times, credit/funding spreads, rating transitions and underlying (including credit) and will be implemented and tested in realistic market situations. The first part of the project will be devoted to review FTD, WWR and spread/rating joint dynamics, leading to a selection of available models. These models will then be evaluated, providing an exhaustive list of pros and cons, taking into account financial behaviour, data calibration, robustness, implementation characteristics, risk management and model validation, adaptability to possible evolutions. Such evaluation will be performed by implementing the most promising models and comparing outputs against a common test framework. Resulting will be a universal model whose behaviour is validated for a portfolio of trades under realistic market conditions.