Researcher at the Bank of England with focus on the quantitative modelling of financial instability and extreme market risks.
- Modelling COVID-19 contagion: risk assessment and targeted mitigation policies
- Liquidity at Risk: joint stress testing of solvency and liquidity
- Network analysis of the UK reinsurance market
- Business closures and (re)openings in real-time using Google Places
- Fast Kernel Adaptive Metropolis-Hastings
- Random matrix theory and estimation of high-dimensional covariance matrices
Modelling COVID-19 contagion
Royal Society Open Science, 8: 201535
We use a spatial epidemic model with demographic and geographic heterogeneity to study the regional dynamics of COVID-19 across 133 regions in England.
Our model emphasises the role of variability of regional outcomes and heterogeneity across age groups and geographic locations, and provides a framework for assessing the impact of policies targeted towards sub-populations or regions. We define a concept of efficiency for comparative analysis of epidemic control policies and show targeted mitigation policies based on local monitoring to be more efficient than country-level or non-targeted measures.
In particular, our results emphasise the importance of shielding vulnerable sub-populations and show that targeted policies based on local monitoring can considerably lower fatality forecasts and, in many cases, prevent the emergence of second waves which may occur under centralised policies.
Liquidity at Risk
Journal of Banking & Finance, 118: 105871
The traditional approach to the stress testing of financial institutions focuses on capital adequacy and solvency. Liquidity stress tests have been applied in parallel to and independently from solvency stress tests, based on scenarios which may not be consistent with those used in solvency stress tests.
We propose a structural framework for the joint stress testing of solvency and liquidity: our approach exploits the mechanisms underlying the solvency-liquidity nexus to derive relations between solvency shocks and liquidity shocks. These relations are then used to model liquidity and solvency risk in a coherent framework, involving external shocks to solvency and endogenous liquidity shocks.
We define the concept of 'Liquidity at Risk', which quantifies the liquidity resources required for a financial institution facing a stress scenario. Finally, we show that the interaction of liquidity and solvency may lead to the amplification of equity losses due to funding costs which arise from liquidity needs.
UK reinsurance network analysis
Bank of England Staff Working Paper No. 1,000
We provide an empirical analysis of the network structure of the UK reinsurance sector based on 2016 Solvency II regulatory data. We examine counterparty credit risk originating from reinsurance contracts as a source of financial contagion in the insurance industry. The granularity of the Solvency II data provides a new opportunity for detailed analysis of the actual exposures in the system, detection of potential systemic vulnerabilities, and reinsurance spirals. In our multi-layered network approach, we incorporate information on reinsurance contract risk types and ownership structure for both life and non-life insurers.
Our findings suggest that the UK reinsurance sector exhibits the ‘small-world’ property with a scale-free, core-periphery structure and topological characteristics common to other financial networks. These characteristics of risk dispersion from the periphery to the core make the network ‘robust-yet-fragile’ to financial shocks. We explore the robustness of the network to adverse shocks through a stress-simulation exercise, where we find it robust to system wide shocks affecting the value of total investments, and to idiosyncratic shocks applied to large, highly interconnected reinsurers.
Journal of Risk & Financial Management, 15(4): 183
We present a new estimation of business opening and closure rates using data from Google Places — the data set behind the Google Maps service. Our algorithm, through a bisection routine, counts the appearance and disappearance of 'pins' that represent unique businesses.
As a proof of concept, we compute business opening and closure rates for the city of Ottawa during the reopening phase of the COVID-19 pandemic in mid-2021. The lifting of restrictions coincides with a wave of re-entry of temporarily closed businesses, suggesting that government support may have facilitated the survival of hibernating businesses.
Our entry estimates are validated by a survey of new businesses. This methodology allows policymakers to monitor business dynamics in quasi-real-time during rapidly unfolding crises.
MSc dissertation · University of Oxford
We propose Fast Kernel Adaptive Metropolis-Hastings (F-KAMH), a gradient-free adaptive MCMC algorithm that is highly suitable for contexts such as Pseudo-Marginal MCMC.
Our procedure bases on the Kernel Adaptive Metropolis-Hastings (KAMH) sampler of Sejdinovic et al. (2014) that offers a novel approach to sampling from multivariate target distributions with non-linear dependencies between dimensions. KAMH bases on the mapping of the samples to a reproducing kernel Hilbert space, where the choice of a proposal distribution is adaptively dictated by the estimated sample covariance in the feature space. Flexibility of the algorithm in Sejdinovic et al. (2014) comes with an increased computational cost, however.
In F-KAMH, we use a large-scale approximation of the kernel methods framework based on random Fourier features of Rahimi and Recht (2007), which leads to a significant reduction in the algorithm’s complexity. Moreover, our asymptotically exact procedure adapts to the local covariance structure of the target distribution based on the entire chain history, in contrast to KAMH's suboptimal approach which uses only a subsample of the chain history. Consequently, our newly proposed sampler offers substantial improvements in terms of effective sample size per computation unit time. Our claims are supported through experimental study on synthetic examples of highly non-linear target distributions.
Random matrix theory
BSc dissertation · Imperial College London
This projects aims to present significant results of random matrix theory in regards to the principal component analysis, including Wigner's semicircular law and Marcenko-Pastur law describing limiting distribution of large dimensional random matrices. The work bases on the large dimensional data assumptions, where both the number of variables and sample size tends to infinity, while their ratio tends to a finite limit.
Random matrix theory, over the past decade has been a fast growing area of mathematics, due to the advancements in technology and data collection methods. Treated as a tool to solve large dimensional problems, it has found its application in many research areas, such as signal processing, network security, image processing, genetic statistics, stock market analysis, and other finance or economic problems.
In this project, key results enabling to establish a low dimensional factor model from a large noisy data will be stated, as well as a general way of proving them will be given. A significant portion of the proofs relies on the Stieltjes transform, a common tool used for studying the convergence of spectral distribution of large matrices, which is also discussed in this project. An algorithm suggested by Karoui (2008) will be presented, giving a method of estimating the true population covariance.
Empirical verification of main theorems is conducted, showing fast convergence rate in case of the Marcenko-Pastur law, and slower rate for the Wigner's semicircular law. Also, the established theory is applied to a real-life financial data, based on the S&P 500 index, for which 12 principal components have been identified when time horizon is equal to 10 years, and 10 principal components for data set over 5 years.
I'm Artur Kotlicki, a researcher at the Bank of England.
How may I help you today? (Type 'help' for more information.)