Die Erholunsgzone vor dem D4 Gebäude über dem Brunnen.

Abstracts Research Seminar Winter Term 2017/18

Efstathia Bura: Near-equivalence in Forecasting Accuracy of Linear Dimension Reduction Methods in Large Panels of Macro-variables

We compare the forecast accuracy of widely used linear estimators, such as ordinary least squares (OLS), dynamic factor models (DFMs), RIDGE regression, partial least squares (PLS) and sliced inverse regression (SIR), a sufficient dimension reduction (SDR) technique, using a large panel of potentially related macroeconomic variables. We found that (a) PCR, RIDGE regression, PLS and SIR exhibit overall near-equivalent forecasting accuracy, however (b) SIR appears to have superior targeting power with only one or two linear combinations of the predictors whereas PCR and PLS need substantially more than two components to achieve their minimum mean square forecast error. This empirical forecast accuracy near-equivalence motivated our theoretical contributions which will be presented in this talk. We show that the most widely used linear dimension reduction methods solve closely related maximization problems, and all have closely related solutions that can be decomposed in “signal” and “scaling” components. We organize them under a common scheme that sheds light on their commonalities and differences as well as their functionality. The competitive advantage of SIR, or SDR in general when dealing with large panels of macro variables, is the dramatic decrease of the complexity of the forecasting problem by delivering the most parsimonious forecast model.

Josef Teichmann: Machine Learning in Finance

We present several applications of machine learning techniques in Finance and show some details on a calibration project. Several theoretical insights why machine learning can make a difference in calibration, risk management or filtering are presented.

Natesh Pillai: Bayesian Factor Models in High Dimensions    

Sparse Bayesian factor models are routinely implemented for parsimonious dependence modeling and dimensionality reduction in high-dimensional applications. We provide theoretical understanding of such Bayesian procedures in terms of posterior convergence rates in inferring high-dimensional covariance matrices where the dimension can be larger than the sample size. We will also discuss other high dimensional shrinkage priors and discuss them in the context of factor models.  

Johanna F. Ziegel: Elicitability and backtesting: Perspectives for banking regulation

Conditional forecasts of risk measures play an important role in internal risk management of financial institutions as well as in regulatory capital calculations. In order to assess forecasting performance of a risk measurement procedure, risk measure forecasts are compared to the realized financial losses over a period of time and a statistical test of correctness of the procedure is conducted. This process is known as backtesting. Such traditional backtests are concerned with assessing some optimality property of a set of risk measure estimates. However, they are not suited to compare different risk estimation procedures. We investigate the proposal of comparative backtests, which are better suited for method comparisons on the basis of forecasting accuracy, but necessitate an elicitable risk measure. We argue that supplementing traditional backtests with comparative backtests will enhance the existing trading book regulatory framework for banks by providing the correct incentive for accuracy of risk measure forecasts. In addition, the comparative back- testing framework could be used by banks internally as well as by researchers to guide selection of forecasting methods. The discussion focuses on two risk measures, Value-at-Risk and expected shortfall, and is supported by a simulation study and data analysis.

Joint work with Natalia Nolde.

Vladimir Veliov: Regularity and approximations of generalized equations; applications in optimal control

We begin with reminding the notions of strong metric regularity and strong (Hölder) sub-regularity of set-valued mappings, x ==>F(x), between subsets of Banach spaces, and some applications to optimal control theory, where the mapping F is associated with the first order optimality system. These applications require a standard coercivity condition. Then we focus on optimal control problems for control-affine problems where the coercivity fails. The analysis of stability of the solutions of such problems require enhancement of the existing general metric regularity theory. We do this by introducing the “strong bi-metric regularity” (SbiMR) and prove a version of the important Ljusternik-Graves theorem for SbiMR mappings. Then we return to the affine optimal control problems and present applications to numerical methods. We focus on two issues: (i) a Newton type method and the pertaining convergence analysis; (ii) a discretization scheme of higher order accuracy than the Euler scheme. In the case of affine problems, the investigation of each of these issues is technically rather different from that in the coercive case, especially that for high order discretization.

The talk is based on joint works with J. Preininger and T. Scarinci.

Bernd Bischl: Model-Based Optimization for Expensive Black-Box Problems and Hyperparameter Optimization

The talk will cover the main components of sequential model-based optimization algorithms. Algorithms of these kinds represent the state-of-the-art for expensive black-box optimization problems and are getting increasingly popular for hyper-parameter optimization of machine learning algorithms, especially on larger data sets. The talk will cover the main components of sequential model-based optimization algorithms, e.g., surrogate regression models like Gaussian processes or random forests, initialization phase and point acquisition. In a second part I will cover some recent extensions with regard to parallel point acquisition, multi-criteria optimization and multi-fidelity systems for subsampled data. Most covered applications will use support vector machines as examples for hyper-parameter optimization. The talk will finish with a brief overview of open questions and challenges.

Stefan Weber: Pricing of Cyber Insurance Contracts in a Network Model

We develop a novel approach for pricing cyber insurance contracts. The considered cyber threats, such as viruses and worms, diffuse in a structured data network. The spread of the cyber infection is modeled by an interacting Markov chain. Conditional on the underlying infection, the occurrence and size of claims are described by a marked point process. We introduce and analyze a new polynomial approximation of claims together with a mean-field approach that allows to compute aggregate expected losses and prices of cyber insurance. Numerical case studies demonstrate the impact of the network topology and indicate that higher order approximations are indispensable for the analysis of non-linear claims.

This is joint work with Matthias Fahrenwaldt and Kerstin Weske.

Eric Finn Schaanning: Measuring systemic risk: The Indirect Contagion Index

The rapid liquidation of a portfolio can generate substantial mark-to-market losses for market participants who have overlapping portfolios with the distressed institution. In a model of deleveraging, we introduce the notion of liquidity-weighted overlaps to quantify these indirect exposures across portfolios. We apply our methodology to analyse indirect contagion in the European Banking Authority’s stress test from 2016. Key questions that we study are: Which asset classes are the most important channels for price-mediated contagion? How can we quantify the degree of interconnectedness for systemically important institutions? Given institutional portfolio holdings, are the stress scenarios that we consider the “right” ones?

Daniel Rösch: Systematic Effects among LGDs and their Implications on Downturn Estimation

Banks are obliged to provide downturn estimates for loss given defaults (LGDs) in their internal ratings-based approach. While there seems to be a consensus that downturn conditions reflect times in which LGDs are systematically higher, it is unclear which factors may best capture these conditions. As LGDs depend on recovery payments which are collected during varying economic conditions of the resolution process, it is challenging to identify economic variables which capture the systematic impact on LGDs. The aim of this paper is to reveal the nature of systematic effects among LGDs using a Bayesian Finite Mixture Model. Our results show that the systematic patterns of LGDs in the US and Europe strongly deviate from the economic cycle. This questions the use of economic variables for downturn modeling and leads to the development of a new method for generating downturn estimates. In comparison to other approaches, our proposal seems to be conservative enough during downturn conditions and inhibits over-conservatism.