Abstracts Research Seminar Winter Term 2014/15

Nikolaus Hautsch: Estimating the Spot Covariation of Asset Prices – Statistical Theory and Empirical Evidence

We propose a new type of estimator for the spot covariance matrix of a multi-dimensional semi-martingale log asset price process which is subject to noise. The estimator is constructed based on a local average of block-wise constant spot covariance estimates. The latter originate from the local method of moments (LMM) proposed by Bibinger et al (2014) building on locally constant approximations of the underlying process. We extend the LMM estimator to allow for autocorrelated noise and propose a consistent estimator of the order of serial dependence. We prove the consistency and asymptotic normality of the proposed spot covariance estimator and show that it benefits from the near rate-optimality of the underlying LMM approach. Based on extensive simulations, we provide empirical guidance on the optimal implementation of the estimator and apply it to high-frequency data based on a cross-section of NASDAQ blue chip stocks. Employing the estimator to estimate spot covariances, correlations and betas in normal, but also extreme-event periods, yields novel insights into intraday covariance and correlation dynamics. We show that intraday (co-)variations (i) follow underlying periodicity patterns, (ii) reveal substantial intraday variability associated with (co-)variation risk, (iii) are strongly serially correlated, and (iv) can increase strongly and nearly instantaneously if new information arrives.

Hansjörg Albrecher: Insurance risk and the cost of capital

The development of rules for the determination of premiums under solvency capital requirements is a classical topic in insurance. In recent years the cost-of-capital method for the determination of risk margins has been advocated, with a particular suggestion for the size of the cost-of-capital rate. In this talk a framework will be developed which considers the viewpoint of regulators, investors and policyholders at the same time, leading to a quantitative approach towards interpreting and justifying the size of such a rate. Some practical implications of this approach are discussed in the context of Solvency II.

Michaela Szölgyenyi: Dividend maximization under regime switching and incomplete information

De Finetti proposed to use discounted future dividend payments as a valuation principle for a homogeneous insurance portfolio. This concept can be extended to value whole (insurance) companies. The concept says that the value of the company is determined as the maximum dividends that can be paid out over the lifetime of the company. We follow this approach to solve the valuation problem of an insurance company. Extending classical contributions we study this optimization problem in a framework that allows for shifts of the economical state, which is reasonable due to the usual long time horizon of the observation. Furthermore, it is assumed that the actual economical phase cannot be observed at the very moment. Specifically, we model the surplus of the insurance company as a diffusion process with an unobservable drift parameter that might shift. This results in a joint filtering and stochastic optimization problem. After applying filtering theory to overcome uncertainty, we are able to characterize the solution of the optimization problem as the unique viscosity solution to the associated Hamilton-Jacobi-Bellman equation. A numerical treatment of the problem leads to dividend strategies of threshold type. This raises the question of admissibility of such strategies. Thus, we finally discuss this issue and its solution in some detail.

Co-authors: Gunther Leobacher (Johannes Kepler University Linz), Stefan Thonhauser (Graz University of Technology)

Nicolas Turenne: Relations and entities extraction from full texts, and their use in an end-user platform. The case of the epidemiosurveillance VESPA platform.

Relation extraction with accurate precision is still a challenge in processing full text databases. We propose an approach based on a double facet driving extraction. First facet is surface analytical extraction with  structural and cooccurrence heuristics. Second facet is usage through users-pool limiting search hypotheses exploration and usefulness of extraction. Extraction tool is called x.ent, developped with R and is available here: http://cran.r-project.org/web/packages/x.ent/index.html
Usage is oriented to plant-disease exploration through agricultural information news, and a platform made with users is available here:

Harry Zheng: Utility-Risk Portfolio Selection

In this talk we discuss a utility-risk portfolio selection problem. By considering the first order condition for the objective function, we derive a primitive static problem, called Nonlinear Moment Problem, subject to a set of constraints involving nonlinear functions of “mean-field terms”, to completely characterize the optimal terminal wealth. Under a mild assumption on utility, we establish the existence of the optimal solutions for both utility-downside-risk and utility-strictly-convex-risk problems, their positive answers have long been missing in the literature. In particular, the existence result in utility-downside-risk problem is in contrast with that of mean-downside-risk problem considered in Jin et al. (2005) in which they prove the non-existence of optimal solution instead and we can show the same non-existence result via the corresponding Nonlinear Moment Problem. This is the joint work with K.C. Wong (University of Hong Kong) and S.C.P. Yam (Chinese University of Hong Kong).

Sam Cohen: Ergodic BSDEs with Lévy noise and time dependence

In many control situations, particularly over the very long term, it is sensible to consider the ergodic value of some payoff. In this talk, we shall see how this can be studied in a weak formulation, using the theory of ergodic BSDEs. In particular, we shall consider the case where the underlying stochastic system is infinite dimensional, has Lévy-type jumps, and is not autonomous. We shall also see how this type of equation naturally arises in the valuation of a power plant.


Workshop of the Institute for Statistics and Mathematics on Operations Research and Stochastics in Economics and Business


Risk measures for multivariate random variables will be considered. This plays a role in illiquid markets (e.g. markets with transaction costs or taxes) or for a network of banks. The last application is important for measuring systemic risk and gained a lot of attention in research after the financial crisis. The value of the risk measure is then a set: the collection of all vectors of capital (in different currencies or of the different banks) that makes the random vector acceptable. In the talk I will present properties, results, and numerical procedures for set-valued risk measures. Time consistency in the dynamic case will be of particular importance, and will lead to a set-valued Bellman principle. The theory has many applications, among them also for one-dimensional risk measures acting on random vectors. Examples, including price bounds in markets with transaction costs and systemic risk, will be discussed.


After the recent financial crisis, the monitoring and managing of systemic risk has emerged as one of the most important concerns for regulators, governments and market participants. In this talk, we propose two complementary approaches to modeling systemic risk.

In the first approach we model the propagation of balance-sheet or cash-flow insolvency across financial institutions as a cascade process on a network representing their mutual exposures. We derive rigorous asymptotic results for the magnitude of contagion in a large financial network and give an analytical expression for the asymptotic fraction of defaults, in terms of network characteristics. We also introduce a criterion for the resilience of a large financial network to initial shock that can be used as a tool for monitoring systemic risk.

Using an equilibrium approach, in the second part of the talk, we introduce a framework to study the interbank payment in conjunction with the price impact on external assets. The framework allows us to examine the effects on systemic risk and price contagion of multilateral clearing via a central clearing counterparty (CCP).


We develop a theory for stochastic control problems which, in various ways, are time inconsistent in the sense that they do not admit a Bellman optimality principle. We study these problems within a game theoretic framework, and we look for Nash subgame perfect equilibrium points. For a general controlled continuous time Markov process and a fairly general objective functional we derive an extension of the standard Hamilton-Jacobi-Bellman equation, in the form of a system of non-linear equations, for the determination for the equilibrium strategy as well as the equilibrium value function. Many examples of time inconsistency in the literature, such as mean-variance problems and non-exponential discounting problems, are easily seen to be special cases of the present theory.


We consider a notion of the weak no arbitrage condition known as Robust No Unbounded Profit with Bounded Risk (RNUPBR) in the context of continuous time markets with small proportional transaction costs. We show that the RNUPBR condition on terminal liquidation value holds if and only if there exists a strictly consistent local martingale system (SCLMS). Moreover, we show that RNUPBR condition implies the existence of optimal solution of the utility maximization problem defined on the terminal liquidation value.