Research Seminar Series in Statistics and Mathematics

Location: WU (Vienna University of Economics and Business) , Departments 4 D4.4.008 on 04 December 2019 Starting at 12:15 Ending at 13:45
Type Lecture / discussion
LanguageEnglish
Speaker Dan Zhu (Monash University, Melbourne, Australia)
Organizer Institute Statistik und Mathematik
Contact katrin.artner@wu.ac.at

Dan Zhu (Monash University, Melbourne, Australia) about "Automated IPA for Bayesian MCMC: A New Approach for Local Prior Robustness and Convergence Analysis with Application to Multidimensional Macroeconomic Time Series with Shrinkage Priors"

The Institute for Statistics and Mathematics (Department of Finance, Accounting and Statistics) cordially invites everyone interested to attend the talks in our Research Seminar Series, where internationally renowned scholars from leading universities present and discuss their (working) papers.
No registration required.

The list of talks for the winter term 2019/20 is available via the following link: https://www.wu.ac.at/en/statmath/resseminar

Abstract:
Infinitesimal perturbation analysis (IPA) is a widely used approach to assess local robustness of stochastic dynamic systems. In Bayesian inference, assessing local robustness of posterior Markov chain Monte carlo (MCMC) inference poses a challenge for existing methods such is finite differencing, symbolic differentiation and likelihood ratio methods due to the complex stochastic dependence structure and computational intensity of dependent sampling based methods. In this paper we introduce an efficient numerical approach based on automatic differentiation (AD) methods to allow for a comprehensive and exact local sensitivity analysis of MCMC output with respect all input parameters, i.e. prior hyper-parameters (prior robustness) and chain starting values (convergence). Building on recent developments in AD methods in the classical simulation setting, we develop an AD scheme to differentiate MCMC algorithms in order to compute the sensitivities based on exact (up to computer floating point error) first-order derivatives of MCMC draws (Jacobians) alongside the estimation algorithm. We focus on methods for Gibbs-based MCMC inference that are applicable to algorithms composed of both continuous and discontinuous high-dimensional mappings but show how the approach may be extended to cases when Gibbs updates are not available. We illustrate how the methods can be used to help practitioners to assess convergence and prior robustness in an application of Bayesian Vector Autoregression (VAR) based analysis with shrinkage priors for US macroeconomic time series data and forecasting.



Back to overview