[Back]


Talks and Poster Presentations (without Proceedings-Entry):

A. Jung:
"COMPRESSIVE NONPARAMETRIC GRAPHICAL MODEL SELECTION FOR TIME SERIES: A Multitask Learning Approach";
Talk: IDCOM Seminar, Edinburgh (Schottland) (invited); 10-16-2013.



English abstract:
We consider the problem of inferring the conditional independence graph (CIG) of a high-dimensional Gaussian time series. The inference is based on the observation of a finite length block of the time series. Inspired by the approach of Meinshausen and Bühlmann to Graphical model selection for Gaussian Markov random fields and leveraging results of compressed sensing, we propose a novel compressive and nonparametric selection scheme for the CIG of a Gaussian time series. Our approach generalizes the existing selection methods for the case of i.i.d. observations to the case of correlated observations, i.e., we take sample memory into account. By contrast to existing approaches, we do not rely on a finite-dimensional parametric (e.g., a multivariate AR) model but only require the process to be sufficiently underspread. We also give a consistency analysis which reveals conditions under which our selection scheme is consistent for increasing sample-size. Some numerical experiments compare our approach to an existing (parametric) selection scheme.

German abstract:
We consider the problem of inferring the conditional independence graph (CIG) of a high-dimensional Gaussian time series. The inference is based on the observation of a finite length block of the time series. Inspired by the approach of Meinshausen and Bühlmann to Graphical model selection for Gaussian Markov random fields and leveraging results of compressed sensing, we propose a novel compressive and nonparametric selection scheme for the CIG of a Gaussian time series. Our approach generalizes the existing selection methods for the case of i.i.d. observations to the case of correlated observations, i.e., we take sample memory into account. By contrast to existing approaches, we do not rely on a finite-dimensional parametric (e.g., a multivariate AR) model but only require the process to be sufficiently underspread. We also give a consistency analysis which reveals conditions under which our selection scheme is consistent for increasing sample-size. Some numerical experiments compare our approach to an existing (parametric) selection scheme.

Keywords:
Sparsity, graphical model selection, multitask learning, nonparametric time series, LASSO


Electronic version of the publication:
http://publik.tuwien.ac.at/files/PubDat_221161.pdf


Created from the Publication Database of the Vienna University of Technology.