On observational variance learning for multivariate Bayesian time series and related models
This thesis is concerned with variance learning in multivariate dynamic linear models (DLMs). Three new models are developed in this thesis. The first one is a dynamic regression model with no distributional assumption of the unknown variance matrix. The second is an extension of a known model that enables comprehensive treatment of any missing observations. For this purpose new distributions that replace the inverse Wishart and matrix T and that allow conjugacy are introduced. The third model is the general multivariate DLM without any precise assumptions of the error sequences and of the unknown variance matrix. We find analytic updatings of the first two moments based on weak assumptions that are satisfied for the usual models. Missing observations and time varying variances are considered in detail for every model. For the first time, deterministic and stochastic variance laws for the general multivariate DLM are presented. Also, by introducing a new distribution that replaces the matrix-beta of a previous work, we prove results on stochastic changes in variance that are in line with missing observation analysis and variance intervention.