|
on Econometrics |
By: | Geert Mesters (Netherlands Institute for the Study of Crime and Law Enforcement, and VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam) |
Abstract: | An exact maximum likelihood method is developed for the estimation of parameters in a nonlinear non-Gaussian dynamic panel data model with unobserved random individual-specific and time-varying effects. We propose an estimation procedure based on the importance sampling technique. In particular, a sequence of conditional importance densities is derived which integrates out all random effects from the joint distribution of endogenous variables. We disentangle the integration over both the cross-section and the time series dimensions. The estimation method facilitates the flexible modeling of large panels in both dimensions. We evaluate the method in a Monte Carlo study for dynamic panel data models with observations from the Student's <i>t</i> distribution. We finally present an extensive empirical study into the interrelationships between the economic growth figures of countries listed in the Penn World Tables. It is shown that our dynamic panel data model can provide an insightful analysis of common and heterogeneous features in world-wide economic growth. |
Keywords: | Panel data, Non-Gaussian, Importance sampling, Random effects, Student's t, Economic growth |
JEL: | C33 C51 F44 |
Date: | 2012–02–06 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012009&r=ecm |
By: | Siem Jan Koopman (VU University Amsterdam); Thuy Minh Nguyen (Deutsche Bank, London) |
Abstract: | We show that efficient importance sampling for nonlinear non-Gaussian state space models can be implemented by computationally efficient Kalman filter and smoothing methods. The result provides some new insights but it primarily leads to a simple and fast method for efficient importance sampling. A simulation study and empirical illustration provide some evidence of the computational gains. |
Keywords: | Kalman filter, Monte Carlo maximum likelihood, Simulation smoothing |
JEL: | C32 C51 |
Date: | 2012–01–12 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012008&r=ecm |
By: | Siem Jan Koopman (VU University Amsterdam); Kai Ming Lee (VU University Amsterdam) |
Abstract: | Unobserved components time series models decompose a time series into a trend, a season, a cycle, an irregular disturbance, and possibly other components. These models have been successfully applied to many economic time series. The standard assumption of a linear model, often appropriate after a logarithmic transformation of the data, facilitates estimation, testing, forecasting and interpretation. However, in some settings the linear-additive framework may be too restrictive. In this paper, we formulate a non-linear unobserved components time series model which allows interactions between the trend-cycle component and the seasonal component. The resulting model is cast into a non-linear state space form and estimated by the extended Kalman filter, adapted for models with diffuse initial conditions. We apply our model to UK travel data and US unemployment and production series, and show that it can capture increasing seasonal variation and cycle dependent seasonal fluctuations. |
Keywords: | Seasonal interaction; Unobserved components; Non-linear state space models |
JEL: | C13 C22 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:0000028&r=ecm |
By: | Francisco Blasques (VU University Amsterdam) |
Abstract: | This paper proposes the use of a double correlation coefficient as a nonpara- metric measure of phase-dependence in time-varying correlations. An asymp- totically Gaussian test statistic for the null hypothesis of no phase-dependence is derived from the proposed measure. Finite-sample distributions, power and size are analyzed in a Monte-Carlo exercise. An application of this test provides evidence that correlation strength between major macroeconomic aggregates is both time-varying and phase dependent in the business cycle. |
Keywords: | nonparametric, phase-dependence, time-varying correlation |
JEL: | C01 C14 C32 |
Date: | 2013–04–04 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2013054&r=ecm |
By: | Lennart Hoogerheide (VU University Amsterdam); Anne Opschoor (Erasmus University Rotterdam); Herman K. van Dijk (Erasmus University Rotterdam, and VU University Amsterdam) |
Abstract: | A class of adaptive sampling methods is introduced for efficient posterior and predictive simulation. The proposed methods are robust in the sense that they can handle target distributions that exhibit non-elliptical shapes such as multimodality and skewness. The basic method makes use of sequences of importance weighted Expectation Maximization steps in order to efficiently construct a mixture of Student-<I>t</I> densities that approximates accurately the target distribution - typically a posterior distribution, of which we only require a kernel - in the sense that the Kullback-Leibler divergence between target and mixture is minimized. We label this approach <I>Mixture of t by Importance Sampling and Expectation Maximization</I> (MitISEM). The constructed mixture is used as a candidate density for quick and reliable application of either Importance Sampling (IS) or the Metropolis-Hastings (MH) method. We also introduce three extensions of the basic MitISEM approach. First, we propose a method for applying MitISEM in a <I>sequential</I> manner. Second, we introduce a <I>permutation-augmented</I> MitISEM approach. Third, we propose a <I>partial</I> MitISEM approach, which aims at approximating the joint distribution by estimating a product of marginal and conditional distributions. This division can substantially reduce the dimension of the approximation problem, which facilitates the application of adaptive importance sampling for posterior simulation in more complex models with larger numbers of parameters. Our results indicate that the proposed methods can substantially reduce the computational burden in econometric models like DCC or mixture GARCH models and a mixture instrumental variables model. |
Keywords: | mixture of Student-t distributions, importance sampling, Kullback-Leibler divergence, Expectation Maximization, Metropolis-Hastings algorithm, predictive likelihood, DCC GARCH, mixture GARCH, instrumental variables |
JEL: | C11 C22 C26 |
Date: | 2012–03–23 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012026&r=ecm |
By: | Falk Brauning (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam) |
Abstract: | We explore a new approach to the forecasting of macroeconomic variables based on a dynamic factor state space analysis. Key economic variables are modeled jointly with principal components from a large time series panel of macroeconomic indicators using a multivariate unobserved components time series model. When the key economic variables are observed at a low frequency and the panel of macroeconomic variables is at a high frequency, we can use our approach for both nowcasting and forecasting purposes. Given a dynamic factor model as the data generation process, we provide Monte Carlo evidence for the finite-sample justification of our parsimonious and feasible approach. We also provide empirical evidence for a U.S. macroeconomic dataset. The unbalanced panel contain quarterly and monthly variables. The forecasting accuracy is measured against a set of benchmark models. We conclude that our dynamic factor state space analysis can lead to higher forecasting precisions when panel size and time series dimensions are moderate. |
Keywords: | Kalman filter, Mixed frequency; Nowcasting, Principal components, State space model, Unobserved Components Time Series Model |
JEL: | C33 C53 E17 |
Date: | 2012–04–20 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012042&r=ecm |
By: | Cees Diks (CeNDEF, University of Amsterdam); Valentyn Panchenko (University of New South Wales); Oleg Sokolinskiy (Rutgers Business School); Dick van Dijk (Econometric Institute, Erasmus University Rotterdam) |
Abstract: | This paper develops a testing framework for comparing the predictive accuracy of copula-based multivariate density forecasts, focusing on a specific part of the joint distribution. The test is framed in the context of the Kullback-Leibler Information Criterion, but using (out-of-sample) conditional likelihood and censored likelihood in order to focus the evaluation on the region of interest. Monte Carlo simulations document that the resulting test statistics have satisfactory size and power properties in small samples. In an empirical application to daily exchange rate returns we find evidence that the dependence structure varies with the sign and magnitude of returns, such that different parametric copula models achieve superior forecasting performance in different regions of the support. Our analysis highlights the importance of allowing for lower and upper tail dependence for accurate forecasting of common extreme appreciation and depreciation of different currencies. |
Keywords: | Copula-based density forecast, Kullback-Leibler Information Criterion, out-of-sample forecast evaluation |
JEL: | C12 C14 C32 C52 C53 |
Date: | 2013–04–19 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2013061&r=ecm |
By: | Jan F. Kiviet (University of Amsterdam) |
Abstract: | In simple static linear simultaneous equation models the empirical distributions of IV and OLS are examined under alternative sampling schemes and compared with their first-order asymptotic approximations. We demonstrate that the limiting distribution of consistent IV is not affected by conditioning on exogenous regressors, whereas that of inconsistent OLS is. The OLS asymptotic and simulated actual variances are shown to diminish by extending the set of exogenous variables kept fixed in sampling, whereas such an extension disrupts the distribution of IV and deteriorates the accuracy of its standard asymptotic approximation, not only when instruments are weak. Against this background the consequences for the identification of parameters of interest are examined for a setting in which (in practice often incredible) assumptions regarding the zero correlation between instruments and disturbances are replaced by (generally more credible) interval assumptions on the correlation between endogenous regressor and disturbance. This yields OLS-based modified confidence intervals, which are usually conservative. Often they compare favorably with IV-based intervals and accentuate their frailty. |
Keywords: | partial identification, weak instruments, (un)restrained repeated sampling, (un)conditional (limiting) distributions, credible robust inference |
JEL: | C12 C13 C15 C26 J31 |
Date: | 2012–11–27 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012128&r=ecm |
By: | Norbert Christopeit (University of Bonn); Michael Massmann (VU University Amsterdam) |
Abstract: | Strong consistency of least squares estimators of the slope parameter in simple linear regression models is established for predetermined stochastic regressors. The main result covers a class of models which falls outside the applicability of what is presently available in the literature. An application to the identification of economic models with adaptive learning is discussed. |
Keywords: | linear regression, least-squares, consistency, stochastic regressors, adaptive learning, decreasing gain |
JEL: | C13 C22 D83 D84 |
Date: | 2012–10–12 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012109&r=ecm |
By: | Francisco Blasques (VU University Amsterdam) |
Abstract: | This paper proposes a new set of transformed polynomial functions that provide a flexible setting for nonlinear autoregressive modeling of the conditional mean while at the same time ensuring the strict stationarity, ergodicity, fading memory and existence of moments of the implied stochastic sequence. The great flexibility of the transformed polynomial functions makes them interesting for both parametric and semi-nonparametric autoregressive modeling. This flexibility is established by showing that transformed polynomial sieves are sup-norm-dense on the space of continuous functions and offer appropriate convergence speeds on Holder function spaces. |
Keywords: | time-series, nonlinear autoregressive models, semi-nonparametric models, method of sieves. |
JEL: | C01 C13 C14 C22 |
Date: | 2012–12–05 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012133&r=ecm |
By: | Jiangyu Ji (VU University Amsterdam); Andre Lucas (VU University Amsterdam, and Duisenberg school of finance) |
Abstract: | We propose a new semiparametric observation-driven volatility model where the form of the error density directly influences the volatility dynamics. This feature distinguishes our model from standard semiparametric GARCH models. The link between the estimated error density and the volatility dynamics follows from the application of the generalized autoregressive score framework of Creal, Koopman, and Lucas (2012). We provide simulated evidence for the estimation efficiency and forecast accuracy of the new model, particularly if errors are fat-tailed and possibly skewed. In an application to equity return data we find that the model also does well in density forecasting. |
Keywords: | volatility clustering, Generalized Autoregressive Score model, kernel density estimation, density forecast evaluation |
JEL: | C10 C14 C22 |
Date: | 2012–05–22 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012055&r=ecm |
By: | Siem Jan Koopman (VU University Amsterdam); Andre Lucas (VU University Amsterdam); Marcel Scharth (VU University Amsterdam) |
Abstract: | We study whether and when parameter-driven time-varying parameter models lead to forecasting gains over observation-driven models. We consider dynamic count, intensity, duration, volatility and copula models, including new specifications that have not been studied earlier in the literature. In an extensive Monte Carlo study, we find that observation-driven generalised autoregressive score (GAS) models have similar predictive accuracy to correctly specified parameter-driven models. In most cases, differences in mean squared errors are smaller than 1% and model confidence sets have low power when comparing these two alternatives. We also find that GAS models outperform many familiar observation-driven models in terms of forecasting accuracy. The results point to a class of observation-driven models with comparable forecasting ability to parameter-driven models, but lower computational complexity. |
Keywords: | Generalised autoregressive score model, Importance sampling, Model confidence set, Nonlinear state space model, Weibull-gamma mixture |
JEL: | C53 C58 C22 |
Date: | 2012–03–06 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012020&r=ecm |
By: | Lukasz Gatarek (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam); Lennart Hoogerheide (VU University Amsterdam); Koen Hooning (Delft University of Technology); Herman K. van Dijk (Econometric Institute, Erasmus University Rotterdam, and VU University Amsterdam) |
Abstract: | Accurate prediction of risk measures such as Value at Risk (VaR) and Expected Shortfall (ES) requires precise estimation of the tail of the predictive distribution. Two novel concepts are introduced that offer a specific focus on this part of the predictive density: the censored posterior, a posterior in which the likelihood is replaced by the censored likelihood; and the censored predictive likelihood, which is used for Bayesian Model Averaging. We perform extensive experiments involving simulated and empirical data. Our results show the ability of these new approaches to outperform the standard posterior and traditional Bayesian Model Averaging techniques in applications of Value-at-Risk prediction in GARCH models. |
Keywords: | censored likelihood, censored posterior, censored predictive likelihood, Bayesian Model Averaging, Value at Risk, Metropolis-Hastings algorithm. |
JEL: | C11 C15 C22 C51 C53 C58 G17 |
Date: | 2013–04–15 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2013060&r=ecm |
By: | Arnold Zellner (posthumously) (University of Chicago, USA); Tomohiro Ando (Keio University, Japan); Nalan Basturk (Erasmus University Rotterdam); Lennart Hoogerheide (Erasmus University Rotterdam); Herman K. van Dijk (VU University Amsterdam, and Erasmus University Rotterdam) |
Abstract: | We discuss Bayesian inferential procedures within the family of instrumental variables regression models and focus on two issues: existence conditions for posterior moments of the parameters of interest under a flat prior and the potential of Direct Monte Carlo (DMC) approaches for efficient evaluation of such possibly highly onelliptical posteriors. We show that, for the general case of <I>m</I> endogenous variables under a flat prior, posterior moments of order <I>r</I> exist for the coefficients reflecting the endogenous regressors’ effect on the dependent variable, if the number of instruments is greater than <I>m</I>+<I>r</I>, even though there is an issue of local non-identification that causes non-elliptical shapes of the posterior. This stresses the need for efficient Monte Carlo integration methods. We introduce an extension of DMC that incorporates an acceptance-rejection sampling step within DMC. This <I>Acceptance-Rejection within Direct Monte Carlo</I> (ARDMC) method has the attractive property that the generated random drawings are independent, which greatly helps the fast convergence of simulation results, and which facilitates the evaluation of the numerical accuracy. The speed of ARDMC can be easily further improved by making use of parallelized computation using multiple core machines or computer clusters. We note that ARDMC is an analogue to the well-known 'Metropolis-Hastings within Gibbs' sampling in the sense that one 'more difficult' step is used within an 'easier' simulation method. We compare the ARDMC approach with the Gibbs sampler using simulated data and two empirical data sets, involving the settler mortality instrument of Acemoglu et al. (2001) and father's education's instrument used by Hoogerheide et al. (2012a). Even without making use of parallelized computation, an efficiency gain is observed both under strong and weak instruments, where the gain can be enormous in the latter case. |
Keywords: | Instrumental variables, Bayesian inference, Direct Monte Carlo, Acceptance-Rejection, numerical standard errors |
JEL: | C11 C15 C26 C36 |
Date: | 2012–09–24 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012098&r=ecm |
By: | Nalan Basturk (Erasmus University Rotterdam); Cem Cakmakli (University of Amsterdam); Pinar Ceyhan (Erasmus University Rotterdam); Herman K. van Dijk (Erasmus University Rotterdam, and VU University Amsterdam) |
Abstract: | Changing time series properties of US inflation and economic activity are analyzed within a class of extended Phillips Curve (PC) models. First, the misspecification effects of mechanical removal of low frequency movements of these series on posterior inference of a basic PC model are analyzed using a Bayesian simulation based approach. Next, structural time series models that describe changing patterns in low and high frequencies and backward as well as forward inflation expectation mechanisms are incorporated in the class of extended PC models. Empirical results indicate that the proposed models compare favorably with existing Bayesian Vector Autoregressive and Stochastic Volatility models in terms of fit and predictive performance. Weak identification and dynamic persistence appear less important when time varying dynamics of high and low frequencies are carefully modeled. Modeling inflation expectations using survey data and adding level shifts and stochastic volatility improves substantially in sample fit and out of sample predictions. No evidence is found of a long run stable cointegration relation between US inflation and marginal costs. Tails of the complete predictive distributions indicate an increase in the probability of disinflation in recent years. |
Keywords: | New Keynesian Phillips curve, unobserved components, level shifts, inflation expectations |
JEL: | C11 C32 E31 E37 |
Date: | 2013–01–10 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2013011&r=ecm |
By: | Denitsa Stefanova (VU University Amsterdam) |
Abstract: | The paper proposes a model for the dynamics of stock prices that incorporates increased asset co-movements during extreme market downturns in a continuous-time setting. The model is based on the construction of a multivariate diffusion with a pre-specified stationary density with tail dependence. I estimate the model with Markov Chain Monte Carlo using a sequential inference procedure that proves to be well-suited for the problem. The model is able to reproduce stylized features of the dependence structure and the dynamic behaviour of asset returns. |
Keywords: | tail dependence, multivariate diffusion, Markov Chain Monte Carlo |
JEL: | C11 C51 C58 |
Date: | 2012–11–21 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012125&r=ecm |
By: | Ad Ridder (VU University Amsterdam); Bruno Tuffin (Inria Rennes Bretagne Atlantique) |
Abstract: | In rare event simulation, we look for estimators such that the relative accuracy of the output is ''controlled'' when the rarity is getting more and more critical. Different robustness properties have been defined in the literature, that an estimator is expected to satisfy. Though, those properties are not adapted to estimators for which the estimators come from a parametric family and the optimal parameter is learned and random. For this reason, we motivate in this paper the need to define probabilistic robustness properties, because the accuracy of the resulting estimator is therefore random. We especially focus on the so-called probabilistic bounded relative error property. We additionally provide sufficient conditions, both in general and Markov settings, to satisfy such a property, illustrate them and simple but standard examples, and hope that it will foster discussions and new works in the area. |
Keywords: | Rare event probability, Importance sampling, Probabilistic robustness, Markov chains |
JEL: | C6 |
Date: | 2012–10–01 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012103&r=ecm |
By: | Francisco Blasques (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam); Andre Lucas (VU University Amsterdam) |
Abstract: | We characterize the dynamic properties of Generalized Autoregressive Score (GAS) processes by identifying regions of the parameter space that imply stationarity and ergodicity. We show how these regions are affected by the choice of parameterization and scaling, which are key features of GAS models compared to other observation driven models. The Dudley entropy integral is used to ensure the non-degeneracy of such regions. Furthermore, we show how to obtain bounds for these regions in models for time-varying means, variances, or higher-order moments. |
Keywords: | Dudley integral, Durations, Higher-order models, Nonlinear dynamics, Time-varying parameters, Volatility |
JEL: | C13 C22 C58 |
Date: | 2012–06–22 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012059&r=ecm |
By: | Francisco Blasques (VU University Amsterdam) |
Abstract: | This paper proposes a functional specification approach for dynamic stochastic general equilibrium (DSGE) models that explores the properties of the solution method used to approximate policy functions. In particular, the solution-driven specification takes the properties of the solution method directly into account when designing the structural model in order to deliver enhanced flexibility and facilitate parameter identification within the structure imposed by the underlying economic theory. A prototypical application reveals the importance of this method in improving the specification of functional nonlinearities that are consistent with economic theory. The solution-driven specification is also shown to have the potential to greatly improve model fit and provide alternative policy recommendations when compared to standard DSGE model designs. |
Keywords: | Nonlinear Model Specification, DSGE, Perturbation Solutions |
JEL: | C51 E17 E37 |
Date: | 2013–04–19 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2013062&r=ecm |
By: | Siem Jan Koopman (VU University Amsterdam); Rutger Lit (VU University Amsterdam) |
Abstract: | Attack and defense strengths of football teams vary over time due to changes in the teams of players or their managers. We develop a statistical model for the analysis and forecasting of football match results which are assumed to come from a bivariate Poisson distribution with intensity coefficients that change stochastically over time. This development presents a novelty in the statistical time series analysis of match results from football or other team sports. Our treatment is based on state space and importance sampling methods which are computationally efficient. The out-of-sample performance of our methodology is verified in a betting strategy that is applied to the match outcomes from the 2010/11 and 2011/12 seasons of the English Premier League. We show that our statistical modeling framework can produce a significant positive return over the bookmaker's odds. |
Keywords: | Betting, Importance sampling, Kalman filter smoother, Non-Gaussian multivariate time series models, Sport statistics |
JEL: | C32 C35 |
Date: | 2012–09–27 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012099&r=ecm |
By: | Monica Billio (University of Venice, GRETA Assoc. and School for Advanced Studies in Venice); Roberto Casarin (University of Venice, GRETA Assoc. and School for Advanced Studies in Venice); Francesco Ravazzolo (Norges Bank and BI Norwegian Business School); Herman K. van Dijk (Erasmus University Rotterdam, VU University Amsterdam) |
Abstract: | We propose a Bayesian combination approach for multivariate predictive densities which relies upon a distributional state space representation of the combination weights. Several specifications of multivariate time-varying weights are introduced with a particular focus on weight dynamics driven by the past performance of the predictive densities and the use of learning mechanisms. In the proposed approach the model set can be incomplete, meaning that all models can be individually misspecified. A Sequential Monte Carlo method is proposed to approximate the filtering and predictive densities. The combination approach is assessed using statistical and utility-based performance measures for evaluating density forecasts. Simulation results indicate that, for a set of linear autoregressive models, the combination strategy is successful in selecting, with probability close to one, the true model when the model set is complete and it is able to detect parameter instability when the model set includes the true model that has generated subsamples of data. For the macro series we find that incompleteness of the models is relatively large in the 70's, the beginning of the 80's and during the recent financial crisis, and lower during the Great Moderation. With respect to returns of the S&P 500 series, we find that an investment strategy using a combination of predictions from professional forecasters and from a white noise model puts more weight on the white noise model in the beginning of the 90's and switches to giving more weight to the professional forecasts over time. |
Keywords: | Density Forecast Combination, Survey Forecast, Bayesian Filtering, Sequential Monte Carlo |
JEL: | C11 C15 C53 E37 |
Date: | 2012–11–07 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012118&r=ecm |
By: | Geert Mesters (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam) |
Abstract: | We study the forecasting of the yearly outcome of the Boat Race between Cambridge and Oxford. We compare the relative performance of different dynamic models for forty years of forecasting. Each model is defined by a binary density conditional on a latent signal that is specified as a dynamic stochastic process with fixed predictors. The out-of-sample predictive ability of the models is compared between each other by using a variety of loss functions and predictive ability tests. We find that the model with its latent signal specified as an autoregressive process cannot be outperformed by the other specifications. This model is able to correctly forecast 30 out of 40 outcomes of the Boat Race. |
Keywords: | Binary time series, Predictive ability, Non-Gaussian state space model |
JEL: | C32 C35 |
Date: | 2012–10–23 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012110&r=ecm |
By: | Fady Barsoum (Department of Economics, University of Konstanz, Germany); Sandra Stankiewicz (Department of Economics, University of Konstanz, Germany) |
Abstract: | For modelling mixed-frequency data with business cycle pattern we introduce the Markovswitching Mixed Data Sampling model with unrestricted lag polynomial (MS-U-MIDAS). Usually models of the MIDAS-class use lag polynomials of a specific function, which impose some structure on the weights of regressors included in the model. This may deteriorate the predictive power of the model if the imposed structure differs from the data generating process. When the difference between the available data frequencies is small and there is no risk of parameter proliferation, using an unrestricted lag polynomial might not only simplify the model estimation, but also improve its forecasting performance. We allow the parameters of the MIDAS model with unrestricted lag polynomial to change according to a Markov-switching scheme in order to account for the business cycle pattern observed in many macroeconomic variables. Thus we combine the unrestricted MIDAS with a Markov-switching approach and propose a new Markov-switching MIDAS model with unrestricted lag polynomial (MS-U-MIDAS). We apply this model to a large dataset with the help of factor analysis. Monte Carlo experiments and an empirical forecasting comparison carried out for the U.S. GDP growth show that the models of the MS-UMIDAS class exhibit similar or better nowcasting and forecasting performance than their counterparts with restricted lag polynomials. |
Keywords: | Markov-switching, Business cycle, Mixed-frequency data analysis, Forecastsing |
JEL: | C22 C53 E37 |
Date: | 2013–05–08 |
URL: | http://d.repec.org/n?u=RePEc:knz:dpteco:1310&r=ecm |
By: | Jiaqi Chen; Michael L. Tindall |
Abstract: | This paper describes the structure of a rule-based econometric forecasting system designed to produce multi-equation econometric models. The paper describes the functioning of a working system which builds the econometric forecasting equation for each series submitted and produces forecasts of the series. The system employs information criteria and cross validation in the equation building process, and it uses Bayesian model averaging to combine forecasts of individual series. The system outperforms standard benchmarks for a variety of national economic datasets. |
Keywords: | Econometrics |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:fip:feddop:1:x:1&r=ecm |