nep-ets New Economics Papers
on Econometric Time Series
Issue of 2019‒09‒23
seven papers chosen by
Jaqueson K. Galimberti
KOF Swiss Economic Institute

  1. Unambiguous inference in sign-restricted VAR models By Robert Calvert Jump
  2. Estimation of Large Dimensional Conditional Factor Models in Finance By Patrick Gagliardini; Elisa Ossola; O. Scaillet
  3. High-dimensional macroeconomic forecasting using message passing algorithms By Korobilis, Dimitris
  4. The Great Moderation: Updated Evidence with Joint Tests for Multiple Structural Changes in Variance and Persistence By Perron, Pierre; Yamamoto, Yohei
  5. Modelling and forecasting the dollar-pound exchange rate in the presence of structural breaks By Jennifer Castle; Takamitsu Kurita
  6. Selecting a Model for Forecasting By Jennifer Castle; Jurgen Doornik; David Hendry
  7. Comparing the forecasting of cryptocurrencies by Bayesian time-varying volatility models By Rick Bohte; Luca Rossini

  1. By: Robert Calvert Jump (University of the West of England, Bristol)
    Abstract: This paper demonstrates how sign restrictions can be used to infer the signs of certain historical shocks from reduced form VAR residuals. This is achieved without recourse to non-sign information. The method is illustrated by an application to the AD-AS model using UK data.
    Keywords: Structural VAR; sign restrictions.
    JEL: C51 C52
    Date: 2018–01–02
  2. By: Patrick Gagliardini (USI Università della Svizzera italiana; Swiss Finance Institute); Elisa Ossola (European Commission, Joint Research Centre); O. Scaillet (University of Geneva GSEM and GFRI; Swiss Finance Institute; University of Geneva - Research Center for Statistics)
    Abstract: This chapter provides an econometric methodology for inference in large-dimensional conditional factor models in finance. Changes in the business cycle and asset characteristics induce time variation in factor loadings and risk premia to be accounted for. The growing trend in the use of disaggregated data for individual securities motivates our focus on methodologies for a large number of assets. The beginning of the chapter outlines the concept of approximate factor structure in the presence of conditional information, and develops an arbitrage pricing theory for large-dimensional factor models in this framework. Then we distinguish between two different cases for inference depending on whether factors are observable or not. We focus on diagnosing model specification, estimating conditional risk premia, and testing asset pricing restrictions under increasing cross-sectional and time series dimensions. At the end of the chapter, we review some of the empirical findings and contrast analysis based on individual stocks and standard sets of portfolios. We also discuss the impact on computing time-varying cost of equity for a firm, and summarize differences between results for developed and emerging markets in an international setting.
    Keywords: large panel, factor model, conditional information, risk premium, asset pricing, emerging markets
    JEL: C12 C13 C23 C51 C52 G12
    Date: 2019–08
  3. By: Korobilis, Dimitris
    Abstract: This paper proposes two distinct contributions to econometric analysis of large information sets and structural instabilities. First, it treats a regression model with time-varying coefficients, stochastic volatility and exogenous predictors, as an equivalent high-dimensional static regression problem with thousands of covariates. Inference in this specification proceeds using Bayesian hierarchical priors that shrink the high-dimensional vector of coefficients either towards zero or time-invariance. Second, it introduces the frameworks of factor graphs and message passing as a means of designing efficient Bayesian estimation algorithms. In particular, a Generalized Approximate Message Passing (GAMP) algorithm is derived that has low algorithmic complexity and is trivially parallelizable. The result is a comprehensive methodology that can be used to estimate time-varying parameter regressions with arbitrarily large number of exogenous predictors. In a forecasting exercise for U.S. price inflation this methodology is shown to work very well.
    Keywords: high-dimensional inference; factor graph; Belief Propagation; Bayesian shrinkage; time-varying parameter model
    JEL: C01 C11 C13 C52 C53 C61 E31 E37
    Date: 2019–09–15
  4. By: Perron, Pierre; Yamamoto, Yohei
    Abstract: We assess the empirical evidence about the Great Moderation using a comprehensive framework to test for multiple structural changes in the coeffcients and in the variance of the error term of a linear regression model provided by Perron, Yamamoto, and Zhou (2019). We apply it to U.S. real GDP and its major components for the period 1960:1 to 2018:4. A notable feature of our approach is that we adopt an unobserved component model, allowing for two breaks in the trend function in 1973:1 and 2008:1, in order to obtain a stationary or cyclical component modelled as an autoregressive process. First, we confirm evidence about the Great Moderation, i.e., a structural change in variance of the errors in the mid-80s for the various series. Second, additional breaks in variance are found in 1970:3 for GDP and production (goods), after which the sample standard deviation increased by three times. Hence, a part of the Great Moderation can be viewed as a reversion to the pre-70s level of volatility. Third, the evidence about systematic changes in the sum of the autoregressive coefficients (a measure of persistence) is weak over the whole sample period. Finally, we find little evidence of structural changes occurring in both the variance and the coeffcients following the Great Recession (2007-2008). These results support views emphasizing the "good luck" hypothesis as a source of the Great Moderation, which continues even after the Great Recession.
    Keywords: the Great Moderation, the Great Recession, multiple structural changes, joint tests, structural change, trend-cycle decomposition
    JEL: C22 C32
    Date: 2019–09
  5. By: Jennifer Castle; Takamitsu Kurita
    Abstract: We employ a newly-developed partial cointegration system allowing for level shifts to examine whether economic fundamentals form the long-run determinants of the dollar-pound exchange rate in an era of structural change. The paper uncovers a class of local data generation mechanisms underlying long-run and short-run dynamic features of the exchange rate using a set of economic variables that explicitly reflect the central banks’ monetary policy stances and the influence of a forward exchange market. The impact of the Brexit referendum is evaluated by examining forecasts when the dollar-pound exchange rate fell substantially around the vote.
    Keywords: Exchange rates, Monetary policy, General-to-speciï¬ c approach, Partial cointegrated vector autoregressive models, Structural breaks.
    JEL: C22 C32 C52 F31
    Date: 2019–01–07
  6. By: Jennifer Castle; Jurgen Doornik; David Hendry
    Abstract: Jennifer L. Castle, Jurgen A. Doornik and David F. Hendry We investigate the role of the significance level when selecting models for forecasting as it con-trols both the null retention frequency and the probability of retaining relevant variables when using binary decisions to retain or drop variables. Analysis identifies the best selection significance level in a bivariate model when there are location shifts at or near the forecast origin. The trade-off for select¬ing variables in forecasting models in a stationary world, namely that variables should be retained if their non-centralities exceed 1, applies in the wide-sense non-stationary settings with structural breaks examined here. The results confirm the optimality of the Akaike Information Criterion for forecasting in completely different settings than initially derived. An empirical illustration forecast¬ing UK inflation demonstrates the applicability of the analytics. Simulation then explores the choice of selection significance level for 1-step ahead forecasts in larger models when there are unknown lo¬cation shifts present under a range of alternative scenarios, using the multipath tree search algorithm, Autometrics (Doornik, 2009), varying the target significance level for the selection of regressors. The costs of model selection are shown to be small. The results provide support for model selection at looser than conventional settings, albeit with many additional features explaining the forecast perfor¬mance, with the caveat that retaining irrelevant variables that are subject to location shifts can worsen forecast performance.
    Keywords: Model selection; forecasting; location shifts; significance level; Autometrics
    Date: 2018–11–09
  7. By: Rick Bohte; Luca Rossini
    Abstract: This paper studies the forecasting ability of cryptocurrency time series. This study is about the four most capitalized cryptocurrencies: Bitcoin, Ethereum, Litecoin and Ripple. Different Bayesian models are compared, including models with constant and time-varying volatility, such as stochastic volatility and GARCH. Moreover, some crypto-predictors are included in the analysis, such as S\&P 500 and Nikkei 225. In this paper the results show that stochastic volatility is significantly outperforming the benchmark of VAR in both point and density forecasting. Using a different type of distribution, for the errors of the stochastic volatility the student-t distribution came out to be outperforming the standard normal approach.
    Date: 2019–09

This nep-ets issue is ©2019 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.