nep-ets New Economics Papers
on Econometric Time Series
Issue of 2018‒10‒01
nine papers chosen by
Jaqueson K. Galimberti
KOF Swiss Economic Institute

  1. On the Choice of Instruments in Mixed Frequency Specification Tests By Yun Liu; Yeonwoo Rho
  2. Nowcasting Canadian Economic Activity in an Uncertain Environment By Tony Chernis; Rodrigo Sekkel
  3. Improving Underlying Scenarios for Aggregate Forecasts: A Multi-level Combination Approach By Cobb, Marcus P A
  4. Non-Gaussian Stochastic Volatility Model with Jumps via Gibbs Sampler By Arthur T. Rego; Thiago R. dos Santos
  5. Detecting exchange rate contagion in Asian exchange rate markets using asymmetric DDC-GARCH and R-vine copulas By Gomez-Gonzalez, Jose; Rojas-Espinosa, Wilmer
  6. Dynamical variety of shapes in financial multifractality By Stanis{\l}aw Dro\.zd\.z; Rafa{\l} Kowalski; Pawe{\l} O\'swi\c{e}cimka; Rafa{\l} Rak; Robert G\c{e}barowski
  7. Change-Point Testing and Estimation for Risk Measures in Time Series By Lin Fan; Peter W. Glynn; Markus Pelger
  8. Generalized exogenous processes in DSGE: A Bayesian approach By Meyer-Gohde, Alexander; Neuhoff, Daniel
  9. To sign or not to sign? On the response of prices to financial and uncertainty shocks By Meinen, Philipp; Röhe, Oke

  1. By: Yun Liu; Yeonwoo Rho
    Abstract: Time averaging has been the traditional approach to handle mixed sampling frequencies. However, it ignores information possibly embedded in high frequency. Mixed data sampling (MIDAS) regression models provide a concise way to utilize the additional information in high-frequency variables. In this paper, we propose a specification test to choose between time averaging and MIDAS models, based on a Durbin-Wu-Hausman test. In particular, a set of instrumental variables is proposed and theoretically validated when the frequency ratio is large. As a result, our method tends to be more powerful than existing methods, as reconfirmed through the simulations.
    Date: 2018–09
  2. By: Tony Chernis; Rodrigo Sekkel
    Abstract: This paper studies short-term forecasting of Canadian real GDP and its expenditure components using combinations of nowcasts from different models. Starting with a medium-sized data set, we use a suite of common nowcasting tools for quarterly real GDP and its expenditure components. Using a two-step combination procedure, the nowcasts are first combined within model classes and then merged into a single point forecast using simple performance-based weighting methods. We find that no single model clearly dominates over all forecast horizons, subsamples and target variables. This highlights that when operating in an uncertain environment, where the choice of model is not clear, combining forecasts is a prudent strategy.
    Keywords: Econometric and statistical methods
    JEL: C53 E52 E37
    Date: 2018
  3. By: Cobb, Marcus P A
    Abstract: Abstract In some situations forecasts for a number of sub-aggregations are required for analysis in addition to the aggregate itself. In this context, practitioners typically rely on bottom-up methods to produce a set of consistent forecasts in order to avoid conflicting messages. However, using this approach exclusively can mean that forecasting accuracy is negatively affected when compared to using other methods. This paper presents a method for increasing overall accuracy by jointly combining the forecasts for an aggregate, any sub-aggregations, and the components from any number of models and measurement approaches. The framework seeks to benefit from the strengths of each of the forecasting approaches by accounting for their reliability in the combination process and exploiting the constraints that the aggregation structure imposes on the set of forecasts as a whole. The results from the empirical application suggest that the method is successful in allowing the strengths of the better-performing approaches to contribute to increasing the performance of the rest.
    Keywords: Bottom-up forecasting; Forecast combination; Hierarchical forecasting; Reconciling forecasts
    JEL: C53 E27 E37
    Date: 2018–08
  4. By: Arthur T. Rego; Thiago R. dos Santos
    Abstract: In this work, we propose a model for estimating volatility from financial time series, extending the non-Gaussian family of space-state models with exact marginal likelihood proposed by Gamerman, Santos and Franco (2013). On the literature there are models focused on estimating financial assets risk, however, most of them rely on MCMC methods based on Metropolis algorithms, since full conditional posterior distributions are not known. We present an alternative model capable of estimating the volatility, in an automatic way, since all full conditional posterior distributions are known, and it is possible to obtain an exact sample of parameters via Gibbs Sampler. The incorporation of jumps in returns allows the model to capture speculative movements of the data, so that their influence does not propagate to volatility. We evaluate the performance of the algorithm using synthetic and real data time series. Keywords: Financial time series, Stochastic volatility, Gibbs Sampler, Dynamic linear models.
    Date: 2018–08
  5. By: Gomez-Gonzalez, Jose; Rojas-Espinosa, Wilmer
    Abstract: This study uses asymmetric DCC-GARCH models and copula functions for studying exchange rate contagion in a group of twelve Asia-Pacific countries. Using daily data between November 1991 and March 2017, shows that extreme market movements are mainly associated with the high degree of interdependence registered by countries in this region. The evidence of contagion is scarce. Asymmetries do not appear to be important. Specifically, currency co-movements are statistically identical during times of extreme market appreciation and depreciation, indicating that phenomena such as the "fear of appreciation" do not appear to be relevant in the region's foreign exchange markets.
    Keywords: Exchange rate contagion; Asian financial crisis; Copula functions; DCC-GARCH models.
    JEL: C32 C51 E42
    Date: 2018–08–21
  6. By: Stanis{\l}aw Dro\.zd\.z; Rafa{\l} Kowalski; Pawe{\l} O\'swi\c{e}cimka; Rafa{\l} Rak; Robert G\c{e}barowski
    Abstract: The concept of multifractality offers a powerful formal tool to filter out multitude of the most relevant characteristics of complex time series. The related studies thus far presented in the scientific literature typically limit themselves to evaluation of whether or not a time series is multifractal and width of the resulting singularity spectrum is considered a measure of the degree of complexity involved. However, the character of the complexity of time series generated by the natural processes usually appears much more intricate than such a bare statement can reflect. As an example, based on the long-term records of S&P500 and NASDAQ - the two world leading stock market indices - the present study shows that they indeed develop the multifractal features, but these features evolve through a variety of shapes, most often strongly asymmetric, whose changes typically are correlated with the historically most significant events experienced by the world economy. Relating at the same time the index multifractal singularity spectra to those of the component stocks that form this index reflects the varying degree of correlations involved among the stocks.
    Date: 2018–09
  7. By: Lin Fan; Peter W. Glynn; Markus Pelger
    Abstract: We investigate methods of change-point testing and confidence interval construction for nonparametric estimators of expected shortfall and related risk measures in weakly dependent time series. A key aspect of our work is the ability to detect general multiple structural changes in the tails of time series marginal distributions. Unlike extant approaches for detecting tail structural changes using quantities such as tail index, our approach does not require parametric modeling of the tail and detects more general changes in the tail. Additionally, our methods are based on the recently introduced self-normalization technique for time series, allowing for statistical analysis without the issues of consistent standard error estimation. The theoretical foundation for our methods are functional central limit theorems, which we develop under weak assumptions. An empirical study of S&P 500 returns and US 30-Year Treasury bonds illustrates the practical use of our methods in detecting and quantifying market instability via the tails of financial time series during times of financial crisis.
    Date: 2018–09
  8. By: Meyer-Gohde, Alexander; Neuhoff, Daniel
    Abstract: We relax the standard assumption in the dynamic stochastic general equilibrium (DSGE) literature that exogenous processes are governed by AR(1) processes and estimate ARMA (p,q) orders and parameters of exogenous processes. Methodologically, we contribute to the Bayesian DSGE literature by using Reversible Jump Markov Chain Monte Carlo (RJMCMC) to sample from the unknown ARMA orders and their associated parameter spaces of varying dimensions. In estimating the technology process in the neoclassical growth model using post war US GDP data, we cast considerable doubt on the standard AR(1) assumption in favor of higher order processes. We find that the posterior concentrates density on hump-shaped impulse responses for all endogenous variables, consistent with alternative empirical estimates and the rigidities behind many richer structural models. Sampling from noninvertibleMA representations, a negative response of hours to a positive technology shock is contained within the posterior credible set. While the posterior contains significant uncertainty regarding the exact order, our results are insensitive to the choice of data filter; this contrasts with our ARMA estimates of GDP itself, which vary significantly depending on the choice of HP or first difference filter.
    Keywords: Bayesian analysis,Dynamic stochastic general equilibrium model,Model evaluation,ARMA,Reversible Jump Markov Chain Monte Carlo
    JEL: C11 C32 C51 C52
    Date: 2018
  9. By: Meinen, Philipp; Röhe, Oke
    Abstract: Based on SVAR models identified by sign restrictions, we estimate the macroeconomic effects of financial and uncertainty shocks in the euro area and the US, paying particular attention to their effects on prices. While our results confirm that such disturbances are important drivers of output fluctuations in both economies, we find the shock responses of consumer prices to be ambiguous. Moreover, restricting prices to co-moving with output can considerably attenuate the measured impact of financial and uncertainty shocks on real activity.
    Keywords: Financial Shocks,Uncertainty Shocks,Sign Restrictions,Euro Area,United States
    JEL: C11 C32 E32 E44
    Date: 2018

This nep-ets issue is ©2018 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.