nep-ets New Economics Papers
on Econometric Time Series
Issue of 2018‒11‒12
ten papers chosen by
Jaqueson K. Galimberti
KOF Swiss Economic Institute

  1. Variational Inference for high dimensional structured factor copulas By Galeano San Miguel, Pedro; Ausín Olivera, María Concepción; Nguyen, Hoang
  2. Forecasting Realized Volatility Measures with Multivariate and Univariate Models: The Case of The US Banking Sector By Gianluca Cubadda; Alain Hecq; Antonio Riccardo
  3. Nowcasting the Unemployment Rate in the EU with Seasonal BVAR and Google Search Data By Anttonen, Jetro
  4. The information content of inflation swap rates for the long-term inflation expectations of professionals: Evidence from a MIDAS analysis By Hanoma, Ahmed; Nautz, Dieter
  5. Wavelet analysis for temporal disaggregation By Chiara Perricone
  6. Time-Frequency Response Analysis of Monetary Policy Transmission By Lubos Hanus; Lukas Vacha
  7. Consistent Non-Gaussian Pseudo Maximum Likelihood Estimators By Gabriele Fiorentini; Enrique Sentana
  8. Modeling Time-Variation Over the Business Cycle (1960-2017): An International Perspective By Martinez-Garcia, Enrique
  9. An intuitive method to improve the estimation of output gaps By Wilde, Wollfram; Beckmann, Joscha

  1. By: Galeano San Miguel, Pedro; Ausín Olivera, María Concepción; Nguyen, Hoang
    Abstract: Factor copula models have been recently proposed for describing the joint distribution of a large number of variables in terms of a few common latent factors. In this paper, we employ a Bayesian procedure to make fast inferences for multi-factor and structured factor copulas. To deal with the high dimensional structure, we apply a variational inference (VI) algorithm to estimate different specifications of factor copula models. Compared to the Markov chain Monte Carlo (MCMC) approach, the variational approximation is much faster and could handle a sizeable problem in a few seconds. Another issue of factor copula models is that the bivariate copula functions connecting the variables are unknown in high dimensions. We derive an automatic procedure to recover the hidden dependence structure. By taking advantage of the posterior modes of the latent variables, we select the bivariate copula functions based on minimizing the Bayesian information criterion (BIC). The simulation studies in different contexts show that the procedure of bivariate copula selection could be very accurate in comparison to the true generated copula model. We illustrate our proposed procedure with two high dimensional real data sets.
    Keywords: Variational inference; Model selection; Factor copula
    Date: 2018–10
  2. By: Gianluca Cubadda (DEF & CEIS,University of Rome "Tor Vergata"); Alain Hecq (Maastricht University); Antonio Riccardo (ICE Data Services Italy)
    Abstract: This paper compares the forecasting performances of both univariate and multivariate models for realized volatilities series. We consider realized volatility measures of the returns of 13 major banks traded in the NYSE. Since our variables are characterized by the presence of long range dependence, we use several modelling approaches that are able to capture such feature. We look at the forecasting accuracy of the considered models to make inference on the underlying mechanism that has generated volatilities of the assets. Our main conclusion is that the contagion effect among the considered volatilities is small or, at least, not well captured by the considered multivariate models.
    Keywords: Consumption,asymmetry,expectations,noisy information
    JEL: C32
    Date: 2018–10–30
  3. By: Anttonen, Jetro
    Abstract: Abstract In this paper a Bayesian vector autoregressive model for nowcasting the seasonally non-adjusted unemployment rate in EU-countries is developed. On top of the official statistical releases, the model utilizes Google search data and the effect of Google data on the forecasting performance of the model is assessed. The Google data is found to yield modest improvements in forecasting accuracy of the model. To the author’s knowledge, this is the first time the forecasting performance of the Google search data has been studied in the context of Bayesian vector autoregressive model. This paper also adds to the empirical literature on the hyperparameter choice with Bayesian vector autoregressive models. The hyperparameters are set according to the mode of the posterior distribution of the hyperparameters, and this is found to improve the out-of-sample forecasting accuracy of the model significantly, compared to the rule-of-thumb values often used in the literature.
    Keywords: Nowcasting, Forecasting, BVAR, Big Data, Unemployment
    JEL: C32 C53 C82 E27
    Date: 2018–11–05
  4. By: Hanoma, Ahmed; Nautz, Dieter
    Abstract: Long-term inflation expectations taken from the Survey of Professional Forecasters are a major source of information for monetary policy. Unfortunately, they are published only on a quarterly basis. This paper investigates the daily information content of inflation-linked swap rates for the next survey outcome. Using a mixed data sampling approach, we find that professionals account for the daily dynamics of inflation swap rates when they submit their long-term inflation expectations. We propose a daily indicator of professionals' inflation expectations that outperforms alternative indicators that ignore the high-frequency dynamics of inflation swap rates. To illustrate the usefulness of the new indicator, we provide new evidence on the (re-)anchoring of U.S. inflation expectations.
    Keywords: Inflation Expectations Dynamics,Expectations Anchoring,MIDAS
    JEL: E31 E52 C22
    Date: 2018
  5. By: Chiara Perricone (DEF,University of Rome "Tor Vergata")
    Abstract: A problem often faced by economic researchers is the interpolation or distribution of economic time series observed at low frequency into compatible higher frequency data. A method based on wavelet analysis is presented to temporal disaggregate time series. A standard `plausible' method is applied, not to the original time series, but to the smooth components resulting from a discrete wavelet transformation. This first step generates a smoothed component at the desired frequency. Subsequently, a noisy component is added to the smooth series to enforce the natural constraint of the series. The method is applied to national accounts for Euro Area, to study both ow and stock variables, and it outperforms other standard methods, as Stram and Wei or low pass interpolation when the series of interest is volatile.
    Keywords: wavelet, temporal disaggregation, sector financial accounts
    JEL: C10 C65 C32 E32
    Date: 2018–10–29
  6. By: Lubos Hanus (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nabrezi 6, 111 01 Prague 1, Czech Republic; Institute of Information Theory and Automation, Academy of Sciences of the Czech Republic, Pod Vodarenskou Vezi 4, 182 00, Prague, Czech Republic); Lukas Vacha (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nabrezi 6, 111 01 Prague 1, Czech Republic; Institute of Information Theory and Automation, Academy of Sciences of the Czech Republic, Pod Vodarenskou Vezi 4, 182 00, Prague, Czech Republic)
    Abstract: In our study, we consider a new approach to quantify the effects of economic shocks on monetary transmission. We analyse the widely known phenomenon of price puzzle in a time-varying environment using the frequency decomposition. We use the frequency response function to measure the power of shocks transferred to different economic cycles. Considering both time and frequency domains, we quantify the dynamics of shocks implied by monetary policy within an economic system. While studying the monetary policy transmission of the U.S., the empirical evidence shows that low-frequency cycles of output are prevalent and have positive transfers. Examination of the inflation reveals that the frequency responses vary significantly in time and alter the direction of transmission for all cyclical lengths.
    Keywords: cyclicality, frequency, economic systems, monetary policy
    Date: 2018–10
  7. By: Gabriele Fiorentini (Università di Firenze and RCEA); Enrique Sentana (CEMFI, Centro de Estudios Monetarios y Financieros)
    Abstract: We characterise the mean and variance parameters that distributionally misspecified maximum likelihood estimators can consistently estimate in multivariate conditionally heteroskedastic dynamic regression models. We also provide simple closed-form consistent estimators for the rest. The inclusion of means and the explicit coverage of multivariate models make our procedures useful not only for GARCH models but also in many empirically relevant macro and finance applications involving VARs and multivariate regressions. We study the statistical properties of our proposed consistent estimators, as well as their efficiency relative to Gaussian pseudo maximum likelihood procedures. Finally, we provide finite sample results through Monte Carlo simulations.
    Keywords: Consistency, efficiency, misspecification.
    JEL: C13 C22 C32 C51
    Date: 2018–01
  8. By: Martinez-Garcia, Enrique (Federal Reserve Bank of Dallas)
    Abstract: In this paper, I explore the changes in international business cycles with quarterly data for the eight largest advanced economies (U.S., U.K., Germany, France, Italy, Spain, Japan, and Canada) since the 1960s. Using a time-varying parameter model with stochastic volatility for real GDP growth and inflation allows their dynamics to change over time, approximating nonlinearities in the data that otherwise would not be adequately accounted for with linear models (Granger et al. (1991), Granger (2008)). With that empirical model, I document a period of declining macro volatility since the 1980s, followed by increasing (and diverging) inflation volatility since the mid-1990s. I also find significant shifts in inflation persistence and cyclicality, as well as in macro synchronization and even forecastability. The 2008 global recession appears to have had an impact on some of this. I ground my empirical strategy on the reduced-form solution of the workhorse New Keynesian model and, motivated by theory, explore the relationship between greater trade openness (globalization) and the reported shifts in international business cycles. I show that globalization has sizeable (yet nonlinear) effects in the data consistent with the implications of the model—yet globalization’s contribution is not a foregone conclusion, depending crucially on more than the degree of openness of the international economy.
    Keywords: Great Moderation; Globalization; International Business Cycles; Stochastic Volatility; Time-Varying Parameters
    JEL: E31 E32 F41 F44
    Date: 2018–10–01
  9. By: Wilde, Wollfram; Beckmann, Joscha
    Abstract: Standard procedures for output gap estimates, such as the Hodrick-Prescott Filter or the Production Function Method, suffer from the sample phase shift issue at the end of the sample. This often provides unstable and unreliable estimates for the current output gap. However the current estimate of output gaps is the most relevant one for monetary and fiscal policymakers. The result from time series filters lack an economic founding and tend to produce economic implausible results for the output gap. This paper introduces and evaluates a new method which is able to reduce the uncertainty of output gaps at the end of a sample while allowing for an economic interpretation of the obtained estimate. Our estimates for 12 economies show that we are able to outperform the popular production function methodology (PF) when nowcasting the current output gap.
    Keywords: Output Gap,Policy Evaluation
    JEL: E52 E58
    Date: 2018
  10. By: Xuexin Wang
    Abstract: For econometric models defined by conditional moment restrictions, it is well known that the popular estimation methods such as the generalized method of moments and generalized empirical likelihood based on an arbitrary finite number of unconditional moment restrictions implied by the conditional moment restrictions can render inconsistent estimates. To guarantee the estimation consistency, some additional assumptions on these unconditional moment restrictions have to be levied. This paper introduces a simple consistent estimation procedure without assuming identifying conditions on the implied unconditional moment restrictions. This procedure is based on a weighted L2 norm with a unique weighting function, where a full continuum of unconditional moment restrictions is employed. It is quite easy to implement for any dimension of conditioning variables, and no any user-chosen number is required. Furthermore statistical inference is straightforward since the proposed estimator is asymptotically normal. Monte Carlo simulations demonstrate that the new estimator has excellent finite sample properties and outperforms other competitors in the cases we consider.
    Keywords: Characteristic function; A continuum of moments; Identification; Nonlinear Models; Nonintegrable weighting function
    JEL: C12 C22
    Date: 2018–10–29

This nep-ets issue is ©2018 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.