nep-ets New Economics Papers
on Econometric Time Series
Issue of 2014‒01‒10
seven papers chosen by
Yong Yin
SUNY at Buffalo

  1. Exact Simulation of Non-stationary Reflected Brownian Motion By Mohammad Mousavi; Peter W. Glynn
  2. Empirical Study of the GARCH model with Rational Errors By Ting Ting Chen; Tetsuya Takaishi
  3. How much random does European Union walk? A time-varying long memory analysis By A. Sensoy; Benjamin Miranda Tabak
  4. Inference Based on SVARs Identified with Sign and Zero Restrictions: Theory and Applications By Arias, Jonas E.; Rubio-Ramírez, Juan F.; Waggoner, Daniel F.
  5. Market Efficiency, Roughness and Long Memory in the PSI20 Index Returns: Wavelet and Entropy Analysis By Rui Pascoal; Ana Margarida Monteiro
  6. The Co-Movement and Causality between the U.S. Real Estate and Stock Markets in the Time and Frequency Domains By Tsangyao Chang; Xiao-lin Li; Stephen M. Miller; Mehmet Balcilar; Rangan Gupta
  7. Bayesian Inference and Forecasting in the Stationary Bilinear Model By Roberto Leon-Gonzalez; Fuyu Yang

  1. By: Mohammad Mousavi; Peter W. Glynn
    Abstract: This paper develops the first method for the exact simulation of reflected Brownian motion (RBM) with non-stationary drift and infinitesimal variance. The running time of generating exact samples of non-stationary RBM at any time $t$ is uniformly bounded by $\mathcal{O}(1/\bar\gamma^2)$ where $\bar\gamma$ is the average drift of the process. The method can be used as a guide for planning simulations of complex queueing systems with non-stationary arrival rates and/or service time.
    Date: 2013–12
  2. By: Ting Ting Chen; Tetsuya Takaishi
    Abstract: We use the GARCH model with a fat-tailed error distribution described by a rational function and apply it for the stock price data on the Tokyo Stock Exchange. To determine the model parameters we perform the Bayesian inference to the model. The Bayesian inference is implemented by the Metropolis-Hastings algorithm with an adaptive multi-dimensional Student's t-proposal density. In order to compare the model with the GARCH model with the standard normal errors we calculate information criterions: AIC and DIC, and find that both criterions favor the GARCH model with a rational error distribution. We also calculate the accuracy of the volatility by using the realized volatility and find that a good accuracy is obtained for the GARCH model with a rational error distribution. Thus we conclude that the GARCH model with a rational error distribution is superior to the GARCH model with the normal errors and it can be used as an alternative GARCH model to those with other fat-tailed distributions.
    Date: 2013–12
  3. By: A. Sensoy; Benjamin Miranda Tabak
    Abstract: This paper proposes a new efficiency index to model time-varying inefficiency in stock markets. We focus on European stock markets and show that they have different degrees of time-varying efficiency. We observe that the 2008 global financial crisis has had an adverse effect on almost all EU stock markets. However, the Eurozone sovereign debt crisis has had a significant adverse effect only on the markets in France, Spain and Greece. For the late members, joining EU does not have a uniform effect on stock market efficiency. Our results have important implications for policy makers, investors, risk managers and academics
    Date: 2013–12
  4. By: Arias, Jonas E.; Rubio-Ramírez, Juan F.; Waggoner, Daniel F.
    Abstract: Are optimism shocks an important source of business cycle fluctuations? Are deficit-financed tax cuts better than deficit-financed spending to increase output? These questions have been previously studied using SVARs identified with sign and zero restrictions and the answers have been positive and definite in both cases. While the identification of SVARs with sign and zero restrictions is theoretically attractive because it allows the researcher to remain agnostic with respect to the responses of the key variables of interest, we show that current implementation of these techniques does not respect the agnosticism of the theory. These algorithms impose additional sign restrictions on variables that are seemingly unrestricted that bias the results and produce misleading confidence intervals. We provide an alternative and efficient algorithm that does not introduce any additional sign restriction, hence preserving the agnosticism of the theory. Without the additional restrictions, it is hard to support the claim that either optimism shocks are an important source of business cycle fluctuations or deficit-financed tax cuts work best at improving output. Our algorithm is not only correct but also faster than current ones.
    Date: 2014–01
  5. By: Rui Pascoal (Faculty of Economics, University of Coimbra, Portugal); Ana Margarida Monteiro (GEMF/Faculty of Economics, University of Coimbra, Portugal)
    Abstract: In this study, features of financial returns of PSI20 index, related to market efficiency, are captured using wavelet and entropy based techniques. This characterization includes the following points. First, the detection of long memory, associated to low frequencies, and a global measure of the time series: the Hurst exponent estimated by several methods including wavelets. Second, the degree of roughness, or regularity variation, associated to the Hölder exponent, fractal dimension and estimation based on multifractal spectrum. Finally, the degree of the unpredictability of the series, estimated by approximate entropy. These aspects may also be studied through the concepts of non-extensive entropy and distribution using, for instance, the Tsallis q-triplet. They allow to study the existence of efficiency in the nancial market. On the other hand, the study of local roughness is performed by considering wavelet leaders based entropy. In fact, the wavelet coefficients are computed from a multiresolution analysis, and the wavelet leaders are defined by the local suprema of these coefficients, near the point we are considering. The resulting entropy is more accurate in that detection than the Hölder exponent. These procedures enhance the capacity to identify the occurrence of financial crashes.
    Keywords: efficiency, long memory, fractal dimension, unpredictability, q-triplet, entropy, wavelets.
    JEL: C22 C61 G14
    Date: 2013–12
  6. By: Tsangyao Chang (Feng Chia University); Xiao-lin Li (Wuhan University); Stephen M. Miller (University of Nevada, Las Vegas and University of Connecticut); Mehmet Balcilar (Eastern Mediterranean University); Rangan Gupta (University of Pretoria)
    Abstract: This study applies wavelet analysis to examine the relationship between the U.S. real estate and stock markets over the period 1890-2012. Wavelet analysis allows the simultaneous examination of co-movement and causality between the two markets in both the time and frequency domains. Our findings provide robust evidence that co-movement and causality vary across frequencies and evolve with time. Examining market co-movement in the time domain, the two markets exhibit positive co-movement over recent past decades, exception for 1998-2002 when a high negative co-movement emerged. In the frequency domain, the two markets correlate with each other mainly at low frequencies (longer term), except in the second half of the 1900s as well as in 1998-2002, when the two markets correlate at high frequencies (shorter term). In addition, we find that the causal effects between the markets in the frequency domain occur generally at low frequencies (longer term). In the time-domain, the time-varying nature of long-run causalities implies structural changes in the two markets. These findings provide a more complete picture of the relationship between the U.S. real estate and stock markets over time and frequency, offering important implications for policymakers and practitioners.
    Keywords: stock market; real estate market; wavelet analysis; frequency domain; time domain
    JEL: C49 E44 G11
    Date: 2013–12
  7. By: Roberto Leon-Gonzalez (National Graduate Institute for Policy Studies); Fuyu Yang (University of East Anglia)
    Abstract: A stationary bilinear (SB) model can be used to describe processes with a time-varying degree of persistence that depends on past shocks. The SB model has been used to model highly persistent but stationary macroeconomic time series such as inflation. This study develops methods for Bayesian inference, model comparison, and forecasting in the SB model. Using U.K. inflation data, we find that the SB model outperforms the random walk and first order autoregressive AR(1) models, in terms of root mean squared forecast errors for the one-step-ahead out-of-sample forecast. In addition, the SB model is superior to these two models in terms of predictive likelihood for almost all of the forecast observations.
    Date: 2014–01

This nep-ets issue is ©2014 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.