nep-ets New Economics Papers
on Econometric Time Series
Issue of 2013‒01‒19
six papers chosen by
Yong Yin
SUNY at Buffalo

  1. Coupling between time series: a network view By Saeed Mehraban; Amirhossein Shirazi; Maryam Zamani; Gholamreza Jafari
  2. Dynamics of episodic transient correlations in currency exchange rate returns and their predictability By Milan \v{Z}ukovi\v{c}
  3. A mixed splicing procedure for economic time series By Angel de la Fuente
  4. An Autocorrelated Loss Distribution Approach : back to the time series By Dominique Guegan; Bertrand Hassani
  5. The Estimation and Testing of a Linear Regression with Near Unit Root in the Spatial Autoregressive Error Term By Badi H. Baltagi; Chihwa Kao; Long Liu
  6. The Generalised Autocovariance Function By Tommaso , Proietti; Alessandra, Luati

  1. By: Saeed Mehraban; Amirhossein Shirazi; Maryam Zamani; Gholamreza Jafari
    Abstract: Recently, the visibility graph has been introduced as a novel view for analyzing time series, which maps it to a complex network. In this paper, we introduce new algorithm of visibility, "cross-visibility", which reveals the conjugation of two coupled time series. The correspondence between the two time series is mapped to a network, "the cross-visibility graph", to demonstrate the correlation between them. We applied the algorithm to several correlated and uncorrelated time series, generated by the linear stationary ARFIMA process. The results demonstrate that the cross-visibility graph associated with correlated time series with power-law auto-correlation is scale-free. If the time series are uncorrelated, the degree distribution of their cross-visibility network deviates from power-law. For more clarifying the process, we applied the algorithm to real-world data from the financial trades of two companies, and observed significant small-scale coupling in their dynamics.
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1301.1010&r=ets
  2. By: Milan \v{Z}ukovi\v{c}
    Abstract: We study the dynamics of the linear and non-linear serial dependencies in financial time series in a rolling window framework. In particular, we focus on the detection of episodes of statistically significant two- and three-point correlations in the returns of several leading currency exchange rates that could offer some potential for their predictability. We employ a rolling window approach in order to capture the correlation dynamics for different window lengths and analyze the distributions of periods with statistically significant correlations. We find that for sufficiently large window lengths these distributions fit well to power-law behavior. We also measure the predictability itself by a hit rate, i.e. the rate of consistency between the signs of the actual returns and their predictions, obtained from a simple correlation-based predictor. It is found that during these relatively brief periods the returns are predictable to a certain degree and the predictability depends on the selection of the window length.
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1301.1893&r=ets
  3. By: Angel de la Fuente
    Abstract: This note develops a flexible methodology for splicing economic time series that avoids the extreme assumptions implicit in the procedures most commonly used in the literature. It allows the user to split the required correction to the older of the series being linked between its levels and growth rates on the basis of what he knows or conjectures about the persistence of the factors that account for the discrepancy between the two series that emerges at their linking point. The time profile of the correction is derived from the assumption that the error in the older series reflects the inadequate coverage of emerging sectors or activities that grow faster than the aggregate.
    Keywords: linking, splicing, economic series
    JEL: C82 E01
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:bbv:wpaper:1302&r=ets
  4. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Paris I - Panthéon Sorbonne); Bertrand Hassani (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Paris I - Panthéon Sorbonne)
    Abstract: The Advanced Measurement Approach requires financial institutions to develop internal models to evaluate their capital charges. Traditionally, the Loss Distribution Approach (LDA) is used mixing frequencies and severities to build a Loss Distribution Function (LDF). This distribution represents annual losses, consequently the 99.9 percentile of the distribution providing the capital charge denotes the worst year in a thousand. The current approach suggested by the regulator implemented in the financial institutions assumes the independence of the losses. In this paper, we propose a solution to address the issues arising when autocorrelations are detected between the losses. Our approach suggests working with the losses considered as time series. Thus, the losses are aggregated periodically and time series processes are adjusted on the related time series among AR, ARFI, and Gegenbauer processes, and a distribution is fitted on the residuals. Finally a Monte Carlo simulation enables constructing the LDF, and the pertaining risk measures are evaluated. In order to show the impact of the choice of the internal models retained by the companies on the capital charges, the paper draws a parallel between the static traditional approach and an appropriate dynamical modelling. If by implementing the traditional LDA, no particular distribution proves its adequacy to the data - as soon as the goodness-of-fits tests rejects them -, keeping the LDA modelling corresponds to an arbitrary choice. We suggest in this paper an alternative and robust approach. For instance, for the two data sets we explore in this paper, with the strategies presented in this paper, the independence assumption is released and we are able to capture the autocorrelations inside the losses through the time series modelling. The construction of the related LDF enables the computation of the capital charge and therefore permits complying with the regulation taking into account as the same time the large losses with adequate distributions on the residuals and the correlations between losses with the time series modelling.
    Keywords: Operation risk, time series, Gegenbauer processes, Monte Carlo, risk measures.
    Date: 2012–12
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00771387&r=ets
  5. By: Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244-1020); Chihwa Kao (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244-1020); Long Liu (Department of Economics, College of Business, University of Texas at San Antonio)
    Abstract: This paper considers the estimation of a linear regression involving the spatial autoregressive (SAR) error term, which is nearly nonstationary. The asymptotics properties of the ordinary least squares (OLS), true generalized least squares (GLS) and feasible generalized least squares (FGLS) estimators as well as the corresponding Wald test statistics are derived. Monte Carlo results are conducted to study the sampling behavior of the proposed estimators and test statistics. Key Words: Spatial Autocorrelation; Ordinary Least Squares; Generalized Least Squares; Two-stage Least Squares; Maximum Likelihood Estimation JEL No. C23, C33
    Date: 2012–12
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:150&r=ets
  6. By: Tommaso , Proietti; Alessandra, Luati
    Abstract: The generalised autocovariance function is defined for a stationary stochastic process as the inverse Fourier transform of the power transformation of the spectral density function. Depending on the value of the transformation parameter, this function nests the inverse and the traditional autocovariance functions. A frequency domain non-parametric estimator based on the power transformation of the pooled periodogram is considered and its asymptotic distribution is derived. The results are employed to construct classes of tests of the white noise hypothesis, for clustering and discrimination of stochastic processes and to introduce a novel feature matching estimator of the spectrum.
    Keywords: Stationary Gaussian processes. Non-parametric spectral estimation. White noise tests. Feature matching. Discriminant Analysis
    JEL: C14 C22
    Date: 2012–06–06
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:43711&r=ets

This nep-ets issue is ©2013 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.