nep-ecm New Economics Papers
on Econometrics
Issue of 2013‒07‒28
fourteen papers chosen by
Sune Karlsson
Orebro University

  1. Robust Estimation and Inference for Threshold Models with Integrated Regressors By Haiqiang Chen; ; ;
  2. A wavelet approach to multiple cointegration testing By Javier Fernandez-Macho
  3. Estimation and Inference for Varying-coefficient Models with Nonstationary Regressors using Penalized Splines By Haiqiang Chen; Ying Fang; Yingxing Li;
  4. A Test for the Null of Multiple Cointegrating Vectors By Javier Fernandez-Macho
  5. Selection Criteria in Regime Switching Conditional Volatility Models By Thomas Chuffart
  6. Time series models with an EGB2 conditional distribution By M. Caivano; A. Harvey
  7. The Geometric Chain-Ladder By D Kuang; Bent Nielsen; J P Nielsen
  8. Are benefits from oil - stocks diversification gone? A new evidence from a dynamic copulas and high frequency data By Krenar Avdulaj; Jozef Barunik
  9. Mixed-correlated ARFIMA processes for power-law cross-correlations By Ladislav Kristoufek
  10. Step-indicator Saturation By David Hendry; Jurgen A. Doornik; Felix Pretis
  11. Separating the impact of macroeconomic variables and global frailty in event data By James Wolter
  12. Bootstrapping and conditional simulation in Kriging: Better confidence intervals and optimization By Mehdad, E.; Kleijnen, Jack P.C.
  13. Where Do Thin Tails Come From? By Nassim Nicholas Taleb
  14. We propose a technique to avoid spurious detections of jumps in highfrequency data via an explicit thresholding on available test statistics. By Pierre BAJGROWICZ; Olivier SCAILLET

  1. By: Haiqiang Chen; ; ;
    Abstract: This paper studies the robust estimation and inference of threshold models with integrated regres- sors. We derive the asymptotic distribution of the profiled least squares (LS) estimator under the diminishing threshold effect assumption that the size of the threshold effect converges to zero. Depending on how rapidly this sequence converges, the model may be identified or only weakly identified and asymptotic theorems are developed for both cases. As the convergence rate is unknown in practice, a model-selection procedure is applied to determine the model identification strength and to construct robust confidence intervals, which have the correct asymptotic size irrespective of the magnitude of the threshold effect. The model is then generalized to incorporate endogeneity and serial correlation in error terms, under which, we design a Cochrane-Orcutt feasible generalized least squares (FGLS) estimator which enjoys efficiency gains and robustness against different error specifications, including both I(0) and I(1) errors. Based on this FGLS estimator, we further develop a sup-Wald statistic to test for the existence of the threshold effect. Monte Carlo simulations show that our estimators and test statistics perform well.
    Keywords: Threshold effects; Integrated processes; Nonlinear cointegration; Weak identification.
    JEL: C12 C22 C52
    Date: 2013–07
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2013-034&r=ecm
  2. By: Javier Fernandez-Macho
    Abstract: This paper introduces a class of cointegration tests based on estimated low-pass and high-pass regression coefficients from the same wavelet transform of the original time series data.  The procedure can be applied to test the null of cointegration in a n + k multivariate system with n cointegrating relationships without the need of either detrending nor differencing.  The proposed non residual-based wavelet statistics are asymptotically distributed as standard chi-square with nk degrees of freedom regardless of deterministic terms or dynamic regressors, thus offering a simple way of testing for cointegration under the null without the need of special tables.  Small sample quantiles for these wavelet statistics are obtained using Monte Carlo simulation in different situations including I(1) and higher order cointegration cases and it is shown that these wavelet tests exhibit appropriate size and good power when compared to other tests of the null of cointegration.
    Keywords: Brownian motion, cointegration, econometric methods, integrated process, multivariate analysis, spectral analysis, time series models, unit roots, wavelet analysis
    JEL: C22 C12
    Date: 2013–07–11
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:668&r=ecm
  3. By: Haiqiang Chen; Ying Fang; Yingxing Li;
    Abstract: This paper considers estimation and inference for varying-coefficient models with nonstationary regressors. We propose a nonparametric estimation method using penalized splines, which achieves the same optimal convergence rate as kernel-based methods, but enjoys computation advantages. Utilizing the mixed model representation of penalized splines, we develop a likelihood ratio test statistic for checking the stability of the regression coefficients. We derive both the exact and the asymptotic null distributions of this test statistic. We also demonstrate its optimality by examining its local power performance. These theoretical findings are well supported by simulation studies.
    Keywords: Nonstationary Time Series; Varying-coe±cient Model; Likelihood Ratio Test; Penalized Splines
    JEL: C12 C14 C22
    Date: 2013–07
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2013-033&r=ecm
  4. By: Javier Fernandez-Macho
    Abstract: This paper examines a test for the null of cointegration in a multivariate system based on the discrepancy between the OLS estimator of the full set of n cointegrating relationships in the n + k system and the OLS estimator of the corresponding relationships among first differences without making specific assumptions about the short-run dynamics of the multivariate data generating process.  It is shown that the proposed test statistics are asymptotically distributed as standard chi-square with n + k degrees of freedom and are not affected by the inclusion of deterministic terms or dynamic regressors, thus offering a simple way of testing for cointegration under the null without the need of special tables.  Small sample critical values for these statistics are tabulated using Monte Carlo simulation and it is shown that these non residual-based tests exhibit appropriate size and good power even for quite general error dynamics.  In fact, simulation results suggest that they perform quite reasonably when compared to other tests of the null of cointegration.
    Keywords: Brownian motion, cointegration, econometric methods, integrated process, multivariate analysis, time series models, unit root
    JEL: C22 C12
    Date: 2013–06–01
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:657&r=ecm
  5. By: Thomas Chuffart (AMSE - Aix-Marseille School of Economics - Aix-Marseille Univ. - Centre national de la recherche scientifique (CNRS) - École des Hautes Études en Sciences Sociales [EHESS] - Ecole Centrale Marseille (ECM))
    Abstract: A large number of non linear conditional heteroskedastic models have been proposed in the literature and practitioners do not have always the tools to choose the correct specification. In this article, our main interest is to know if usual choice criteria lead them to choose the good specification in regime switching framework. We focus on two types of models: the Logistic Smooth Transition GARCH model and the Markov-Switching GARCH models. Thanks to simulation experiments, we highlight that information criteria and loss functions can lead practitioners to do a misspecification. Indeed, depending on the Data Generating Process used in the experiment, the choice of a criteria to select a model is a difficult issue. We argue that if selection criteria lead to choose the wrong model, it's rather due to the difficulty to estimate such models with Quasi Maximum Likelihood Estimation method (QMLE).
    Keywords: conditional volatility; model selection; GARCH; regime switching
    Date: 2013–07
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-00844413&r=ecm
  6. By: M. Caivano; A. Harvey
    Abstract: A time series model in which the signal is buried in noise that is non-Gaussian may throw up observations that, when judged by the Gaussian yardstick, are outliers. We describe an observation driven model, based on an exponential generalized beta distribution of the second kind (EGB2), in which the signal is a linear function of past values of the score of the conditional distribution. This specification produces a model that is not only easy to implement, but which also facilitates the development of a comprehensive and relatively straight-forward theory for the asymptotic distribution of the maximum likelihood estimator. The model is fitted to US macroeconomic time series and compared with Gaussian and Student-t models. A theory is then developed for an EGARCH model based on the EGB2 distribution and the model is fitted to exchange rate data. Finally dynamic location and scale models are combined and applied to data on the UK rate of inflation.
    Keywords: Beta distribution, EGARCH; fat tails; score; robustness; Student's t; Winsorizing
    JEL: C22 G17
    Date: 2013–07–17
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:1325&r=ecm
  7. By: D Kuang (Hiscox, London); Bent Nielsen (Nuffield College, Oxford); J P Nielsen (Cass Business School, City University London)
    Abstract: The log normal reserving model is considered. The contribution of the paper is to derive explicit expressions for the maximum likelihood estimators. These are expressed in terms of development factors which are geometric averages. The distribution of the estimators is derived. It is shown that the analysis is invariant to traditional measures for exposure.
    Keywords: Arithmetic chain-ladder, geometric chain-ladder, canonical parameter, identification problem, maximum likelihood, log-normal model.
    Date: 2013–07–01
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:1311&r=ecm
  8. By: Krenar Avdulaj; Jozef Barunik
    Abstract: Oil is widely perceived as a good diversification tool for stock markets. To fully understand the potential, we propose a new empirical methodology which combines generalized autoregressive score copula functions with high frequency data, and allows us to capture and forecast the conditional time-varying joint distribution of the oil -- stocks pair accurately. Our realized GARCH with time-varying copula yields statistically better forecasts of the dependence as well as quantiles of the distribution when compared to competing models. Using recently proposed conditional diversification benefits measure which take into account higher-order moments and nonlinear dependence, we document reducing benefits from diversification over the past ten years. Diversification benefits implied by our empirical model are moreover strongly varying over time. These findings have important implications for portfolio management.
    Date: 2013–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1307.5981&r=ecm
  9. By: Ladislav Kristoufek
    Abstract: We introduce a general framework of the Mixed-correlated ARFIMA (MC-ARFIMA) processes which allows for various specifications of univariate and bivariate long-term memory. Apart from a standard case when $H_{xy}={1}{2}(H_x+H_y)$, MC-ARFIMA also allows for processes with $H_{xy}<{1}{2}(H_x+H_y)$ but also for long-range correlated processes which are either short-range cross-correlated or simply correlated. The major contribution of MC-ARFIMA lays in the fact that the processes have well-defined asymptotic properties for $H_x$, $H_y$ and $H_{xy}$, which are derived in the paper, so that the processes can be used in simulation studies comparing various estimators of the bivariate Hurst exponent $H_{xy}$. Moreover, the framework allows for modeling of processes which are found to have $H_{xy}<{1}{2}(H_x+H_y)$.
    Date: 2013–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1307.6046&r=ecm
  10. By: David Hendry; Jurgen A. Doornik; Felix Pretis
    Abstract: Using an extension of general-to-specific modelling, based on the recent developments of impulse-indicator saturation (IIS), we consider selecting significant step indicators from a saturating set to capture location shifts.  The approximate non-centrality of the test is derived for a variety of shifts using a 'split-half' analysis, the simplest specialization of a multiple-block search algorithm.  Monte Carlo simulations confirm the accuracy of the nominal significance levels under the null, and show rejections when location shifts occur, improving in non-null rejection frequency compared to the corresponding IIS-based and to Chow (1960) tests.
    Keywords: General-so-specific, step-indicator saturation, test power, location shifts, model section, Autometrics
    JEL: C51 C22
    Date: 2013–06–06
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:658&r=ecm
  11. By: James Wolter
    Abstract: Global frailty is an unobserved macroeconomic variable.  In event data contexts, this unobserved variable is assumed to impact the hazard rate of event arrivals.  Attempts to identify and estimate the path of frailty are complicated when observed macroeconomic variables also impact hazard rates.  It is possible that the impact of the observed macro variables and global frailty can be confused and identification can fail.  In this paper I show that, under appropriate assumptions, the path of global frailty and the impact of observed macro variables can both be recovered.  This approach differs from previous work in that I do not assume frailty follows a specific stochastic process form.  Previous studies identify global frailty by assuming a stochastic form and using a filtering approach.  However, chosen stochastic forms are arbitrary and can potentially lead to poor results.  The method in this paper shows how to recover frailty without these assumptions.  This can serve as a model check to filtering approaches.  The methods are applied to simulations and an application to corporate default.
    JEL: C13 C14 C41 C58
    Date: 2013–07–10
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:667&r=ecm
  12. By: Mehdad, E.; Kleijnen, Jack P.C. (Tilburg University, Center for Economic Research)
    Abstract: Abstract: This paper investigates two related questions: (1) How to derive a confidence interval for the output of a combination of simulation inputs not yet simulated? (2) How to select the next combination to be simulated when searching for the optimal combination? To answer these questions, the paper uses parametric bootstrapped Kriging and "conditional simulation". Classic Kriging estimates the variance of its predictor by plugging-in the estimated GP parameters so this variance is biased. The main conclusion is that classic Kriging seems quite robust; i.e., classic Kriging gives acceptable confidence intervals and estimates of the optimal solution.
    Keywords: Simulation;Optimization;Kriging;Bootstrap.
    JEL: C0 C1 C9 C15 C44
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:2013038&r=ecm
  13. By: Nassim Nicholas Taleb
    Abstract: The literature of heavy tails starts with a random walk and finds mechanisms that lead to fat tails under aggregation. We follow the inverse route and show how starting with fat tails we get to thin-tails when deriving the probability distribution of the response to a random variable. We introduce a general dose-response curve and argue that the left and right-boundedness or saturation of the reponse in natural things leads to thin-tails, even when the "underlying" random variable at the source of the exposure is fat-tailed.
    Date: 2013–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1307.6695&r=ecm
  14. By: Pierre BAJGROWICZ (University of Geneva); Olivier SCAILLET (University of Geneva and Swiss Finance Institute)
    Abstract: We prove that it eliminates asymptotically all spurious detections. Monte Carlo results show that it performs also well in nite samples. In Dow Jones stocks, spurious detections represent up to 50% of the jumps detected initially between 2006 and 2008. For the majority of stocks, jumps do not cluster in time and no cojump aects all stocks simultaneously, suggesting jump risk is diversiable. We relate the remaining jumps to macroeconomic news, prescheduled company-specic announcements, and stories from news agencies which include a variety of unscheduled and uncategarized events. The majority of news do not cause jumps. One exception are share buybacks announcements. Fed rate news have an important impact but rarely cause jumps. Another nding is that 60% of jumps occur without any news event. For one third of the jumps with no news we observe an unusual behavior in the volume of transactions. Hence, liquidity pressures are probably another important factor of jumps.
    Keywords: jumps, high-frequency data, spurious detections, jumps dynamics, news releases, cojumps
    JEL: C58 G12 G14
    URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp1136&r=ecm

This nep-ecm issue is ©2013 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.