nep-ecm New Economics Papers
on Econometrics
Issue of 2017‒12‒11
thirteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Random Coefficient Continuous Systems: Testing for Extreme Sample Path Behaviour By Tao, Yubo; Phillips, Peter C.B.; Yu, Jun
  2. A Neural Stochastic Volatility Model By Rui Luo; Weinan Zhang; Xiaojun Xu; Jun Wang
  3. Testing the lag length of vector autoregressive models: A power comparison between portmanteau and Lagrange multiplier tests By Raja Ben Hajria; Salah Khardani; Hamdi Raïssi
  4. Likelihood-based Risk Estimation for Variance-Gamma Models By Marco Bee; Maria Michela Dickson; Flavio Santi
  5. ob Displacement during the Great Recession: Tight Bounds on Distributional Treatment Effect Parameters using Panel Data By Brantly Callaway
  6. Testing of Parameter's Instability in a Balanced Panel: An Application to Real Effective Exchange Rate for SAARC Countries By Varun Agiwal; Jitendra Kumar; Sumit Kumar Sharma
  7. Estimation for high-frequency data under parametric market microstructure noise By Simon Clinet; Yoann Potiron
  8. Strong consistency of the least squares estimator in regression models with adaptive learning By Norbert Christopeit; Michael Massmann
  9. Uncertainty across volatility regimes By Angelini, Giovanni; Bacchiocchi, Emanuele; Caggiano, Giovanni; Fanelli, Luca
  10. Volatility Transmission in Overlapping Trading Zones By Andreas Masuhr
  11. Equivalent representations of discrete-time two-state panel data models By Tue Gorgens; Dean Hyslop
  12. Tests of Policy Interventions in DSGE Models By M Hashem Pesaran; Ron P Smith
  13. Identification and Generalized Band Spectrum Estimation of the New Keynesian Phillips Curve By Junjie Guo; Juan Carlos Escanciano; Jinho Choi

  1. By: Tao, Yubo (School of Economics, Singapore Management University); Phillips, Peter C.B. (Yale University); Yu, Jun (School of Economics, Singapore Management University)
    Abstract: This paper studies a continuous time dynamic system with a random persistence parameter. The exact discrete time representation is obtained and related to several discrete time random coefficient models currently in the literature. The model distinguishes various forms of unstable and explosive behaviour according to specific regions of the parameter space that open up the potential for testing these forms of extreme behaviour. A two-stage approach that employs realized volatility is proposed for the continuous system estimation, asymptotic theory is developed, and test statistics to identify the different forms of extreme sample path behaviour are proposed. Simulations show that the proposed estimators work well in empirically realistic settings and that the tests have good size and power properties in discriminating characteristics in the data that differ from typical unit root behaviour. The theory is extended to cover models where the random persistence parameter is endogenously determined. An empirical application based on daily real S&P 500 index data over 1964-2015 reveals strong evidence against parameter constancy after early 1980, which strengthens after July 1997, leading to a long duration of what the model characterizes as extreme behaviour in real stock prices.
    Keywords: Continuous time models; Explosive path; Extreme behaviour; Random coefficient autoregression; In fill asymptotics; Bubble testing.
    JEL: C13 C22 G13
    Date: 2017–11–30
    URL: http://d.repec.org/n?u=RePEc:ris:smuesw:2017_018&r=ecm
  2. By: Rui Luo; Weinan Zhang; Xiaojun Xu; Jun Wang
    Abstract: In this paper, we show that the recent integration of statistical models with deep recurrent neural networks provides a new way of formulating volatility (the degree of variation of time series) models that have been widely used in time series analysis and prediction in finance. The model comprises a pair of complementary stochastic recurrent neural networks: the generative network models the joint distribution of the stochastic volatility process; the inference network approximates the conditional distribution of the latent variables given the observables. Our focus here is on the formulation of temporal dynamics of volatility over time under a stochastic recurrent neural network framework. Experiments on real-world stock price datasets demonstrate that the proposed model generates a better volatility estimation and prediction that outperforms stronge baseline methods, including the deterministic models, such as GARCH and its variants, and the stochastic MCMC-based models, and the Gaussian-process-based, on the average negative log-likelihood measure.
    Date: 2017–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1712.00504&r=ecm
  3. By: Raja Ben Hajria; Salah Khardani; Hamdi Raïssi
    Abstract: In this paper we provide an asymptotic theoretical power comparison in the Bahadur sense, between the portmanteau and Breusch-Godfrey Lagrange Multiplier (LM) tests for the goodness-of-fit checking of vector autoregressive (VAR) models. The merits and the drawbacks of the studied tests are illustrated using Monte Carlo experiments.
    Keywords: VAR model, VECM model, Cointegration, Residual autocorrelations, Portmanteau tests, Lagrange Multiplier tests.
    JEL: C22 C01
    Date: 2017–11
    URL: http://d.repec.org/n?u=RePEc:ucv:wpaper:2017-03&r=ecm
  4. By: Marco Bee; Maria Michela Dickson; Flavio Santi
    Abstract: Although the variance-gamma distribution is a flexible model for log-returns of financial assets, so far it has found rather limited applications in finance and risk management. One of the reasons is that maximum likelihood estimation of its parameters is not straightforward. We develop an EM-type algorithm that bypasses the evaluation of the full likelihood, which may be dicult because the density is not in closed form and is unbounded for small values of the shape parameter. Moreover, we study the relative eciency of our approach with respect to the maximum likelihood estimation procedures implemented in the VarianceGamma and ghyp R packages. Extensive simulation experiments and real-data analyses suggest that the multicycle ECM algorithm and the routines in the ghyp R package give the best results in terms of root-mean-squared-error, for both parameter and Value-at-Risk estimation
    Keywords: Multicycle EM algorithm, maximum likelihood, numerical optimization, risk estimation
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:trn:utwprg:2017/03&r=ecm
  5. By: Brantly Callaway (Department of Economics, Temple University)
    Abstract: Late prime-age workers who were displaced during the Great Recession lost on average 39% of their earnings relative to their counterfactual earnings had they not been displaced. But the average effect masks substantial heterogeneity across workers. This paper develops new techniques to bound distributional treatment effect parameters that depend on the joint distribution of potential outcomes -- an object not identified by standard identifying assumptions such as selection on observables or even when treatment is randomly assigned. I show that panel data and an additional assumption on the dependence between untreated potential outcomes for the treated group over time (i) provide more identifying power for distributional treatment effect parameters than existing bounds and (ii) provide a more plausible set of conditions than existing methods that obtain point identification.
    Keywords: Joint Distribution of Potential Outcomes, Distribution of the Treatment Effect, Quantile of the Treatment Effect, Copula Stability Assumption, Panel Data, Job Displacement
    JEL: C14 C31 C33 J63
    Date: 2017–08
    URL: http://d.repec.org/n?u=RePEc:tem:wpaper:1703&r=ecm
  6. By: Varun Agiwal; Jitendra Kumar; Sumit Kumar Sharma
    Abstract: Present paper considers structural break in panel AR(1) model which allows instability in mean, variance and autoregressive coefficient. This model is extension of univariate model proposed by Meligkotsiduo et al. (2004) and review of existing panel data time series model considering break studied by Levin et al. (2002), Pesaran (2004), Bai (2010), Liu et al. (2011), Wachter and Tzavalis (2012). Paper dealt the identification of structural break by comparing the posterior probability of all possible models like break on all three parameters, only two parameters, one parameter and there is no break. A simulation study is carried out to validate the derived theorems. An Empirical analysis on Real Exchange Rate of India and its neighboring countries (SAARC countries including China) are also carried out. The present study is correctly identifying the common break on 1991 which happened due to second gulf war and international debt crisis.
    Keywords: Panel autoregressive model, Structural break, Prior and Posterior probability.
    JEL: C11 C12 C23
    Date: 2017–11–11
    URL: http://d.repec.org/n?u=RePEc:eei:rpaper:eeri_rp_2017_11&r=ecm
  7. By: Simon Clinet; Yoann Potiron
    Abstract: In this paper, we propose a general class of noise-robust estimators based on the existing estimators in the non-noisy high-frequency data literature. The market microstructure noise is a known parametric function of the limit order book. The noise-robust estimators are constructed as a plug-in version of their counterparts, where we replace the efficient price, which is non-observable in our framework, by an estimator based on the raw price and the limit order book data. We show that the technology can be directly applied to estimate volatility, high-frequency covariance, functionals of volatility and volatility of volatility in a general nonparametric framework where, depending on the problem at hand, price possibly includes infinite jump activity and sampling times encompass asynchronicity and endogeneity.
    Date: 2017–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1712.01479&r=ecm
  8. By: Norbert Christopeit; Michael Massmann
    Abstract: This paper looks at the strong consistency of the ordinary least squares (OLS) estimator in a stereotypical macroeconomic model with adaptive learning. It is a companion to Christopeit & Massmann (2017, Econometric Theory) which considers the estimator’s convergence in distribution and its weak consistency in the same setting. Under constant gain learning, the model is closely related to stationary, (alternating) unit root or explosive autoregressive processes. Under decreasing gain learning, the regressors in the model are asymptotically collinear. The paper examines, first, the issue of strong convergence of the learning recursion: It is argued that, under constant gain learning, the recursion does not converge in any probabilistic sense, while for decreasing gain learning rates are derived at which the recursion converges almost surely to the rational expectations equilibrium. Secondly, the paper establishes the strong consistency of the OLS estimators, under both constant and decreasing gain learning, as well as rates at which the estimators converge almost surely. In the constant gain model, separate estimators for the intercept and slope parameters are juxtaposed to the joint estimator, drawing on the recent literature on explosive autoregressive models. Thirdly, it is emphasised that strong consistency is obtained in all models although the near-optimal condition for the strong consistency of OLS in linear regression models with stochastic regressors, established by Lai & Wei (1982), is not always met.
    Keywords: adaptive learning, non-stationary regression, ordinary least squares, almost sure convergence
    JEL: C22 C51 D83
    Date: 2017–11–28
    URL: http://d.repec.org/n?u=RePEc:whu:wpaper:17-07&r=ecm
  9. By: Angelini, Giovanni; Bacchiocchi, Emanuele; Caggiano, Giovanni; Fanelli, Luca
    Abstract: We propose a new non-recursive identification scheme for uncertainty shocks, which exploits breaks in the unconditional volatility of macroeconomic variables. Such identification approach allows us to simultaneously address two major questions in the empirical literature on uncertainty: (i) Does the relationship between uncertainty and economic activity change across macroeconomic regimes? (ii) Is uncertainty a major cause or effect (or both) of decline in economic activity? Empirical results based on a small-scale VAR with US monthly data for the period 1960-2015 suggest that (i) the effects of uncertainty shocks are regime-dependent, and (ii) uncertainty is an exogenous source of decline of economic activity, rather than an endogenous response to it.
    JEL: C32 C51 E44 G01
    Date: 2017–11–30
    URL: http://d.repec.org/n?u=RePEc:bof:bofrdp:2017_035&r=ecm
  10. By: Andreas Masuhr
    Abstract: Previous volatility spillover models (Engle et al. 1990, Clements et al. 2015) use artificially non overlapping trading zones to identify sources of volatility transmission between these zones. The problem of non overlapping zones is overcome using a copula GARCH approach that allows for multiple overlaps between zones incorporating vine copulas to flexibly model the dependence structure and to meet stylized facts of return data. Stationarity conditions are examined and identifications problems concerning previous work, as well, are pointed out. To handle the relatively large parameter space, the model is estimated by Bayesian methods using a differential evolution MCMC (Braak 2006) approach. Simulation studies are carried out in order to ensure robustness against copula or error term misspecification and in order to analyze the identification problem.
    Date: 2017–11
    URL: http://d.repec.org/n?u=RePEc:cqe:wpaper:6717&r=ecm
  11. By: Tue Gorgens; Dean Hyslop
    Abstract: There are two common approaches to analyzing discrete-time two-state panel data. One focuses on modeling the determinants of state occupancy, the other on modeling the determinants of transition between states. This note shows that there are one-to-one correspondences between the two representations, between the two probability distributions in an unrestricted context, and between low-order Markov models of state occupancy and semi-Markov models of transition between states with strictly limited duration dependence.
    Keywords: Panel data, binary response, dynamic models
    JEL: C33 C35 C41
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:acb:cbeeco:2017-654&r=ecm
  12. By: M Hashem Pesaran (University of Southern California; Trinity College, Cambridge); Ron P Smith (Birkbeck, University of London)
    Abstract: This paper considers tests of the effectiveness of a policy intervention, defined as a change in the parameters of a policy rule, in the context of a macroeconometric dynamic stochastic general equilibrium (DSGE) model. We consider two types of intervention, First the standard case of a parameter change that does not alter the steady state, and second one that does alter the steady state, e.g. the target rate of inflation. We consider two types of test, one a multi-horizon test, where the post-intervention policy horizon, H, is small and fixed, and a mean policy effect test where H is allowed to increase without bounds. The multi-horizon test requires Gaussian errors, but the mean policy effect test does not. It is shown that neither of these two tests are consistent, in the sense that the the power of the tests does not tend to unity as H tends to infinity, unless the intervention alters the steady state. This follows directly from the fact that DSGE variables are measured as deviations from the steady state, and the effects of policy change on target variables decay exponentially fast. We investigate the size and power of the proposed mean effect test by simulating a standard three equation New Keynesian DSGE model. The simulation results are in line with our theoretical findings and show that in all applications the tests have the correct size; but unless the intervention alters the steady state, their power does not go to unity with H.
    Keywords: counterfactuals, policy analysis, policy ineffectiveness test, macroeconomics.
    JEL: C18 C54 E65
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:bbk:bbkcam:1706&r=ecm
  13. By: Junjie Guo (Indiana University); Juan Carlos Escanciano (Indiana University); Jinho Choi (AMRO and Bank of Korea)
    Abstract: This article proposes a new identification strategy and a new estimation method for the hybrid New Keynesian Phillips curve (NKPC). Unlike the predominant Generalized Method of Moments (GMM) approach, which leads to weak identification of the NKPC with U.S. postwar data, our nonparametric method exploits nonlinear variation in inflation dynamics and provides supporting evidence of point-identification. This article shows that identification of the NKPC is characterized by two conditional moment restrictions. This insight leads to a quantitative method to assess identification in the NKPC. For estimation, the article proposes a closed-form Generalized Band Spectrum Estimator (GBSE) that effectively uses information from the conditional moments, accounts for nonlinear variation, and permits a focus on short-run dynamics. Applying the GBSE to U.S postwar data, we find a significant coefficient of marginal cost and that the forward-looking component and the inflation inertia are both equally quantitatively important in explaining the short-run inflation dynamics, substantially reducing sampling uncertainty relative to existing GMM estimates.
    Keywords: Point-identification, New Keynesian Phillips curve, Weak instruments, Nonlinear dependence, Generalized spectrum
    Date: 2017–11
    URL: http://d.repec.org/n?u=RePEc:inu:caeprp:2017014&r=ecm

This nep-ecm issue is ©2017 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.