nep-ecm New Economics Papers
on Econometrics
Issue of 2013‒11‒14
nine papers chosen by
Sune Karlsson
Orebro University

  1. Multivariate stochastic volatility modelling using Wishart autoregressive processes By K. Triantafyllopoulos
  2. Optimal Estimation Strategies for Bivariate Fractional Cointegration Systems By Marcel Aloy; Gilles De Truchis
  3. A bootstrapped spectral test for adequacy in weak ARMA models By Zhu, Ke; Li, Wai-Keung
  4. Modeling of Volatility with Non-linear Time Series Model By Song-Yon Kim; Mun-Chol Kim
  5. Inference on Higher-Order Spatial Autoregressive Models with Increasingly Many Parameters By Abhimanyu Gupta; M. Robinson
  6. Finding starting-values for maximum likelihood estimation of vector STAR models By Schleer, Frauke
  7. Nonlinear Difference-in-Differences in Repeated Cross Sections with Continuous Treatments By Xavier D'Haultfoeuille; Stefan Hoderlein; Yuya Sasaki
  8. There is a VaR Beyond Usual Approximations By Marie Kratz
  9. Predicting the Spread of Financial Innovations: An Epidemiological Approach By Hull, Isaiah

  1. By: K. Triantafyllopoulos
    Abstract: A new multivariate stochastic volatility estimation procedure for financial time series is proposed. A Wishart autoregressive process is considered for the volatility precision covariance matrix, for the estimation of which a two step procedure is adopted. The first step is the conditional inference on the autoregressive parameters and the second step is the unconditional inference, based on a Newton-Raphson iterative algorithm. The proposed methodology, which is mostly Bayesian, is suitable for medium dimensional data and it bridges the gap between closed-form estimation and simulation-based estimation algorithms. An example, consisting of foreign exchange rates data, illustrates the proposed methodology.
    Date: 2013–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1311.0530&r=ecm
  2. By: Marcel Aloy (AMSE - Aix-Marseille School of Economics - Aix-Marseille Univ. - Centre national de la recherche scientifique (CNRS) - École des Hautes Études en Sciences Sociales [EHESS] - Ecole Centrale Marseille (ECM)); Gilles De Truchis (AMSE - Aix-Marseille School of Economics - Aix-Marseille Univ. - Centre national de la recherche scientifique (CNRS) - École des Hautes Études en Sciences Sociales [EHESS] - Ecole Centrale Marseille (ECM))
    Abstract: Estimation methods of bivariate fractional cointegration models are numerous. In most cases they have non-equivalent asymptotic and finite sample properties, implying diffculties in determining an optimal estimation strategy. In this paper, we address this issue by means of simulations and provide useful guidance to practitioners. Our Monte Carlo study reveals the superiority of techniques that estimate jointly all parameters of interest, over those operating in two steps. In some cases, it also shows that estimators originally designed for the stationary cointegration, have good finite sample properties in non-stationary regions of the parameter space.
    Keywords: fractional cointegration; Monte Carlo simulation; Whittle estimation; frequency analysis
    Date: 2013–10
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-00879522&r=ecm
  3. By: Zhu, Ke; Li, Wai-Keung
    Abstract: This paper proposes a Cramer-von Mises (CM) test statistic to check the adequacy of weak ARMA models. Without posing a martingale difference assumption on the error terms, the asymptotic null distribution of the CM test is obtained by using the Hillbert space approach. Moreover, this CM test is consistent, and has nontrivial power against the local alternative of order $n^{-1/2}$. Due to the unknown dependence of error terms and the estimation effects, a new block-wise random weighting method is constructed to bootstrap the critical values of the test statistic. The new method is easy to implement and its validity is justified. The theory is illustrated by a small simulation study and an application to S\&P 500 stock index.
    Keywords: Block-wise random weighting method; Diagnostic checking; Least squares estimation; Spectral test; Weak ARMA models; Wild bootstrap.
    JEL: C1 C12
    Date: 2013–11–06
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:51224&r=ecm
  4. By: Song-Yon Kim; Mun-Chol Kim
    Abstract: In this paper nonlinear time series models are used to describe volatility in financial time series data. To describe volatility two of the nonlinear time series are combined into TAR (Threshold Auto-Regressive Model) with AARCH (Asymmetric Auto- Regressive Conditional Heteroskedasticity) error term and its parameter estimation is studied.
    Date: 2013–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1311.1154&r=ecm
  5. By: Abhimanyu Gupta; M. Robinson
    Abstract: This paper develops consistency and asymptotic normality of instrumental variables and least squares estimates for the parameters of a higher-order spatial autoregressive (SAR) model with regressors. The order of the SAR model and the number of regressors are allowed to approach infinity slowly with sample size, and the permissible rate of growth of the dimension of the parameter space relative to sample size is studied. Besides allowing the number of estimable parameters to increase with the data, this has the advantage of accommodating explicitly some asymptotic regimes that arise in practice. Illustrations are discussed, in particular settings where the need for such theory arises quite naturally. A Monte Carlo study analyses various implications of the theory in finite samples. For empirical researchers our work has implications for the choice of model. In particular if the structure of the spatial weights matrix assumes a partitioning of the data then spatial parameters should be allowed to vary over clusters.
    Date: 2013–10–16
    URL: http://d.repec.org/n?u=RePEc:esx:essedp:735&r=ecm
  6. By: Schleer, Frauke
    Abstract: This paper focuses on finding starting-values for maximum likelihood estimation of Vector STAR models. Based on a Monte Carlo exercise, different procedures are evaluated. Their performance is assessed w.r.t. model fit and computational effort. I employ i) grid search algorithms, and ii) heuristic optimization procedures, namely, differential evolution, threshold accepting, and simulated annealing. In the equation-by-equation starting-value search approach the procedures achieve equally good results. Unless the errors are cross-correlated, equation-by-equation search followed by a derivative-based algorithm can handle such an optimization problem sufficiently well. This result holds also for higher-dimensional VSTAR models with a slight edge for the heuristic methods. Being faced with more complex Vector STAR models for which a multivariate search approach is required, simulated annealing and differential evolution outperform threshold accepting and the grid with a zoom. --
    Keywords: Vector STAR model,starting-values,optimization heuristics,grid search,estimation,non-linearieties
    JEL: C32 C61 C63
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:zbw:zewdip:13076&r=ecm
  7. By: Xavier D'Haultfoeuille (CREST); Stefan Hoderlein (Boston College); Yuya Sasaki (Johns Hopkins University)
    Abstract: This paper studies the identification of nonseparable models with continuous, endogenous regressors, also called treatments, using repeated cross sections. We show that several treatment effect parameters are identified under two assumptions on the effect of time, namely a weak stationarity condition on the distribution of unobservables, and time variation in the distribution of endogenous regressors. Other treatment effect parameters are set identified under curvature conditions, but without any functional form restrictions. This result is related to the difference-in-differences idea, but does neither impose additive time effects nor exogenously defined control groups. Furthermore, we investigate two extrapolation strategies that allow us to point identify the entire model: using monotonicity of the error term, or imposing a linear correlated random coefficient structure. Finally, we illustrate our results by studying the effect of mother's age on infants' birth weight.
    Keywords: identification, repeated cross sections, nonlinear models, continuous treatment, random coefficients, endogeneity, difference-in-differences.
    Date: 2013–08–13
    URL: http://d.repec.org/n?u=RePEc:boc:bocoec:839&r=ecm
  8. By: Marie Kratz (SID - Information Systems / Decision Sciences Department - ESSEC Business School, MAP5 - Mathématiques appliquées Paris 5 - CNRS : UMR8145 - Université Paris V - Paris Descartes)
    Abstract: Basel II and Solvency 2 both use the Value-at Risk (VaR) as the risk measure to compute the Capital Requirements. In practice, to calibrate the VaR, a normal approximation is often chosen for the unknown distribution of the yearly log returns of financial assets. This is usually justified by the use of the Central Limit Theorem (CLT), when assuming aggregation of independent and identically distributed (iid) observations in the portfolio model. Such a choice of modeling, in particular using light tail distributions, has proven during the crisis of 2008/2009 to be an inadequate approximation when dealing with the presence of extreme returns; as a consequence, it leads to a gross underestimation of the risks. The main objective of our study is to obtain the most accurate evaluations of the aggregated risks distribution and risk measures when working on financial or insurance data under the presence of heavy tail and to provide practical solutions for accurately estimating high quantiles of aggregated risks. We explore a new method, called Normex, to handle this problem numerically as well as theoretically, based on properties of upper order statistics. Normex provides accurate results, only weakly dependent upon the sample size and the tail index. We compare it with existing methods.
    Keywords: Aggregated risk ; (refined) Berry-Esséen Inequality ; (generalized) Central Limit Theorem ; Conditional (Pareto) Distribution ; Conditional (Pareto) Moment ; Convolution ; Expected Short Fall ; Extreme Values ; Financial Data ; High Frequency Data ; Market Risk ; Order Statistics ; Pareto Distribution ; Rate of Convergence ; Risk Measures ; Stable Distribution ; Value-at-Risk
    Date: 2013–11
    URL: http://d.repec.org/n?u=RePEc:hal:journl:hal-00880258&r=ecm
  9. By: Hull, Isaiah (Research Department, Central Bank of Sweden)
    Abstract: I construct an estimable statistic that predicts whether a financial innovation will spread. The approach embeds the multi-host SIR model from epidemiology within a financial model of correlated securities trade; and takes advantage of the related predictive tools from mathematical epidemiology, including the basic reproductive ratio (R0) and herd immunity. In the model, banks and their creditors are assumed to have imperfect information about a newly-created security, and must search over the portfolios of other investors and intermediaries to infer the security's properties. In the absence of historical returns data, a large mass of firms holding the new security and not experiencing insolvency provides a positive signal about the distribution of its returns within the current period, and perpetuates further holding of the security. The model yields a set of structural equations that are used to construct the statistic. I provide two estimation strategies for the statistic; and identify 12 theoretical parameter restrictions that enable inference when only a subset of the model's parameters are identifiable. I use the approach to predict the spread of exchange traded funds (ETFs) and asset-backed securities (ABS). Additionally, I show how regulators can use the method to monitor the joint solvency of depository institutions within a given geographic region.
    Keywords: Econometric Modeling; Econometric Forecasting; Financial Econometrics; Financial Innovation
    JEL: C51 C53 G12 G14
    Date: 2013–10–01
    URL: http://d.repec.org/n?u=RePEc:hhs:rbnkwp:0279&r=ecm

This nep-ecm issue is ©2013 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.