nep-ecm New Economics Papers
on Econometrics
Issue of 2015‒08‒25
twelve papers chosen by
Sune Karlsson
Örebro universitet

  1. Efficient Estimation for Diffusions Sampled at High Frequency Over a Fixed Time Interval By Nina Munkholt Jakobsen; Michael Sørensen
  2. Log-Transform Kernel Density Estimation of Income Distribution By Arthur Charpentier; Emmanuel Flachaire
  3. GMM Estimation of Affine Term Structure Models By Jaroslava Hlouskova; Leopold S\"ogner
  4. A note on the bootstrap method for testing the existence of finite moments By Fedotenkov, Igor
  5. Large sample properties of the matrix exponential spatial specification with an application to FDI By Nicolas Debarsy; Fei Jin; Lung-Fei Lee
  6. MGARCH models: tradeoff between feasibility and flexibility By Daniel De Almeida; Luiz Hotta; Esther Ruiz
  7. A Note on Estimating Variance of Finite Population Distribution Function By Sumanta Adhya; Banerjee, Tathagata; Chattopadhyay, Gouranga
  8. On an asymmetric extension of multivariate Archimedean copulas By Elena Di Bernardino; Didier Rullière
  10. Which pricing approach for options under GARCH with non-normal innovations? By Jean-Guy Simonato; Lars Stentoft
  11. Towards improving the framework for probabilistic forecast evaluation By Leonard A. Smith; Emma B. Suckling; Erica L. Thompson; Trevor Maynard; Hailiang Du
  12. Structural and atheoretic approaches to micro-econometrics of public policy evaluation.(in french) By S. Roux

  1. By: Nina Munkholt Jakobsen (University of Copenhagen); Michael Sørensen (University of Copenhagen and CREATES)
    Abstract: Parametric estimation for diffusion processes is considered for high frequency observations over a fixed time interval. The processes solve stochastic differential equations with an unknown parameter in the diffusion coefficient. We find easily verified conditions on approximate martingale estimating functions under which estimators are consistent, rate optimal, and efficient under high frequency (in-fill) asymptotics. The asymptotic distributions of the estimators are shown to be normal variance-mixtures, where the mixing distribution generally depends on the full sample path of the diffusion process over the observation time interval. Utilising the concept of stable convergence, we also obtain the more easily applicable result that for a suitable data dependent normalisation, the estimators converge in distribution to a standard normal distribution. The theory is illustrated by a small simulation study comparing an efficient and a non-efficient estimating function.
    Keywords: Approximate martingale estimating functions, discrete time sampling of diffusions, in-fill asymptotics, normal variance-mixtures, optimal rate, random Fisher information, stable convergence, stochastic differential equation.
    JEL: C22
    Date: 2015–08–06
  2. By: Arthur Charpentier (UQAM - Université du Québec à Montréal); Emmanuel Flachaire (AMSE - Aix-Marseille School of Economics - EHESS - École des hautes études en sciences sociales - Centre national de la recherche scientifique (CNRS) - Ecole Centrale Marseille (ECM) - AMU - Aix-Marseille Université)
    Abstract: Standard kernel density estimation methods are very often used in practice to estimate density function. It works well in numerous cases. However, it is known not to work so well with skewed, multimodal and heavy-tailed distributions. Such features are usual with income distributions, defined over the positive support. In this paper, we show that a preliminary logarithmic transformation of the data, combined with standard kernel density estimation methods, can provide a much better fit of the density estimation.
    Date: 2014–11
  3. By: Jaroslava Hlouskova; Leopold S\"ogner
    Abstract: This article investigates parameter estimation of affine term structure models by means of the generalized method of moments. Exact moments of the affine latent process as well as of the yields are obtained by using results derived for p-polynomial processes. Then the generalized method of moments, combined with Quasi-Bayesian methods, is used to get reliable parameter estimates and to perform inference. After a simulation study, the estimation procedure is applied to empirical interest rate data.
    Date: 2015–08
  4. By: Fedotenkov, Igor
    Abstract: This paper discusses a bootstrap-based test, which checks if finite moments exist, and indicates cases of possible misapplication. It notes, that a procedure for finding the smallest power to which observations need to be raised, such that the test rejects a hypothesis that the corresponding moment is finite, works poorly as an estimator of the tail index or moment estimator. This is the case especially for very low- and high-order moments. Several examples of correct usage of the test are also shown. The main result is derived analytically, and a Monte-Carlo experiment is presented.
    Keywords: Bootstrap, finite moment, heavy tails, tail index, test.
    JEL: C00 C12
    Date: 2015–06–16
  5. By: Nicolas Debarsy (Laboratoire d\'Économie dÓrléans - LEO - Laboratoire d'économie d'Orleans - CNRS - UO - Université d'Orléans); Fei Jin (School of Economics - SUFE - School of Economics - Shanghai University of Finance and Economics); Lung-Fei Lee (Department of Economics - Ohio State University - OSU - Ohio State University [Columbus])
    Abstract: This paper studies large sample properties of the matrix exponential spatial specification (MESS). We find that the quasi-maximum likelihood estimator (QMLE) for the MESS is consistent under heteroskedasticity, a property not shared by the QMLE of the SAR model. For the general model that has MESS in both the dependent variable and disturbances, labeled MESS(1,1), the QMLE can be consistent under unknown heteroskedasticity when the spatial weights matrices in the two MESS processes are commutative. We also consider the generalized method of moments estimator (GMME). In the homoskedastic case, we derive a best GMME that is as efficient as the maximum likelihood estimator under normality and can be asymptotically more efficient than the QMLE under non-normality. In the heteroskedastic case, an optimal GMME can be more efficient than the QMLE asymptotically. The QML approach for the MESS model has the computational advantage over that of a SAR model. The computational simplicity carries over to MESS models with any finite order of spatial matrices. No parameter range needs to be imposed in order for the model to be stable. Results of Monte Carlo experiments for finite sample properties of the estimators are reported. Finally, the MESS(1,1) is applied to Belgium's outward FDI data and we observe that the dominant motivation of Belgium's outward FDI lies in finding cheaper factor inputs.
    Date: 2015–09–01
  6. By: Daniel De Almeida; Luiz Hotta; Esther Ruiz
    Abstract: The parameters of popular multivariate GARCH (MGARCH) models are restricted so that their estimation is feasible in large systems and covariance stationarity and positive definiteness of conditional covariance matrices are guaranteed. These restrictions limit the dynamics that the models can represent, assuming, for example, that volatilities evolve in an univariate fashion, not being related neither among them nor with the correlations. This paper updates previous surveyson parametric MGARCH models focusing on their limitations to represent the dynamics observed in real systems of financial returns. The conclusions are illustrated using simulated data and a five-dimensional system of exchange rate returns.
    Keywords: BEKK , DCC , Multivariate conditional heteroscedasticity , Variance targeting , VECH
    JEL: C32 C52 C58
    Date: 2015–07
  7. By: Sumanta Adhya; Banerjee, Tathagata; Chattopadhyay, Gouranga
    Abstract: Estimating finite population distribution function is an important problem to the survey samplers since it summarizes almost all the relevant information of interest about the finite population. Moreover due to its nonlinearity estimation of variance of estimators of distribution function remains an active area of research since Chambers et al., 1992. Both analytic and resampling-based variance estimators are developed earlier. Here we poropse a bootstrap hybrid variance estimator of model-based semi-patametric estimator of finite population distribution function estimator. We prove its consistency and also show that its numerical performances are superior to analytical estimator.
  8. By: Elena Di Bernardino (CEDRIC - Centre d'Etude et De Recherche en Informatique du Cnam - Conservatoire National des Arts et Métiers [CNAM]); Didier Rullière (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1)
    Abstract: Archimedean copulas are copulas determined by a specific real function, called the generator. Composited with the copula at a given point, this generator can be expressed as a linear form of generators of the considered point components. In this paper, we discuss the case where this function is expressed as a quadratic form (called here multivariate Archimatrix copulas). This allows extending Archimedean copulas, in order for example to build asymmetric copulas. Parameters of this new class of copulas are grouped within a matrix, thus facilitating some usual applications as level curve determination or estimation. Some choices as sub-model stability help associating each parameter to one bivariate projection of the copula. We also give some admissibility conditions for the considered Archimatrix copulas. We propose different examples as some natural multivariate extensions of Farlie-Gumbel-Morgenstern, Gumbel-Barnett, or particular Archimax copulas.
    Date: 2015–05–04
  9. By: Hacène Djellout (Laboratoire de Mathématiques - UBP - Université Blaise Pascal - Clermont-Ferrand 2 - CNRS); Hui Jiang (Nanjing University of Aeronautics and Astronautics - Department of Mathematics)
    Abstract: Recently a considerable interest has been paid on the estimation problem of the realized volatility and covolatility by using high-frequency data of financial price processes in financial econometrics. Threshold estimation is one of the useful techniques in the inference for jump-type stochastic processes from discrete observations. In this paper, we adopt the threshold estimator introduced by Mancini where only the variations under a given threshold function are taken into account. The purpose of this work is to investigate large and moderate deviations for the threshold estimator of the integrated variance-covariance vector. This paper is an extension of the previous work in Djellout Guillin and Samoura where the problem has been studied in absence of the jump component. We will use the approximation lemma to prove the LDP. As the reader can expect we obtain the same results as in the case without jump.
    Date: 2015–04–03
  10. By: Jean-Guy Simonato (HEC Montreál); Lars Stentoft (University of Western Ontario and CREATES)
    Abstract: Two different pricing frameworks are typically used in the literature when pricing options under GARCH with non-normal innovations: the equilibrium approach and the no-arbitrage approach. Each framework can accommodate various forms of GARCH and innovation distributions, but empirical implementation and tests are typically done in one framework or the other because of the computational challenges that are involved in obtaining the relevant pricing parameters. We contribute to the literature by comparing and documenting the empirical performance of a GARCH specification which can be readily implemented in both pricing frameworks. The model uses a parsimonious GARCH specification with skewed and leptokurtic Johnson Su innovations together with either the equilibrium based framework or the no-arbitrage based framework. Using a large sample of options on the S&P 500 index, we find that the two approaches give rise to very similar pricing errors when implemented with time-varying pricing parameters. However, when implemented with constant pricing parameters, the performance of the no-arbitrage approach deteriorates in periods of high volatility relative to the equilibrium approach whose performance remains stable and at par with the models with time-varying pricing parameters.
    Keywords: Option pricing, Equilibrium approach, No-arbitrage approach
    JEL: C22 C53 G13
    Date: 2015–07–10
  11. By: Leonard A. Smith; Emma B. Suckling; Erica L. Thompson; Trevor Maynard; Hailiang Du
    Abstract: The evaluation of forecast performance plays a central role both in the interpretation and use of forecast systems and in their development. Different evaluation measures (scores) are available, often quantifying different characteristics of forecast performance. The properties of several proper scores for probabilistic forecast evaluation are contrasted and then used to interpret decadal probability hindcasts of global mean temperature. The Continuous Ranked Probability Score (CRPS), Proper Linear (PL) score, and IJ Good’s logarithmic score (also referred to as Ignorance) are compared; although information from all three may be useful, the logarithmic score has an immediate interpretation and is not insensitive to forecast busts. Neither CRPS nor PL is local; this is shown to produce counter intuitive evaluations by CRPS. Benchmark forecasts from empirical models like Dynamic Climatology place the scores in context. Comparing scores for forecast systems based on physical models (in this case HadCM3, from the CMIP5 decadal archive) against such benchmarks is more informative than internal comparison systems based on similar physical simulation models with each other. It is shown that a forecast system based on HadCM3 out performs Dynamic Climatology in decadal global mean temperature hindcasts; Dynamic Climatology previously outperformed a forecast system based upon HadGEM2 and reasons for these results are suggested. Forecasts of aggregate data (5-year means of global mean temperature) are, of course, narrower than forecasts of annual averages due to the suppression of variance; while the average “distance” between the forecasts and a target may be expected to decrease, little if any discernible improvement in probabilistic skill is achieved.
    JEL: C1
    Date: 2015–07–17
  12. By: S. Roux
    Abstract: This article aims at presenting and comparing structural and atheoretic approaches to micro-econometrics of public policy evaluation. If these approaches are often opposed because they rely on different scientific methodologies, they complement each other in the lessons one can draw from them. Two illustrations are presented: the evaluation of the workweek reduction in 1998 (Crépon, Leclair, Roux [1998]) and the local effect of speed enforcement cameras on road accidents (Roux, Zamora [2013]).
    Keywords: Evaluation methods, Structural models, Natural Experiments.
    JEL: B23 C10 C21 C52
    Date: 2015

This nep-ecm issue is ©2015 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.