
on Econometrics 
By:  Nina Munkholt Jakobsen (University of Copenhagen); Michael Sørensen (University of Copenhagen and CREATES) 
Abstract:  Parametric estimation for diffusion processes is considered for high frequency observations over a fixed time interval. The processes solve stochastic differential equations with an unknown parameter in the diffusion coefficient. We find easily verified conditions on approximate martingale estimating functions under which estimators are consistent, rate optimal, and efficient under high frequency (infill) asymptotics. The asymptotic distributions of the estimators are shown to be normal variancemixtures, where the mixing distribution generally depends on the full sample path of the diffusion process over the observation time interval. Utilising the concept of stable convergence, we also obtain the more easily applicable result that for a suitable data dependent normalisation, the estimators converge in distribution to a standard normal distribution. The theory is illustrated by a small simulation study comparing an efficient and a nonefficient estimating function. 
Keywords:  Approximate martingale estimating functions, discrete time sampling of diffusions, infill asymptotics, normal variancemixtures, optimal rate, random Fisher information, stable convergence, stochastic differential equation. 
JEL:  C22 
Date:  2015–08–06 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201533&r=ecm 
By:  Arthur Charpentier (UQAM  Université du Québec à Montréal); Emmanuel Flachaire (AMSE  AixMarseille School of Economics  EHESS  École des hautes études en sciences sociales  Centre national de la recherche scientifique (CNRS)  Ecole Centrale Marseille (ECM)  AMU  AixMarseille Université) 
Abstract:  Standard kernel density estimation methods are very often used in practice to estimate density function. It works well in numerous cases. However, it is known not to work so well with skewed, multimodal and heavytailed distributions. Such features are usual with income distributions, defined over the positive support. In this paper, we show that a preliminary logarithmic transformation of the data, combined with standard kernel density estimation methods, can provide a much better fit of the density estimation. 
Date:  2014–11 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:halshs01115988&r=ecm 
By:  Jaroslava Hlouskova; Leopold S\"ogner 
Abstract:  This article investigates parameter estimation of affine term structure models by means of the generalized method of moments. Exact moments of the affine latent process as well as of the yields are obtained by using results derived for ppolynomial processes. Then the generalized method of moments, combined with QuasiBayesian methods, is used to get reliable parameter estimates and to perform inference. After a simulation study, the estimation procedure is applied to empirical interest rate data. 
Date:  2015–08 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1508.01661&r=ecm 
By:  Fedotenkov, Igor 
Abstract:  This paper discusses a bootstrapbased test, which checks if finite moments exist, and indicates cases of possible misapplication. It notes, that a procedure for finding the smallest power to which observations need to be raised, such that the test rejects a hypothesis that the corresponding moment is finite, works poorly as an estimator of the tail index or moment estimator. This is the case especially for very low and highorder moments. Several examples of correct usage of the test are also shown. The main result is derived analytically, and a MonteCarlo experiment is presented. 
Keywords:  Bootstrap, finite moment, heavy tails, tail index, test. 
JEL:  C00 C12 
Date:  2015–06–16 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:66033&r=ecm 
By:  Nicolas Debarsy (Laboratoire d\'Économie dÓrléans  LEO  Laboratoire d'économie d'Orleans  CNRS  UO  Université d'Orléans); Fei Jin (School of Economics  SUFE  School of Economics  Shanghai University of Finance and Economics); LungFei Lee (Department of Economics  Ohio State University  OSU  Ohio State University [Columbus]) 
Abstract:  This paper studies large sample properties of the matrix exponential spatial specification (MESS). We find that the quasimaximum likelihood estimator (QMLE) for the MESS is consistent under heteroskedasticity, a property not shared by the QMLE of the SAR model. For the general model that has MESS in both the dependent variable and disturbances, labeled MESS(1,1), the QMLE can be consistent under unknown heteroskedasticity when the spatial weights matrices in the two MESS processes are commutative. We also consider the generalized method of moments estimator (GMME). In the homoskedastic case, we derive a best GMME that is as efficient as the maximum likelihood estimator under normality and can be asymptotically more efficient than the QMLE under nonnormality. In the heteroskedastic case, an optimal GMME can be more efficient than the QMLE asymptotically. The QML approach for the MESS model has the computational advantage over that of a SAR model. The computational simplicity carries over to MESS models with any finite order of spatial matrices. No parameter range needs to be imposed in order for the model to be stable. Results of Monte Carlo experiments for finite sample properties of the estimators are reported. Finally, the MESS(1,1) is applied to Belgium's outward FDI data and we observe that the dominant motivation of Belgium's outward FDI lies in finding cheaper factor inputs. 
Date:  2015–09–01 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal00858174&r=ecm 
By:  Daniel De Almeida; Luiz Hotta; Esther Ruiz 
Abstract:  The parameters of popular multivariate GARCH (MGARCH) models are restricted so that their estimation is feasible in large systems and covariance stationarity and positive definiteness of conditional covariance matrices are guaranteed. These restrictions limit the dynamics that the models can represent, assuming, for example, that volatilities evolve in an univariate fashion, not being related neither among them nor with the correlations. This paper updates previous surveyson parametric MGARCH models focusing on their limitations to represent the dynamics observed in real systems of financial returns. The conclusions are illustrated using simulated data and a fivedimensional system of exchange rate returns. 
Keywords:  BEKK , DCC , Multivariate conditional heteroscedasticity , Variance targeting , VECH 
JEL:  C32 C52 C58 
Date:  2015–07 
URL:  http://d.repec.org/n?u=RePEc:cte:wsrepe:ws1516&r=ecm 
By:  Sumanta Adhya; Banerjee, Tathagata; Chattopadhyay, Gouranga 
Abstract:  Estimating finite population distribution function is an important problem to the survey samplers since it summarizes almost all the relevant information of interest about the finite population. Moreover due to its nonlinearity estimation of variance of estimators of distribution function remains an active area of research since Chambers et al., 1992. Both analytic and resamplingbased variance estimators are developed earlier. Here we poropse a bootstrap hybrid variance estimator of modelbased semipatametric estimator of finite population distribution function estimator. We prove its consistency and also show that its numerical performances are superior to analytical estimator. 
URL:  http://d.repec.org/n?u=RePEc:iim:iimawp:13715&r=ecm 
By:  Elena Di Bernardino (CEDRIC  Centre d'Etude et De Recherche en Informatique du Cnam  Conservatoire National des Arts et Métiers [CNAM]); Didier Rullière (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1) 
Abstract:  Archimedean copulas are copulas determined by a specific real function, called the generator. Composited with the copula at a given point, this generator can be expressed as a linear form of generators of the considered point components. In this paper, we discuss the case where this function is expressed as a quadratic form (called here multivariate Archimatrix copulas). This allows extending Archimedean copulas, in order for example to build asymmetric copulas. Parameters of this new class of copulas are grouped within a matrix, thus facilitating some usual applications as level curve determination or estimation. Some choices as submodel stability help associating each parameter to one bivariate projection of the copula. We also give some admissibility conditions for the considered Archimatrix copulas. We propose different examples as some natural multivariate extensions of FarlieGumbelMorgenstern, GumbelBarnett, or particular Archimax copulas. 
Date:  2015–05–04 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal01147778&r=ecm 
By:  Hacène Djellout (Laboratoire de Mathématiques  UBP  Université Blaise Pascal  ClermontFerrand 2  CNRS); Hui Jiang (Nanjing University of Aeronautics and Astronautics  Department of Mathematics) 
Abstract:  Recently a considerable interest has been paid on the estimation problem of the realized volatility and covolatility by using highfrequency data of financial price processes in financial econometrics. Threshold estimation is one of the useful techniques in the inference for jumptype stochastic processes from discrete observations. In this paper, we adopt the threshold estimator introduced by Mancini where only the variations under a given threshold function are taken into account. The purpose of this work is to investigate large and moderate deviations for the threshold estimator of the integrated variancecovariance vector. This paper is an extension of the previous work in Djellout Guillin and Samoura where the problem has been studied in absence of the jump component. We will use the approximation lemma to prove the LDP. As the reader can expect we obtain the same results as in the case without jump. 
Date:  2015–04–03 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal01147189&r=ecm 
By:  JeanGuy Simonato (HEC Montreál); Lars Stentoft (University of Western Ontario and CREATES) 
Abstract:  Two different pricing frameworks are typically used in the literature when pricing options under GARCH with nonnormal innovations: the equilibrium approach and the noarbitrage approach. Each framework can accommodate various forms of GARCH and innovation distributions, but empirical implementation and tests are typically done in one framework or the other because of the computational challenges that are involved in obtaining the relevant pricing parameters. We contribute to the literature by comparing and documenting the empirical performance of a GARCH specification which can be readily implemented in both pricing frameworks. The model uses a parsimonious GARCH specification with skewed and leptokurtic Johnson Su innovations together with either the equilibrium based framework or the noarbitrage based framework. Using a large sample of options on the S&P 500 index, we find that the two approaches give rise to very similar pricing errors when implemented with timevarying pricing parameters. However, when implemented with constant pricing parameters, the performance of the noarbitrage approach deteriorates in periods of high volatility relative to the equilibrium approach whose performance remains stable and at par with the models with timevarying pricing parameters. 
Keywords:  Option pricing, Equilibrium approach, Noarbitrage approach 
JEL:  C22 C53 G13 
Date:  2015–07–10 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201532&r=ecm 
By:  Leonard A. Smith; Emma B. Suckling; Erica L. Thompson; Trevor Maynard; Hailiang Du 
Abstract:  The evaluation of forecast performance plays a central role both in the interpretation and use of forecast systems and in their development. Different evaluation measures (scores) are available, often quantifying different characteristics of forecast performance. The properties of several proper scores for probabilistic forecast evaluation are contrasted and then used to interpret decadal probability hindcasts of global mean temperature. The Continuous Ranked Probability Score (CRPS), Proper Linear (PL) score, and IJ Good’s logarithmic score (also referred to as Ignorance) are compared; although information from all three may be useful, the logarithmic score has an immediate interpretation and is not insensitive to forecast busts. Neither CRPS nor PL is local; this is shown to produce counter intuitive evaluations by CRPS. Benchmark forecasts from empirical models like Dynamic Climatology place the scores in context. Comparing scores for forecast systems based on physical models (in this case HadCM3, from the CMIP5 decadal archive) against such benchmarks is more informative than internal comparison systems based on similar physical simulation models with each other. It is shown that a forecast system based on HadCM3 out performs Dynamic Climatology in decadal global mean temperature hindcasts; Dynamic Climatology previously outperformed a forecast system based upon HadGEM2 and reasons for these results are suggested. Forecasts of aggregate data (5year means of global mean temperature) are, of course, narrower than forecasts of annual averages due to the suppression of variance; while the average “distance” between the forecasts and a target may be expected to decrease, little if any discernible improvement in probabilistic skill is achieved. 
JEL:  C1 
Date:  2015–07–17 
URL:  http://d.repec.org/n?u=RePEc:ehl:lserod:62949&r=ecm 
By:  S. Roux 
Abstract:  This article aims at presenting and comparing structural and atheoretic approaches to microeconometrics of public policy evaluation. If these approaches are often opposed because they rely on different scientific methodologies, they complement each other in the lessons one can draw from them. Two illustrations are presented: the evaluation of the workweek reduction in 1998 (Crépon, Leclair, Roux [1998]) and the local effect of speed enforcement cameras on road accidents (Roux, Zamora [2013]). 
Keywords:  Evaluation methods, Structural models, Natural Experiments. 
JEL:  B23 C10 C21 C52 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:bfr:banfra:565&r=ecm 