Econometrics
http://lists.repec.org/mailman/listinfo/nep-ecm
Econometrics2015-08-25Sune KarlssonEfficient Estimation for Diffusions Sampled at High Frequency Over a Fixed Time Interval
http://d.repec.org/n?u=RePEc:aah:create:2015-33&r=ecm
Parametric estimation for diffusion processes is considered for high frequency observations over a fixed time interval. The processes solve stochastic differential equations with an unknown parameter in the diffusion coefficient. We find easily verified conditions on approximate martingale estimating functions under which estimators are consistent, rate optimal, and efficient under high frequency (in-fill) asymptotics. The asymptotic distributions of the estimators are shown to be normal variance-mixtures, where the mixing distribution generally depends on the full sample path of the diffusion process over the observation time interval. Utilising the concept of stable convergence, we also obtain the more easily applicable result that for a suitable data dependent normalisation, the estimators converge in distribution to a standard normal distribution. The theory is illustrated by a small simulation study comparing an efficient and a non-efficient estimating function.Nina Munkholt Jakobsen, Michael Sørensen2015-08-06Approximate martingale estimating functions, discrete time sampling of diffusions, in-fill asymptotics, normal variance-mixtures, optimal rate, random Fisher information, stable convergence, stochastic differential equation.Log-Transform Kernel Density Estimation of Income Distribution
http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-01115988&r=ecm
Standard kernel density estimation methods are very often used in practice to estimate density function. It works well in numerous cases. However, it is known not to work so well with skewed, multimodal and heavy-tailed distributions. Such features are usual with income distributions, defined over the positive support. In this paper, we show that a preliminary logarithmic transformation of the data, combined with standard kernel density estimation methods, can provide a much better fit of the density estimation.Arthur Charpentier, Emmanuel Flachaire2014-11GMM Estimation of Affine Term Structure Models
http://d.repec.org/n?u=RePEc:arx:papers:1508.01661&r=ecm
This article investigates parameter estimation of affine term structure models by means of the generalized method of moments. Exact moments of the affine latent process as well as of the yields are obtained by using results derived for p-polynomial processes. Then the generalized method of moments, combined with Quasi-Bayesian methods, is used to get reliable parameter estimates and to perform inference. After a simulation study, the estimation procedure is applied to empirical interest rate data.Jaroslava Hlouskova, Leopold S\"ogner2015-08A note on the bootstrap method for testing the existence of finite moments
http://d.repec.org/n?u=RePEc:pra:mprapa:66033&r=ecm
This paper discusses a bootstrap-based test, which checks if finite moments exist, and indicates cases of possible misapplication. It notes, that a procedure for finding the smallest power to which observations need to be raised, such that the test rejects a hypothesis that the corresponding moment is finite, works poorly as an estimator of the tail index or moment estimator. This is the case especially for very low- and high-order moments. Several examples of correct usage of the test are also shown. The main result is derived analytically, and a Monte-Carlo experiment is presented.Fedotenkov, Igor2015-06-16Bootstrap, finite moment, heavy tails, tail index, test.Large sample properties of the matrix exponential spatial specification with an application to FDI
http://d.repec.org/n?u=RePEc:hal:journl:hal-00858174&r=ecm
This paper studies large sample properties of the matrix exponential spatial specification (MESS). We find that the quasi-maximum likelihood estimator (QMLE) for the MESS is consistent under heteroskedasticity, a property not shared by the QMLE of the SAR model. For the general model that has MESS in both the dependent variable and disturbances, labeled MESS(1,1), the QMLE can be consistent under unknown heteroskedasticity when the spatial weights matrices in the two MESS processes are commutative. We also consider the generalized method of moments estimator (GMME). In the homoskedastic case, we derive a best GMME that is as efficient as the maximum likelihood estimator under normality and can be asymptotically more efficient than the QMLE under non-normality. In the heteroskedastic case, an optimal GMME can be more efficient than the QMLE asymptotically. The QML approach for the MESS model has the computational advantage over that of a SAR model. The computational simplicity carries over to MESS models with any finite order of spatial matrices. No parameter range needs to be imposed in order for the model to be stable. Results of Monte Carlo experiments for finite sample properties of the estimators are reported. Finally, the MESS(1,1) is applied to Belgium's outward FDI data and we observe that the dominant motivation of Belgium's outward FDI lies in finding cheaper factor inputs.Nicolas Debarsy, Fei Jin, Lung-Fei Lee2015-09-01MGARCH models: tradeoff between feasibility and flexibility
http://d.repec.org/n?u=RePEc:cte:wsrepe:ws1516&r=ecm
The parameters of popular multivariate GARCH (MGARCH) models are restricted so that their estimation is feasible in large systems and covariance stationarity and positive definiteness of conditional covariance matrices are guaranteed. These restrictions limit the dynamics that the models can represent, assuming, for example, that volatilities evolve in an univariate fashion, not being related neither among them nor with the correlations. This paper updates previous surveyson parametric MGARCH models focusing on their limitations to represent the dynamics observed in real systems of financial returns. The conclusions are illustrated using simulated data and a five-dimensional system of exchange rate returns.Daniel De Almeida, Luiz Hotta, Esther Ruiz2015-07BEKK , DCC , Multivariate conditional heteroscedasticity , Variance targeting , VECHA Note on Estimating Variance of Finite Population Distribution Function
http://d.repec.org/n?u=RePEc:iim:iimawp:13715&r=ecm
Estimating finite population distribution function is an important problem to the survey samplers since it summarizes almost all the relevant information of interest about the finite population. Moreover due to its nonlinearity estimation of variance of estimators of distribution function remains an active area of research since Chambers et al., 1992. Both analytic and resampling-based variance estimators are developed earlier. Here we poropse a bootstrap hybrid variance estimator of model-based semi-patametric estimator of finite population distribution function estimator. We prove its consistency and also show that its numerical performances are superior to analytical estimator.Sumanta Adhya, Banerjee, Tathagata, Chattopadhyay, GourangaOn an asymmetric extension of multivariate Archimedean copulas
http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01147778&r=ecm
Archimedean copulas are copulas determined by a specific real function, called the generator. Composited with the copula at a given point, this generator can be expressed as a linear form of generators of the considered point components. In this paper, we discuss the case where this function is expressed as a quadratic form (called here multivariate Archimatrix copulas). This allows extending Archimedean copulas, in order for example to build asymmetric copulas. Parameters of this new class of copulas are grouped within a matrix, thus facilitating some usual applications as level curve determination or estimation. Some choices as sub-model stability help associating each parameter to one bivariate projection of the copula. We also give some admissibility conditions for the considered Archimatrix copulas. We propose different examples as some natural multivariate extensions of Farlie-Gumbel-Morgenstern, Gumbel-Barnett, or particular Archimax copulas.Elena Di Bernardino, Didier Rullière2015-05-04LARGE DEVIATIONS OF THE THRESHOLD ESTIMATOR OF INTEGRATED (CO-)VOLATILITY VECTOR IN THE PRESENCE OF JUMPS
http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01147189&r=ecm
Recently a considerable interest has been paid on the estimation problem of the realized volatility and covolatility by using high-frequency data of financial price processes in financial econometrics. Threshold estimation is one of the useful techniques in the inference for jump-type stochastic processes from discrete observations. In this paper, we adopt the threshold estimator introduced by Mancini where only the variations under a given threshold function are taken into account. The purpose of this work is to investigate large and moderate deviations for the threshold estimator of the integrated variance-covariance vector. This paper is an extension of the previous work in Djellout Guillin and Samoura where the problem has been studied in absence of the jump component. We will use the approximation lemma to prove the LDP. As the reader can expect we obtain the same results as in the case without jump.Hacène Djellout, Hui Jiang2015-04-03Which pricing approach for options under GARCH with non-normal innovations?
http://d.repec.org/n?u=RePEc:aah:create:2015-32&r=ecm
Two different pricing frameworks are typically used in the literature when pricing options under GARCH with non-normal innovations: the equilibrium approach and the no-arbitrage approach. Each framework can accommodate various forms of GARCH and innovation distributions, but empirical implementation and tests are typically done in one framework or the other because of the computational challenges that are involved in obtaining the relevant pricing parameters. We contribute to the literature by comparing and documenting the empirical performance of a GARCH specification which can be readily implemented in both pricing frameworks. The model uses a parsimonious GARCH specification with skewed and leptokurtic Johnson Su innovations together with either the equilibrium based framework or the no-arbitrage based framework. Using a large sample of options on the S&P 500 index, we find that the two approaches give rise to very similar pricing errors when implemented with time-varying pricing parameters. However, when implemented with constant pricing parameters, the performance of the no-arbitrage approach deteriorates in periods of high volatility relative to the equilibrium approach whose performance remains stable and at par with the models with time-varying pricing parameters.Jean-Guy Simonato, Lars Stentoft2015-07-10Option pricing, Equilibrium approach, No-arbitrage approachTowards improving the framework for probabilistic forecast evaluation
http://d.repec.org/n?u=RePEc:ehl:lserod:62949&r=ecm
The evaluation of forecast performance plays a central role both in the interpretation and use of forecast systems and in their development. Different evaluation measures (scores) are available, often quantifying different characteristics of forecast performance. The properties of several proper scores for probabilistic forecast evaluation are contrasted and then used to interpret decadal probability hindcasts of global mean temperature. The Continuous Ranked Probability Score (CRPS), Proper Linear (PL) score, and IJ Good’s logarithmic score (also referred to as Ignorance) are compared; although information from all three may be useful, the logarithmic score has an immediate interpretation and is not insensitive to forecast busts. Neither CRPS nor PL is local; this is shown to produce counter intuitive evaluations by CRPS. Benchmark forecasts from empirical models like Dynamic Climatology place the scores in context. Comparing scores for forecast systems based on physical models (in this case HadCM3, from the CMIP5 decadal archive) against such benchmarks is more informative than internal comparison systems based on similar physical simulation models with each other. It is shown that a forecast system based on HadCM3 out performs Dynamic Climatology in decadal global mean temperature hindcasts; Dynamic Climatology previously outperformed a forecast system based upon HadGEM2 and reasons for these results are suggested. Forecasts of aggregate data (5-year means of global mean temperature) are, of course, narrower than forecasts of annual averages due to the suppression of variance; while the average “distance” between the forecasts and a target may be expected to decrease, little if any discernible improvement in probabilistic skill is achieved.Leonard A. Smith, Emma B. Suckling, Erica L. Thompson, Trevor Maynard, Hailiang Du2015-07-17Structural and atheoretic approaches to micro-econometrics of public policy evaluation.(in french)
http://d.repec.org/n?u=RePEc:bfr:banfra:565&r=ecm
This article aims at presenting and comparing structural and atheoretic approaches to micro-econometrics of public policy evaluation. If these approaches are often opposed because they rely on different scientific methodologies, they complement each other in the lessons one can draw from them. Two illustrations are presented: the evaluation of the workweek reduction in 1998 (Crépon, Leclair, Roux [1998]) and the local effect of speed enforcement cameras on road accidents (Roux, Zamora [2013]).S. Roux2015Evaluation methods, Structural models, Natural Experiments.