|
on Econometrics |
By: | Lemieux, Thomas; Marmer, Vadim |
Abstract: | We consider weak identification in the fuzzy regression discontinuity (FRD) model. In this model, the treatment effect is identified through a discontinuity in the conditional probability of treatment assignment. Weak identification corresponds to the situation where the discontinuity is of a small magnitude. When identification is weak, we show that the usual t-test based on the FRD estimator and its standard error suffers from asymptotic size distortions. To eliminate those size distortions, we propose a modified t-statistic that uses a null-restricted version of the standard error of the FRD estimator. Simple and asymptotically valid confidence sets for the treatment effect can be also constructed using the FRD estimator and its null-restricted standard error. |
Keywords: | Nonparametric inference; regression discontinuity design; treatment e¤ect; weak identi…cation |
JEL: | C12 C13 C14 |
Date: | 2010–05–15 |
URL: | http://d.repec.org/n?u=RePEc:ubc:pmicro:vadim_marmer-2010-19&r=ecm |
By: | Schwarz, Maik; Van Bellegem, Sébastien |
Abstract: | We estimate the distribution of a real-valued random variable from contaminated observations. The additive error is supposed to be normally distributed, but with unknown variance. The distribution is identiable from the observations if we restrict the class of considered distributions by a simple condition in the time domain. A minimum distance estimator is shown to be consistent imposing only a slightly stronger assumption than the identification condition. |
Keywords: | deconvolution, error measurement, density estimation |
Date: | 2009–10–06 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22200&r=ecm |
By: | Bouezmarni, Taoufik; Van Bellegem, Sébastien |
Abstract: | The paper introduces a new nonparametric estimator of the spectral density that is given in smoothing the periodogram by the probability density of Beta random variable (Beta kernel). The estimator is proved to be bounded for short memory data, and diverges at the origin for long memory data. The convergence in probability of the relative error and Monte Carlo simulations suggest that the estimator automaticaly adapts to the long- or the short-range dependency of the process. A cross-validation procedure is also studied in order to select the nuisance parameter of the estimator. Illustrations on historical as well as most recent returns and absolute returns of the S&P500 index show the reasonable performance of the estimation, and show that the data-driven estimator is a valuable tool for the detection of long-memory as well as hidden periodicities in stock returns. |
Keywords: | spectral density, long rage dependence, nonparametric estimation |
Date: | 2009–09–11 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22191&r=ecm |
By: | Bigot, Jérôme; Van Bellegem, Sébastien |
Abstract: | This paper proposes a new wavelet-based method for deconvolving a density. The estimator combines the ideas of nonlinear wavelet thresholding with periodised Meyer wavelets and estimation by information projection. It is guaranteed to be in the class of density functions, in particular it is positive everywhere by construction. The asymptotic optimality of the estimator is established in terms of rate of convergence of the Kullback-Leibler discrepancy over Besov classes. Finite sample properties is investigated in detail, and show the excellent empirical performance of the estimator, compared with other recently introduced estimators. |
Keywords: | deconvolution, wavelet thresholding,adaptive estimation |
Date: | 2009–02–11 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22136&r=ecm |
By: | Faugeras, Olivier |
Abstract: | To make a prediction of a response variable from an explanatory one which takes into account features such as multimodality, a nonparametric approach based on an estimate of the conditional density is advocated and considered. In particular, we build point and interval predictors based on the quantile-copula estimator of the conditional density by Faugeras [8]. The consistency of these predictors is proved through a uniform consistency result of the conditional density estimator. Eventually, the practical implementation of these predictors is discussed. A simulation on a real data set illustrates the proposed methods. |
Keywords: | nonparametric estimation, modal regressor, level-set |
Date: | 2009–12–07 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22247&r=ecm |
By: | Xinghua Zheng; Yingying Li |
Abstract: | We consider the estimation of integrated covariance matrices of high dimensional diffusion processes by using high frequency data. We start by studying the most commonly used estimator, the realized covariance matrix (RCV). We show that in the high dimensional case when the dimension p and the observation frequency n grow in the same rate, the limiting empirical spectral distribution of RCV depends on the covolatility processes not only through the underlying integrated covariance matrix Sigma, but also on how the covolatility processes vary in time. In particular, for two high dimensional diffusion processes with the same integrated covariance matrix, the empirical spectral distributions of their RCVs can be very different. Hence in terms of making inference about the spectrum of the integrated covariance matrix, the RCV is in general \emph{not} a good proxy to rely on in the high dimensional case. We then propose an alternative estimator, the time-variation adjusted realized covariance matrix (TVARCV), for a class of diffusion processes. We show that the limiting empirical spectral distribution of our proposed estimator TVARCV does depend solely on that of Sigma through a Marcenko-Pastur equation, and hence the TVARCV can be used to recover the empirical spectral distribution of Sigma by inverting the Marcenko-Pastur equation, which can then be applied to further applications such as portfolio allocation, risk management, etc.. |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1005.1862&r=ecm |
By: | Bontemps, Christophe; Racine, Jeffrey S.; Simioni, Michel |
Abstract: | The estimation of conditional probability distribution functions (PDFs) in a kernel nonparametric framework has recently received attention. As emphasized by Hall, Racine & Li (2004), these conditional PDFs are extremely useful for a range of tasks including modelling and predicting consumer choice. The aim of this paper is threefold. First, we implement nonparametric kernel estimation of PDF with a binary choice variable and both continuous and discrete explanatory variables. Second, we address the issue of the performances of this nonparametric estimator when compared to a classic on-the-shelf parametric estimator, namely a probit. We propose to evaluate these estimators in terms of their predictive performances, in the line of the recent ”revealed performance” test proposed by Racine & Parmeter (2009). Third, we provide a detailed discussion of the results focusing on environmental insights provided by the two estimators, revealing some patterns that can only be detected using the nonparametric estimator. |
Keywords: | binary choice models, nonparametric estimation, specification tests |
Date: | 2009–12–16 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22249&r=ecm |
By: | Casanova, Sandrine |
Abstract: | Estimating the cumulative distribution function in survey sampling is of interest on the population but also on a sub-population (domain). However, in most practical applications, sample sizes in the domains are not large enough to produce sufficiently precise estimators. Therefore, we propose new nonparametric estimators of the cumulative distribution function in a domain based on M-quantile estimation. The obtained estimators are compared by simulations and applied to real data. |
Date: | 2009–12 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22254&r=ecm |
By: | Alysha M De Livera |
Abstract: | A new automatic forecasting procedure is proposed based on a recent exponential smoothing framework which incorporates a Box-Cox transformation and ARMA residual corrections. The procedure is complete with well-defined methods for initialization, estimation, likelihood evaluation, and analytical derivation of point and interval predictions under a Gaussian error assumption. The algorithm is examined extensively by applying it to single seasonal and non-seasonal time series from the M and the M3 competitions, and is shown to provide competitive out-of-sample forecast accuracy compared to the best methods in these competitions and to the traditional exponential smoothing framework. The proposed algorithm can be used as an alternative to existing automatic forecasting procedures in modeling single seasonal and non-seasonal time series. In addition, it provides the new option of automatic modeling of multiple seasonal time series which cannot be handled using any of the existing automatic forecasting procedures. The proposed automatic procedure is further illustrated by applying it to two multiple seasonal time series involving call center data and electricity demand data. |
Keywords: | Exponential smoothing, state space models, automatic forecasting, Box-Cox transformation, residual adjustment, multiple seasonality, time series |
JEL: | C22 C53 |
Date: | 2010–04–28 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2010-10&r=ecm |
By: | Costantini, Mauro (Department of Economics, University of Vienna, Vienna, Austria); Gunter, Ulrich (Department of Economics, University of Vienna, Vienna, Austria); Kunst, Robert M. (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria and Department of Economics, University of Vienna, Vienna, Austria) |
Abstract: | We use data generated by a macroeconomic DSGE model to study the relative benefits of forecast combinations based on forecast-encompassing tests relative to simple uniformly weighted forecast averages across rival models. Assumed rival models are four linear autoregressive specifications, one of them a more sophisticated factor-augmented vector autoregression (FAVAR). The forecaster is assumed not to know the true data-generating DSGE model. The results critically depend on the prediction horizon. While one-step prediction hardly supports test-based combinations, the test-based procedure attains a clear lead at prediction horizons greater than two. |
Keywords: | Combining forecasts, encompassing tests, model selection, time series, DSGE model |
JEL: | C15 C32 C53 |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:ihs:ihsesp:251&r=ecm |
By: | Peter Ruckdeschel (Fraunhofer ITWM, Department of Financial Mathematics, Dept. of Mathematics, Univerisity of Kaiserslautern); Nataliya Horbenko (Fraunhofer ITWM, Department of Financial Mathematics, Dept. of Mathematics, Univerisity of Kaiserslautern) |
Abstract: | We study global and local robustness properties of several estimators for shape and scale in a generalized Pareto model. The estimators considered in this paper cover maximum likelihood estimators, skipped maximum likelihood estimators, Cram\'er-von-Mises Minimum Distance estimators, and, as a special case of quantile-based estimators, Pickands Estimator. We further consider an estimator matching the population median and an asymmetric, robust estimator of scale (kMAD) to the empirical ones (kMedMAD), which may be tuned to an expected FSBP of 34%. These estimators are compared to one-step estimators distinguished as optimal in the shrinking neighborhood setting, i.e.; the most bias-robust estimator minimizing the maximal (asymptotic) bias and the estimator minimizing the maximal (asymptotic) MSE. For each of these estimators, we determine the finite sample breakdown point, the influence function, as well as statistical accuracy measured by asymptotic bias, variance, and mean squared error - all evaluated uniformly on shrinking convex contamination neighborhoods. Finally, we check these asymptotic theoretical findings against finite sample behavior by an extensive simulation study. |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1005.1476&r=ecm |
By: | Farré, Lídia (Institut d’Anàlisi Econòmica (IAE)); Klein, Roger (Rutgers University); Vella, Francis (Georgetown University) |
Abstract: | An innovation which bypasses the need for instruments when estimating endogenous treatment effects is identification via conditional second moments. The most general of these approaches is Klein and Vella (2010) which models the conditional variances semiparametrically. While this is attractive, as identification is not reliant on parametric assumptions for variances, the non-parametric aspect of the estimation may discourage practitioners from its use. This paper outlines how the estimator can be implemented parametrically. The use of parametric assumptions is accompanied by a large reduction in computational and programming demands. We illustrate the approach by estimating the return to education using a sample drawn from the National Longitudinal Survey of Youth 1979. Accounting for endogeneity increases the estimate of the return to education from 6.8% to 11.2%. |
Keywords: | return to education, heteroskedasticity, endogeneity |
JEL: | J31 C31 |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp4935&r=ecm |
By: | Regnard, Nazim; Zakoian, Jean-Michel |
Abstract: | A novel GARCH(1,1) model, with coefficients function of the realizations of an exogenous process, is considered for the volatility of daily gas prices. A distinctive feature of the model is that it produces non-stationary solutions. The probability properties, and the convergence and asymptotic normality of the Quasi-Maximum Likelihood Estimator (QMLE) have been derived by Regnard and Zakoian (2009). The prediction properties of the model are considered. We derive a strongly consistent estimator of the asymptotic variance of the QMLE. An application to daily gas spot prices from the Zeebruge market is presented. Apart from conditional heteroskedasticity, an empirical finding is the existence of distinct volatility regimes depending on the temperature level. |
Keywords: | GARCH; Gas prices; Nonstationary models; Periodic models; Quasi-maximum likelihood estimation; Time-varying coefficients |
JEL: | C53 C22 |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:22642&r=ecm |
By: | Jordà, Òscar; Knüppel, Malte; Marcellino, Massimiliano |
Abstract: | Measuring and displaying uncertainty around path-forecasts, i.e. forecasts made in period T about the expected trajectory of a random variable in periods T+1 to T+H is a key ingredient for decision making under uncertainty. The probabilistic assessment about the set of possible trajectories that the variable may follow over time is summarized by the simultaneous confidence region generated from its forecast generating distribution. However, if the null model is only approximative or altogether unavailable, one cannot derive analytic expressions for this confidence region, and its non-parametric estimation is impractical given commonly available predictive sample sizes. Instead, this paper derives the approximate rectangular confidence regions that control false discovery rate error, which are a function of the predictive sample covariance matrix and the empirical distribution of the Mahalanobis distance of the path-forecast errors. These rectangular regions are simple to construct and appear to work well in a variety of cases explored empirically and by simulation. The proposed techniques are applied to provide confidence bands around the Fed and Bank of England real-time path-forecasts of growth and inflation. -- |
Keywords: | Path forecast,forecast uncertainty,simultaneous confidence region,Scheffé's S-method,Mahalanobis distance,false discovery rate |
JEL: | C32 C52 C53 |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:zbw:bubdp1:201006&r=ecm |
By: | Rama Cont (IEOR Dept., Columbia University, New York, USA, and Laboratoire de Probabilites et Modeles Aleatoires, CNRS-Universite Paris VI, France); Cecilia Mancini (Dipartimento di Matematica per le Decisioni, Universita' degli Studi di Firenze) |
Abstract: | We propose two nonparametric tests for investigating the pathwise properties of a signal modeled as the sum of a L\'evy process and a Brownian semimartingale. Using a nonparametric threshold estimator for the continuous component of the quadratic variation, we design a test for the presence of a continuous martingale component in the process and a test for establishing whether the jumps have finite or infinite variation, based on observations on a discrete time grid. We evaluate the performance of our tests using simulations of various stochastic models and use the tests to investigate the fine structure of the DM/USD exchange rate fluctuations and SPX futures prices. In both cases, our tests reveal the presence of a non-zero Brownian component and a finite variation jump component. |
Keywords: | Threshold estimator, central limit theorem, test for finite variation jumps, test for Brownian component. |
Date: | 2010–01 |
URL: | http://d.repec.org/n?u=RePEc:flo:wpaper:2010-02&r=ecm |
By: | Andrés M. Alonso; David Casado; Juan Romo |
Abstract: | A popular approach for classifying functional data is based on the distances from the function or its derivatives to group representative (usually the mean) functions or their derivatives. In this paper, we propose using a combination of those distances. Simulation studies show that our procedure performs very well, resulting in smaller testing classication errors. Applications to real data show that our procedure performs as well as –and in some cases better than– other classication methods. |
Keywords: | Discriminant analysis, Functional data, Weighted distances |
Date: | 2009–07 |
URL: | http://d.repec.org/n?u=RePEc:cte:wsrepe:ws093915&r=ecm |
By: | José Antonio Carnicero; Michael P. Wiper; Concepción Ausín |
Abstract: | This paper introduces a new non-parametric approach to the modeling of circular data, based on the use of Bernstein polynomial densities which generalizes the standard Bernstein polynomial model to account for the specific characteristics of circular data. It is shown that the trigonometric moments of the proposed circular Bernstein polynomial distribution can all be derived in closed form. We comment on how to fit the Bernstein polynomial density approximation to a sample of data and illustrate our approach with a real data example. |
Keywords: | Circular data, Non-parametric modeling, Bernstein polynomials |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:cte:wsrepe:ws102511&r=ecm |
By: | Marco Aiolfi (Goldman Sachs Asset Management); Carlos Capistrán (Banco de México); Allan Timmermann (University of California, San Diego and CREATES) |
Abstract: | We consider combinations of subjective survey forecasts and model-based forecasts from linear and non-linear univariate specifications as well as multivariate factor-augmented models. Empirical results suggest that a simple equal-weighted average of survey forecasts outperform the best model-based forecasts for a majority of macroeconomic variables and forecast horizons. Additional improvements can in some cases be gained by using a simple equal-weighted average of survey and model-based forecasts. We also provide an analysis of the importance of model instability for explaining gains from forecast combination. Analytical and simulation results uncover break scenarios where forecast combinations outperform the best individual forecasting model. |
Keywords: | Time-series forecasts, survey forecasts, model instability |
JEL: | C22 C53 |
Date: | 2010–05–06 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-21&r=ecm |
By: | Yin Liao; Heather M. Anderson; Farshid Vahid |
Abstract: | Realized volatility of stock returns is often decomposed into two distinct components that are attributed to continuous price variation and jumps. This paper proposes a tobit multivariate factor model for the jumps coupled with a standard multivariate factor model for the continuous sample path to jointly forecast volatility in three Chinese Mainland stocks. Out of sample forecast analysis shows that separate multivariate factor models for the two volatility processes outperform a single multivariate factor model of realized volatility, and that a single multivariate factor model of realized volatility outperforms univariate models. |
Keywords: | Realized Volatility, Bipower Variation, Jumps, Common Factors, Forecasting |
JEL: | C13 C32 C52 C53 G32 |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2010-11&r=ecm |
By: | Florens, Jean-Pierre; Johannes, Jan; Van Bellegem, Sébastien |
Date: | 2009–08 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22276&r=ecm |
By: | Rothe, Christoph |
Abstract: | In this paper, we study the effect of a small ceteris paribus change in the marginal distribution of a binary covariate on some feature of the unconditional distribution of an outcome variable of interest. We show that the RIF regression techniques recently proposed by Firpo, Fortin, and Lemieux (2009) do not estimate this quantity. Moreover, we show that such parameters are in general only partially identified, and derive straightforward expressions for the identified set. The results are implemented in the context of an empirical application that studies the effect of union membership rates on the distribution of wages. |
Keywords: | unconditional partial effect, partial identification, unconditional quantile regression |
JEL: | C14 C31 |
Date: | 2009–09–02 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22190&r=ecm |
By: | Keith Ord; Ralph Snyder; Adrian Beaumont |
Abstract: | Organizations with large-scale inventory systems typically have a large proportion of items for which demand is intermittent and low volume. We examine different approaches to forecasting for such products, paying particular attention to the need for inventory planning over a multi-period lead-time when the underlying process may be non-stationary. We develop a forecasting framework based upon the zero-inflated Poisson distribution (ZIP), which enables the explicit evaluation of the multi-period lead-time demand distribution in special cases and an effective simulation scheme more generally. We also develop performance measures related to the entire predictive distribution, rather than focusing exclusively upon point predictions. The ZIP model is compared to a number of existing methods using data on the monthly demand for 1,046 automobile parts, provided by a US automobile manufacturer. We conclude that the ZIP scheme compares favorably to other approaches, including variations of Croston's method as well as providing a straightforward basis for inventory planning. |
Keywords: | Croston's method; Exponential smoothing; Intermittent demand; Inventory control; Prediction likelihood; State space models; Zero-inflated Poisson distribution |
JEL: | C22 |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2010-12&r=ecm |
By: | Fève, Patrick; Matheron, Julien; Sahuc, Jean-Guillaume |
Date: | 2009–12 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22257&r=ecm |
By: | Caporin, M. (Erasmus Econometric Institute); McAleer, M.J. (Erasmus Econometric Institute) |
Abstract: | DAMGARCH is a new model that extends the VARMA-GARCH model of Ling and McAleer (2003) by introducing multiple thresholds and time-dependent structure in the asymmetry of the conditional variances. Analytical expressions for the news impact surface implied by the new model are also presented. DAMGARCH models the shocks affecting the conditional variances on the basis of an underlying multivariate distribution. It is possible to model explicitly asset-specific shocks and common innovations by partitioning the multivariate density support. This paper presents the model structure, describes the implementation issues, and provides the conditions for the existence of a unique stationary solution, and for consistency and asymptotic normality of the quasi-maximum likelihood estimators. The paper also presents an empirical example to highlight the usefulness of the new model. |
Keywords: | multivariate asymmetry;conditional variance;stationarity conditions;asymptotic theory;multivariate news impact cure |
Date: | 2010–05–11 |
URL: | http://d.repec.org/n?u=RePEc:dgr:eureir:1765019452&r=ecm |
By: | Alessio Moneta (Max Planck Institute of Economics); Doris Entner (Helsinki Institute for Information Technology); Patrik Hoyer (Helsinki Institute for Information Technology and Massachusetts Institute of Technology); Alex Coad (Max Planck Institute of Economics) |
Abstract: | Structural vector-autoregressive models are potentially very useful tools for guiding both macro- and microeconomic policy. In this paper, we present a recently developed method for exploiting non-Gaussianity in the data for estimating such models, with the aim of capturing the causal structure underlying the data, and show how the method can be applied to both microeconomic data (processes of firm growth and firm performance) as well as macroeconomic data (effects of monetary policy). |
Keywords: | Causality, Structural VAR, Independent Components Analysis, Non-Gaussianity, Firm Growth, Monetary Policy |
JEL: | C32 C52 D21 E52 L21 |
Date: | 2010–05–11 |
URL: | http://d.repec.org/n?u=RePEc:jrp:jrpwrp:2010-031&r=ecm |
By: | Verardi, Vincenzo (Free University of Brussels); Wagner, Joachim (Leuphana University Lüneburg) |
Abstract: | In empirical studies it often happens that some variables for some units are far away from the other observations in the sample. These extreme observations, or outliers, often have a large impact on the results of statistical analyses – conclusions based on a sample with and without these units may differ drastically. While applied researchers tend to be aware of this, the detection of outliers and their appropriate treatment is often dealt with in a rather sloppy manner. One reason for this habit seems to be the lack of availability of appropriate canned programs for robust methods that can be used in the presence of outliers. Our paper intents to improve on this situation by presenting a highly robust method for estimation of the popular linear fixed effects panel data model, and to supply Stata code for it. In an application from the field of the micro-econometrics of international firm activities we demonstrate that outliers can indeed drive results. |
Keywords: | Stata, outliers, panel data, robust estimation, exporter productivity premium |
JEL: | C23 C81 C87 F14 |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp4928&r=ecm |
By: | Cecilia Mancini (Dipartimento di Matematica per le Decisioni, Universita' degli Studi di Firenze); Fabio Gobbi (Department of Mathematical Economics, University of Bologna) |
Abstract: | In this paper we consider two semimartingales driven by Wiener processes and (possibly infinite activity) jumps. Given discrete observations we separately estimate the integrated covariation IC from the sum of the co-jumps. The Realized Covariation (RC) approaches the sum of IC with the co-jumps as the number of observations increases to infinity. Our threshold (or truncated) estimator \hat{IC}_n excludes from RC all the terms containing jumps in the finite activity case and the terms containing jumps over the threshold in the infinite activity case, and is consistent. To further measure the dependence between the two processes also the betas, \beta^{(1,2)} and \beta^{(2,1)}, and the correlation coefficient \rho^{(1,2)} among the Brownian semimartingale parts are consistently estimated. In presence of only finite activity jumps: 1) we reach CLTs for \hat{IC}_n, \hat\beta^{(i,j)} and \hat \rho^{(1,2)}; 2) combining thresholding with the observations selection proposed in Hayashi and Yoshida (2005) we reach an estimate of IC which is robust to asynchronous data. We report the results of an illustrative application, made in a web appendix (on www.dmd.unifi.it/upload/sub/persone/mancini/WebAppendix3.pdf), to two very different simulated realistic asset price models and we see that the finite sample performances of \hat{IC}_n and of the sum of the co-jumps estimator are good for values of the observation step large enough to avoid the typical problems arising in presence of microstructure noises in the data. However we find that the co-jumps estimators are more sensible than \hat{IC}_n to the choice of the threshold. Finding criteria for optimal threshold selection is object of further research. |
Keywords: | co-jumps, integrated covariation, integrated variance, finite activity jumps, infinite activity jumps, threshold estimator |
Date: | 2010–03 |
URL: | http://d.repec.org/n?u=RePEc:flo:wpaper:2010-05&r=ecm |
By: | Ana P. Palacios; Juan Miguel Marín; Michael P. Wiper |
Abstract: | Bacterial growth models are commonly used in food safety. Such models permit the prediction of microbial safety and the shelf life of perishable foods. In this paper, we study the problem of modelling bacterial growth when we observe multiple experimental results under identical environmental conditions. We develop a hierarchical version of the Gompertz equation to take into account the possibility of replicated experiments and we show how it can be fitted using a fully Bayesian approach. This approach is illustrated using experimental data from Listeria monocytogenes growth and the results are compared with alternative models. Model selection is undertaken throughout using an appropriate version of the deviance information criterion and the posterior predictive loss criterion. Models are fitted using WinBUGS via R2WinBUGS. |
Keywords: | Predictive microbiology, Growth models, Gompertz curve, Bayesian hierarchical modelling |
Date: | 2010–04 |
URL: | http://d.repec.org/n?u=RePEc:cte:wsrepe:ws102109&r=ecm |
By: | Michiels F.; De Schepper A. |
Abstract: | The selection of copulas is an important aspect of dependence modeling. In many practical applications, only a limited number of copulas is tested, and the modeling applications usually are restricted to the bivariate case. One explanation is the fact that no graphical copula tool exist which allows to assess the goodness-of-fit of a large set of (possible higher dimensional) copula functions at once. This paper pursues to overcome this problem by developing a new graphical tool for the copula selection, based on a statistical analysis technique called ‘principal coordinate analysis’. The advantage is threefold. In the first place, when projecting the empirical copula of a modeling application on a two-dimensional copula space, it allows us to visualize the fit of a whole collection of multivariate copulas at once. Secondly, the visual tool allows to identify ‘search’ directions for potential fit improvements (e.g. through the use of copula transforms). Finally, in the bivariate case the tool makes it also possible to give a two-dimensional visual overview of a large number of known copula families, for a common concordance value, leading to a better understanding and a more efficient use of the different copula families. The practical use of the new graphical tool is illustrated for two two-dimensional and two three-dimensional fitting applications. |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:ant:wpaper:2010004&r=ecm |
By: | Cecilia Mancini (Dipartimento di Matematica per le Decisioni, Universita' degli Studi di Firenze) |
Abstract: | In this paper we consider a semimartingale model for the evolution of the price of a financial asset, driven by a Brownian motion (plus drift) and possibly infinite activity jumps. Given discrete observations, the threshold estimator is able to separate the integrated variance from the sum of the squared jumps. This has importance in measuring and forecasting the asset risks. The exact convergence speed was found in the literature only when the jumps are of finite variation. Here we give the speed even in presence of infinite variation jumps, as they appear e.g. in some cgmy plus diffusion models. |
Keywords: | Integrated variance, threshold estimator, convergence speed, infinite activity stable Le'vy jumps. |
Date: | 2010–04 |
URL: | http://d.repec.org/n?u=RePEc:flo:wpaper:2010-03&r=ecm |
By: | Caporin, M. (Erasmus Econometric Institute); McAleer, M.J. (Erasmus Econometric Institute) |
Abstract: | In the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. The two most widely known and used are the Scalar BEKK model of Engle and Kroner (1995) and Ding and Engle (2001), and the DCC model of Engle (2002). Some recent research has begun to examine MGARCH specifications in terms of their out-of-sample forecasting performance. In this paper, we provide an empirical comparison of a set of MGARCH models, namely BEKK, DCC, Corrected DCC (cDCC) of Aeilli (2008), CCC of Bollerslev (1990), Exponentially Weighted Moving Average, and covariance shrinking of Ledoit and Wolf (2004), using the historical data of 89 US equities. Our methods follow some of the approach described in Patton and Sheppard (2009), and contribute to the literature in several directions. First, we consider a wide range of models, including the recent cDCC model and covariance shrinking. Second, we use a range of tests and approaches for direct and indirect model comparison, including the Weighted Likelihood Ratio test of Amisano and Giacomini (2007). Third, we examine how the model rankings are influenced by the cross-sectional dimension of the problem. |
Keywords: | covariance forecasting;model confidence set;model ranking;MGARCH;model comparison |
Date: | 2010–05–11 |
URL: | http://d.repec.org/n?u=RePEc:dgr:eureir:1765019447&r=ecm |
By: | Vladimir Panov |
Abstract: | In this article, we present new ideas concerning Non-Gaussian Component Analysis (NGCA). We use the structural assumption that a high-dimensional random vector X can be represented as a sum of two components - a lowdimensional signal S and a noise component N. We show that this assumption enables us for a special representation for the density function of X. Similar facts are proven in original papers about NGCA ([1], [5], [13]), but our representation differs from the previous versions. The new form helps us to provide a strong theoretical support for the algorithm; moreover, it gives some ideas about new approaches in multidimensional statistical analysis. In this paper, we establish important results for the NGCA procedure using the new representation, and show benefits of our method. |
Keywords: | dimension reduction, non-Gaussian components, EDR subspace, classification problem, Value at Risk |
JEL: | C13 C14 |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2010-026&r=ecm |
By: | Rothe, Christoph |
Abstract: | This note demonstrates identification of Unconditional Partial Effects introduced by Firpo, Fortin, and Lemieux (2009) in nonseparable triangular models with endogenous regressors via a control variable approach, as employed by Imbens and Newey (2009). |
JEL: | C14 |
Date: | 2009–06–30 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22180&r=ecm |
By: | Don McLeish |
Abstract: | Consider a process, stochastic or deterministic, obtained by using a numerical integration scheme, or from Monte-Carlo methods involving an approximation to an integral, or a Newton-Raphson iteration to approximate the root of an equation. We will assume that we can sample from the distribution of the process from time 0 to finite time n. We propose a scheme for unbiased estimation of the limiting value of the process, together with estimates of standard error and apply this to examples including numerical integrals, root-finding and option pricing in a Heston Stochastic Volatility model. This results in unbiased estimators in place of biased ones i nmany potential applications. |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1005.2228&r=ecm |
By: | Bertrand B. Maillet (ABN AMRO Advisors, Variances and University of Paris-1 (CES/CNRS and EIF)); Jean-Philippe R. Médecin (Paris School of Economics, University of Paris-1 and Variances) |
Abstract: | Following Bali and Weinbaum (2005) and Maillet et al. (2010), we present several estimates of volatilities computed with high- and low frequency data and complement their results using additional measures of risk and several alternative methods for Tail-index estimation. The aim here is to confirm previous results regarding the slope of the tail of various risk measure distributions, in order to define the high watermarks of market risks. We also produce synthetic general results concerning the method of estimation of the Tail-indexes related to expressions of the L-moments. Based on estimates of Tail-indexes, retrieved from the high frequency 30’ sampled CAC40 French stock Index series from the period 1997-2009, using Non-parametric Generalized Hill, Maximum Likelihood and various kinds of L-moment Methods for the estimation of both a Generalized Extreme Value density and a Generalized Pareto Distribution, we confirm that a heavy-tail density specification of the Log-volatility is not necessary. |
Keywords: | Financial Crisis, Realized Volatility, Range-based Volatility, Extreme Value Distributions, Tail-index, L-moments, High Frequency Data. |
JEL: | G10 G14 |
URL: | http://d.repec.org/n?u=RePEc:ven:wpaper:2010_10&r=ecm |
By: | Bisio Laura; Faccini Andrea |
Abstract: | The aim of this paper is to verify if a proper SVEC representation of a standard Real Business Cycle model exists even when the capital stock series is omitted. The argument is relevant as the common unavailability of su¢ ciently long medium-frequency capital series prevent researchers from including capital in the widespread structural VAR (SVAR) representations of DSGE models - which is supposed to be the cause of the SVAR biased estimates. Indeed, a large debate about the truncation and small sample bias a¤ecting the SVAR performance in approximating DSGE models has been recently rising. In our view, it might be the case of a smaller degree of estimates distorsions when the RBC dynamics is approximated through a SVEC model as the information provided by the cointegrating relations among some variables might compensate the exclusion of the capital stock series from the empirical representation of the model. |
Keywords: | RBC, SVAR, SVEC model, cointegration |
JEL: | E27 E32 C32 C52 |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:ter:wpaper:0066&r=ecm |
By: | Genton, Mark G.; Ruiz-Gazen, Anne |
Abstract: | We introduce the hair-plot to visualize influential observations in dependent data. It consists of all trajectories of the value of an estimator when each observation is modified in turn by an additive perturbation. We define two measures of influence: the local influence which describes the rate of departure from the original estimate due to a small perturbation of each observation; and the asymptotic influence which indicates the influence on the original estimate of the most extreme contamination for each observation. The cases of estimators defined as quadratic forms or ratios of quadratic forms are investigated in detail. Sample autocovariances, covariograms and variograms belong to the first case. Sample autocorrelations, correlograms, and indices of spatial autocorrelation such as Moran’s I belong to the second case. We illustrate our approach on various datasets from time series analysis and spatial statistics. |
Keywords: | autocovariance, Moran's I, outlier |
Date: | 2009–06–23 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22176&r=ecm |
By: | Jeroen V.K. Rombouts (Institute of Applied Economics at HEC Montréal, CIRANO, CIRPEE, Université catholique de Louvain (CORE)); Lars Stentoft (Department of Finance at HEC Montréal, CIRANO, CIRPEE and CREATES) |
Abstract: | In recent years multivariate models for asset returns have received much attention, in particular this is the case for models with time varying volatility. In this paper we consider models of this class and examine their potential when it comes to option pricing. Specifically, we derive the risk neutral dynamics for a general class of multivariate heteroskedastic models, and we provide a feasible way to price options in this framework. Our framework can be used irrespective of the assumed underlying distribution and dynamics, and it nests several important special cases. We provide an application to options on the minimum of two indices. Our results show that not only is correlation important for these options but so is allowing this correlation to be dynamic. Moreover, we show that for the general model exposure to correlation risk carries an important premium, and when this is neglected option prices are estimated with errors. Finally, we show that when neglecting the non-Gaussian features of the data, option prices are also estimated with large errors. |
Keywords: | Multivariate risk premia, Option pricing, GARCH models |
JEL: | C11 C15 C22 G13 |
Date: | 2010–04–30 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-19&r=ecm |
By: | Bontemps, Christian; Magnac, Thierry; Maurin, Eric |
Date: | 2009–09 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22272&r=ecm |
By: | Daskovska, Alexandra; Simar, Léopold; Van Bellegem, Sébastien |
Abstract: | The Malmquist Productivity Index (MPI) suggests a convenient way of measuring the productivity change of a given unit between two consequent time periods. Until now, only a static approach for analyzing the MPI was available in the literature. However, this hides a potentially valuable information given by the evolution of productivity over time. In this paper, we introduce a dynamic procedure for forecasting the MPI. We compare several approaches and give credit to a method based on the assumption of circularity. Because the MPI is not circular, we present a new decomposition of the MPI, in which the time-varying indices are circular. Based on that decomposition, a new working dynamic forecasting procedure is proposed and illustrated. To construct prediction intervals of the MPI, we extend the bootstrap method in order to take into account potential serial correlation in the data. We illustrate all the new techniques described above by forecasting the productivityt index of 17 OCDE countries, constructed from their GDP, labor and capital stock. |
Keywords: | Malmquist Productivity Index, circularity efficiency, smooth bootstrap |
Date: | 2009–06–11 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:22150&r=ecm |