nep-ecm New Economics Papers
on Econometrics
Issue of 2010‒10‒02
twenty-six papers chosen by
Sune Karlsson
Orebro University

  1. Nonparametric stochastic frontier estimation via profile By Carlos Martins-Filho; Feng Yao
  2. Efficient semiparametric instrumental variable estimation By Feng Yao; Junsen Zhang
  3. Iterative Regularization in Nonparametric Instrumental Regression By Johannes, Jan; Van Bellegem, Sébastien; Vanhems, Anne
  4. A semiparametric Bayesian approach to the analysis of financial time series with applications to value at risk estimation By Concepción Ausín; Pedro Galeano; Pulak Ghosh
  5. Nonparametric Frontier Estimation from Noisy Data By Florens, Jean-Pierre; Schwarz, Maik; Van Bellegem, Sébastien
  6. Nonparametric Estimation of An Instrumental Regression: A Quasi-Bayesian Approach Based on Regularized Posterior By Florens, Jean-Pierre; Simoni, Anna
  7. Unit root testing under a local break in trend By David I. Harvey; Stephen J. Leybourne; A. M. Robert Taylor
  8. Indirect Inference Based on the Score By Peter Fuleky; Eric Zivot
  9. Jumpy or Kinky? Regression Discontinuity without the Discontinuity By Dong, Yingying
  10. Efficient Evaluation of Multidimensional Time-Varying Density Forecasts with an Application to Risk Management By Evarist Stoja; Arnold Polanski
  11. A note on some properties of a skew-normal density By Carlos Martins-Filho; Feng Yao
  12. Empirical power of the Kwiatkowski-Phillips-Schmidt-Shin test By Ewa M. Syczewska
  13. Regularizing priors for linear inverse problems By Florens, Jean-Pierre; Simoni, Anna
  14. Estimating standard errors for the Parks model: Can jackknifing help? By Reed, W. Robert; Webb, Rachel S.
  15. Spatial Chow-Lin Methods for Data Completion in Econometric Flow Models By Polasek, Wolfgang; Sellner, Richard
  16. Residual-based tests for cointegration and multiple deterministic structural breaks: A Monte Carlo study By Matteo Mogliani
  17. On the forecasting accuracy of multivariate GARCH models By LAURENT, Sébastien; ROMBOUTS, Jeroen V. K.; VIOLANTE, Francesco
  18. Estimation of operational value-at-risk in the presence of minimum collection threshold: An empirical study By Chernobai, Anna; Menn, Christian; Rachev, Svetlozar T.; Trück, Stefan
  19. Bayesian inference for hedge funds with stable distribution of returns By Güner, Biliana; Rachev, Svetlozar T.; Edelman, Daniel; Fabozzi, Frank J.
  20. Aggregation of exponential smoothing processes with an application to portfolio risk evaluation By SBRANA, Giacomo; SILVESTRINI, Andrea
  21. Endogeneity and Instrumental Variables in Dynamic Models By Florens, Jean-Pierre; Simon, Guillaume
  22. The Effect of Unobserved Heterogeneity in Stochastic Frontier Estimation: Comparison of Cross Section and Panel with Simulated Data for The Postal Sector By Cazals, Catherine; Dudley, Paul; Florens, Jean-Pierre; Jones, Michael
  23. Identification and estimation of sequential English auctions By Laurent Lamy
  24. A Cyclical Model of Exchange Rate Volatility By Evarist Stoja; Richard D. F. Harris; Fatih Yilmaz
  25. The Estimation of Causal Effects by Difference-in-Difference Methods By Michael Lechner
  26. A New Framework To Estimate the Risk-Neutral Probability Density Functions Embedded in Options Prices By Kevin C. Cheng

  1. By: Carlos Martins-Filho (Department of Economics, University of Colorado); Feng Yao (Department of Economics, West Virginia University)
    Abstract: We consider the estimation of a nonparametric stochastic frontier model with composite error density which is known up to a finite parameter vector. Our primary interest is on the estimation of the parameter vector, as it provides the basis for estimation of firm specific (in)efficiency. Our frontier model is similar to that of Fan et al. (1996), but here we extend their work in that: a) we establish the asymptotic properties of their estimation procedure, and b) propose and establish the asymptotic properties of an alternative estimator based on the maximization of a conditional profile likelihood function. The estimator proposed in Fan et al. (1996) is asymptotically normally distributed but has bias which does not vanish as the sample size n??. In contrast, our proposed estimator is asymptotically normally distributed and correctly centered at the true value of the parameter vector. In addition, our estimator is shown to be efficient in a broad class of semiparametric estimators. Our estimation procedure provides a fast converging alternative to the recently proposed estimator in Kumbhakar et al. (2007). A Monte Carlo study is performed to shed light on the finite sample properties of these competing estimators.
    Keywords: stochastic frontier models; nonparametric frontiers; profile likelihood estimation.
    JEL: C14 C22
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:wvu:wpaper:10-09&r=ecm
  2. By: Feng Yao (Department of Economics, West Virginia University); Junsen Zhang (Department of Economics, The Chinese University of Hong Kong)
    Abstract: We consider the estimation of a semiparametric regression model where data is independently and identically distributed. Our primary interest is on the estimation of the parameter vector, where the associated regressors are correlated with the errors and contain both continuous and discrete variables. We propose three estimators by adapting Robinson's (1988) and Li and Stengos' (1996) framework and establish their asymptotic properties. They are asymptotically normally distributed and correctly centered at the true value of the parameter vector. Among a class of semiparametric IV estimators with conditional moment restriction, the first two are efficient under conditional homoskedasticity and the last one is efficient under heteroskedasticity. They allow the reduced form to be nonparametric, are asymptotically equivalent to semiparametric IV estimators that optimally select the instrument and reach the semiparametric efficiency bounds in Chamberlain (1992). A Monte Carlo study is performed to shed light on the finite sample properties of these competing estimators. Its applicability is illustrated with an empirical data set.
    Keywords: Instrumental variables, semiparametric regression, efficient estimation.
    JEL: C14 C21
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:wvu:wpaper:10-11&r=ecm
  3. By: Johannes, Jan; Van Bellegem, Sébastien; Vanhems, Anne
    Abstract: We consider the nonparametric regression model with an additive error that is correlated with the explanatory variables. We suppose the existence of instrumental variables that are considered in this model for the identification and the estimation of the regression function. The nonparametric estimation by instrumental variables is an illposed linear inverse problem with an unknown but estimable operator. We provide a new estimator of the regression function using an iterative regularization method (the Landweber-Fridman method). The optimal number of iterations and the convergence of the mean square error of the resulting estimator are derived under both mild and severe degrees of ill-posedness. A Monte-Carlo exercise shows the impact of some parameters on the estimator and concludes on the reasonable finite sample performance of the new estimator.
    Keywords: Nonparametric estimation; Instrumental variable; Ill-posed inverse problem
    JEL: C14 C30
    Date: 2010–07–16
    URL: http://d.repec.org/n?u=RePEc:tse:wpaper:23124&r=ecm
  4. By: Concepción Ausín; Pedro Galeano; Pulak Ghosh
    Abstract: Financial time series analysis deals with the understanding of data collected on financial markets. Several parametric distribution models have been entertained for describing, estimating and predicting the dynamics of financial time series. Alternatively, this article considers a Bayesian semiparametric approach. In particular, the usual parametric distributional assumptions of the GARCH-type models are relaxed by entertaining the class of location-scale mixtures of Gaussian distributions with a Dirichlet process prior on the mixing distribution, leading to a Dirichlet process mixture model. The proposed specification allows for a greater exibility in capturing both the skewness and kurtosis frequently observed in financial returns. The Bayesian model provides statistical inference with finite sample validity. Furthermore, it is also possible to obtain predictive distributions for the Value at Risk (VaR), which has become the most widely used measure of market risk for practitioners. Through a simulation study, we demonstrate the performance of the proposed semiparametric method and compare results with the ones from a normal distribution assumption. We also demonstrate the superiority of our proposed semiparametric method using real data from the Bombay Stock Exchange Index (BSE-30) and the Hang Seng Index (HSI).
    Keywords: Bayesian estimation, Deviance information criterion, Dirichlet process mixture, Financial time series, Location-scale Gaussian mixture, Markov chain Monte Carlo
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws103822&r=ecm
  5. By: Florens, Jean-Pierre; Schwarz, Maik; Van Bellegem, Sébastien
    Abstract: A new nonparametric estimator of production a frontier is defined and studied when the data set of production units is contaminated by measurement error. The measurement error is assumed to be an additive normal random variable on the input variable, but its variance is unknown. The estimator is a modification of the m-frontier, which necessitates the computation of a consistent estimator of the conditional survival function of the input variable given the output variable. In this paper, the identification and the consistency of a new estimator of the survival function is proved in the presence of additive noise with unknown variance. The performance of the estimator is also studied through simulated data.
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:tse:wpaper:22897&r=ecm
  6. By: Florens, Jean-Pierre; Simoni, Anna
    Abstract: We propose a Quasi-Bayesian nonparametric approach to estimating the structural relationship ' among endogenous variables when instruments are available. We show that the posterior distribution of ' is inconsistent in the frequentist sense. We interpret this fact as the ill-posedness of the Bayesian inverse problem defined by the relation that characterizes the structural function '. To solve this problem, we construct a regularized posterior distribution, based on a Tikhonov regularization of the inverse of the marginal variance of the sample, which is justified by a penalized projection argument. This regularized posterior distribution is consistent in the frequentist sense and its mean can be interpreted as the mean of the exact posterior distribution resulting from a gaussian prior distribution with a shrinking covariance operator.
    JEL: C11 C14 C30
    Date: 2010–03
    URL: http://d.repec.org/n?u=RePEc:tse:wpaper:22895&r=ecm
  7. By: David I. Harvey; Stephen J. Leybourne; A. M. Robert Taylor
    Abstract: It is well known that it is vital to account for trend breaks when testing for a unit root. In practice, uncertainty exists over whether or not a trend break is present and, if it is, where it is located. Harris et al. (2009) and Carrion-i-Silvestre et al. (2009) propose procedures which account for both of these forms of uncertainty. Each uses what amounts to a pre-test for a trend break, accounting for a trend break (the associated break fraction estimated from the data) in the unit root procedure only where the pre-test signals a break. Assuming the break magnitude is fixed (independent of sample size) these authors show that their methods achieve near asymptotically ecient unit root inference in both trend break and no trend break environments. These asymptotic results are, however, somewhat at odds with the finite sample simulations reported in both papers. These show the presence of pronounced "valleys" in the finite sample power functions (when mapped as functions of the break magnitude) of the tests such that power is initially high for very small breaks, then decreases as the break magnitude increases, before increasing again. Here we show that treating the break magnitude as local to zero (in a Pitman drift sense) allows the asymptotic analysis to very closely approximate this finite sample effect, thereby providing useful analytical insights into the observed phenomenon. In response to this problem we propose practical solutions, based either on the use of a with break unit root test but with adaptive critical values, or on a union of rejections principle taken across with break and without break unit root tests. The former is shown to eliminate power valleys but at the expense of power when no break is present, while the latter considerably mitigates the valleys while not losing all the power gains available when no break exists.
    Keywords: Unit root test; local trend break; asymptotic local power; union of rejections; adaptive critical values
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:not:notgts:10/05&r=ecm
  8. By: Peter Fuleky (University of Hawaii); Eric Zivot (University of Washington)
    Abstract: The Ecient Method of Moments (EMM) estimator popularized by Gallant and Tauchen (1996) is an indirect inference estimator based on the simulated auxiliary score evaluated at the sample estimate of the auxiliary parameters. We study an alternative estimator that uses the sample auxiliary score evaluated at the simulated binding function which maps the structural parameters of interest to the auxiliary parameters. We show that the alternative estimator has the same asymptotic properties as the EMM estimator but in finite samples behaves more like the distance-based indirect inference estimator of Gourieroux, Monfort and Renault (1993).
    Date: 2010–06
    URL: http://d.repec.org/n?u=RePEc:udb:wpaper:uwec-2010-08&r=ecm
  9. By: Dong, Yingying
    Abstract: Regression Discontinuity (RD) models identify local treatment effects by associating a discrete change in the mean outcome with a corresponding discrete change in the probability of treatment at a known threshold of a running variable. This paper shows that it is possible to identify RD model treatment effects without a discontinuity. The intuition is that identification can come from a slope change (a kink) instead of a discrete level change (a jump) in the treatment probability. Formally this can be shown using L'hopital's rule. The identification results are interpreted intuitively using instrumental variable models. Estimators are proposed that can be applied in the presence or absence of a discontinuity, by exploiting either a jump or a kink.
    Keywords: Regression Discontinuity; Fuzzy design; Average treatment effect; Identification; Jump; Kink; Threshold
    JEL: C21 C25
    Date: 2010–08–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:25461&r=ecm
  10. By: Evarist Stoja; Arnold Polanski
    Abstract: We propose two simple evaluation methods for time varying density forecasts of continuous higher dimensional random variables. Both methods are based on the probability integral transformation for unidimensional forecasts. The first method tests multinormal densities and relies on the rotation of the coordinate system. The advantage of the second method is not only its applicability to any continuous distribution but also the evaluation of the forecast accuracy in specific regions of its domain as defined by the user’s interest. We show that the latter property is particularly useful for evaluating a multidimensional generalization of the Value at Risk. In simulations and in an empirical study, we examine the performance of both tests.
    Keywords: Multivariate Density Forecast Evaluation, Probability Integral Transformation, Multidimensional Value at Risk, Monte Carlo Simulations
    JEL: C52 C53
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:bri:uobdis:09/617&r=ecm
  11. By: Carlos Martins-Filho (Department of Economics, University of Colorado); Feng Yao (Department of Economics, West Virginia University)
    Abstract: The sum of two independent random variables with normal and half normal densities has a skew-normal density (Azzalini, 1985). In this note we show that this skew-normal density satisfies all assumptions required in establishing the asymptotic properties of the estimators discussed in Martins-Filho and Yao (2010).
    Keywords: skew-normal density; semi-parametric stochastic frontiers.
    JEL: C14 C22
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:wvu:wpaper:10-10&r=ecm
  12. By: Ewa M. Syczewska (Warsaw School of Economics)
    Abstract: The aim of this paper is to study properties of the Kwiatkowski-Phillips-Schmidt-Shin test (KPSS test), introduced in Kwiatkowski et al. (1992) paper. The null of the test corresponds to stationarity of a series, the alternative to its nonstationarity. Distribution of the test statistics is nonstandard, asymptotically converges to Brownian bridges as was shown in original paper. The authors produced tables of critical values based on asymptotic approximation. Here we present results of simulation experiment aimed at studying small sample properties of the test and its empirical power.
    Keywords: KPSS test, stationarity, integration, empirical power of KPSS test
    JEL: C12 C16
    Date: 2010–09–23
    URL: http://d.repec.org/n?u=RePEc:wse:wpaper:45&r=ecm
  13. By: Florens, Jean-Pierre; Simoni, Anna
    Abstract: We consider statistical linear inverse problems in Hilbert spaces of the type ˆ Y = Kx + U where we want to estimate the function x from indirect noisy functional observations ˆY . In several applications the operator K has an inverse that is not continuous on the whole space of reference; this phenomenon is known as ill-posedness of the inverse problem. We use a Bayesian approach and a conjugate-Gaussian model. For a very general specification of the probability model the posterior distribution of x is known to be inconsistent in a frequentist sense. Our first contribution consists in constructing a class of Gaussian prior distributions on x that are shrinking with the measurement error U and we show that, under mild conditions, the corresponding posterior distribution is consistent in a frequentist sense and converges at the optimal rate of contraction. Then, a class ^ of posterior mean estimators for x is given. We propose an empirical Bayes procedure for selecting an estimator in this class that mimics the posterior mean that has the smallest risk on the true x.
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:tse:wpaper:22884&r=ecm
  14. By: Reed, W. Robert; Webb, Rachel S.
    Abstract: Non-spherical errors, namely heteroscedasticity, serial correlation and cross-sectional correlation are commonly present within panel data sets. These can cause significant problems for econometric analyses. The FGLS(Parks) estimator has been demonstrated to produce considerable efficiency gains in these settings. However, it suffers from underestimation of coefficient standard errors, oftentimes severe. Potentially, jackknifing the FGLS(Parks) estimator could allow one to maintain the efficiency advantages of FGLS(Parks) while producing more reliable estimates of coefficient standard errors. Accordingly, this study investigates the performance of the jackknife estimator of FGLS(Parks) using Monte Carlo experimentation. We find that jackknifing can - in narrowly defined situations - substantially improve the estimation of coefficient standard errors. However, its overall performance is not sufficient to make it a viable alternative to other panel data estimators. --
    Keywords: Panel Data estimation,Parks model,cross-sectional correlation,jackknife,Monte Carlo
    JEL: C23 C15
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:201023&r=ecm
  15. By: Polasek, Wolfgang (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria, and Faculty of Science, University of Porto, Porto, Portugal); Sellner, Richard (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria)
    Abstract: Flow data across regions can be modeled by spatial econometric models, see LeSage and Pace (2009). Recently, regional studies became interested in the aggregation and disaggregation of flow models, because trade data cannot be obtained at a disaggregated level but data are published on an aggregate level. Furthermore, missing data in disaggregated flow models occur quite often since detailed measurements are often not possible at all observation points in time and space. In this paper we develop classical and Bayesian methods to complete flow data. The Chow and Lin (1971) method was developed for completing disaggregated incomplete time series data. We will extend this method in a general framework to spatially correlated flow data using the cross-sectional Chow-Lin method of Polasek et al. (2009). The missing disaggregated data can be obtained either by feasible GLS prediction or by a Bayesian (posterior) predictive density.
    Keywords: Missing values in spatial econometrics, MCMC, non-spatial Chow-Lin (CL) and spatial Chow-Lin (SCL) methods, spatial internal flow (SIF) models, origin and destination (OD) data
    JEL: C11 C15 C52 E17 R12
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:ihs:ihsesp:255&r=ecm
  16. By: Matteo Mogliani
    Abstract: The aim of this paper is to study the performance of residual-based tests for cointegration in the presence of multiple deterministic structural breaks via Monte Carlo simulations. We consider the KPSS-type LM tests proposed in Carrion-i-Silvestre and Sansò (2006) and in Bartley, Lee and Strazicich (2001), as well as the Schmidt and Phillips-type LM tests proposed in Westerlund and Edgerton (2007). This exercise allow us to cover a wide set of single-equation cointegration estimators. Monte Carlo experiments reveal a trade-off between size and power distortions across tests and models. KPSS-type tests display large size distortions under multiple breaks scenarios, while Schmidt and Phillips-type tests appear well-sized across all simulations. However, when regressors are endogenous, the former group of tests displays quite high power against the alternative hypothesis, while the latter shows severe low power.
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:pse:psecon:2010-22&r=ecm
  17. By: LAURENT, Sébastien (Maastricht University, The Netherlands; Université catholique de Louvain, CORE, B-1348 Louvain-la-Neuve, Belgium); ROMBOUTS, Jeroen V. K. (HEC Montréal, CIRANO, CIRPEE; Université catholique de Louvain, CORE, B-1348 Louvain-la-Neuve, Belgium); VIOLANTE, Francesco (Université de Namur, CeReFim, B-5000 Namur, Belgium; Université catholique de Louvain, CORE, B-1348 Louvain-la-Neuve, Belgium)
    Abstract: This paper addresses the question of the selection of multivariate GARCH models in terms of variance matrix forecasting accuracy with a particular focus on relatively large scale problems. We consider 10 assets from NYSE and NASDAQ and compare 125 model based one-step-ahead conditional variance forecasts over a period of 10 years using the model confidence set (MCS) and the Superior Predicitive Ability (SPA) tests. Model per- formances are evaluated using four statistical loss functions which account for different types and degrees of asymmetry with respect to over/under predictions. When consid- ering the full sample, MCS results are strongly driven by short periods of high market instability during which multivariate GARCH models appear to be inaccurate. Over rel- atively unstable periods, i.e. dot-com bubble, the set of superior models is composed of more sophisticated specifications such as orthogonal and dynamic conditional correlation (DCC), both with leverage effect in the conditional variances. However, unlike the DCC models, our results show that the orthogonal specifications tend to underestimate the conditional variance. Over calm periods, a simple assumption like constant conditional correlation and symmetry in the conditional variances cannot be rejected. Finally, during the 2007-2008 financial crisis, accounting for non-stationarity in the conditional variance process generates superior forecasts. The SPA test suggests that, independently from the period, the best models do not provide significantly better forecasts than the DCC model of Engle (2002) with leverage in the conditional variances of the returns.
    Keywords: variance matrix, forecasting, multivariate GARCH, loss function, model confidence set, superior predictive ability
    JEL: C10 C32 C51 C52 C53 G10
    Date: 2010–05–01
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2010025&r=ecm
  18. By: Chernobai, Anna; Menn, Christian; Rachev, Svetlozar T.; Trück, Stefan
    Abstract: The recently finalized Basel II Capital Accord requires banks to adopt a procedure to estimate the operational risk capital charge. Under the Advanced Measurement Approaches, that are currently mandated for all large internationally active US banks, require the use of historic operational loss data. Operational loss databases are typically subject to a minimum recording threshold of roughly $10,000. We demonstrate that ignoring such thresholds leads to biases in corresponding parameter estimates when the threshold is ignored. Using publicly available operational loss data, we analyze the effects of model misspecification on resulting expected loss, Value-at-Risk, and Conditional Value-at-Risk figures and show that underestimation of the regulatory capital is a consequence of such model error. The choice of an adequate loss distribution is conducted via in-sample goodness-of-fit procedures and backtesting, using both classical and robust methodologies. --
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:zbw:kitwps:4&r=ecm
  19. By: Güner, Biliana; Rachev, Svetlozar T.; Edelman, Daniel; Fabozzi, Frank J.
    Abstract: Recently, a body of academic literature has focused on the area of stable distributions and their application potential for improving our understanding of the risk of hedge funds. At the same time, research has sprung up that applies standard Bayesian methods to hedge fund evaluation. Little or no academic attention has been paid to the combination of these two topics. In this paper, we consider Bayesian inference for alpha-stable distributions with particular regard to hedge fund performance and risk assessment. After constructing Bayesian estimators for alpha-stable distributions in the context of an ARMA-GARCH time series model with stable innovations, we compare our risk evaluation and prediction results to the predictions of several competing conditional and unconditional models that are estimated in both the frequentist and Bayesian setting. We find that the conditional Bayesian model with stable innovations has superior risk prediction capabilities compared with other approaches and, in particular, produced better risk forecasts of the abnormally large losses that some hedge funds sustained in the months of September and October 2008. --
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:zbw:kitwps:1&r=ecm
  20. By: SBRANA, Giacomo (Université de Strasbourg, BETA, F-67085 Strasbourg, France); SILVESTRINI, Andrea (Bank of Italy, Economics, Research and International Relations Area, Economic and Financial Statistics Department, I-00184 Roma, Italy)
    Abstract: In this paper we propose a unified framework to analyse contemporaneous and temporal aggregation of exponential smoothing (EWMA) models. Focusing on a vector IMA(1,1) model, we obtain a closed form representation for the parameters of the contemporaneously and temporally aggregated process as a function of the parameters of the original one. In the framework of EWMA estimates of volatility, we present an application dealing with Value-at-Risk (VaR) prediction at different sampling frequencies for an equally weighted portfolio composed of multiple indices. We apply the aggregation results by inferring the decay factor in the portfolio volatility equation from the estimated vector IMA(1,1) model of squared returns. Empirical results show that VaR predictions delivered using this suggested approach are at least as accurate as those obtained by applying the standard univariate RiskMetrics TM methodology.
    Keywords: contemporaneous and temporal aggregation, EWMA, volatility, Value-at-Risk
    JEL: C10 C32 C43
    Date: 2010–07–01
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2010039&r=ecm
  21. By: Florens, Jean-Pierre; Simon, Guillaume
    Abstract: The objective of the paper is to draw the theory of endogeneity in dynamic models in discrete and continuous time, in particular for diffusions and counting processes. We first provide an extension of the separable set-up to a separable dynamic framework given in term of semi-martingale decomposition. Then we define our function of interest as a stopping time for an additional noise process, whose role is played by a Brownian motion for diffusions, and a Poisson process for counting processes.
    JEL: C14 C32 C51
    Date: 2010–04
    URL: http://d.repec.org/n?u=RePEc:tse:wpaper:22896&r=ecm
  22. By: Cazals, Catherine; Dudley, Paul; Florens, Jean-Pierre; Jones, Michael
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:tse:wpaper:22880&r=ecm
  23. By: Laurent Lamy
    Abstract: Brendstrup (2007) and Brendstrup and Paarsch (2006) claim that sequential English auction models with multi-unit demand can be identified from the distribution of the last stage winning price and without any assumption on bidding behavior in the earliest stages. We show that their identification strategy is not correct and that non-identification occurs even if equilibrium behavior is assumed in the earliest stages. For two-stage sequential auctions, an estimation procedure that has an equilibrium foundation and that uses the winning price at both stages is developed and supported by Monte Carlo experiments. Identification under general affiliated multi-unit demand schemes is also investigated.
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:pse:psecon:2010-16&r=ecm
  24. By: Evarist Stoja; Richard D. F. Harris; Fatih Yilmaz
    Abstract: In this paper, we investigate the long run dynamics of the intraday range of the GBP/USD, JPY/USD and CHF/USD exchange rates. We use a non-parametric filter to extract the low frequency component of the intraday range, and model the cyclical deviation of the range from the long run trend as a stationary autoregressive process. We find that the long run trend is time-varying but highly persistent, while the cyclical component is strongly mean reverting. This has important implications for modelling and forecasting volatility over both short and long horizons. As an illustration, we use the cyclical volatility model to generate out-of-sample forecasts of exchange rate volatility for horizons of up to one year under the assumption that the long run trend is fully persistent. As a benchmark, we compare the forecasts of the cyclical volatility model with those of the two-factor intraday range-based EGARCH model of Brandt and Jones (2006). Not only is the cyclical volatility model significantly easier to estimate than the EGARCH model, but it also offers a substantial improvement in out-of-sample forecast performance.
    Keywords: Conditional volatility, Intraday range, Hodrick-Prescott filter
    JEL: C15 C22
    Date: 2010–10
    URL: http://d.repec.org/n?u=RePEc:bri:uobdis:10/618&r=ecm
  25. By: Michael Lechner
    Abstract: This survey gives a brief overview of the literature on the difference-in-difference (DiD) estimation strategy and discusses major issues using a treatment effect perspective. In this sense, this survey gives a somewhat different view on DiD than the standard textbook discussion of the difference-in-difference model, but it will also not be as complete as the latter. This survey contains also a couple of extensions to the literature, for example, a discussion of and suggestions for non-linear DiD as well as DiD based on propensity-score type matching methods.
    Keywords: Causal inference, counterfactual analysis, before-after-treatment-control design, control group design with pretest and posttest
    JEL: C21 C23 C31 C33
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:usg:dp2010:2010-28&r=ecm
  26. By: Kevin C. Cheng
    Abstract: Building on the widely-used double-lognormal approach by Bahra (1997), this paper presents a multi-lognormal approach with restrictions to extract risk-neutral probability density functions (RNPs) for various asset classes. The contributions are twofold: first, on the technical side, the paper proposes useful transformation/restrictions to Bahra’s original formulation for achieving economically sensible outcomes. In addition, the paper compares the statistical properties of the estimated RNPs among major asset classes, including commodities, the S&P 500, the dollar/euro exchange rate, and the US 10-year Treasury Note. Finally, a Monte Carlo study suggests that the multi-lognormal approach outperforms the double-lognormal approach.
    Date: 2010–08–02
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:10/181&r=ecm

This nep-ecm issue is ©2010 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.