nep-ecm New Economics Papers
on Econometrics
Issue of 2008‒06‒27
fifty-four papers chosen by
Sune Karlsson
Orebro University

  1. A Range-Based Test for the Parametric Form of the Volatility in Diffusion Models By Mark Podolskij; Daniel Ziggel
  2. Estimation of Volatility Functionals in the Simultaneous Presence of Microstructure Noise and Jumps By Mark Podolskij; Mathias Vetter
  3. New tests for jumps: a threshold-based approach By Mark Podolskij; Daniel Ziggel
  4. An Econometric Analysis of Modulated Realised Covariance, Regression and Correlation in Noisy Diffusion Models By Silja Kinnebrock; Mark Podolskij
  5. The Pearson diffusions: A class of statistically tractable diffusion processes By Michael Sørensen; Julie Lyng Forman
  6. Bipower-type estimation in a noisy diffusion setting By Mark Podolskij; Mathias Vetter
  7. Semiparametric estimation of duration models when the parameters are subject to inequality constraints and the error distribution is unknown By Kulan Ranasinghe; Mervyn J. Silvapulle
  8. Optimal Inference for Instrumental Variables Regression with non-Gaussian Errors By Mathias D. Cattaneo; Richard K. Crump; Michael Jansson
  9. Bias-reduced estimation of long memory stochastic volatility By Per Frederiksen; Morten Ørregaard Nielsen
  10. Parametric inference for discretely sampled stochastic differential equations By Michael Sørensen
  11. Inference regarding multiple structural changes in linear models estimated via two stage least squares By Hall, Alastair R.; Han, Sanggohn; Boldea, Otilia
  12. Estimation of semiparametric stochastic frontiers under shape constraints with application to pollution generating technologies By Kortelainen, Mika
  13. Local polynomial Whittle estimation of perturbed fractional processes By Per Frederiksen; Frank S. Nielsen; Morten Ørregaard Nielsen
  14. An analysis of the indicator saturation estimator as a robust regression estimator By Søren Johansen; Bent Nielsen
  15. Testing for conditional heteroscedasticity in the components of inflation By Carmen Broto; Esther Ruiz
  16. Microstructure Noise in the Continuous Case: The Pre-Averaging Approach - JLMPV-9 By Jean Jacod; Yingying Li; Per A. Mykland; Mark Podolskij; Mathias Vetter
  17. Local Linear Density Estimation for Filtered Survival Data, with Bias Correction By Jens Perch Nielsen; Carsten Tanggaard; M.C. Jones
  18. Semiparametric Power Envelopes for Tests of the Unit Root Hypothesis By Michael Jansson
  19. Efficient estimation for ergodic diffusions sampled at high frequency By Michael Sørensen
  20. Use of Propensity Scores in Non-Linear Response Models: The Case for Health Care Expenditures By Anirban Basu; Daniel Polsky; Willard G. Manning
  21. Estimating High-Frequency Based (Co-) Variances: A Unified Approach By Ingmar Nolte; Valeri Voev
  22. Forecasting with the age-period-cohort model and the extended chain-ladder model By D. Kuang; Bent Nielsen; J. P. Nielsen
  23. Non-linear DSGE Models, The Central Difference Kalman Filter, and The Mean Shifted Particle Filter By Martin Møller Andreasen
  24. Likelihood-Based Inference in Nonlinear Error-Correction Models By Dennis Kristensen; Anders Rahbek
  25. Local polynomial Whittle estimation covering non-stationary fractional processes By Frank S. Nielsen
  26. Inference for the jump part of quadratic variation of Itô semimartingales By Almut Veraart
  27. Power variation for Gaussian processes with stationary increments By Ole E. Barndorff-Nielsen; José Manuel Corcuera; Mark Podolskij
  28. Multiplicative Measurement Error and the Simulation Extrapolation Method By Elena Biewen; Sandra Nolte; Martin Rosemann
  29. Testing a Model of the UK by the Method of Indirect Inference By Meenagh, David; Minford, Patrick; Theodoridis, Konstantinos
  30. Reduced-Rank Regression: A Useful Determinant Identity By Peter Reinhard Hansen
  31. Volatility extraction using the Kalman filter By Alexandr Kuchynka
  32. Continuous-Time Models, Realized Volatilities, and Testable Distributional Implications for Daily Stock Returns By Torben G. Andersen; Tim Bollerslev; Per Houmann Frederiksen; Morten Ørregaard Nielsen
  33. Small Bandwidth Asymptotics for Density-Weighted Average Derivatives By Matias D. Cattaneo; Richard K. Crump; Michael Jansson
  34. Bayesian Analysis of a Probit Panel Data Model with Unobserved Individual Heterogeneity and Autocorrelated Errors By Martin Burda; Roman Liesenfeld; Jean-Francois Richard
  35. Structural estimation of jump-diffusion processes in macroeconomics By Olaf Posch
  36. Jumps and Betas: A New Framework for Disentangling and Estimating Systematic Risks By Viktor Todorov; Tim Bollerslev
  37. Representation and Weak Convergence of Stochastic Integrals with Fractional Integrator Processes By James Davidson; Nigar Hashimzade
  38. Optimal Instrumental Variables Generators Based on Improved Hausman Regression, with an Application to Hedge Funds Returns By Francois-Éric Racicot; Raymond Théoret
  39. Modeling Dependencies in Finance using Copulae By Wolfgang Härdle; Ostap Okhrin; Yarema Okhrin
  40. A Discrete-Time Model for Daily S&P500 Returns and Realized Variations: Jumps and Leverage Effects By Tim Bollerslev; Uta Kretschmer; Christian Pigorsch; George Tauchen
  41. Dynamic Estimation of Volatility Risk Premia and Investor Risk Aversion from Option-Implied and Realized Volatilities By Tim Bollerslev; Michael Gibson; Hao Zhou
  42. Long Memory in Stock Market Volatility and the Volatility-in-Mean Effect: The FIEGARCH-M Model By Bent Jesper Christensen; Morten Ørregaard Nielsen; Jie Zhu
  43. A Reduced Form Framework for Modeling Volatility of Speculative Prices based on Realized Variation Measures By Torben G. Andersen; Tim Bollerslev; Xin Huang
  44. Ensuring the Validity of the Micro Foundation in DSGE Models By Martin Møller Andreasen
  45. Determinants of Birthweight Outcomes: Quantile Regressions Based on Panel Data By Stefan Holst Bache; Christian M. Dahl; Johannes Tang
  46. Bipower variation for Gaussian processes with stationary increments By Ole E. Barndorff-Nielsen; José Manuel Corcuera; Mark Podolskij; Jeannette H.C. Woerner
  47. The Role of Implied Volatility in Forecasting Future Realized Volatility and Jumps in Foreign Exchange, Stock, and Bond Markets By Thomas Busch; Thomas Busch; Bent Jesper Christensen; Morten Ørregaard Nielsen
  48. How to Maximize the Likelihood Function for a DSGE Model By Martin Møller Andreasen
  49. Risk, Jumps, and Diversification By Tim Bollerslev; Tzuo Hann Law; George Tauchen
  50. An iterated GMM procedure for estimating the Campbell-Cochrane habit formation model, with an application to Danish stock and bond returns By Tom Engsted; Stig V. Møller
  51. Pricing Volatility of Stock Returns with Volatile and Persistent Components By Jie Zhu
  52. Handling class imbalance in customer churn prediction By J. BUREZ; D. VAN DEN POEL
  53. Estimation of Random Coefficient Demand Models: Challenges, Difficulties and Warnings By Christopher R. Knittel; Konstantinos Metaxoglou
  54. Trygve Haavelmo’s visit in Aarhus 1938-39 By Olav Bjerkholt

  1. By: Mark Podolskij; Daniel Ziggel (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We propose a new test for the parametric form of the volatility function in continuous time diffusion models of the type dXt = a(t;Xt)dt + (t;Xt)dWt. Our approach involves a range-based estimation of the integrated volatility and the integrated quarticity, which are used to construct the test statistic. Under rather weak assumptions on the drift and volatility we prove weak convergence of the test statistic to a centered mixed Gaussian distribution. As a consequence we obtain a test, which is consistent for any fixed alternative. We also provide a test for neighborhood hypotheses. Moreover, we present a parametric bootstrap procedure which provides a better approximation of the distribution of the test statistic. Finally, it is demonstrated by means of Monte Carlo study that the range-based test is more powerful than the return-based test when comparing at the same sampling frequency.
    Keywords: Bipower Variation, Central Limit Theorem, Diffusion Models, Goodness-Of- Fit Testing, High-Frequency Data, Integrated Volatility, Range-Based Bipower Variation; Semimartingale Theory
    JEL: C12 C14
    Date: 2008–05–14
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-22&r=ecm
  2. By: Mark Podolskij; Mathias Vetter (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We propose a new concept of modulated bipower variation for diffusion models with microstructure noise. We show that this method provides simple estimates for such important quantities as integrated volatility or integrated quarticity. Under mild conditions the consistency of modulated bipower variation is proven. Under further assumptions we prove stable convergence of our estimates with the optimal rate n-1/4. Moreover, we construct estimates which are robust to finite activity jumps.
    Keywords: Bipower Variation, Central Limit Theorem, Finite Activity Jumps, High-Frequency Data, Integrated Volatility, Microstructure Noise, Semimartingale Theory, Subsampling
    JEL: C10 C13 C14
    Date: 2007–09–19
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-27&r=ecm
  3. By: Mark Podolskij; Daniel Ziggel (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: In this paper we propose a test to determine whether jumps are present in a discretely sampled process or not. We use the concept of truncated power variation to construct our test statistics for (i) semimartingale models and (ii) semimartingale models with noise. The test statistics converge to innity if jumps are present and have a normal distribution otherwise. Our method is valid (under very weak assumptions) for all semimartingales with absolute continuous characteristics and rather general model for the noise process. We nally implement the test and present the simulation results. Our simulations suggest that for semimartingale models the new test is much more powerful then tests proposed by Barndorff-Nielsen and Shephard (2006) and At-Sahalia and Jacod (2008).
    Keywords: Central Limit Theorem, High-Frequency Data, Microstructure Noise, Semimartingale Theory, Tests for Jumps, Truncated Power Variation
    JEL: C10 C13 C14
    Date: 2008–06–20
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-34&r=ecm
  4. By: Silja Kinnebrock; Mark Podolskij (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: This paper introduces a new estimator to measure the ex-post covariation between high-frequency financial time series under market microstructure noise. We provide an asymptotic limit theory (including feasible central limit theorems) for standard methods such as regression, correlation analysis and covariance, for which we obtain the optimal rate of convergence. We demonstrate some positive semidefinite estimators of the covariation and construct a positive semidefinite estimator of the conditional covariance matrix in the central limit theorem. Furthermore, we indicate how the assumptions on the noise process can be relaxed and how our method can be applied to non-synchronous observations. We also present an empirical study of how high-frequency correlations, regressions and covariances change through time.
    Keywords: Central Limit Theorem, Diffusion Models, Market Microstructure Noise, Non-synchronous Trading, High-Frequency Data, Semimartingale Theory
    Date: 2008–05–16
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-23&r=ecm
  5. By: Michael Sørensen; Julie Lyng Forman (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: The Pearson diffusions is a flexible class of diffusions defined by having linear drift and quadratic squared diffusion coefficient. It is demonstrated that for this class explicit statistical inference is feasible. Explicit optimal martingale estimating func- tions are found, and the corresponding estimators are shown to be consistent and asymptotically normal. The discussion covers GMM, quasi-likelihood, and non- linear weighted least squares estimation too, and it is discussed how explicit likeli- hood or approximate likelihood inference is possible for the Pearson diffusions. A complete model classification is presented for the ergodic Pearson diffusions. The class of stationary distributions equals the full Pearson system of distributions. Well-known instances are the Ornstein-Uhlenbeck processes and the square root (CIR) processes. Also diffusions with heavy-tailed and skew marginals are included. Special attention is given to a skew t-type distribution. Explicit formulae for the conditional moments and the polynomial eigenfunctions are derived. The analyti- cal tractability is inherited by transformed Pearson diffusions, integrated Pearson diffusions, sums of Pearson diffusions, and stochastic volatility models with Pearson volatility process. For the non-Markov models explicit optimal prediction based estimating functions are found and shown to yield consistent and asymptotically normal estimators.
    Keywords: eigenfunction, ergodic diffusion, integrated diffusion, martingale estimating function, likelihood inference, mixing, optimal estimating function, Pearson system, prediction based estimating function, quasi likelihood, spectral methods,stochastic differential equation, stochastic volatility
    JEL: C22 C51
    Date: 2007–09–27
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-28&r=ecm
  6. By: Mark Podolskij; Mathias Vetter (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We consider a new class of estimators for volatility functionals in the setting of frequently observed Itô diffusions which are disturbed by i.i.d. noise. These statistics extend the approach of pre-averaging as a general method for the estimation of the integrated volatility in the presence of microstructure noise and are closely related to the original concept of bipower variation in the no-noise case. We show that this approach provides efficient estimators for a large class of integrated powers of volatility and prove the associated (stable) central limit theorems. In a more general Itô semimartingale framework this method can be used to define both estimators for the entire quadratic variation of the underlying process and jump-robust estimators which are consistent for various functionals of volatility. As a by-product we obtain a simple test for the presence of jumps in the underlying semimartingale.
    Keywords: Bipower Variation, Central Limit Theorem, High-Frequency Data, Microstructure Noise, Quadratic Variation, Semimartingale Theory, Test for Jumps
    JEL: C10 C13 C14
    Date: 2008–05–26
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-25&r=ecm
  7. By: Kulan Ranasinghe; Mervyn J. Silvapulle
    Abstract: This paper proposes a semiparametric method for estimating duration models when there are inequality constraints on some parameters and the error distribution may be unknown. Thus, the setting considered here is particularly suitable for practical applications. The parameters in duration models are usually estimated by a quasi-MLE. Recent advances show that a semiparametrically efficient estimator [SPE] has better asymptotic optimality properties than the QMLE provided that the parameter space is unrestricted. However, in several important duration models, the parameter space is restricted, for example in the commonly used linear duration model some parameters are non-negative. In such cases, the SPE may turn out to be outside the allowed parameter space and hence are unsuitable for use. To overcome this difficulty, we propose a new constrained semiparametric estimator. In a simulation study involving duration models with inequality constraints on parameters, the new estimator proposed in this paper performed better than its competitors. An empirical example is provided to illustrate the application of the new constrained semiparametric estimator and to show how it overcomes difficulties encountered when the unconstrained estimator of nonnegative parameters turn out to be negative.
    Keywords: Adaptive inference; Conditional duration model; Constrained inference; Efficient semiparametric estimation; Order restricted inference; Semiparametric efficiency bound.
    JEL: C41 C14
    Date: 2008–06
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2008-5&r=ecm
  8. By: Mathias D. Cattaneo; Richard K. Crump; Michael Jansson (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: This paper is concerned with inference on the coefficient on the endogenous regressor in a linear instrumental variables model with a single endogenous regressor, nonrandom exogenous regressors and instruments, and i.i.d. errors whose distribution is unknown. It is shown that under mild smoothness conditions on the error distribution it is possible to develop tests which are “nearly” efficient when identification is weak and consistent and asymptotically optimal when identification is strong. In addition, an estimator is presented which can be used in the usual way to construct valid (indeed, optimal) confidence intervals when identification is strong. The estimator is of the two stage least squares variety and is asymptotically efficient under strong identification whether or not the errors are normal.
    Keywords: Instrumental variables regression, weak instruments, adaptive estimation
    JEL: C14 C31
    Date: 2007–06–25
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-11&r=ecm
  9. By: Per Frederiksen; Morten Ørregaard Nielsen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We propose to use a variant of the local polynomial Whittle estimator to estimate the memory parameter in volatility for long memory stochastic volatility models with potential nonstation- arity in the volatility process. We show that the estimator is asymptotically normal and capable of obtaining bias reduction as well as a rate of convergence arbitrarily close to the parametric rate, n1=2. A Monte Carlo study is conducted to support the theoretical results, and an analysis of daily exchange rates demonstrates the empirical usefulness of the estimators
    Keywords: Bias reduction, local Whittle estimation, long memory stochastic volatility model
    JEL: C14 C22
    Date: 2008–06–24
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-35&r=ecm
  10. By: Michael Sørensen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: A review is given of parametric estimation methods for discretely sampled mul- tivariate diffusion processes. The main focus is on estimating functions and asymp- totic results. Maximum likelihood estimation is briefly considered, but the emphasis is on computationally less demanding martingale estimating functions. Particular attention is given to explicit estimating functions. Results on both fixed frequency and high frequency asymptotics are given. When choosing among the many estima- tors available, guidance is provided by simple criteria for high frequency efficiency and rate optimality that are presented in the framework of approximate martingale estimating functions.
    Keywords: Asymptotic results, discrete time observation of a diffusion, efficiency, eigenfunctions, explicit inference, generalized method of moments, likelihood infer- ence, martingale estimating functions, high frequency asymptotics, Pearson diffu- sions.
    JEL: C22 C32
    Date: 2008–04–04
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-18&r=ecm
  11. By: Hall, Alastair R.; Han, Sanggohn; Boldea, Otilia
    Abstract: In this paper, we extend Bai and Perron’s (1998, Econometrica, p.47-78) framework for multiple break testing to linear models estimated via Two Stage Least Squares (2SLS). Within our framework, the break points are estimated simultaneously with the regression parameters via minimization of the residual sum of squares on the second step of the 2SLS estimation. We establish the consistency of the resulting estimated break point fractions. We show that various F-statistics for structural instability based on the 2SLS estimator have the same limiting distribution as the analogous statistics for OLS considered by Bai and Perron (1998). This allows us to extend Bai and Perron’s (1998) sequential procedure for selecting the number of break points to the 2SLS setting. Our methods also allow for structural instability in the reduced form that has been identified a priori using data-based methods. As an empirical illustration, our methods are used to assess the stability of the New Keynesian Phillips curve.
    Keywords: unknown break points; structural change; instrumental variables; endogenous regressors; structural stability tests; new Keynesian Phillips curve
    JEL: C13 C32 C12 C22 C01
    Date: 2008–06–20
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:9251&r=ecm
  12. By: Kortelainen, Mika
    Abstract: A number of studies have explored the semi- and nonparametric estimation of stochastic frontier models by using kernel regression or other nonparametric smoothing techniques. In contrast to popular deterministic nonparametric estimators, these approaches do not allow one to impose any shape constraints (or regularity conditions) on the frontier function. On the other hand, as many of the previous techniques are based on the nonparametric estimation of the frontier function, the convergence rate of frontier estimators can be sensitive to the number of inputs, which is generally known as “the curse of dimensionality” problem. This paper proposes a new semiparametric approach for stochastic frontier estimation that avoids the curse of dimensionality and allows one to impose shape constraints on the frontier function. Our approach is based on the singleindex model and applies both single-index estimation techniques and shape-constrained nonparametric least squares. In addition to production frontier and technical efficiency estimation, we show how the technique can be used to estimate pollution generating technologies. The new approach is illustrated by an empirical application to the environmental adjusted performance evaluation of U.S. coal-fired electric power plants.
    Keywords: stochastic frontier analysis (SFA); nonparametric least squares; single-index model; sliced inverse regression; monotone rank correlation estimator; environmental efficiency
    JEL: C51 Q52 C14 D24
    Date: 2008–06–20
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:9257&r=ecm
  13. By: Per Frederiksen; Frank S. Nielsen; Morten Ørregaard Nielsen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We propose a semiparametric local polynomial Whittle with noise (LPWN) estimator of the memory parameter in long memory time series perturbed by a noise term which may be serially correlated. The estimator approximates the spectrum of the perturbation as well as that of the short-memory component of the signal by two separate polynomials. Furthermore, an empirical investigation of the 30 DJIA stocks shows that this estimator indicates stronger persistence in volatility than the standard local Whittle estimator.
    Keywords: Bias reduction, local Whittle, long memory, perturbed fractional process, semiparametric estimation, stochastic volatility
    JEL: C22
    Date: 2008–06–09
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-29&r=ecm
  14. By: Søren Johansen; Bent Nielsen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: An algorithm suggested by Hendry (1999) for estimation in a regression with more regressors than observations, is analyzed with the purpose of finding an estimator that is robust to outliers and structural breaks. This estimator is an example of a one-step M-estimator based on Huber's skip function. The asymptotic theory is derived in the situation where there are no outliers or structural breaks using empirical process techniques. Stationary processes, trend stationary autoregressions and unit root processes are considered.
    Keywords: Empirical processes, Huber's skip, indicator saturation, M-estimator, outlier robustness, vector autoregressive process
    JEL: C32
    Date: 2008–02–05
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-09&r=ecm
  15. By: Carmen Broto (Banco de España); Esther Ruiz (Universidad Carlos III de Madrid)
    Abstract: In this paper we propose a model for monthly inflation with stochastic trend, seasonal and transitory components with QGARCH disturbances. This model distinguishes whether the long-run or short-run components are heteroscedastic. Furthermore, the uncertainty associated with these components may increase with the level of inflation as postulated by Friedman. We propose to use the differences between the autocorrelations of squares and the squared autocorrelations of the auxiliary residuals to identify heteroscedastic components. We show that conditional heteroscedasticity truly present in the data can be rejected when looking at the correlations of standardized residuals while the autocorrelations of auxiliary residuals have more power to detect conditional heteroscedasticity. Furthermore, the proposed statistics can help to decide which component is heteroscedastic. Their finite sample performance is compared with that of a Lagrange Multiplier test by means of Monte Carlo experiments. Finally, we use auxiliary residuals to detect conditional heteroscedasticity in monthly inflation series of eight OECD countries.
    Keywords: Leverage effect, QGARCH, seasonality, structural time series models, unobserved component
    JEL: C22 C52 E31
    Date: 2008–06
    URL: http://d.repec.org/n?u=RePEc:bde:wpaper:0812&r=ecm
  16. By: Jean Jacod; Yingying Li; Per A. Mykland; Mark Podolskij; Mathias Vetter (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: This paper presents a generalized pre-averaging approach for estimating the integrated volatility. This approach also provides consistent estimators of other powers of volatility – in particular, it gives feasible ways to consistently estimate the asymptotic variance of the estimator of the integrated volatility. We show that our approach, which possess an intuitive transparency, can generate rate optimal estimators (with convergence rate n-1/4).
    Keywords: consistency, continuity, discrete observation, Itô process, leverage effect, pre-averaging, quarticity, realized volatility, stable convergence
    JEL: C10 C13 C14
    Date: 2007–12–10
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-43&r=ecm
  17. By: Jens Perch Nielsen; Carsten Tanggaard; M.C. Jones (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: A class of local linear kernel density estimators based on weighted least squares kernel estimation is considered within the framework of Aalen’s multiplicative intensity model. This model includes the filtered data model that, in turn, allows for truncation and/or censoring in addition to accommodat- ing unusual patterns of exposure as well as occurrence. It is shown that the local linear estimators corresponding to all different weightings have the same pointwise asymptotic properties. However, the weighting previously used in the literature in the i.i.d. case is seen to be far from optimal when it comes to exposure robustness, and a simple alternative weighting is to be preferred. Indeed, this weighting has, effectively, to be well chosen in a ‘pilot’ estimator of the survival function as well as in the main estimator itself. We also investigate multiplicative and additive bias correction methods within our framework. The multiplicative bias correction method proves to be best in a simulation study comparing the performance of the considered estimators. An example concerning old age mortality demonstrates the importance of the improvements provided.
    Keywords: Aalen’s multiplicative model, additive bias correction, censoring, counting processes, exposure robustness, kernel density estimation, multiplicative bias correction, old age mortality
    Date: 2007–06–14
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-13&r=ecm
  18. By: Michael Jansson (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: This paper derives asymptotic power envelopes for tests of the unit root hypothesis in a zero-mean AR(1) model. The power envelopes are derived using the limits of experiments approach and are semiparametric in the sense that the underlying error distribution is treated as an unknown infinitedimensional nuisance parameter. Adaptation is shown to be possible when the error distribution is known to be symmetric and to be impossible when the error distribution is unrestricted. In the latter case, two conceptually distinct approaches to nuisance parameter elimination are employed in the derivation of the semiparametric power bounds. One of these bounds, derived under an invariance restriction, is shown by example to be sharp, while the other, derived under a similarity restriction, is conjectured not to be globally attainable.
    Keywords: Unit root testing, semiparametric efficiency
    JEL: C14 C22
    Date: 2007–06–25
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-12&r=ecm
  19. By: Michael Sørensen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: A general theory of efficient estimation for ergodic diffusions sampled at high fre- quency is presented. High frequency sampling is now possible in many applications, in particular in finance. The theory is formulated in term of approximate martingale estimating functions and covers a large class of estimators including most of the pre- viously proposed estimators for diffusion processes, for instance GMM-estimators and the maximum likelihood estimator. Simple conditions are given that ensure rate optimality, where estimators of parameters in the diffusion coefficient converge faster than estimators of parameters in the drift coefficient, and for efficiency. The conditions turn out to be equal to those implying small delta-optimality in the sense of Jacobsen and thus gives an interpretation of this concept in terms of classical sta- tistical concepts. Optimal martingale estimating functions in the sense of Godambe and Heyde are shown to be give rate optimal and efficient estimators under weak conditions.
    Keywords: Approximate martingale estimating functions, discrete time observation of a diffusion, efficiency, Euler approximation, generalized method of moments, optimal estimating function, optimal rate, small delta-optimality
    JEL: C22 C32
    Date: 2008–01–22
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-46&r=ecm
  20. By: Anirban Basu; Daniel Polsky; Willard G. Manning
    Abstract: Under the assumption of no unmeasured confounders, a large literature exists on methods that can be used to estimating average treatment effects (ATE) from observational data and that spans regression models, propensity score adjustments using stratification, weighting or regression and even the combination of both as in doubly-robust estimators. However, comparison of these alternative methods is sparse in the context of data generated via non-linear models where treatment effects are heterogeneous, such as is in the case of healthcare cost data. In this paper, we compare the performance of alternative regression and propensity score-based estimators in estimating average treatment effects on outcomes that are generated via non-linear models. Using simulations, we find that in moderate size samples (n= 5000), balancing on estimated propensity scores balances the covariate means across treatment arms but fails to balance higher-order moments and covariances amongst covariates, raising concern about its use in non-linear outcomes generating mechanisms. We also find that besides inverse-probability weighting (IPW) with propensity scores, no one estimator is consistent under all data generating mechanisms. The IPW estimator is itself prone to inconsistency due to misspecification of the model for estimating propensity scores. Even when it is consistent, the IPW estimator is usually extremely inefficient. Thus care should be taken before naively applying any one estimator to estimate ATE in these data. We develop a recommendation for an algorithm which may help applied researchers to arrive at the optimal estimator. We illustrate the application of this algorithm and also the performance of alternative methods in a cost dataset on breast cancer treatment.
    JEL: C01 C21 I10
    Date: 2008–06
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:14086&r=ecm
  21. By: Ingmar Nolte; Valeri Voev (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We propose a unified framework for estimating integrated variances and covariances based on simple OLS regressions, allowing for a general market microstructure noise specification. We show that our estimators can outperform, in terms of the root mean squared error criterion, the most recent and commonly applied estimators, such as the realized kernels of Barndorff-Nielsen, Hansen, Lunde & Shephard (2006), the two-scales realized variance of Zhang, Mykland & Aït-Sahalia (2005), the Hayashi & Yoshida (2005) covariance estimator, and the realized variance and covariance with the optimal sampling frequency derived in Bandi & Russell (2005a) and Bandi & Russell (2005b). For a realistic trading scenario, the efficiency gains resulting from our approach are in the range of 35% to 50%.
    Keywords: High frequency data, Realized volatility and covariance, Market microstructure
    JEL: G10 F31 C32
    Date: 2008–06–10
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-31&r=ecm
  22. By: D. Kuang (Department of Statistics, University of Oxford); Bent Nielsen (Nuffield College, Oxford University); J. P. Nielsen (Cass Business School)
    Abstract: We consider forecasting from age-period-cohort models, as well as from the extended chain-ladder model. The parameters of these models are known only to be identified up to linear trends. Forecasts from such models may therefore depend on arbitrary linear trends. A condition for invariant forecasts is proposed. A number of standard forecast models are analysed.
    Keywords: Age-period-cohort model; Chain-ladder model; Forecasting; Identification.
    Date: 2008–06–16
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:0809&r=ecm
  23. By: Martin Møller Andreasen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: This paper shows how non-linear DSGE models with potential non-normal shocks can be estimated by Quasi-Maximum Likelihood based on the Central Difference Kalman Filter (CDKF). The advantage of this estimator is that evaluating the quasi log-likelihood function only takes a fraction of a second. The second contribution of this paper is to derive a new particle filter which we term the Mean Shifted Particle Filter (MSPFb). We show that the MSPFb outperforms the standard Particle Filter by delivering more precise state estimates, and in general the MSPFb has lower Monte Carlo variation in the reported log-likelihood function.
    Keywords: Multivariate Stirling interpolation, Particle filtering, Non-linear DSGE models, Non-normal shocks, Quasi-maximum likelihood
    JEL: C13 C15 E10 E32
    Date: 2008–06–20
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-33&r=ecm
  24. By: Dennis Kristensen; Anders Rahbek (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We consider a class of vector nonlinear error correction models where the transfer function (or loadings) of the stationary relation- ships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties of the process in terms of stochastic and deter- ministic trends as well as stationary components. In particular, the behaviour of the cointegrating relations is described in terms of geo- metric ergodicity. Despite the fact that no deterministic terms are included, the process will have both stochastic trends and a linear trend in general. Gaussian likelihood-based estimators are considered for the long- run cointegration parameters, and the short-run parameters. Asymp- totic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normaity can be found. A simulation study reveals that cointegration vectors and the shape of the adjust- ment are quite accurately estimated by maximum likelihood, while at the same time there is very little information about some of the individual parameters entering the adjustment function.
    Date: 2007–11–19
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-38&r=ecm
  25. By: Frank S. Nielsen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: This paper extends the local polynomial Whittle estimator of Andrews & Sun (2004) to fractionally integrated processes covering stationary and non-stationary regions. We utilize the notion of the extended discrete Fourier transform and periodogram to extend the local polynomial Whittle estimator to the non-stationary region. By approximating the short-run component of the spectrum by a polynomial, instead of a constant, in a shrinking neighborhood of zero we alleviate some of the bias that the classical local Whittle estimators is prone to. A simulation study illustrates the performance of the proposed estimator compared to the classical local Whittle estimator and the local polynomial Whittle estimator. The empirical justification of the proposed estimator is shown through an analysis of credit spreads.
    Keywords: Bias reduction, fractional integration, local polynomial, local Whittle estimation, long memory.
    JEL: C22
    Date: 2008–06–02
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-28&r=ecm
  26. By: Almut Veraart (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: Recent research has focused on modelling asset prices by Itô semimartingales. In such a modelling framework, the quadratic variation consists of a continuous and a jump component. This paper is about inference on the jump part of the quadratic variation, which can be estimated by the difference of realised variance and realised multipower variation. The main contribution of this paper is twofold. First, it provides a bivariate asymptotic limit theory for realised variance and realised multipower variation in the presence of jumps. Second, this paper presents new, consistent estimators for the jump part of the asymptotic variance of the estimation bias. Eventually, this leads to a feasible asymptotic theory which is applicable in practice. Finally, Monte Carlo studies reveal a good finite sample performance of the proposed feasible limit theory.
    Keywords: Quadratic variation, Itô semimartingale, stochastic volatility, jumps, realised variance, realised multipower variation, high–frequency data
    JEL: C13 C14 G10 G12
    Date: 2008–03–31
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-17&r=ecm
  27. By: Ole E. Barndorff-Nielsen; José Manuel Corcuera; Mark Podolskij (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We develop the asymptotic theory for the realised power variation of the processes X = f • G, where G is a Gaussian process with stationary increments. More specifically, under some mild assumptions on the variance function of the increments of G and certain regularity condition on the path of the process f we prove the convergence in probability for the properly normalised realised power variation. Moreover, under a further assumption on the H¨older index of the path of f, we show an associated stable central limit theorem. The main tool is a general central limit theorem, due essentially to Hu & Nualart (2005), Nualart & Peccati (2005) and Peccati & Tudor (2005), for sequences of random variables which admit a chaos representation.
    Keywords: Central Limit Theorem, Chaos Expansion, Gaussian Processes, High-Frequency Data, Multiple Wiener-Itô Integrals, Power Variation
    JEL: C10 C13 C14
    Date: 2007–12–07
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-42&r=ecm
  28. By: Elena Biewen; Sandra Nolte; Martin Rosemann
    Abstract: Whereas the literature on additive measurement error has known a considerable treatment, less work has been done for multiplicative noise. In this paper we concentrate on multiplicative measurement error in the covariates, which contrary to additive error not only modies proportionally the original value, but also conserves the structural zeros. This paper compares three variants to specify the multiplicative measurement error model in the simulation step of the Simulation-Extrapolation (SIMEX) method originally proposed by Cook and Stefanski (1994): i) as an additive one without using a logarithmic transformation, ii) as the well-known logarithmic transformation of the multiplicative error model, and iii) as an approach using the multiplicative measurement error model as such. The aim of the paper is to analyze how well these three approaches reduce the bias caused by the multiplicative measurement error. We apply three variants to the case of data masking by multiplicative measurement error, in order to obtain parameter estimates of the true data generating process. We produce Monte Carlo evidence on how the reduction of data quality can be minimized.
    Keywords: Errors-in-variables in nonlinear models, disclosure limitation methods, multiplicative error
    JEL: C13 C21
    Date: 2008–01
    URL: http://d.repec.org/n?u=RePEc:iaw:iawdip:39&r=ecm
  29. By: Meenagh, David; Minford, Patrick; Theodoridis, Konstantinos
    Abstract: We use the method of indirect inference to test a full open economy model of the UK that has been in forecasting use for three decades. The test establishes, using a Wald statistic, whether the parameters of a time-series representation estimated on the actual data lie within some confidence interval of the model-implied distribution. Various forms of time-series representations that could deal with the UK's various changes of monetary regime are tried; two are retained as adequate. The model is rejected under one but marginally accepted under the other, suggesting that with some modifications it could achieve general acceptability and that the testing method is worth investigating further.
    Keywords: Bootstrap; Indirect inference; Model evaluation; Non-linear Time Series Models; Open economy models; UK models
    JEL: C12 C32
    Date: 2008–06
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:6849&r=ecm
  30. By: Peter Reinhard Hansen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We derive an identity for the determinant of a product involving non-squared matrices. The identity can be used to derive the maximum likelihood estimator in reduced-rank regres- sions with Gaussian innovations. Furthermore, the identity sheds light on the structure of the estimation problem that arises when the reduced-rank parameters are subject to additional constraints.
    Keywords: Determinant Identity, Reduced Rank Regression, Least Squares
    JEL: C3 C32
    Date: 2008–01–15
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-02&r=ecm
  31. By: Alexandr Kuchynka (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic; Faculty of Economics, University of West Bohemia in Pilsen; Institute of Information Theory and Automation of the ASCR)
    Abstract: This paper focuses on the extraction of volatility of financial returns. The volatility process is modeled as a superposition of two autoregressive processes which represent the more persistent factor and the quickly mean-reverting factor. As the volatility is not observable, the logarithm of the daily high-low range is employed as its proxy. The estimation of parameters and volatility extraction are performed using a modified version of the Kalman filter which takes into account the finite sample distribution of the proxy.
    Keywords: volatility, stochastic volatility models, Kalman filter, volatility proxy
    JEL: C22 G15
    Date: 2008–06
    URL: http://d.repec.org/n?u=RePEc:fau:wpaper:wp2008_10&r=ecm
  32. By: Torben G. Andersen; Tim Bollerslev; Per Houmann Frederiksen; Morten Ørregaard Nielsen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We provide an empirical framework for assessing the distributional properties of daily specu- lative returns within the context of the continuous-time modeling paradigm traditionally used in asset pricing finance. Our approach builds directly on recently developed realized variation measures and non-parametric jump detection statistics constructed from high-frequency intra- day data. A sequence of relatively simple-to-implement moment-based tests involving various transforms of the daily returns speak directly to the import of different features of the under- lying continuous-time processes that might have generated the data. As such, the tests may serve as a useful diagnostic tool in the specification of empirically more realistic asset pricing models. Our results are also directly related to the popular mixture-of-distributions hypoth- esis and the role of the corresponding latent information arrival process. On applying our sequential test procedure to the thirty individual stocks in the Dow Jones Industrial Average index, the data suggest that it is important to allow for both time-varying diffusive volatility, jumps, and leverage effects in order to satisfactorily describe the daily stock price dynamics. At a broader level, the empirical results also illustrate how the realized variation measures and high-frequency sampling schemes may be used in eliciting important distributional features and asset pricing implications more generally.
    Keywords: Return distributions, continuous-time models, mixture-of-distributions hypothesis, financial-time sampling, high-frequency data, volatility signature plots, realized volatilities, jumps, leverage and volatility feedback effects
    JEL: C1 G1
    Date: 2007–08–16
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-21&r=ecm
  33. By: Matias D. Cattaneo; Richard K. Crump; Michael Jansson (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: This paper proposes (apparently) novel standard error formulas for the density-weighted average derivative estimator of Powell, Stock, and Stoker (1989). Asymptotic validity of the standard errors developed in this paper does not require the use of higher-order kernels and the standard errors are "robust" in the sense that they accommodate (but do not require) bandwidths that are smaller than those for which conventional standard errors are valid. Moreover, the results of a Monte Carlo experiment suggest that the finite sample coverage rates of confidence intervals constructed using the standard errors developed in this paper coincide (approximately) with the nominal coverage rates across a nontrivial range of bandwidths.
    Keywords: Semiparametric estimation, density-weighted average derivatives
    JEL: C14 C21
    Date: 2008–05–20
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-24&r=ecm
  34. By: Martin Burda; Roman Liesenfeld; Jean-Francois Richard
    Abstract: In this paper, we perform Bayesian analysis of a panel probit model with unobserved individual heterogeneity and serially correlated errors. We augment the data with latent variables and sample the unobserved heterogeneity component as one Gibbs block per individual using a flexible piecewise linear approximation to the marginal posterior density. The latent time effects are simulated as another Gibbs block. For this purpose we develop a new user-friendly form of the Efficient Importance Sampling proposal density for an Acceptance-Rejection Metropolis-Hastings step. We apply our method to the analysis of product innovation activity of a panel of German manufacturing firms in response to imports, foreign direct investment and other control variables. The dataset used here was analyzed under more restrictive assumptions by Bertschek and Lechner (1998) and Greene (2004). Although our results differ to a certain degree from these benchmark studies, we confirm the positive effect of imports and FDI on firms' innovation activity. Moreover, unobserved firm heterogeneity is shown to play a far more significant role in the application than the latent time effects.
    Keywords: Dynamic latent variables; Markov Chain Monte Carlo; importance sampling
    JEL: C11 C13 C15 C23 C25
    Date: 2008–06–16
    URL: http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa-321&r=ecm
  35. By: Olaf Posch (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: Understanding the process of economic growth involves comparing competing theoretical models and evaluating their empirical relevance. Our approach is to take the neoclassical stochastic growth model directly to the data and make inferences about the model parameters of interest. In this paper, output follows a jump-diffusion process. By imposing parameter restrictions we derive two solutions in explicit form. Based on them, we obtain transition densities in closed form and employ maximum likelihood techniques to estimate the model parameters. In extensive Monte Carlo simulations we demonstrate that population parameters of the underlying data generating process can be recovered. We find empirical evidence for jumps in monthly and quarterly data on industrial production for the UK, the US, Germany, and the euro area (Euro12).
    Keywords: Jump-diffusion estimation, Stochastic growth, Closed form solutions
    JEL: C13 E32 O40
    Date: 2007–09–14
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-23&r=ecm
  36. By: Viktor Todorov; Tim Bollerslev (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We provide a new theoretical framework for disentangling and estimating sensitivity towards systematic diffusive and jump risks in the context of factor pricing models. Our estimates of the sensitivities towards systematic risks, or betas, are based on the notion of increasingly finer sampled returns over fixed time intervals. In addition to establish- ing consistency of our estimators, we also derive Central Limit Theorems characterizing their asymptotic distributions. In an empirical application of the new procedures using high-frequency data for forty individual stocks and an aggregate market portfolio, we find the estimated diffusive and jump betas with respect to the market to be quite dif- ferent for many of the stocks. Our findings have direct and important implications for empirical asset pricing finance and practical portfolio and risk management decisions.
    Keywords: Factor models, systematic risk, common jumps, high-frequency data, realized variation
    JEL: C13 C14 G10 G12
    Date: 2007–08–16
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-15&r=ecm
  37. By: James Davidson; Nigar Hashimzade (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: This paper considers the asymptotic distribution of the covariance of a nonstationary frac- tionally integrated process with the stationary increments of another such process - possibly, itself. Questions of interest include the relationship between the harmonic representation of these random variables, which we have analysed in a previous paper, and the construction derived from moving average representations in the time domain. The limiting integrals are shown to be expressible in terms of functionals of Itô integrals with respect to two distinct Brownian motions. Their mean is nonetheless shown to match that of the harmonic rep- resentation, and they satisfy the required integration by parts rule. The advantages of our approach over the harmonic analysis include the facts that our formulae are valid for the full range of the long memory parameters, and extend to non-Gaussian processes.
    Keywords: Stochastic integral, weak convergence, fractional Brownian motion
    JEL: C22 C32
    Date: 2007–12–21
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-45&r=ecm
  38. By: Francois-Éric Racicot (Département des sciences administratives, Université du Québec (Outaouais), LRSP et Chaire d'information financière et organisationnelle); Raymond Théoret (Département de stratégie des affaires, Université du Québec (Montréal), et Chaire d'information financière et organisationnelle)
    Keywords: Asset Pricing Models, specification errors, Hausman test, GMM, optimal instruments.
    JEL: C13 C19 C49 G12 G31
    Date: 2008–01–06
    URL: http://d.repec.org/n?u=RePEc:pqs:wpaper:012008&r=ecm
  39. By: Wolfgang Härdle; Ostap Okhrin; Yarema Okhrin
    Abstract: In this paper we provide a review of copula theory with applications to finance. We illustrate the idea on the bivariate framework and discuss the simple, elliptical and Archimedean classes of copulae. Since the cop- ulae model the dependency structure between random variables, next we explain the link between the copulae and common dependency measures, such as Kendall's tau and Spearman's rho. In the next section the copulae are generalized to the multivariate case. In this general setup we discuss and provide an intensive literature review of estimation and simulation techniques. Separate section is devoted to the goodness-of-fit tests. The importance of copulae in finance we illustrate on the example of asset allocation problems, Value-at-Risk and time series models. The paper is complemented with an extensive simulation study and an application to financial data.
    Keywords: Distribution functions, Dimension Reduction, Risk management, Statistical models
    JEL: C00 C14 C51
    Date: 2008–06
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2008-043&r=ecm
  40. By: Tim Bollerslev; Uta Kretschmer; Christian Pigorsch; George Tauchen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We develop an empirically highly accurate discrete-time daily stochastic volatility model that explicitly distinguishes between the jump and continuoustime components of price movements using nonparametric realized variation and Bipower variation measures constructed from high-frequency intraday data. The model setup allows us to directly assess the structural inter-dependencies among the shocks to returns and the two different volatility components. The model estimates suggest that the leverage effect, or asymmetry between returns and volatility, works primarily through the continuous volatility component. The excellent fit of the model makes it an ideal candidate for an easyto- implement auxiliary model in the context of indirect estimation of empirically more realistic continuous-time jump diffusion and L´evy-driven stochastic volatility models, effectively incorporating the interdaily dependencies inherent in the high-frequency intraday data.
    Keywords: Realized volatility, Bipower variation, Jumps, Leverage effect, Simultaneous equation model
    JEL: C1 C3 C5 G1
    Date: 2007–08–16
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-22&r=ecm
  41. By: Tim Bollerslev; Michael Gibson; Hao Zhou (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: This paper proposes a method for constructing a volatility risk premium, or investor risk aversion, index. The method is intuitive and simple to implement, relying on the sample moments of the recently popularized model-free realized and option-implied volatility measures. A small-scale Monte Carlo experiment confirms that the procedure works well in practice. Implementing the procedure with actual S&P500 option-implied volatilities and high-frequency five-minute-based realized volatilities indicates significant temporal dependencies in the estimated stochastic volatility risk premium, which we in turn relate to a set of macro-finance state variables. We also find that the extracted volatility risk premium helps predict future stock market returns.
    Keywords: Stochastic Volatility Risk Premium, Model-Free Implied Volatility, Model-Free Realized Volatility, Black-Scholes, GMM Estimation, Return Predictability
    JEL: G12 G13 C51 C52
    Date: 2007–08–16
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-16&r=ecm
  42. By: Bent Jesper Christensen; Morten Ørregaard Nielsen; Jie Zhu (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We extend the fractionally integrated exponential GARCH (FIEGARCH) model for daily stock return data with long memory in return volatility of Bollerslev and Mikkelsen (1996) by introducing a possible volatility-in-mean effect. To avoid that the long memory property of volatility carries over to returns, we consider a filtered FIEGARCH-in-mean (FIEGARCH-M) effect in the return equation. The filtering of the volatility-in-mean component thus allows the co-existence of long memory in volatility and short memory in returns. We present an application to the S&P 500 index which documents the empirical relevance of our model.
    Keywords: FIEGARCH, financial leverage, GARCH, long memory, risk-return tradeoff, stock returns, volatility feedback
    JEL: C22
    Date: 2007–06–12
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-10&r=ecm
  43. By: Torben G. Andersen; Tim Bollerslev; Xin Huang (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: Building on realized variance and bi-power variation measures constructed from high-frequency financial prices, we propose a simple reduced form framework for effectively incorporating intraday data into the modeling of daily return volatility. We decompose the total daily return variability into the continuous sample path variance, the variation arising from discontinuous jumps that occur during the trading day, as well as the overnight return variance. Our empirical results, based on long samples of high-frequency equity and bond futures returns, suggest that the dynamic dependencies in the daily continuous sample path variability is well described by an approximate long-memory HAR-GARCH model, while the overnight returns may be modelled by an augmented GARCH type structure. The dynamic dependencies in the non-parametrically identified significant jumps appear to be well described by the combination of an ACH model for the time-varying jump intensities coupled with a relatively simple log-linear structure for the jump sizes. Lastly, we discuss how the resulting reduced form model structure for each of the three components may be used in the construction of out-of-sample forecasts for the total return volatility.
    Keywords: Stochastic Volatility, Realized Variation, Bipower Variation, Jumps, Hazard Rates, Overnight Volatility
    JEL: C1 G1 C2
    Date: 2007–08–16
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-14&r=ecm
  44. By: Martin Møller Andreasen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano & Primiceri (American Economic Review, forth- coming) and Fernández-Villaverde & Rubio-Ramírez (Review of Economic Studies, 2007) do not satisfy these sufficient conditions, or any other known set of conditions ensuring finite values for the objective functions. Thus, the validity of the micro foundation in these models remains to be established.
    Keywords: Deterministic trends, DSGE models, Error distributions, Moment generating functions, Stochastic trends, Stochastic volatility, Unit-roots
    JEL: E10 E30
    Date: 2008–05–26
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-26&r=ecm
  45. By: Stefan Holst Bache; Christian M. Dahl; Johannes Tang (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: Low birthweight outcomes are associated with large social and economic costs, and therefore the possible determinants of low birthweight are of great interest. One such determinant which has received considerable attention is maternal smoking. From an economic perspective this is in part due to the possibility that smoking habits can be influenced through policy conduct. It is widely believed that maternal smoking reduces birthweight; however, the crucial difficulty in estimating such effects is the unobserved heterogeneity among mothers. We consider extensions of three panel data models to a quantile regression framework in order to control for heterogeneity and to infer conclusions about causality across the entire birthweight distribution. We obtain estimation results for maternal smoking and other interesting determinants, applying these to data obtained from Aarhus University Hospital, Skejby (Denmark). We examine the use of both balanced and unbalanced panels. In conclusion, our results show the importance of considering conditional quantiles and controlling for unobserved heterogeneity when estimating determinants of birthweight outcomes. An example of this is the change in magnitude and significance of prenatal smoking. Controlling for unobserved effects does not change the fact that smoking reduces birthweight, but it shows that the effect is primarily a problem in the left tail of the distribution on a slightly smaller scale.
    Keywords: Random Correlated Effects, Fixed Effects, Cross Section, Quantile Regression, Maternal Smoking, Birthweight
    JEL: C13 C23 I10
    Date: 2008–05–08
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-20&r=ecm
  46. By: Ole E. Barndorff-Nielsen; José Manuel Corcuera; Mark Podolskij; Jeannette H.C. Woerner (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: Convergence in probability and central limit laws of bipower variation for Gaussian processes with stationary increments and for integrals with respect to such processes are derived. The main tools of the proofs are some recent powerful techniques of Wiener/Itô/Malliavin calculus for establishing limit laws, due to Nualart, Peccati and others.
    Keywords: Bipower Variation, Central Limit Theorem, Chaos Expansion, Gaussian Processes, Multiple Wiener-Itô Integrals.
    Date: 2008–05–08
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-21&r=ecm
  47. By: Thomas Busch; Thomas Busch; Bent Jesper Christensen; Morten Ørregaard Nielsen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We study the forecasting of future realized volatility in the stock, bond, and for- eign exchange markets, as well as the continuous sample path and jump components of this, from variables in the information set, including implied volatility backed out from option prices. Recent nonparametric statistical techniques of Barndor¤-Nielsen & Shephard (2004, 2006) are used to separate realized volatility into its continuous and jump components, which enhances forecasting performance, as shown by Andersen, Bollerslev & Diebold (2005). We generalize the heterogeneous autoregressive (HAR) model of Corsi (2004) to include implied volatility as an additional regressor, and to the separate forecasting of the realized components. We also introduce a new vector HAR (VecHAR) model for the resulting simultaneous system, controlling for possible endogeneity issues in the forecasting equations. We show that implied volatility con- tains incremental information about future volatility relative to both continuous and jump components of past realized volatility. Indeed, in the foreign exchange market, implied volatility completely subsumes the information content of daily, weekly, and monthly realized volatility measures, when forecasting future realized volatility or its continuous component. In addition, implied volatility is an unbiased forecast of future realized volatility in the foreign exchange and stock markets. Perhaps surprisingly, the jump component of realized return volatility is, to some extent, predictable, and options appear to be calibrated to incorporate information about future jumps in all three markets.
    Keywords: Bipower variation, HAR, Heterogeneous Autoregressive Model, implied volatility, jumps, options, realized volatility, VecHAR, volatility forecasting
    JEL: C22 C32 F31 G1
    Date: 2007–06–06
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-09&r=ecm
  48. By: Martin Møller Andreasen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: This paper extends two optimization routines to deal with objective functions for DSGE models. The optimization routines are i) a version of Simulated Annealing developed by Corana, Marchesi & Ridella (1987), and ii) the evolutionary algorithm CMA-ES developed by Hansen, Müller & Koumoutsakos (2003). Following these extensions, we examine the ability of the two routines to maximize the likelihood function for a sequence of test economies. Our results show that the CMA- ES routine clearly outperforms Simulated Annealing in its ability to find the global optimum and in efficiency. With 10 unknown structural parameters in the likelihood function, the CMA-ES routine finds the global optimum in 95% of our test economies compared to 89% for Simulated Annealing. When the number of unknown structural parameters in the likelihood function increases to 20 and 35, then the CMA-ES routine finds the global optimum in 85% and 71% of our test economies, respectively. The corresponding numbers for Simulated Annealing are 70% and 0%.
    Keywords: CMA-ES optimization routine, Multimodel objective function, Nelder-Mead simplex routine, Non-convex search space, Resampling, Simulated Annealing
    JEL: C61 C88 E30
    Date: 2008–06–19
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-32&r=ecm
  49. By: Tim Bollerslev; Tzuo Hann Law; George Tauchen (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We test for price discontinuities, or jumps, in a panel of high-frequency intraday returns for forty large-cap stocks and an equiweighted index from these same stocks. Jumps are naturally classified into two types: common and idiosyncratic. Common jumps affect all stocks, albeit to varying degrees, while idiosyncratic jumps are stock-specific. Despite the fact that each of the stocks has a of about unity with respect to the index, common jumps are virtually never detected in the individual stocks. This is truly puzzling, as an index can jump only if one or more of its components jump. To resolve this puzzle, we propose a new test for cojumps. Using this new test we find strong evidence for many modest-sized common jumps that simply pass through the standard jump detection statistic, while they appear highly significant in the cross section based on the new cojump identification scheme. Our results are further corroborated by a striking within-day pattern in the non-diversifiable cojumps.
    Keywords: risk, diversification
    JEL: C12 C32 C33 G12 G14
    Date: 2007–08–16
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-19&r=ecm
  50. By: Tom Engsted; Stig V. Møller (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We suggest an iterated GMM approach to estimate and test the consumption based habit persistence model of Campbell and Cochrane (1999), and we apply the approach on annual and quarterly Danish stock and bond returns. For comparative purposes we also estimate and test the standard CRRA model. In addition, we compare the pricing errors of the different models using Hansen and Jagannathan’s (1997) specification error measure. The main result is that for Denmark the Campbell-Cochrane model does not seem to perform markedly better than the CRRA model. For the long annual sample period covering more than 80 years there is absolutely no evidence of superior performance of the Campbell-Cochrane model. For the shorter and more recent quarterly data over a 20-30 year period, there is some evidence of counter-cyclical time-variation in the degree of risk-aversion, in accordance with the Campbell-Cochrane model, but the model does not produce lower pricing errors or more plausible parameter estimates than the CRRA model.
    Keywords: Consumption-based model, habit persistence, GMM, pricing error
    JEL: G12
    Date: 2008–02–27
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-12&r=ecm
  51. By: Jie Zhu (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: In this paper a two-component volatility model based on the component's first moment is introduced to describe the dynamic of speculative return volatility. The two components capture the volatile and persistent part of volatility respectively. Then the model is applied to 10 Asia-Pacific stock markets. Their in-mean effects on return are also tested. The empirical results show that the persistent component accounts much more for volatility dynamic process than the volatile component. However the volatile component is found to be a significant pricing factor of asset returns for most markets, a positive or risk-premium effect exists between return and the volatile component, yet the persistent component is not significantly priced for return dynamic process.
    Keywords: Risk, Return, In-mean effect, Volatile, Persistent, Innovations
    JEL: C14 G12 G15
    Date: 2008–03–05
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-14&r=ecm
  52. By: J. BUREZ; D. VAN DEN POEL
    Abstract: Customer churn is often a rare event in service industries, but of great interest and great value. Until recently, however, class imbalance has not received much attention in the context of data mining (Weiss, 2004). In this study, we investigate how we can better handle class imbalance in churn prediction. Using more appropriate evaluation metrics (AUC, lift), we investigated the increase in performance of sampling (both random and advanced under-sampling) and two specific modelling techniques (gradient boosting and weighted random forests) compared to some standard modelling techniques. <br>AUC and lift prove to be good evaluation metrics. AUC does not depend on a threshold, and is therefore a better overall evaluation metric compared to accuracy. Lift is very much related to accuracy, but has the advantage of being well used in marketing practice (Ling and Li, 1998). Results show that under-sampling can lead to improved prediction accuracy, especially when evaluated with AUC. Unlike Ling and Li (1998), we find that there is no need to under-sample so that there are as many churners in your training set as non churners. Results show no increase in predictive performance when using the advanced sampling technique CUBE in this study. This is in line with findings of Japkowicz (2000), who noted that using sophisticated sampling techniques did not give any clear advantage. Weighted random forests, as a cost-sensitive learner, performs significantly better compared to random forests, and is therefore advised. It should, however always be compared to logistic regression. Boosting is a very robust classifier, but never outperforms any other technique.
    Keywords: rare events, class imbalance, undersampling, oversampling, boosting, random forests, CUBE, customer churn, classifier
    Date: 2008–05
    URL: http://d.repec.org/n?u=RePEc:rug:rugwps:08/517&r=ecm
  53. By: Christopher R. Knittel; Konstantinos Metaxoglou
    Abstract: Empirical exercises in economics frequently involve estimation of highly nonlinear models. The criterion function may not be globally concave or convex and exhibit many local extrema. Choosing among these local extrema is non-trivial for a variety of reasons. In this paper, we analyze the sensitivity of parameter estimates, and most importantly of economic variables of interest, to both starting values and the type of non-linear optimization algorithm employed. We focus on a class of demand models for differentiated products that have been used extensively in industrial organization, and more recently in public and labor. We find that convergence may occur at a number of local extrema, at saddles and in regions of the objective function where the first-order conditions are not satisfied. We find own- and cross-price elasticities that differ by a factor of over 100 depending on the set of candidate parameter estimates. In an attempt to evaluate the welfare effects of a change in an industry's structure, we undertake a hypothetical merger exercise. Our calculations indicate consumer welfare effects can vary between positive values to negative seventy billion dollars depending on the set of parameter estimates used.
    JEL: C1 C61 C81 L1 L4
    Date: 2008–06
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:14080&r=ecm
  54. By: Olav Bjerkholt (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: Trygve Haavelmo spent the academic year 1938/39 at the University of Aarhus as a teacher in statistics. He would immediately after his Aarhus stay leave for the United States, where he completed The Probability Approach in Econometrics (1944) and later worked at the Cowles Commission before returning to Norway in 1947. The purpose of the paper has been to assess whether Haavelmo in Aarhus was already on a path towards the Probability Approach or, as suggested in the history of econometrics literature, this path did not really open up until Haavelmo came to the U.S.A. and got converted to probability reasoning. The paper gives a survey of Haavelmo’s papers and other work while in Aarhus. The evidence indicates that Haavelmo had adopted probability ideas by the time he was in Aarhus and seemed well prepared to embark on his magnum opus.
    Keywords: Economic history, the probability approach in econometrics
    JEL: B23 B31
    Date: 2007–11–26
    URL: http://d.repec.org/n?u=RePEc:aah:create:2007-40&r=ecm

This nep-ecm issue is ©2008 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.