
on Econometrics 
By:  Mark Podolskij; Daniel Ziggel (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We propose a new test for the parametric form of the volatility function in continuous time diffusion models of the type dXt = a(t;Xt)dt + (t;Xt)dWt. Our approach involves a rangebased estimation of the integrated volatility and the integrated quarticity, which are used to construct the test statistic. Under rather weak assumptions on the drift and volatility we prove weak convergence of the test statistic to a centered mixed Gaussian distribution. As a consequence we obtain a test, which is consistent for any fixed alternative. We also provide a test for neighborhood hypotheses. Moreover, we present a parametric bootstrap procedure which provides a better approximation of the distribution of the test statistic. Finally, it is demonstrated by means of Monte Carlo study that the rangebased test is more powerful than the returnbased test when comparing at the same sampling frequency. 
Keywords:  Bipower Variation, Central Limit Theorem, Diffusion Models, GoodnessOf Fit Testing, HighFrequency Data, Integrated Volatility, RangeBased Bipower Variation; Semimartingale Theory 
JEL:  C12 C14 
Date:  2008–05–14 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200822&r=ecm 
By:  Mark Podolskij; Mathias Vetter (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We propose a new concept of modulated bipower variation for diffusion models with microstructure noise. We show that this method provides simple estimates for such important quantities as integrated volatility or integrated quarticity. Under mild conditions the consistency of modulated bipower variation is proven. Under further assumptions we prove stable convergence of our estimates with the optimal rate n1/4. Moreover, we construct estimates which are robust to finite activity jumps. 
Keywords:  Bipower Variation, Central Limit Theorem, Finite Activity Jumps, HighFrequency Data, Integrated Volatility, Microstructure Noise, Semimartingale Theory, Subsampling 
JEL:  C10 C13 C14 
Date:  2007–09–19 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200727&r=ecm 
By:  Mark Podolskij; Daniel Ziggel (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  In this paper we propose a test to determine whether jumps are present in a discretely sampled process or not. We use the concept of truncated power variation to construct our test statistics for (i) semimartingale models and (ii) semimartingale models with noise. The test statistics converge to innity if jumps are present and have a normal distribution otherwise. Our method is valid (under very weak assumptions) for all semimartingales with absolute continuous characteristics and rather general model for the noise process. We nally implement the test and present the simulation results. Our simulations suggest that for semimartingale models the new test is much more powerful then tests proposed by BarndorffNielsen and Shephard (2006) and AtSahalia and Jacod (2008). 
Keywords:  Central Limit Theorem, HighFrequency Data, Microstructure Noise, Semimartingale Theory, Tests for Jumps, Truncated Power Variation 
JEL:  C10 C13 C14 
Date:  2008–06–20 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200834&r=ecm 
By:  Silja Kinnebrock; Mark Podolskij (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  This paper introduces a new estimator to measure the expost covariation between highfrequency financial time series under market microstructure noise. We provide an asymptotic limit theory (including feasible central limit theorems) for standard methods such as regression, correlation analysis and covariance, for which we obtain the optimal rate of convergence. We demonstrate some positive semidefinite estimators of the covariation and construct a positive semidefinite estimator of the conditional covariance matrix in the central limit theorem. Furthermore, we indicate how the assumptions on the noise process can be relaxed and how our method can be applied to nonsynchronous observations. We also present an empirical study of how highfrequency correlations, regressions and covariances change through time. 
Keywords:  Central Limit Theorem, Diffusion Models, Market Microstructure Noise, Nonsynchronous Trading, HighFrequency Data, Semimartingale Theory 
Date:  2008–05–16 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200823&r=ecm 
By:  Michael Sørensen; Julie Lyng Forman (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  The Pearson diffusions is a flexible class of diffusions defined by having linear drift and quadratic squared diffusion coefficient. It is demonstrated that for this class explicit statistical inference is feasible. Explicit optimal martingale estimating func tions are found, and the corresponding estimators are shown to be consistent and asymptotically normal. The discussion covers GMM, quasilikelihood, and non linear weighted least squares estimation too, and it is discussed how explicit likeli hood or approximate likelihood inference is possible for the Pearson diffusions. A complete model classification is presented for the ergodic Pearson diffusions. The class of stationary distributions equals the full Pearson system of distributions. Wellknown instances are the OrnsteinUhlenbeck processes and the square root (CIR) processes. Also diffusions with heavytailed and skew marginals are included. Special attention is given to a skew ttype distribution. Explicit formulae for the conditional moments and the polynomial eigenfunctions are derived. The analyti cal tractability is inherited by transformed Pearson diffusions, integrated Pearson diffusions, sums of Pearson diffusions, and stochastic volatility models with Pearson volatility process. For the nonMarkov models explicit optimal prediction based estimating functions are found and shown to yield consistent and asymptotically normal estimators. 
Keywords:  eigenfunction, ergodic diffusion, integrated diffusion, martingale estimating function, likelihood inference, mixing, optimal estimating function, Pearson system, prediction based estimating function, quasi likelihood, spectral methods,stochastic differential equation, stochastic volatility 
JEL:  C22 C51 
Date:  2007–09–27 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200728&r=ecm 
By:  Mark Podolskij; Mathias Vetter (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We consider a new class of estimators for volatility functionals in the setting of frequently observed Itô diffusions which are disturbed by i.i.d. noise. These statistics extend the approach of preaveraging as a general method for the estimation of the integrated volatility in the presence of microstructure noise and are closely related to the original concept of bipower variation in the nonoise case. We show that this approach provides efficient estimators for a large class of integrated powers of volatility and prove the associated (stable) central limit theorems. In a more general Itô semimartingale framework this method can be used to define both estimators for the entire quadratic variation of the underlying process and jumprobust estimators which are consistent for various functionals of volatility. As a byproduct we obtain a simple test for the presence of jumps in the underlying semimartingale. 
Keywords:  Bipower Variation, Central Limit Theorem, HighFrequency Data, Microstructure Noise, Quadratic Variation, Semimartingale Theory, Test for Jumps 
JEL:  C10 C13 C14 
Date:  2008–05–26 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200825&r=ecm 
By:  Kulan Ranasinghe; Mervyn J. Silvapulle 
Abstract:  This paper proposes a semiparametric method for estimating duration models when there are inequality constraints on some parameters and the error distribution may be unknown. Thus, the setting considered here is particularly suitable for practical applications. The parameters in duration models are usually estimated by a quasiMLE. Recent advances show that a semiparametrically efficient estimator [SPE] has better asymptotic optimality properties than the QMLE provided that the parameter space is unrestricted. However, in several important duration models, the parameter space is restricted, for example in the commonly used linear duration model some parameters are nonnegative. In such cases, the SPE may turn out to be outside the allowed parameter space and hence are unsuitable for use. To overcome this difficulty, we propose a new constrained semiparametric estimator. In a simulation study involving duration models with inequality constraints on parameters, the new estimator proposed in this paper performed better than its competitors. An empirical example is provided to illustrate the application of the new constrained semiparametric estimator and to show how it overcomes difficulties encountered when the unconstrained estimator of nonnegative parameters turn out to be negative. 
Keywords:  Adaptive inference; Conditional duration model; Constrained inference; Efficient semiparametric estimation; Order restricted inference; Semiparametric efficiency bound. 
JEL:  C41 C14 
Date:  2008–06 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:20085&r=ecm 
By:  Mathias D. Cattaneo; Richard K. Crump; Michael Jansson (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  This paper is concerned with inference on the coefficient on the endogenous regressor in a linear instrumental variables model with a single endogenous regressor, nonrandom exogenous regressors and instruments, and i.i.d. errors whose distribution is unknown. It is shown that under mild smoothness conditions on the error distribution it is possible to develop tests which are “nearly” efficient when identification is weak and consistent and asymptotically optimal when identification is strong. In addition, an estimator is presented which can be used in the usual way to construct valid (indeed, optimal) confidence intervals when identification is strong. The estimator is of the two stage least squares variety and is asymptotically efficient under strong identification whether or not the errors are normal. 
Keywords:  Instrumental variables regression, weak instruments, adaptive estimation 
JEL:  C14 C31 
Date:  2007–06–25 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200711&r=ecm 
By:  Per Frederiksen; Morten Ørregaard Nielsen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We propose to use a variant of the local polynomial Whittle estimator to estimate the memory parameter in volatility for long memory stochastic volatility models with potential nonstation arity in the volatility process. We show that the estimator is asymptotically normal and capable of obtaining bias reduction as well as a rate of convergence arbitrarily close to the parametric rate, n1=2. A Monte Carlo study is conducted to support the theoretical results, and an analysis of daily exchange rates demonstrates the empirical usefulness of the estimators 
Keywords:  Bias reduction, local Whittle estimation, long memory stochastic volatility model 
JEL:  C14 C22 
Date:  2008–06–24 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200835&r=ecm 
By:  Michael Sørensen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  A review is given of parametric estimation methods for discretely sampled mul tivariate diffusion processes. The main focus is on estimating functions and asymp totic results. Maximum likelihood estimation is briefly considered, but the emphasis is on computationally less demanding martingale estimating functions. Particular attention is given to explicit estimating functions. Results on both fixed frequency and high frequency asymptotics are given. When choosing among the many estima tors available, guidance is provided by simple criteria for high frequency efficiency and rate optimality that are presented in the framework of approximate martingale estimating functions. 
Keywords:  Asymptotic results, discrete time observation of a diffusion, efficiency, eigenfunctions, explicit inference, generalized method of moments, likelihood infer ence, martingale estimating functions, high frequency asymptotics, Pearson diffu sions. 
JEL:  C22 C32 
Date:  2008–04–04 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200818&r=ecm 
By:  Hall, Alastair R.; Han, Sanggohn; Boldea, Otilia 
Abstract:  In this paper, we extend Bai and Perron’s (1998, Econometrica, p.4778) framework for multiple break testing to linear models estimated via Two Stage Least Squares (2SLS). Within our framework, the break points are estimated simultaneously with the regression parameters via minimization of the residual sum of squares on the second step of the 2SLS estimation. We establish the consistency of the resulting estimated break point fractions. We show that various Fstatistics for structural instability based on the 2SLS estimator have the same limiting distribution as the analogous statistics for OLS considered by Bai and Perron (1998). This allows us to extend Bai and Perron’s (1998) sequential procedure for selecting the number of break points to the 2SLS setting. Our methods also allow for structural instability in the reduced form that has been identified a priori using databased methods. As an empirical illustration, our methods are used to assess the stability of the New Keynesian Phillips curve. 
Keywords:  unknown break points; structural change; instrumental variables; endogenous regressors; structural stability tests; new Keynesian Phillips curve 
JEL:  C13 C32 C12 C22 C01 
Date:  2008–06–20 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:9251&r=ecm 
By:  Kortelainen, Mika 
Abstract:  A number of studies have explored the semi and nonparametric estimation of stochastic frontier models by using kernel regression or other nonparametric smoothing techniques. In contrast to popular deterministic nonparametric estimators, these approaches do not allow one to impose any shape constraints (or regularity conditions) on the frontier function. On the other hand, as many of the previous techniques are based on the nonparametric estimation of the frontier function, the convergence rate of frontier estimators can be sensitive to the number of inputs, which is generally known as “the curse of dimensionality” problem. This paper proposes a new semiparametric approach for stochastic frontier estimation that avoids the curse of dimensionality and allows one to impose shape constraints on the frontier function. Our approach is based on the singleindex model and applies both singleindex estimation techniques and shapeconstrained nonparametric least squares. In addition to production frontier and technical efficiency estimation, we show how the technique can be used to estimate pollution generating technologies. The new approach is illustrated by an empirical application to the environmental adjusted performance evaluation of U.S. coalfired electric power plants. 
Keywords:  stochastic frontier analysis (SFA); nonparametric least squares; singleindex model; sliced inverse regression; monotone rank correlation estimator; environmental efficiency 
JEL:  C51 Q52 C14 D24 
Date:  2008–06–20 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:9257&r=ecm 
By:  Per Frederiksen; Frank S. Nielsen; Morten Ørregaard Nielsen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We propose a semiparametric local polynomial Whittle with noise (LPWN) estimator of the memory parameter in long memory time series perturbed by a noise term which may be serially correlated. The estimator approximates the spectrum of the perturbation as well as that of the shortmemory component of the signal by two separate polynomials. Furthermore, an empirical investigation of the 30 DJIA stocks shows that this estimator indicates stronger persistence in volatility than the standard local Whittle estimator. 
Keywords:  Bias reduction, local Whittle, long memory, perturbed fractional process, semiparametric estimation, stochastic volatility 
JEL:  C22 
Date:  2008–06–09 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200829&r=ecm 
By:  Søren Johansen; Bent Nielsen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  An algorithm suggested by Hendry (1999) for estimation in a regression with more regressors than observations, is analyzed with the purpose of finding an estimator that is robust to outliers and structural breaks. This estimator is an example of a onestep Mestimator based on Huber's skip function. The asymptotic theory is derived in the situation where there are no outliers or structural breaks using empirical process techniques. Stationary processes, trend stationary autoregressions and unit root processes are considered. 
Keywords:  Empirical processes, Huber's skip, indicator saturation, Mestimator, outlier robustness, vector autoregressive process 
JEL:  C32 
Date:  2008–02–05 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200809&r=ecm 
By:  Carmen Broto (Banco de España); Esther Ruiz (Universidad Carlos III de Madrid) 
Abstract:  In this paper we propose a model for monthly inflation with stochastic trend, seasonal and transitory components with QGARCH disturbances. This model distinguishes whether the longrun or shortrun components are heteroscedastic. Furthermore, the uncertainty associated with these components may increase with the level of inflation as postulated by Friedman. We propose to use the differences between the autocorrelations of squares and the squared autocorrelations of the auxiliary residuals to identify heteroscedastic components. We show that conditional heteroscedasticity truly present in the data can be rejected when looking at the correlations of standardized residuals while the autocorrelations of auxiliary residuals have more power to detect conditional heteroscedasticity. Furthermore, the proposed statistics can help to decide which component is heteroscedastic. Their finite sample performance is compared with that of a Lagrange Multiplier test by means of Monte Carlo experiments. Finally, we use auxiliary residuals to detect conditional heteroscedasticity in monthly inflation series of eight OECD countries. 
Keywords:  Leverage effect, QGARCH, seasonality, structural time series models, unobserved component 
JEL:  C22 C52 E31 
Date:  2008–06 
URL:  http://d.repec.org/n?u=RePEc:bde:wpaper:0812&r=ecm 
By:  Jean Jacod; Yingying Li; Per A. Mykland; Mark Podolskij; Mathias Vetter (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  This paper presents a generalized preaveraging approach for estimating the integrated volatility. This approach also provides consistent estimators of other powers of volatility – in particular, it gives feasible ways to consistently estimate the asymptotic variance of the estimator of the integrated volatility. We show that our approach, which possess an intuitive transparency, can generate rate optimal estimators (with convergence rate n1/4). 
Keywords:  consistency, continuity, discrete observation, Itô process, leverage effect, preaveraging, quarticity, realized volatility, stable convergence 
JEL:  C10 C13 C14 
Date:  2007–12–10 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200743&r=ecm 
By:  Jens Perch Nielsen; Carsten Tanggaard; M.C. Jones (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  A class of local linear kernel density estimators based on weighted least squares kernel estimation is considered within the framework of Aalen’s multiplicative intensity model. This model includes the filtered data model that, in turn, allows for truncation and/or censoring in addition to accommodat ing unusual patterns of exposure as well as occurrence. It is shown that the local linear estimators corresponding to all different weightings have the same pointwise asymptotic properties. However, the weighting previously used in the literature in the i.i.d. case is seen to be far from optimal when it comes to exposure robustness, and a simple alternative weighting is to be preferred. Indeed, this weighting has, effectively, to be well chosen in a ‘pilot’ estimator of the survival function as well as in the main estimator itself. We also investigate multiplicative and additive bias correction methods within our framework. The multiplicative bias correction method proves to be best in a simulation study comparing the performance of the considered estimators. An example concerning old age mortality demonstrates the importance of the improvements provided. 
Keywords:  Aalen’s multiplicative model, additive bias correction, censoring, counting processes, exposure robustness, kernel density estimation, multiplicative bias correction, old age mortality 
Date:  2007–06–14 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200713&r=ecm 
By:  Michael Jansson (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  This paper derives asymptotic power envelopes for tests of the unit root hypothesis in a zeromean AR(1) model. The power envelopes are derived using the limits of experiments approach and are semiparametric in the sense that the underlying error distribution is treated as an unknown infinitedimensional nuisance parameter. Adaptation is shown to be possible when the error distribution is known to be symmetric and to be impossible when the error distribution is unrestricted. In the latter case, two conceptually distinct approaches to nuisance parameter elimination are employed in the derivation of the semiparametric power bounds. One of these bounds, derived under an invariance restriction, is shown by example to be sharp, while the other, derived under a similarity restriction, is conjectured not to be globally attainable. 
Keywords:  Unit root testing, semiparametric efficiency 
JEL:  C14 C22 
Date:  2007–06–25 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200712&r=ecm 
By:  Michael Sørensen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  A general theory of efficient estimation for ergodic diffusions sampled at high fre quency is presented. High frequency sampling is now possible in many applications, in particular in finance. The theory is formulated in term of approximate martingale estimating functions and covers a large class of estimators including most of the pre viously proposed estimators for diffusion processes, for instance GMMestimators and the maximum likelihood estimator. Simple conditions are given that ensure rate optimality, where estimators of parameters in the diffusion coefficient converge faster than estimators of parameters in the drift coefficient, and for efficiency. The conditions turn out to be equal to those implying small deltaoptimality in the sense of Jacobsen and thus gives an interpretation of this concept in terms of classical sta tistical concepts. Optimal martingale estimating functions in the sense of Godambe and Heyde are shown to be give rate optimal and efficient estimators under weak conditions. 
Keywords:  Approximate martingale estimating functions, discrete time observation of a diffusion, efficiency, Euler approximation, generalized method of moments, optimal estimating function, optimal rate, small deltaoptimality 
JEL:  C22 C32 
Date:  2008–01–22 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200746&r=ecm 
By:  Anirban Basu; Daniel Polsky; Willard G. Manning 
Abstract:  Under the assumption of no unmeasured confounders, a large literature exists on methods that can be used to estimating average treatment effects (ATE) from observational data and that spans regression models, propensity score adjustments using stratification, weighting or regression and even the combination of both as in doublyrobust estimators. However, comparison of these alternative methods is sparse in the context of data generated via nonlinear models where treatment effects are heterogeneous, such as is in the case of healthcare cost data. In this paper, we compare the performance of alternative regression and propensity scorebased estimators in estimating average treatment effects on outcomes that are generated via nonlinear models. Using simulations, we find that in moderate size samples (n= 5000), balancing on estimated propensity scores balances the covariate means across treatment arms but fails to balance higherorder moments and covariances amongst covariates, raising concern about its use in nonlinear outcomes generating mechanisms. We also find that besides inverseprobability weighting (IPW) with propensity scores, no one estimator is consistent under all data generating mechanisms. The IPW estimator is itself prone to inconsistency due to misspecification of the model for estimating propensity scores. Even when it is consistent, the IPW estimator is usually extremely inefficient. Thus care should be taken before naively applying any one estimator to estimate ATE in these data. We develop a recommendation for an algorithm which may help applied researchers to arrive at the optimal estimator. We illustrate the application of this algorithm and also the performance of alternative methods in a cost dataset on breast cancer treatment. 
JEL:  C01 C21 I10 
Date:  2008–06 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:14086&r=ecm 
By:  Ingmar Nolte; Valeri Voev (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We propose a unified framework for estimating integrated variances and covariances based on simple OLS regressions, allowing for a general market microstructure noise specification. We show that our estimators can outperform, in terms of the root mean squared error criterion, the most recent and commonly applied estimators, such as the realized kernels of BarndorffNielsen, Hansen, Lunde & Shephard (2006), the twoscales realized variance of Zhang, Mykland & AïtSahalia (2005), the Hayashi & Yoshida (2005) covariance estimator, and the realized variance and covariance with the optimal sampling frequency derived in Bandi & Russell (2005a) and Bandi & Russell (2005b). For a realistic trading scenario, the efficiency gains resulting from our approach are in the range of 35% to 50%. 
Keywords:  High frequency data, Realized volatility and covariance, Market microstructure 
JEL:  G10 F31 C32 
Date:  2008–06–10 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200831&r=ecm 
By:  D. Kuang (Department of Statistics, University of Oxford); Bent Nielsen (Nuffield College, Oxford University); J. P. Nielsen (Cass Business School) 
Abstract:  We consider forecasting from ageperiodcohort models, as well as from the extended chainladder model. The parameters of these models are known only to be identified up to linear trends. Forecasts from such models may therefore depend on arbitrary linear trends. A condition for invariant forecasts is proposed. A number of standard forecast models are analysed. 
Keywords:  Ageperiodcohort model; Chainladder model; Forecasting; Identification. 
Date:  2008–06–16 
URL:  http://d.repec.org/n?u=RePEc:nuf:econwp:0809&r=ecm 
By:  Martin Møller Andreasen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  This paper shows how nonlinear DSGE models with potential nonnormal shocks can be estimated by QuasiMaximum Likelihood based on the Central Difference Kalman Filter (CDKF). The advantage of this estimator is that evaluating the quasi loglikelihood function only takes a fraction of a second. The second contribution of this paper is to derive a new particle filter which we term the Mean Shifted Particle Filter (MSPFb). We show that the MSPFb outperforms the standard Particle Filter by delivering more precise state estimates, and in general the MSPFb has lower Monte Carlo variation in the reported loglikelihood function. 
Keywords:  Multivariate Stirling interpolation, Particle filtering, Nonlinear DSGE models, Nonnormal shocks, Quasimaximum likelihood 
JEL:  C13 C15 E10 E32 
Date:  2008–06–20 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200833&r=ecm 
By:  Dennis Kristensen; Anders Rahbek (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We consider a class of vector nonlinear error correction models where the transfer function (or loadings) of the stationary relation ships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties of the process in terms of stochastic and deter ministic trends as well as stationary components. In particular, the behaviour of the cointegrating relations is described in terms of geo metric ergodicity. Despite the fact that no deterministic terms are included, the process will have both stochastic trends and a linear trend in general. Gaussian likelihoodbased estimators are considered for the long run cointegration parameters, and the shortrun parameters. Asymp totic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normaity can be found. A simulation study reveals that cointegration vectors and the shape of the adjust ment are quite accurately estimated by maximum likelihood, while at the same time there is very little information about some of the individual parameters entering the adjustment function. 
Date:  2007–11–19 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200738&r=ecm 
By:  Frank S. Nielsen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  This paper extends the local polynomial Whittle estimator of Andrews & Sun (2004) to fractionally integrated processes covering stationary and nonstationary regions. We utilize the notion of the extended discrete Fourier transform and periodogram to extend the local polynomial Whittle estimator to the nonstationary region. By approximating the shortrun component of the spectrum by a polynomial, instead of a constant, in a shrinking neighborhood of zero we alleviate some of the bias that the classical local Whittle estimators is prone to. A simulation study illustrates the performance of the proposed estimator compared to the classical local Whittle estimator and the local polynomial Whittle estimator. The empirical justification of the proposed estimator is shown through an analysis of credit spreads. 
Keywords:  Bias reduction, fractional integration, local polynomial, local Whittle estimation, long memory. 
JEL:  C22 
Date:  2008–06–02 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200828&r=ecm 
By:  Almut Veraart (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  Recent research has focused on modelling asset prices by Itô semimartingales. In such a modelling framework, the quadratic variation consists of a continuous and a jump component. This paper is about inference on the jump part of the quadratic variation, which can be estimated by the difference of realised variance and realised multipower variation. The main contribution of this paper is twofold. First, it provides a bivariate asymptotic limit theory for realised variance and realised multipower variation in the presence of jumps. Second, this paper presents new, consistent estimators for the jump part of the asymptotic variance of the estimation bias. Eventually, this leads to a feasible asymptotic theory which is applicable in practice. Finally, Monte Carlo studies reveal a good finite sample performance of the proposed feasible limit theory. 
Keywords:  Quadratic variation, Itô semimartingale, stochastic volatility, jumps, realised variance, realised multipower variation, high–frequency data 
JEL:  C13 C14 G10 G12 
Date:  2008–03–31 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200817&r=ecm 
By:  Ole E. BarndorffNielsen; José Manuel Corcuera; Mark Podolskij (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We develop the asymptotic theory for the realised power variation of the processes X = f • G, where G is a Gaussian process with stationary increments. More specifically, under some mild assumptions on the variance function of the increments of G and certain regularity condition on the path of the process f we prove the convergence in probability for the properly normalised realised power variation. Moreover, under a further assumption on the H¨older index of the path of f, we show an associated stable central limit theorem. The main tool is a general central limit theorem, due essentially to Hu & Nualart (2005), Nualart & Peccati (2005) and Peccati & Tudor (2005), for sequences of random variables which admit a chaos representation. 
Keywords:  Central Limit Theorem, Chaos Expansion, Gaussian Processes, HighFrequency Data, Multiple WienerItô Integrals, Power Variation 
JEL:  C10 C13 C14 
Date:  2007–12–07 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200742&r=ecm 
By:  Elena Biewen; Sandra Nolte; Martin Rosemann 
Abstract:  Whereas the literature on additive measurement error has known a considerable treatment, less work has been done for multiplicative noise. In this paper we concentrate on multiplicative measurement error in the covariates, which contrary to additive error not only modies proportionally the original value, but also conserves the structural zeros. This paper compares three variants to specify the multiplicative measurement error model in the simulation step of the SimulationExtrapolation (SIMEX) method originally proposed by Cook and Stefanski (1994): i) as an additive one without using a logarithmic transformation, ii) as the wellknown logarithmic transformation of the multiplicative error model, and iii) as an approach using the multiplicative measurement error model as such. The aim of the paper is to analyze how well these three approaches reduce the bias caused by the multiplicative measurement error. We apply three variants to the case of data masking by multiplicative measurement error, in order to obtain parameter estimates of the true data generating process. We produce Monte Carlo evidence on how the reduction of data quality can be minimized. 
Keywords:  Errorsinvariables in nonlinear models, disclosure limitation methods, multiplicative error 
JEL:  C13 C21 
Date:  2008–01 
URL:  http://d.repec.org/n?u=RePEc:iaw:iawdip:39&r=ecm 
By:  Meenagh, David; Minford, Patrick; Theodoridis, Konstantinos 
Abstract:  We use the method of indirect inference to test a full open economy model of the UK that has been in forecasting use for three decades. The test establishes, using a Wald statistic, whether the parameters of a timeseries representation estimated on the actual data lie within some confidence interval of the modelimplied distribution. Various forms of timeseries representations that could deal with the UK's various changes of monetary regime are tried; two are retained as adequate. The model is rejected under one but marginally accepted under the other, suggesting that with some modifications it could achieve general acceptability and that the testing method is worth investigating further. 
Keywords:  Bootstrap; Indirect inference; Model evaluation; Nonlinear Time Series Models; Open economy models; UK models 
JEL:  C12 C32 
Date:  2008–06 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:6849&r=ecm 
By:  Peter Reinhard Hansen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We derive an identity for the determinant of a product involving nonsquared matrices. The identity can be used to derive the maximum likelihood estimator in reducedrank regres sions with Gaussian innovations. Furthermore, the identity sheds light on the structure of the estimation problem that arises when the reducedrank parameters are subject to additional constraints. 
Keywords:  Determinant Identity, Reduced Rank Regression, Least Squares 
JEL:  C3 C32 
Date:  2008–01–15 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200802&r=ecm 
By:  Alexandr Kuchynka (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic; Faculty of Economics, University of West Bohemia in Pilsen; Institute of Information Theory and Automation of the ASCR) 
Abstract:  This paper focuses on the extraction of volatility of financial returns. The volatility process is modeled as a superposition of two autoregressive processes which represent the more persistent factor and the quickly meanreverting factor. As the volatility is not observable, the logarithm of the daily highlow range is employed as its proxy. The estimation of parameters and volatility extraction are performed using a modified version of the Kalman filter which takes into account the finite sample distribution of the proxy. 
Keywords:  volatility, stochastic volatility models, Kalman filter, volatility proxy 
JEL:  C22 G15 
Date:  2008–06 
URL:  http://d.repec.org/n?u=RePEc:fau:wpaper:wp2008_10&r=ecm 
By:  Torben G. Andersen; Tim Bollerslev; Per Houmann Frederiksen; Morten Ørregaard Nielsen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We provide an empirical framework for assessing the distributional properties of daily specu lative returns within the context of the continuoustime modeling paradigm traditionally used in asset pricing finance. Our approach builds directly on recently developed realized variation measures and nonparametric jump detection statistics constructed from highfrequency intra day data. A sequence of relatively simpletoimplement momentbased tests involving various transforms of the daily returns speak directly to the import of different features of the under lying continuoustime processes that might have generated the data. As such, the tests may serve as a useful diagnostic tool in the specification of empirically more realistic asset pricing models. Our results are also directly related to the popular mixtureofdistributions hypoth esis and the role of the corresponding latent information arrival process. On applying our sequential test procedure to the thirty individual stocks in the Dow Jones Industrial Average index, the data suggest that it is important to allow for both timevarying diffusive volatility, jumps, and leverage effects in order to satisfactorily describe the daily stock price dynamics. At a broader level, the empirical results also illustrate how the realized variation measures and highfrequency sampling schemes may be used in eliciting important distributional features and asset pricing implications more generally. 
Keywords:  Return distributions, continuoustime models, mixtureofdistributions hypothesis, financialtime sampling, highfrequency data, volatility signature plots, realized volatilities, jumps, leverage and volatility feedback effects 
JEL:  C1 G1 
Date:  2007–08–16 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200721&r=ecm 
By:  Matias D. Cattaneo; Richard K. Crump; Michael Jansson (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  This paper proposes (apparently) novel standard error formulas for the densityweighted average derivative estimator of Powell, Stock, and Stoker (1989). Asymptotic validity of the standard errors developed in this paper does not require the use of higherorder kernels and the standard errors are "robust" in the sense that they accommodate (but do not require) bandwidths that are smaller than those for which conventional standard errors are valid. Moreover, the results of a Monte Carlo experiment suggest that the finite sample coverage rates of confidence intervals constructed using the standard errors developed in this paper coincide (approximately) with the nominal coverage rates across a nontrivial range of bandwidths. 
Keywords:  Semiparametric estimation, densityweighted average derivatives 
JEL:  C14 C21 
Date:  2008–05–20 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200824&r=ecm 
By:  Martin Burda; Roman Liesenfeld; JeanFrancois Richard 
Abstract:  In this paper, we perform Bayesian analysis of a panel probit model with unobserved individual heterogeneity and serially correlated errors. We augment the data with latent variables and sample the unobserved heterogeneity component as one Gibbs block per individual using a flexible piecewise linear approximation to the marginal posterior density. The latent time effects are simulated as another Gibbs block. For this purpose we develop a new userfriendly form of the Efficient Importance Sampling proposal density for an AcceptanceRejection MetropolisHastings step. We apply our method to the analysis of product innovation activity of a panel of German manufacturing firms in response to imports, foreign direct investment and other control variables. The dataset used here was analyzed under more restrictive assumptions by Bertschek and Lechner (1998) and Greene (2004). Although our results differ to a certain degree from these benchmark studies, we confirm the positive effect of imports and FDI on firms' innovation activity. Moreover, unobserved firm heterogeneity is shown to play a far more significant role in the application than the latent time effects. 
Keywords:  Dynamic latent variables; Markov Chain Monte Carlo; importance sampling 
JEL:  C11 C13 C15 C23 C25 
Date:  2008–06–16 
URL:  http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa321&r=ecm 
By:  Olaf Posch (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  Understanding the process of economic growth involves comparing competing theoretical models and evaluating their empirical relevance. Our approach is to take the neoclassical stochastic growth model directly to the data and make inferences about the model parameters of interest. In this paper, output follows a jumpdiffusion process. By imposing parameter restrictions we derive two solutions in explicit form. Based on them, we obtain transition densities in closed form and employ maximum likelihood techniques to estimate the model parameters. In extensive Monte Carlo simulations we demonstrate that population parameters of the underlying data generating process can be recovered. We find empirical evidence for jumps in monthly and quarterly data on industrial production for the UK, the US, Germany, and the euro area (Euro12). 
Keywords:  Jumpdiffusion estimation, Stochastic growth, Closed form solutions 
JEL:  C13 E32 O40 
Date:  2007–09–14 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200723&r=ecm 
By:  Viktor Todorov; Tim Bollerslev (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We provide a new theoretical framework for disentangling and estimating sensitivity towards systematic diffusive and jump risks in the context of factor pricing models. Our estimates of the sensitivities towards systematic risks, or betas, are based on the notion of increasingly finer sampled returns over fixed time intervals. In addition to establish ing consistency of our estimators, we also derive Central Limit Theorems characterizing their asymptotic distributions. In an empirical application of the new procedures using highfrequency data for forty individual stocks and an aggregate market portfolio, we find the estimated diffusive and jump betas with respect to the market to be quite dif ferent for many of the stocks. Our findings have direct and important implications for empirical asset pricing finance and practical portfolio and risk management decisions. 
Keywords:  Factor models, systematic risk, common jumps, highfrequency data, realized variation 
JEL:  C13 C14 G10 G12 
Date:  2007–08–16 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200715&r=ecm 
By:  James Davidson; Nigar Hashimzade (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  This paper considers the asymptotic distribution of the covariance of a nonstationary frac tionally integrated process with the stationary increments of another such process  possibly, itself. Questions of interest include the relationship between the harmonic representation of these random variables, which we have analysed in a previous paper, and the construction derived from moving average representations in the time domain. The limiting integrals are shown to be expressible in terms of functionals of Itô integrals with respect to two distinct Brownian motions. Their mean is nonetheless shown to match that of the harmonic rep resentation, and they satisfy the required integration by parts rule. The advantages of our approach over the harmonic analysis include the facts that our formulae are valid for the full range of the long memory parameters, and extend to nonGaussian processes. 
Keywords:  Stochastic integral, weak convergence, fractional Brownian motion 
JEL:  C22 C32 
Date:  2007–12–21 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200745&r=ecm 
By:  FrancoisÉric Racicot (Département des sciences administratives, Université du Québec (Outaouais), LRSP et Chaire d'information financière et organisationnelle); Raymond Théoret (Département de stratégie des affaires, Université du Québec (Montréal), et Chaire d'information financière et organisationnelle) 
Keywords:  Asset Pricing Models, specification errors, Hausman test, GMM, optimal instruments. 
JEL:  C13 C19 C49 G12 G31 
Date:  2008–01–06 
URL:  http://d.repec.org/n?u=RePEc:pqs:wpaper:012008&r=ecm 
By:  Wolfgang Härdle; Ostap Okhrin; Yarema Okhrin 
Abstract:  In this paper we provide a review of copula theory with applications to finance. We illustrate the idea on the bivariate framework and discuss the simple, elliptical and Archimedean classes of copulae. Since the cop ulae model the dependency structure between random variables, next we explain the link between the copulae and common dependency measures, such as Kendall's tau and Spearman's rho. In the next section the copulae are generalized to the multivariate case. In this general setup we discuss and provide an intensive literature review of estimation and simulation techniques. Separate section is devoted to the goodnessoffit tests. The importance of copulae in finance we illustrate on the example of asset allocation problems, ValueatRisk and time series models. The paper is complemented with an extensive simulation study and an application to financial data. 
Keywords:  Distribution functions, Dimension Reduction, Risk management, Statistical models 
JEL:  C00 C14 C51 
Date:  2008–06 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2008043&r=ecm 
By:  Tim Bollerslev; Uta Kretschmer; Christian Pigorsch; George Tauchen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We develop an empirically highly accurate discretetime daily stochastic volatility model that explicitly distinguishes between the jump and continuoustime components of price movements using nonparametric realized variation and Bipower variation measures constructed from highfrequency intraday data. The model setup allows us to directly assess the structural interdependencies among the shocks to returns and the two different volatility components. The model estimates suggest that the leverage effect, or asymmetry between returns and volatility, works primarily through the continuous volatility component. The excellent fit of the model makes it an ideal candidate for an easyto implement auxiliary model in the context of indirect estimation of empirically more realistic continuoustime jump diffusion and L´evydriven stochastic volatility models, effectively incorporating the interdaily dependencies inherent in the highfrequency intraday data. 
Keywords:  Realized volatility, Bipower variation, Jumps, Leverage effect, Simultaneous equation model 
JEL:  C1 C3 C5 G1 
Date:  2007–08–16 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200722&r=ecm 
By:  Tim Bollerslev; Michael Gibson; Hao Zhou (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  This paper proposes a method for constructing a volatility risk premium, or investor risk aversion, index. The method is intuitive and simple to implement, relying on the sample moments of the recently popularized modelfree realized and optionimplied volatility measures. A smallscale Monte Carlo experiment confirms that the procedure works well in practice. Implementing the procedure with actual S&P500 optionimplied volatilities and highfrequency fiveminutebased realized volatilities indicates significant temporal dependencies in the estimated stochastic volatility risk premium, which we in turn relate to a set of macrofinance state variables. We also find that the extracted volatility risk premium helps predict future stock market returns. 
Keywords:  Stochastic Volatility Risk Premium, ModelFree Implied Volatility, ModelFree Realized Volatility, BlackScholes, GMM Estimation, Return Predictability 
JEL:  G12 G13 C51 C52 
Date:  2007–08–16 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200716&r=ecm 
By:  Bent Jesper Christensen; Morten Ørregaard Nielsen; Jie Zhu (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We extend the fractionally integrated exponential GARCH (FIEGARCH) model for daily stock return data with long memory in return volatility of Bollerslev and Mikkelsen (1996) by introducing a possible volatilityinmean effect. To avoid that the long memory property of volatility carries over to returns, we consider a filtered FIEGARCHinmean (FIEGARCHM) effect in the return equation. The filtering of the volatilityinmean component thus allows the coexistence of long memory in volatility and short memory in returns. We present an application to the S&P 500 index which documents the empirical relevance of our model. 
Keywords:  FIEGARCH, financial leverage, GARCH, long memory, riskreturn tradeoff, stock returns, volatility feedback 
JEL:  C22 
Date:  2007–06–12 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200710&r=ecm 
By:  Torben G. Andersen; Tim Bollerslev; Xin Huang (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  Building on realized variance and bipower variation measures constructed from highfrequency financial prices, we propose a simple reduced form framework for effectively incorporating intraday data into the modeling of daily return volatility. We decompose the total daily return variability into the continuous sample path variance, the variation arising from discontinuous jumps that occur during the trading day, as well as the overnight return variance. Our empirical results, based on long samples of highfrequency equity and bond futures returns, suggest that the dynamic dependencies in the daily continuous sample path variability is well described by an approximate longmemory HARGARCH model, while the overnight returns may be modelled by an augmented GARCH type structure. The dynamic dependencies in the nonparametrically identified significant jumps appear to be well described by the combination of an ACH model for the timevarying jump intensities coupled with a relatively simple loglinear structure for the jump sizes. Lastly, we discuss how the resulting reduced form model structure for each of the three components may be used in the construction of outofsample forecasts for the total return volatility. 
Keywords:  Stochastic Volatility, Realized Variation, Bipower Variation, Jumps, Hazard Rates, Overnight Volatility 
JEL:  C1 G1 C2 
Date:  2007–08–16 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200714&r=ecm 
By:  Martin Møller Andreasen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi tions which ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano & Primiceri (American Economic Review, forth coming) and FernándezVillaverde & RubioRamírez (Review of Economic Studies, 2007) do not satisfy these sufficient conditions, or any other known set of conditions ensuring finite values for the objective functions. Thus, the validity of the micro foundation in these models remains to be established. 
Keywords:  Deterministic trends, DSGE models, Error distributions, Moment generating functions, Stochastic trends, Stochastic volatility, Unitroots 
JEL:  E10 E30 
Date:  2008–05–26 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200826&r=ecm 
By:  Stefan Holst Bache; Christian M. Dahl; Johannes Tang (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  Low birthweight outcomes are associated with large social and economic costs, and therefore the possible determinants of low birthweight are of great interest. One such determinant which has received considerable attention is maternal smoking. From an economic perspective this is in part due to the possibility that smoking habits can be influenced through policy conduct. It is widely believed that maternal smoking reduces birthweight; however, the crucial difficulty in estimating such effects is the unobserved heterogeneity among mothers. We consider extensions of three panel data models to a quantile regression framework in order to control for heterogeneity and to infer conclusions about causality across the entire birthweight distribution. We obtain estimation results for maternal smoking and other interesting determinants, applying these to data obtained from Aarhus University Hospital, Skejby (Denmark). We examine the use of both balanced and unbalanced panels. In conclusion, our results show the importance of considering conditional quantiles and controlling for unobserved heterogeneity when estimating determinants of birthweight outcomes. An example of this is the change in magnitude and significance of prenatal smoking. Controlling for unobserved effects does not change the fact that smoking reduces birthweight, but it shows that the effect is primarily a problem in the left tail of the distribution on a slightly smaller scale. 
Keywords:  Random Correlated Effects, Fixed Effects, Cross Section, Quantile Regression, Maternal Smoking, Birthweight 
JEL:  C13 C23 I10 
Date:  2008–05–08 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200820&r=ecm 
By:  Ole E. BarndorffNielsen; José Manuel Corcuera; Mark Podolskij; Jeannette H.C. Woerner (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  Convergence in probability and central limit laws of bipower variation for Gaussian processes with stationary increments and for integrals with respect to such processes are derived. The main tools of the proofs are some recent powerful techniques of Wiener/Itô/Malliavin calculus for establishing limit laws, due to Nualart, Peccati and others. 
Keywords:  Bipower Variation, Central Limit Theorem, Chaos Expansion, Gaussian Processes, Multiple WienerItô Integrals. 
Date:  2008–05–08 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200821&r=ecm 
By:  Thomas Busch; Thomas Busch; Bent Jesper Christensen; Morten Ørregaard Nielsen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We study the forecasting of future realized volatility in the stock, bond, and for eign exchange markets, as well as the continuous sample path and jump components of this, from variables in the information set, including implied volatility backed out from option prices. Recent nonparametric statistical techniques of Barndor¤Nielsen & Shephard (2004, 2006) are used to separate realized volatility into its continuous and jump components, which enhances forecasting performance, as shown by Andersen, Bollerslev & Diebold (2005). We generalize the heterogeneous autoregressive (HAR) model of Corsi (2004) to include implied volatility as an additional regressor, and to the separate forecasting of the realized components. We also introduce a new vector HAR (VecHAR) model for the resulting simultaneous system, controlling for possible endogeneity issues in the forecasting equations. We show that implied volatility con tains incremental information about future volatility relative to both continuous and jump components of past realized volatility. Indeed, in the foreign exchange market, implied volatility completely subsumes the information content of daily, weekly, and monthly realized volatility measures, when forecasting future realized volatility or its continuous component. In addition, implied volatility is an unbiased forecast of future realized volatility in the foreign exchange and stock markets. Perhaps surprisingly, the jump component of realized return volatility is, to some extent, predictable, and options appear to be calibrated to incorporate information about future jumps in all three markets. 
Keywords:  Bipower variation, HAR, Heterogeneous Autoregressive Model, implied volatility, jumps, options, realized volatility, VecHAR, volatility forecasting 
JEL:  C22 C32 F31 G1 
Date:  2007–06–06 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200709&r=ecm 
By:  Martin Møller Andreasen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  This paper extends two optimization routines to deal with objective functions for DSGE models. The optimization routines are i) a version of Simulated Annealing developed by Corana, Marchesi & Ridella (1987), and ii) the evolutionary algorithm CMAES developed by Hansen, Müller & Koumoutsakos (2003). Following these extensions, we examine the ability of the two routines to maximize the likelihood function for a sequence of test economies. Our results show that the CMA ES routine clearly outperforms Simulated Annealing in its ability to find the global optimum and in efficiency. With 10 unknown structural parameters in the likelihood function, the CMAES routine finds the global optimum in 95% of our test economies compared to 89% for Simulated Annealing. When the number of unknown structural parameters in the likelihood function increases to 20 and 35, then the CMAES routine finds the global optimum in 85% and 71% of our test economies, respectively. The corresponding numbers for Simulated Annealing are 70% and 0%. 
Keywords:  CMAES optimization routine, Multimodel objective function, NelderMead simplex routine, Nonconvex search space, Resampling, Simulated Annealing 
JEL:  C61 C88 E30 
Date:  2008–06–19 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200832&r=ecm 
By:  Tim Bollerslev; Tzuo Hann Law; George Tauchen (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We test for price discontinuities, or jumps, in a panel of highfrequency intraday returns for forty largecap stocks and an equiweighted index from these same stocks. Jumps are naturally classified into two types: common and idiosyncratic. Common jumps affect all stocks, albeit to varying degrees, while idiosyncratic jumps are stockspecific. Despite the fact that each of the stocks has a of about unity with respect to the index, common jumps are virtually never detected in the individual stocks. This is truly puzzling, as an index can jump only if one or more of its components jump. To resolve this puzzle, we propose a new test for cojumps. Using this new test we find strong evidence for many modestsized common jumps that simply pass through the standard jump detection statistic, while they appear highly significant in the cross section based on the new cojump identification scheme. Our results are further corroborated by a striking withinday pattern in the nondiversifiable cojumps. 
Keywords:  risk, diversification 
JEL:  C12 C32 C33 G12 G14 
Date:  2007–08–16 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200719&r=ecm 
By:  Tom Engsted; Stig V. Møller (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  We suggest an iterated GMM approach to estimate and test the consumption based habit persistence model of Campbell and Cochrane (1999), and we apply the approach on annual and quarterly Danish stock and bond returns. For comparative purposes we also estimate and test the standard CRRA model. In addition, we compare the pricing errors of the different models using Hansen and Jagannathan’s (1997) specification error measure. The main result is that for Denmark the CampbellCochrane model does not seem to perform markedly better than the CRRA model. For the long annual sample period covering more than 80 years there is absolutely no evidence of superior performance of the CampbellCochrane model. For the shorter and more recent quarterly data over a 2030 year period, there is some evidence of countercyclical timevariation in the degree of riskaversion, in accordance with the CampbellCochrane model, but the model does not produce lower pricing errors or more plausible parameter estimates than the CRRA model. 
Keywords:  Consumptionbased model, habit persistence, GMM, pricing error 
JEL:  G12 
Date:  2008–02–27 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200812&r=ecm 
By:  Jie Zhu (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  In this paper a twocomponent volatility model based on the component's first moment is introduced to describe the dynamic of speculative return volatility. The two components capture the volatile and persistent part of volatility respectively. Then the model is applied to 10 AsiaPacific stock markets. Their inmean effects on return are also tested. The empirical results show that the persistent component accounts much more for volatility dynamic process than the volatile component. However the volatile component is found to be a significant pricing factor of asset returns for most markets, a positive or riskpremium effect exists between return and the volatile component, yet the persistent component is not significantly priced for return dynamic process. 
Keywords:  Risk, Return, Inmean effect, Volatile, Persistent, Innovations 
JEL:  C14 G12 G15 
Date:  2008–03–05 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200814&r=ecm 
By:  J. BUREZ; D. VAN DEN POEL 
Abstract:  Customer churn is often a rare event in service industries, but of great interest and great value. Until recently, however, class imbalance has not received much attention in the context of data mining (Weiss, 2004). In this study, we investigate how we can better handle class imbalance in churn prediction. Using more appropriate evaluation metrics (AUC, lift), we investigated the increase in performance of sampling (both random and advanced undersampling) and two specific modelling techniques (gradient boosting and weighted random forests) compared to some standard modelling techniques. <br>AUC and lift prove to be good evaluation metrics. AUC does not depend on a threshold, and is therefore a better overall evaluation metric compared to accuracy. Lift is very much related to accuracy, but has the advantage of being well used in marketing practice (Ling and Li, 1998). Results show that undersampling can lead to improved prediction accuracy, especially when evaluated with AUC. Unlike Ling and Li (1998), we find that there is no need to undersample so that there are as many churners in your training set as non churners. Results show no increase in predictive performance when using the advanced sampling technique CUBE in this study. This is in line with findings of Japkowicz (2000), who noted that using sophisticated sampling techniques did not give any clear advantage. Weighted random forests, as a costsensitive learner, performs significantly better compared to random forests, and is therefore advised. It should, however always be compared to logistic regression. Boosting is a very robust classifier, but never outperforms any other technique. 
Keywords:  rare events, class imbalance, undersampling, oversampling, boosting, random forests, CUBE, customer churn, classifier 
Date:  2008–05 
URL:  http://d.repec.org/n?u=RePEc:rug:rugwps:08/517&r=ecm 
By:  Christopher R. Knittel; Konstantinos Metaxoglou 
Abstract:  Empirical exercises in economics frequently involve estimation of highly nonlinear models. The criterion function may not be globally concave or convex and exhibit many local extrema. Choosing among these local extrema is nontrivial for a variety of reasons. In this paper, we analyze the sensitivity of parameter estimates, and most importantly of economic variables of interest, to both starting values and the type of nonlinear optimization algorithm employed. We focus on a class of demand models for differentiated products that have been used extensively in industrial organization, and more recently in public and labor. We find that convergence may occur at a number of local extrema, at saddles and in regions of the objective function where the firstorder conditions are not satisfied. We find own and crossprice elasticities that differ by a factor of over 100 depending on the set of candidate parameter estimates. In an attempt to evaluate the welfare effects of a change in an industry's structure, we undertake a hypothetical merger exercise. Our calculations indicate consumer welfare effects can vary between positive values to negative seventy billion dollars depending on the set of parameter estimates used. 
JEL:  C1 C61 C81 L1 L4 
Date:  2008–06 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:14080&r=ecm 
By:  Olav Bjerkholt (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  Trygve Haavelmo spent the academic year 1938/39 at the University of Aarhus as a teacher in statistics. He would immediately after his Aarhus stay leave for the United States, where he completed The Probability Approach in Econometrics (1944) and later worked at the Cowles Commission before returning to Norway in 1947. The purpose of the paper has been to assess whether Haavelmo in Aarhus was already on a path towards the Probability Approach or, as suggested in the history of econometrics literature, this path did not really open up until Haavelmo came to the U.S.A. and got converted to probability reasoning. The paper gives a survey of Haavelmo’s papers and other work while in Aarhus. The evidence indicates that Haavelmo had adopted probability ideas by the time he was in Aarhus and seemed well prepared to embark on his magnum opus. 
Keywords:  Economic history, the probability approach in econometrics 
JEL:  B23 B31 
Date:  2007–11–26 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200740&r=ecm 