
on Econometrics 
By:  Jakob Grazzini; Matteo G. Richiardi; Mike Tsionas 
Abstract:  We consider Bayesian inference techniques for AgentBased (AB) models, as an alternative to simulated minimum distance (SMD). We discuss the specicities of AB models with respect to models with exact aggregation results (as DSGE models), and how this impact estimation. Three computationally heavy steps are involved: (i) simulating the model, (ii) estimating the likelihood and (iii) sampling from the posterior distribution of the parameters. Computational complexity of AB models implies that ecient techniques have to be used with respect to points (ii) and (iii), possibly involving approximations. We rst discuss nonparametric (kernel density) estimation of the likelihood, coupled with Markov chain Monte Carlo sampling schemes. We then turn to parametric approximations of the likelihood, which can be derived by observing the distribution of the simulation outcomes around the statistical equilibria, or by assuming a specic form for the distribution of external deviations in the data. Finally, we introduce Approximate Bayesian Computation techniques for likelihoodfree estimation. These allow embedding SMD methods in a Bayesian framework, and are particularly suited when robust estimation is needed. These techniques are tested, for the sake of comparison, in the same price discovery model used by Grazzini and Richiardi (2015) to illustrate SMD techniques. 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:cca:wplabo:145&r=ecm 
By:  Mark Podolskij (Heidelberg University and CREATES); Bezirgen Veliyev (Aarhus University and CREATES); Nakahiro Yoshida (Graduate School of Mathematical Science) 
Abstract:  In this paper, we study the Edgeworth expansion for a preaveraging estimator of quadratic variation in the framework of continuous diffusion models observed with noise. More specifically, we obtain a second order expansion for the joint density of the estimators of quadratic variation and its asymptotic variance. Our approach is based on martingale embedding, Malliavin calculus and stable central limit theorems for continuous diffusions. Moreover, we derive the density expansion for the studentized statistic, which might be applied to construct asymptotic confidence regions. 
Keywords:  diffusion processes, Edgeworth expansion, high frequency observations, quadratic variation, preaveraging. 
JEL:  C10 C13 C14 
Date:  2015–12–14 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201560&r=ecm 
By:  Claudio, Morana 
Abstract:  The paper introduces a new simple semiparametric estimator of the conditional variance covariance and correlation matrix (SPDCC). While sharing a similar sequential approach to existing dynamic conditional correlation (DCC) methods, SPDCC has the advantage of not requiring the direct parameterization of the conditional covariance or correlation processes, therefore also avoiding any assumption on their longrun target. In the proposed framework, conditional variances are estimated by univariate GARCH models, for actual and suitably transformed series, in the first step; the latter are then nonlinearly combined in the second step, according to basic properties of the covariance and correlation operator, to yield nonparametric estimates of the various conditional covariances and correlations. Moreover, in contrast to available DCC methods, SPDCC allows for straightforward estimation also for the nonsymultaneous case, i.e., for the estimation of conditional crosscovariances and correlations, displaced at any time horizon of interest. A simple expost procedure, to ensure well behaved conditional covariance and correlation matrices, grounded on nonlinear shrinkage, is finally proposed. Due to its sequential implementation and scant computational burden, SPDCC is very simple to apply and suitable for the modeling of vast sets of conditionally heteroskedastic time series. 
Keywords:  Multivariate GARCH model, dynamic conditional correlation, semiparametric estimation 
JEL:  C30 C51 
Date:  2015–12–10 
URL:  http://d.repec.org/n?u=RePEc:mib:wpaper:317&r=ecm 
By:  Matthieu Garcin (Centre d'Economie de la Sorbonne & Natixis Asset Management); Clément Goulet (Centre d'Economie de la Sorbonne) 
Abstract:  In this paper we propose a new model for estimating returns and volatility. Our approach is based both on the wavelet denoising technique and on the variational theory. We assess that the volatility can be expressed as a nonparametric functional form of past returns. Therefore, we are able to forecast both returns and volatility and to build confidence intervals for predicted returns. Our technique outperforms classical time series theory. Our model does not require the stationarity of the observed logreturns, it preserves the volatility stylised facts and it is based on a fully nonparametric form. This nonparametric form is obtained thanks to the multiplicative noise theory. To our knowledge, this is the first time that such a method is used for financial modelling. We propose an application to intraday and daily financial data 
Keywords:  Volatility modeling; non variational calculus; wavelet theory; trading strategy 
JEL:  C14 C51 C53 C58 
Date:  2015–09 
URL:  http://d.repec.org/n?u=RePEc:mse:cesdoc:15086&r=ecm 
By:  CHEN, Cathy W.S.; WENG, Monica M.C.; WATANABE, Toshiaki 
Abstract:  To allow for a higher degree of flexibility in model parameters, we propose a general and timevarying nonlinear smooth transition (ST) heteroskedastic model with a secondorder logistic function of varying speed in the mean and variance. This paper evaluates the performance of ValueatRisk (VaR) measures in a class of risk models, specially focusing on three distinct ST functions with GARCH structures: first and secondorder logistic functions, and the exponential function. The likelihood function is nondifferentiable in terms of the threshold values and delay parameter. We employ Bayesian Markov chain Monte Carlo sampling methods to update the estimates and quantile forecasts. The proposed methods are illustrated using simulated data and an empirical study. We estimate VaR forecasts for the proposed models alongside some competing asymmetric models with skew and fattailed error probability distributions, including realized volatility models. To evaluate the accuracy of VaR estimates, we implement two loss functions and three backtests. The results show that the ST model with a secondorder logistic function and skew Student’s t error is a worthy choice at the 1% level, when compared to a range of existing alternatives. 
Keywords:  Secondorder logistic transition function, Backtesting, Markov chain Monte Carlo methods, ValueatRisk, Volatility forecasting, Realized volatility models 
Date:  2015–12–08 
URL:  http://d.repec.org/n?u=RePEc:hit:hiasdp:hiase16&r=ecm 
By:  Matthieu Garcin (Centre d'Economie de la Sorbonne & Natixis Asset Management); Dominique Guegan (Centre d'Economie de la Sorbonne) 
Abstract:  By filtering wavelet coefficients, it is possible to construct a good estimate of a pure signal from noisy data. Especially, for a simple linear noise influence, Donoho and Johnstone (1994) have already defined an optimal filter design in the sense of a good reconstruction of the pure signal. We set here a different framework where the influence of the noise is nonlinear. In particular, we propose an optimal method to filter the wavelet coefficients of a discrete dynamical system disrupted by a weak noise, in order to construct good estimates of the pure signal, including Bayes' estimate, minimax estimate, oracular estimate or thresholding estimate. We present the example of a simple chaotic dynamical system as well as an adaptation of our technique in order to show empirically the robustness of the thresholding method in presence of leptokurtic noise. Moreover, we test both the hard and the soft thresholding and also another kind of smoother thresholding which seems to have almost the same reconstruction power as the hard thresholding 
Keywords:  wavelets; dynamical systems; chaos; Gaussian noise; Cauchy noise; thresholding; nonequispaced design; nonlinear noise impact 
Date:  2015–10 
URL:  http://d.repec.org/n?u=RePEc:mse:cesdoc:15085&r=ecm 
By:  Jose Apesteguia; Miguel A. Ballester 
Abstract:  Suppose that, when evaluating two alternatives x and y by means of a parametric utility function, low values of the parameter indicate a preference for x and high values indicate a preference for y. We say that a stochastic choice model is monotone whenever the probability of choosing x is decreasing in the preference parameter. We show that the standard use of random utility models in the context of risk and time preferences may sharply violate this monotonicity property, and argue that their use in preference estimation may be problematic. In particular, they may pose identification problems and yield biased estimations. We then establish that the alternative random parameter models, in contrast, are always monotone. We show in an empirical application that standard riskaversion assessments may be severely biased. 
Keywords:  Stochastic Choice; Preference Parameters; Random Utility Models; Random Parameter Models; Risk Aversion; Delay Aversion. 
JEL:  C25 D81 
Date:  2015–11 
URL:  http://d.repec.org/n?u=RePEc:upf:upfgen:1499&r=ecm 
By:  Lee, Sokbae; Salanié, Bernard 
Abstract:  Multivalued treatment models have only been studied so far under restrictive assumptions: ordered choice, or more recently unordered monotonicity. We show how marginal treatment effects can be identified in a more general class of models. Our results rely on two main assumptions: treatment assignment must be a measurable function of thresholdcrossing rules; and enough continuous instruments must be available. On the other hand, we do not require any kind of monotonicity condition. We illustrate our approach on several commonly used models; and we also discuss the identification power of discrete instruments. 
Keywords:  Discrete Choice; Identification; Monotonicity; Treatment evaluation 
JEL:  C14 C21 
Date:  2015–12 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:10970&r=ecm 
By:  Andreas BasseO'Connor (Department of Mathematics); Raphaël LachièzeRey (Heidelberg University  Department of Mathematics); Mark Podolskij (Department of Mathematics and CREATES) 
Abstract:  In this paper we present some new limit theorems for power variation of kth order increments of stationary increments Lévy driven moving averages. In this infill sampling setting, the asymptotic theory gives very surprising results, which (partially) have no counterpart in the theory of discrete moving averages. More specifically, we will show that the first order limit theorems and the mode of convergence strongly depend on the interplay between the given order of the increments, the considered power p, the BlumenthalGetoor index of the driving pure jump Lévy process L and the behaviour of the kernel function g at 0. First order asymptotic theory essentially comprise three cases: stable convergence towards a certain infinitely divisible distribution, an ergodic type limit theorem and convergence in probability towards an integrated random process. We also prove the second order limit theorem connected to the ergodic type result. When the driving Lévy process L is a symmetric stable process we obtain two different limits: a central limit theorem and convergence in distribution towards a stable random variable. 
Keywords:  Power variation, limit theorems, moving averages, fractional processes, stable convergence, high frequency data 
JEL:  C10 C13 C14 
Date:  2015–12–01 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201556&r=ecm 
By:  Xin Geng (IFPRI); Carlos MartinsFilho (University of Colorado, Department of Economics); Feng Yao (West Virginia University, Department of Economics) 
Abstract:  We propose a kernelbased estimator for a partially linear regression in a triangular system where endogenous regressors appear both in the nonparametric and linear components of the regression. Compared with alternative estimators currently available in the literature (Ai and Chen 2003; Otsu 2011), our estimator has an explicit functional form, is easier to implement, and exhibits better experimental finite sample performance. The estimator is inspired by the control function approach of Newey et al. (1999) and was initially proposed by MartinsFilho and Yao (2012). It explores conditional moment restrictions that make it suitable for additive regression estimation as in Kim et al. (1999) and Manzan and Zerom (2005). We establish consistency and asymptotic normality of the estimator for the parameters in the linear component of the model and give a uniform convergence rate for the estimator of the nonparametric component. In addition, for statistical inference, a consistent estimator for the covariance of the limiting distribution of the parametric estimator is provided. We illustrate the empirical viability of our estimation procedure by applying it to the study of the impact of foreign aid and policy on growth of per capita gross domestic product (GDP) in developing countries. 
Keywords:  partially linear regression, endogeneity, semiparametric instrumental variable estimation 
JEL:  C14 C36 
Date:  2015–10 
URL:  http://d.repec.org/n?u=RePEc:wvu:wpaper:1546&r=ecm 
By:  Kim, Jae; Choi, In 
Abstract:  This paper reevaluates the key past results of unit root test, emphasizing that the use of a conventional level of significance is not in general optimal due to the test having low power. The optimal levels for popular unit root tests, chosen using the line of enlightened judgement under a symmetric loss function, are found to be much higher than conventional ones. We also propose simple calibration rules for the optimal level of significance for a range of unit root tests based on asymptotic local power. At the optimal levels, many time series in the extended NelsonPlosser data set are judged to be trendstationary, including real income variables, employment variables and money stock. We also find nearly all real exchange rates covered in the ElliottPesavento study to be stationary at the optimal levels, which lends strong support for the purchasing power parity. Additionally, most of the real interest rates covered in the RapachWeber study are found to be stationary. 
Keywords:  Expected Loss; Optimal Level of Significance; Power of the Test; Response Surface 
JEL:  C12 E30 F30 
Date:  2015–12–17 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:68411&r=ecm 
By:  Harvey Goldstein; Peter Lynn; Graciela MunizTerrera; Rebecca Hardy; Colm O’Muircheartaigh; Chris J. Skinner; Risto Lehtonen 
Abstract:  In an opening paper Harvey Goldstein questions the need for observational studies to achieve representativeness for real populations, in particular for longitudinal studies. He draws upon recent debates and argues for the need to distinguish scientific inference from population inference. The points he raises are then debated in commentaries by Peter Lynn, Graciela MunizTerrera and Rebecca Hardy, Colm O'Muircheartaigh, Chris Skinner and Risto Lehtonen. These commentaries are followed by a response from Goldstein. 
JEL:  C1 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:ehl:lserod:64705&r=ecm 
By:  Koen Jochmans (Département d'économie); Thierry Magnac (Groupe de recherche en économie mathématique et quantitative) 
Abstract:  Consider estimating the slope coefficients of a fixedeffect binarychoice model from twoperiod panel data. Two approaches to semiparametric estimation at the regular parametric rate have been proposed. One is based on a sufficient statistic, the other is based on a conditionalmedian restriction. We show that, under standard assumptions, both approaches are equivalent. 
Keywords:  binary choice, fixed effects, panel data, regular estimation, sufficiency. 
Date:  2015–12 
URL:  http://d.repec.org/n?u=RePEc:spo:wpecon:info:hdl:2441/2t7dgrpjh58e9a93hqot3nu9k3&r=ecm 
By:  Ruijun Bu; Jie Cheng; Kaddour Hadri 
Abstract:  Reducible diffusions (RDs) are nonlinear transformations of analytically solvable Basic Diffusions (BDs). Therefore, they are constructed to be analytically tractable and flexible diffusion processes. Existing literature on RDs has mostly focused on timehomogeneous transformations, which to a significant extent fail to explore the full potential of RDs from both theoretical and practical point of views. In this paper, we propose flexible and economically justifiable timevariations to the transformations of RDs. Concentrating on the Constant Elasticity Variance (CEV) RDs, we consider nonlinear dynamics for our timevarying transformations with both deterministic and stochastic designs. Such timevariations can greatly enhance the flexibility of RDs while maintain sufficient tractability of the resulting models. Our approach also enjoys the benefits of classical inferential techniques as much as the advocated timevarying nonlinear dynamics. Our application to UK and US shortterm interest rates suggests that from an empirical point of view timevarying transformations are highly relevant and statistically significant. 
Keywords:  Stochastic Differential Equation, Reducible Diffusion, Constant Elasticity Variance, TimeVarying Transformation, Maximum Likelihood Estimation, ShortTerm Interest Rate 
JEL:  C13 C32 G12 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:qub:wpaper:1401&r=ecm 
By:  Mark Podolskij (Aarhus University, Department of Mathematics and CREATES); Christian Schmidt (Aarhus University, Department of Mathematics and CREATES); Mathias Vetter (ChristianAlbrechtsUniversität zu Kiel, Mathematisches Seminar) 
Abstract:  In this paper we examine the asymptotic theory for Ustatistics and Vstatistics of discontinuous Itô semimartingales that are observed at high frequency. For different types of kernel functions we show laws of large numbers and associated stable central limit theorems. In most of the cases the limiting process will be conditionally centered Gaussian. The structure of the kernel function determines whether the jump and/or the continuous part of the semimartingale contribute to the limit. 
Keywords:  central limit theorems,It^o semimartingales, stable convergence, Ustatistics. 
JEL:  C10 C13 C14 
Date:  2015–11–20 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201552&r=ecm 
By:  Andreas BasseO'Connor (Department of Mathematics); Mark Podolskij (Department of Mathematics and CREATES) 
Abstract:  In this paper we present some limit theorems for power variation of stationary increments Lévy driven moving averages in the setting of critical regimes. In [5] the authors derived first and second order asymptotic results for kth order increments of stationary increments Lévy driven moving averages. The limit theory heavily depends on the interplay between the given order of the increments, the considered power, the BlumenthalGetoor index of the driving pure jump Lévy process L and the behavior of the kernel function g at 0. In this work we will study the critical cases, which were not covered in the original work [5]. 
Keywords:  Power variation, limit theorems, moving averages, fractional processes, stable convergence, high frequency data 
JEL:  C10 C13 C14 
Date:  2015–12–01 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201557&r=ecm 
By:  Francesco Sergi (Centre d'Economie de la Sorbonne) 
Abstract:  The purpose of this contribution to the epistemology and history of recent macroeconomics is to construct a clear understanding of econometric methods and problems in New Classical mecroeconomics. Most historical work have focused so far on theoretical or policy implication aspects of this research program set in motion by Robert Lucas in the early seventies. On the contrary, the empirical and econometric works of New Classical macroeconomics have received little attention. I focus especially on the contributions gathered in Rational Expectations and Econometric Practice, edited in 1981 by Lucas and Thomas Sargent. The main claim of this article is that the publication of this book must be regarded as a turn in macroeconomics, that would bring macroeconometric modeling methodology closer to Lucas's conception of models. The analysis of the New Classical macroeconometrics through the Lucas methodology allow us to propose an original historical account of the methods presented in Rational Expectations and Econometric Practice, but also of the problems that flawed this approach 
Keywords:  history of macroeconomics; Lucas (Robert); Sargent (Thomas); macroeconometrics; modeling methodology 
JEL:  B22 B41 
Date:  2015–11 
URL:  http://d.repec.org/n?u=RePEc:mse:cesdoc:15088&r=ecm 
By:  Franses, Ph.H.B.F.; de Bruijn, B. 
Abstract:  Many publicly available macroeconomic forecasts are judgmentallyadjusted modelbased forecasts. In practice usually only a single final forecast is available, and not the underlying econometric model, nor are the size and reason for adjustment known. Hence, the relative weights given to the model forecasts and to the judgment are usually unknown to the analyst. This paper proposes a methodology to evaluate the quality of such final forecasts, also to allow learning from past errors. To do so, the analyst needs benchmark forecasts. We propose two such benchmarks. The first is the simple nochange forecast, which is the bottom line forecast that an expert should be able to improve. The second benchmark is an estimated model based forecast, which is found as the best forecast given the realizations and the final forecasts. We illustrate this methodology for two sets of GDP growth forecasts, one for the US and for the Netherlands. These applications tell us that adjustment appears most effective in periods of first recovery from a recession. 
Keywords:  forecast decomposition, expert adjustment, total least squares 
JEL:  C20 C51 
Date:  2015–11–01 
URL:  http://d.repec.org/n?u=RePEc:ems:eureir:79222&r=ecm 
By:  Niels Haldrup (Aarhus University and CREATES); J. Eduardo VeraValdés (Aarhus University and CREATES) 
Abstract:  It is commonly argued that observed long memory in time series variables can result from crosssectional aggregation of dynamic heterogeneous micro units. For instance, Granger (1980) demonstrated that aggregation of AR(1) processes with a Beta distributed AR coefficient can exhibit long memory under certain conditions and that the aggregated series will have an autocorrelation function that exhibits hyperbolic decay. In this paper, we further analyze this phenomenon. We demonstrate that the aggregation argument leading to long memory is consistent with a wide range of definitions of long memory. In a simulation study we seek to quantify Granger's result and find that indeed both the time series and crosssectional dimensions have to be rather significant to reflect the theoretical asymptotic results. Long memory can result even for moderate T,N dimensions but can vary considerably from the theoretical degree of memory. Also, Granger's result is most precise in samples with a relatively high degree of memory. Finally, we show that even though the aggregated process will behave as generalized fractional process and thus converge to a fractional Brownian motion asymptotically, the fractionally differenced series does not behave according to an ARMA process. In particular, despite the autocorrelation function is summable and hence the fractionally differenced process satisfy the conditions for being I(0), it still exhibits hyperbolic decay. This may have consequences for the validity of ARFIMA time series modeling of long memory processes when the source of memory is due to aggregation. 
Keywords:  Long memory, Fractional Integration, Aggregation 
JEL:  C2 C22 
Date:  2015–12–12 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201559&r=ecm 