nep-ecm New Economics Papers
on Econometrics
Issue of 2015‒12‒20
nineteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Bayesian Estimation of Agent-Based Models. By Jakob Grazzini; Matteo G. Richiardi; Mike Tsionas
  2. Edgeworth expansion for the pre-averaging estimator By Mark Podolskij; Bezirgen Veliyev; Nakahiro Yoshida
  3. Semiparametric Estimation of Multivariate GARCH Models By Claudio, Morana
  4. A fully non-parametric heteroskedastic model By Matthieu Garcin; Clément Goulet
  5. Employing Bayesian Forecasting of Value-at-Risk to Determine an Appropriate Model for Risk Management By CHEN, Cathy W.S.; WENG, Monica M.C.; WATANABE, Toshiaki
  6. Optimal wavelet shrinkage of a noisy dynamical system with non-linear noise impact By Matthieu Garcin; Dominique Guegan
  7. Monotone stochastic choice models: The case of risk and time preferences By Jose Apesteguia; Miguel A. Ballester
  8. Identifying Effects of Multivalued Treatments By Lee, Sokbae; Salanié, Bernard
  9. Limit theorems for stationary increments Lévy driven moving averages By Andreas Basse-O'Connor; Raphaël Lachièze-Rey; Mark Podolskij
  10. Estimation of a Partially Linear Regression in Triangular Systems By Xin Geng; Carlos Martins-Filho; Feng Yao
  11. Unit Roots in Economic and Financial Time Series: A Re-Evaluation based on Enlightened Judgement By Kim, Jae; Choi, In
  12. Population sampling in longitudinal suverys By Harvey Goldstein; Peter Lynn; Graciela Muniz-Terrera; Rebecca Hardy; Colm O’Muircheartaigh; Chris J. Skinner; Risto Lehtonen
  13. A note on sufficiency in binary panel models By Koen Jochmans; Thierry Magnac
  14. Reducible Diffusions with Time-Varying Transformations with Application to Short-Term Interest Rates By Ruijun Bu; Jie Cheng; Kaddour Hadri
  15. On U- and V-statistics for discontinuous Itô semimartingale By Mark Podolskij; Christian Schmidt; Mathias Vetter
  16. On critical cases in limit theory for stationary increments Lévy driven moving averages By Andreas Basse-O'Connor; Mark Podolskij
  17. Robert Lucas and the Twist of Modeling Methodology. On some Econometric Methods and Problems in New Classical Macroeconomics By Francesco Sergi
  18. Benchmarking judgmentally adjusted forecasts By Franses, Ph.H.B.F.; de Bruijn, B.
  19. Long Memory, Fractional Integration, and Cross-Sectional Aggregation By Niels Haldrup; J. Eduardo Vera-Valdés

  1. By: Jakob Grazzini; Matteo G. Richiardi; Mike Tsionas
    Abstract: We consider Bayesian inference techniques for Agent-Based (AB) models, as an alternative to simulated minimum distance (SMD). We discuss the specicities of AB models with respect to models with exact aggregation results (as DSGE models), and how this impact estimation. Three computationally heavy steps are involved: (i) simulating the model, (ii) estimating the likelihood and (iii) sampling from the posterior distribution of the parameters. Computational complexity of AB models implies that ecient techniques have to be used with respect to points (ii) and (iii), possibly involving approximations. We rst discuss non-parametric (kernel density) estimation of the likelihood, coupled with Markov chain Monte Carlo sampling schemes. We then turn to parametric approximations of the likelihood, which can be derived by observing the distribution of the simulation outcomes around the statistical equilibria, or by assuming a specic form for the distribution of external deviations in the data. Finally, we introduce Approximate Bayesian Computation techniques for likelihood-free estimation. These allow embedding SMD methods in a Bayesian framework, and are particularly suited when robust estimation is needed. These techniques are tested, for the sake of comparison, in the same price discovery model used by Grazzini and Richiardi (2015) to illustrate SMD techniques.
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:cca:wplabo:145&r=ecm
  2. By: Mark Podolskij (Heidelberg University and CREATES); Bezirgen Veliyev (Aarhus University and CREATES); Nakahiro Yoshida (Graduate School of Mathematical Science)
    Abstract: In this paper, we study the Edgeworth expansion for a pre-averaging estimator of quadratic variation in the framework of continuous diffusion models observed with noise. More specifically, we obtain a second order expansion for the joint density of the estimators of quadratic variation and its asymptotic variance. Our approach is based on martingale embedding, Malliavin calculus and stable central limit theorems for continuous diffusions. Moreover, we derive the density expansion for the studentized statistic, which might be applied to construct asymptotic confidence regions.
    Keywords: diffusion processes, Edgeworth expansion, high frequency observations, quadratic variation, pre-averaging.
    JEL: C10 C13 C14
    Date: 2015–12–14
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-60&r=ecm
  3. By: Claudio, Morana
    Abstract: The paper introduces a new simple semiparametric estimator of the conditional variance covariance and correlation matrix (SP-DCC). While sharing a similar sequential approach to existing dynamic conditional correlation (DCC) methods, SP-DCC has the advantage of not requiring the direct parameterization of the conditional covariance or correlation processes, therefore also avoiding any assumption on their long-run target. In the proposed framework, conditional variances are estimated by univariate GARCH models, for actual and suitably transformed series, in the first step; the latter are then nonlinearly combined in the second step, according to basic properties of the covariance and correlation operator, to yield nonparametric estimates of the various conditional covariances and correlations. Moreover, in contrast to available DCC methods, SP-DCC allows for straightforward estimation also for the non-symultaneous case, i.e., for the estimation of conditional cross-covariances and correlations, displaced at any time horizon of interest. A simple ex-post procedure, to ensure well behaved conditional covariance and correlation matrices, grounded on nonlinear shrinkage, is finally proposed. Due to its sequential implementation and scant computational burden, SP-DCC is very simple to apply and suitable for the modeling of vast sets of conditionally heteroskedastic time series.
    Keywords: Multivariate GARCH model, dynamic conditional correlation, semiparametric estimation
    JEL: C30 C51
    Date: 2015–12–10
    URL: http://d.repec.org/n?u=RePEc:mib:wpaper:317&r=ecm
  4. By: Matthieu Garcin (Centre d'Economie de la Sorbonne & Natixis Asset Management); Clément Goulet (Centre d'Economie de la Sorbonne)
    Abstract: In this paper we propose a new model for estimating returns and volatility. Our approach is based both on the wavelet denoising technique and on the variational theory. We assess that the volatility can be expressed as a non-parametric functional form of past returns. Therefore, we are able to forecast both returns and volatility and to build confidence intervals for predicted returns. Our technique outperforms classical time series theory. Our model does not require the stationarity of the observed log-returns, it preserves the volatility stylised facts and it is based on a fully non-parametric form. This non-parametric form is obtained thanks to the multiplicative noise theory. To our knowledge, this is the first time that such a method is used for financial modelling. We propose an application to intraday and daily financial data
    Keywords: Volatility modeling; non variational calculus; wavelet theory; trading strategy
    JEL: C14 C51 C53 C58
    Date: 2015–09
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:15086&r=ecm
  5. By: CHEN, Cathy W.S.; WENG, Monica M.C.; WATANABE, Toshiaki
    Abstract: To allow for a higher degree of flexibility in model parameters, we propose a general and time-varying nonlinear smooth transition (ST) heteroskedastic model with a second-order logistic function of varying speed in the mean and variance. This paper evaluates the performance of Value-at-Risk (VaR) measures in a class of risk models, specially focusing on three distinct ST functions with GARCH structures: first- and second-order logistic functions, and the exponential function. The likelihood function is non-differentiable in terms of the threshold values and delay parameter. We employ Bayesian Markov chain Monte Carlo sampling methods to update the estimates and quantile forecasts. The proposed methods are illustrated using simulated data and an empirical study. We estimate VaR forecasts for the proposed models alongside some competing asymmetric models with skew and fat-tailed error probability distributions, including realized volatility models. To evaluate the accuracy of VaR estimates, we implement two loss functions and three backtests. The results show that the ST model with a second-order logistic function and skew Student’s t error is a worthy choice at the 1% level, when compared to a range of existing alternatives.
    Keywords: Second-order logistic transition function, Backtesting, Markov chain Monte Carlo methods, Value-at-Risk, Volatility forecasting, Realized volatility models
    Date: 2015–12–08
    URL: http://d.repec.org/n?u=RePEc:hit:hiasdp:hias-e-16&r=ecm
  6. By: Matthieu Garcin (Centre d'Economie de la Sorbonne & Natixis Asset Management); Dominique Guegan (Centre d'Economie de la Sorbonne)
    Abstract: By filtering wavelet coefficients, it is possible to construct a good estimate of a pure signal from noisy data. Especially, for a simple linear noise influence, Donoho and Johnstone (1994) have already defined an optimal filter design in the sense of a good reconstruction of the pure signal. We set here a different framework where the influence of the noise is non-linear. In particular, we propose an optimal method to filter the wavelet coefficients of a discrete dynamical system disrupted by a weak noise, in order to construct good estimates of the pure signal, including Bayes' estimate, minimax estimate, oracular estimate or thresholding estimate. We present the example of a simple chaotic dynamical system as well as an adaptation of our technique in order to show empirically the robustness of the thresholding method in presence of leptokurtic noise. Moreover, we test both the hard and the soft thresholding and also another kind of smoother thresholding which seems to have almost the same reconstruction power as the hard thresholding
    Keywords: wavelets; dynamical systems; chaos; Gaussian noise; Cauchy noise; thresholding; nonequispaced design; non-linear noise impact
    Date: 2015–10
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:15085&r=ecm
  7. By: Jose Apesteguia; Miguel A. Ballester
    Abstract: Suppose that, when evaluating two alternatives x and y by means of a parametric utility function, low values of the parameter indicate a preference for x and high values indicate a preference for y. We say that a stochastic choice model is monotone whenever the probability of choosing x is decreasing in the preference parameter. We show that the standard use of random utility models in the context of risk and time preferences may sharply violate this monotonicity property, and argue that their use in preference estimation may be problematic. In particular, they may pose identification problems and yield biased estimations. We then establish that the alternative random parameter models, in contrast, are always monotone. We show in an empirical application that standard risk-aversion assessments may be severely biased.
    Keywords: Stochastic Choice; Preference Parameters; Random Utility Models; Random Parameter Models; Risk Aversion; Delay Aversion.
    JEL: C25 D81
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:upf:upfgen:1499&r=ecm
  8. By: Lee, Sokbae; Salanié, Bernard
    Abstract: Multivalued treatment models have only been studied so far under restrictive assumptions: ordered choice, or more recently unordered monotonicity. We show how marginal treatment effects can be identified in a more general class of models. Our results rely on two main assumptions: treatment assignment must be a measurable function of threshold-crossing rules; and enough continuous instruments must be available. On the other hand, we do not require any kind of monotonicity condition. We illustrate our approach on several commonly used models; and we also discuss the identification power of discrete instruments.
    Keywords: Discrete Choice; Identification; Monotonicity; Treatment evaluation
    JEL: C14 C21
    Date: 2015–12
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:10970&r=ecm
  9. By: Andreas Basse-O'Connor (Department of Mathematics); Raphaël Lachièze-Rey (Heidelberg University - Department of Mathematics); Mark Podolskij (Department of Mathematics and CREATES)
    Abstract: In this paper we present some new limit theorems for power variation of k-th order increments of stationary increments Lévy driven moving averages. In this infill sampling setting, the asymptotic theory gives very surprising results, which (partially) have no counterpart in the theory of discrete moving averages. More specifically, we will show that the first order limit theorems and the mode of convergence strongly depend on the interplay between the given order of the increments, the considered power p, the Blumenthal-Getoor index of the driving pure jump Lévy process L and the behaviour of the kernel function g at 0. First order asymptotic theory essentially comprise three cases: stable convergence towards a certain infinitely divisible distribution, an ergodic type limit theorem and convergence in probability towards an integrated random process. We also prove the second order limit theorem connected to the ergodic type result. When the driving Lévy process L is a symmetric stable process we obtain two different limits: a central limit theorem and convergence in distribution towards a stable random variable.
    Keywords: Power variation, limit theorems, moving averages, fractional processes, stable convergence, high frequency data
    JEL: C10 C13 C14
    Date: 2015–12–01
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-56&r=ecm
  10. By: Xin Geng (IFPRI); Carlos Martins-Filho (University of Colorado, Department of Economics); Feng Yao (West Virginia University, Department of Economics)
    Abstract: We propose a kernel-based estimator for a partially linear regression in a triangular system where endogenous regressors appear both in the nonparametric and linear components of the regression. Compared with alternative estimators currently available in the literature (Ai and Chen 2003; Otsu 2011), our estimator has an explicit functional form, is easier to implement, and exhibits better experimental finite sample performance. The estimator is inspired by the control function approach of Newey et al. (1999) and was initially proposed by Martins-Filho and Yao (2012). It explores conditional moment restrictions that make it suitable for additive regression estimation as in Kim et al. (1999) and Manzan and Zerom (2005). We establish consistency and asymptotic normality of the estimator for the parameters in the linear component of the model and give a uniform convergence rate for the estimator of the nonparametric component. In addition, for statistical inference, a consistent estimator for the covariance of the limiting distribution of the parametric estimator is provided. We illustrate the empirical viability of our estimation procedure by applying it to the study of the impact of foreign aid and policy on growth of per capita gross domestic product (GDP) in developing countries.
    Keywords: partially linear regression, endogeneity, semiparametric instrumental variable estimation
    JEL: C14 C36
    Date: 2015–10
    URL: http://d.repec.org/n?u=RePEc:wvu:wpaper:15-46&r=ecm
  11. By: Kim, Jae; Choi, In
    Abstract: This paper re-evaluates the key past results of unit root test, emphasizing that the use of a conventional level of significance is not in general optimal due to the test having low power. The optimal levels for popular unit root tests, chosen using the line of enlightened judgement under a symmetric loss function, are found to be much higher than conventional ones. We also propose simple calibration rules for the optimal level of significance for a range of unit root tests based on asymptotic local power. At the optimal levels, many time series in the extended Nelson-Plosser data set are judged to be trend-stationary, including real income variables, employment variables and money stock. We also find nearly all real exchange rates covered in the Elliott-Pesavento study to be stationary at the optimal levels, which lends strong support for the purchasing power parity. Additionally, most of the real interest rates covered in the Rapach-Weber study are found to be stationary.
    Keywords: Expected Loss; Optimal Level of Significance; Power of the Test; Response Surface
    JEL: C12 E30 F30
    Date: 2015–12–17
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:68411&r=ecm
  12. By: Harvey Goldstein; Peter Lynn; Graciela Muniz-Terrera; Rebecca Hardy; Colm O’Muircheartaigh; Chris J. Skinner; Risto Lehtonen
    Abstract: In an opening paper Harvey Goldstein questions the need for observational studies to achieve representativeness for real populations, in particular for longitudinal studies. He draws upon recent debates and argues for the need to distinguish scientific inference from population inference. The points he raises are then debated in commentaries by Peter Lynn, Graciela Muniz-Terrera and Rebecca Hardy, Colm O'Muircheartaigh, Chris Skinner and Risto Lehtonen. These commentaries are followed by a response from Goldstein.
    JEL: C1
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:ehl:lserod:64705&r=ecm
  13. By: Koen Jochmans (Département d'économie); Thierry Magnac (Groupe de recherche en économie mathématique et quantitative)
    Abstract: Consider estimating the slope coefficients of a fixed-effect binary-choice model from two-period panel data. Two approaches to semiparametric estimation at the regular parametric rate have been proposed. One is based on a sufficient statistic, the other is based on a conditional-median restriction. We show that, under standard assumptions, both approaches are equivalent.
    Keywords: binary choice, fixed effects, panel data, regular estimation, sufficiency.
    Date: 2015–12
    URL: http://d.repec.org/n?u=RePEc:spo:wpecon:info:hdl:2441/2t7dgrpjh58e9a93hqot3nu9k3&r=ecm
  14. By: Ruijun Bu; Jie Cheng; Kaddour Hadri
    Abstract: Reducible diffusions (RDs) are nonlinear transformations of analytically solvable Basic Diffusions (BDs). Therefore, they are constructed to be analytically tractable and flexible diffusion processes. Existing literature on RDs has mostly focused on time-homogeneous transformations, which to a significant extent fail to explore the full potential of RDs from both theoretical and practical point of views. In this paper, we propose flexible and economically justifiable time-variations to the transformations of RDs. Concentrating on the Constant Elasticity Variance (CEV) RDs, we consider nonlinear dynamics for our time-varying transformations with both deterministic and stochastic designs. Such time-variations can greatly enhance the flexibility of RDs while maintain sufficient tractability of the resulting models. Our approach also enjoys the benefits of classical inferential techniques as much as the advocated time-varying nonlinear dynamics. Our application to UK and US short-term interest rates suggests that from an empirical point of view time-varying transformations are highly relevant and statistically significant.
    Keywords: Stochastic Differential Equation, Reducible Diffusion, Constant Elasticity Variance, Time-Varying Transformation, Maximum Likelihood Estimation, Short-Term Interest Rate
    JEL: C13 C32 G12
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:qub:wpaper:1401&r=ecm
  15. By: Mark Podolskij (Aarhus University, Department of Mathematics and CREATES); Christian Schmidt (Aarhus University, Department of Mathematics and CREATES); Mathias Vetter (Christian-Albrechts-Universität zu Kiel, Mathematisches Seminar)
    Abstract: In this paper we examine the asymptotic theory for U-statistics and V-statistics of discontinuous Itô semimartingales that are observed at high frequency. For different types of kernel functions we show laws of large numbers and associated stable central limit theorems. In most of the cases the limiting process will be conditionally centered Gaussian. The structure of the kernel function determines whether the jump and/or the continuous part of the semimartingale contribute to the limit.
    Keywords: central limit theorems,It^o semimartingales, stable convergence, U-statistics.
    JEL: C10 C13 C14
    Date: 2015–11–20
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-52&r=ecm
  16. By: Andreas Basse-O'Connor (Department of Mathematics); Mark Podolskij (Department of Mathematics and CREATES)
    Abstract: In this paper we present some limit theorems for power variation of stationary increments Lévy driven moving averages in the setting of critical regimes. In [5] the authors derived first and second order asymptotic results for k-th order increments of stationary increments Lévy driven moving averages. The limit theory heavily depends on the interplay between the given order of the increments, the considered power, the Blumenthal-Getoor index of the driving pure jump Lévy process L and the behavior of the kernel function g at 0. In this work we will study the critical cases, which were not covered in the original work [5].
    Keywords: Power variation, limit theorems, moving averages, fractional processes, stable convergence, high frequency data
    JEL: C10 C13 C14
    Date: 2015–12–01
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-57&r=ecm
  17. By: Francesco Sergi (Centre d'Economie de la Sorbonne)
    Abstract: The purpose of this contribution to the epistemology and history of recent macroeconomics is to construct a clear understanding of econometric methods and problems in New Classical mecroeconomics. Most historical work have focused so far on theoretical or policy implication aspects of this research program set in motion by Robert Lucas in the early seventies. On the contrary, the empirical and econometric works of New Classical macroeconomics have received little attention. I focus especially on the contributions gathered in Rational Expectations and Econometric Practice, edited in 1981 by Lucas and Thomas Sargent. The main claim of this article is that the publication of this book must be regarded as a turn in macroeconomics, that would bring macroeconometric modeling methodology closer to Lucas's conception of models. The analysis of the New Classical macroeconometrics through the Lucas methodology allow us to propose an original historical account of the methods presented in Rational Expectations and Econometric Practice, but also of the problems that flawed this approach
    Keywords: history of macroeconomics; Lucas (Robert); Sargent (Thomas); macroeconometrics; modeling methodology
    JEL: B22 B41
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:15088&r=ecm
  18. By: Franses, Ph.H.B.F.; de Bruijn, B.
    Abstract: Many publicly available macroeconomic forecasts are judgmentally-adjusted model-based forecasts. In practice usually only a single final forecast is available, and not the underlying econometric model, nor are the size and reason for adjustment known. Hence, the relative weights given to the model forecasts and to the judgment are usually unknown to the analyst. This paper proposes a methodology to evaluate the quality of such final forecasts, also to allow learning from past errors. To do so, the analyst needs benchmark forecasts. We propose two such benchmarks. The first is the simple no-change forecast, which is the bottom line forecast that an expert should be able to improve. The second benchmark is an estimated model based forecast, which is found as the best forecast given the realizations and the final forecasts. We illustrate this methodology for two sets of GDP growth forecasts, one for the US and for the Netherlands. These applications tell us that adjustment appears most effective in periods of first recovery from a recession.
    Keywords: forecast decomposition, expert adjustment, total least squares
    JEL: C20 C51
    Date: 2015–11–01
    URL: http://d.repec.org/n?u=RePEc:ems:eureir:79222&r=ecm
  19. By: Niels Haldrup (Aarhus University and CREATES); J. Eduardo Vera-Valdés (Aarhus University and CREATES)
    Abstract: It is commonly argued that observed long memory in time series variables can result from cross-sectional aggregation of dynamic heterogeneous micro units. For instance, Granger (1980) demonstrated that aggregation of AR(1) processes with a Beta distributed AR coefficient can exhibit long memory under certain conditions and that the aggregated series will have an autocorrelation function that exhibits hyperbolic decay. In this paper, we further analyze this phenomenon. We demonstrate that the aggregation argument leading to long memory is consistent with a wide range of definitions of long memory. In a simulation study we seek to quantify Granger's result and find that indeed both the time series and cross-sectional dimensions have to be rather significant to reflect the theoretical asymptotic results. Long memory can result even for moderate T,N dimensions but can vary considerably from the theoretical degree of memory. Also, Granger's result is most precise in samples with a relatively high degree of memory. Finally, we show that even though the aggregated process will behave as generalized fractional process and thus converge to a fractional Brownian motion asymptotically, the fractionally differenced series does not behave according to an ARMA process. In particular, despite the autocorrelation function is summable and hence the fractionally differenced process satisfy the conditions for being I(0), it still exhibits hyperbolic decay. This may have consequences for the validity of ARFIMA time series modeling of long memory processes when the source of memory is due to aggregation.
    Keywords: Long memory, Fractional Integration, Aggregation
    JEL: C2 C22
    Date: 2015–12–12
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-59&r=ecm

This nep-ecm issue is ©2015 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.