nep-ecm New Economics Papers
on Econometrics
Issue of 2015‒01‒31
thirteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Indirect inference with time series observed with error By Eduardo Rossi ; Paolo Santucci de Magistris
  2. Nonparametric likelihood for volatility under high frequency data By Lorenzo Camponovo ; Yukitoshi Matsushita ; Taisuke Otsu
  3. "Bayesian Modeling of Dynamic Extreme Values: Extension of Generalized Extreme Value Distributions with Latent Stochastic Processes " By Jouchi Nakajima ; Tsuyoshi Kunihama ; Yasuhiro Omori
  4. Estimating Stable Factor Models By Indirect Inference By Giorgio Calzolari ; Roxana Halbleib
  5. Inference about the Indirect Effect: a Likelihood Approach By Noud P.A. van Giersbergen
  6. Inference in High-dimensional Dynamic Panel Data Models By Anders Bredahl Kock ; Haihan Tang
  7. The Statistical Implications of Common Identifying Restrictions for DSGE Models By Stephen Morris
  8. Forecast-error-based estimation of forecast uncertainty when the horizon is increased By Knüppel, Malte
  9. Bootstrap inference of matching estimators for average treatment effects By Taisuke Otsu ; Yoshiyasu Rai
  10. Indirect estimation and econometrics exams: how to live a round life By Giorgio Calzolari
  11. Combining Alphas via Bounded Regression By Zura Kakushadze
  12. "Slutsky Matrix Norms and Revealed Preference Tests of Consumer Behaviour" By Victor Aguiar ; Roberto Serrano
  13. A Narrative Approach to a Fiscal DSGE Model By Thorsten Drautzburg

  1. By: Eduardo Rossi (University of Pavia ); Paolo Santucci de Magistris (Aarhus University and CREATES )
    Abstract: We analyze the properties of the indirect inference estimator when the observed series are contaminated by measurement error. We show that the indirect inference estimates are asymptotically biased when the nuisance parameters of the measurement error distribution are neglected in the indirect estimation. We propose to solve this inconsistency by jointly estimating the nuisance and the structural parameters. Under standard assumptions, this estimator is consistent and asymptotically normal. A condition for the identification of ARMA plus noise is obtained. The proposed methodology is used to estimate the parameters of continuous-time stochastic volatility models with auxiliary specifications based on realized volatility measures. Monte Carlo simulations shows the bias reduction of the indirect estimates obtained when the microstructure noise is explicitly modeled. Finally, an empirical application illustrates the relevance of a realistic specification of the microstructure noise distribution to match the features of the observed log-returns at high frequencies.
    Keywords: Indirect inference, measurement error, stochastic volatility, realized volatility
    JEL: C13 C15 C22 C58
    Date: 2014–12–31
  2. By: Lorenzo Camponovo ; Yukitoshi Matsushita ; Taisuke Otsu
    Abstract: We propose a nonparametric likelihood inference method for the integrated volatility under high frequency financial data. The nonparametric likelihood statistic, which contains the conventional statistics such as empirical likelihood and Pearson's chi-square as special cases, is not asymptotically pivotal under the so-called infill asymptotics, where the number of high frequency observations in a fixed time interval increases to infinity. We show that multiplying a correction term recovers the chi-square limiting distribution. Furthermore, we establish Bartlett correction for our modified nonparametric likelihood statistic under the constant and general non-constant volatility cases. In contrast to the existing literature, the empirical likelihood statistic is not Bartlett correctable under the infill asymptotics. However, by choosing adequate tuning constants for the power divergence family, we show that the second order refinement to the order n^2 can be achieved.
    Keywords: Nonparametric likelihood, Volatility, High frequency data
    JEL: C14
    Date: 2015–01
  3. By: Jouchi Nakajima (Bank of Japan, ); Tsuyoshi Kunihama (Department of Statistical Science, Duke University ); Yasuhiro Omori (Faculty of Economics, The University of Tokyo )
    Abstract: This paper develops Bayesian inference of extreme value models with a exible time- dependent latent structure. The generalized extreme value distribution is utilized to incorporate state variables that follow an autoregressive moving average (ARMA) process with Gumbel-distributed innovations. The time-dependent extreme value distribution is combined with heavy-tailed error terms. An efficient Markov chain Monte Carlo algorithm is proposed using a state space representation with a mixture of normal distribution approximating the Gumbel distribution. The methodology is illustrated using extreme data of stock returns and electricity demand. Estimation results show the usefulness of the proposed model and evidence that the latent autoregressive process and heavy-tailed errors plays an important role to describe the monthly series of minimum stock returns and maximum electricity demand.
    Date: 2015–01
  4. By: Giorgio Calzolari (Dipartimento di Statistica, University of Firenze, Italy ); Roxana Halbleib (Department of Economics, University of Konstanz, Germany )
    Abstract: Financial returns exhibit common behavior described at best by factor models, but also fat tails, which may be captured by α-stable distributions. This paper concentrates on estimating factor models with multivariate α-stable distributed and independent factors and idiosyncratic noises under the assumption of time constant distribution (static factor models) or time-varying conditional distribution (GARCH factor models). While the simulation from such a distribution is straightforward, the estimation encounters difficulties. These difficulties are overcome in this paper by implementing the indirect inference estimation method with the multivariate Student’s t as the auxiliary distribution.
    Keywords: Symmetric Multivariate α-stable Distribution, Factor Models, Indirect Inference, Multivariate Student’s t Distribution, Discrete Spectral Measures, GARCH Models
    JEL: C13 C15 C18 C38 C46
    Date: 2014–12–28
  5. By: Noud P.A. van Giersbergen
    Abstract: Prior research for constructing confidence intervals for an indirect effect has focused on a Wald statistic. In this paper, however, the inference problem is analyzed from a likelihood ratio (LR) perspective. When testing the null hypothesis $H_{0}:\ \alpha \beta =0$, the LR test statistic leads to the minimum of two t-ratios, whose size can be controlled. A confidence interval is obtained by inverting the LR statistic. Another confidence interval is obtained by inverting the sum of two pivotal t-statistics. In the Monte Carlo simulations, this latter confidence interval is the best performer: it outperforms the commonly used existing methods
    Date: 2014–12–30
  6. By: Anders Bredahl Kock (Aarhus University and CREATES ); Haihan Tang (Cambridge University )
    Abstract: We establish oracle inequalities for a version of the Lasso in high-dimensional fixed effects dynamic panel data models. The inequalities are valid for the coefficients of the dynamic and exogenous regressors. Separate oracle inequalities are derived for the fixed effects. Next, we show how one can conduct simultaneous inference on the parameters of the model and construct a uniformly valid estimator of the asymptotic covariance matrix which is robust to conditional heteroskedasticity in the error terms. Allowing for conditional heteroskedasticity is important in dynamic models as the conditional error variance may be non-constant over time and depend on the covariates. Furthermore, our procedure allows for inference on high-dimensional subsets of the parameter vector of an increasing cardinality. We show that the confidence bands resulting from our procedure are asymptotically honest and contract at the optimal rate. This rate is different for the fixed effects than for the remaining parts of the parameter vector.
    Keywords: Panel data Dynamic models, Lasso, Desparsification, High-dimensional data, Uniform inference, Honest inference, Oracle inequality, Confidence intervals, Tests
    JEL: C13 C23
    Date: 2014–12–30
  7. By: Stephen Morris (UC San Diego )
    Abstract: I reveal identification failures in a well-known dynamic stochastic general equilibrium (DSGE) model, and study the statistical implications of common identifying restrictions. First, I provide a fully analytical methodology for determining all observationally equivalent values of the structural parameters in any parameter space. I show that either parameter admissibility or sign restrictions may yield global identification for some parameter realizations, but not for others. Second, I derive a "plug-in" maximum likelihood estimator, which requires no numerical search. I use this tool to demonstrate that the idiosyncratic identifying restriction directly impinges on both the location and distribution of the small-sample MLE, and compute correctly sized confidence intervals.
    Date: 2014
  8. By: Knüppel, Malte
    Abstract: Recently, several institutions have increased their forecast horizons, and many institutions rely on their past forecast errors to estimate measures of forecast uncertainty. This work addresses the question how the latter estimation can be accomplished if there are only very few errors available for the new forecast horizons. It extends upon the results of Knüppel (2014) in order to relax the condition on the data structure required for the SUR estimator to be independent from unknown quantities. It turns out that the SUR estimator of forecast uncertainty tends to deliver large e¢ ciency gains compared to the OLS estimator (i.e. the sample mean of the squared forecast errors) in the case of increased forecast horizons. The SUR estimator is applied to the forecast errors of the Bank of England and the FOMC.
    Keywords: multi-step-ahead forecasts,forecast error variance,SUR
    JEL: C13 C32 C53
    Date: 2014
  9. By: Taisuke Otsu ; Yoshiyasu Rai
    Abstract: Abadie and Imbens (2008) showed that the standard naive bootstrap is inconsistent to estimate the distribution of the matching estimator for treatment effects with a fixed number of matches. This article proposes an asymptotically valid inference method for the matching estimators based on the wild bootstrap. The key idea is to resample not only the regression residuals of treated and untreated observations but also the ones to estimate the average treatment effects. The proposed method is valid even for the case of vector covariates by incorporating the bias correction method in Abadie and Imbens (2011), and is applicable to estimate the average treatment effect and the counterpart for the treated population. A simulation study indicates that our wild bootstrap method is favorably comparable to the asymptotic normal approximation. As an empirical illustration, we apply our bootstrap method to the National Supported Work data.
    Keywords: Treatment effect, matching, bootstrap
    JEL: C21
    Date: 2015–01
  10. By: Giorgio Calzolari (Dipartimento di Statistica, Informatica, Applicazioni "G. Parenti", Università di Firenze )
    Abstract: The use of Monte Carlo methods to generate exam data sets is nowadays a well-established practice among econometrics examiners all over the world. Its advantages are well known: providing each student a different data set ensures that estimates are actually computed individually, rather than copied from someone sitting nearby. The method however has a major fault: initial "random errors", such as mistakes in downloading the assigned dataset, might generate downward bias in student evaluation. We propose a set of calibration algorithms, typical of indirect estimation methods, that solve the issue of initial ``random errors'' and reduce evaluation bias. Ensuring round initial estimates of the parameters for each individual data set, our calibration procedures allow the students to determine if they have started the exam correctly. When initial estimates are not round numbers, this random error in the initial stage of the exam can be corrected for immediately, thus reducing evaluation bias. The procedure offers the further advantage of rounding markers’ life by allowing them to check round numbers answers only, rather than lists of numbers with many decimal digits.
    Keywords: Indirect estimation, round numbers, econometrics exams
    Date: 2015–01
  11. By: Zura Kakushadze
    Abstract: We give an explicit algorithm and source code for combining alpha streams via bounded regression. In practical applications typically there is insufficient history to compute a sample covariance matrix (SCM) for a large number of alphas. To compute alpha allocation weights, one then resorts to (weighted) regression over SCM principal components. Regression often produces alpha weights with insufficient diversification and/or skewed distribution against, e.g., turnover. This can be rectified by imposing bounds on alpha weights within the regression procedure. Bounded regression can also be applied to stock and other asset portfolio construction. We discuss illustrative examples.
    Date: 2015–01
  12. By: Victor Aguiar ; Roberto Serrano
    Abstract: Given any observed finite sequence of prices, wealth and demand choices, we characterize the relation between its underlying Slutsky matrix norm (SMN) and some popular discrete revealed preference (RP) measures of departures from rationality, such as the Afriat index. We show that testing rationality in the SMN aproach with finite data is equivalent to testing it under the RP approach. We propose a way to “summarize” the departures from rationality in a systematic fashion in finite datasets. Finally, these ideas are extended to an observed demand with noise due to measurement error; we formulate an appropriate modification of the SMN approach in this case and derive closed-form asymptotic results under standard regularity conditions.
    Keywords: consumer theory; rationality; Slutsky matrix function; revealed preference approach; bounded rationality.
    Date: 2015
  13. By: Thorsten Drautzburg (Federal Reserve Bank of Philadelphia )
    Abstract: DSGE models are used for analyzing policy and the sources of business cycles. A competing approach uses VARs that are partially identified using, for example, narrative shock measures and are often viewed as imposing fewer restrictions on the data. Narrative shocks are identified non-structurally through information external to particular models. This uses non-structural narrative shock measures to inform the structural estimation of DSGE models. Since fiscal policy has received much recent attention but the foundations of the fiscal side of DSGE models are less well studied than their monetary building block, fiscal DSGE models are a particularly promising application. Preliminary results from a standard medium-scale DSGE model support this argument: Structurally identified monetary shocks line up well with narrative measures, whereas government spending shocks do not. Extending the model to include distortionary taxes and more general fiscal policy processes, I find that model implied labor tax shocks line up well with narrative tax shocks. Including different narrative shock measures affects parameter identification and implied measures such as fiscal multipliers.
    Date: 2014

This nep-ecm issue is ©2015 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.