
on Econometrics 
By:  Eduardo Rossi (University of Pavia ); Paolo Santucci de Magistris (Aarhus University and CREATES ) 
Abstract:  We analyze the properties of the indirect inference estimator when the observed series are contaminated by measurement error. We show that the indirect inference estimates are asymptotically biased when the nuisance parameters of the measurement error distribution are neglected in the indirect estimation. We propose to solve this inconsistency by jointly estimating the nuisance and the structural parameters. Under standard assumptions, this estimator is consistent and asymptotically normal. A condition for the identification of ARMA plus noise is obtained. The proposed methodology is used to estimate the parameters of continuoustime stochastic volatility models with auxiliary specifications based on realized volatility measures. Monte Carlo simulations shows the bias reduction of the indirect estimates obtained when the microstructure noise is explicitly modeled. Finally, an empirical application illustrates the relevance of a realistic specification of the microstructure noise distribution to match the features of the observed logreturns at high frequencies. 
Keywords:  Indirect inference, measurement error, stochastic volatility, realized volatility 
JEL:  C13 C15 C22 C58 
Date:  2014–12–31 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201457&r=ecm 
By:  Lorenzo Camponovo ; Yukitoshi Matsushita ; Taisuke Otsu 
Abstract:  We propose a nonparametric likelihood inference method for the integrated volatility under high frequency financial data. The nonparametric likelihood statistic, which contains the conventional statistics such as empirical likelihood and Pearson's chisquare as special cases, is not asymptotically pivotal under the socalled infill asymptotics, where the number of high frequency observations in a fixed time interval increases to infinity. We show that multiplying a correction term recovers the chisquare limiting distribution. Furthermore, we establish Bartlett correction for our modified nonparametric likelihood statistic under the constant and general nonconstant volatility cases. In contrast to the existing literature, the empirical likelihood statistic is not Bartlett correctable under the infill asymptotics. However, by choosing adequate tuning constants for the power divergence family, we show that the second order refinement to the order n^2 can be achieved. 
Keywords:  Nonparametric likelihood, Volatility, High frequency data 
JEL:  C14 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2015/581&r=ecm 
By:  Jouchi Nakajima (Bank of Japan, ); Tsuyoshi Kunihama (Department of Statistical Science, Duke University ); Yasuhiro Omori (Faculty of Economics, The University of Tokyo ) 
Abstract:  This paper develops Bayesian inference of extreme value models with a exible time dependent latent structure. The generalized extreme value distribution is utilized to incorporate state variables that follow an autoregressive moving average (ARMA) process with Gumbeldistributed innovations. The timedependent extreme value distribution is combined with heavytailed error terms. An efficient Markov chain Monte Carlo algorithm is proposed using a state space representation with a mixture of normal distribution approximating the Gumbel distribution. The methodology is illustrated using extreme data of stock returns and electricity demand. Estimation results show the usefulness of the proposed model and evidence that the latent autoregressive process and heavytailed errors plays an important role to describe the monthly series of minimum stock returns and maximum electricity demand. 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2014cf952&r=ecm 
By:  Giorgio Calzolari (Dipartimento di Statistica, University of Firenze, Italy ); Roxana Halbleib (Department of Economics, University of Konstanz, Germany ) 
Abstract:  Financial returns exhibit common behavior described at best by factor models, but also fat tails, which may be captured by αstable distributions. This paper concentrates on estimating factor models with multivariate αstable distributed and independent factors and idiosyncratic noises under the assumption of time constant distribution (static factor models) or timevarying conditional distribution (GARCH factor models). While the simulation from such a distribution is straightforward, the estimation encounters difficulties. These difficulties are overcome in this paper by implementing the indirect inference estimation method with the multivariate Student’s t as the auxiliary distribution. 
Keywords:  Symmetric Multivariate αstable Distribution, Factor Models, Indirect Inference, Multivariate Student’s t Distribution, Discrete Spectral Measures, GARCH Models 
JEL:  C13 C15 C18 C38 C46 
Date:  2014–12–28 
URL:  http://d.repec.org/n?u=RePEc:knz:dpteco:1425&r=ecm 
By:  Noud P.A. van Giersbergen 
Abstract:  Prior research for constructing confidence intervals for an indirect effect has focused on a Wald statistic. In this paper, however, the inference problem is analyzed from a likelihood ratio (LR) perspective. When testing the null hypothesis $H_{0}:\ \alpha \beta =0$, the LR test statistic leads to the minimum of two tratios, whose size can be controlled. A confidence interval is obtained by inverting the LR statistic. Another confidence interval is obtained by inverting the sum of two pivotal tstatistics. In the Monte Carlo simulations, this latter confidence interval is the best performer: it outperforms the commonly used existing methods 
Date:  2014–12–30 
URL:  http://d.repec.org/n?u=RePEc:ame:wpaper:1410&r=ecm 
By:  Anders Bredahl Kock (Aarhus University and CREATES ); Haihan Tang (Cambridge University ) 
Abstract:  We establish oracle inequalities for a version of the Lasso in highdimensional fixed effects dynamic panel data models. The inequalities are valid for the coefficients of the dynamic and exogenous regressors. Separate oracle inequalities are derived for the fixed effects. Next, we show how one can conduct simultaneous inference on the parameters of the model and construct a uniformly valid estimator of the asymptotic covariance matrix which is robust to conditional heteroskedasticity in the error terms. Allowing for conditional heteroskedasticity is important in dynamic models as the conditional error variance may be nonconstant over time and depend on the covariates. Furthermore, our procedure allows for inference on highdimensional subsets of the parameter vector of an increasing cardinality. We show that the confidence bands resulting from our procedure are asymptotically honest and contract at the optimal rate. This rate is different for the fixed effects than for the remaining parts of the parameter vector. 
Keywords:  Panel data Dynamic models, Lasso, Desparsification, Highdimensional data, Uniform inference, Honest inference, Oracle inequality, Confidence intervals, Tests 
JEL:  C13 C23 
Date:  2014–12–30 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201458&r=ecm 
By:  Stephen Morris (UC San Diego ) 
Abstract:  I reveal identification failures in a wellknown dynamic stochastic general equilibrium (DSGE) model, and study the statistical implications of common identifying restrictions. First, I provide a fully analytical methodology for determining all observationally equivalent values of the structural parameters in any parameter space. I show that either parameter admissibility or sign restrictions may yield global identification for some parameter realizations, but not for others. Second, I derive a "plugin" maximum likelihood estimator, which requires no numerical search. I use this tool to demonstrate that the idiosyncratic identifying restriction directly impinges on both the location and distribution of the smallsample MLE, and compute correctly sized confidence intervals. 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:red:sed014:738&r=ecm 
By:  Knüppel, Malte 
Abstract:  Recently, several institutions have increased their forecast horizons, and many institutions rely on their past forecast errors to estimate measures of forecast uncertainty. This work addresses the question how the latter estimation can be accomplished if there are only very few errors available for the new forecast horizons. It extends upon the results of Knüppel (2014) in order to relax the condition on the data structure required for the SUR estimator to be independent from unknown quantities. It turns out that the SUR estimator of forecast uncertainty tends to deliver large e¢ ciency gains compared to the OLS estimator (i.e. the sample mean of the squared forecast errors) in the case of increased forecast horizons. The SUR estimator is applied to the forecast errors of the Bank of England and the FOMC. 
Keywords:  multistepahead forecasts,forecast error variance,SUR 
JEL:  C13 C32 C53 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:zbw:bubdps:402014&r=ecm 
By:  Taisuke Otsu ; Yoshiyasu Rai 
Abstract:  Abadie and Imbens (2008) showed that the standard naive bootstrap is inconsistent to estimate the distribution of the matching estimator for treatment effects with a fixed number of matches. This article proposes an asymptotically valid inference method for the matching estimators based on the wild bootstrap. The key idea is to resample not only the regression residuals of treated and untreated observations but also the ones to estimate the average treatment effects. The proposed method is valid even for the case of vector covariates by incorporating the bias correction method in Abadie and Imbens (2011), and is applicable to estimate the average treatment effect and the counterpart for the treated population. A simulation study indicates that our wild bootstrap method is favorably comparable to the asymptotic normal approximation. As an empirical illustration, we apply our bootstrap method to the National Supported Work data. 
Keywords:  Treatment effect, matching, bootstrap 
JEL:  C21 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2015/580&r=ecm 
By:  Giorgio Calzolari (Dipartimento di Statistica, Informatica, Applicazioni "G. Parenti", Università di Firenze ) 
Abstract:  The use of Monte Carlo methods to generate exam data sets is nowadays a wellestablished practice among econometrics examiners all over the world. Its advantages are well known: providing each student a different data set ensures that estimates are actually computed individually, rather than copied from someone sitting nearby. The method however has a major fault: initial "random errors", such as mistakes in downloading the assigned dataset, might generate downward bias in student evaluation. We propose a set of calibration algorithms, typical of indirect estimation methods, that solve the issue of initial ``random errors'' and reduce evaluation bias. Ensuring round initial estimates of the parameters for each individual data set, our calibration procedures allow the students to determine if they have started the exam correctly. When initial estimates are not round numbers, this random error in the initial stage of the exam can be corrected for immediately, thus reducing evaluation bias. The procedure offers the further advantage of rounding markers’ life by allowing them to check round numbers answers only, rather than lists of numbers with many decimal digits. 
Keywords:  Indirect estimation, round numbers, econometrics exams 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:fir:econom:wp2015_01&r=ecm 
By:  Zura Kakushadze 
Abstract:  We give an explicit algorithm and source code for combining alpha streams via bounded regression. In practical applications typically there is insufficient history to compute a sample covariance matrix (SCM) for a large number of alphas. To compute alpha allocation weights, one then resorts to (weighted) regression over SCM principal components. Regression often produces alpha weights with insufficient diversification and/or skewed distribution against, e.g., turnover. This can be rectified by imposing bounds on alpha weights within the regression procedure. Bounded regression can also be applied to stock and other asset portfolio construction. We discuss illustrative examples. 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1501.05381&r=ecm 
By:  Victor Aguiar ; Roberto Serrano 
Abstract:  Given any observed finite sequence of prices, wealth and demand choices, we characterize the relation between its underlying Slutsky matrix norm (SMN) and some popular discrete revealed preference (RP) measures of departures from rationality, such as the Afriat index. We show that testing rationality in the SMN aproach with finite data is equivalent to testing it under the RP approach. We propose a way to “summarize” the departures from rationality in a systematic fashion in finite datasets. Finally, these ideas are extended to an observed demand with noise due to measurement error; we formulate an appropriate modification of the SMN approach in this case and derive closedform asymptotic results under standard regularity conditions. 
Keywords:  consumer theory; rationality; Slutsky matrix function; revealed preference approach; bounded rationality. 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:bro:econwp:20151&r=ecm 
By:  Thorsten Drautzburg (Federal Reserve Bank of Philadelphia ) 
Abstract:  DSGE models are used for analyzing policy and the sources of business cycles. A competing approach uses VARs that are partially identified using, for example, narrative shock measures and are often viewed as imposing fewer restrictions on the data. Narrative shocks are identified nonstructurally through information external to particular models. This uses nonstructural narrative shock measures to inform the structural estimation of DSGE models. Since fiscal policy has received much recent attention but the foundations of the fiscal side of DSGE models are less well studied than their monetary building block, fiscal DSGE models are a particularly promising application. Preliminary results from a standard mediumscale DSGE model support this argument: Structurally identified monetary shocks line up well with narrative measures, whereas government spending shocks do not. Extending the model to include distortionary taxes and more general fiscal policy processes, I find that model implied labor tax shocks line up well with narrative tax shocks. Including different narrative shock measures affects parameter identification and implied measures such as fiscal multipliers. 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:red:sed014:791&r=ecm 