
on Econometrics 
By:  Kruse, Yves Robinson; Kaufmann, Hendrik 
Abstract:  This paper provides a comprehensive Monte Carlo comparison of different finitesample biascorrection methods for autoregressive processes. We consider situations where the process is either mildly explosive or has a unit root. The case of highly persistent stationary is also studied. We compare the empirical performance of the plain OLS estimator with an OLS and a Cauchy estimator based on recursive demeaning, as well as an estimator based on second differencing. In addition, we consider three different approaches for biascorrection for the OLS estimator: (i) bootstrap, (ii) jackknife and (iii) indirect inference. The estimators are evaluated in terms of bias and root mean squared errors (RMSE) in a variety of practically relevant settings. Our findings suggest that the indirect inference method clearly performs best in terms of RMSE for all considered orders of integration. If biascorrection abilities are solely considered, the jackknife works best for stationary and unit root processes. For the explosive case, the bootstrap and the indirect inference can be recommended. As an empirical application, we study Asian stock market overvaluation during bubbles and emphasize the importance of biascorrection for explosive series. 
JEL:  C13 C22 G12 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:vfsc15:112897&r=ecm 
By:  Haas, Markus; Liu, JiChun 
Abstract:  We consider a multivariate Markovswitching GARCH model which allows for regimespecific volatility dynamics, leverage effects, and correlation structures. Stationarity conditions are derived, and consistency of the maximum likelihood estimator (MLE) is established under the assumption of Gaussian innovations. A Lagrange Multiplier (LM) test for correct specification of the correlation dynamics is devised, and a simple recursion for computing multistepahead conditional covariance matrices is provided. The theory is illustrated with an application to global stock market and real estate equity returns. The empirical analysis highlights the importance of the conditional distribution in Markovswitching time series models. Specifications with Student's t innovations dominate their Gaussian counterparts both in and outofsample. The dominating specification appears to be a tworegime Student's t process with correlations which are higher in the turbulent (highvolatility) regime. 
JEL:  C32 C51 C58 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:vfsc15:112855&r=ecm 
By:  Manner, Hans; Hafner, Christian; Simar, Leopold 
Abstract:  Stochastic frontier models are widely used to measure, e.g., technical efficiencies of firms. The classical stochastic frontier model often suffers from the empirical artefact that the residuals of the production function may have a positive skewness, whereas a negative one is expected under the model, which leads to estimated full efficiencies of all firms. We propose a new approach to the problem by generalizing the distribution used for the inefficiency variable. This generalized stochastic frontier model allows the sample data to have the wrong skewness while estimating welldefined and nondegenerate efficiency measures. We discuss the statistical properties of the model and we discuss a test for the symmetry of the error term (no inefficiency). We provide a simulation study to show that our model delivers estimators of efficiency with smaller bias than those of the classical model even if the population skewness has the correct sign. Finally, we apply the model to data of the U.S. textile industry for 19582005, and show that for a number of years our model suggests technical efficiencies well below the frontier, while the classical one estimates no inefficiency in those years. 
JEL:  C13 C18 D24 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:vfsc15:112812&r=ecm 
By:  Demetrescu, Matei; Kruse, Robinson 
Abstract:  Normality testing is an evergreen topic in statistics and econometrics and other disciplines. The paper focuses on testing economic time series for normality in a robust way, taking specific data features such as serial dependence and timevarying volatility into account. Here, we suggest tests based on raw moments of probability integral transform of standardized time series. The use of raw moments is advantageous as they are quite sensitive to deviations from the null other than asymmetry and excess kurtosis. To standardize the series, nonparametric estimators of the (timevarying) variance may be used, but the mean as a function of time has to be estimated parametrically. Shortrun dynamics is taken into account using the Heteroskedasticity and Autocorrelation Robust [HAR] approach of Kiefer and Vogelsang (2005, ET). The effect of estimation uncertainty arising from estimated standardization is accounted for by providing a necessary modification. In a simulation study, we compare the suggested tests to a benchmark test by Bai and Ng (2005, JBES). The results show that the new tests are performing well in terms of size (which is mainly due to the adopted fixedb framework for longrun covariance estimation), but also in terms of power. An empirical application to G7 industrial production growth rates sheds further light on the empirical usefulness and limitations of the proposed test. 
JEL:  C22 C46 C52 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:vfsc15:113221&r=ecm 
By:  Hanck, Christoph; Demetrescu, Matei; Kruse, Robinson 
Abstract:  The fixedb asymptotic framework provides refinements in the use of heteroskedasticity and autocorrelation consistent variance estimators. We show however that the fixedb limiting distributions of tstatistics are not pivotal when the variance of the underlying data generating process changes over time. To regain pivotal fixedb inference under such time heteroskedasticity, we discuss three alternative approaches. We employ (1) the wild bootstrap (Cavaliere and Taylor, 2008, ET), (2) resort to time transformations (Cavaliere and Taylor, 2008, JTSA) and (3) suggest to pick suitable the asymptotics according to the outcome of a heteroskedasticity test, since smallb asymptotics deliver standard limiting distributions irrespective of the socalled variance profile of the series. We quantify the degree of size distortions from using the standard fixedb approach and compare the effectiveness of the corrections via simulations. We also provide an empirical application to excess returns. 
JEL:  C12 C32 C15 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:vfsc15:112916&r=ecm 
By:  Schwiebert, Jörg; Wagner, Joachim 
Abstract:  The fractional probit (or fractional logit) model is used when the outcome variable is a fractional response variable, i.e. a variable taking a value between zero and one. In case of excess zeros, the fractional probit model might not be the optimal modeling device since this model does not predict zeros. As a solution, the twopart model has been proposed, which assumes different processes for having a (non)zero outcome and, conditionally on having a nonzero outcome, the actual outcome. However, the twopart model assumes independence of these processes. This paper proposes a generalization of the twopart model which allows for dependence of these processes and which also nests the twopart model as a special case. A simulation study indicates that the proposed estimator performs well in finite samples. Two empirical examples illustrate that the model proposed in this paper improves upon the fractional probit and twopart model in terms of model fit and also leads to different marginal effects. 
JEL:  C25 C35 C51 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:vfsc15:113059&r=ecm 
By:  Vogler, Jan; Liesenfeld, Roman; Richard, JeanFrancois 
Abstract:  PRELIMINARY DRAFT We discuss maximum likelihood (ML) analysis for panel count data models, in which the observed counts are linked via a measurement density to a latent Gaussian process with spatial as well as temporal dynamics and random effects. For likelihood evaluation requiring highdimensional integration we rely upon Efficient Importance Sampling (EIS). The algorithm we develop extends existing EIS implementations by constructing importance sampling densities, which closely approximate the nontrivial spatiotemporal correlation structure under dynamic spatial panel models. In order to make this highdimensional approximation computationally feasible, our EIS implementation exploits the typical sparsity of spatial precision matrices in such a way that all the highdimensional matrix operations it requires can be performed using computationally fast sparse matrix functions. We use the proposed sparse EISML approach for an extensive empirical study analyzing the sociodemographic determinants and the spacetime dynamics of urban crime in Pittsburgh, USA, between 2008 and 2013 for a panel of monthly crime rates at censustract level. 
JEL:  C15 C01 C23 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:vfsc15:113131&r=ecm 
By:  Kaeding, Matthias 
Abstract:  Popular link functions often fit skewed binary data poorly. We propose the logBurr link as flexible alternative. The link nests the complementaryloglog and logit link as special cases, determined by a shape parameter which can be estimated from the data. Shrinkage priors are used for the shape parameter, furthermore the parameter is allowed to vary between subgroups for clustered data. For modeling of nonlinear effects basis function expansions are used. Inference is done in a fully Bayesian framework. Posterior simulation is done via the NoUTurn sampler implemented in Stan, avoiding convergence problems of the Gibbs sampler and allowing for easy use of nonconjugate priors. Regression coefficients associated with basis functions are reparameterized as random effects to speed up convergence. The proposed methods and the effect of misspecification of the modeled dgp are investigated in a simulation study. The approach is applied on large scale unemployment data. 
JEL:  C10 C11 C63 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:vfsc15:113043&r=ecm 
By:  Zadrozny, Peter A. 
Abstract:  Chen and Zadrozny (1998) developed the linear extended YuleWalker (XYW) method for determining the parameters of a vector autoregressive (VAR) model with available covariances of mixedfrequency observations on the variables of the model. If the parameters are determined uniquely for available population covariances, then, the VAR model is identified. The present paper extends the original XYW method to an extended XYW method for determining all ARMA parameters of a vector autoregressive movingaverage (VARMA) model with available covariances of single or mixedfrequency observations on the variables of the model. The paper proves that under conditions of stationarity, regularity, miniphaseness, controllability, observability, and diagonalizability on the parameters of the model, the parameters are determined uniquely with available population covariances of single or mixedfrequency observations on the variables of the model, so that the VARMA model is identified with the single or mixedfrequency covariances. 
Keywords:  blockVandermonde eigenvectors of blockcompanion statetransition,matrix of statespace representation,matrix spectral factorization 
JEL:  C32 C80 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:cfswop:526&r=ecm 
By:  von Schweinitz, Gregor; Sarlin, Peter 
Abstract:  In past years, the most common approaches for deriving earlywarning models belong to the family of binarychoice methods, which have been coupled with a separate loss function to optimize model signals based on policymakers preferences. The evidence in this paper shows that earlywarning models should not be used in this traditional way, as the optimization of thresholds produces an insample overfit at the expense of outofsample performance. Instead of expost threshold optimization based upon a loss function, policymakers' preferences should rather be directly included as weights in the estimation function. Doing this strongly improves the outofsample performance of earlywarning systems. 
JEL:  C35 C53 G01 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:vfsc15:112964&r=ecm 
By:  NAKABAYASHI Jun; HIROSE Yohsuke 
Abstract:  This paper offers an analytical framework for the scoring auction. We first characterize a symmetric monotone equilibrium in the scoring auction. We then propose a semiparametric procedure to identify the joint distribution of the bidder's multidimensional signal from scoring auction data. Our approach allows for a broad class of scoring rules in settings with multidimensional signals. Finally, using our analytical framework, we conduct an empirical experiment to estimate the impacts of the change of auction formats and scoring rules. The data on scoring auctions are from public procurement auctions for construction projects in Japan. 
Date:  2016–02 
URL:  http://d.repec.org/n?u=RePEc:eti:dpaper:16008&r=ecm 
By:  Mutschler, Willi 
Abstract:  This note shows how to derive unconditional moments, cumulants and polyspectra of order higher than two for the pruned statespace of nonlinear DSGE models. Useful Matrix tools and computational aspects are also discussed. 
JEL:  C10 C51 E10 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:vfsc15:113138&r=ecm 
By:  Santiago Pereda Fernández (Banca d’Italia) 
Abstract:  I propose a method to correct for test scores manipulation and apply it to a natural experiment in the Italian education system consisting in the random assignment of external monitors to classrooms. The empirical strategy is based on a likelihood approach, using nonlinear panel data methods to obtain clean estimates of cheating controlling for unobserved heterogeneity. The likelihood of each classroom's scores is later used to correct them for cheating. Cheating is not associated with an increase in the correlation of the answers after we control for mean test scores. The method produces estimates of manipulation more frequent in the South and Islands and among female students and immigrants in Italian tests. A simulation shows how the manipulation reduces the accuracy of an exam in reflecting students' knowledge, and the correction proposed in this paper makes up for about a half of this loss. 
Keywords:  cheating correction, copula, nonlinear panel data, test scores manipulation 
JEL:  C23 C25 I28 
Date:  2016–01 
URL:  http://d.repec.org/n?u=RePEc:bdi:wptemi:td_1047_16&r=ecm 
By:  Schreiber, Sven; Breitung, Jörg 
Abstract:  We extend the frequencyspecific Grangercausality test of Breitung and Candelon (2006) to a more general null hypothesis that allows noncausality at unknown frequencies within an interval, instead of having to prespecify a single frequency. This setup corresponds better to most hypotheses that are typically analyzed in applied research and is easy to implement. We also discuss a test approach that departs from strict noncausality, given the impossibility of (nontrivial) noncausality over a continuum of frequencies. In an empirical application dealing with the dynamics of US temperatures and CO2 emissions we find that emissions cause temperature changes only at very low frequencies with more than 30 years of oscillation. 
JEL:  C32 Q54 C53 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:vfsc15:113111&r=ecm 
By:  von Gaudecker, HansMartin; Drerup, Tilman; Enke, Benjamin 
Abstract:  While stock market expectations are among the most important primitives of portfolio choice models, their measurement has proved challenging for some respondents. We argue that the magnitude of measurement error in subjective expectations can be used as an indicator of the degree to which economic models of portfolio choice provide an ade quate representation of individual decision processes. In order to explore this conjecture empirically, we estimate a semiparametric double index model on a dataset specifically collected for this purpose. Stock market participation reacts strongly to changes in model parameters for respondents at the lower end of the measurement error distribution; these effects are much less pronounced for individuals at the upper end. Our findings indicate that measurement error in subjective expectations provides useful information to uncover heterogeneity in choice behavior. 
JEL:  C35 C51 G11 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:vfsc15:112871&r=ecm 