
on Econometrics 
By:  Federico Bassetti (University of Pavia); Roberto Casarin (University of Venice); Francesco Ravazzolo (Norges Bank (Central Bank of Norway)and BI Norwegian Business School) 
Abstract:  We introduce a Bayesian approach to predictive density calibration and combination that accounts for parameter uncertainty and model set incompleteness through the use of random calibration functionals and random combination weights. Building on the work of Ranjan and Gneiting (2010) and Gneiting and Ranjan (2013), we use infinite beta mixtures for the calibration. The proposed Bayesian nonparametric approach takes advantage of the flexibility of Dirichlet process mixtures to achieve any continuous deformation of linearly combined predictive distributions. The inference procedure is based on Gibbs sampling and allows accounting for uncertainty in the number of mixture components, mixture weights, and calibration parameters. The weak posterior consistency of the Bayesian nonparametric calibration is provided under suitable conditions for unknown true density. We study the methodology in simulation examples with fat tails and multimodal densities and apply it to density forecasts of daily S&P returns and daily maximum wind speed at the Frankfurt airport. 
Keywords:  Forecast calibration, Forecast combination, Density forecast, Beta mixtures, Bayesian nonparametrics, Slice sampling. 
JEL:  C13 C14 C51 C53 
Date:  2015–02–26 
URL:  http://d.repec.org/n?u=RePEc:bno:worpap:2015_03&r=ecm 
By:  Sibbertsen, Philipp; Leschinski, Christian; Holzhausen, Marie 
Abstract:  This paper provides a multivariate scoretype test to distinguish between true and spurious long memory. The test is based on the weighted sum of the partial derivatives of the multivariate local Whittle likelihood function. This approach takes phase shifts in the multivariate spectrum into account. The resulting pivotal limiting distribution is independent of the dimension of the process, which makes it easy to apply in practice. We prove the consistency of our test against the alternative of random level shifts or monotonic trends. A Monte Carlo analysis shows good finite sample properties of the test in terms of size and power. Additionally, we apply our test to the logabsolute returns of the S\&P 500, DAX, FTSE, and the NIKKEI. The multivariate test gives formal evidence that these series are contaminated by level shifts. 
Keywords:  Multivariate Long Memory, Semiparametric Estimation, Spurious Long Memory, Volatility 
JEL:  C12 C32 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:han:dpaper:dp547&r=ecm 
By:  Tommaso Proietti (University of Rome “Tor Vergata" and CREATES); Martyna Marczak (University of Hohenheim); Gianluigi Mazzi (Statistical Office of the European Communities) 
Abstract:  EuroMIndD is a density estimate of monthly gross domestic product (GDP) constructed according to a bottom–up approach, pooling the density estimates of eleven GDP components, by output and expenditure type. The components density estimates are obtained from a mediumsize dynamic factor model of a set of coincident time series handling mixed frequencies of observation and ragged–edged data structures. They reflect both parameter and filtering uncertainty and are obtained by implementing a bootstrap algorithm for simulating from the distribution of the maximum likelihood estimators of the model parameters, and conditional simulation filters for simulating from the predictive distribution of GDP. Both algorithms process sequentially the data as they become available in real time. The GDP density estimates for the output and expenditure approach are combined using alternative weighting schemes and evaluated with different tests based on the probability integral transform and by applying scoring rules. 
Keywords:  Density Forecast Combination and Evaluation, Mixed–Frequency Data, Dynamic Factor Models, State Space Models 
JEL:  C32 C52 C53 E37 
Date:  2015–02–24 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201512&r=ecm 
By:  TAEHWAN KIM (Yonsei University); SOOBIN JEONG (Yonsei University); BONGHWAN KIM (University of California, San Diego); HYUNGHO MOON (University of California, San Diego) 
Abstract:  Spurious rejections of the standard DickeyFuller (DF) test caused by a single variance break have been reported and some solutions to correct the problem have been proposed in the literature. Kim et al. (2002) put forward a correctlysized unit root test robust to a single variance break, called the KLN test. However, there can be more than one break in variance in timeseries data as documented in Zhou and Perron (2008), so allowing only one break can be too restrictive. In this paper, we show that multiple breaks in variance can generate spurious rejections not only by the standard DF test but also by the KLN test. We then propose a bootstrapbased unit root test that is correctlysized in the presence of multiple breaks in variance. Simulation experiments demonstrate that the proposed test performs well regardless of the number of breaks and the location of the breaks in innovation variance. 
Keywords:  DickeyFuller test; variance break; wild bootstrap. 
JEL:  C12 C15 
Date:  2014–11 
URL:  http://d.repec.org/n?u=RePEc:yon:wpaper:2014rwp70&r=ecm 
By:  James Davidson; Dooruj Rambaccussing 
Abstract:  This paper develops a new test of true versus spurious long memory, based on logperiodogram estimation of the long memory parameter using skipsampled data. A correction factor is derived to overcome the bias in this estimator due to aliasing. The procedure is designed to be used in the context of a conventional test of signi cance of the long memory parameter, and a composite test procedure is described that has the properties of known asymptotic size and consistency. The test is implemented using the bootstrap, with the distribution under the null hypothesis being approximated using a dependentsample bootstrap technique to approximate shortrun dependence following fractional di¤erencing. The properties of the test are investigated in a set of Monte Carlo experiments. The procedure is illustrated by applications to exchange rate volatility and dividend growth series. 
Keywords:  Long Memory, Selfsimilarity, Bootstrap 
JEL:  C12 C14 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:dun:dpaper:286&r=ecm 
By:  Medel, Carlos; Pincheira, Pablo 
Abstract:  We analyse the multihorizon forecasting performance of several strategies to estimate the stationary AR(1) model in a nearunity context. We focus on the Andrews' (1993) exact medianunbiased estimator (BC), the OLS estimator, and the driftless random walk (RW). In addition, we explore the forecasting performance of pairwise combinations between these individual strategies. We do this to investigate whether the Andrews' (1993) correction of the OLS downward bias helps in reducing mean squared forecast errors. Via simulations, we find that BC forecasts typically outperform OLS forecasts. When BC is compared to the RW we obtain mixed results, favouring the latter as the persistence of the true process increases. Interestingly, we also find that the combination of BC and RW performs well when the persistence of the process is high. 
Keywords:  Nearunity autoregression; medianunbiased estimation; unbiasedness; unit root model; forecasting; forecast combinations 
JEL:  C22 C52 C53 C63 
Date:  2015–03–04 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:62552&r=ecm 
By:  Fernández Kranz, Daniel; Lechner, Michael; RodriguezPlanas, Nuria 
Abstract:  In this note, we show that the OLS and fixedeffects (FE) estimators of the popular differenceindifferences model may deviate when there is time varying panel nonresponse. If such nonresponse does not affect the commontrend assumption, then OLS and FE are consistent, but OLS is more precise. However, if nonresponse is affecting the commontrend assumption, then FE estimation may still be consistent, while OLS will be inconsistent. We provide simulation as well as empirical evidence for this phenomenon to occur. We conclude that in case of unbalanced panels, any evidence of deviating OLS and FE estimates should be considered as evidence that nonresponse is not ignorable for the differencesindifferences estimation. 
Keywords:  Differenceindifference estimation, attrition, panel estimation, balanced panel, unbalanced panel 
JEL:  C21 C31 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:usg:econwp:2015:07&r=ecm 
By:  Mukhoti, Sujay 
Abstract:  In this paper I present a new single factor stochastic volatility model for asset return observed in discrete time and its latent volatility. This model unites the feedback effect and return skewness using a common factor for return and its volatility. Further, it generalizes the existing stochastic volatility framework with constant feedback to one with time varying feedback and as a consequence time varying skewness. However, presence of dynamic feedback effect violates the weakstationarity assumption usually considered for the latent volatility process. The concept of bounded stationarity has been proposed in this paper to address the issue of nonstationarity. A characterization of the error distributions for returns and volatility is provided on the basis of existence of conditional moments. Finally, an application of the model has been explained using S&P100 daily returns under the assumption of Normal error and half Normal common factor distribution. 
Keywords:  Stochastic volatility, Bounded stationarity, Leverage, Feedback, Skewness, Single factor model 
JEL:  C11 C58 
Date:  2014–06–28 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:62532&r=ecm 
By:  Chopin, Nicolas; Gadat, Sébastien; Guedj, Benjamin; Guyader, Arnaud; Vernet, Elodie 
Abstract:  This paper proposes to review some recent developments in Bayesian statistics for high dimensional data. After giving some brief motivations in a short introduction, we describe new advances in the understanding of Bayes posterior computation as well as theoretical contributions in non parametric and high dimensional Bayesian approaches. From an applied point of view, we describe the socalled SQMC particle method to compute posterior Bayesian law, and provide a nonparametric analysis of the widespread ABC method. On the theoretical side, we describe some recent advances in Bayesian consistency for a nonparametric hidden Markov model as well as new PACBayesian results for different models of high dimensional regression. 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:tse:wpaper:29078&r=ecm 
By:  TAEHWAN KIM (Yonsei University); YONGCHEOL SHIN (University of York); JIN SEO CHO (Yonsei University) 
Abstract:  Xiao (2009) develops a novel estimation technique for quantile cointegrated time series by extending Phillips and Hansen¡¯s (1990) semiparametric approach and Saikkonen¡¯s (1991) parametrically augmented approach. This paper extends Pesaran and Shin¡¯s (1998) autoregressive distributedlag approach into quantile regression by jointly analysing shortrun dynamics and longrun cointegrating relationships across a range of quantiles. We derive the asymptotic theory and provide a general package in which the model can be estimated and tested within and across quantiles. We further affirm our theoretical results by Monte Carlo simulations. Main utilities of this analysis are demonstrated through the empirical application to the dividend policy in the U.S. 
Keywords:  QARDL, Quantile Regression, Longrun Cointegrating Relationship, Dividend Smoothing, Timevarying Rolling Estimation. 
JEL:  C22 G35 
Date:  2014–11 
URL:  http://d.repec.org/n?u=RePEc:yon:wpaper:2014rwp69&r=ecm 
By:  Yunmi Kim (University of Seoul); TaeHwan Kim (Yonsei University); Tolga Ergun (State Street Corporation) 
Abstract:  It is well known that any statistic based on sample averages can be sensitive to outliers. Some examples are the conventional momentsbased statistics such as the sample mean, the sample variance, or the sample covariance of a set of observations on two variables. Given that sample correlation is defined as sample covariance divided by the product of sample standard deviations, one might suspect that the impact of outliers on the correlation coefficient may be neither present nor noticeable because of a ¡®dampening effect'i.e., the effects of outliers on both the numerator and the denominator of the correlation coefficient can cancel each other. In this paper, we formally investigate this issue. Contrary to such an expectation, we show analytically and by simulations that the distortion caused by outliers in the behavior of the correlation coefficient can be fairly large in some cases, especially when outliers are present in both variables at the same time. These outliers are called ¡®coincidental outliers.¡¯ We consider some robust alternative measures and compare their performance in the presence of such coincidental outliers. 
Keywords:  Correlation; robust statistic; outliers 
JEL:  C13 C18 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:yon:wpaper:2015rwp77&r=ecm 
By:  Sylvain Barde 
Abstract:  The recent increase in the breath of computational methodologies has been matched with a corresponding increase in the difficulty of comparing the relative explanatory power of models from different methodological lineages. In order to help address this problem a universal information criterion (UIC) is developed that is analogous to the Akaike information criterion (AIC) in its theoretical derivation and yet can be applied to any model able to generate simulated or predicted data, regardless of its methodology. Both the AIC and proposed UIC rely on the KullbackLeibler (KL) distance between model predictions and real data as a measure of prediction accuracy. Instead of using the maximum likelihood approach like the AIC, the proposed UIC relies instead on the literal interpretation of the KL distance as the inefficiency of compressing real data using modelled probabilities, and therefore uses the output of a universal compression algorithm to obtain an estimate of the KL distance. Several Monte Carlo tests are carried out in order to (a) confirm the performance of the algorithm and (b) evaluate the ability of the UIC to identify the true datagenerating process from a set of alternative models. 
Keywords:  AIC; Minimum description length; Model selection 
JEL:  B41 C15 C52 C63 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:ukc:ukcedp:1504&r=ecm 
By:  Yae Ji Jun (Yonsei University); Jin Seo Cho (Yonsei University) 
Abstract:  This paper analyzes the interrelationships among Wald, likelihood ratio, Lagrange multiplier statistics for testing neglected nonlinearity. We show that the three test statistics are equivalent under the null although there exists a twofold identification problem. This implies that the trinity property holds for the tests as for the standard case. 
Keywords:  Neglected Nonlinearity, Wald statistics, likelihood ratio statistics, Lagrange multiplier statistics, Trinity. 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:yon:wpaper:2015rwp78&r=ecm 
By:  Frederik Meudt; Martin Theissen; Rudi Sch\"afer; Thomas Guhr 
Abstract:  In complex systems, crucial parameters are often subject to unpredictable changes in time. Climate, biological evolution and networks provide numerous examples for such nonstationarities. In many cases, improved statistical models are urgently called for. In a genral setting, we study systems of correlated quantities to which we refer as amplitudes. We are interested in the case of nonstationarity i.e., seemingly random covariances. We present a general method to derive the distribution of the covariances from the distribution of the amplitudes. To ensure analytical tractability, we construct a properly deformed Wishart ensemble of random matrices. We apply our method to financial returns where the wealth of data allows us to carry out statistically significant tests. The ensemble that we find is characterized by an algebraic distribution which improves the understanding of large events. 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1503.01584&r=ecm 
By:  Rudi Sch\"afer; Sonja Barkhofen; Thomas Guhr; HansJ\"urgen St\"ockmann; Ulrich Kuhl 
Abstract:  A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the timedependent parameters. To model the longterm statistical behavior, we compound the local distribution with the distribution of its parameters. Here we consider two concrete, but diverse examples of such nonstationary systems, the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end we have to estimate the parameter distribution for univariate time series in a highly nonstationary situation. 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1503.02177&r=ecm 
By:  An, Yonghong; Tang, Xun 
Abstract:  We introduce a structural model of procurement auctions with incomplete contracts, where a procurer chooses an initial project specification endogenously. The contract between the procurer and the winner of the auction is incomplete in that the two parties may agree to adopt a new feasible specification later, and negotiate an additional transfer via Nash Bargaining where both parties’ disagreement values depend on the auction price. In a Perfect Bayesian Equilibrium, contractors competing in the auction take account of such incompleteness while quoting prices. We show that the model primitives are nonparametrically identified and propose a feasible estimation procedure. Using data from highway procurement auctions in California, we estimate the structural elements that determine the holdup due to incompleteness, and infer how a contractor’s bargaining power and the markup in the price quoted vary with its characteristics and the features of the construction project. We also find that ignoring the existence of contract incompleteness in the structural analysis of the bidding data leads to substantial overestimation of the markups in the prices. 
Keywords:  Identification, estimation, incomplete contracts, procurement auctions 
JEL:  C14 D44 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:62602&r=ecm 
By:  Esther Ruiz; Pilar Poncela 
Abstract:  In the context of Dynamic Factor Models (DFM), we compare point and interval estimates of the underlying unobserved factors extracted using small and bigdata procedures. Our paper differs from previous works in the related literature in several ways. First, we focus on factor extraction rather than on prediction of a given variable in the system. Second, the comparisons are carried out by implementing the procedures considered to the same data. Third, we are interested not only on point estimates but also on confidence intervals for the factors. Based on a simulated system and the macroeconomic data set popularized by Stock and Watson (2012), we show that, for a given procedure, factor estimates based on different crosssectional dimensions are highly correlated. On the other hand, given the crosssectional dimension, the Maximum Likelihood Kalman filter and smoother (KFS) factor estimates are highly correlated with those obtained using hybrid Principal Components (PC) and KFS procedures. The PC estimates are somehow less correlated. Finally, the PC intervals based on asymptotic approximations are unrealistically tiny. 
Keywords:  Confidence intervals, Kalman filter, Principal components, QuasiMaximum Likelihood, Sectorial Factors 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:cte:wsrepe:ws1502&r=ecm 
By:  Laffers, Lukas (Department of Mathematics); Mellace, Giovanni (Department of Business and Economics) 
Abstract:  In this paper we show that the testable implications derived in Huber and Mellace (2013) are the best possible to detect invalid instruments, in the presence of heterogeneous treatment effects and endogeneity. We also provide a formal proof of the fact that those testable implications are only necessary but not sufficient conditions for instrument validity. 
Keywords:  Testing IV validity; Local average treatment effect; Moment inequalities; Bounds 
JEL:  C12 C21 C26 
Date:  2015–03–03 
URL:  http://d.repec.org/n?u=RePEc:hhs:sdueko:2015_004&r=ecm 