
on Econometrics 
By:  Gael M. Martin; Brendan P.M. McCabe; David T. Frazier; Worapree Maneesoonthorn; Christian P. Robert 
Abstract:  A new approach to inference in state space models is proposed, using approximate Bayesian computation (ABC). ABC avoids evaluation of an intractable likelihood by matching summary statistics computed from observed data with statistics computed from data simulated from the true process, based on parameter draws from the prior. Draws that produce a â€˜matchâ€™ between observed and simulated summaries are retained, and used to estimate the inaccessible posterior; exact inference being possible in the state space setting, we pursue summaries via the maximization of an auxiliary likelihood function. We derive conditions under which this auxiliary likelihoodbased approach achieves Bayesian consistency and show that â€“ in a precise limiting sense â€“ results yielded by the auxiliary maximum likelihood estimator are replicated by the auxiliary score. Particular attention is given to a structure in which the state variable is driven by a continuous time process, with exact inference typically infeasible in this case due to intractable transitions Two models for continuous time stochastic volatility are used for illustration, with auxiliary likelihoods constructed by applying computationally efficient filtering methods to discrete time approximations. The extent to which the conditions for consistency are satisfied is demonstrated in both cases, and the accuracy of the proposed technique when applied to a square root volatility model also demonstrated numerically. In multiple parameter settings a separate treatment of each parameter, based on integrated likelihood techniques, is advocated as a way of avoiding the curse of dimensionality associated with ABC methods. 
Keywords:  Likelihoodfree methods, latent diffusion models, Bayesian consistency, asymptotic sufficiency, unscented Kalman filter, stochastic volatility 
JEL:  C11 C22 C58 
Date:  2016 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201609&r=ecm 
By:  Federico A. Bugni (Duke University); Mehmet Caner (Ohio State University); Anders Bredahl Kock (Aarhus University and CREATES); Soumendra Lahiri (North Carolina State University) 
Abstract:  This paper considers the problem of inference in a partially identified moment (in)equality model with possibly many moment inequalities. Our contribution is to propose a novel twostep new inference method based on the combination of two ideas. On the one hand, our test statistic and critical values are based on those proposed by Chernozhukov et al. (2014c) (CCK14, hereafter). On the other hand, we propose a new first step selection procedure based on the Lasso. Some of the advantages of our twostep inference method are that (i) it can be used to conduct hypothesis tests and to construct confidence sets for the true parameter value that is uniformly valid, both in underlying parameter _ and distribution of the data; (ii) our test is asymptotically optimal in a minimax sense and (iii) our method has better power than CCK14 in large parts of the parameter space, both in theory and in simulations. Finally, we show that the Lassobased first step can be implemented with a thresholding least squares procedure that makes it extremely simple to compute. 
Keywords:  Many moment inequalities, selfnormalizing sum, multiplier bootstrap, empirical bootstrap, Lasso, inequality selection 
JEL:  C13 C23 C26 
Date:  2016–04–26 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201612&r=ecm 
By:  Sujay Mukhoti; Pritam Ranjan 
Abstract:  In an efficient stock market, the logreturns and their timedependent variances are often jointly modelled by stochastic volatility models (SVMs). Many SVMs assume that errors in logreturn and latent volatility process are uncorrelated, which is unrealistic. It turns out that if a nonzero correlation is included in the SVM (e.g., Shephard (2005)), then the expected logreturn at time t conditional on the past returns is nonzero, which is not a desirable feature of an efficient stock market. In this paper, we propose a meancorrection for such an SVM for discretetime returns with nonzero correlation. We also find closed form analytical expressions for higher moments of logreturn and its leadlag correlations with the volatility process. We compare the performance of the proposed and classical SVMs on S&P 500 index returns obtained from NYSE. 
Date:  2016–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1605.02418&r=ecm 
By:  Robinson Kruse (Rijksuniversiteit Groningen and CREATES); Christian Leschinski (Leibniz University Hannover); Michael Will (Leibniz University Hannover) 
Abstract:  This paper extends the popular DieboldMariano test to situations when the forecast error loss differential exhibits long memory. It is shown that this situation can arise frequently, since long memory can be transmitted from forecasts and the forecast objective to forecast error loss differentials. The nature of this transmission mainly depends on the (un)biasedness of the forecasts and whether the involved series share common long memory. Further results show that the conventional DieboldMariano test is invalidated under these circumstances. Robust statistics based on a memory and autocorrelation consistent estimator and an extended fixedbandwidth approach are considered. The subsequent Monte Carlo study provides a novel comparison of these robust statistics. As empirical applications, we conduct forecast comparison tests for the realized volatility of the Standard and Poors 500 index among recent extensions of the heterogeneous autoregressive model. While we find that forecasts improve significantly if jumps in the logprice process are considered separately from continuous components, improvements achieved by the inclusion of implied volatility turn out to be insignificant in most situations. 
Keywords:  Equal Predictive Ability, Long Memory, DieboldMariano Test, Longrun Variance Estimation, Realized Volatility 
JEL:  C22 C52 C53 
Date:  2016–05–19 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201617&r=ecm 
By:  Xavier D'Haultfoeuille (Centre de Recherche en Économie et Statistique (CREST)); Roland Rathelot (University of Warwick) 
Abstract:  We consider the issue of measuring segregation in a population of small units, considering establishments in our application. Each establishment may have a different probability to hire an individual from the minority group. We define segregation indices as inequality indices on these unobserved, random probabilities. Because these probabilities are measured with error by proportions, standard estimators are inconsistent. We model this problem as a nonparametric binomial mixture. Under this testable assumption and conditions satisfied by standard segregation indices, such indices are partially identified and sharp bounds can be easily obtained by an optimization over a low dimensional space. We also develop bootstrap confidence intervals and a test of the binomial mixture model. Finally, we apply our method to measure the segregation of foreigners in small French firms. 
Keywords:  segregation, small units, partial identification 
JEL:  C13 C14 J71 
Date:  2016–05 
URL:  http://d.repec.org/n?u=RePEc:crm:wpaper:1611&r=ecm 
By:  P. A. V. B. Swamy; ILok Chang; Jatinder S. Mehta; William H. Greene; Stephen G. Hall; George S. Tavlas 
Abstract:  We develop a procedure for removing four major specification errors from the usual formulation of binary choice models. The model that results from this procedure is different from the conventional probit and logit models. This difference arises as a direct consequence of our relaxation of the usual assumption that omitted regressors constituting the error term of a latent linear regression model do not introduce omitted regressor biases into the coefficients of the included regressors. 
URL:  http://d.repec.org/n?u=RePEc:lec:leecon:16/11&r=ecm 
By:  Tomás del Barrio Castro (Universitat de les Illes Balears); Alain Hecq (Maastricht University) 
Abstract:  This paper investigates the presence of deterministic seasonal features within a mixed frequency vector autoregressive model. A strategy based on Wald tests is proposed. 
Keywords:  deterministic seasonal features, mixed frequency VARs 
JEL:  C32 
Date:  2016 
URL:  http://d.repec.org/n?u=RePEc:ubi:deawps:76&r=ecm 
By:  Matsypura, Dmytro; Neo, Emily; Prokhorov, Artem 
Abstract:  We formulate the problem of finding and estimating the optimal hierarchical Archimedean copula as an amended shortest path problem. The standard network flow problem is amended by certain constraints specific to copulas, which limit scalability of the problem. However, we show in dimensions as high as twenty that the new approach dominates the alternatives which usually require recursive estimation or full enumeration. 
Keywords:  network flow problem, copulas 
Date:  2016–04–16 
URL:  http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/14745&r=ecm 
By:  Takashi Shinzato 
Abstract:  In the present work, eigenvalue distributions defined by a random rectangular matrix whose components are neither independently nor identically distributed are analyzed using replica analysis and belief propagation. In particular, we consider the case in which the components are independently but not identically distributed; for example, only the components in each row or in each column may be {identically distributed}. We also consider the more general case in which the components are correlated with one another. We use the replica approach while making only weak assumptions in order to determine the asymptotic eigenvalue distribution and to derive an algorithm for doing so, based on belief propagation. One of our findings supports the results obtained from Feynman diagrams. We present the results of several numerical experiments that validate our proposed methods. 
Date:  2016–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1605.06840&r=ecm 
By:  Constantin Bürgi (The George Washington University); Tara M. Sinclair (The George Washington University) 
Abstract:  Empirical studies in the forecast combination literature have shown that it is notoriously di!cult to improve upon the simple average despite the availability of optimal combination weights. In particular, historical performancebased combination approaches do not select forecasters that improve upon the simple average going forward. This paper shows that this is due to the high correlation among forecasters, which only by chance causes some individuals to have lower root mean squared errors (RMSE) than the simple average. We introduce a new nonparametric approach to eliminate forecasters who perform well based purely on chance as well as poor performers. This leaves a subset of forecasters with better performance in subsequent periods. It improves upon the simple average in the SPF for bond yields where some forecasters may be more likely to have specialized knowledge. 
Keywords:  Forecast combination; Forecast evaluation; Multiple model comparisons; Realtime data; Survey of Professional Forecasters 
JEL:  C22 C52 C53 
Date:  2015–12 
URL:  http://d.repec.org/n?u=RePEc:gwc:wpaper:2015006&r=ecm 
By:  Antoine Mandel (Centre d'Economie de la Sorbonne  Paris School of Economics); Amir Sani (Centre d'Economie de la Sorbonne  Paris School of Economics) 
Abstract:  Nonparametric forecast combination methods choose a set of static weights to combine over candidate forecasts as opposed to traditional forecasting approaches, such as ordinary least squares, that combine over information (e.g. exogenous variables). While they are robust to noise, structural breaks, inconsistent predictors and changing dynamics in the target variable, sophisticated combination methods fail to outperform the simple mean. Timevarying weights have been suggested as a way forward. Here we address the challenge to “develop methods better geared to the intermittent and evolving nature of predictive relations” in Stock and Watson (2001) and propose a data driven machine learning approach to learn timevarying forecast combinations for output, inflation or any macroeconomic time series of interest. Further, the proposed procedure “hedges” combination weights against poor performance to the mean, while optimizing weights to minimize the performance gap to the best candidate forecast in hindsight. Theoretical results are reported along with empirical performance on a standard macroeconomic dataset for predicting output and inflation 
Keywords:  Forecast combinations; Machine Learning; Econometrics; Forecasting; Forecast combination Puzzle 
JEL:  C71 D85 
Date:  2016–04 
URL:  http://d.repec.org/n?u=RePEc:mse:cesdoc:16036&r=ecm 
By:  S Ouliaris; A R Pagan (UniSyd) 
Abstract:  Structural VARs are used to compute impulse responses to shocks. One problem that has arisen involves the information needed to perform this task i.e. how are the shocks to separated into those representing technology, monetary effects etc. Increasingly the signs of impulse responses are used for this task. However it is often desirable to impose some parametric assumption as well e.g. that monetary shocks have no longrun impact on output. Existing methods for combining sign and parametric restrictions are not well developed. In this paper we provide a relatively simple way to allow for these combinations and show how it works in a number of different contexts. 
Keywords:  VAR 
Date:  2015–05–11 
URL:  http://d.repec.org/n?u=RePEc:qut:auncer:2015_03&r=ecm 
By:  Sergey Ivashchenko (National Research University Higher School of Economics) 
Abstract:  This article suggests and compares the properties of some nonlinear Markovswitching filters. Two of them are sigma point filters: the Markov switching central difference Kalman filter (MSCDKF) and MSCDKFA. Two of them are Gaussian assumed filters: Markov switching quadratic Kalman filter (MSQKF) and MSQKFA. A small scale financial MSDSGE model is used for tests. MSQKF greatly outperforms other filters in terms of computational costs. It also is the first or the second best according to most tests of filtering quality (including the quality of quasimaximum likelihood estimation with use of a filter, RMSE and LPS of unobserved variables). 
Keywords:  regime switching, secondorder approximation, nonlinear MSDSGE estimation, MSQKF, MSCDKF 
JEL:  C13 C32 E32 
Date:  2016 
URL:  http://d.repec.org/n?u=RePEc:hig:wpaper:136/ec/2016&r=ecm 