
on Econometrics 
By:  Bent Jesper Christensen (Aarhus University and CREATES); Rasmus T. Varneskov (Aarhus University and CREATES) 
Abstract:  This paper introduces a new estimator of the fractional cointegrating vector between stationary long memory processes that is robust to lowfrequency contamination such as level shifts, i.e., structural changes in the means of the series, and deterministic trends. In particular, the proposed medium band least squares (MBLS) estimator uses sample dependent trimming of frequencies in the vicinity of the origin to account for such contamination. Consistency and asymptotic normality of the MBLS estimator are established, a feasible inference procedure is proposed, and rigorous tools for assessing the cointegration strength and testing MBLS against the existing narrow band least squares estimator are developed. Finally, the asymptotic framework for the MBLS estimator is used to provide new perspectives on volatility factors in an empirical application to longspan realized variance series for S&P 500 equities. 
Keywords:  Deterministic Trends, Factor Models, Fractional Cointegration, Long Memory, Realized Variance, Semiparametric Estimation, Structural Change 
JEL:  C12 C14 C32 C58 
Date:  2015–05–27 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201525&r=ecm 
By:  Ulrich Hounyo (OxfordMan Institute, University of Oxford, and Aarhus University and CREATES); Rasmus T. Varneskov (Aarhus University and CREATES) 
Abstract:  We provide a new resampling procedure  the local stable bootstrap  that is able to mimic the dependence properties of realized power variations for purejump semimartingales observed at different frequencies. This allows us to propose a bootstrap estimator and inference procedure for the activity index of the underlying process, ß, as well as a bootstrap test for whether it obeys a jumpdiffusion or a purejump process, that is, of the null hypothesis H0: ß=2 against the alternative H1: ß<2. We establish firstorder asymptotic validity of the resulting bootstrap power variations, activity index estimator, and diffusion test for H0. Moreover, the finite sample size and power properties of the proposed diffusion test are compared to those of benchmark tests using Monte Carlo simulations. Unlike existing procedures, our bootstrap test is correctly sized in general settings. Finally, we illustrate use and properties of the new bootstrap diffusion test using highfrequency data on three FX series, the S&P 500, and the VIX. 
Keywords:  Activity index, Bootstrap, BlumenthalGetoor index, Confidence Intervals, Highfrequency Data, Hypothesis Testing, Realized Power Variation, Stable Processes 
JEL:  C12 C14 C15 G1 
Date:  2015–05–27 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201526&r=ecm 
By:  esposito, francesco paolo; cummins, mark 
Abstract:  In this article we use a partial integraldifferential approach to construct and extend a nonlinear filter to include jump components in the system state. We employ the enhanced filter to estimate the latent state of multivariate parametric jumpdiffusions. The devised procedure is flexible and can be applied to nonaffine diffusions as well as to state dependent jump intensities and jump size distributions. The particular design of the system state can also provide an estimate of the jump times and sizes. With the same approch by which the filter has been devised, we implement an approximate likelihood for the parameter estimation of models of the jumpdiffusion class. In the development of the estimation function, we take particular care in designing a simplified algorithm for computing. The likelihood function is then characterised in the application to stochastic volatility models with jumps. In the empirical section we validate the proposed approach via Monte Carlo experiments. We deal with the volatility as an intrinsic latent factor, which is partially observable through the integrated variance, a new system state component that is introduced to increase the filtered information content, allowing a closer tracking of the latent volatility factor. Further, we analyse the structure of the measurement error, particularly in relation to the presence of jumps in the system. In connection to this, we detect and address an issue arising in the update equation, improving the system state estimate. 
Keywords:  latent statevariables, nonlinear filtering, finite difference method, multivariate jumpdiffusions, likelihood estimation 
JEL:  C13 
Date:  2015–05–01 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:64987&r=ecm 
By:  W. Robert Reed (University of Canterbury) 
Abstract:  This paper demonstrates that unit root tests can suffer from inflated Type I error rates when data are cointegrated. Results from Monte Carlo simulations show that three commonly used unit root tests – the ADF, PhillipsPerron, and DFGLS tests – frequently overreject the true null of a unit root for at least one of the cointegrated variables. The reason for this overrejection is that unit root tests, designed for random walk data, are often misspecified when data are cointegrated. While the addition of lagged differenced (LD) terms can eliminate the size distortion, this “success” is spurious, driven by collinearity between the lagged dependent variable and the LD explanatory variables. Accordingly, standard diagnostics such as (i) testing for serial correlation in the residuals and (ii) using information criteria to select among different lag specifications are futile. The implication of these results is that researchers should be conservative in the weight 
Keywords:  Unit root testing, cointegration, DFGLS test, Augmented DickeyFuller test, PhillipsPerron test, simulation 
JEL:  C32 C22 C18 
Date:  2015–05–30 
URL:  http://d.repec.org/n?u=RePEc:cbt:econwp:15/11&r=ecm 
By:  Naoto Kunitomo (Faculty of Economics, The University of Tokyo); Seisho Sato (Faculty of Economics, The University of Tokyo) 
Abstract:  The use of seasonally adjusted (official) data may have statistical problem because it is a common practice to use <i>X12ARIMA</i> in <i>the official seasonal adjustment</i>, which adopts the univariate ARIMA time series modeling with some renements. Instead of using the seasonally adjusted data, for estimating the structural parameters and relationships among nonstationary economic time series with seasonality and noise, we propose a new method called the Separating Information Maximum Likelihood (SIML) estimation. We show that the SIML estimation can identify the nonstationary trend, the seasonality and the noise components, which have been observed in many macroeconomic time series, and recover the structural parameters and relationships among the nonstationary trends with seasonality. The SIML estimation is consistent and it has the asymptotic normality when the sample size is large. Based on simulations, we nd that the SIML estimator has reasonable nite sample properties and thus it would be useful for practice.  
Date:  2015–06 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2015cf977&r=ecm 
By:  Shonosuke Sugasawa (Graduate School of Economics, The University of Tokyo); Tatsuya Kubokawa (Faculty of Economics, The University of Tokyo) 
Abstract:  The article considers a nested error regression model with heteroscedastic variance functions for analyzing clustered data, where the normality for the underlying distributions is not assumed. Classical methods in normal nested error regression models with homogenous variances are extended in the two directions: heterogeneous variance functions for error terms and nonnormal distributions for random effects and error terms. Consistent estimators for model parameters are suggested, and secondorder approximations of their biases and variances are derived. The mean squared errors of the empirical best linear unbiased predictors are expressed explicitly to secondorder. Secondorder unbiased estimators of the mean squared errors are provided analytically in closed forms. The proposed model and the resulting procedures are numerically investigated through simulation and empirical studies.  
Date:  2015–06 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2015cf978&r=ecm 
By:  Bayai, Mohsen (Stanford University); Erdogdu, Murat A. (Stanford University); Montanari, Andrea (Stanford University) 
Abstract:  We study the fundamental problems of variance and risk estimation in high dimensional statistical modeling. In particular, we consider the problem of learning a coefficient vector Theta 0 is an element of Rp from noisy linear observations y = X Theta 0 + w is an element of Rn (p > n) and the popular estimation procedure of solving the '1penalized least squares objective known as the LASSO or Basis Pursuit DeNoising (BPDN). In this context, we develop new estimators for the '2 estimation risk k Theta b Theta 0k2 and the variance of the noise when distributions of Theta 0 and w are unknown. These can be used to select the regularization parameter optimally. Our approach combines Stein's unbiased risk estimate [Ste81] and the recent results of [BM12a] [BM12b] on the analysis of approximate message passing and the risk of LASSO. We establish highdimensional consistency of our estimators for sequences of matrices X of increasing dimensions, with independent Gaussian entries. We establish validity for a broader class of Gaussian designs, conditional on a certain conjecture from statistical physics. To the best of our knowledge, this result is the first that provides an asymptotically consistent risk estimator for the LASSO solely based on data. In addition, we demonstrate through simulations that our variance estimation outperforms several existing methods in the literature. 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:ecl:stabus:3284&r=ecm 
By:  Abdelkamel Alj; Christophe Ley; Guy Melard 
Keywords:  nonstationary process; multivariate time series; timevarying models 
Date:  2015–06 
URL:  http://d.repec.org/n?u=RePEc:eca:wpaper:2013/200183&r=ecm 
By:  Fuchun Li 
Abstract:  The author proposes a test for the parametric specification of each component in the diffusion matrix of a ddimensional diffusion process. Overall, d (d1)/2 test statistics are constructed for the offdiagonal components, while d test statistics are constructed for the main diagonal components. Using theories of degenerate Ustatistics, each of these test statistics is shown to follow an asymptotic standard normal distribution under null hypothesis, while diverging to infinity if the component is misspecified over a significant range. Our tests strongly reject the specification of diffusion functions in a variety of popular univariate interest rate models for daily 7day eurodollar spot rates, and the specification of the diffusion matrix in some popular multivariate affine termstructure models for monthly U.S. Treasury yields. 
Keywords:  Asset Pricing, Econometric and statistical methods, Interest rates 
JEL:  C12 C14 E17 E43 G12 G20 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:bca:bocawp:1517&r=ecm 
By:  Violetta Dalla; Javier Hidalgo 
Keywords:  Nonparametric regression, Breaks/smoothness, Strong dependence, Extremevalues distribution, Frequency domain bootstrap algorithms. 
JEL:  C14 C22 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2015/584&r=ecm 
By:  Bertille Antoine (Simon Fraser University); Otilia (Tilburg University) 
Abstract:  Decades of empirical evidence suggest that many macroeconometric and financial models are subject to both instability and identification problems. In this paper, we address both issues under the unified framework of timevarying information, which includes changes in instrument strength, changes in the second moment of instruments, and changes in the variance of moment conditions. We develop a comprehensive econometric method that detects and exploits these changes to increase the efficiency of the estimates of the (stable) structural parameters. We estimate a New Keynesian Phillips Curve and obtain more precise estimates of the price indexation parameters than standard methods. An extensive simulation study also shows that our method delivers substantial efficiency gains in finite samples. 
Keywords:  GMM, Weak instruments, Break point, Change in identification strength 
JEL:  C13 C22 C26 C36 C51 
Date:  2015–06–04 
URL:  http://d.repec.org/n?u=RePEc:sfu:sfudps:dp1504&r=ecm 
By:  esposito, francesco paolo; cummins, mark 
Abstract:  Extending previous risk model backtesting literature, we construct multiple hypothesis testing (MHT) with the stationary bootstrap. We conduct multiple tests which control for the generalized confidence level and employ the bootstrap MHT to design multiple comparison testing. We consider absolute and relative predictive ability to test a range of competing risk models, focusing on ValueatRisk (VaR) and Expected Shortfall (ExS). In devising the test for the absolute predictive ability, we take the route of recent literature and construct balanced simultaneous confidence sets that control for the generalized familywise error rate, which is the joint probability of rejecting true hypotheses. We implement a stepdown method which increases the power of the MHT in isolating false discoveries. In testing for the ExS model predictive ability, we design a new simple test to draw inference about recursive model forecasting capability. In the second suite of statistical testing, we develop a novel device for measuring the relative predictive ability in the bootstrap MHT framework. The device, we coin multiple comparison mapping, provides a statistically robust instrument designed to answer the question: ''which model is the best model?''. 
Keywords:  valueatrisk, expected shortfall, bootstrap multiple hypothesis testing, generalized familywise error rate, multiple comparison map 
JEL:  C12 
Date:  2015–03–01 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:64986&r=ecm 
By:  Joshua C.C. Chan; Eric Eisenstat 
Abstract:  Empirical work in macroeconometrics has been mostly restricted to using VARs, even though there are strong theoretical reasons to consider general VARMAs. A number of articles in the last two decades have conjectured that this is because estimation of VARMAs is perceived to be challenging and proposed various ways to simplify it. Nevertheless, VARMAs continue to be largely dominated by VARs, particularly in terms of developing useful extensions. We address these computational challenges with a Bayesian approach. Specifically, we develop a Gibbs sampler for the basic VARMA, and demonstrate how it can be extended to models with timevarying VMA coefficients and stochastic volatility. We illustrate the methodology through a macroeconomic forecasting exercise. We show that in a class of models with stochastic volatility, VARMAs produce better density forecasts than VARs, particularly for short forecast horizons. 
Keywords:  state space, stochastic volatility, factor model, macroeconomic forecasting, density forecast 
JEL:  C11 C32 C53 
Date:  2015–06 
URL:  http://d.repec.org/n?u=RePEc:een:camaaa:201519&r=ecm 
By:  Farbmacher, Helmut; Kögel, Heinrich (Munich Center for the Economics of Aging (MEA)) 
Abstract:  In the presence of heteroskedasticity, conventional standard errors (which assume homoskedasticity) can be biased up or down. The most common form of heteroskedasticity leads to conventional standard errors that are too small. When Wald tests based on these standard errors are insignificant, heteroskedasticity ro bust standard errors do not change inference. On the other hand, inference is conservative in a setting with upwardbiased conventional standard errors. We discuss the power gains when using robust standard errors in this case and also potential problems of heteroskedasticity tests. As a solution for the poor performance of the usual heteroskedasticity tests in this setting, we propose a modification of the White test which has better properties. We illustrate our findings using a study in labor economics. The correct standard errors turn out to be around 15 percent lower, leading to different policy conclusions. Moreover, only our modified test is able to detect heteroskedasticity in this application. 
Date:  2015–03–10 
URL:  http://d.repec.org/n?u=RePEc:mea:meawpa:201503&r=ecm 
By:  Daniel Felix Ahelegbey (Department of Economics, University of Venice Cà Foscari) 
Abstract:  Recent advances in empirical finance has seen a growing interest in the application of network models to analyse contagion, spillover effects and risk propagation channels in the system. While interconnectivity among financial institutions have been widely studied, only a few papers review networks in finance and they do not focus on the econometrics aspects. This paper surveys the state of the arts for statistical inference and application of networks from a multidisciplinary perspective, and specifically in the context of systemic risk. We contribute to the literature on network econometrics by relating network models to multivariate analysis with potential applications in econometrics and finance. 
Keywords:  Bayesian inference, Graphical models, Model selection, Systemic risk. 
JEL:  C11 C15 C52 G01 G17 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:ven:wpaper:2015:13&r=ecm 
By:  Laurent Callot (University of Amsterdam and CREATES); Johannes Tang Kristensen (University of Southern Denmark and CREATES) 
Abstract:  This paper shows that the parsimoniously timevarying methodology of Callot and Kristensen (2015) can be applied to factormodels.We apply this method to study macroeconomic instability in the US from 1959:1 to 2006:4 with a particular focus on the Great Moderation. Models with parsimoniously timevarying parameters are models with an unknown number of break points at unknown locations. The parameters are assumed to follow a random walk with a positive probability that an increment is exactly equal to zero so that the parameters do not vary at every point in time. The vector of increments, which is high dimensional by construction and sparse by assumption, is estimated using the Lasso. We apply this method to the estimation of static factor models and factor augmented autoregressions using a set of 190 quarterly observations of 144 US macroeconomic series from Stock andWatson (2009).We find that the parameters of both models exhibit a higher degree of instability in the period from 1970:1 to 1984:4 relative to the following 15 years. In our setting the Great Moderation appears as the gradual ending of a period of high structural instability that took place in the 1970s and early 1980s. 
Keywords:  Parsimoniously timevarying parameters, factor models, structural break, Lasso 
JEL:  C01 C13 C32 C38 E32 
Date:  2015–06–01 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201529&r=ecm 
By:  Nathaniel T. Wilcox (Economic Science Institute (Chapman University) and Center for the Economic Analysis of Risk (Georgia State University)) 
Abstract:  I compare the generalization ability, or outofsample predictive success, of four probabilistic models of binary discrete choice under risk. One model is the conventional homoscedastic latent index model—the simple logit—that is common in applied econometrics: This model is “contextfree” in the sense that its error part is homoscedastic with respect to decision sets. The other three models are also latent index models but their error part is heteroscedastic with respect to decision sets: In that sense they are “contextdependent” models. Contextdependent models of choice under risk arise from several different theoretical perspectives. Here I consider my own “contextual utility” model (Wilcox 2011), the “decision field theory” model of Busemeyer and Townsend (1993) and the “BlavatskyyFishburn” model (Fishburn 1978; Blavatskyy 2014). In a new experiment, all three contextdependent models outperform the contextfree model in prediction, and significantly outperform a linear probability model (suggested by contemporary applied practice a la Angrist and Pischke 2009) when the latent preference structure is rankdependent utility (Quiggin 1982). All of this holds true for functionfree estimations of outcome utilities and probability weights as well as parametric estimations. Preoccupation with theories of the deterministic structure of choice under risk, to the exclusion of theories of error, is a mistake. 
Keywords:  risk, discrete choice, probabilistic choice, heteroscedasticity, prediction 
JEL:  C25 C91 D81 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:chu:wpaper:1511&r=ecm 
By:  A. Sayers; J. Heron; A. Smith; C. MacdonaldWallis; M. Gilthorpe; F. Steele; K. Tilling 
Abstract:  There is a growing debate with regards to the appropriate methods of analysis of growth trajectories and their association with prospective dependent outcomes. Using the example of childhood growth and adult BP, we conducted an extensive simulation study to explore four twostage and two joint modelling methods, and compared their bias and coverage in estimation of the (unconditional) association between birth length and later BP, and the association between growth rate and later BP (conditional on birth length). We show that the twostage method of using multilevel models to estimate growth parameters and relating these to outcome gives unbiased estimates of the conditional associations between growth and outcome. Using simulations, we demonstrate that the simple methods resulted in bias in the presence of measurement error, as did the twostage multilevel method when looking at the total (unconditional) association of birth length with outcome. The two joint modelling methods gave unbiased results, but using the reinflated residuals led to undercoverage of the confidence intervals. We conclude that either joint modelling or the simpler twostage multilevel approach can be used to estimate conditional associations between growth and later outcomes, but that only joint modelling is unbiased with nominal coverage for unconditional associations. 
Keywords:  lifecourse epidemiology; joint model; multilevel model; measurement error; growth 
JEL:  C1 
Date:  2014–09 
URL:  http://d.repec.org/n?u=RePEc:ehl:lserod:62246&r=ecm 
By:  Goh, Joel (Stanford University); Bayati, Mohsen (Stanford University); Zenios, Stefanos A. (Stanford University); Singh, Sundeep (Stanford University); Moore, David (Stanford University) 
Abstract:  Costeffectiveness studies of medical innovations often suffer from data inadequacy. When Markov chains are used as a modeling framework for such studies, this data inadequacy can manifest itself as imprecise estimates for many elements of the transition matrix. In this paper, we study how to compute maximal and minimal values for the discounted value of the chain (with respect to a vector of statewise costs or rewards) as these uncertain transition parameters jointly vary within a given uncertainty set. We show that these problems are computationally tractable if the uncertainty set has a rowwise structure. Conversely, we prove that if the rowwise structure is relaxed slightly, the problems become computationally intractable (NPhard). We apply our model to assess the costeffectiveness of fecal immunochemical testing (FIT), a new screening method for colorectal cancer. Our results show that despite the large uncertainty in FIT's performance, it is highly costeffective relative to the prevailing screening method of colonoscopy. 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:ecl:stabus:3283&r=ecm 
By:  Chevillon, Guillaume (ESSEC Business School); Hecq , Alain (Maastricht University (Department of Quantitative Economics)); Laurent, Sébastien (AixMarseille University (AixMarseille School of Economics)) 
Abstract:  This paper shows that large dimensional vector autoregressive (VAR) models of fi nite order can generate long memory in the marginalized univariate series. We derive highlevel assumptions under which the fi nal equation representation of a VAR(1) leads to univariate fractional white noises and verify the validity of these assumptions for two speci fic models. We consider the implications of our findings for the variances of asset returns where the socalled goldenrule of realized variances states that they tend always to exhibit fractional integration of a degree close to 0:4. 
Keywords:  Long memory; Vector Autoregressive Model; Marginalization; Final Equation Representation; Volatility 
JEL:  C10 C32 C58 
Date:  2015–06 
URL:  http://d.repec.org/n?u=RePEc:ebg:essewp:dr15007&r=ecm 
By:  Christopher G. Gibbs (School of Economics, UNSW Business School, UNSW) 
Abstract:  This paper proposes a new dynamic forecast combination strategy for forecasting inﬂation. The procedure draws on explanations of why the forecast combination puzzle exists and the stylized fact that Phillips curve forecasts of inﬂation exhibit signiﬁcant timevariation in forecast accuracy. The forecast combination puzzle is the empirical observation that a simple average of point forecasts is often the best forecasting strategy. The forecast combination puzzle exists because many dynamic weighting strategies tend to shift weights toward Phillips curve forecasts after they exhibit a signiﬁcant period of relative forecast improvement, which is often when their forecast accuracy begins to deteriorate. The proposed strategy in this paper weights forecasts according to their expected performance rather than their past performance to anticipate these changes in forecast accuracy. The forwardlooking approach is shown to robustly beat equal weights combined and benchmark univariate forecasts of inﬂation in realtime outofsample exercises on U.S. and New Zealand inﬂation data. 
Keywords:  Forecast combination, inﬂation, forecast pooling, forecast combination puzzle, Phillips curve 
JEL:  E17 E47 C53 
Date:  2015–04 
URL:  http://d.repec.org/n?u=RePEc:swe:wpaper:201509&r=ecm 
By:  Christopher F Baum (Boston College; DIW Berlin); Hans Lööf (Royal Institute of Technology, Stockholm); Pardis Nabavi (Royal Institute of Technology, Stockholm); Andreas Stephan (Jönkoping International Business School) 
Abstract:  We evaluate a Generalized Structural Equation Model (GSEM) approach to the estimation of the relationship between R&D, innovation and productivity that focuses on the potentially crucial heterogeneity across technology and knowledge levels. The model accounts for selectivity and handles the endogeneity of this relationship in a recursive framework. Employing a panel of Swedish firms observed in three consecutive Community Innovation Surveys, our maximum likelihood estimates show that many key channels of inuence among the model's components differ meaningfully in their statistical significance and magnitude across sectors defined by different technology levels. 
Keywords:  R&D, Innovation, Productivity, Generalized Structural Equation Model, Community Innovation Survey 
JEL:  C23 L6 O32 O52 
Date:  2015–05–29 
URL:  http://d.repec.org/n?u=RePEc:boc:bocoec:876&r=ecm 