
on Econometrics 
By:  Harris, D; Kew, H; Taylor, AMR 
Abstract:  In this paper we contribute to two separate literatures. Our principal contribution is made to the literature on break fraction estimation. Here we investigate the properties of a class of weighted residual sum of squares estimators for the location of a level break in time series whose shocks display nonstationary volatility (permanent changes in unconditional volatility). This class contains the ordinary least squares (OLS) and weighted least squares (WLS) estimators, the latter based on the true volatility process. For fixed magnitude breaks we show that the estimator attains the same consistency rate under nonstationary volatility as under homoskedasticity. We also provide local limiting distribution theory for the estimator when the break magnitude is either localtozero at some rate in the sample size or exactly zero. The former includes the Pitman drift rate which is shown via Monte Carlo experiments to predict well the key features of the finite sample behaviour of the OLS estimator and a feasible version of the WLS estimator based on an adaptive estimate of the volatility path of the shocks. The simulations highlight the importance of the break location, break magnitude, and the form of nonstationary volatility for the finite sample performance of these estimators, and show that the feasible WLS estimator can deliver significant improvements over the OLS estimator in certain heteroskedastic environments. We also contribute to the unit root testing literature. We demonstrate how the results in the first part of the paper can be applied, by using level break fraction estimators on the first differences of the data, when testing for a unit root in the presence of trend breaks and/or nonstationary volatility. In practice it will be unknown whether a trend break is present and so we also discuss methods to select between the break and no break cases, considering both standard information criteria and feasible weighted information criteria based on our adaptive volatility estimator. Simulation evidence suggests that the use of these feasible weighted estimators and information criteria can deliver unit root tests with significantly improved finite sample behaviour under heteroskedasticity relative to their unweighted counterparts. 
Keywords:  Level break fraction, nonstationary volatility, adaptive estimation, feasible weighted estimator,, information criteria, unit root tests and trend breaks. 
Date:  2017–09 
URL:  http://d.repec.org/n?u=RePEc:esy:uefcwp:20329&r=ecm 
By:  KUROZUMI, Eiji 
Abstract:  This paper proposes constructing a confidence set for the date of a structural change at the end of a sample in a mean shift model. While the break fraction, the ratio of the number of observations before the break to the sample size, is typically assumed to take a value in the (0, 1) open interval, we consider the case where a permissible break date is included in a fixed number of observations at the end of the sample and thus the break fraction approaches one as the sample size goes to infinity. We propose inverting the test for the break date to construct a confidence set, while critical values are obtained by using the subsampling method. By using Monte Carlo simulations, we show that the confidence set proposed in this paper can control the coverage rate in finite samples well, while the average length of the confidence set is comparable to existing methods based on asymptotic theory with a fixed break fraction in the (0, 1) interval. 
Keywords:  structural change, coverage rate, subsampling method 
JEL:  C12 C15 C22 
Date:  2017–09 
URL:  http://d.repec.org/n?u=RePEc:hit:econdp:201706&r=ecm 
By:  Christian Murray (University of Houston); Juan Urquiza (Pontificia Universidad Católica de Chile) 
Abstract:  Over the last decade, applied researchers have estimated forward looking Taylor rules with interest rate smoothing via Nonlinear Least Squares. A common empirical finding for postVolcker samples, based on asymptotic theory, is that the Federal Reserve adheres to the Taylor Principle. We explore the possibility of weak identification and spurious inference in estimated Taylor rule regressions with interest rate smoothing. We argue that the presence of smoothing subjects the parameters of interest to the Zero Information Limit Condition analyzed by Nelson and Startz (2007, Journal of Econometrics). We demonstrate that confidence intervals based on standard methods such as the delta method can have severe coverage problems when interest rate smoothing is persistent. We then demonstrate that alternative methodologies such as Fieller (1940, 1954), Krinsky and Robb (1986), and the AndersonRubin (1949) test have better finite sample coverage. We reconsider the results of four recent empirical studies and show that the evidence supporting the Taylor Principle can be reversed over half of the time. 
Keywords:  Spurious Inference; ZeroInformationLimitCondition; Interest Rate Smoothing; Nonlinear Least Squares. 
JEL:  C12 C22 E52 
Date:  2017–09–03 
URL:  http://d.repec.org/n?u=RePEc:hou:wpaper:201727409&r=ecm 
By:  Elena Di Bernardino (CEDRIC  Centre d'Etude et De Recherche en Informatique du Cnam  CNAM  Conservatoire National des Arts et Métiers [CNAM]); Didier Rullière (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1) 
Abstract:  The class of multivariate Archimedean copulas is defined by using a realvalued function called the generator of the copula. This generator satisfies some properties, including dmonotony. We propose here a new basic transformation of this generator, preserving these properties, thus ensuring the validity of the transformed generator and inducing a proper valid copula. This transformation acts only on a specific portion of the generator, it allows both the nonreduction of the likelihood on a given dataset, and the choice of the upper tail dependence coefficient of the transformed copula. Numerical illustrations show the utility of this construction, which can improve the fit of a given copula both on its central part and its tail. 
Keywords:  transformations,Archimedean copulas,distortions,tail dependence coefficients,likelihood 
Date:  2017–03–03 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal01347869&r=ecm 
By:  Houllier, Melanie (The London Institute for Banking and Finance); Murphy, David (Bank of England) 
Abstract:  The advent of mandatory central clearing for certain types of overthecounter derivatives and margin requirements for others means that margin is the most important mitigation mechanism for many counterparty credit risks. Initial margin requirements are typically calculated using riskbased margin models, and these models must be tested to ensure that they are prudent. However, two different margin models can calculate substantially different levels of margin yet both pass the usual tests. This paper presents a new approach to parameter selection based on the statistical properties of the worst loss over a margin period of risk estimated by the margin model under test. This measure is related to risk estimated at a fixed confidence interval yet leads to a more powerful test which is better able to justify the choice of parameters used in margin models. The test proposed is used on a variety of volatility estimation techniques applied to a long history of returns of the S&P 500 index. Well known techniques, including exponentially weighted moving average volatility estimation and generalised autoregressive conditional heteroskedasticity approaches are considered, and novel approaches derived from signal processing are also analysed. In each case a range of model parameters which give rise to acceptable risk estimates is identified. 
Keywords:  Conditional volatility; filtered volatility; GARCH(1; 1); initial margin model; model backtesting; volatility estimation 
JEL:  C12 C52 G13 
Date:  2017–09–01 
URL:  http://d.repec.org/n?u=RePEc:boe:boeewp:0673&r=ecm 
By:  Bibi, Abdelouahab; Ghezal, Ahmed 
Abstract:  In this paper, we propose a natural extension of timeinvariant coefficients threshold GARCH (TGARCH) processes to periodically timevarying coefficients (PTGARCH) one. So some theoretical probabilistic properties of such models are discussed, in particular, we establish firstly necessary and sufficient conditions which ensure the strict stationarity and ergodicity (in periodic sense) solution of PTGARCH. Secondary, we extend the standard results for the limit theory of the popular quasimaximum likelihood estimator (QMLE) for estimating the unknown parameters of the model. More precisely, the strong consistency and the asymptotic normality of QMLE are studied in cases when the innovation process is an i.i.d (Strong case) and/or is not (Semistrong case). The finitesample properties of QMLE are illustrated by a Monte Carlo study. Our proposed model is applied to model the exchange rates of the Algerian Dinar against the U.Sdollar and the single European currency (Euro). 
Keywords:  Periodic asymmetric GARCH model, Stationarity, Strong consistency, Asymptotic normality. 
JEL:  C13 
Date:  2017–09–04 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:81126&r=ecm 
By:  Aufenanger, Tobias 
Abstract:  This paper proposes a way of using observational pretest data for the design of experiments. In particular, this paper suggests to train a random forest on the pretest data and to stratify the allocation of treatments to experimental units on the predicted dependent variables. This approach reduces much of the arbitrariness involved in defining strata directly on the basis of covariates. A simulation on 300 random samples drawn from six data sets shows that this algorithm is extremely effective in increasing power compared to random allocation and to traditional ways of stratification. In more than 80% of all samples the estimated variance of the treatment estimator is lower and the estimated power is higher than for standard designs such as complete randomization, conventional stratification or Mahalanobis matching. 
Keywords:  experiment design,treatment allocation 
Date:  2017 
URL:  http://d.repec.org/n?u=RePEc:zbw:iwqwdp:162017&r=ecm 
By:  Chakraborty, Chiranjit (Bank of England); Joseph, Andreas (Bank of England) 
Abstract:  We introduce machine learning in the context of central banking and policy analyses. Our aim is to give an overview broad enough to allow the reader to place machine learning within the wider range of statistical modelling and computational analyses, and provide an idea of its scope and limitations. We review the underlying technical sources and the nascent literature applying machine learning to economic and policy problems. We present popular modelling approaches, such as artificial neural networks, treebased models, support vector machines, recommender systems and different clustering techniques. Important concepts like the biasvariance tradeoff, optimal model complexity, regularisation and crossvalidation are discussed to enrich the econometrics toolbox in their own right. We present three case studies relevant to central bank policy, financial regulation and economic modelling more widely. First, we model the detection of alerts on the balance sheets of financial institutions in the context of banking supervision. Second, we perform a projection exercise for UK CPI inflation on a mediumterm horizon of two years. Here, we introduce a simple trainingtesting framework for time series analyses. Third, we investigate the funding patterns of technology startups with the aim to detect potentially disruptive innovators in financial technology. Machine learning models generally outperform traditional modelling approaches in prediction tasks, while open research questions remain with regard to their causal inference properties. 
Keywords:  Machine learning; artificial intelligence; big data; econometrics; forecasting; inflation; financial markets; banking supervision; financial technology 
JEL:  A12 A33 C14 C38 C44 C45 C51 C52 C53 C54 C61 C63 C87 E37 E58 G17 Y20 
Date:  2017–09–04 
URL:  http://d.repec.org/n?u=RePEc:boe:boeewp:0674&r=ecm 
By:  Kapetanios, G; Price, SG; Young, G 
Abstract:  A financial conditions index(FCI)is designed to summarise the state of financial markets. We construct two with UK data. The first is the first principal component(PC)of a set of financial indicators. The second comes from a new approach taking information from a large set of macroeconomic variables weighted by the joint covariance with a subset of the financial indicators (a set of spreads), using multivariate partial least squares, again using the first factor. The resulting FCIs are broadly similar. They both have some forecasting power for monthly GDP in a quasirealtime recursive evaluation from 20112014 and outperform an FCI produced by Goldman Sachs. A second factor, that may be interpreted as a monetary conditions index, adds further forecast power, while third factors have a mixed effect on performance. The FCIs are used to improve identification of credit supply shocks in an SVAR. The main effects relative to an SVAR excluding an FCI of the (adverse) credit shock IRFs are to make the positive impact on inflation more precise and to reveal an increased positive impact on spreads. 
Keywords:  Forecasting, Financial conditions index, Targeted data reduction, Multivariate partial least squares, Credit shocks 
Date:  2017–08–29 
URL:  http://d.repec.org/n?u=RePEc:esy:uefcwp:20328&r=ecm 
By:  Donald W.K. Andrews (Cowles Foundation, Yale University) 
Abstract:  This paper introduces identificationrobust subvector tests and confidence sets (CS’s) that have asymptotic size equal to their nominal size and are asymptotically efficient under strong identification. Hence, inference is as good asymptotically as standard methods under standard regularity conditions, but also is identification robust. The results do not require special structure on the models under consideration, or strong identification of the nuisance parameters, as many existing methods do. We provide general results under highlevel conditions that can be applied to moment condition, likelihood, and minimum distance models, among others. We verify these conditions under primitive conditions for moment condition models. In another paper, we do so for likelihood models. The results build on the approach of Chaudhuri and Zivot (2011), who introduce a C(a)type Lagrange multiplier test and employ it in a Bonferroni subvector test. Here we consider twostep tests and CS’s that employ a C(a)type test in the second step. The twostep tests are closely related to Bonferroni tests, but are not asymptotically conservative and achieve asymptotic efficiency under strong identification 
Keywords:  Asymptotics, Confidence set, Identificationrobust, Inference, Instrumental variables, Moment condition, Robust, Test 
JEL:  C10 C12 
Date:  2017–09 
URL:  http://d.repec.org/n?u=RePEc:cwl:cwldpp:3005&r=ecm 
By:  Kocięcki, Andrzej 
Abstract:  The paper proposes the methodologically sound method to deal with set identified Structural VAR (SVAR) models under zero and sign restrictions. What distinguishes our method from that proposed by Arias, RubioRamírez and Waggoner (2016) is that we isolated many special cases for which we arrive at more efficient algorithms to draw from the posterior. We illustrate our approach with the help of two serious empirical examples. First of all we challenge the output puzzle found by Uhlig (2005). Second, we check the robustness of the results given by Beaudry et al. (2014) concerning impact of optimism shocks on economy. 
Keywords:  Set identified Structural VAR, Sign restrictions, Monetary policy, Bayesian 
JEL:  C11 C18 C3 E5 E52 
Date:  2017–08–23 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:81094&r=ecm 
By:  Glaser, Stephanie 
Abstract:  Despite the increasing availability of spatial count data in research areas like technology spillovers, patenting activities, insurance payments, and crime forecasting, specialized models for analysing such data have received little attention in econometric literature so far. The few existing approaches can be broadly classified into observationdriven models, where the random spatial effects enter the moments of the dependent variable directly, and parameterdriven models, where the random spatial effects are unobservable and induced via a latent process. Moreover, within these groups the modelling approaches (and therefore the interpretation) of spatial effects are quite heterogeneous, stemming in part from the nonlinear structure of count data models. The purpose of this survey is to compare and contrast the various approaches for econometric modelling of spatial counts discussed in the literature. 
Date:  2017 
URL:  http://d.repec.org/n?u=RePEc:zbw:hohdps:192017&r=ecm 
By:  Clark, Todd E. (Federal Reserve Bank of Cleveland); McCracken, Michael W. (Federal Reserve Bank of St. Louis); Mertens, Elmar (Bank for International Settlements) 
Abstract:  We develop uncertainty measures for point forecasts from surveys such as the Survey of Professional Forecasters, Blue Chip, or the Federal Open Market Committee's Summary of Economic Projections. At a given point of time, these surveys provide forecasts for macroeconomic variables at multiple horizons. To track timevarying uncertainty in the associated forecast errors, we derive a multiplehorizon specification of stochastic volatility. Compared to constantvariance approaches, our stochasticvolatility model improves the accuracy of uncertainty measures for survey forecasts. 
Keywords:  Stochastic volatility; survey forecasts; prediction 
JEL:  C32 C53 E47 
Date:  2017–08–28 
URL:  http://d.repec.org/n?u=RePEc:fip:fedlwp:2017026&r=ecm 
By:  Kollmann, Robert 
Abstract:  This paper presents a simple and fast maximum likelihood estimation method for nonlinear DSGE models that are solved using a second (or higher) order accurate approximation. The method requires that the number of observables equals the number of exogenous shocks. Exogenous innovations are extracted recursively by inverting the observation equation, which allows easy computation of the likelihood function. 
Keywords:  Estimation of nonlinear DSGE models; observation equation inversion 
JEL:  C51 C63 C68 E37 
Date:  2017–08 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:12262&r=ecm 
By:  Robert Kollmann 
Abstract:  This paper presents a simple and fast maximum likelihood estimation method for nonlinear DSGE models that are solved using a second (or higher) order accurate approximation. The method requires that the number of observables equals the number of exogenous shocks. Exogenous innovations are extracted recursively by inverting the observation equation, which allows easy computation of the likelihood function. 
Keywords:  Estimation of nonlinear DSGE models, observation equation inversion 
JEL:  C51 C63 C68 E37 
Date:  2017–09 
URL:  http://d.repec.org/n?u=RePEc:een:camaaa:201755&r=ecm 
By:  Irma Hindrayanto; Jan P.A.M. Jacobs; Denise R. Osborn; Jing Tian 
Abstract:  Economists typically use seasonally adjusted data in which the assumption is imposed that seasonality is uncorrelated with trend and cycle. The importance of this assumption has been highlighted by the Great Recession. The paper examines an unobserved components model that permits nonzero correlations between seasonal and nonseasonal shocks. Identification conditions for estimation of the parameters are discussed from the perspectives of both analytical and simulation results. Applications to UK household consumption expenditures and US employment reject the zero correlation restrictions and also show that the correlation assumptions imposed have important implications about the evolution of the trend and cycle in the postGreat Recession period. 
Keywords:  Trendcycleseasonal decomposition, unobserved components, seasonal adjustment, employment, Great Recession 
JEL:  C22 E24 E32 E37 F01 
Date:  2017–08 
URL:  http://d.repec.org/n?u=RePEc:een:camaaa:201757&r=ecm 
By:  Fernando J. Pérez Forero (Central Reserve Bank of Peru) 
Abstract:  The stance of monetary policy is a general interest for academics, policy makers and the private sector. The latter is not necessarily observable, since the Fed have used different monetary instruments at different points in time. This paper provides a measure of this stance for the last forty five years, which is a weighted average of a pool of instruments. We extend Bernanke and Mihov (1998)'s Interbank Market model by allowing structural parameters and shock variances to change over time. In particular, we follow the recent work of Canova and Pérez Forero (2015) for estimating nonrecursive TVCVARs with Bayesian Methods. The estimated stance measure describes how tight/loose was monetary policy over time and takes into account the uncertainty related with posterior estimates of time varying parameters. Finally, we present how has monetary transmission mechanism changed over time, focusing our attention in the period after the Great Recession. 
Keywords:  SVARs, Interbank Market, Operating Procedures, Monetary Policy Stance, Timevarying parameters, Bayesian Methods, Multimove Metropolis within Gibbs Sampling 
JEL:  C11 E51 E52 E58 
Date:  2017–08 
URL:  http://d.repec.org/n?u=RePEc:apc:wpaper:2017102&r=ecm 