
on Econometrics 
By:  Kruiniger, Hugo 
Abstract:  In this paper we consider two kinds of generalizations of Lancaster's (Review of Economic Studies, 2002) Modified ML estimator (MMLE) for the panel AR(1) model with fixed effects and arbitrary initial conditions and possibly covariates when the time dimension, T, is fixed. When the autoregressive parameter ρ=1, the limiting modified profile loglikelihood function for this model has a stationary point of inflection and ρ is firstorder underidentified but secondorder identified. We show that the generalized MMLEs exist w.p.a.1 and are uniquely defined w.p.1. and consistent for any value of ρ≥1. When ρ=1, the rate of convergence of the MMLEs is N^{1/4}, where N is the crosssectional dimension of the panel. We then develop an asymptotic theory for GMM estimators when one of the parameters is only secondorder identified and use this to derive the limiting distributions of the MMLEs. They are generally asymmetric when ρ=1. One kind of generalized MMLE depends on a weight matrix W_{N} and we show that a suitable choice of W_{N} yields an asymptotically unbiased MMLE. We also show that Quasi LM tests that are based on the modified profile loglikelihood and use its expected rather than observed Hessian, with an additional modification for ρ=1, and confidence regions that are based on inverting these tests have correct asymptotic size in a uniform sense when ρ≤1. Finally, we investigate the finite sample properties of the MMLEs and the QLM test in a Monte Carlo study. 
Keywords:  dynamic panel data, expected Hessian, fixed effects, Generalized Method of Moments (GMM), inflection point, Modified Maximum Likelihood, Quasi LM test, secondorder identification, singular information matrix, weak moment conditions. 
JEL:  C11 C12 C13 C23 
Date:  2018–06–16 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:88623&r=ecm 
By:  Bognanni, Mark (Federal Reserve Bank of Cleveland) 
Abstract:  This paper develops a new class of structural vector autoregressions (SVARs) with timevarying parameters, which I call a drifting SVAR (DSVAR). The DSVAR is the first structural timevarying parameter model to allow for internally consistent probabilistic inference under exact—or set—identification, nesting the widely used SVAR framework as a special case. I prove that the DSVAR implies a reducedform representation, from which structural inference can proceed similarly to the widely used twostep approach for SVARs: beginning with estimation of a reduced form and then choosing among observationally equivalent candidate structural parameters via the imposition of identifying restrictions. In a special case, the implied reduced form is a tractable known model for which I provide the first algorithm for Bayesian estimation of all free parameters. I demonstrate the framework in the context of Baumeister and Peersman’s (2013b) work on time variation in the elasticity of oil demand. 
Keywords:  structural vector autoregressions; timevarying parameters; Gibbs sampling; stochastic volatility; Bayesian inference; 
JEL:  C11 C15 C32 C52 E3 E4 E5 
Date:  2018–09–11 
URL:  http://d.repec.org/n?u=RePEc:fip:fedcwp:1811&r=ecm 
By:  Knut Are Aastveit (Norges Bank); James Mitchell (Warwick Business School); Francesco Ravazzolo (Free University of Bozen/Bolzano); Herman van Dijk (Erasmus University, Noges Bank) 
Abstract:  Increasingly, professional forecasters and academic researchers present modelbased and subjective or judgmentbased forecasts in economics which are accompanied by some measure of uncertainty. In its most complete form this measure is a probability density function for future values of the variables of interest. At the same time combinations of forecast densities are being used in order to integrate information coming from several sources like experts, models and large microdata sets. Given this increased relevance of forecast density combinations, the genesis and evolution of this approach, both inside and outside economics, is explored. A fundamental density combination equation is specified which shows that various frequentist as well as Bayesian approaches give different specific contents to this density. In its most simplistic case, it is a restricted finite mixture, giving fixed equal weights to the various individual densities. The specification of the fundamental density combination is made more flexible in recent literature. It has evolved from using simple average weights to optimized weights and then to `richer' procedures that allow for timevariation, learning features and model incompleteness. The recent history and evolution of forecast density combination methods, together with their potential and benefits, are illustrated in a policy making environment of central banks. 
Keywords:  Forecasting; Model Uncertainty; Density Combinations 
JEL:  C10 C11 
Date:  2018–09–02 
URL:  http://d.repec.org/n?u=RePEc:tin:wpaper:20180069&r=ecm 
By:  Martin Burda; Remi Daviet 
Abstract:  Practical use of nonparametric Bayesian methods requires the availability of efficient algorithms for implementation for posterior inference. The inherently serial nature of Markov Chain Monte Carlo (MCMC) imposes limitations on its efficiency and scalability. In recent years there has been a surge of research activity devoted to developing alternative implementation methods that target parallel computing environments. Sequential Monte Carlo (SMC), also known as a particle filter, has been gaining popularity due to its desirable properties. SMC uses a genetic mutationselection sampling approach with a set of particles representing the posterior distribution of a stochastic process. We propose to enhance the performance of SMC by utilizing Hamiltonian transition dynamics in the particle transition phase, in place of random walk used in the previous literature. We call the resulting procedure Hamiltonian Sequential Monte Carlo (HSMC). Hamiltonian transition dynamics has been shown to yield superior mixing and convergence properties relative to random walk transition dynamics in the context of MCMC procedures. The rationale behind HSMC is to translate such gains to the SMC environment. We apply both SMC and HSMC to a panel discrete choice model with a nonparametric distribution of unobserved individual heterogeneity. We contrast both methods in terms of convergence properties and show the favorable performance of HSMC. 
Keywords:  Particle filtering, Bayesian nonparametrics, mixed panel logit, discrete choice 
JEL:  C11 C14 C15 C23 C25 
Date:  2018–09–12 
URL:  http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa618&r=ecm 
By:  Ross Doppelt (Penn State); Keith O'Hara (New York University) 
Abstract:  We introduce a new method for Bayesian estimation of fractionally integrated vector autoregressions (FIVARs). The FIVAR, which nests a standard VAR as a special case, allows each series to exhibit long memory, meaning that low frequencies can play a dominant role — a salient feature of many macroeconomic and financial time series. Although the parameter space is typically highdimensional, our inferential procedure is computationally tractable and relatively easy to implement. We apply our methodology to the identification of technology shocks, an empirical problem in which businesscycle predictions depend on carefully accounting for lowfrequency fluctuations. 
Date:  2018 
URL:  http://d.repec.org/n?u=RePEc:red:sed018:1212&r=ecm 
By:  Den Haan, Wouter; Drechsel, Thomas 
Abstract:  Exogenous random structural disturbances are the main driving force behind fluctuations in most business cycle models and typically a wide variety is used. This paper documents that a minor misspecification regarding structural disturbances can lead to large distortions for parameter estimates and implied model properties, such as impulse response functions with a wrong shape and even an incorrect sign. We propose a novel concept, namely an agnostic structural disturbance (ASD), that can be used to both detect and correct for misspecification of the structural disturbances. In contrast to regular disturbances and wedges, ASDs do not impose additional restrictions on policy functions. When applied to the SmetsWouters (SW) model, we find that its riskpremium disturbance and its investmentspecific productivity disturbance are rejected in favor of our ASDs. While agnostic in nature, studying the estimated associated coefficients and the impulse response functions of these ASDs allows us to interpret them economically as a riskpremium/preference and an investmentspecific productivity type disturbance as in SW, but our results indicate that they enter the model quite differently than the original SW disturbances. Our procedure also selects an additional wage markup disturbance that is associated with increased capital efficiency. 
Keywords:  DSGE; fullinformation model estimation; structural disturbances 
JEL:  C13 C52 E30 
Date:  2018–08 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:13145&r=ecm 
By:  HAFNER Christian, (CORE and ISBA, UCLouvain); HERWARTZ Helmut, (University of Goettingen); MAXAND Simone, (University of Helsinki) 
Abstract:  Multivariate GARCH models are widely used to model volatility and correlation dynamics of nancial time series. These models are typically silent about the transmission of implied orthogonalized shocks to vector returns. We propose a loss statistic to discriminate in a datadriven way between alternative structural assumptions about the transmission scheme. In its structural form, a four dimensional system comprising US and Latin American stock market returns points to a substantial volatility transmission from the US to the Latin American markets. The identified structural model improves the estimation of classical measures of portfolio risk, as well as corresponding variations. 
Keywords:  structural innovations; identifying assumptions; MGARCH; portfolio risk; volatility transmission 
JEL:  C32 G15 
Date:  2018–07–25 
URL:  http://d.repec.org/n?u=RePEc:cor:louvco:2018020&r=ecm 
By:  Kengo Kato; Yuya Sasaki; Takuya Ura 
Abstract:  This paper presents the nonparametric inference problem about the probability density function of a latent variable in the measurement error model with repeated measurements. We construct a system of linear complexvalued moment restrictions by Kotlarski's identity, and then establish a confidence band for the density of the latent variable. Our confidence band controls the asymptotic size uniformly over a class of data generating processes, and it is consistent against all fixed alternatives. Simulation studies support our theoretical results. 
Date:  2018–08 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1808.09375&r=ecm 
By:  Abel Brodeur (Department of Economics, University of Ottawa, Ottawa, ON); Nikolai Cook (Department of Economics, University of Ottawa, Ottawa, ON); Anthony Heyes (Department of Economics, University of Ottawa, Ottawa, ON, and University of Sussex) 
Abstract:  The economics 'credibility revolution' has promoted the identification of causal relationships using differenceindifferences (DID), instrumental variables (IV), randomized control trials (RCT) and regression discontinuity design (RDD) methods. The extent to which a reader should trust claims about the statistical significance of results proves very sensitive to method. Applying multiple methods to 13,440 hypothesis tests reported in 25 top economics journals in 2015, we show that selective publication and phacking is a substantial problem in research employing DID and (in particular) IV. RCT and RDD are much less problematic. Almost 25% of claims of marginally significant results in IV papers are misleading. 
Keywords:  Research methods, causal inference, pcurves, phacking, publication bias. 
JEL:  A11 B41 C13 C44 
Date:  2018 
URL:  http://d.repec.org/n?u=RePEc:ott:wpaper:1809e&r=ecm 
By:  Eric Beutner; Alexander Heinemann; Stephan Smeekes 
Abstract:  This paper proposes a fixeddesign residual bootstrap method for the twostep estimator of Francq and Zako\"ian (2015) associated with the conditional ValueatRisk. The bootstrap's consistency is proven under mild assumptions for a general class of volatility models and bootstrap intervals are constructed for the conditional ValueatRisk to quantify the uncertainty induced by estimation. A largescale simulation study is conducted revealing that the equaltailed percentile interval based on the fixeddesign residual bootstrap tends to fall short of its nominal value. In contrast, the reversedtails interval based on the fixeddesign residual bootstrap yields accurate coverage. In the simulation study we also consider the recursivedesign bootstrap. It turns out that the recursivedesign and the fixeddesign bootstrap perform equally well in terms of average coverage. Yet in smaller samples the fixeddesign scheme leads on average to shorter intervals. An empirical application illustrates the interval estimation using the fixeddesign residual bootstrap. 
Date:  2018–08 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1808.09125&r=ecm 
By:  Carol Alexander; Emese Lazar; Silvia Stanescu 
Abstract:  Conditional returns distributions generated by a GARCH process, which are important for many applications in market risk assessment and portfolio optimization, are typically generated via simulation. This paper extends previous research on analytic moments of GARCH returns distributions in several ways: we consider a general GARCH model  the GJR specification with a generic innovation distribution; we derive analytic expressions for the first four conditional moments of the forward return, of the forward variance, of the aggregated return and of the aggregated variance  corresponding moments for some specific GARCH models largely used in practice are recovered as special cases; we derive the limits of these moments as the time horizon increases, establishing regularity conditions for the moments of aggregated returns to converge to normal moments; and we demonstrate empirically that some excellent approximate predictive distributions can be obtained from these analytic moments, thus precluding the need for timeconsuming simulations. 
Date:  2018–08 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1808.09666&r=ecm 
By:  HAFNER Christian, (CORE and ISBA, UCLouvain) 
Abstract:  The recent evolution of cryptocurrencies has been characterized by bubblelike behavior and extreme volatility. While it is difficult to assess an intrinsic value to a specific cryptocurrency, one can employ recently proposed bubble tests that rely on recursive applications of classical unit root tests. This paper extends this approach to the case where volatility is time varying, assuming a deterministic longrun component that may take into account a decrease of unconditional volatility when the cryptocurrency matures with a higher market dissemination. Volatility also includes a stochastic shortrun component to capture volatility clustering. The wild bootstrap is shown to correctly adjust the size properties of the bubble test, which retains good power properties. In an empirical application using eleven of the largest cryptocurrencies and the CRIX index, the general evidence in favor of bubbles is confirmed, but much less pronounced than under constant volatility. 
Keywords:  cryptocurrencies; speculative bubbles; wild bootstrap; volatility 
JEL:  C14 C43 Z11 
Date:  2018–07–25 
URL:  http://d.repec.org/n?u=RePEc:cor:louvco:2018019&r=ecm 
By:  Josselin Garnier; Knut Solna 
Abstract:  Oil price data have a complicated multiscale structure that may vary with time. We use timefrequency analysis to identify the main features of these variations and, in particular, the regime shifts. The analysis is based on a waveletbased decomposition and analysis of the associated scale spectrum. The joint estimation of the local Hurst exponent and volatility is the key to detect and identify regime shifting and switching of the oil price. The framework involves in particular modeling in terms of a process of `multifractional' type so that both the roughness and the volatility of the price process may vary with time. Special epochs then emerge as a result of these degrees of freedom, moreover, as a result of the special type of spectral estimator used. These special epochs are discussed and related to historical events. Some of them are not detected by standard analysis based on maximum likelihood estimation. The paper presents a novel algorithm for robust detection of such special epochs and multifractional behavior in financial or other types of data. In the financial context insight about such behavior of the asset price is important to evaluate financial contracts involving the asset. 
Date:  2018–08 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1808.09382&r=ecm 
By:  Damjana Kokol Bukov\v{s}ek; Toma\v{z} Ko\v{s}ir; Bla\v{z} Moj\v{s}kerc; Matja\v{z} Omladi\v{c} 
Abstract:  When choosing the right copula for our data a key point is to distinguish the family that describes it at the best. In this respect, a better choice of the copulas could be obtained through the information about the (non)symmetry of the data. Exchangeability as a probability concept (first next to independence) has been studied since 1930's, copulas have been studied since 1950's, and even the most important class of copulas from the point of view of applications, i.e. the ones arising from shock models s.a. Marshall's copulas, have been studied since 1960's. However, the point of nonexchangeability of copulas was brought up only in 2006 and has been intensively studied ever since. One of the main contributions of this paper is the maximal asymmetry function for a family of copulas. We compute this function for the major families of shockbased copulas, i.e. Marshall, maxmin and reflected maxmin (RMM for short) copulas and also for some other important families. We compute the sharp bound of asymmetry measure $\mu_\infty$, the most important of the asymmetry measures, for the family of Marshall copulas and the family of maxmin copulas, which both equal to $\frac{4}{27}\ (\approx 0.148)$. One should compare this bound to the one for the class of PQD copulas to which they belong, which is $32\sqrt{2}\ \approx 0.172)$, and to the general bound for all copulas that is $\frac13$. Furthermore, we give the sharp bound of the same asymmetry measure for RMM copulas which is $32\sqrt{2}$, compared to the same bound for NQD copulas, where they belong, which is $\sqrt{5}2\ (\approx 0.236)$. One of our main results is also the statistical interpretation of shocks in a given model at which the maximal asymmetry measure bound is attained. These interpretations for the three families studied are illustrated by examples that should be helpful to practitioners when choosing the model for their data. 
Date:  2018–08 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1808.09698&r=ecm 
By:  CHIRANJIT MUKHOPADHYAY (Indian Institute of Science) 
Abstract:  Short horizon Event Studies (ES) in financial research, are concerned with the effects of firmspecific or marketwide events such as, stocksplits, earnings announcements, mergers and acquisitions, derivatives introductions etc. on the underlying firms' stock prices. Though it has been around for half a century, and evidences abound about the phenomenon of Event Induced Variance (EIV), the methodological development in the ES literature, has mostly focused only on a shift in location of the expected abnormal returns.In this work, a randomeffect model is proposed which explicitly accounts for the (empirically observed) crosssectional variance of the (predicted) abnormal returns, along with another parameter accommodating for (another empirical phenomenon of) a change in postevent volatility. Under this model, the null hypothesis of "no event effect" also involves these additional variance parameters other than the usual mean. This necessitates development of new tests for this and other hypotheses of interests in ES, for which new Likelihood Ratio Tests (LRTs) are derived.As is standard in the ES literature, the specification and power behavior of the newly developed LRTs are compared with those of the existing ES tests, using real returns of 1231 stocks, that were listed in the National Stock Exchange, India between April 1998 and January 2016. 100,000 samples of sizes 5 and 50 are drawn to estimate and compare the probabilities of typeI error and power of the tests. The new LRTs are compared with both the popular and recent parametric and nonparametric ES tests in the literature. The powers are compared under both presence and absence of shift in location of the distribution of the abnormal returns, along with those of the two components of EIV. The newly developed LRTs are found to be adequately specified, and for more powerful than the existing ES tests in the literature, for a wide spectrum of alternatives. 
Keywords:  Cumulative Abnormal Return, Efficient Market Hypothesis, Event Induced Variance, Power, Specification 
JEL:  G14 C12 C58 
Date:  2018–06 
URL:  http://d.repec.org/n?u=RePEc:sek:iacpro:6408700&r=ecm 
By:  WEBER Matthias, (Bank of Lithuania and Vilnius University); STRIAUKAS Jonas, (CORE, UCLouvain); SCHUMACHER Martin, (University of Freiburg); HARALD Binder, (University of Freiburg) 
Abstract:  Often, variables are linked to each other via a network. When such a network structure is known, this knowledge can be incorporated into regularized regression settings. In particular, an additional network penalty can be added on top of another penalty term, such as a Lasso penalty. However, when the type of interaction via the network is unknown (that is, whether connections are of an activating or a repressing type), the connection signs have to be estimated simultaneously with the covariate coefficients. This can be done with an algorithm iterating a connection sign estimation step and a covariate coefficient estimation step. We show detailed simulation results of such an algorithm. The algorithm performs well in a variety of settings. We also briefly describe the Rpackage that we developed for this purpose, which is publicly available. 
Keywords:  network regression; network penalty; connection sign estimation; regularized regression 
Date:  2018–06–11 
URL:  http://d.repec.org/n?u=RePEc:cor:louvco:2018018&r=ecm 