|
on Econometrics |
By: | Alessio Volpicella (Queen Mary University of London) |
Abstract: | Sign-restricted Structural Vector Autoregressions (SVARs) are increasingly common. However, they usually result in a set of structural parameters that have very different implications in terms of impulse responses, elasticities, historical decomposition and forecast error variance decomposition (FEVD). This makes it difficult to derive meaningful economic conclusions, and there is always the risk of retaining structural parameters with implausible implications. This paper imposes bounds on the FEVD as a way of sharpening set-identification induced by sign restrictions. Firstly, in a bivariate and trivariate setting, this paper analytically proves that bounds on the FEVD reduce the identified set. For higher dimensional SVARs, I establish the conditions in which the placing of bounds on the FEVD delivers a non-empty set and sharpens inference; algorithms to detect non-emptiness and reduction are also provided. Secondly, under a convexity criterion, a prior-robust approach is proposed to construct estimation and inference. Thirdly, this paper suggests a procedure to derive theory-driven bounds that are consistent with the implications of a variety of popular, but different, DSGE models, with real, nominal, and financial frictions, and with sufficiently wide ranges for their parameters. The methodology is generalized to incorporate uncertainty about the bounds themselves. Fourthly, a Monte-Carlo exercise verifies the effectiveness of those bounds in identifying the data-generating process relative to sign restrictions. Finally, a monetary policy application shows that bounds on the FEVD tend to remove unreasonable implications, increase estimation precision, sharpen and also alter the inference of models identified through sign restrictions. |
Keywords: | Bounds, Forecast Error Variance, Monetary Policy, Set Identification, Sign Restrictions, Structural Vector Autoregressions (SVARs) |
JEL: | C32 C53 E10 E52 |
Date: | 2019–07–29 |
URL: | http://d.repec.org/n?u=RePEc:qmw:qmwecw:890&r=all |
By: | Yukitoshi Matsushita; Taisuke Otsu |
Abstract: | This paper sheds light on problems of statistical inference under alternative or nonstandard asymptotic frameworks from the perspective of jackknife empirical likelihood (JEL). Examples include small bandwidth asymptotics for semiparametric inference, many covariates asymptotics for regression models, and many-weak instruments asymptotics for instrumental variable regression. We first establish Wilks' theorem for the JEL statistic on a general semiparametric inference problem under the conventional asymptotics. We then show that the JEL statistics lose asymptotic pivotalness under the above nonstandard asymptotic frameworks, and argue that these phenomena are understood as emergence of Efron and Stein's (1981) bias of the jackknife variance estimator in the first order. Finally we propose a modification of JEL to recover asymptotic pivotalness under both the conventional and nonstandard asymptotics. Our modification works for all above examples and provides a unified framework to investigate nonstandard asymptotic problems. |
Keywords: | Jackknife, Empirical likelihood, Nonstandard asymptotics |
JEL: | C14 |
Date: | 2019–07 |
URL: | http://d.repec.org/n?u=RePEc:cep:stiecm:605&r=all |
By: | Germano Ruisi (Queen Mary University of London) |
Abstract: | In recent years local projections have become a more and more popular methodology for the estimation of impulse responses. Besides being relatively easy to implement, the main strength of this approach relative to the traditional VAR one is that there is no need to impose any specific assumption on the dynamics of the data. This paper models local projections in a time-varying framework and provides a Gibbs sampler routine to estimate them. A simulation study shows how the performance of the algorithm is satisfactory while the usefulness of the model developed here is shown through an application to fiscal policy shocks. |
Keywords: | Time-Varying Coefficients, Local Projections |
JEL: | C11 C32 C36 E32 |
Date: | 2019–07–29 |
URL: | http://d.repec.org/n?u=RePEc:qmw:qmwecw:891&r=all |
By: | Xuehai Zhang (Paderborn University) |
Abstract: | A general class of SemiMEM (semiparametric multiplicative error) models is proposed by introducing a scale function into a MEM (multiplicative error) class model to analyze the non-negative observations. The estimation of the scale function is not limited by any parametric models specification and the moments condition is also reduced via the Box- Cox transformation. For the purpose, an equivalent scale function is applied in a local linear approach and converted to the scale function under weak moment conditions. The equivalent scale function estimation and the bandwidth, the constant factor in the asymp- totic variance and the power transformation parameters estimation are proposed based on the iterative plug-in (IPI) algorithms. In the power transformation estimation, the maximum likelihood estimation (MLE), the normality test and the the quantile-quantile regression (QQr) are employed and simulation algorithms for the confidence interval of estimated power transformation parameter are also developed by the block bootstrap method. The algorithms fit the selected real data well. |
Date: | 2019–08 |
URL: | http://d.repec.org/n?u=RePEc:pdn:ciepap:122&r=all |
By: | Rico Krueger; Taha H. Rashidi; Akshay Vij |
Abstract: | In this paper, we contrast parametric and semi-parametric representations of unobserved heterogeneity in hierarchical Bayesian multinomial logit models and leverage these methods to infer distributions of willingness to pay for features of shared automated vehicle (SAV) services. Specifically, we compare the multivariate normal (MVN), finite mixture of normals (F-MON) and Dirichlet process mixture of normals (DP-MON) mixing distributions. The latter promises to be particularly flexible in respect to the shapes it can assume and unlike other semi-parametric approaches does not require that its complexity is fixed prior to estimation. However, its properties relative to simpler mixing distributions are not well understood. In this paper, we evaluate the performance of the MVN, F-MON and DP-MON mixing distributions using simulated data and real data sourced from a stated choice study on preferences for SAV services in New York City. Our analysis shows that the DP-MON mixing distribution provides superior fit to the data and performs at least as well as the competing methods at out-of-sample prediction. The DP-MON mixing distribution also offers substantive behavioural insights into the adoption of SAVs. We find that preferences for in-vehicle travel time by SAV with ride-splitting are strongly polarised. Whereas one third of the sample is willing to pay between 10 and 80 USD/h to avoid sharing a vehicle with strangers, the remainder of the sample is either indifferent to ride-splitting or even desires it. Moreover, we estimate that new technologies such as vehicle automation and electrification are relatively unimportant to travellers. This suggests that travellers may primarily derive indirect, rather than immediate benefits from these new technologies through increases in operational efficiency and lower operating costs. |
Date: | 2019–07 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1907.09639&r=all |
By: | McCracken, Michael W. (Federal Reserve Bank of St. Louis) |
Abstract: | We investigate a test of equal predictive ability delineated in Giacomini and White (2006; Econometrica). In contrast to a claim made in the paper, we show that their test statistic need not be asymptotically Normal when a fixed window of observations is used to estimate model parameters. An example is provided in which, instead, the test statistic diverges with probability one under the null. Simulations reinforce our analytical results. |
Keywords: | prediction; out-of-sample; inference |
JEL: | C12 C52 C53 |
Date: | 2019–07–29 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedlwp:2019-018&r=all |
By: | Jon Danielsson; Lerby Ergun; Laurens de Haan; Casper G. de Vries |
Abstract: | The selection of upper order statistics in tail estimation is notoriously difficult. Methods that are based on asymptotic arguments, like minimizing the asymptotic MSE, do not perform well in finite samples. Here, we advance a data-driven method that minimizes the maximum distance between the fitted Pareto type tail and the observed quantile. To analyze the finite sample properties of the metric, we perform rigorous simulation studies. In most cases, the finite sample-based methods perform best. To demonstrate the economic relevance of choosing the proper methodology, we use daily equity return data from the CRSP database and find economically relevant variation between the tail index estimates. |
Keywords: | Econometric and statistical methods; Financial stability |
JEL: | C01 C14 C58 |
Date: | 2019–08 |
URL: | http://d.repec.org/n?u=RePEc:bca:bocawp:19-28&r=all |
By: | Bajzik, Jozef; Havranek, Tomas; Irsova, Zuzana; Schwarz, Jiri |
Abstract: | A key parameter in international economics is the elasticity of substitution between domestic and foreign goods, also called the Armington elasticity. Yet estimates vary widely. We collect 3,524 reported estimates of the elasticity, construct 34 variables that reflect the context in which researchers obtain their estimates, and examine what drives the heterogeneity in the results. To account for inherent model uncertainty, we employ Bayesian and frequentist model averaging. We present the first application of newly developed non-linear techniques to correct for publication bias. Our main results are threefold. First, there is publication bias against small and statistically insignificant elasticities. Second, differences in results are best explained by differences in data: aggregation, frequency, size, and dimension. Third, the mean elasticity implied by the literature after correcting for both publication bias and potential misspecifications is 3. |
Keywords: | Armington; trade elasticity; meta-analysis; publication bias; Bayesian model averaging |
JEL: | C83 D12 F14 |
Date: | 2019–07–12 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:95031&r=all |
By: | Petr Koldanov |
Abstract: | Wide class of elliptically contoured distributions is a popular model of stock returns distribution. However the important question of adequacy of the model is open. There are some results which reject and approve such model. Such results are obtained by testing some properties of elliptical model for each pair of stocks from some markets. New property of equality of $\tau$ Kendall correlation coefficient and probability of sign coincidence for any pair of random variables with elliptically contoured distribution is proved in the paper. Distribution free statistical tests for testing this property for any pair of stocks are constructed. Holm multiple hypotheses testing procedure based on the individual tests is constructed and applied for stock markets data for the concrete year. New procedure of testing the elliptical model for stock returns distribution for all years of observation for some period is proposed. The procedure is applied for the stock markets data of China, USA, Great Britain and Germany for the period from 2003 to 2014. It is shown that for USA, Great Britain and Germany stock markets the hypothesis of elliptical model of stock returns distribution could be accepted but for Chinese stock market is rejected for some cases. |
Date: | 2019–07 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1907.10306&r=all |
By: | Anne Opschoor (Vrije Universiteit Amsterdam); André Lucas (Vrije Universiteit Amsterdam) |
Abstract: | We propose a new score-driven model to capture the time-varying volatility and tail behavior of realized kernels. We assume realized kernels follow an F distribution with two time-varying degrees-of-freedom parameters, accounting for the Vol-of-Vol and the tail shape of the realized kernel distribution. The resulting score-driven dynamics imply that the influence of large (outlying) realized kernels on future volatilities and tail-shapes is mitigated. We apply our model to 30 stocks from the S&P 500 index over the period 2001-2014. The results show that tail shapes vary over time, even after correcting for the time-varying mean and Vol-of-Vol of the realized kernels. The model outperforms a number of recent competitors, both in-sample and out-of-sample. In particular, accounting for time-varying tail shapes matters for both density forecasts and forecasts of volatility risk quantiles. |
Keywords: | realized kernel, heavy tails, F distribution, time-varying shape-parameter, Vol-of-Vol, score-driven dynamics |
JEL: | C32 C58 |
Date: | 2019–07–31 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20190051&r=all |
By: | Jérôme TRINH (Institut Polytechnique de Paris, CREST; Thema, University of Cergy-Pontoise.) |
Abstract: | his article develops a methodology to compute up-to-date quarterly macroeconomic data for emerging countries by adapting a well known method of temporal disaggregation to time series with small sample size and instable relationships between them. By incorporating di erent procedures of structural break detection, the prediction of higher-frequency estimations of yearly oficial data can be improved. A methodology with a model selection procedure and disaggregation formulas is proposed. Its predictive performance is assessed by using empirical advanced countries data and simulated time series. An application to the Chinese national accounts allows the estimation of the cyclical components of the Chinese expenditure accounts and shows the Chinese economy to have second order moments more in line with emerging countries than advanced economies like the United States. |
Keywords: | Time series, macroeconomic forecasting, disaggregation, structural change, business cycles, emerging economies, |
Date: | 2019–06–27 |
URL: | http://d.repec.org/n?u=RePEc:crs:wpaper:2019-11&r=all |
By: | Daisuke Kurisu; Taisuke Otsu |
Abstract: | This paper studies the uniform convergence rates of Li and Vuong's (1998) nonparametric deconvolution estimator and its regularized version by Comte and Kappus (2015) for the classical measurement error model, where repeated measurements are available. Our assumptions are weaker than existing results, such as Li and Vuong (1998) which requires bounded support, and a specialization of Bonhomme and Robin (2010) which requires the existence of moment generating functions of certain observables. Moreover, our uniform convergence rates are typically faster than those obtained in these papers. |
Keywords: | measurement error, deconvolution, uniform convergence |
JEL: | C14 |
Date: | 2019–07 |
URL: | http://d.repec.org/n?u=RePEc:cep:stiecm:604&r=all |
By: | Manh D. Pham (School of Economics and Centre for Efficiency and Productivity Analysis (CEPA) at The University of Queensland, Australia); Lèopold Simar (Institut de Statistique, Biostatistique et Sciences Actuarielles, Universite Catholique de Louvain, B1348 Louvain-la-Neuve, Belgium); Valentin Zelenyuk (School of Economics and Centre for Efficiency and Productivity Analysis (CEPA) at The University of Queensland, Australia) |
Abstract: | The Malmquist Productivity Index (MPI) has gained popularity amongst studies on dynamic change of productivity of decision making units (DMUs). In practice, this index is frequently reported at aggregate levels (e.g., public and private rms) in the form of simple equally-weighted arithmetic or geometric means of individual MPIs. A number of studies have emphasized that it is necessary to account for the relative importance of individual DMUs in the aggregations of indices in general and of MPI in particular. While more suitable aggregations of MPIs have been introduced in the literature, their statistical properties have not been revealed yet, preventing applied researchers from making essential statistical inferences such as con dence intervals and hypothesis testing. In this paper, we will ll this gap by developing a full asymptotic theory for an appealing aggregation of MPIs. On the basis of this, some meaningful statistical inferences are proposed and their nite-sample performances are veri ed via extensive Monte Carlo experiments. |
Keywords: | aggregation, asymptotics, DEA, hypothesis test, inference, Malmquist index, productivity |
JEL: | C14 C44 C51 D24 M11 |
Date: | 2019–07 |
URL: | http://d.repec.org/n?u=RePEc:qld:uqcepa:138&r=all |
By: | David S. Miller |
Abstract: | This note introduces a general method to derive recession probabilities from forecasts using real-time data in parsimoniously specified logistic regressions. I apply two specifications of the general method that produces an implied recession probability to forecasts contained in releases of the Survey of Professional Forecasters (SPF). Using yearly forecasts from the 2018:Q3 SPF, the probability of a recession peaks between 30 percent in 2020 and 40 percent in 2021. Using quarterly forecasts, the probability of a recession within four quarters is monotonically increasing during the forecast, hitting a high between 35 and 40 percent in 2019:Q3. |
Date: | 2019–05–06 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedgfn:2019-05-06&r=all |
By: | Mitchell, James (University of Warwick); Weale, Martin (King's College London) |
Abstract: | This paper considers the production and evaluation of density forecasts paying attention to if and how the probabilities of outlying observations are quantified and communicated. Particular focus is given to the ‘censored’ nature of the Bank of England’s fan charts, given that - which is commonly ignored - they describe only the inner 90% (best critical region) of the forecast distribution. A new estimator is proposed that fits a potentially skewed and fat tailed density to the inner observations, acknowledging that the outlying observations may be drawn from a different but unknown distribution. In forecasting applications, motivation for this could reflect the view that outlying forecast errors reflect (realised) unknown unknowns or events not expected to recur that should be censored before quantifying known unknowns. |
Keywords: | forecasting uncertainty; fan charts; skewed densities; best critical region; density forecasting; censoring; forecasting evaluation; |
JEL: | C24 C46 C53 E58 |
Date: | 2019 |
URL: | http://d.repec.org/n?u=RePEc:wrk:wrkemf:27&r=all |
By: | Keith Blackwell |
Abstract: | This paper introduces a simple symmetric Quantal Response Statistical Equilibrium (QRSE) model that can fit many commonly observed distributions of Economic and Financial data including Laplace, Gaussian, Logistic, and Student’s T distributions. This paper also introduces the application of QRSE to a financial market setting. A QRSE market model uses joint probability distribution of asset returns and entropy constrained buy/sell decisions of investors to explain stylized facts we commonly observe in the distributions of asset returns and economic data such as fat-tails, excess peakedness, and skew. Using the simplified model, this paper extends the existing logic and understanding of QRSE in order to provide a behavioral explanation for these commonly observed distributions of data. |
Date: | 2019–08 |
URL: | http://d.repec.org/n?u=RePEc:new:wpaper:1912&r=all |
By: | Xuehai Zhang (Paderborn University) |
Abstract: | Risk management has been emphasized by financial institutions and the Basel Com- mittee on Banking Supervision (BCBS). The core issue in risk management is the mea- surement of the risks. Value at Risk (VaR) and Expected Shortfall (ES) are the widely used tools in quantitative risk management. Due to the ineptitude of VaR on tail risk performances, ES is recommended as the financial risk management metrics by BCBS. In this paper, we generate general SemiGARCH class models with a time-varying scale function. GARCH class models, based on the conditional t-distribution, are parametric extensions. Besides, backtesting with the semiparametric approach is also discussed. Fol- lowing Basel III, the trac light tests are applied in the model validation. Finally, we propose the loss functions with the views from regulators and firms, combing a power transformation in the model selection and it is shown that semiparametric models are a necessary option in practical financial risk management. |
Date: | 2019–08 |
URL: | http://d.repec.org/n?u=RePEc:pdn:ciepap:123&r=all |
By: | Ray C. Fair (Cowles Foundation, Yale University) |
Abstract: | This comment points out mismeasurement of three of the variables in the DSGE model in Smets and Wouters (2007) and in models that use the Smets-Wouters model as a benchmark. The mismeasurement appears serious enough to call into question the reliability of empirical results using these variables. |
Keywords: | DSGE models, Macro data |
JEL: | E12 E32 |
Date: | 2019–02 |
URL: | http://d.repec.org/n?u=RePEc:cwl:cwldpp:2166r&r=all |
By: | Thomas Despois (PJSE - Paris Jourdan Sciences Economiques - UP1 - Université Panthéon-Sorbonne - ENS Paris - École normale supérieure - Paris - INRA - Institut National de la Recherche Agronomique - EHESS - École des hautes études en sciences sociales - ENPC - École des Ponts ParisTech - CNRS - Centre National de la Recherche Scientifique, PSE - Paris School of Economics); Catherine Doz (PJSE - Paris Jourdan Sciences Economiques - UP1 - Université Panthéon-Sorbonne - ENS Paris - École normale supérieure - Paris - INRA - Institut National de la Recherche Agronomique - EHESS - École des hautes études en sciences sociales - ENPC - École des Ponts ParisTech - CNRS - Centre National de la Recherche Scientifique, PSE - Paris School of Economics) |
Abstract: | Dynamic factor models (DFMs) have been successfully used in various applications, and have become increasingly popular. Nevertheless, they suffer from two weaknesses. First, the factors are very difficult to interpret with the usual estimation methods. Secondly, there is recent and mounting evidence that they can be subject to structural instability, potentially leading to an inconsistent estimation of the factors. The central point of this paper is to tackle the uninterpretability issue. We consider two families of methods to get interpretable factors: factor rotations and sparse PCA. When using a large dataset of US macroeconomic and financial variables, they recover the same factor representation, with a simple structure in the loadings offering a clear economic interpretation of the factors, which appears to be stable over time. This provides new lens for the DFM applications, and especially to study the structural instability issue. Using this factor representation, we find no evidence for the emergence of a new factor or for a widespread break in the factor loadings at a given time. The structural instability in the model seems to rather consist in the order of the factors switching over time (apparently in relation with the structural breaks in the data), and breaks in a limited number of loadings which can be localized and interpreted. |
Keywords: | dynamic factor models,factor rotations,sparse PCA,structural break |
Date: | 2019–08 |
URL: | http://d.repec.org/n?u=RePEc:hal:psewpa:halshs-02235543&r=all |
By: | Tara M. Sinclair (The George Washington University) |
Abstract: | Throughout the history of macroeconomic forecasting, several major themes have remained surprisingly consistent. The failure to forecast economic downturns ahead of time is perhaps the most significant of these. Forecasting approaches have changed, but forecasts for recessions have not improved. What can we learn from past evaluations of macroeconomic forecasts? Is it possible to predict major economic shocks or is it a fool’s errand? This chapter discusses how forecasting techniques have evolved over time and yet the record on forecasting recessions remains dismal. There are several competing hypotheses for why forecasters fail to foresee recessions, but little evidence any of them are going to be addressed before the next recession occurs. This suggests planners and policymakers should expect to be surprised by the arrival of downturns and develop ways to be prepared for recessions without having clear warning of their coming. |
Keywords: | Forecast evaluation, recessions |
JEL: | E37 C53 |
Date: | 2019–08 |
URL: | http://d.repec.org/n?u=RePEc:gwc:wpaper:2019-003&r=all |
By: | Michael Weylandt; Yu Han; Katherine B. Ensor |
Abstract: | Financial markets for Liquified Natural Gas (LNG) are an important and rapidly-growing segment of commodities markets. Like other commodities markets, there is an inherent spatial structure to LNG markets, with different price dynamics for different points of delivery hubs. Certain hubs support highly liquid markets, allowing efficient and robust price discovery, while others are highly illiquid, limiting the effectiveness of standard risk management techniques. We propose a joint modeling strategy, which uses high-frequency information from thickly-traded hubs to improve volatility estimation and risk management at thinly traded hubs. The resulting model has superior in- and out-of-sample predictive performance, particularly for several commonly used risk management metrics, demonstrating that joint modeling is indeed possible and useful. To improve estimation, a Bayesian estimation strategy is employed and data-driven weakly informative priors are suggested. Our model is robust to sparse data and can be effectively used in any market with similar irregular patterns of data availability. |
Date: | 2019–07 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1907.10152&r=all |
By: | Anne Opschoor (Vrije Universiteit Amsterdam); André Lucas (Vrije Universiteit Amsterdam) |
Abstract: | We present a new model to decompose total daily return volatility into a filtered (high-frequency based) open-to-close volatility and a time-varying scaling factor. We use score-driven dynamics based on fat-tailed distributions to limit the impact of incidental large observations. Applying our new model to 100 stocks of the S&P 500 during the period 2001-2014 and evaluating (in-sample and out-of-sample) in terms of Value-at-Risk and Expected Shortfall, we find our model outperforms alternatives like the HEAVY model that uses close-to-close returns and realized variances, and models treating close-to-open en open-to-close returns as separate processes. Results also indicate that the ratio between total and open-to-close volatility changes substantially through time, especially for financial stocks. |
Keywords: | overnight volatility, realized variance, F distribution, score-driven dynamics |
JEL: | C32 C58 |
Date: | 2019–07–31 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20190052&r=all |
By: | Azam, Jean-Paul |
Date: | 2019–08 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:123287&r=all |
By: | Li, Yong (Renmin University of China); Wang, Nianling (Renmin University of China); Yu, Jun (School of Economics, Singapore Management University) |
Abstract: | The power-posterior method of Friel and Pettitt (2008) has been used to estimate the marginal likelihoods of competing Bayesian models. In this paper it is shown that the Bernstein-von Mises (BvM) theorem holds for the power posteriors under regularity conditions. Due to the BvM theorem, the power posteriors, when adjusted by the square root of the corresponding grid points, converge to the same normal distribution as the original posterior distribution, facilitating the implementation of importance sampling for the purpose of estimating the marginal likelihood. Unlike the power-posterior method that requires repeated posterior sampling from the power posteriors, the new method only requires the posterior output from the original posterior. Hence, it is computationally more efficient to implement. Moreover, it completely avoids the coding efforts associated with drawing samples from the power posteriors. Numerical efficiency of the proposed method is illustrated using two models in economics and finance. |
Keywords: | Bayes factor; Marginal likelihood; Markov Chain Monte Carlo; Model choice; Power posteriors; Importance sampling |
JEL: | C11 C12 |
Date: | 2019–07–22 |
URL: | http://d.repec.org/n?u=RePEc:ris:smuesw:2019_016&r=all |