|
on Econometrics |
By: | Legrand, Romain |
Abstract: | Time-varying VAR models have become increasingly popular and are now widely used for policy analysis and forecast purposes. They constitute fundamental tools for the anticipation and analysis of economic crises, which represent rapid shifts in dynamic responses and shock volatility. Yet, despite their flexibility, time-varying VARs remain subject to a number of limitations. On the theoretical side, the conventional random walk assumption used for the dynamic parameters appears excessively restrictive. It also conceals the potential heterogeneities existing between the dynamic processes of different variables. On the application side, the standard two-pass procedure building on the Kalman filter proves excessively complicated and suffers from low efficiency. Based on these considerations, this paper contributes to the literature in four directions: i) it introduces a general time-varying VAR model which relaxes the standard random walk assumption and defines the dynamic parameters as general auto-regressive processes with variable- specific mean values and autoregressive coefficients. ii) it develops an estimation procedure for the model which is simple, transparent and efficient. The procedure requires no sophisticated Kalman filtering methods and reduces to a standard Gibbs sampling algorithm. iii) as an extension, it develops efficient procedures to estimate endogenously the mean values and autoregressive coefficients associated with each variable-specific autoregressive process. iv) through a case study of the Great Recession for four major economies (Canada, the Euro Area, Japan and the United States), it establishes that forecast accuracy can be significantly improved by using the proposed general time-varying model and its extensions in place of the traditional random walk specification. |
Keywords: | Time-varyings coefficients; Stochastic volatility; Bayesian methods; Markov Chain Monte Carlo methods; Forecasting; Great Recession |
JEL: | C11 C15 C22 E32 F47 |
Date: | 2018–09–10 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:88925&r=ecm |
By: | Mengheng Li (Economics Discipline Group, University of Technology, Sydney); Marcel Scharth (School of Economics,University of Sydney, Sydney) |
Abstract: | We develop a flexible modeling and estimation framework for a high-dimensional factor stochastic volatility (SV) model. Our specification allows for leverage effects, asymmetry and heavy tails across all systematic and idiosyncratic components of the model. This framework accounts for well-documented features of univariate financial time series, while introducing a flexible dependence structure that incorporates tail dependence and asymmetries such as stronger correlations following downturns. We develop an efficient Markov chain Monte Carlo (MCMC) algorithm for posterior simulation based on the particle Gibbs, ancestor sampling, and particle efficient importance sampling methods. We build computationally efficient model selection into our estimation framework to obtain parsimonious specifications in practice. We validate the performance of our proposed estimation method via extensive simulation studies for univariate and multivariate simulated datasets. An empirical study shows that the model outperforms other multivariate models in terms of value-at-risk evaluation and portfolio selection performance for a sample of US and Australian stocks. |
Keywords: | Generalised hyperbolic skew Student’s t-distribution; Metropolis-Hastings algorithm; Importance sampling; Particle filter; Particle Gibbs; State space model; Time-varying covariance matrix; Factor model |
JEL: | C11 C32 C53 G32 |
Date: | 2018–08–24 |
URL: | http://d.repec.org/n?u=RePEc:uts:ecowps:49&r=ecm |
By: | Stephan Smeekes; Etienne Wijler |
Abstract: | In this paper we propose the Single-equation Penalized Error Correction Selector (SPECS) as an automated estimation procedure for dynamic single-equation models with a large number of potentially (co)integrated variables. By extending the classical single-equation error correction model, SPECS enables the researcher to model large cointegrated datasets without necessitating any form of pre-testing for the order of integration or cointegrating rank. We show that SPECS is able to consistently estimate an appropriate linear combination of the cointegrating vectors that may occur in the underlying DGP, while simultaneously enabling the correct recovery of sparsity patterns in the corresponding parameter space. A simulation study shows strong selective capabilities, as well as superior predictive performance in the context of nowcasting compared to high-dimensional models that ignore cointegration. An empirical application to nowcasting Dutch unemployment rates using Google Trends confirms the strong practical performance of our procedure. |
Date: | 2018–09 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1809.08889&r=ecm |
By: | Daouia, Abdelaati; Girard, Stéphane; Stupfler, Gilles |
Abstract: | Risk measures of a financial position are traditionally based on quantiles. Replacing quantiles with their least squares analogues, called expectiles, has recently received increasing attention. The novel expectile-based risk measures satisfy all coherence requirements. We revisit their extreme value estimation for heavy-tailed distributions. First, we estimate the underlying tail index via weighted combinations of top order statistics and asymmetric least squares estimates. The resulting expectHill estimators are then used as the basis for estimating tail expectiles and Expected Shortfall. The asymptotic theory of the proposed estimators is provided, along with numerical simulations and applications to actuarial and financial data. |
Keywords: | Asymmetric least squares; Coherent risk measures; Expected shortfall; Expectile; Extrapolation; Extremes; Heavy tails; Tail index |
JEL: | C13 C14 |
Date: | 2018–09 |
URL: | http://d.repec.org/n?u=RePEc:tse:wpaper:32939&r=ecm |
By: | Matthew Harding; Carlos Lamarche; M. Hashem Pesaran |
Abstract: | This paper proposes a quantile regression estimator for a heterogeneous panel model with lagged dependent variables and interactive effects. The paper adopts the Common Correlated Effects (CCE) approach proposed by Pesaran (2006) and Chudik and Pesaran (2015) and demonstrates that the extension to the estimation of dynamic quantile regression models is feasible under similar conditions to the ones used in the literature. We establish consistency and derive the asymptotic distribution of the new quantile regression estimator. Monte Carlo studies are carried out to study the small sample behavior of the proposed approach. The evidence shows that the estimator can significantly improve on the performance of existing estimators as long as the time series dimension of the panel is large. We present an application to the evaluation of Time-of-Use pricing using a large randomized control trial. |
Keywords: | common correlated effects, dynamic panel, quantile regression, smart meter, randomized experiment |
JEL: | C21 C31 C33 D12 L94 |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:ces:ceswps:_7211&r=ecm |
By: | Max H. Farrell; Tengyuan Liang; Sanjog Misra |
Abstract: | We study deep neural networks and their use in semiparametric inference. We provide new rates of convergence for deep feedforward neural nets and, because our rates are sufficiently fast (in some cases minimax optimal), prove that semiparametric inference is valid using deep nets for first-step estimation. Our estimation rates and semiparametric inference results are the first in the literature to handle the current standard architecture: fully connected feedforward neural networks (multi-layer perceptrons), with the now-default rectified linear unit (ReLU) activation function and a depth explicitly diverging with the sample size. We discuss other architectures as well, including fixed-width, very deep networks. We establish nonasymptotic bounds for these deep ReLU nets, for both least squares and logistic loses in nonparametric regression. We then apply our theory to develop semiparametric inference, focusing on treatment effects and expected profits for concreteness, and demonstrate their effectiveness with an empirical application to direct mail marketing. Inference in many other semiparametric contexts can be readily obtained. |
Date: | 2018–09 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1809.09953&r=ecm |
By: | Manabu Asai (Faculty of Economics, Soka University, Japan.); Shelton Peiris (School of Mathematics and Statistics, University of Sydney, Australia .); Michael McAleer (Department of Quantitative Finance National Tsing Hua University, Taiwan and Econometric Institute Erasmus School of Economics Erasmus University Rotterdam, The Netherlands and Department of Quantitative Economics Complutense University of Madrid, Spain And Institute of Advanced Sciences Yokohama National University, Japan.); David E. Allen (School of Mathematics and Statistics, University of Sydney, Australia, Department of Finance, Asia University, Taiwan, and School of Business and Law, Edith Cowan University, Western Australia, Department of Finance, Asia University, Taiwan.) |
Abstract: | Recent developments in econometric methods enable estimation and testing of general long memory process, which include the general Gegenbauer process. This paper considers the error correction model for a vector general long memory process, which encompasses the vector autoregressive fractionally-integrated moving average and general Gegenbauer process. We modify the tests for unit roots and cointegration, based on the concept of heterogeneous autoregression. The Monte Carlo simulations show that the finite sample properties of the modified tests are satisfactory, while the conventional tests suffer from size distortion. Empirical results for interest rates series for the U.S.A. and Australia indicate that: (1) the modified unit root test detected unit roots for all series, (2) after differencing, all series favour the general Gegenbauer process, (3) the modified test for cointegration found only two cointegrating vectors, and (4) the zero interest rate policy in the U.S.A. has no effect on the cointegrating vector for the two countries. |
Keywords: | Long Memory Processes; Gegenbauer Process; Dickey-Fuller Tests; Cointegration; Differencing; Interest Rates. |
JEL: | C22 C32 C51 |
Date: | 2018–09 |
URL: | http://d.repec.org/n?u=RePEc:ucm:doicae:1822&r=ecm |
By: | Yang, Cynthia Fan |
Abstract: | This paper considers panel data models with cross-sectional dependence arising from both spatial autocorrelation and unobserved common factors. It derives conditions for model identification and proposes estimation methods that employ cross-sectional averages as factor proxies, including the 2SLS, Best 2SLS, and GMM estimations. The proposed estimators are robust to unknown heteroskedasticity and serial correlation in the disturbances, unrequired to estimate the number of unknown factors, and computationally tractable. The paper establishes the asymptotic distributions of these estimators and compares their consistency and efficiency properties. Extensive Monte Carlo experiments lend support to the theoretical findings and demonstrate the satisfactory finite sample performance of the proposed estimators. The empirical section of the paper finds strong evidence of spatial dependence of real house price changes across 377 Metropolitan Statistical Areas in the US from 1975Q1 to 2014Q4. The results also reveal that population and income growth have significantly positive direct and spillover effects on house price changes. These findings are robust to different specifications of the spatial weights matrix constructed based on distance, migration flows, and pairwise correlations. |
Keywords: | Cross-sectional dependence, Common factors, Spatial panel data models, Generalized method of moments, House prices |
JEL: | C13 C23 R21 R31 |
Date: | 2017–11–01 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:89032&r=ecm |
By: | Laurens Cherchye; Thomas Demuynck; Bram De Rock; Marijn Verschelde |
Abstract: | We propose a novel nonparametric method for the structural production analysis in the presence of unobserved heterogeneity in productivity. We assume cost minimization as the firms' behavioral objective, and we model productivity on which firms condition the input demand of the observed inputs. Our model can equivalently be represented in terms of endogenously chosen latent input costs that guarantee data consistency with our behavioral assumption, and we argue that this avoids a simultaneity bias in a natural way. Our Monte Carlo simulation and empirical application to Belgian manufacturing data show that our method allows for drawing strong and robust conclusions, despite its nonparametric orientation. For example, our results pinpoint a clear link between international exposure and productivity and show that primary inputs are substituted for materials rather than for productivity enhancement. |
Keywords: | productivity, unobserved heterogeneity, simultaneity bias, nonparametric pro- duction analysis, cost minimisation, manufacturing |
Date: | 2018–09 |
URL: | http://d.repec.org/n?u=RePEc:eca:wpaper:2013/277180&r=ecm |
By: | Lillestøl, Jostein (Dept. of Business and Management Science, Norwegian School of Economics) |
Abstract: | This report deals with the analysis of data used by tax officers to support their claim of tax fraud at a pizzeria. The possibilities of embezzlement under study are overreporting of take-away sales and underreporting of cash payments. Several modelling approaches are explored, ranging from simple well-known methods to presumably more precise tools. More specifically, we contrast common methods based on normal assumptions and models based on Gamma-assumptions. For the latter, both maximum likelihood and Bayesian approaches are covered. Several criteria for the choice of method in practice are discussed, among them, how easy the method is to understand, justify and communicate to the parties. Some dilemmas present itself: the choice of statistical method, its role in building the evidence, the choice of risk factor, the application of legal principles like “clear and convincing evidence” and “beyond reasonable doubt”. The insights gained may be useful for both tax officers and defenders of the taxpayer, as well as for expert witnesses. |
Keywords: | Gamma-Beta analysis; Bayesian Gamma-analysis; Risk analysis |
JEL: | C00 C10 C11 C13 |
Date: | 2018–09–25 |
URL: | http://d.repec.org/n?u=RePEc:hhs:nhhfms:2018_012&r=ecm |