|
on Econometrics |
By: | Gergely Akos Ganics (Banco de España) |
Abstract: | How should researchers combine predictive densities to improve their forecasts? I propose consistent estimators of weights which deliver density forecast combinations approximating the true predictive density, conditional on the researcher’s information set. Monte Carlo simulations confi rm that the proposed methods work well for sample sizes of practical interest. In an empirical example of forecasting monthly US industrial production, I demonstrate that the estimator delivers density forecasts which are superior to well-known benchmarks, such as the equal weights scheme. Specifi cally, I show that housing permits had valuable predictive power before and after the Great Recession. Furthermore, stock returns and corporate bond spreads proved to be useful predictors during the recent crisis, suggesting that fi nancial variables help with density forecasting in a highly leveraged economy. |
Keywords: | density forecasts, forecast combinations, probability integral transform, Kolmogorov-Smirnov, Cramer-von Mises, Anderson-Darling, Kullback-Leibler information criterion |
JEL: | C13 C22 C53 |
Date: | 2017–12 |
URL: | http://d.repec.org/n?u=RePEc:bde:wpaper:1751&r=ecm |
By: | Pierre Nguimkeu; Augustine Denteh; Rusty Tchernis |
Abstract: | Participation in social programs is often misreported in survey data, complicating the estimation of the effects of those programs. In this paper, we propose a model to estimate treatment effects under endogenous participation and endogenous misreporting. We show that failure to account for endogenous misreporting can result in the estimate of the treatment effect having an opposite sign from the true effect. We present an expression for the asymptotic bias of both OLS and IV estimators and discuss the conditions under which sign reversal may occur. We provide a method for eliminating this bias when researchers have access to information related to both participation and misreporting. We establish the consistency and asymptotic normality of our estimator and assess its small sample performance through Monte Carlo simulations. An empirical example is given to illustrate the proposed method. |
JEL: | C35 C51 I28 |
Date: | 2017–12 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:24117&r=ecm |
By: | Shih-Kang Chao; Wolfgang K. Härdle; Chen Huang |
Abstract: | Abstract: More and more data are observed in form of curves. Numerous applications in finance, neuroeconomics, demographics and also weather and climate analysis make it necessary to extract common patterns and prompt joint modelling of individual curve variation. Focus of such joint variation analysis has been on fluctuations around a mean curve, a statistical task that can be solved via functional PCA. In a variety of questions concerning the above applications one is more interested in the tail asking therefore for tail event curves (TEC) studies. With increasing dimension of curves and complexity of the covariates though one faces numerical problems and has to look into sparsity related issues. Here the idea of FActorisable Sparse Tail Event Curves (FASTEC) via multivariate asymmetric least squares regression (expectile regression) in a high-dimensional framework is proposed. Expectile regression captures the tail moments globally and the smooth loss function improves the convergence rate in the iterative estimation algorithm compared with quantile regression. The necessary penalization is done via the nuclear norm. Finite sample oracle properties of the estimator associated with asymmetric squared error loss and nuclear norm regularizer are studied formally in this paper. As an empirical illustration, the FASTEC technique is applied on fMRI data to see if individual’s risk perception can be recovered by brain activities. Results show that factor loadings over different tail levels can be employed to predict individual’s risk attitudes. |
Keywords: | high-dimensionalM-estimator, nuclear norm regularizer, factorization, expectile regression, fMRI, risk perception, multivariate functional data JEL Classification: C38, C55, C61, C91, D87 |
JEL: | C00 |
Date: | 2016–08 |
URL: | http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2016-058&r=ecm |
By: | Ryo Okui; Wendun Wang |
Abstract: | This paper develops a new model and a new estimation procedure for panel data that allow us to identify heterogeneous structural breaks. In many applications, there are good reasons to suspect that structural breaks occur at different time points across individual units and the sizes of the breaks differ too. We model individual heterogeneity using a grouped pattern such that individuals within a given group share the same regression coefficients. For each group, we allow common structural breaks in the coefficients, while the number of breaks, the break points, and the size of breaks can differ across groups. To estimate the model, we develop a hybrid procedure of the grouped fixed effects approach and adaptive group fused Lasso (least absolute shrinkage and selection operator). We show that our method can consistently identify the latent group structure, detect structural breaks, and estimate the regression parameters. Monte Carlo results demonstrate a good performance of the proposed method in finite samples. We apply our method to two cross-country empirical studies and illustrate the importance of taking heterogeneous structural breaks into account. |
Date: | 2018–01 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1801.04672&r=ecm |
By: | Petra Burdejová; Wolfgang K. Härdle; |
Abstract: | High-frequency data can provide us with a quantity of informa- tion for forecasting, help to calculate and prevent the future risk based on extremes. This tail behaviour is very often driven by ex- ogenous components and may be modelled conditional on other vari- ables. However, many of these phenomena are observed over time, exhibiting non-trivial dynamics and dependencies. We propose a func- tional dynamic factor model to study the dynamics of expectile curves. The complexity of the model and the number of dependent variables are reduced by lasso penalization. The functional factors serve as a low-dimensional representation of the conditional tail event, while the time-variation is captured by factor loadings. We illustrate the model with an application to climatology, where daily data over years on temperature, rainfalls or strength of wind are available. |
Keywords: | factor model, functional data, expectiles, extremes. |
JEL: | C14 C38 C61 Q54 |
Date: | 2017–08 |
URL: | http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2017-027&r=ecm |
By: | Jungyoon Lee; Peter M Robinson |
Abstract: | We consider adaptive tests and estimates which are asymptotically efficient in the presence of unknown, nonparametric, distributional form in pure spatial models. A novel adaptive Lagrange Multiplier testing procedure for lack of spatial dependence is proposed and extended to linear re- gression with spatially correlated errors. Feasibility of adaptive estimation is verified and its efficiency improvement over Gaussian pseudo maximum likelihood is shown to be either less than, or more than, for models with explanatory variables. The paper covers a general class of semiparametric spatial models allowing nonlinearity in the parameters and/or the weight matrix, in addition to unknown distribution. |
Keywords: | Efficient test, adaptive estimation, spatial models |
JEL: | C12 C13 C14 C21 |
Date: | 2018–01 |
URL: | http://d.repec.org/n?u=RePEc:cep:stiecm:596&r=ecm |
By: | Sarlota Smutna (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nabrezi 6, 111 01 Prague 1, Czech Republic; Charles University Environment Centre, José Martího 407/2, 162 00, Prague, Czech Republic); Milan Scasny (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nabrezi 6, 111 01 Prague 1, Czech Republic; Charles University Environment Centre, José Martího 407/2, 162 00, Prague, Czech Republic) |
Abstract: | This paper deals with a problem of censored data in the household demand analysis when budget survey data is used. Micro-data, in contrast with aggregated data, usually contains a significant portion of zero observations (no consumption recorded) that leads to censoring of data and potential selectivity problem resulting in biased estimates if inappropriate econometric model is used. We review different treatment methods available in the literature that control the selectivity problem. Concretely, it is Tobit model, Two-part model, Double-hurdle model, Sample selection model with three different estimators – FIML, Heckman two-step, and Cosslett’s semi-parametric estimator. On the empirical example we indeed show that firstly the treatment methods are necessary also for small levels of censoring and secondly the choice of treatment method matters even for different products within the same dataset. We compare performance over the above single-equation demand models together with OLS. The household demand is analysed for 13 different food products with high variety of level of censoring. We found that the Heckman two-step procedure and Cosslett’s semi-parametric estimators performed best among all examined techniques in our case and that these two estimators yield similar estimates of income and own-price elasticities. The Two-part model performs equivalently but the estimation results differ from the Heckman two-step and the Cosslett‘s estimator. The OLS estimates are biased and perform poorly together with Tobit model with weak performance. |
Keywords: | demand analysis, censoring, selectivity, Heckman two-step estimator |
JEL: | C24 D12 R22 |
Date: | 2017–09 |
URL: | http://d.repec.org/n?u=RePEc:fau:wpaper:wp2017_21&r=ecm |
By: | Josep Lluís Carrion-i-Silvestre (AQR-IREA Research Group, Department of Econometrics, Statistics, and Spanish Economy, University of Barcelona. Av. Diagonal, 690. 08034 Barcelona.); María Dolores Gadea (Department of Applied Economics, University of Zaragoza. Gran Vía, 4, 50005 Zaragoza (Spain).); Antonio Montañés (Department of Applied Economics, University of Zaragoza. Gran Vía, 4, 50005 Zaragoza (Spain).) |
Abstract: | The paper investigates the estimation bias of autoregressive models for bounded stochastic processes and the performance of the standard procedures in the literature that aim to correcting the estimation bias. It is shown that, in some cases, the bounded nature of the stochastic processes worsen the estimation bias effect, which suggests the design of bound-specific bias correction methods. The paper focuses on two popular autoregressive estimation bias correction procedures which are extended to cover bounded stochastic processes. Finite sample performance analysis of the new proposal is carried out using Monte Carlo simulations which reveal that accounting for the bounded nature of the stochastic processes leads to improvements in the estimation of autoregressive models. Finally, an illustration is given using the current account balance of some developed countries, whose shocks persistence measures are computed. |
Keywords: | Bounded stochastic processes, estimation bias, unit root tests, current account balance. JEL classification: C22, C32, E32, Q43. |
Date: | 2017–11 |
URL: | http://d.repec.org/n?u=RePEc:ira:wpaper:201719&r=ecm |
By: | Michael P. Cameron (University of Waikato); Jacques Poot (University of Waikato) |
Abstract: | We demonstrate that the conventional OLS and fixed effects estimators of gravity models of migration are biased, and that the interpretation of coefficients in the fixed effects model is typically incorrect. We present a new best linear unbiased estimator for gravity models of migration. |
Keywords: | gross migration flows; gravity model; New Zealand |
JEL: | O15 R23 |
Date: | 2018–01–15 |
URL: | http://d.repec.org/n?u=RePEc:wai:econwp:18/01&r=ecm |
By: | Yingxing Li; Chen Huang; Wolfgang Karl Härdle |
Abstract: | This paper considers a fast and effective algorithm for conducting functional principle component analysis with multivariate factors. Compared with the univariate case, our approach could be more powerful in revealing spatial connections or extracting important features in images. To facilitate fast computation, we connect Singular Value Decomposition with penalized smoothing and avoid estimating a huge dimensional covariance operator. Under regularity assumptions, the results indicate that we may enjoy the optimal convergence rate by employing the smoothness assumption inherent to functional objects. We apply our method on the analysis of brain image data. Our extracted factors provide excellent recovery of the risk related regions of interests in human brain and the estimated loadings are very informative in revealing the individual risk attitude. |
Keywords: | Principal Component Analysis; Penalized Smoothing; Asymptotics; functional Magnetic Resonance Imaging fMRI |
JEL: | C13 C20 E37 |
Date: | 2017–08 |
URL: | http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2017-024&r=ecm |
By: | Paccagnini, Alessia |
Abstract: | Dynamic Stochastic General Equilibrium (DSGE) models are the main tool used in Academia and in Central Banks to evaluate the business cycle for policy and forecasting analyses. Despite the recent advances in improving the fit of DSGE models to the data, the misspecification issue still remains. The aim of this survey is to shed light on the different forms of misspecification in DSGE modeling and how the researcher can identify the sources. In addition, some remedies to face with misspecification are discussed. |
Keywords: | DSGE Models, Misspecification, Estimation, Bayesian Estimation |
JEL: | C1 C11 C5 E0 E5 E50 E58 E60 |
Date: | 2017–11–24 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:82914&r=ecm |
By: | Xinjue Li; Lenka Zbonakova; Wolfgang Karl Härdle |
Abstract: | In the present paper we propose a new method, the Penalized Adaptive Method (PAM), for a data driven detection of structure changes in sparse linear models. The method is able to allocate the longest homogeneous intervals over the data sample and simultaneously choose the most proper variables with help of penalized regression models. The method is simple yet exible and can be safely applied in high-dimensional cases with di erent sources of parameter changes. Comparing with the adaptive method in linear models, its combination with dimension reduction yields a method which selects proper signi cant variables and detects structure breaks while steadily reduces the forecast error in high-dimensional data. When applying PAM to bond risk premia modelling, the locally selected variables and their estimated coecient loadings identi ed in the longest stable subsamples over time align with the true structure changes observed throughout the market. |
Keywords: | SCAD penalty, propagation-separation, adaptive window choice, multiplier bootstrap, bond risk premia |
JEL: | C13 C20 E37 |
Date: | 2017–08 |
URL: | http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2017-023&r=ecm |