Operations Research
http://lists.repec.orgmailman/listinfo/nep-ore
Operations Research
2015-11-21
Higher-order statistics for DSGE models
http://d.repec.org/n?u=RePEc:cqe:wpaper:4315&r=ore
This note derives closed-form expressions for unconditional moments, cumulants and polyspectra of order higher than two for linear and nonlinear (pruned) DSGE models. The procedures are demonstrated by means of the Smets and Wouters (2007) model (first-order approximation), the An and Schorfheide (2007) model (second-order approximation) and the canonical neoclassical growth model (third-order approximation). Both the Gaussian as well as Student's t-distribution are considered as the underlying stochastic process. Useful Matrix tools and computational aspects are also discussed.
Willi Mutschler
higher-order moments, cumulants, polyspectra, nonlinear SDGE, pruning
2015-11
Dissecting Modelsâ€™ Forecasting Performance
http://d.repec.org/n?u=RePEc:kof:wpskof:15-397&r=ore
In this paper we suggest an approach to comparison of modelsâ€™ forecasting performance in unstable environments. Our approach is based on combination of the Cumulated Sum of Squared Forecast Error Differential (CSSFED) suggested earlier in Welch and Goyal (2008) and the Bayesian change point analysis based on Barry and Hartigan (1993). The latter methodology provides the formal statistical analysis of the CSSFED time series which turned out to be a powerful graphical tool for tracking how the relative forecasting performance of competing models evolves over time. We illustrate the suggested approach by using forecasts of the GDP growth rate in Switzerland.
Boriss Siliverstovs
Forecasting, Forecast Evaluation, Change Point Detection, Bayesian Estimation
2015-11
A data-cleaning augmented Kalman filter for robust estimation of state space models
http://d.repec.org/n?u=RePEc:zbw:hohdps:132015&r=ore
This article presents a robust augmented Kalman filter that extends the data-cleaning filter (Masreliez and Martin, 1977) to the general state space model featuring nonstationary and regression effects. The robust filter shrinks the observations towards their one-step-ahead prediction based on the past, by bounding the effect of the information carried by a new observation according to an influence function. When maximum likelihood estimation is carried out on the replacement data, an M-type estimator is obtained. We investigate the performance of the robust AKF in two applications using as a modeling framework the basic structural time series model, a popular unobserved components model in the analysis of seasonal time series. First, a Monte Carlo experiment is conducted in order to evaluate the comparative accuracy of the proposed method for estimating the variance parameters. Second, the method is applied in a forecasting context to a large set of European trade statistics series.
Marczak, Martyna
Proietti, Tommaso
Grassi, Stefano
robust filtering,augmented Kalman filter,structural time series model,additive outlier,innovation outlier
2015
Exponential Smoothing, Long Memory and Volatility Prediction
http://d.repec.org/n?u=RePEc:aah:create:2015-51&r=ore
Extracting and forecasting the volatility of financial markets is an important empirical problem. The paper provides a time series characterization of the volatility components arising when the volatility process is fractionally integrated, and proposes a new predictor that can be seen as extension of the very popular and successful forecasting and signal extraction scheme, known as exponential smoothing (ES). First, we derive a generalization of the Beveridge-Nelson result, decomposing the series into the sum of fractional noise processes with decreasing orders of integration. Secondly, we consider three models that are natural extensions of ES: the fractionally integrated first order moving average (FIMA) model, a new integrated moving average model formulated in terms of the fractional lag operator (FLagIMA), and a fractional equal root integrated moving average (FerIMA) model, proposed originally by Hosking. We investigate the properties of the volatility components and the forecasts arising from these specification, which depend uniquely on the memory and the moving average parameters. For statistical inference we show that, under mild regularity conditions, the Whittle pseudo-maximum likelihood estimator is consistent and asymptotically normal. The estimation results show that the log-realized variance series are mean reverting but nonstationary. An out-of-sample rolling forecast exercise illustrates that the three generalized ES predictors improve significantly upon commonly used methods for forecasting realized volatility, and that the estimated model confidence sets include the newly proposed fractional lag predictor in all occurrences.
Tommaso Proietti
Realized Volatility, Volatility Components, Fractional lag models, Fractional equal-root IMA model, Model Confidence Set
2015-06-01
A decomposition for the space of games with externalities
http://d.repec.org/n?u=RePEc:pra:mprapa:67932&r=ore
The main goal of this paper is to present a different perspective than the more `traditional' approaches to study solutions for games with externalities. We provide a direct sum decomposition for the vector space of these games and use the basic representation theory of the symmetric group to study linear symmetric solutions. In our analysis we identify all irreducible subspaces that are relevant to the study of linear symmetric solutions and we then use such decomposition to derive some applications involving characterizations of classes of solutions.
Sanchez-Perez, Joss
Games with externalities; value; representation theory; symmetric group.
2015
Forecasting with Instabilities: an Application to DSGE Models with Financial Frictions
http://d.repec.org/n?u=RePEc:ucn:wpaper:201523&r=ore
This paper examines whether the presence of parameter instabilities in dynamic stochastic general equilibrium (DSGE) models affects their forecasting performance. We apply this analysis to medium-scale DSGE models with and without financial frictions for the US economy. Over the forecast period 2001-2013, the models augmented with financial frictions lead to an improvement in forecasts for inflation and the short term interest rate, while for GDP growth rate the performance depends on the horizon/period. We interpret this finding taking into account parameters instabilities. Fluctuation test shows that models with financial frictions outperform in forecasting inflation but not the GDP growth rate.
Roberta Cardani
Alessia Paccagnini
Stefania Villa
Bayesian estimation; Forecasting; Financial frictions; Parameter instabilities
2015-10
GEL Estimation for Heavy-Tailed GARCH Models with Robust Empirical Likelihood Inference
http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/13795&r=ore
We construct a Generalized Empirical Likelihood estimator for a GARCH(1,1) model with a possibly heavy tailed error. The estimator imbeds tail-trimmed estimating equations allowing for over-identifying conditions, asymptotic normality, efficiency and empirical likelihood based confidence regions for very heavy-tailed random volatility data. We show the implied probabilities from the tail-trimmed Continuously Updated Estimator elevate weight for usable large values, assign large but not maximum weight to extreme observations, and give the lowest weight to non-leverage points. We derive a higher order expansion for GEL with imbedded tail-trimming (GELITT), which reveals higher order bias and efficiency properties, available when the GARCH error has a finite second moment. Higher order asymptotics for GEL without tail-trimming requires the error to have moments of substantially higher order. We use first order asymptotics and higher order bias to justify the choice of the number of trimmed observations in any given sample. We also present robust versions of Generalized Empirical Likelihood Ratio, Wald, and Lagrange Multiplier tests, and an efficient and heavy tail robust moment estimator with an application to expected shortfall estimation. Finally, we present a broad simulation study for GEL and GELITT, and demonstrate profile weighted expected shortfall for the Russian Ruble - US Dollar exchange rate. We show that tail-trimmed CUE-GMM dominates other estimators in terms of bias, mse and approximate normality. AMS classifications : 62M10 , 62F35.
Hill, Jonathan B.
Prokhorov, Artem
GEL ; GARCH ; tail trimming ; heavy tails ; robust inference ; efficient moment estimation ; expected shortfall ; Russian Ruble
2015-09-11
Estimating Non-Linear DSGEs with the Approximate Bayesian Computation: an application to the Zero Lower Bound
http://d.repec.org/n?u=RePEc:saq:wpaper:06/15&r=ore
Non-linear model estimation is generally perceived as impractical and computationally burdensome. This perception limited the diffusion on non-linear models estimation. In this paper a simple set of techniques going under the name of Approximate Bayesian Computation (ABC) is proposed. ABC is a set of Bayesian techniques based on moments matching: moments are obtained simulating the model conditional on draws from the prior distribution. An accept-reject criterion is applied on the simulations and an approximate posterior distribution is obtained by the accepted draws. A series of techniques are presented (ABC-regression, ABC-MCMC, ABC-SMC). To assess their small sample performance, Montecarlo experiments are run on AR(1) processes and on a RBC model showing that ABC estimators outperform the Limited Information Method (Kim, 2002), a GMM-style estimator. In the remainder, the estimation of a new-keynesian model with a zero lower bound on the interest rate is performed. Non-gaussian moments are exploited in the estimation procedure.
Valerio Scalone
Monte-Carlo analysis, Method of moments, Bayesian, Zero Lower Bound, DSGE estimation.
2015-11
Bounding average treatment effects: A linear programming approach
http://d.repec.org/n?u=RePEc:unm:umagsb:2015027&r=ore
We show how to obtain bounds on the mean treatment effects by solving a simple linear programming problem. The use of a linear programme is convenient from a practical point of view because it avoids the need to derive closed form solutions. Imposing or omitting monotonicity or concavity restrictions is done by simply adding or removing sets of linear restrictions to the linear programme.
Demuynck T.
Semiparametric and Nonparametric Methods: General; Optimization Techniques; Programming Models; Dynamic Analysis; Human Capital; Skills; Occupational Choice; Labor Productivity;
2015