|
on Forecasting |
By: | Marcin Chlebus (Faculty of Economic Sciences, University of Warsaw) |
Abstract: | In the study a proposal of two-step EWS-GARCH models to forecast Value-at-Risk is presented. The EWS-GARCH allows different distributions of returns to be used in Value-at-Risk forecasting depending on a forecasted state of the financial time series. In the study EWS-GARCH with GARCH(1,1) and GARCH(1,1) with the amendment to the empirical distribution of returns as a Value-at-Risk model in a state of tranquillity and empirical tail, exponential or Pareto distributions used to forecast Value-at-Risk in a state of turbulence were considered. The evaluation of the quality of the Value-at-Risk forecasts was based on the Value-at-Risk forecasts adequacy (the excess ratio, the Kupiec test, the Christoffersen test, the asymptotic test of unconditional coverage and the back-testing criteria defined by the Basel committee) and the analysis of loss functions (the Lopez quadratic loss function, the Abad & Benito absolute loss function, the 3rd version of Caporin loss function and proposed in the study the function of excessive costs). Obtained results indicate that the EWS-GARCH models may improve the quality of the Value-at-Risk forecasts generated using benchmark models. However, the choice of best assumptions for an EWS-GARCH model should depend on the goals of the Value-at-Risk forecasting model. The final selection may depend on an expected level of adequacy, conservatism and costs of a model. |
Keywords: | Value-at-Risk, GARCH, forecasting, state of turbulence, regime switching, risk management, risk measure, market risk. |
JEL: | G17 C51 C52 C53 |
Date: | 2016 |
URL: | http://d.repec.org/n?u=RePEc:war:wpaper:2016-06&r=for |
By: | Dovern, Jonas; Hartmann, Matthias |
Abstract: | We propose an imperfect information model for the expectations of macroeconomic forecasters that explains differences in average disagreement levels across forecasters by means of cross sectional heterogeneity in the variance of private noise signals. We show that the forecaster-specific signal-to-noise ratios determine both the average individual disagreement level and an individuals' forecast performance: forecasters with very noisy signals deviate strongly from the average forecasts and report forecasts with low accuracy. We take the model to the data by empirically testing for this implied correlation. Evidence based on data from the Surveys of Professional Forecasters for the US and for the Euro Area supports the model for short- and medium-run forecasts but rejects it based on its implications for long-run forecasts. |
Keywords: | disagreement; expectations; imperfect information; signal-to-noise ratio. |
Date: | 2016–03–14 |
URL: | http://d.repec.org/n?u=RePEc:awi:wpaper:0611&r=for |
By: | Ching-Wai Chiu (Bank of England); Haroon Mumtaz (School of Economics and Finance Queen Mary); Gabor Pinter (Bank of England; Centre for Macroeconomics (CFM)) |
Abstract: | We introduce a Bayesian VAR model with non-Gaussian disturbances that are modelled with a finite mixture of normal distributions. Importantly, we allow for regime switching among the different components of the mixture of normals. Our model is highly flexible and can capture distributions that are fat-tailed, skewed and even multimodal. We show that our model can generate large out-of-sample forecast gains relative to standard forecasting models, especially during tranquil periods. Our model forecasts are also competitive with those generated by the conventional VAR model with stochastic volatility. |
JEL: | C11 C32 C52 |
Date: | 2016–02 |
URL: | http://d.repec.org/n?u=RePEc:cfm:wpaper:1609&r=for |
By: | Jos Jansen; Jasper de Winter |
Abstract: | We investigate to what extent it is feasible to improve model-based near-term GDP forecasts by combining them with judgmental (quarterly) forecasts by professional analysts (Consensus survey) in a real-time setting. Our analysis covers the G7 countries over the years 1999-2013. We consider as combination schemes the weighted average and the linear combination. Incorporating subjective information delivers sizable gains in forecasting ability of statistical models for all countries except Japan in 1999-2013, even when subjective forecasts are somewhat dated. Accuracy gains are much more pronounced in the volatile period after 2008 due to a marked improvement in predictive power of Consensus forecasts. Since 2008, Consensus forecasts are superior at the moment of publication for most countries. For some countries Consensus forecasts can be enhanced by model-based forecasts in between the quarterly release dates of the Consensus survey, as the latter embody more recent monthly information. |
Keywords: | Forecast combination; encompassing test; nowcasting; factor models; judgment |
JEL: | C33 C53 E37 |
Date: | 2016–03 |
URL: | http://d.repec.org/n?u=RePEc:dnb:dnbwpp:507&r=for |
By: | Porshakov, Alexey; Deryugina, Elena; Ponomarenko, Alexey; Sinyakov, Andrey |
Abstract: | Real-time assessment of quarterly GDP growth rates is crucial for evaluation of economy’s current perspectives given the fact that respective data is normally subject to substantial publication delays by national statistical agencies. Large information sets of real-time indicators which could be used to approximate GDP growth rates in the quarter of interest are in practice characterized by unbalanced data, mixed frequencies, systematic data revisions, as well as a more general curse of dimensionality problem. The latter issues could, however, be practically resolved by means of dynamic factor modeling that has recently been recognized as a helpful tool to evaluate current economic conditions by means of higher frequency indicators. Our major results show that the performance of dynamic factor models in predicting Russian GDP dynamics appears to be superior as compared to other common alternative specifications. At the same time, we empirically show that the arrival of new data seems to consistently improve DFM’s predictive accuracy throughout sequential nowcast vintages. We also introduce the analysis of nowcast evolution resulting from the gradual expansion of the dataset of explanatory variables, as well as the framework for estimating contributions of different blocks of predictors into now-casts of Russian GDP. |
Keywords: | GDP nowcast, dynamic factor models, principal components, Kalman filter, nowcast evolution |
JEL: | C53 C82 E17 |
Date: | 2015–05–28 |
URL: | http://d.repec.org/n?u=RePEc:bof:bofitp:urn:nbn:fi:bof-201506091268&r=for |
By: | Yahia Salhi (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1); Pierre-Emmanuel Thérond (Galea & Associés, SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1); Julien Tomas (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1) |
Abstract: | The present article illustrates a credibility approach to mortality. Interest from life insurers to assess their portfolios' mortality risk has considerably increased. The new regulation and norms, Solvency II, shed light on the need of life tables that best reect the experience of insured portfolios in order to quantify reliably the underlying mortality risk. In this context and following the work of Bühlmann and Gisler (2005) and Hardy and Panjer (1998), we propose a credibility approach which consists on reviewing, as new observations arrive, the assumption on the mortality curve. Unlike the methodology considered in Hardy and Panjer (1998) that consists on updating the aggregate deaths we have chosen to add an age structure on these deaths. Formally, we use a Makeham graduation model. Such an adjustment allows to add a structure in the mortality pattern which is useful when portfolios are of limited size so as to ensure a good representation over the entire age bands considered. We investigate the divergences in the mortality forecasts generated by the classical credibility approaches of mortality including Hardy and Panjer (1998) and the Poisson-Gamma model on portfolios originating from various French insurance companies. |
Keywords: | Graduation,Life insurance,Mortality,Credibility,Makeham law,Extrapolation |
Date: | 2016 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:hal-01232683&r=for |
By: | Tena, Juan de Dios; Wiper, Michael P.; Corona, Francisco |
Abstract: | Identifying the important matches in international football tournaments is of great relevance for a variety of decision makers such as organizers, team coaches and/or media managers. This paper addresses this issue by analyzing the role of the statistical approach used to estimate the outcome of the game on the identification of decisive matches on international tournaments for national football teams. We extend the measure of decisiveness proposed by Geenens (2014) in order to allow us to predict or evaluate match importance before, during and after of a particular game on the tournament. Using information from the 2014 FIFA World Cup, our results suggest that Poisson and Kernel regressions significantly outperform the forecasts of ordered probit models. Moreover, we find that the identification of the key, but not most important, matches depends on the model considered. We also apply this methodology to identify the favorite teams and to predict the most important matches in 2015 Copa America before the start of the competition. |
Keywords: | Kernel regression; Poisson model; Entropy; Ordered probit model; Game importance |
Date: | 2015–06–01 |
URL: | http://d.repec.org/n?u=RePEc:cte:wsrepe:21174&r=for |
By: | Davide Delle Monache (Bank of Italy); Ivan Petrella (Bank of England) |
Abstract: | This paper introduces an adaptive algorithm for time-varying autoregressive models in the presence of heavy tails. The evolution of the parameters is determined by the score of the conditional distribution. The resulting model is observation-driven and is estimated by classical methods. Meaningful restrictions are imposed on the model parameters so as to attain local stationarity and bounded mean values. In particular, we consider time variation in both coefficients and volatility, emphasizing how the two interact. Moreover, we show how the proposed approach generalizes the various adaptive algorithms used in the literature. The model is applied to the analysis of inflation dynamics. Allowing for heavy tails leads to significant improvements in terms of fit and forecast. The adoption of the Student's-t distribution proves to be crucial in order to obtain well-calibrated density forecasts. These results are obtained using the US CPI inflation rate and are confirmed by other inflation indicators as well as the CPI of the other G7 countries. |
Keywords: | adaptive algorithms, inflation, score driven models, student-t, time-varying parameters. |
JEL: | C22 C51 C53 E31 |
Date: | 2016–02 |
URL: | http://d.repec.org/n?u=RePEc:bdi:wptemi:td_1052_16&r=for |
By: | Forni, Mario; Giovannelli, Alessandro; Lippi, Marco; Soccorsi, Stefano |
Abstract: | The paper compares the pseudo real-time forecasting performance of three Dynamic Factor Models: (i) The standard principal-component model, Stock and Watson (2002a), (ii) The model based on generalized principal components, Forni et al. (2005), (iii) The model recently proposed in Forni et al. (2015b) and Forni et al. (2015a). We employ a large monthly dataset of macroeconomic and financial time series for the US economy, which includes the Great Moderation, the Great Recession and the subsequent recovery. Using a rolling window for estimation and prediction, we nd that (iii) neatly outperforms (i) and (ii) in the Great Moderation period for both Industrial Production and Inflation, and for Inflation over the full sample. However, (iii) is outperfomed by (i) and (ii) over the full sample for Industrial Production. |
Date: | 2016–03 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:11161&r=for |
By: | Brian K. Chen; Hawre Jalal; Hideki Hashimoto; Sze-Chuan Suen; Karen Eggleston; Michael Hurley; Lena Schoemaker; Jay Bhattacharya |
Abstract: | Japan has experienced pronounced population aging, and now has the highest proportion of elderly adults in the world. Yet few projections of Japan’s future demography go beyond estimating population by age and sex to forecast the complex evolution of the health and functioning of the future elderly. This study adapts to the Japanese population the Future Elderly Model (FEM), a demographic and economic state-transition microsimulation model that projects the health conditions and functional status of Japan’s elderly population in order to estimate disability, health, and need for long term care. Our FEM simulation suggests that by 2040, over 27 percent of Japan’s elderly will exhibit 3 or more limitations in IADLs and social functioning; almost one in 4 will experience difficulties with 3 or more ADLs; and approximately one in 5 will suffer limitations in cognitive or intellectual functioning. Since the majority of the increase in disability arises from the aging of the Japanese population, prevention efforts that reduce age-specific disability (or future compression of morbidity among middle-aged Japanese) may have only a limited impact on reducing the overall prevalence of disability among Japanese elderly. |
JEL: | I1 J1 J11 J14 |
Date: | 2016–01 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:21870&r=for |
By: | Zeineb Affes (Centre d'Economie de la Sorbonne); Rania Hentati-Kaffel (Centre d'Economie de la Sorbonne) |
Abstract: | Using a large panel of US banks over the period 2008-2013, this paper proposes an early-warning framework to identify bank leading to bankruptcy. We conduct a comparative analysis based on both Canonical Discriminant Analysis and Logit models to examine and to determine the most accurate of these models. Moreover, we analyze and improve suitability of models by comparing different optimal cut-off score (ROC curve vs theoretical value). The main conclusions are: i) Results vary with cut-off value of score, ii) the logistic regression using 0.5 as critical cut-off value outperforms DA model with an average of correct classification equal to 96.22%. However, it produces the highest error type 1 rate 42.67%, iii) ROC curve validation improves the quality of the model by minimizing the error of misclassification of bankrupt banks: only 4.42% in average and exhibiting 0% in both 2012 and 2013. Also, it emphasizes better prediction of failure of banks because it delivers in mean the highest error type II 8.43% |
Keywords: | Bankruptcy prediction; Canonical Discriminant Analysis; Logistic regression; CAMELS; ROC curve; Early-warning system |
JEL: | G21 G33 C25 C38 C53 |
Date: | 2016–02 |
URL: | http://d.repec.org/n?u=RePEc:mse:cesdoc:16016&r=for |