|
on Forecasting |
By: | Norio Kitagawa (Graduate School of Business Administration, Kobe University, Japan); Akinobu Shuto (Research Institute for Economics & Business Administration (RIEB), Kobe University, Japan) |
Abstract: | This study investigates the effect of managerial discretion over their initial earnings forecasts on future performance. First, by estimating the discretionary portion of initial management earnings forecasts (defined as discretionary forecasts) based on the findings of fundamental analysis research, we find that firms with higher discretionary forecasts are more likely to miss their earnings forecast at the end of the fiscal year and revise their forecasts downward to meet their earnings forecasts for the period, suggesting that forecast management through discretionary forecasting produces less credible management forecasts in terms of ex-post realization. Second, by using the hedge-portfolio test and regression analysis, we find that firms with higher discretionary forecasts earn consistently negative abnormal returns, suggesting that investors do not fully understand the implication of discretionary forecasts for the credibility of management earnings forecasts and thus overprice them at the forecast announcement. |
Keywords: | management earnings forecasts, forecast credibility, mispricing, forecast error, forecast revision, Japan |
JEL: | M41 |
Date: | 2013–10 |
URL: | http://d.repec.org/n?u=RePEc:kob:dpaper:dp2013-30&r=for |
By: | Takuya Iwasaki (Faculty of Commerce, Kansai University, Japan); Norio Kitagawa (Graduate School of Business Administration, Kobe University, Japan); Akinobu Shuto (Research Institute for Economics & Business Administration (RIEB), Kobe University, Japan) |
Keywords: | management forecasts, forecasts management, forecast innovations, product market competition, Japan |
JEL: | M41 |
Date: | 2013–10 |
URL: | http://d.repec.org/n?u=RePEc:kob:dpaper:dp2013-31&r=for |
By: | Estian Calitz (Department of Economics, University of Stellenbosch); Krige Siebrits (Department of Economics, University of Stellenbosch); Ian Stuart (Treasury, Government of South Africa) |
Abstract: | Forecasting accuracy is important for fiscal policy credibility. Three questions are posed. Firstly, are the forecasts by South Africa’s National Treasury good, compared to those of non-government economists? The paper compares Treasury’s forecasts to non-government projections and to those of other countries and over time. With reference to the mean absolute error and the root mean square error (van der Watt, 2013), it is concluded that nongovernment economists do not necessarily forecast GDP and inflation better than Treasury. Secondly, have the forecasts by National Treasury been good, over time and compared to those of other countries? The forecast error (the final figure minus the budget estimate) is calculated, using data for 2000/01-2010/11. This is most relevant because retrospectively the outcome of fiscal policy is analysed and judged with reference to final figures. National Treasury’s budget forecast errors are found to be significant. Margins of error in forecasting revenue, expenditure and GDP have partially neutralised each other in terms of their impact on the budget balance as a percentage of GDP. Except towards the end of the period, the fiscal balance was better than budgeted. On average and calculated as a percentage of GDP, revenue forecasting inaccuracies made the biggest contribution to inaccurate estimates of the budget balance, but this is largely explained by GDP forecasting inaccuracies. SA fiscal forecasts show a smaller forecast error than that of 14 member countries of the European Union. Thirdly, has the forecasting ability of National Treasury improved over time? A trend line shows higher Treasury forecast errors towards the end of the period and an underestimation bias for GDP and revenue forecasts. A simple example of the dynamics of fiscal politics is presented to demonstrate that a persistent underestimation of revenue could also erode fiscal credibility. |
Keywords: | fiscal policy, fiscal forecasts, fiscal credibility |
JEL: | E60 H3 H61 H62 |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:sza:wpaper:wpapers200&r=for |
By: | Bessec, Marie |
Abstract: | In recent years, factor models have received increasing attention from both econometricians and practitioners in the forecasting of macroeconomic variables. In this context, Bai and Ng (2008) find an improvement in selecting indicators according to the forecast variable prior to factor estimation (targeted predictors). In particular, they propose using the LARS-EN algorithm to remove irrelevant predictors. In this paper, we adapt the Bai and Ng procedure to a setup in which data releases are delayed and staggered. In the pre-selection step, we replace actual data with estimates obtained on the basis of past information, where the structure of the available information replicates the one a forecaster would face in real time. We estimate on the reduced dataset the dynamic factor model of Giannone, Reichlin and Small (2008) and Doz, Giannone and Reichlin (2011), which is particularly suitable for the very short-term forecast of GDP. A pseudo real-time evaluation on French data shows the potential of our approach. |
Keywords: | Factor model; GDP forecasting; Large dataset; Targeted predictors; Variable selection; |
JEL: | C22 E32 E37 |
Date: | 2013–09 |
URL: | http://d.repec.org/n?u=RePEc:dau:papers:123456789/10079&r=for |
By: | Minchul Shin (Department of Economics, University of Pennsylvania); Molin Zhong (Department of Economics, University of Pennsylvania) |
Abstract: | This paper examines the importance of realized volatility in bond yield density prediction. We incorporate realized volatility into a Dynamic Nelson-Siegel (DNS) model with stochastic volatility and evaluate its predictive performance on US bond yield data. When compared to popular specifications in the DNS literature without realized volatility, we find that having this information improves density forecasting performance. |
Keywords: | Dynamic factor model, forecasting, stochastic volatility, term structure of interest rates |
JEL: | C5 G1 E4 |
Date: | 2013–11–04 |
URL: | http://d.repec.org/n?u=RePEc:pen:papers:13-064&r=for |
By: | Maximo Camacho (Universidad de Murcia); Gabriel Perez-Quiros (Banco de España); Pilar Poncela (Universidad Autónoma de MAdrid) |
Abstract: | Practitioners do not always use research findings, as the research is not always conducted in a manner relevant to real-world practice. This survey seeks to close the gap between research and practice in respect of short-term forecasting in real time. To this end, we review the most relevant recent contributions to the literature, examining their pros and cons, and we take the liberty of proposing some avenues of future research. We include bridge equations, MIDAS, VARs, factor models and Markov-switching factor models, all allowing for mixed-frequency and ragged ends. Using the four constituent monthly series of the Stock-Watson coincident index, industrial production, employment, income and sales, we evaluate their empirical performance to forecast quarterly US GDP growth rates in real time. Finally, we review the main results having regard to the number of predictors in factorbased forecasts and how the selection of the more informative or representative variables can be made. |
Keywords: | Forecasting, GDP growth, time series |
JEL: | E32 C22 E27 |
Date: | 2013–11 |
URL: | http://d.repec.org/n?u=RePEc:bde:wpaper:1318&r=for |
By: | Jakob W. Messner; Georg J. Mayr; Daniel S. Wilks; Achim Zeileis |
Abstract: | Extended logistic regression is a recent ensemble calibration method that extends logistic regression to provide full continuous probability distribution forecasts. It assumes conditional logistic distributions for the (transformed) predictand and fits these using selected predictand category probabilities. In this study we compare extended logistic regression to the closely related ordered and censored logistic regression models. Ordered logistic regression avoids the logistic distribution assumption but does not yield full probability distribution forecasts, whereas censored regression directly fits the full conditional predictive distributions. To compare the performance of these and other ensemble post-processing methods we used wind speed and precipitation data from two European locations and ensemble forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF). Ordered logistic regression performed similarly to extended logistic regression for probability forecasts of discrete categories whereas full predictive distributions were better predicted by censored regression. |
Keywords: | probabilistic forecasting, extended logistic regression, ordered logistic regression, heteroscedasticity |
JEL: | C53 C25 Q42 |
Date: | 2013–10 |
URL: | http://d.repec.org/n?u=RePEc:inn:wpaper:2013-32&r=for |
By: | Niels S. Hansen (Aarhus University and CREATES); Asger Lunde (Aarhus University and CREATES) |
Abstract: | In this paper we are interested in the term structure of futures contracts on oil. The objective is to specify a relatively parsimonious model which explains data well and performs well in a real time out of sample forecasting. The dynamic Nelson-Siegel model is normally used to analyze and forecast interest rates of different maturities. The structure of oil futures resembles the structure of interest rates and this motivates the use of this model for our purposes. The data set is vast and the dynamic Nelson-Siegel model allows for a significant dimension reduction by introducing three factors. By performing a series of cross-section regressions we obtain time series for these factors and we focus on modeling their joint distribution. Using copula decomposition we can set up a model for each factor individually along with a model for their dependence structure. When a reasonable model for the factors has been specified it can be used to forecast prices of futures contracts with different maturities. The outcome of this exercise is a class of models which describes the observed futures contracts well and forecasts better than conventional benchmarks. We carry out a real time value at risk analysis and show that our class of models performs well. |
Keywords: | Oil futures, Nelson-Siegel, Normal Inverse Gaussian, GARCH, Copula. |
JEL: | G17 C32 C53 |
Date: | 2013–10–25 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2013-36&r=for |
By: | Chèze, Benoit; Chevallier, Julien; Gastineau, Pascal |
Abstract: | The aim of this article is to investigate whether anticipated technological progress can be expected to be strong enough to offset carbon dioxide (CO2)emissions resulting from the rapid growth of air transport. Aviation CO2 emissions projections are provided at the worldwide level and for eight geographical zones until 2025. Total air traffic flows are first forecast using a dynamic panel-data econometric model and then converted into corresponding quantities of air traffic CO2 emissions, through jet fuel demand forecasts, using specific hypothesis and energy factors. None of our nine scenarios appears compatible with the objective of 450 ppm CO2-eq. (a.k.a. “scenario of type I”) recommended by the Intergovernmental Panel on Climate Change (IPCC). None is either compatible with the IPCC scenario of type III, which aims at limiting global warming to 3.2◦C. Thus, aviation CO2 emissions are unlikely to diminish over the next decade unless there is a radical shift in technology and/or travel demand is restricted. |
Keywords: | Air transport; CO2 emissions; Forecasting; Climate change; |
JEL: | C53 L93 Q47 Q54 |
Date: | 2013–01 |
URL: | http://d.repec.org/n?u=RePEc:dau:papers:123456789/9262&r=for |
By: | Dimitrios D. Thomakos (University of Peloponnese); Konstantinos Nikolopoulos (Bangor Business School) |
Abstract: | In this study building on earlier work on the properties and performance of the univariate Theta method for a unit root data generating process we: (a) derive new theoretical formulations for the application of the method on multivariate time series, (b) investigate the conditions for which the multivariate Theta method is expected to forecast better than the univariate one, (c) evaluate through simulations the bivariate form of the method, (d) evaluate this latter model in real macroeconomic and financial time series. The study provides sufficient empirical evidence to illustrate the suitability of the method for vector forecasting; furthermore it provides the motivation for further investigation of the multivariate Theta method for higher dimensions. |
Keywords: | Theta method; univariate; multivariate time series; unit roots; vector forecasting |
Date: | 2013–07 |
URL: | http://d.repec.org/n?u=RePEc:bng:wpaper:13004&r=for |
By: | Peter H. Sullivan |
Abstract: | In this paper I compare the performance of three approaches to modeling temporal instability of the relationship between the euro-dollar exchange rate and macroeconomic fundamentals. Each of the three approaches considered -- adaptive learning, Markov-switching and Imperfect Knowledge Economics (IKE) -- recognize that market participants revise forecasting strategies, at least intermittently, and, as a result, the relationship between the exchange rate and fundamentals is temporally unstable. The central question in the literature addressed by this paper is which of the three approaches to modeling revisions of market participants' forecasting strategies is most empirically relevant for understanding the connection between currency fluctuations and fundamentals? One of the objectives of comparing the out-of-sample forecasting of the three approaches to change is to test to what extent growth-of-knowledge considerations, as proposed by Frydman and Goldberg (2007, 2011), are empirically relevant for our understanding of currency fluctuations. I find that only the IKE model, developed from Sullivan (2013) is able to significantly outperform the random walk benchmark, suggesting that different sets of fundamentals matter during different time periods in ways that do not conform to an overarching probability law. |
JEL: | F31 C58 E44 E47 |
Date: | 2013–11–10 |
URL: | http://d.repec.org/n?u=RePEc:jmp:jm2013:psu387&r=for |
By: | Estian Calitz (Department of Economics, University of Stellenbosch); Krige Siebrits (Department of Economics, University of Stellenbosch); Ian Stuart (Treasury, Government of South Africa) |
Abstract: | The paper investigates whether fiscal credibility in South Africa (SA) would be enhanced by following the international trend of establishing a fiscal council. Given that fiscal councils and numerical fiscal rules are increasingly seen as complementary aspects of fiscal policymaking frameworks, we survey evidence on fiscal councils, with reference to empirical studies and country experience – Chile in particular. Whilst earlier studies generated inconclusive results of earlier attempts about the link between fiscal councils and good fiscal performance, more recent studies found that the involvement of fiscal councils has contributed to more accurate macroeconomic and budgetary forecasts. In the light of this evidence – in particular, the increasingly recognised need for flexibility in fiscal rules, respect for the country’s political environment in considering the appropriateness of fiscal councils and the importance of transparency in any fiscal regime – we discuss lessons for SA, and the mechanics of our proposal. SA’s fiscal performance and regime are assessed, with reference to the literature’s finding of historical fiscal sustainability and macro fiscal forecasting accuracy and various measures characterising the current transparency-enhancing regime of fiscal discretion. It is recognised that SA does not have numerical fiscal rules and that the National Treasury has not been outperformed by nongovernment economists in forecasting key variables used in drafting the annual budget. Projections nevertheless become increasingly inaccurate over three-year periods. On average, budget deficit forecasting errors have during the previous decade been lower than in European Union countries. The case for a fiscal council on the basis of better short-term forecasting accuracy alone is not strong. Instead of a fiscal council, an institutional innovation is proposed, namely structured bi-annual discussions of government’s macroeconomic budget forecasts in public parliamentary hearings, integrated into the budget process. This avoids drainage of scarce resources from Treasury and political pitfalls encountered elsewhere and might strengthen credibility of medium-term projections. |
Keywords: | fiscal rules, fiscal policy, fiscal council, fiscal transparency, fiscal forecasts |
JEL: | H61 H68 |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:sza:wpaper:wpapers201&r=for |
By: | Sévi, Benoît; Chevallier, Julien |
Abstract: | This paper evaluates the predictability of WTI light sweet crude oil futures by using the variance risk premium, i.e. the difference between model-free measures of implied and realized volatilities. Additional regressors known for their ability to explain crude oil futures prices are also considered, capturing macroeconomic, financial and oil-specific influences. The results indicate that the explanatory power of the (negative) variance risk premium on oil excess returns is particularly strong (up to 25% for the adjusted Rsquared across our regressions). It complements other financial (e.g. default spread) and oil-specific (e.g. US oil stocks) factors highlighted in previous literature. |
Keywords: | Oil Futures; Variance Risk Premium; Forecasting; |
JEL: | C32 G17 Q47 |
Date: | 2013–05 |
URL: | http://d.repec.org/n?u=RePEc:dau:papers:123456789/11714&r=for |
By: | Nalan Basturk (Erasmus University Rotterdam Econometric Institute, Tinbergen Institute); Cem Cakmakli (University of Amsterdam Department of Quantitative Economics, Koç University); Pinar Ceyhan (Erasmus University Rotterdam Econometric Institute, Tinbergen Institute); Herman K. van Dijk (Erasmus University Rotterdam Econometric Institute, Tinbergen Institute, VU University Amsterdam Department of Econometrics) |
Abstract: | Changing time series properties of US inflation and economic activity, measured as marginal costs, are modeled within a set of extended Phillips Curve (PC) models. It is shown that mechanical removal or modeling of simple low frequency movements in the data may yield poor predictive results which depend on the model specification used. Basic PC models are extended to include structural time series models that describe typical time varying patterns in levels and volatilities. Forward and backward looking expectation components for inflation are incorporated and their relative importance is evaluated. Survey data on expected inflation are introduced to strengthen the information in the likelihood. Use is made of simulation based Bayesian techniques for the empirical analysis. No credible evidence is found on endogeneity and long run stability between inflation and marginal costs. Backward-looking inflation appears stronger than forward-looking one. Levels and volatilities of inflation are estimated more precisely using rich PC models. The extended PC structures compare favorably with existing basic Bayesian vector autoregressive and stochastic volatility models in terms of fit and prediction. Tails of the complete predictive distributions indicate an increase in the probability of deflation in recent years. |
Keywords: | New Keynesian Phillips curve, unobserved components, time varying parameters, level shifts, inflation expectations, survey data |
JEL: | C11 C32 E31 E37 |
Date: | 2013–11 |
URL: | http://d.repec.org/n?u=RePEc:koc:wpaper:1321&r=for |
By: | Yuta Kurose (Center for the Study of Finance and Insurance, Osaka University,); Yasuhiro Omori (Faculty of Economics, University of Tokyo) |
Abstract: |    A multivariate stochastic volatility model with dynamic equicorrelation and cross leverage ef- fect is proposed and estimated. Using a Bayesian approach, an ecient Markov chain Monte Carlo algorithm is described where we use the multi-move sampler, which generates multiple latent variables simultaneously. Numerical examples are provided to show its sampling e- ciency in comparison with the simple algorithm that generates one latent variable at a time given other latent variables. Furthermore, the proposed model is applied to the multivariate daily stock price index data. The empirical study shows that our novel model provides a substantial improvement in forecasting with respect to out-of-sample hedging performances |
Date: | 2013–11 |
URL: | http://d.repec.org/n?u=RePEc:tky:fseres:2013cf907&r=for |
By: | Lang, Michael |
Abstract: | This paper builds upon the model of Kaminsky and Reinhart (1999) and extends it to triplecrises. It applies a new visualisation approach combining elements of an event study analysis and a fan chart technique. This approach illustrates the deviation of fundamentals in the runup to balance-of-payments problems. The results suggest that both systemic banking crises and deteriorating government finances are highly significant leading indicators. Taking these indicators into account helps build a new early warning system for currency crises. The results are highly significant and robust. The out-of-sample forecasts demonstrate the strong predictive power of the model. -- |
Keywords: | currency crisis,financial sector vulnerability,early warning system |
JEL: | F30 F31 F34 F41 G01 |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:zbw:fsfmwp:205&r=for |
By: | Alessandro Innocenti; Tommaso Nannicini; Roberto Ricciuti |
Abstract: | We evaluate the impact of timing on decision outcome, when both the timing and the relevant decision are chosen under uncertainty. Betting markets provide the testing ground, as we exploit an original dataset containing more than one million online bets on games of the Italian Major Soccer League. We find that individuals perform systematically better when they place their bets farther away from the game day. The better performance of early bettors holds controlling for (time-invariant) unobservable ability, learning during the season, and timing of the odds. We attribute this result to the increase of noisy information on game day, which hampers the capacity of late (non-professional) bettors to use very simple prediction methods, such as team rankings or last game results. We also find that more successful bettors tend to bet in advance,focus on a smaller set of events, and prefer events associated with smaller betting odds. JEL codes: D81, D83. Keywords: decision timing, information overload, betting, sports forecasting. |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:igi:igierp:502&r=for |
By: | Papadimitriou, Theophilos (Democritus University of Thrace, Department of Economics); Gogas, Periklis (Democritus University of Thrace, Department of Economics); Plakandaras, Vasilios (Democritus University of Thrace, Department of Economics) |
Abstract: | In this paper, we approximate the empirical findings of Papadamou and Markopoulos (2012) on the NOK/USD exchange rate under a Machine Learning (ML) framework. By applying Support Vector Regression (SVR) on a general monetary exchange rate model and a Dynamic Evolving Neuro-Fuzzy Inference System (DENFIS) to extract model structure, we test for the validity of popular monetary exchange rate models. We reach to mixed results since the coefficient sign of interest rate differential is in favor only with the model proposed by Bilson (1978), while the inflation rate differential coefficient sign is approximated by the model of Frankel (1979). By adopting various inflation expectation estimates, our SVR model fits actual data with a small Mean Absolute Percentage Error when an autoregressive approach excluding energy prices is adopted for inflation expectation. Overall, our empirical findings conclude that for a small open petroleum producing country such as Norway, fundamentals possess significant forecasting ability when used in exchange rate forecasting. |
Keywords: | International Financial Markets; Foreign Exchange; Support Vector Regression; Monetary exchange rate models |
JEL: | F30 F31 G15 |
Date: | 2013–11–07 |
URL: | http://d.repec.org/n?u=RePEc:ris:duthrp:2013_005&r=for |
By: | Cary Deck (Department of Economics, University of Arkansas and Economic Science Institute, Chapman University); Li Hao (Department of Economics, University of Arkansas); David Porter (Economic Science Institute, Chapman University) |
Abstract: | Laboratory experiments have demonstrated that prediction market prices weakly aggregate the disparate information of the traders about states (moves) of nature. However, in many practical applications one might want to predict the move of a strategic participant. This is particularly important in aggressor-defender contests. This paper reports a set of such experiments where the defender may have the advantage of observing a prediction market on the aggressor’s action. The results of the experiments indicate that: the use of prediction markets does not increase the defender’s win rate; prediction markets contain reliable information regarding aggressors’ decisions, namely excess bid information, that is not being exploited by defenders; and the existence of a prediction market alters the behavior of the aggressor whose behavior is being forecast. |
Keywords: | Information Aggregation, Prediction Markets, Weak-Link Contests, Colonel Blotto |
JEL: | C7 C9 D7 D8 G1 |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:chu:wpaper:13-27&r=for |
By: | Xiangjin B. Chen; Jiti Gao; Degui Li; Param Silvapulle |
Abstract: | This paper introduces a new specification for the heterogeneous autoregressive (HAR) model for the realized volatility of S&P500 index returns. In this new model, the coeffcients of the HAR are allowed to be time-varying with unknown functional forms. We propose a local linear method for estimating this TVC-HAR model as well as a bootstrap method for constructing confidence intervals for the time varying coefficient functions. In addition, the estimated nonparametric TVC-HAR was calibrated by fitting parametric polynomial functions by minimising the L2-type criterion. The calibrated TVC-HAR and the simple HAR models were tested separately against the nonparametric TVC-HAR model. The test statistics constructed based on the generalised likelihood ratio method augmented with bootstrap method provide evidence in favour of calibrated TVC-HAR model. More importantly, the results of conditional predictive ability test developed by Giacomini and White (2006) indicate that the non-parametric TVC-HAR model consistently outperforms its calibrated counterpart as well as the simple HAR and the HAR-GARCH models in out-of-sample forecasting. |
Keywords: | Bootstrap method, heterogeneous autoregressive model, locally stationary process, nonparametric method |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2013-21&r=for |
By: | Xibin Zhang; Maxwell L. King |
Abstract: | This paper aims to investigate a Bayesian sampling approach to parameter estimation in the GARCH model with an unknown conditional error density, which we approximate by a mixture of Gaussian densities centered at individual errors and scaled by a common standard deviation. This mixture density has the form of a kernel density estimator of the errors with its bandwidth being the standard deviation. This study is motivated by the lack of robustness in GARCH models with a parametric assumption for the error density when used for error-density based inference such as value-at-risk (VaR) estimation. A contribution of the paper is to construct the likelihood and posterior of the model and bandwidth parameters under the kernel-form error density, and to derive the one-step-ahead posterior predictive density of asset returns. We also investigate the use and benefit of localized bandwidths in the kernel-form error density. A Monte Carlo simulation study reveals that the robustness of the kernel-form error density compensates for the loss of accuracy when using this density. Applying this GARCH model to daily return series of 42 assets in stock, commodity and currency markets, we find that this GARCH model is favored against the GARCH model with a skewed Student t error density for all stock indices, two out of 11 currencies and nearly half of the commodities. This provides an empirical justification for the value of the proposed GARCH model. |
Keywords: | Bayes factors, Gaussian kernel error density, localized bandwidths, Markov chain Monte Carlo, value-at-risk |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2013-19&r=for |
By: | Xibin Zhang; Maxwell L. King; Han Lin Shang |
Abstract: | We propose to approximate the unknown error density of a nonparametric regression model by a mixture of Gaussian densities with means being the individual error realizations and variance a constant parameter. This mixture density has the form of a kernel density estimator of error realizations. We derive an approximate likelihood and posterior for bandwidth parameters in the kernel-form error density and the Nadaraya-Watson regression estimator and develop a sampling algorithm. A simulation study shows that when the true error density is non-Gaussian, the kernel-form error density is often favored against its parametric counterparts including the correct error density assumption. Our approach is demonstrated through a nonparametric regression model of the Australian All Ordinaries daily return on the overnight FTSE and S&P 500 returns. Using the estimated bandwidths, we derive the one-day-ahead density forecast of the All Ordinaries return, and a distribution-free value-at-risk is obtained. The proposed algorithm is also applied to a nonparametric regression model involved in state–price density estimation based on S&P 500 options data. |
Keywords: | Bayes factors, kernel-form error density, Metropolis-Hastings algorithm, state–price density, value-at-risk |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2013-20&r=for |
By: | Michael J. Lamla (KOF Swiss Economic Institute, ETH Zurich, Switzerland); Lena Dräger (University of Hamburg, Germany); Damjan Pfajfar (University of Tilburg, Netherlands) |
Abstract: | Using the microdata of the Michigan Survey of Consumers, we evaluate whether U.S. consumers form macroeconomic expectations consistent with different economic concepts. We check whether their expectations are in line with the Phillips Curve, the Taylor Rule and the Income Fisher Equation. We observe that 50% of the surveyed population have expectations consistent with the Income Fisher equation and the Taylor Rule, while 25% are in line with the Phillips Curve. However, only 6% of consumers form theory-consistent expectations with respect to all three concepts. For the Taylor Rule and the Phillips curve we observe a strong cyclical pattern. For all three concepts we find significant differences across demographic groups. Evaluating determinants of consistency, we provide evidence that the likelihood of having theory-consistent expectations with respect to the Phillips curve and the Taylor rule falls during recessions and with inflation higher than 2%. Moreover, consistency with respect to all three concepts is affected by changes in the communication policy of the Fed, where the strongest positive effect on consistency comes from the introduction of the official inflation target. Finally, we show that consumers with theory-consistent expectations have lower absolute inflation forecast errors and are closer to professionals' inflation forecasts. |
Keywords: | Macroeconomic expectations, microdata, macroeconomic literacy, central bank communication, consumer forecast accuracy |
JEL: | C25 D84 E31 |
Date: | 2013–11 |
URL: | http://d.repec.org/n?u=RePEc:kof:wpskof:13-345&r=for |
By: | Betz, Frank; Oprica, Silviu; Peltonen, Tuomas A.; Sarlin, Peter |
Abstract: | The paper develops an early-warning model for predicting vulnerabilities leading to distress in European banks using both bank and country-level data. As outright bank failures have been rare in Europe, the paper introduces a novel dataset that complements bankruptcies and defaults with state interventions and mergers in distress. The signals of the early warning model are calibrated not only according to the policy-maker’s preferences between type I and II errors, but also to take into account the potential systemic relevance of each individual financial institution. The key findings of the paper are that complementing bank specific vulnerabilities with indicators for macro-financial imbalances and banking sector vulnerabilities improves model performance and yields useful out-of-sample predictions of bank distress during the current financial crisis. JEL Classification: E44, E58, F01, F37, G01 |
Keywords: | bank distress, early-warning model, prudential policy, signal evaluation |
Date: | 2013–10 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20131597&r=for |