nep-for New Economics Papers
on Forecasting
Issue of 2012‒05‒15
23 papers chosen by
Rob J Hyndman
Monash University

  1. Forecasting national recessions using state level data By Michael T. Owyang; Jeremy M. Piger; Howard J. Wall
  2. Using the Yield Curve in Forecasting Output Growth and In?flation By Eric Hillebrand; Huiyu Huang; Tae-Hwy Lee; Canlin Li
  3. The short term prediction of analysts' forecast error By Boudt, Kris; De Goeij, Peter; Thewissen, James; Van Campenhout, Geert
  4. Combining Recession Probability Forecasts from a Dynamic Probit Indicator By Thomas Theobald
  5. Robust Ranking of Multivariate GARCH Models by Problem Dimension By Massimiliano Caporin; Michael McAleer
  6. Forecasting demand for high speed rail By Börjesson, Maria
  7. Large time-varying parameter VARs By Koop, Gary; Korobilis, Dimitris
  8. Inter-temporal variation in the travel time and travel cost parameters of transport models By Börjesson, Maria
  9. Do changes in distance-to-default anticipate changes in the credit rating? By Nidhi Aggarwal; Manish Singh; Susan Thomas
  10. Evaluating DSGE model forecasts of comovements By Edward Herbst; Frank Schorfheide
  11. What does financial volatility tell us about macroeconomic fluctuations? By Marcelle Chauvet; Zeynep Senyuz; Emre Yoldas
  12. Stein-Rule Estimation and Generalized Shrinkage Methods for Forecasting Using Many Predictors By Eric Hillebrand; Tae-Hwy Lee
  13. Alternative Modeling for Long Term Risk By Dominique Guegan; Xin Zhao
  14. The Future of Oil: Geology versus Technology By Michael Kumhof; Jaromir Benes; Ondra Kamenik; Susanna Mursula; Marcelle Chauvet; Jack Selody; Douglas Laxton
  15. The Causal Effect of Cognitive Abilities on Economic Behavior: Evidence from a Forecasting Task with Varying Cognitive Load By Ondrej Rydval
  16. Econometric Model of Investment in Fixed Assets in Russian Federation: Estimates and Forecasts By Elena Mitsek; Sergey Mitsek
  17. Verfahren der konjunkturellen Wendepunktbestimmung unter Berücksichtigung der Echtzeit-Problematik By Daniel Detzer; Christian R. Proaño; Katja Rietzler; Sven Schreiber; Thomas Theobald; Sabine Stephan
  18. Corporate Social Responsibility and Stock Market Efficiency By Leonardo Becchetti; Rocco Ciciretti; Alessandro Giovannelli
  19. A Multivariate Random Walk Model with Slowly Changing Drift and Cross-correlation Applied to Finance By Yuanhua Feng; David Hand; Yuanhua Feng
  20. Estimating a semiparametric asymmetric stochastic volatility model with a Dirichlet process mixture By Mark J. Jensen; John M. Maheu
  21. Forecasting Extreme Volatility of FTSE-100 With Model Free VFTSE, Carr-Wu and Generalized Extreme Value (GEV) Option Implied Volatility Indices By Sheri M. Markose; Yue Peng; Amadeo Alentorn
  22. Tobin’S Q Versus Cape Versus Caper: Predicting Stock Market Returns Using Fundamentals and Momentum By Ed Tower
  23. Modeling with Limited Data: Estimating Potential Growth in Cambodia By Phurichai Rungcharoenkitkul

  1. By: Michael T. Owyang; Jeremy M. Piger; Howard J. Wall
    Abstract: A large literature studies the information contained in national-level economic indicators, such as financial and aggregate economic activity variables, for forecasting U.S. business cycle phases (expansions and recessions.) In this paper, we investigate whether there is additional information regarding business cycle phases contained in subnational measures of economic activity. Using a probit model to predict the NBER expansion and recession classification, we assess the forecasting benefits of adding state-level employment growth to a common list of national-level predictors. As state-level data adds a large number of variables to the model, we employ a Bayesian model averaging procedure to construct forecasts. Based on a variety of forecast evaluation metrics, we find that including state-level employment growth substantially improves short-horizon forecasts of the business cycle phase. The gains in forecast accuracy are concentrated during months of national recession. Posterior inclusion probabilities indicate substantial uncertainty regarding which states belong in the model, highlighting the importance of the Bayesian model averaging approach.>
    Keywords: Recessions ; Business cycles ; Economic conditions
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:fip:fedlwp:2012-013&r=for
  2. By: Eric Hillebrand (Aarhus University and CREATES); Huiyu Huang (GMO Emerging Markets); Tae-Hwy Lee (University of California, Riverside); Canlin Li (Federal Reserve Board)
    Abstract: Following Diebold and Li (2006), we use the Nelson-Siegel (NS, 1987) yield curve factors. However the NS yield curve factors are not supervised for a specifi?c forecast target in the sense that the same factors are used for forecasting different variables, e.g., output growth or infl?ation. We propose a modifed NS factor model, where the new NS yield curve factors are supervised for a specifi?c variable to forecast. We show it outperforms the conventional (non-supervised) NS factor model in out-of-sample forecasting of monthly US output growth and infl?ation. The original NS yield factor model is to combine information (CI) of predictors and uses factors of predictors (yield curve). The new supervised NS factor model is to combine forecasts (CF) and uses factors of forecasts of output growth or infl?ation conditional on the yield curve. We formalize the concept of supervision, and demonstrate analytically and numerically how supervision works. For both CF and CI schemes, principal components (PC) may be used in place of the NS factors. In out-of-sample forecasting of U.S. monthly output growth and infl?ation, we fi?nd that supervised CF-factor models (CF-NS, CF-PC) are substantially better than unsupervised CI-factor models (CI-NS, CI-PC), especially at longer forecast horizons.
    Keywords: Level, slope, and curvature of the yield curve, Nelson-Siegel factors, Supervised factor models, Combining forecasts, Principal components.
    JEL: C5 E4 G1
    Date: 2011–12–20
    URL: http://d.repec.org/n?u=RePEc:aah:create:2012-17&r=for
  3. By: Boudt, Kris (KULeuven, Lessius University College, V.U.University of Amsterdam); De Goeij, Peter (Tilburg University); Thewissen, James (KULeuven, Lessius University College, HUBrussel); Van Campenhout, Geert (Hogeschool-Universiteit Brussel (HUB), KULeuven)
    Abstract: We examine the profitability of implementing a short term trading strategy based on predicting the error in analysts' earnings per share forecasts using publicly available information. In the 1998-2010 I/B/E/S data, the strategy of taking a long (short) position in stocks with the most pessimistic (optimistic) consensus forecast and closing the position on the rst post announcement day has an annual gross abnormal return of 16.56%, after correcting for market risk, size, book-to-market and price momentum effects. A key insight is that the profitability of the trading strategy stems from using robust forecasting methods and from focusing on the stocks with the most extreme predicted forecast errors. The trading strategies using least squares regression and/or focusing merely on the sign of the forecast error are not profitable.
    Keywords: Financial analysts, Forecast error, Short term prediction, Trading strategy
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:hub:wpecon:201216&r=for
  4. By: Thomas Theobald (Macroeconomic Policy Institute)
    Abstract: This paper analyzes the real-time out-of-sample performance of three kinds of combination schemes. While for each the set of underlying forecasts is slightly modified, all of them are real-time recession probability forecasts generated by a dynamic probit indicator. Among the considered aggregations the most efficient turns out to be one that neglects the correlations between the forecast errors.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:imk:wpaper:89-2012&r=for
  5. By: Massimiliano Caporin (Dipartimento di Scienze Economiche "Marco Fanno" (Department of Economics and Management), Università degli Studi di Padova); Michael McAleer (Econometrisch Instituut (Econometric Institute), Faculteit der Economische Wetenschappen (Erasmus School of Economics), Erasmus Universiteit, Tinbergen Instituut (Tinbergen Institute).)
    Abstract: During the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. Recent research has begun to examine MGARCH specifications in terms of their out-of-sample forecasting performance. We provide an empirical comparison of alternative MGARCH models, namely BEKK, DCC, Corrected DCC (cDCC), CCC, OGARCH Exponentially Weighted Moving Average, and covariance shrinking, using historical data for 89 US equities. We contribute to the literature in several directions. First, we consider a wide range of models, including the recent cDCC and covariance shrinking models. Second, we use a range of tests and approaches for direct and indirect model comparison, including the Model Confidence Set. Third, we examine how the robust model rankings are influenced by the cross-sectional dimension of the problem.
    Keywords: Covariance forecasting, model confidence set, robust model ranking, MGARCH, robust model comparison.
    JEL: C32 C53 C52
    Date: 2012–04
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1206&r=for
  6. By: Börjesson, Maria (KTH)
    Abstract: It is sometimes argued that standard state-of-practice logit based models cannot forecast the demand for substantially reduced travel times, for instance due to High Speed Rail (HSR). The present paper investigates this issue by reviewing travel time elasticities for long-distance rail travel in the literature and comparing these with elasticities observed when new HSR lines have opened. This paper also validates the Swedish official long-distance model and its forecasted demand for a proposed new HSR track, using aggregate data revealing how the air-rail modal split varies with the difference in generalized travel time between rail and air. The official linear-in-parameters long-distance model is also compared to a model applying Box-Cox transformations. The paper contributes to the empirical literature on long-distance travel, long-distance elasticities and HSR passenger demand forecasts. Results indicate that the Swedish state-of-practice model, and similar models, is indeed able to predict the demand for a HSR reasonably well. The non-linear model, however, has better model fit and slightly higher elasticities.
    Keywords: High speed rail; Travel demand; Forecasting; Air-rail share; Cost-benefit analysis
    JEL: C25 D61 J22 R41 R42
    Date: 2012–05–03
    URL: http://d.repec.org/n?u=RePEc:hhs:ctswps:2012_012&r=for
  7. By: Koop, Gary; Korobilis, Dimitris
    Abstract: In this paper we develop methods for estimation and forecasting in large time-varying parameter vector autoregressive models (TVP-VARs). To overcome computational constraints with likelihood-based estimation of large systems, we rely on Kalman filter estimation with forgetting factors. We also draw on ideas from the dynamic model averaging literature and extend the TVP-VAR so that its dimension can change over time. A final extension lies in the development of a new method for estimating, in a time-varying manner, the parameter(s) of the shrinkage priors commonly-used with large VARs. These extensions are operationalized through the use of forgetting factor methods and are, thus, computationally simple. An empirical application involving forecasting inflation, real output, and interest rates demonstrates the feasibility and usefulness of our approach.
    Keywords: Bayesian VAR; forecasting; time-varying coefficients; state-space model
    JEL: E27 C52 E37 C11
    Date: 2012–02–28
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:38591&r=for
  8. By: Börjesson, Maria (KTH, ABE)
    Abstract: The parameters for travel time and travel cost are central in travel demand forecasting models. Since valuation of infrastructure investments requires prediction of travel demand for future evaluation years, inter-temporal variation of the travel time and travel cost parameters is a key issue in forecasting. Using two identical stated choice experiments conducted among Swedish drivers with an interval of 13 years, 1994 and 2007, this paper estimates the inter-temporal variation in travel time and cost parameters. It is found that the travel time parameter has remained constant over time but that the travel cost parameter has declined in real terms. The trend decline in the cost parameter can be entirely explained by higher average income level in the 2007 sample compared to the 1994 sample. The results support the recommendation to keep the travel time parameter constant over time in forecast models but to deflate the travel cost parameter according to forecasts of income increases among travellers and the relevant income elasticity of the cost parameter. Evidence from this study further suggests that the inter-temporal and the cross-sectional income elasticity of the cost parameter are equal. The average elasticity is found to be -0.8- -0.9 in the present sample of drivers, and the elasticity is found to increases with real income level, both in the cross-section and over time.
    Keywords: Travel demand forecasting; Inter-temporal income elasticity; Marginal disutility of time; Marginal disutility of cost; Time parameter; Cost parameter; Stated preference; Replicated survey
    JEL: C25 D61 J22 R41 R42
    Date: 2012–05–04
    URL: http://d.repec.org/n?u=RePEc:hhs:ctswps:2012_016&r=for
  9. By: Nidhi Aggarwal (Indira Gandhi Institute of Development Research); Manish Singh (Indira Gandhi Institute of Development Research); Susan Thomas (Indira Gandhi Institute of Development Research)
    Abstract: Distance-to-default (DtD) from the Merton model has been used in the credit risk literature, most successfully as an input into reduced form models for forecasting default. In this paper, we suggest that the change in the DtD is informative for predicting change in the credit rating. This is directly useful for situations where forecasts of credit rating changes are required. More generally, it contributes to our knowledge about reduced form models of credit risk.
    Keywords: Distance to Default, rating downgrades, rating change, forecasts, event study analysis, probit models, simulation, bootstrap, crisis analysis
    JEL: C53 C58 G14 G17 G21
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:ind:igiwpp:2012-010&r=for
  10. By: Edward Herbst; Frank Schorfheide
    Abstract: This paper develops and applies tools to assess multivariate aspects of Bayesian Dynamic Stochastic General Equilibrium (DSGE) model forecasts and their ability to predict comovements among key macroeconomic variables. We construct posterior predictive checks to evaluate conditional and unconditional density forecasts, in addition to checks for root-mean-squared errors and event probabilities associated with these forecasts. The checks are implemented on a three-equation DSGE model as well as the Smets and Wouters (2007) model using real-time data. We find that the additional features incorporated into the Smets-Wouters model do not lead to a uniform improvement in the quality of density forecasts and prediction of comovements of output, inflation, and interest rates.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2012-11&r=for
  11. By: Marcelle Chauvet; Zeynep Senyuz; Emre Yoldas
    Abstract: This paper provides an extensive analysis of the predictive ability of financial volatility measures for economic activity. We construct monthly measures of aggregated and industry-level stock volatility, and bond market volatility from daily returns. We model log financial volatility as composed of a long-run component that is common across all series, and a short-run component. If volatility has components, volatility proxies are characterized by large measurement error, which veils analysis of their fundamental information and relationship with the economy. We find that there are substantial gains from using the long term component of the volatility measures for linearly projecting future economic activity, as well as for forecasting business cycle turning points. When we allow for asymmetry in the long-run volatility component, we find that it provides early signals of upcoming recessions. In a real-time out-of-sample analysis of the last recession, we find that these signals are concomitant with the first signs of distress in the financial markets due to problems in the housing sector around mid-2007 and the implied chronology is consistent with the crisis timeline.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2012-09&r=for
  12. By: Eric Hillebrand (Aarhus University and CREATES); Tae-Hwy Lee (University of California, Riverside)
    Abstract: We examine the Stein-rule shrinkage estimator for possible improvements in estimation and forecasting when there are many predictors in a linear time series model. We consider the Stein-rule estimator of Hill and Judge (1987) that shrinks the unrestricted unbiased OLS estimator towards a restricted biased principal component (PC) estimator. Since the Stein-rule estimator combines the OLS and PC estimators, it is a model-averaging estimator and produces a combined forecast. The conditions under which the improvement can be achieved depend on several unknown parameters that determine the degree of the Stein-rule shrinkage. We conduct Monte Carlo simulations to examine these parameter regions. The overall picture that emerges is that the Stein-rule shrinkage estimator can dominate both OLS and principal components estimators within an intermediate range of the signal-to-noise ratio. If the signal-to-noise ratio is low, the PC estimator is superior. If the signal-to-noise ratio is high, the OLS estimator is superior. In out-of-sample forecasting with AR(1) predictors, the Stein-rule shrinkage estimator can dominate both OLS and PC estimators when the predictors exhibit low persistence.
    Keywords: Stein-rule, shrinkage, risk, variance-bias tradeo, OLS, principal components.
    JEL: C1 C2 C5
    Date: 2012–04–30
    URL: http://d.repec.org/n?u=RePEc:aah:create:2012-18&r=for
  13. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Paris I - Panthéon Sorbonne, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Xin Zhao (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Paris I - Panthéon Sorbonne)
    Abstract: In this paper, we propose an alternative approach to estimate long-term risk. Instead of using the static square root method, we use a dynamic approach based on volatility forecasting by non-linear models. We explore the possibility of improving the estimations by different models and distributions. By comparing the estimations of two risk measures, value at risk and expected shortfall, with different models and innovations at short, median and long-term horizon, we find out that the best model varies with the forecasting horizon and the generalized Pareto distribution gives the most conservative estimations with all the models at all the horizons. The empirical results show that the square root method underestimates risk at long horizon and our approach is more competitive for risk estimation at long term.
    Keywords: Long memory, Value at Risk, expect shortfall, extreme value distribution.
    Date: 2012–04
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00694449&r=for
  14. By: Michael Kumhof; Jaromir Benes; Ondra Kamenik; Susanna Mursula; Marcelle Chauvet; Jack Selody; Douglas Laxton
    Abstract: We discuss and reconcile two diametrically opposed views concerning the future of world oil production and prices. The geological view expects that physical constraints will dominate the future evolution of oil output and prices. It is supported by the fact that world oil production has plateaued since 2005 despite historically high prices, and that spare capacity has been near historic lows. The technological view of oil expects that higher oil prices must eventually have a decisive effect on oil output, by encouraging technological solutions. It is supported by the fact that high prices have, since 2003, led to upward revisions in production forecasts based on a purely geological view. We present a nonlinear econometric model of the world oil market that encompasses both views. The model performs far better than existing empirical models in forecasting oil prices and oil output out of sample. Its point forecast is for a near doubling of the real price of oil over the coming decade. The error bands are wide, and reflect sharply differing judgments on ultimately recoverable reserves, and on future price elasticities of oil demand and supply.
    Keywords: Demand , Economic models , External shocks , Oil prices , Oil production , Supply ,
    Date: 2012–05–02
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:12/109&r=for
  15. By: Ondrej Rydval
    Abstract: We identify the causal effect of cognitive abilities on economic behavior in an experimental setting. Using a forecasting task with varying cognitive load, we identify the causal effect of working memory on subjects' forecasting performance, while also accounting for the effect of other cognitive, personality and demographic characteristics. Addressing the causality is important for understanding the nature of various decision-making errors, as well as for providing reliable policy implications in contexts such as student placement, personnel assignment, and public policy programs designed to augment abilities of the disadvantaged. We further argue that establishing the causality of cognitive abilities is a prerequisite for studying their interaction with financial incentives, with implications for the design of efficient incentive schemes.
    Keywords: cognitive ability; causality; experiment; financial incentives; performance; working memory;
    JEL: C81 C91 D80 D83 J24
    Date: 2012–04
    URL: http://d.repec.org/n?u=RePEc:cer:papers:wp457&r=for
  16. By: Elena Mitsek; Sergey Mitsek
    Date: 2011–09
    URL: http://d.repec.org/n?u=RePEc:deg:conpap:c016_030&r=for
  17. By: Daniel Detzer; Christian R. Proaño; Katja Rietzler; Sven Schreiber (IMK at the Hans-Boeckler-Foundation); Thomas Theobald (IMK at the Hans-Boeckler-Foundation); Sabine Stephan (IMK at the Hans-Boeckler-Foundation)
    Abstract: Forecasting business-cycle turning points under real-time conditions One of the greatest challenges in business cycle research is the timely and reliable identification of cyclical turning points.The data availability in real time constitutes a fundamental problem:First there is a publication lag of several months for some of the indicators concerning the real economy, and secondly those indicators are subject to substantial revisions even afterwards. The IMK undertook a systematic analysis of the business-cycle turning point detection problem in real time for Germany, applying and comparing four different econometric model classes. The employed methods recognize turning points two to four months ahead of official statistics in real time, for the evaluation sample of 2007 through 2010. A (nonlinear) dynamic probit model and a (linear) so-called subset VAR model seem to be especially well suited for this task. Based on our research results we conclude that it is advisable for the detection of turning points to combine many indicators.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:imk:studie:27-2012&r=for
  18. By: Leonardo Becchetti (Faculty of Economics, University of Rome "Tor Vergata"); Rocco Ciciretti (Faculty of Economics, University of Rome "Tor Vergata"); Alessandro Giovannelli (Faculty of Economics, University of Rome "Tor Vergata")
    Abstract: We investigate the relationship between Corporate Social Responsibility (hereafter CSR) and I/B/E/S Details analysts' earnings per share (EPS) forecasts using a large sample of US firm forecasts for the 1997-2004 period. We show that the net difference between CSR strengths and weaknesses significantly reduces both the absolute earning forecast error and its standard deviation after controlling for standard regressors and year, industry, and firm/broker effects. Our findings are consistent with the hypothesis that reduced transaction costs (and conflicts) with stakeholders and more transparent accounting practices implied by CSR significantly affect the bias. The CSR effect is strongly asymmetric and mainly driven by CSR weaknesses, consistent with the fact that the predicted channels of influence are mainly captured by CSR weakness scores. A crucial aspect of our findings is that CSR contributes to make financial markets efficient as unbiasedness and efficiency are (in almost all specifications) not violated in the subsample of the top 20 percent (lowest CSR weaknesses) companies, while they are in the bottom 20 percent CSR companies.
    Keywords: Earnings per Share; Analyst Forecast; Corporate Social Responsibility
    JEL: D84 E44 F30 G17 C53
    Date: 2012–05–04
    URL: http://d.repec.org/n?u=RePEc:rtv:ceisrp:233&r=for
  19. By: Yuanhua Feng (University of Paderborn); David Hand (Imperial College); Yuanhua Feng (Brunel University)
    Abstract: A new multivariate random walk model with slowly changing drift and cross-correlations for multivariate processes is introduced and investigated in detail. In the model, not only the drifts and the cross-covariances but also the cross-correlations between single series are allowed to change slowly over time. The model can accompany any number of components such as many number of assets. The model is particularly useful for modelling and forecasting the value of financial portfolios under very complex market conditions. Kernel estimation of local covariance matrix is used. The integrated effect of the estimation errors involved in estimating the integrated processes is derived. Practical relevance of the model and estimation is illustrated by application to several foreign exchange rates.
    Keywords: Forecasting, Kernel estimation, Multivariate time series analysis, Portfolio return, Slowly changing multivariate random walk
    Date: 2012–05
    URL: http://d.repec.org/n?u=RePEc:pdn:wpaper:50&r=for
  20. By: Mark J. Jensen; John M. Maheu
    Abstract: In this paper, we extend the parametric, asymmetric, stochastic volatility model (ASV), where returns are correlated with volatility, by flexibly modeling the bivariate distribution of the return and volatility innovations nonparametrically. Its novelty is in modeling the joint, conditional, return-volatility distribution with an infinite mixture of bivariate Normal distributions with mean zero vectors, but having unknown mixture weights and covariance matrices. This semiparametric ASV model nests stochastic volatility models whose innovations are distributed as either Normal or Student-t distributions, plus the response in volatility to unexpected return shocks is more general than the fixed asymmetric response with the ASV model. The unknown mixture parameters are modeled with a Dirichlet process prior. This prior ensures a parsimonious, finite, posterior mixture that best represents the distribution of the innovations and a straightforward sampler of the conditional posteriors. We develop a Bayesian Markov chain Monte Carlo sampler to fully characterize the parametric and distributional uncertainty. Nested model comparisons and out-of-sample predictions with the cumulative marginal-likelihoods, and one-day-ahead, predictive log-Bayes factors between the semiparametric and parametric versions of the ASV model shows the semiparametric model projecting more accurate empirical market returns. A major reason is how volatility responds to an unexpected market movement. When the market is tranquil, expected volatility reacts to a negative (positive) price shock by rising (initially declining, but then rising when the positive shock is large). However, when the market is volatile, the degree of asymmetry and the size of the response in expected volatility is muted. In other words, when times are good, no news is good news, but when times are bad, neither good nor bad news matters with regards to volatility.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:fip:fedawp:2012-06&r=for
  21. By: Sheri M. Markose; Yue Peng; Amadeo Alentorn
    Abstract: Since its introduction in 2003, volatility indices such as the VIX based on the model-free implied volatility (MFIV) have become the industry standard for assessing equity market volatility. MFIV suffers from estimation bias which typically underestimates volatility during extreme market conditions due to sparse data for options traded at very high or very low strike prices, Jiang and Tian (2007). To address this problem, we propose modifications to the CBOE MFIV using Carr and Wu (2009) moneyness based interpolations and extrapolations of implied volatilities and so called GEV-IV derived from the Generalised Extreme Value (GEV) option pricing model of Markose and Alentorn (2011). GEV-IV gives the best forecasting performance when compared to the model-free VFTSE, Black-Scholes IV and the Carr-Wu case, for realised volatility of the FTSE-100, both during normal and extreme market conditions in 2008 when realised volatility peaked at 80%. The success of GEV-IV comes from the explicit modelling of the implied tail shape parameter and the time scaling of volatility in the risk neutral density which can rapidly and flexibly reflect extreme market sentiments present in traded option prices.
    Date: 2012–03–01
    URL: http://d.repec.org/n?u=RePEc:esx:essedp:713&r=for
  22. By: Ed Tower
    Abstract: This paper predicts the stock market using Tobin’s q, momentum, the Campbell-Shiller CAPE, and a new variant of the CAPE, the CAPER—trend earnings calculated using regressions of log earnings on time. The CAPER is superior to the CAPE. But q emerges as by far the best of the predictors. Two versions of the model are built. The one with momentum predicts a 29% fall in real wealth over the eight years from end 2010. The one without momentum predicts real wealth to increase over all time horizons, but even after fifteen years, only a 32% increase in real wealth.
    Keywords: CAPE, CAPER, Tobin’s q, momentum, stock market
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:duk:dukeec:12-02&r=for
  23. By: Phurichai Rungcharoenkitkul
    Abstract: This paper proposes a framework to analyze long-term potential growth that combines a simple quantitative model with an investigative approach of ‘growth diagnostics’. The framework is used to forecast potential growth for Cambodia, and to conduct simulations about the main drivers of growth in that country. The main result is that Cambodia compares less favorably against other lower-income Asian economies in terms of its investment rate, which in turn is constrained by the poor quality of its infrastructure. Bridging this gap can lift Cambodia’s potential growth by more than one percentage point.
    Keywords: Economic growth , Economic models , Production growth ,
    Date: 2012–04–11
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:12/96&r=for

This nep-for issue is ©2012 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.