nep-for New Economics Papers
on Forecasting
Issue of 2013‒06‒04
fifteen papers chosen by
Rob J Hyndman
Monash University

  1. Forecasting Day-Ahead Electricity Prices: Utilizing Hourly Prices By Eran Raviv; Kees E. Bouwman; Dick van Dijk
  2. GFC-Robust Risk Management under the Basel Accord using Extreme Value Methodologies By Juan-Angel Jimenez-Martin; Michael McAleer; Teodosio Perez Amaral; Paulo Araujo Santos
  3. The impact of forecasting errors on warehouse labor efficiency: A case study in consumer electronics By Kim, T.Y.; Dekker, R.; Heij, C.
  4. Has the Basel II Accord Encouraged Risk Management During the 2008-09 Financial Crisis? By Michael McAleer; Juan-Angel Jimenez-Martin; Teodosio Pérez-Amaral
  5. What Do Experts Know About Forecasting Journal Quality? A Comparison with ISI Research Impact in Finance? By Chang, C.L.; McAleer, M.J.
  6. Worldwide equity Risk Prediction By David Ardia; Lennart F. Hoogerheide
  7. Forecasting Value-at-Risk using Block Structure Multivariate Stochastic Volatility Models By Manabu Asai; Massimiliano Caporin; Michael McAleer
  8. Modeling and Simulation: An Overview By Michael McAleer; Felix Chan; Les Oxley
  9. Exchange Rate Predictability and a Monetary Model with Time-varying Cointegration Coefficients By Cheolbeom Park; Sookyung Park
  10. Forecasting the Public's Acceptability of Municipal Water Regulation and Price Rationing for Communities on the Ogallala Aquifer By Edwards, Jeffrey A.; Wade, Tara R.; Burkey, Mark L.; Pumphrey, R. Gary
  11. How Pro-Poor Growth Affects the Demand for Energy By Paul Gertler; Orie Shelef; Catherine Wolfram; Alan Fuchs
  12. Post-recession US employment through the lens of a non-linear Okun’s law By Menzie Chinn; Laurent Ferrara; Valérie Mignon
  13. Semi-automatic Non-linear Model selection By Jennifer Castle; David Hendry
  14. 'Lucas' In The Laboratory By Elena Asparouhova; Peter Bossaerts; Nilanjan Roy; William Zame
  15. Ten Things You Should Know About DCC By Caporin, M.; McAleer, M.J.

  1. By: Eran Raviv (Erasmus University Rotterdam); Kees E. Bouwman (Erasmus University Rotterdam); Dick van Dijk (Erasmus University Rotterdam)
    Abstract: The daily average price of electricity represents the price of electricity to be delivered over the full next day and serves as a key reference price in the electricity market. It is an aggregate that equals the average of hourly prices for delivery during each of the 24 individual hours. This paper demonstrates that the disaggregated hourly prices contain useful predictive information for the daily average price. Multivariate models for the full panel of hourly prices significantly outperform univariate models of the daily average price, with reductions in Root Mean Squared Error of up to 16%. Substantial care is required in order to achieve these forecast improvements. Rich multivariate models are needed to exploit the relations between different hourly prices, but the risk of overfitting must be mitigated by using dimension reduction techniques, shrinkage and forecast combinations.
    Keywords: Electricity market, Forecasting, Hourly prices, Dimension reduction, Shrinkage, Forecast combinations
    JEL: C53 C32 Q47
    Date: 2013–05–17
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:2013068&r=for
  2. By: Juan-Angel Jimenez-Martin (Complutense University of Madrid, Spain); Michael McAleer (Complutense University of Madrid, Spain, Erasmus School of Economics, Erasmus University Rotterdam, The Netherlands, and Kyoto University, Japan); Teodosio Perez Amaral (Complutense University of Madrid, Spain); Paulo Araujo Santos (University of Lisbon, Portugal)
    Abstract: In this paper we provide further evidence on the suitability of the median of the point VaR forecasts of a set of models as a GFC-robust strategy by using an additional set of new extreme value forecasting models and by extending the sample period for comparison. These extreme value models include DPOT and Conditional EVT. Such models might be expected to be useful in explaining financial data, especially in the presence of extreme shocks that arise during a GFC. Our empirical results confirm that the median remains GFC-robust even in the presence of these new extreme value models. This is illustrated by using the S&P500 index before, during and after the 2008-09 GFC. We investigate the performance of a variety of single and combined VaR forecasts in terms of daily capital requirements and violation penalties under the Basel II Accord, as well as other criteria, including several tests for independence of the violations. The strategy based on the median, or more generally, on combined forecasts of single models, is straightforward to incorporate into existing computer software packages that are used by banks and other financial institutions.
    Keywords: Value-at-Risk (VaR), DPOT, daily capital charges, robust forecasts, violation penalties, optimizing strategy, aggressive risk management, conservative risk management, Basel, global financial crisis
    JEL: G32 G11 G17 C53 C22
    Date: 2013–05–21
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:2013070&r=for
  3. By: Kim, T.Y.; Dekker, R.; Heij, C.
    Abstract: Efficiency of outbound warehouse operations depends on the management of demand forecasts and associated labor planning. A case study in consumer electronics shows that warehouse management systematically over-forecasts actual orders, by 3% on average and by 6-12% in busy periods (at the end of each month and also in the months September, October, and November). A time series model that corrects order forecasts for the biases in preceding weeks reduces the bias to less than 2%, both on average and also in busy periods. The arrangements with the labor provider imply potential benefits of intentional over-forecasting and the associated ample labor supply for the warehouse. As compared to under-forecasted days, labor productivity on over-forecasted days is higher by 12% for loading activities and by 4% for picking and total outbound activities. Similar productivity gains are found if unbiased forecasts are compared with the optimal bias obtained from non-linear models estimated from daily data on bias and labor efficiency. The positive effects of intentional over-forecasting on productivity are confirmed in a structural equations model. By following similar methodologies as described in this paper, warehouse managers can determine the amount of intentional forecast bias that works best for their situation. The information required for this kind of evidence-based labor management consists of historical data on order sizes, forecasts, and labor productivity, and the outcomes depend on the available hiring strategies and cost structures.
    Keywords: time series;forecasting;decision support;case study;labor efficiency;wharehouse planning
    Date: 2013–05–01
    URL: http://d.repec.org/n?u=RePEc:dgr:eureir:1765040238&r=for
  4. By: Michael McAleer (Erasmus University Rotterdam, The Netherlands; National Chung Hsing University, Taiwan); Juan-Angel Jimenez-Martin (Complutense University of Madrid, Spain); Teodosio Pérez-Amaral (Complutense University of Madrid, Spain)
    Abstract: The Basel II Accord requires that banks and other Authorized Deposit-taking Institutions (ADIs) communicate their daily risk forecasts to the appropriate monetary authorities at the beginning of each trading day, using one or more risk models to measure Value-at-Risk (VaR). The risk estimates of these models are used to determine capital requirements and associated capital costs of ADIs, depending in part on the number of previous violations, whereby realised losses exceed the estimated VaR. In this paper we define risk management in terms of choosing sensibly from a variety of risk models, discuss the selection of optimal risk models, consider combining alternative risk models, discuss the choice between a conservative and aggressive risk management strategy, and evaluate the effects of the Basel II Accord on risk management. We also examine how risk management strategies performed during the 2008-09 financial crisis, evaluate how the financial crisis affected risk management practices, forecasting VaR and daily capital charges, and discuss alternative policy recommendations, especially in light of the financial crisis. These issues are illustrated using Standard and Poor’s 500 Index, with an emphasis on how risk management practices were monitored and encouraged by the Basel II Accord regulations during the financial crisis.
    Keywords: Value-at-Risk (VaR), daily capital charges, exogenous and endogenous violations, violation penalties, optimizing strategy, risk forecasts, aggressive or conservative risk management strategies, Basel II Accord, financial crisis
    JEL: G32 G11 G17 C53 C22
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:2009039&r=for
  5. By: Chang, C.L.; McAleer, M.J.
    Abstract: Experts possess knowledge and information that are not publicly available. The paper is concerned with forecasting academic journal quality and research impact using a survey of international experts from a national project on ranking academic finance journals in Taiwan. A comparison is made with publicly available bibliometric data, namely the Thomson Reuters ISI Web of Science citations database (hereafter ISI) for the Business - Finance (hereafter Finance) category. The paper analyses the leading international journals in Finance using expert scores and quantifiable Research Assessment Measures (RAMs), and highlights the similarities and differences in the expert scores and alternative RAMs, where the RAMs are based on alternative transformations of citations taken from the ISI database. Alternative RAMs may be calculated annually or updated daily to answer the perennial questions as to When, Where and How (frequently) published papers are cited (see Chang et al. (2011a, b, c)). The RAMs include the most widely used RAM, namely the classic 2-year impact factor including journal self citations (2YIF), 2-year impact factor excluding journal self citations (2YIF*), 5-year impact factor including journal self citations (5YIF), Immediacy (or zero-year impact factor (0YIF)), Eigenfactor, Article Influence, C3PO (Citation Performance Per Paper Online), h-index, PI-BETA (Papers Ignored - By Even The Authors), 2-year Self-citation Threshold Approval Ratings (2Y-STAR), Historical Self-citation Threshold Approval Ratings (H-STAR), Impact Factor Inflation (IFI), and Cited Article Influence (CAI). As data are not available for 5YIF, Article Influence and CAI for 13 of the leading 34 journals considered, 10 RAMs are analysed for 21 highly-cited journals in Finance. The harmonic mean of the ranks of the 10 RAMs for the 34 highly-cited journals are also presented. It is shown that emphasizing the 2-year impact factor of a journal, which partly answers the question as to When published papers are cited, to the exclusion of other informative RAMs, which answer Where and How (frequently) published papers are cited, can lead to a distorted evaluation of journal impact and influence relative to the Harmonic Mean rankings. A linear regression model is used to forecast expert scores on the basis of RAMs that capture journal impact, journal policy, the number of high quality papers, and quantitative information about a journal. The robustness of the rankings is also analysed.
    Keywords: robustness;IFI;PI-BETA;STAR;article influence;eigenfactor;h-index;C3PO;impact factor;expert scores;journal quality;RAMs;harmonic mean
    Date: 2013–02–01
    URL: http://d.repec.org/n?u=RePEc:dgr:eureir:1765038715&r=for
  6. By: David Ardia; Lennart F. Hoogerheide
    Abstract: Various GARCH models are applied to daily returns of more than 1200 constituents of major stock indices worldwide. The value-at-risk forecast performance is investigated for different markets and industries, considering the test for correct conditional coverage using the false discovery rate (FDR) methodology. For most of the markets and industries we find the same two conclusions. First, an asymmetric GARCH specification is essential when forecasting the 95% value-at-risk. Second, for both the 95% and 99% value-at-risk it is crucial that the innovations’ distribution is fat-tailed (e.g., Student-t or – even better – a non-parametric kernel density estimate). Then we discuss two applications. First, we use normal Entropy Pooling to estimate a market distribution consistent with the CAPM equilibrium, which improves on the “implied returns” a-la-Black and Litterman (1990) and can be used as the starting point for portfolio construction. Second, we use normal Entropy Pooling to process ranking signals for alpha-generation.
    Keywords: GARCH, value-at-risk, equity, worldwide, false discovery rate
    JEL: C11 C22 C52
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:lvl:lacicr:1312&r=for
  7. By: Manabu Asai (Soka University, Japan); Massimiliano Caporin (University of Padova, Italy); Michael McAleer (Erasmus University Rotterdam, The Netherlands, Complutense University of Madrid, Spain, and Kyoto University, Japan)
    Abstract: Most multivariate variance or volatility models suffer from a common problem, the “curse of dimensionality”. For this reason, most are fitted under strong parametric restrictions that reduce the interpretation and flexibility of the models. Recently, the literature has focused on multivariate models with milder restrictions, whose purpose is to combine the need for interpretability and efficiency faced by model users with the computational problems that may emerge when the number of assets can be very large. We contribute to this strand of the literature by proposing a block-type parameterization for multivariate stochastic volatility models. The empirical analysis on stock returns on the US market shows that 1% and 5 % Value-at-Risk thresholds based on one-step-ahead forecasts of covariances by the new specification are satisfactory for the period including the Global Financial Crisis.
    Keywords: block structures; multivariate stochastic volatility; curse of dimensionality; leverage effects; multi-factors; heavy-tailed distribution
    JEL: C32 C51 C10
    Date: 2013–05–27
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:2013073&r=for
  8. By: Michael McAleer (University of Canterbury); Felix Chan; Les Oxley
    Abstract: The papers in this special issue of Mathematics and Computers in Simulation cover the following topics. Improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are born equal. The empirical properties of some estimators of long memory, characterising trader manipulation in a limitorder driven market, measuring bias in a term-structure model of commodity prices through the comparison of simultaneous and sequential estimation, modeling tail credit risk using transition matrices, evaluation of the DPC-based inclusive payment system in Japan for cataract operations by a new model, the matching of lead underwriters and issuing firms in the Japanese corporate bond market, stochastic life table forecasting. A time-simultaneous fan chart application, adaptive survey designs for sampling rare and clustered populations, income distribution inequality, globalization, and innovation. A general equilibrium simulation, whether exchange rates affect consumer prices. A comparative analysis for Australia, China and India, the impacts of exchange rates on Australia's domestic and outbound travel markets, clean development mechanism in China. Regional distribution and prospects, design and implementation of a Web-based groundwater data management system, the impact of serial correlation on testing for structural change in binary choice model. Monte Carlo evidence, and coercive journal self citations, impact factor, journal influence and article influence.
    Keywords: Modeling, simulation, forecasting, time series models, trading, credit risk, empirical finance, health economics, sampling, groundwater systems, exchange rates, structural change, citations
    JEL: C15 C63 E27 E37 E47 F37 F47
    Date: 2013–05–20
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:13/18&r=for
  9. By: Cheolbeom Park (Department of Economics, Korea University, Seoul, Republic of Korea); Sookyung Park (Department of Economics, Korea University, Seoul, Republic of Korea)
    Abstract: Many studies have pointed out that the underlying relations and functions for the monetary model (e.g. the PPP relation, the money demand function, monetary policy rule, etc.) have undergone parameter instabilities and that the relation between exchange rates and macro fundamentals are unstable due to the shift in the economic models in foreign exchange traders¡¯ views or the scapegoat effect in Bacchetta and van Wincoop (2009). Facing this, we consider a monetary model with time-varying cointegration coefficients in order to understand exchange rate movements. We provide statistical evidence against the standard monetary model with constant cointegration coefficients but find favorable evidence for the time-varying cointegration relationship between exchange rates and monetary fundamentals. Furthermore, we demonstrate that deviations between the exchange rate and fundamentals from the time-varying cointegration relation have strong predictive power for future changes in exchange rates through in-sample analysis, out-of-sample analysis, and directional accuracy tests.
    Keywords: Exchange rate, Monetary model, Predictability, Time-varying cointegration
    JEL: F31 F47
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:iek:wpaper:1302&r=for
  10. By: Edwards, Jeffrey A.; Wade, Tara R.; Burkey, Mark L.; Pumphrey, R. Gary
    Abstract: Among many, increasing the price of municipal water is considered to be the most effective mechanism for enhancing municipal water conservation, whether during times of drought or not. However, increasing the price of something that is considered to be, literally, a life-giving resource is politically taboo. This study follows two others that evaluate survey data with Likert scale responses, in determining whether or not constituents would outright reject the idea of using price to ration municipal water. But it goes several steps further--it controls for both community and respondent level variables, calculates and evaluates in-sample response probabilities, and most importantly, attempts to forecast the attitudes of constituents in communities that are not in our survey sample. In the end, our model produces both in-sample and out-of-sample response probabilities that are reasonable, and relatively stable across communities; it therefore provides communities and researchers with a means to gauge public support for pricing initiatives.
    Keywords: water conservation, water pricing, regulation, ethanol, Likert scale, ordered logit, Environmental Economics and Policy, Q25, Q56, R52,
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:ags:aaea13:149579&r=for
  11. By: Paul Gertler; Orie Shelef; Catherine Wolfram; Alan Fuchs
    Abstract: Most of the future growth in energy use is forecast to come from the developing world. Understanding the likely pace and specific location of this growth is essential to inform decisions about energy infrastructure investments and to improve greenhouse gas emissions forecasts. We argue that countries with pro-poor economic growth will experience larger increases in energy demand than countries where growth is more regressive. When poor households’ incomes go up, their energy demand increases along the extensive margin as they buy energy-using assets for the first time. We also argue that the speed at which households come out of poverty affects their asset purchase decisions. We provide empirical support for these hypotheses by examining the causal impact of increases in household income on asset accumulation and energy use in the context of Mexico’s conditional cash transfer program. We find that transfers had a large effect on asset accumulation among the low-income program beneficiaries, and the effect is greater when the cash is transferred over a shorter time period. We apply lessons from the household analysis to aggregate energy forecast models using country-level panel data. Our results suggest that existing forecasts could grossly underestimate future energy use in the developing world.
    JEL: Q41 Q47
    Date: 2013–05
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:19092&r=for
  12. By: Menzie Chinn; Laurent Ferrara; Valérie Mignon
    Abstract: This paper aims at investigating the relationship between employment and GDP in the United States. We disentangle trend and cyclical employment components by estimating a non-linear Okun’s law based on a smooth transition error-correction model that simultaneously accounts for long-term relationships between growth and employment and short-run instability over the business cycle. Our findings based on out-of-sample conditional forecasts show that, since the exit of the 2008-09 recession, US employment is on average around 1% below the level implied by the long run output-employment relationship, meaning that about 1.2 million of the trend employment loss cannot be attributed to the identified cyclical factors.
    Keywords: Okun’s law, trend employment, non-linear modeling
    JEL: E24 E32 C22
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:drm:wpaper:2013-12&r=for
  13. By: Jennifer Castle; David Hendry
    Abstract: We consider model selection for non-linear dynamic equations with more candidate variables than observations, based on a general class of non-linear-in-the-variables functions, addressing possible location shifts by impulse-indicator saturation.  After an automatic search delivers a simplified congruent terminal model, an encompassing test can be implemented against an investigator's preferred non-linear function.  When that is non-linear in the parameters, such as a threshold model, the overall approach can only be semi-automatic.  The method is applied to re-analyze an empirical model of real wages in the UK over 1860-2004, updated and extended to 2005-2011 for forecast evaluation.
    Keywords: Non-linear models, location shifts, model selection, autometrics, impulse-indicator saturation
    JEL: C51 C22
    Date: 2013–05–16
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:654&r=for
  14. By: Elena Asparouhova; Peter Bossaerts; Nilanjan Roy; William Zame
    Abstract: This paper reports on experimental tests of an instantiation of the Lucas asset pricing model with heterogeneous agents and time-varying private income streams. Central features of the model (infinite horizon, perishability of consumption, stationarity) present difficult challenges and require a novel experimental design. The experimental evidence provides broad support for the qualitative pricing and consumption predictions of the model (prices move with fundamentals, agents smooth consumption) but sharp differences from the quantitative predictions emerge (asset prices display excess volatility, agents do not hedge price risk). Generalized Method of Moments (GMM) tests of the stochastic Euler equations yield very different conclusions depending on the instruments chosen. It is suggested that the qualitative agreement with and quantitative deviation from theoretical predictions arise from agents' expectations about future prices, which are almost self-fulfilling and yet very different from what they would need to be if they were exactly self-fulfilling (as the Lucas model requires).
    JEL: C92 E21 E32 G12
    Date: 2013–05
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:19068&r=for
  15. By: Caporin, M.; McAleer, M.J.
    Abstract: The purpose of the paper is to discuss ten things potential users should know about the limits of the Dynamic Conditional Correlation (DCC) representation for estimating and forecasting time-varying conditional correlations. The reasons given for caution about the use of DCC include the following: DCC represents the dynamic conditional covariances of the standardized residuals, and hence does not yield dynamic conditional correlations; DCC is stated rather than derived; DCC has no moments; DCC does not have testable regularity conditions; DCC yields inconsistent two step estimators; DCC has no asymptotic properties; DCC is not a special case of GARCC, which has testable regularity conditions and standard asymptotic properties; DCC is not dynamic empirically as the effect of news is typically extremely small; DCC cannot be distinguished empirically from diagonal BEKK in small systems; and DCC may be a useful filter or a diagnostic check, but it is not a model.
    Keywords: conditional correlations;regularity conditions;conditional covariances;BEKK;DCC;GARCC;assumed properties;asymptotic properties;derived model;diagnostic check;filter;moments;stated representation;two step estimators
    Date: 2013–03–01
    URL: http://d.repec.org/n?u=RePEc:dgr:eureir:1765039599&r=for

This nep-for issue is ©2013 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.