nep-for New Economics Papers
on Forecasting
Issue of 2012‒06‒25
twenty-two papers chosen by
Rob J Hyndman
Monash University

  1. Volatility forecast combinations using asymmetric loss functions By Elena Andreou; Constantinos Kourouyiannis; Andros Kourtellos
  2. Evaluating Macroeconomic Forecasts:A Concise Review of Some Recent Developments By Philip Hans Franses; Michael McAleer; Rianne Legerstee
  3. A Comprehensive Look at Financial Volatility Prediction by Economic Variables By Charlotte Christiansen; Maik Schmeling; Andreas Schrimpf
  4. Real-time forecasting in a data-rich environment By LIEBERMANN, JOELLE
  5. Generating short-term forecasts of the Lithuanian GDP using factor models By Julius Stakenas
  6. Is there an Optimal Forecast Combination? A Stochastic Dominance Approach to Forecast Combination Puzzle By Mehmet Pinar; Thanasis Stengos; M. Ege Yazgan
  7. Short-Run Regional Forecasts: Spatial Models through Varying Cross-Sectional and Temporal Dimensions By Matías Mayor-Fernández; Roberto Patuelli
  8. Forecasting Consumption in Real Time: The Role of Consumer Confidence Surveys By Kajal Lahiri; George Monokroussos; Yongchen Zhao
  9. Forecasting Value-at-Risk with Time-Varying Variance, Skewness and Kurtosis in an Exponential Weighted Moving Average Framework By Alexandros Gabrielsen; Paolo Zagaglia; Axel Kirchner; Zhuoshi Liu
  10. A New Structural Break Model with Application to Canadian Inflation Forecasting By John M. Maheu; Yong Song
  11. Large Time-Varying Parameter VARs By Koop, Gary; Korobilis, Dimitris
  12. Information Rigidity and Correcting Inefficiency in USDA’s Commodity Forecasts By MacDonald, Stephen; Isengildina-Massa, Olga
  13. Robust volatility forecasts in the presence of structural breaks By Elena Andreou; Eric Ghysels; Constantinos Kourouyiannis
  14. Do Analysts’ Earnings Per Share Forecasts Contain Valuable Information Beyond One Quarter? The Case of Publicly Traded Agribusiness Firms By Lewis, Daniel; Manfredo, Mark R.; Sanders, Dwight R.; Scott, Winifred
  15. Baysian Model Averaging, Learning and Model Selection By Mitra, Kaushik; Evans, George W.; Honkapohja, Seppo
  16. Factor-Based Forecasting in the Presence of Outliers: Are Factors Better Selected and Estimated by the Median than by The Mean? By Johannes Tang Kristensen
  17. A new model of trend inflation By Chan, Joshua; Koop, Gary; Potter, Simon
  18. Identifying Speculative Bubbles with an Infinite Hidden Markov Model By Shu-Ping Shi; Yong Song
  19. Expectation Formation and Monetary DSGE Models: Beyond the Rational Expectations Paradigm By Fabio Milani; Ashish Rajbhandari
  20. Rethinking Age-Period-Cohort Mortality Trend Models By Daniel Alai; Michael Sherris
  21. eMPF Econometric Model of Public Finance By Sławomir Dudek; Tomasz Zając; Kamil Danielski; Magdalena Zachłod-Jelec; Paweł Kolski; Dawid Pachucki; Iwona Fudała-Poradzińska
  22. Impacts of Permanent and Transitory Shocks on Optimal Length of Moving Average to Predict Wheat Basis By Lee, Yoonsuk; Brorsen, B. Wade

  1. By: Elena Andreou; Constantinos Kourouyiannis; Andros Kourtellos
    Abstract: The paper deals with the problem o fmodel uncertainty in the forecasting volatility using forecast combinations and a flexible family of asymmetric loss functions that allow for the possibility that an investor would attach different preferences to high vis-a-vis low volatility periods. Using daily as well as 5 minute data for US and major international stock market indices we provide volatility forecasts by minimizing the Homogeneous Robust Loss function of the Realized Volatility and the combined forecast. Our findings show that forecast combinations based on the homogeneous robust loss function significantly outperform simple forecast combination methods, especially during the period of recent financial crisis.
    Keywords: asymetric loss functions, forecast combinations, realized volatility, volatility forecasting
    Date: 2012–05
    URL: http://d.repec.org/n?u=RePEc:ucy:cypeua:07-2012&r=for
  2. By: Philip Hans Franses (Econometric Institute Erasmus School of Economics Erasmus University Rotterdam); Michael McAleer (Erasmus University Rotterdam,Tinbergen Institute,Kyoto University,Complutense University of Madrid); Rianne Legerstee (Econometric Institute Erasmus School of Economics Erasmus University Rotterdam, Tinbergen Institute The Netherlands)
    Abstract: Macroeconomic forecasts are frequently produced, widely published, intensively discussed and comprehensively used. The formal evaluation of such forecasts has a long research history. Recently, a new angle to the evaluation of forecasts has been addressed, and in this review we analyse some recent developments from that perspective. The literature on forecast evaluation predominantly assumes that macroeconomic forecasts are generated from econometric models. In practice, however, most macroeconomic forecasts, such as those from the IMF, World Bank, OECD, Federal Reserve Board, Federal Open Market Committee (FOMC) and the ECB, are typically based on econometric model forecasts jointly with human intuition. This seemingly inevitable combination renders most of these forecasts biased and, as such, their evaluation becomes non-standard. In this review, we consider the evaluation of two forecasts in which: (i) the two forecasts are generated from two distinct econometric models; (ii) one forecast is generated from an econometric model and the other is obtained as a combination of a model and intuition; and (iii) the two forecasts are generated from two distinct (but unknown) combinations of different models and intuition. It is shown that alternative tools are needed to compare and evaluate the forecasts in each of these three situations. These alternative techniques are illustrated by comparing the forecasts from the (econometric) Staff of the Federal Reserve Board and the FOMC on inflation, unemployment and real GDP growth. It is shown that the FOMC does not forecast significantly better than the Staff, and that the intuition of the FOMC does not add significantly in forecasting the actual values of the economic fundamentals. This would seem to belie the purported expertise of the FOMC.
    Keywords: Macroeconomic forecasts, econometric models, human intuition, biased forecasts, forecast performance, forecast evaluation, forecast comparison.
    JEL: C22 C51 C52 C53 E27 E37
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:821&r=for
  3. By: Charlotte Christiansen; Maik Schmeling; Andreas Schrimpf
    Abstract: We investigate if asset return volatility is predictable by macroeconomic and financial variables and shed light on the economic drivers of financial volatility. Our approach is distinct due to its comprehensiveness: First, we employ a data-rich forecast methodology to handle a large set of potential predictors in a Bayesian Model Averaging approach, and, second, we take a look at multiple asset classes (equities, foreign exchange, bonds, and commodities) over long time spans. We find that proxies for credit risk and funding (il)liquidity consistently show up as common predictors of volatility across asset classes. Variables capturing time-varying risk premia also perform well as predictors of volatility. While forecasts by macro-finance augmented models also achieve forecasting gains out-of-sample relative to autoregressive benchmarks, the performance varies across asset classes and over time.
    Keywords: Realised volatility; Forecasting; Data-rich modeling; Bayesian model averaging; Model uncertainty
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:bis:biswps:374&r=for
  4. By: LIEBERMANN, JOELLE
    Abstract: This paper assesses the ability of different models to forecast key real and nominal U.S. monthly macroeconomic variables in a data-rich environment and from the perspective of a real-time forecaster, i.e. taking into account the real-time data revisions process and data flow. We find that for the real variables predictability is confined over the recent recession/crisis period. This is in line with the findings of D’Agostino and Giannone (2012) that gains in relative performance of models using large datasets over univariate models are driven by downturn periods which are characterized by higher comovements. Regarding inflation, results are stable across time, but predictability is mainly found at the very short-term horizons. Inflation is known to be hard to forecast, but by exploiting timely information one obtains gains at nowcasting and forecasting one-month ahead, especially with Bayesian VARs. Furthermore, for both real and nominal variables, the direct pooling of information using a high dimensional model (dynamic factor model or Bayesian VAR) which takes into account the cross-correlation between the variables and efficiently deals with the “ragged edge”structure of the dataset, yields more accurate forecasts than the indirect pooling of bi-variate forecasts/models.
    Keywords: Real-time data; Nowcasting; Forecasting; Factor model; Bayesian VAR; Forecast pooling
    JEL: C53 E52 C33 C11
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:39452&r=for
  5. By: Julius Stakenas (Bank of Lithuania)
    Abstract: This paper focuses on short-term Lithuanian GDP forecasting using a large monthly dataset. The forecasting accuracy of various factor model specifications is assessed using the out-of-sample forecasting exercise. It is argued that factor extraction by using a simple principal components method might lead to a loss of important information related GDP forecasting, therefore, other methods should be also considered. Performance of several factor models, which relate the factor extraction step to GDP forecasting, was tested. The effect of using weighted principal components model, with weights depending on variables’ absolute correlation with GDP, was explored in greater detail. Although factor models performed better than naive benchmark forecast for GDP nowcasting and 1 quarter ahead forecasting, we were unable to set up the ranking among different factor model specifications. We also find that a small scale factor model with 5 variables (which could be regarded as the most important monthly variables for GDP nowcasting) is able to nowcast GDP better than models with a full data set of 52 variables, which might indicate that for the case of the Lithuanian economy, a smaller scale factor models may be more suitable.
    Keywords: GDP forecasting, factor models, principal components
    JEL: C22 E37
    Date: 2012–06–18
    URL: http://d.repec.org/n?u=RePEc:lie:wpaper:13&r=for
  6. By: Mehmet Pinar (University of Guelph; Fondazione Eni Enrico Mattei); Thanasis Stengos (University of Guelph); M. Ege Yazgan (Istanbul Bilgi University)
    Abstract: The forecast combination puzzle refers to the finding that a simple average forecast combination outperforms more sophisticated weighting schemes and/or the best individual model. The paper derives optimal (worst) forecast combinations based on stochastic dominance (SD) analysis with differential forecast weights. For the optimal (worst) forecast combination, this index will minimize (maximize) forecasts errors by combining time-series model based forecasts at a given probability level. By weighting each forecast differently, we find the optimal (worst) forecast combination that does not rely on arbitrary weights. Using two exchange rate series on weekly data for the Japanese Yen/U.S. Dollar and U.S. Dollar/Great Britain Pound for the period from 1975 to 2010 we find that the simple average forecast combination is neither the worst nor the best forecast combination something that provides partial support for the forecast combination puzzle. In that context, the random walk model is the model that consistently contributes with considerably more than an equal weight to the worst forecast combination for all variables being forecasted and for all forecast horizons, whereas a flexible Neural Network autoregressive model and a self-exciting threshold autoregressive model always enter the best forecast combination with much greater than equal weights.
    Keywords: Nonparametric Stochastic Dominance; Mixed Integer Programming; Forecast combinations; Forecast combination puzzle
    JEL: C12 C13 C14 C15 G01
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:17_12&r=for
  7. By: Matías Mayor-Fernández (Department of Applied Economics, University of Oviedo, Spain); Roberto Patuelli (Department of Economics, Faculty of Economics-Rimini, University of Bologna, Italy; The Rimini Centre for Economic Analysis (RCEA), Italy)
    Abstract: In any economic analysis, regions or municipalities should not be regarded as isolated spatial units, but rather as highly interrelated small open economies. These spatial interrelations must be considered also when the aim is to forecast economic variables. For example, policy makers need accurate forecasts of the unemployment evolution in order to design short- or long-run local welfare policies. These predictions should then consider the spatial interrelations and dynamics of regional unemployment. In addition, a number of papers have demonstrated the improvement in the reliability of long-run forecasts when spatial dependence is accounted for. We estimate a heterogeneouscoefficients dynamic panel model employing a spatial filter in order to account for spatial heterogeneity and/or spatial autocorrelation in both the levels and the dynamics of unemployment, as well as a spatial vector-autoregressive (SVAR) model. We compare the short-run forecasting performance of these methods, and in particular, we carry out a sensitivity analysis in order to investigate if different number and size of the administrative regions influence their relative forecasting performance. We compute short-run unemployment forecasts in two countries with different administrative territorial divisions and data frequency: Switzerland (26 regions, monthly data for 34 years) and Spain (47 regions, quarterly data for 32 years)
    Keywords: regional forecasts; spatial econometrics; dynamic panel; SVAR
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:15_12&r=for
  8. By: Kajal Lahiri; George Monokroussos; Yongchen Zhao
    Abstract: We study the role of consumer sentiment in forecasting personal consumption expenditure. We re-examine existing models of consumption and sentiment using both quarterly data and monthly data in realtime. Furthermore, we produce forecasts of consumption expenditures using a dynamic factor model with nearly two hundred explanatory variables with and without consumer sentiment measures. We find that consumer sentiment contributes to the accuracy of consumption forecasts in general. This contribution is a combined effect of unique information contained in the sentiment index and, more importantly of the timeliness of its release.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:nya:albaec:12-02&r=for
  9. By: Alexandros Gabrielsen (Sumitomo Mitsui Banking Corporation, UK); Paolo Zagaglia (Department of Economics, University of Bologna, Italy); Axel Kirchner (Deutsche Bank, UK); Zhuoshi Liu (Bank of England, UK)
    Abstract: This paper provides an insight to the time-varying dynamics of the shape of the distribution of financial return series by proposing an exponential weighted moving average model that jointly estimates volatility, skewness and kurtosis over time using a modified form of the Gram-Charlier density in which skewness and kurtosis appear directly in the functional form of this density. In this setting VaR can be described as a function of the time-varying higher moments by applying the Cornish-Fisher expansion series of the first four moments. An evaluation of the predictive performance of the proposed model in the estimation of 1-day and 10-day VaR forecasts is performed in comparison with the historical simulation, filtered historical simulation and GARCH model. The adequacy of the VaR forecasts is evaluated under the unconditional, independence and conditional likelihood ratio tests as well as Basel II regulatory tests. The results presented have significant implications for risk management, trading and hedging activities as well as in the pricing of equity derivatives.
    Keywords: exponential weighted moving average, time-varying higher moments, Cornish-Fisher expansion, Gram-Charlier density, risk management, Value-at-Risk
    JEL: C51 C52 C53 G15
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:34_12&r=for
  10. By: John M. Maheu (Department of Economics, University of Toronto, Canada; RCEA, Italy); Yong Song (CenSoC, University of Technology, Sydney, Australia; RCEA, Italy)
    Abstract: This paper develops an efficient approach to model and forecast time-series data with an unknown number of change-points. Using a conjugate prior and conditional on time-invariant parameters, the predictive density and the posterior distribution of the change-points have closed forms. The conjugate prior is further modeled as hierarchical to exploit the information across regimes. This framework allows breaks in the variance, the regression coefficients or both. Regime duration can be modelled as a Poisson distribution. A new efficient Markov Chain Monte Carlo sampler draws the parameters as one block from the posterior distribution. An application to Canada inflation time series shows the gains in forecasting precision that our model provides.
    Keywords: multiple change-points, regime duration, inflation targeting, predictive density, MCMC
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:27_12&r=for
  11. By: Koop, Gary; Korobilis, Dimitris
    Abstract: In this paper we develop methods for estimation and forecasting in large timevarying parameter vector autoregressive models (TVP-VARs). To overcome computational constraints with likelihood-based estimation of large systems, we rely on Kalman filter estimation with forgetting factors. We also draw on ideas from the dynamic model averaging literature and extend the TVP-VAR so that its dimension can change over time. A final extension lies in the development of a new method for estimating, in a time-varying manner, the parameter(s) of the shrinkage priors commonly-used with large VARs. These extensions are operationalized through the use of forgetting factor methods and are, thus, computationally simple. An empirical application involving forecasting inflation, real output, and interest rates demonstrates the feasibility and usefulness of our approach.
    Keywords: Bayesian VAR, forecasting, time-varying coefficients, state-space model,
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:edn:sirdps:317&r=for
  12. By: MacDonald, Stephen; Isengildina-Massa, Olga
    Abstract: This study investigates the rationality of monthly revisions in annual forecasts of supply, demand and price for U.S. corn, cotton, soybeans, and wheat, published in the World Agricultural Supply and Demand Estimates over 1985/86-2010/11. The findings indicate that USDA's forecast revisions are not independent across months, and that forecasts are typically smoothed. Adjustment for smoothing in a subset of forecasts (1998/2000-2010/11) showed mixed results: significant improvements for soybean use forecasts, cotton exports, and a broad cross-section of forecasts published in October. However, accuracy deteriorated in some cases, particularly for late-season preliminary data revisions.
    Keywords: Crop Production/Industries, Research Methods/ Statistical Methods,
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:ags:aaea12:124890&r=for
  13. By: Elena Andreou; Eric Ghysels; Constantinos Kourouyiannis
    Abstract: Financial time series often undergo periods of structural change that yield biased estimates or forecasts of volatility and thereby risk management measures. We show that in the context of GARCH diussion models ignoring structural breaks in the leverage coecient and the constant can lead to biased and inecient AR-RV and GARCH-type volatility estimates. Similarly, we nd that volatility forecasts based on AR-RV and GARCH-type models that take into account structural breaks by estimating the parameters only in the post-break period, signicantly outperform those that ignore them. Hence, we propose a Flexible Forecast Combination method that takes into account not only information from dierent volatility models, but from different subsamples as well. This methods consists of two main steps: First, it splits the estimation period in subsamples based on estimated structural breaks detected by a change-pointtest. Second, it forecasts volatility weighting information from all subsamples by minimizing particular loss function, such as the Square Error and QLIKE. An empirical application using the S&P 500 Index shows that our approach performs better, especially in periods of high volatility, compared to a large set of individual volatility models and simple averaging methods as well as Forecast Combinations under Regime Switching.
    Keywords: forecast, combinations, volatility, structural breaks
    Date: 2012–05
    URL: http://d.repec.org/n?u=RePEc:ucy:cypeua:08-2012&r=for
  14. By: Lewis, Daniel; Manfredo, Mark R.; Sanders, Dwight R.; Scott, Winifred
    Abstract: Analysts’ forecasting of earnings per share for multiple quarter time horizons of eleven agribusiness companies is evaluated using a mean absolute scaled error and a direct test. Results illustrate that unique information is consistently found. Rational and efficient expectations are formed periodically. Analysts’ performance declines as the time horizon increases.
    Keywords: Agribusiness, Agricultural Finance, Industrial Organization,
    Date: 2012–06–01
    URL: http://d.repec.org/n?u=RePEc:ags:aaea12:124480&r=for
  15. By: Mitra, Kaushik; Evans, George W.; Honkapohja, Seppo
    Abstract: Agents have two forecasting models, one consistent with the unique rational expectations equilibrium, another that assumes a time-varying parameter structure. When agents use Bayesian updating to choose between models in a self-referential system, we find that learning dynamics lead to selection of one of the two models. However, there are parameter regions for which the non-rational forecasting model is selected in the long-run. A key structural parameter governing outcomes measures the degree of expectations feedback in Muth's model of price determination.
    Keywords: Learning dynamics, Bayesian model averaging, grain of truth, self-referential systems,
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:edn:sirdps:314&r=for
  16. By: Johannes Tang Kristensen (Aarhus University and CREATES)
    Abstract: Macroeconomic forecasting using factor models estimated by principal components has become a popular research topic with many both theoretical and applied contributions in the literature. In this paper we attempt to address an often neglected issue in these models: The problem of outliers in the data. Most papers take an ad-hoc approach to this problem and simply screen datasets prior to estimation and remove anomalous observations.We investigate whether forecasting performance can be improved by using the original unscreened dataset and replacing principal components with a robust alternative. We propose an estimator based on least absolute deviations (LAD) as this alternative and establish a tractable method for computing the estimator. In addition to this we demonstrate the robustness features of the estimator through a number of Monte Carlo simulation studies. Finally, we apply our proposed estimator in a simulated real-time forecasting exercise to test its merits. We use a newly compiled dataset of US macroeconomic series spanning the period 1971:2–2011:4. Our findings suggest that the chosen treatment of outliers does affect forecasting performance and that in many cases improvements can be made using a robust estimator such as our proposed LAD estimator.
    Keywords: Forecasting, FactorsModels, Principal Components Analysis, Robust Estimation, Least Absolute Deviations
    JEL: C38 C53 E37
    Date: 2012–06–08
    URL: http://d.repec.org/n?u=RePEc:aah:create:2012-28&r=for
  17. By: Chan, Joshua; Koop, Gary; Potter, Simon
    Abstract: This paper introduces a new model of trend (or underlying) inflation. In contrast to many earlier approaches, which allow for trend inflation to evolve according to a random walk, ours is a bounded model which ensures that trend inflation is constrained to lie in an interval. The bounds of this interval can either be fixed or estimated from the data. Our model also allows for a time-varying degree of persistence in the transitory component of inflation. The bounds placed on trend inflation mean that standard econometric methods for estimating linear Gaussian state space models cannot be used and we develop a posterior simulation algorithm for estimating the bounded trend inflation model. In an empirical exercise with CPI inflation we find the model to work well, yielding more sensible measures of trend inflation and forecasting better than popular alternatives such as the unobserved components stochastic volatility model.
    Keywords: Constrained inflation; non-linear state space model; underlying inflation; inflation targeting; inflation forecasting; Bayesian
    JEL: C51 E31 C15 E37 C11
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:39496&r=for
  18. By: Shu-Ping Shi (Australian National University, Australia); Yong Song (University of Technology Sydney, Australia)
    Abstract: This paper proposes an infinite hidden Markov model (iHMM) to detect, date stamp, and estimate speculative bubbles. Three features make this new approach attractive to practitioners. first, the iHMM is capable of capturing the nonlinear dynamics of different types of bubble behaviors as it allows an infinite number of regimes. Second, the implementation of this procedure is straightforward as the detection, dating, and estimation of bubbles are done simultaneously in a coherent Bayesian framework. Third, the iHMM, by assuming hierarchical structures, is parsimonious and superior in out-of-sample forecast. Two empirical applications are presented: one to the Argentinian money base, exchange rate, and consumer price from January 1983 to November 1989; and the other to the U.S. oil price from April 1983 to December 2010. We find prominent results, which have not been discovered by the existing finite hidden Markov model. Model comparison shows that the iHMM is strongly supported by the predictive likelihood.
    Keywords: speculative bubbles, infinite hidden Markov model, Dirichlet process
    JEL: C11 C15
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:26_12&r=for
  19. By: Fabio Milani (Department of Economics, University of California-Irvine); Ashish Rajbhandari (Department of Economics, University of California-Irvine)
    Abstract: Empirical work in macroeconomics almost universally relies on the hypothesis of rational expectations. This paper departs from the literature by considering a variety of alternative expectations formation models. We study the econometric properties of a popular New Keynesian monetary DSGE model under different expectational assumptions: the benchmark case of rational expectations, rational expectations extended to allow for `news' about future shocks, near-rational expectations and learning, and observed subjective expectations from surveys. The results show that the econometric evaluation of the model is extremely sensitive to how expectations are modeled. The posterior distributions for the structural parameters significantly shift when the assumption of rational expectations is modified. Estimates of the structural disturbances under different expectation processes are often dissimilar. The modeling of expectations has important effects on the ability of the model to fit macroeconomic time series. The model achieves its worse fit under rational expectations. The introduction of news improves fit. The best-fitting specifications, however, are those that assume learning. Expectations also have large effects on forecasting. Survey expectations, news, and learning all work to improve the model's one-step-ahead forecasting accuracy. Rational expectations, however, dominate over longer horizons, such as one-year ahead or beyond.
    Keywords: Expectation formation; Rational expectations; News shocks; Adaptive learning; Survey expectations; Econometric evaluation of DSGE models; Forecasting
    JEL: C52 D84 E32 E37 E50
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:irv:wpaper:111212&r=for
  20. By: Daniel Alai (ARC Centre of Excellence in Population Ageing Research, Australian School of Business, University of New South Wales); Michael Sherris (School of Risk and Actuarial Studies and ARC Centre of Excellence in Population Ageing Research, Australian School of Business, University of New South Wales)
    Abstract: Longevity risk arising from uncertain mortality improvement is one of the major risks facing annuity providers and pension funds. In this paper we show how applying trend models from non-life claims reserving to age-period-cohort mortality trends provides new insight in estimating mortality improvement and quantifying its uncertainty. Age, period, and cohort trends are modelled with distinct effects for each age, calendar year, and birth year in a generalized linear models framework. The effects are distinct in the sense that they are not conjoined with age coefficients, borrowing from regression terminology, we denote them as main effects. Mortality models in this framework for age-period, age-cohort, and age-period-cohort effects are assessed using national population mortality data from Norway and Australia to show the relative significance of cohort effects as compared to period effects. Results are compared with the traditional Lee-Carter model. The bilinear period effect in the Lee-Carter model is shown to resemble a main cohort effect in these trend models. However the approach avoids the limitations of the Lee-Carter model when forecasting with the age-cohort trend model.
    Keywords: Mortality Modelling, Age-Period-Cohort Models, Generalized Linear Models, Lee-Carter Models
    JEL: G22 G23 C51 C18
    Date: 2012–05
    URL: http://d.repec.org/n?u=RePEc:asb:wpaper:201212&r=for
  21. By: Sławomir Dudek (Ministry of Finance, Poland); Tomasz Zając (Ministry of Finance, Poland); Kamil Danielski; Magdalena Zachłod-Jelec (Ministry of Finance, Poland); Paweł Kolski (Ministry of Finance, Poland); Dawid Pachucki (Ministry of Finance, Poland); Iwona Fudała-Poradzińska (Ministry of Finance, Poland)
    Abstract: This paper presents the Econometric Model of Public Finance, eMPF. The model has been developed and maintained at the Polish Ministry of Finance to facilitate forecasting process, especially for the budget and convergence programme purposes, and to deliver scenario analyses. We present the particular blocks of the model and responses to some standard shocks. The eMPF model is a medium size quarterly macroeconometric model of the Polish economy. It was estimated on the seasonally adjusted data on the 1995-2010 sample. The model consists of 352 variables, of which 279 are endogenous and about 40 are explained by stochastic ECM type equations. The long run of the model is theory based and is derived from optimization conditions of the market participants. Microeconomic foundations of the long-run equilibrium impose constraints for dynamics of the model to force it to converge to the steady state. In the short run the model is demand driven with elasticities estimated to reflect historical path of the variables and rigidities in the economy. Taking into account the mix of economic theory and the willingness to fit the equations to the data, the eMPF model belongs to the so-called hybrid models family. There are two sectors identified in the model: the market sector and the general government sector, both summing up to the total economy according to the ESA95 methodology. Within the market sector two additional subsectors are recognised: households and companies, but only part of the institutional accounts is incorporated for these two subsectors. To fulfill the needs of the Ministry of Finance to prepare fiscal policy analyses, the model has quite detailed public finance block.
    Keywords: structural macroeconometric model, macroeconomic model, Polish economy
    JEL: E10 E17 E20 E50 E60
    Date: 2012–06–12
    URL: http://d.repec.org/n?u=RePEc:fpo:wpaper:14&r=for
  22. By: Lee, Yoonsuk; Brorsen, B. Wade
    Abstract: A new stochastic process is introduced where permanent changes occur following a Poisson jump process and temporary changes occur following a normal distribution. The model is estimated using hard wheat basis data and is used to explain why the optimal length of moving average to forecast basis varies over time. The estimated probability of jumps is large and thus the optimal length of moving average is small.
    Keywords: basis, jump-diffusion process, Monte Carlo simulation, Research and Development/Tech Change/Emerging Technologies, Research Methods/ Statistical Methods,
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:ags:aaea12:125001&r=for

This nep-for issue is ©2012 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.