nep-for New Economics Papers
on Forecasting
Issue of 2006‒02‒05
thirteen papers chosen by
Rob J Hyndman
Monash University

  1. Macroeconomic derivatives: an initial analysis of market-based macro forecasts, uncertainty, and risk By Refet S. Gürkaynak; Justin Wolfers
  2. A review of core inflation and an evaluation of its measures By Robert Rich; Charles Steindel
  3. Modelling inflation dynamics: a critical review of recent research By Jeremy Rudd; Karl Whelan
  4. Cheap versus Expensive Trades: Assessing the Determinants of Market Impact Costs By Jacob A. Bikker; Laura Spierdijk; Pieter Jelle van der Sluis
  5. Using subjective expectations to forecast longevity: do survey respondents know something we don't know? By Maria G. Perozek
  6. If, At First, The Idea is Not Absurd, Then There is No Hope For It: Towards 15 MtC in the UK Transport Sector. By Robin Hickman; David Banister
  7. Pivot-Point Procedures in Practical Travel Demand Forecasting By Andrew Daly; James Fox; Jan Gerrit Tuinenga
  8. Comparing the Point Predictions and Subjective Probability Distributions of Professional Forecasters By Joseph Engelberg; Charles F. Manski; Jared Williams
  9. PEARL - THE NEW REGIONAL FORECASTING MODEL OF THE NETHERLANDS By Dik Leering; Andries Hans De Jong
  10. In search of a modelling strategy for projecting internal migration in European countries - Demographic versus economic-geographical approaches By Leo JG Van Wissen; Nicole Gaag; Phil Rees; John Stillwell
  11. Prediction of Land use change in urbanization control districts using neural network - A Case Study of Regional Hub City in Japan By Yoshitaka Kajita; Satoshi Toi; Hiroshi Tatsumi
  12. Representing future urban and regional scenarios for flood hazard mitigation By Jose I. Barredo; Carlo Lavalle; Valentina Sagris; Guy Engelen
  13. GIS-based Forecast of Landscape Changes with the Ito Land Readjustment Project By Hiroshi Tatsumi; Satoshi Toi; Kazunari Tanaka; Sanpei Yamashita; Yoshitaka Kajita

  1. By: Refet S. Gürkaynak; Justin Wolfers
    Abstract: In September 2002, a new market in "Economic Derivatives" was launched allowing traders to take positions on future values of several macroeconomic data releases. We provide an initial analysis of the prices of these options. We find that market-based measures of expectations are similar to survey-based forecasts, although the market-based measures somewhat more accurately predict financial market responses to surprises in data. These markets also provide implied probabilities of the full range of specific outcomes, allowing us to measure uncertainty, assess its driving forces, and compare this measure of uncertainty with the dispersion of point-estimates among individual forecasters (a measure of disagreement). We also assess the accuracy of market-generated probability density forecasts. A consistent theme is that few of the behavioral anomalies present in surveys of professional forecasts survive in equilibrium, and that these markets are remarkably well calibrated. Finally we assess the role of risk, finding little evidence that risk-aversion drives a wedge between market prices and probabilities in this market.
    Keywords: Derivative securities ; Macroeconomics ; Forecasting
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:fip:fedfwp:2005-26&r=for
  2. By: Robert Rich; Charles Steindel
    Abstract: This paper provides a review of the concept of core inflation and evaluates the performance of several proposed measures. We first consider the rationale of a central bank in setting its inflation goal in terms of a selected rate of consumer price growth and the use of a core inflation measure as a means of achieving this long-term policy objective. We then discuss desired attributes of a core measure of inflation, such as ease of design, accuracy in tracking trend inflation, and predictive content for future movements in aggregate inflation. Using these attributes as criteria, we evaluate several candidate series that have been proposed as core measures of consumer price index (CPI) inflation and personal consumption expenditure (PCE) inflation for the United States. The candidate series are inflation excluding food and energy, inflation excluding energy, and median inflation, as well as exponentially smoothed versions of aggregate inflation and the aforementioned individual series. ; For PCE inflation, we examine quarterly data starting in 1959. Unlike previous research, we confine our analysis to the methodologically consistent CPI index, which is only available starting in 1978. We find that most of the candidate series, including the familiar ex-food and energy measure, demonstrate the ability to match the mean rate of aggregate inflation and track movements in its underlying trend. In the within-sample analysis, we find that core measures derived through exponential smoothing, in combination with simple measures of economic slack, have substantial explanatory content for changes in aggregate inflation several years in advance. In the out-of-sample analysis, however, we find that no measure performs consistently well in forecasting inflation. Moreover, we document evidence of some parameter instability in the estimated forecasting models. Taken together, our findings lead us to conclude that there is no individual measure of core inflation that can be considered superior to other measures.
    Keywords: Inflation (Finance) ; Consumer price indexes ; Consumption (Economics)
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:fip:fednsr:236&r=for
  3. By: Jeremy Rudd; Karl Whelan
    Abstract: In recent years, a broad academic consensus has arisen around the use of rational expectations sticky-price models to capture inflation dynamics. These models are seen as providing an empirically reasonable characterization of observed inflation behavior once suitable measures of the output gap are chosen; and, moreover, are perceived to be robust to the Lucas critique in a way that earlier econometric models of inflation are not. We review the principal conclusions of this literature concerning: 1) the ability of these models to fit the data; 2) the importance of rational forward-looking expectations in price setting; and 3) the appropriate measure of inflationary pressures. We argue that existing rational expectations sticky-price models fail to provide a useful empirical description of the inflation process, especially relative to traditional econometric Phillips curves of the sort commonly employed for policy analysis and forecasting.
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2005-66&r=for
  4. By: Jacob A. Bikker; Laura Spierdijk; Pieter Jelle van der Sluis
    Abstract: This paper assesses the determinants of market impact costs of institutional equity trades, using unique data from the world’s second largest pension fund. We allow the impact of trade characteristics and market conditions on trading costs to depend on the level of trading costs itself and establish significant differences in the responses of cheaper and more expensive trades. We explain the distinct responses from differences in information content and demand for liquidity between trades with high and low trading costs. Finally, to illustrate the practical relevance of the approach, we use our method to forecast future trading costs.
    Keywords: market impact costs; quantile regression; forecasting market impact costs
    JEL: G11 C53
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:dnb:dnbwpp:069&r=for
  5. By: Maria G. Perozek
    Abstract: Furture old-age mortality is notoriously difficult to predict because it requires not only an understanding of the process of senescence, which is influenced by genetic, environmental and behavioral factors, but also a prediction of how these factors will evolve going forward. In this paper, I argue that individuals are uniquely qualified to predict their own mortality based on their own genetic background, as well as environmental and behavioral risk factors that are often known only to the individual. Using expectations data from the 1992 HRS, I construct subjective cohort life tables that are shown to predict the unusual direction of revisions to U.S. life expectancy by gender between 1992 and 2004; that is, the SSA revised up male life expectancy in 2004 and at the same revised down female life expectancy, narrowing the gender gap in longevity by 25 percent over this period. Further, the subjective expectations of women suggest that female life expectancies produced by the Social Security Actuary might still be on the high side, while the subjective life expectancies for men appear to be roughly in line with the 2004 life tables.
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2005-68&r=for
  6. By: Robin Hickman; David Banister
    Abstract: This paper examines the possibilities of reducing transport carbon dioxide emissions in the UK by 60 per cent by 2030 using a modified scenario building and backcasting approach. It examines a range of policy measures (behavioural and technological), assessing how they can be effectively combined to achieve the required level of emissions reduction. The intention is to evaluate whether such an ambitious target is feasible, identify the main problems (including the transition costs), and the main decision points over the 30-year time horizon. This paper outlines the first stages of the research, providing: An introduction to futures studies, including a review of forecasting, scenario building and backcasting approaches; An assessment of the UK transport sector's contribution to climate change and global warming, and; Setting targets for 2030, forecasting the business as usual situation for all forms of transport in the UK, and assessing the scale of change in terms of achieving the emissions reductions. The benefits of scenario building and backcasting are that innovative packages of policy measures can be developed to address emissions reduction targets. It allows trend-breaking analysis, by highlighting the policy and planning choices to be made by identifying those key stakeholders that should be included in the process, and by making an assessment of the main decision points that have to be made (the step changes). It also provides a longer-term background against which more detailed analysis can take place.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa05p791&r=for
  7. By: Andrew Daly; James Fox; Jan Gerrit Tuinenga
    Abstract: For many cities, regions and countries, large-scale model systems have been developed to support the development of transport policy. These models are intended to predict the traffic flows that are likely to result from assumed exogenous developments and transport policies affecting people and businesses in the relevant area. The accuracy of the model is crucial to determining the quality of the information that can be extracted as input to the planning and policy analysis process. A frequent approach to modelling, which can substantially enhance the accuracy of the model, is to formulate the model as predicting changes relative to a base-year situation. Often, base-year traffic flows can be observed rather accurately and the restriction of the model to predicting differences reduces the scope for errors in the modelling – whether they be caused by errors in the model itself or in the inputs to the model – to influence the outputs. Such approaches are called ‘pivot point’ methods, or sometimes incremental models. The approaches have proved themselves beneficial in practical planning situations and now form part of the recommended ‘VaDMA’ (Variable Demand Modelling Advice) guidelines issued by the UK Department for Transport. While the principle of the pivot point is clear, the implementation of the principle in practical model systems can be done in a number of ways and the choice between these can have substantial influence on the model forecasts. In particular modellers need to consider: 1.whether the change predicted by the model should be expressed as an absolute difference or a proportional ratio, or whether a mixed approach is necessary; 2.how to deal with apparently growth in ‘green-field’ situations when applying these approaches; 3,at what level in the model should the pivoting apply, i.e. at the level of mode choice, destination choice, overall travel frequency or combinations of these; 4,whether the pivoting is best undertaken as an operation conducted on a ‘base matrix’ or the model is constructed so that it automatically reproduces the base year situation with base year inputs. The paper reviews the alternative approaches to each of these issues, discussing current practice and attempting to establish the basis on which alternative approaches might be established; in particular, whether pivoting is treated as a correction to a model which is in principle correctly specified but incorporates some error, perhaps from faulty data, or as a partial replacement for a model that handles at best part of the situation. These views of the pivoting lead to different procedures. It goes on to present and justify the approach that the authors have found useful in a number of large-scale modelling studies in The Netherlands, the United Kingdom and elsewhere, pointing out the problems that have led to the calculations that are recommended.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa05p784&r=for
  8. By: Joseph Engelberg; Charles F. Manski; Jared Williams
    Abstract: We use data from the Survey of Professional Forecasters to compare point forecasts of GDP growth and inflation with the subjective probability distributions held by forecasters. We find that SPF forecasters summarize their underlying distributions in different ways and that their summaries tend to be favorable relative to the central tendency of the underlying distributions. We also find that those forecasters who report favorable point estimates in the current survey tend to do so in subsequent surveys. These findings, plus the inescapable fact that point forecasts reveal nothing about the uncertainty that forecasters feel, suggest that the SPF and similar surveys should not ask for point forecasts. It seems more reasonable to elicit probabilistic expectations and derive measures of central tendency and uncertainty, as we do here.
    JEL: C42 E27 E47
    Date: 2006–01
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:11978&r=for
  9. By: Dik Leering; Andries Hans De Jong
    Abstract: The Netherlands has a rather long history of developing models in the field of regional forecasts. Among othter things, these forecasts are used as an instrument for planning of house-building. In 2004 Statistics Netherlands and the Spatial Planning Bureau started with the development of a new model, called PEARL (which stands for 'Population Extrapolations At Regional Level'). It is an integrated model for the forecast of the population (by ethnic group) and households. PEARL will be used to regionalize the official forecasts of population (by ethnic group) and households at the national level, which are compiled by Statistics Netherlands. The lowest level of the regional forecasts will be the municipal level, which permits the aggregation to larger NUTS regions, such as 'COROP' and 'province'. The forecast-horizon of the regional forecasts will be 15 to 20 years, although computations for a longer period are possible. An important objective of PEARL is to be considered as the official regional forecast, from 2007 onwards. Assumptions on demographic (growth) components (fertility, mortality, internal and external migration) and transition rates (with respect to the life course) will be formulated at the municipal level. These assumptions are used as input for PEARL. In this way transparency of the outcomes of the model is promoted. In order to achieve consistency between population and households, PEARL consists of both a macro- and a micro-layer. At the macro-layer (the municipal level) the assumptions are applied, while in the micro-layer (individual level) the resulting events are administrated. In this way the micro-layer consists of approximately 16 million persons and approximately 7 million households. In switching between the macro- and the micro-layer PEARL distinguishes itself from more conventional models. The primary goal is to use PEARL as a (robust) instrument for forecasting. However, it may also be used as a tool for compiling scenarios. This can be done at the macro level (by formulating alternative assumptions at the municipal level), but also at the micro level (by using alternative figures on risks). In the last application PEARL is used as a micro-simulation model. The software program PEARL is written in Delphi-5. The intention is to publish first outcomes (with a limited scope) in the second half of 2005
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa05p420&r=for
  10. By: Leo JG Van Wissen; Nicole Gaag; Phil Rees; John Stillwell
    Abstract: Internal migration is the most volatile and difficult to predict component of regional demographic change. A pure demographic approach using age and sex-specific parameters of migration intensities cannot fully capture the migration trends over time. One of the approaches that can be used for a better description of past trends and forecasting of future trends is to use additional non-demographic information such as regional economic indicators. In this paper we compare the predictive performance of pure demographic and extended economic-geographical models using data of four European countries at the so-called NUTS 2 level. The models are nested within a GLM specification%2C that allows both demographic and extended models to be written as specific cases of loglinear models. Therefore model fit and performance can be compared directly.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa05p787&r=for
  11. By: Yoshitaka Kajita; Satoshi Toi; Hiroshi Tatsumi
    Abstract: Land use is changeable in the urban area, depending upon the economical mechanism of market. The controlled urbanization area is made a region where the urbanization should be controlled by the city planning and zoning act. However, in the zone, there are also many areas where form regulation of the building is looser than the urbanization zone which should form a city area. Therefore disorderly development acts, such as location of the large-scale commercial institution and leisure facilities unsuitable for circumference environment, are accepted in the controlled urbanization area. On the other hand, energies decrease in existing village by population decrease and declining birthrate and a growing proportion of elderly people become a problem. In order to cope with this problem, it is important to understand the past conditions of land use for the urban planning. This paper describes the spatial structure of urbanization control districts based on the present conditions and the change structure of land use by using mesh data surveyed and the copy of the development permission register in a local hub-city in Japan. Land use forecasting systems are designed using neural network. Although land use is classified separately in every surveyed year, the common classification of land use is proposed, considering the similarity of spatial distributions and the physical meanings of land use. Then, the distribution by mesh at each division of land use is studied. Spatial distribution of land use and its transition are also discussed. Next, land use forecasting models are made out using neural network. The feature and structure of change in the land use of an area depends on whether development projects are carried out or not. Therefore, all of the meshes are divided into two groups, and forecasting models are designed. Though our proposed approach is a macroscopic forecasting method of land use, it is useful in the investigation of urban policies for development projects and in the evaluation of their effects.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa05p415&r=for
  12. By: Jose I. Barredo; Carlo Lavalle; Valentina Sagris; Guy Engelen
    Abstract: In this paper we analyse urban and regional growth trends by using dynamic spatial models. The objective of this approach is twofold: on the one hand to monitor sustainable development trends and on the other hand to assess flood risk in urban areas. We propose the use of future urban scenarios in order to forecast the effects of urban and regional planning policies. In the last 20 years the extent of built-up areas in Europe has increased by 20%, exceeding clearly the 6% rate of population growth over the same period. This trend contributes to unsustainable development patterns, and moreover, the exposure to natural hazards is increasing in large regions of Europe. The paper is organised in two parts. In the first part we analyse a study case in Friuli-Venezia Giulia (FVG) Region in northern Italy. We analyse several spatial indicators in the form of maps describing population growth and patterns, and the historical growth of built-up areas. Then we show the results of a dynamic spatial model for simulating land use scenarios. The model is based on a spatial dynamics bottom-up approach, and can be defined as a cellular automata (CA)-based model. Future urban scenarios are produced by taking into account several factors –e.g. land use development, population growth or spatial planning policies–. Urban simulations offer a useful approach to understanding the consequences of current spatial planning policies. Inappropriate regional and urban planning can exacerbate the negative effects of extreme hydrological processes. Good land management and planning practices, including appropriate land use and development control in flood-prone areas, represent suitable non-structural solutions to minimise flood damages. The overall effects of these measures in terms of both sustainable development and flood defence can be quantified with the proposed modelling approach. In the second part of the paper we show some preliminary results of a pilot study case. Two future simulations produced by the model were used for a flood risk assessment in Pordenone (one of the four provinces of FVG). In the last 100 years Pordenone has suffered several floods. The two major events were the heavy floods of 1966 (100-year flood event; >500 mm of rain in 36 hours) and 2002 (up to 580 mm of rain in 36 hours). The disastrous consequences of those heavy floods have shown how vulnerable this area is. The flood risk analysis is based on a hydrological hazard map for the Livenza River catchment area, provided by the regional Water Authority. That map covers most of flood hazard areas of Pordenone province. Early results of this study show that the main driving force of natural disasters damage is not only increasing flood hazard, but increasing vulnerability, mainly due to urbanisation in flood prone areas.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa05p147&r=for
  13. By: Hiroshi Tatsumi; Satoshi Toi; Kazunari Tanaka; Sanpei Yamashita; Yoshitaka Kajita
    Abstract: The purpose of the present research is to attempt consolidation of geographic information into the GIS regarding the Ito Land Readjustment Project presently being implemented in northern Kyushu in Japan, and forecast the landscape changes before/after comparison of the project. First, the topographic map, 50-meter mesh digital map (elevation), aerial photograph, land use map, project plan, land use zone plan, district plan, substitute lot plan and other geographic information of the project district were collected from various sources and systematized. Thereafter the data were aggregated using ArcGIS. Next, building and structure data before and after the project were prepared and height data of these buildings and structures were combined with the land elevation data. Then, the major view point fields in the district were selected, to examine the extent of the change of the visibility areas from these view point fields before and after the project. Moreover, focusing on the mountains in the visibility areas, we forecasted how much the visibility area of the mountains would decrease at each view point field after construction of the buildings and structures in the project implementation. Finally, we showed 3-D images of the project district using ArcScene, and investigated the extent of visibility of the mountain range sky line from each view point field as before/after comparison of the project.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa05p199&r=for

This nep-for issue is ©2006 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.