
on Econometrics 
By:  Marco Del Negro; Frank Schorfheide; Frank Smets; Raf Wouters 
Abstract:  The paper provides new tools for the evaluation of DSGE models and applies them to a largescale New Keynesian dynamic stochastic general equilibrium (DSGE) model with price and wage stickiness and capital accumulation. Specifically, we approximate the DSGE model by a vector autoregression (VAR) and then systematically relax the implied crossequation restrictions. Let denote the extent to which the restrictions are being relaxed. We document how the in and outofsample fit of the resulting specification (DSGEVAR) changes as a function of . Furthermore, we learn about the precise nature of the misspecification by comparing the DSGE model’s impulse responses to structural shocks with those of the bestfitting DSGEVAR. We find that the degree of misspecification in largescale DSGE models is no longer so large as to prevent their use in daytoday policy analysis, yet it is not small enough that it cannot be ignored. 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:200437&r=ecm 
By:  Marco Del Negro; Frank Schorfheide 
Abstract:  This paper uses a novel method for conducting policy analysis with potentially misspecified DSGE models (Del Negro and Schorfheide 2004) and applies it to a simple New Keynesian DSGE model. We illustrate the sensitivity of the results to assumptions on the policy invariance of model misspecifications. 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:200438&r=ecm 
By:  Peter Reinhard Hansen; Asger Lunde; James M. Nason 
Abstract:  This paper studies tests of calendar effects in equity returns. It is necessary to control for all possible calendar effects to avoid spurious results. The authors contribute to the calendar effects literature and its significance with a test for calendarspecific anomalies that conditions on the nuisance of possible calendar effects. Thus, their approach to test for calendar effects produces robust datamining results. Unfortunately, attempts to control for a large number of possible calendar effects have the downside of diminishing the power of the test, making it more difficult to detect actual anomalies. The authors show that our test achieves good power properties because it exploits the correlation structure of (excess) returns specific to the calendar effect being studied. We implement the test with bootstrap methods and apply it to stock indices from Denmark, France, Germany, Hong Kong, Italy, Japan, Norway, Sweden, the United Kingdom, and the United States. Bootstrap pvalues reveal that calendar effects are significant for returns in most of these equity markets, but endoftheyear effects are predominant. It also appears that, beginning in the late 1980s, calendar effects have diminished except in smallcap stock indices. 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:200502&r=ecm 
By:  Pierluigi Balduzzi; Cesare Robotti 
Abstract:  This paper considers two alternative formulations of the linear factor model (LFM) with nontraded factors. The first formulation is the traditional LFM, where the estimation of risk premia and alphas is performed by means of a crosssectional regression of average returns on betas. The second formulation (LFM*) replaces the factors with their projections on the span of excess returns. This formulation requires only timeseries regressions for the estimation of risk premia and alphas. We compare the theoretical properties of the two approaches and study the smallsample properties of estimates and test statistics. Our results show that when estimating risk premia and testing multibeta models, the LFM* formulation should be considered in addition to, or even instead of, the more traditional LFM formulation. 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:200504&r=ecm 
By:  Peter Reinhard Hansen; Asger Lunde; James M. Nason 
Abstract:  The paper introduces the model confidence set (MCS) and applies it to the selection of forecasting models. An MCS is a set of models that is constructed so that it will contain the “best” forecasting model, given a level of confidence. Thus, an MCS is analogous to a confidence interval for a parameter. The MCS acknowledges the limitations of the data so that uninformative data yield an MCS with many models, whereas informative data yield an MCS with only a few models. We revisit the empirical application in Stock and Watson (1999) and apply the MCS procedure to their set of inflation forecasts. In the first pre1984 subsample we obtain an MCS that contains only a few models, notably versions of the SolowGordon Phillips curve. On the other hand, the second post1984 subsample contains little information and results in a large MCS. Yet, the random walk forecast is not contained in the MCS for either of the samples. This outcome shows that the random walk forecast is inferior to inflation forecasts based on Phillips curvelike relationships. 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:200507&r=ecm 
By:  Jeffrey C. Fuhrer; Giovanni P. Olivei 
Abstract:  This paper compares different methods for estimating forwardlooking output and inflation Euler equations and shows that weak identification can be an issue in conventional GMM estimation. The authors propose a GMM procedure that imposes the dynamic constraints implied by the forwardlooking relation on the instruments set. This “optimal instruments” procedure is more reliable than conventional GMM, and it provides a robust alternative to estimating dynamic macroeconomic relations. Empirical applications of this procedure suggest only a limited role for expectational terms. 
Keywords:  Keynesian economics ; Macroeconomics 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:fip:fedbwp:042&r=ecm 
By:  N. Kundan Kishor; Evan F. Koenig 
Abstract:  Conventional VAR estimation and forecasting ignores the fact that economic data are often subject to revision many months or years after their initial release. This paper shows how VAR analysis can be modified to account for such revisions. The proposed approach assumes that government statistical releases are efficient with a finite lag. It takes no stand on whether earlier revisions are “noise” or “news.” The technique is illustrated using data on employment and the unemployment rate, real GDP and the unemployment rate, and real GDP and the GDP/consumption ratio. In each case, the proposed procedure outperforms conventional VAR analysis and the morerestrictive methods for handling the datarevision problem that are found in the existing literature. 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:fip:feddwp:0501&r=ecm 
By:  Anil Kumar 
Abstract:  Econometric models with nonlinear budgets sets frequently arise in the study of impact of taxation on labor supply. Blomquist and Newey (2002) have suggested a nonparametric method to estimate the uncompensated wage and income effects when the budget set is nonlinear. This paper extends their nonparametric estimation method to censored dependent variables. The modified method is applied to estimate female wage and income elasticities using the 1987 PSID. I find evidence of bias if the nonlinearity in the budget set is ignored. The median compensated elasticity is estimated at 1.19 (with a standard error of 0.19). 
Keywords:  Labor supply ; Women  Employment ; Wages 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:fip:feddwp:0505&r=ecm 
By:  Athanasios Orphanides; Simon van Norden 
Abstract:  A stable predictive relationship between inflation and the output gap, often referred to as a Phillips curve, provides the basis for countercyclical monetary policy in many models. In this paper, we evaluate the usefulness of alternative univariate and multivariate estimates of the output gap for predicting inflation. Many of the ex post output gap measures we examine appear to be quite useful for predicting inflation. However, forecasts using realtime estimates of the same measures do not perform nearly as well. The relative usefulness of realtime output gap estimates diminishes further when compared to simple bivariate forecasting models which use past inflation and output growth. Forecast performance also appears to be unstable over time, with models often performing differently over periods of high and low inflation. These results call into question the practical usefulness of the output gap concept for forecasting inflation. 
Keywords:  Inflation (Finance) ; Phillips curve ; Inputoutput analysis 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:200468&r=ecm 
By:  Stefania D'Amico 
Abstract:  This paper proposes a method for predicting the probability density of a variable of interest in the presence of model ambiguity. In the first step, each candidate parametric model is estimated minimizing the KullbackLeibler 'distance' (KLD) from a reference nonparametric density estimate. Given that the KLD represents a measure of uncertainty about the true structure, in the second step, its information content is used to rank and combine the estimated models. The paper shows that the KLD between the nonparametric and the parametric density estimates is asymptotically normally distributed. This result leads to determining the weights in the model combination, using the distribution function of a Normal centered on the average performance of all plausible models. Consequently, the final weight is determined by the ability of a given model to perform better than the average. As such, this combination technique does not require the true structure to belong to the set of competing models and is computationally simple. I apply the proposed method to estimate the density function of daily stock returns under different phases of the business cycle. The results indicate that the double Gamma distribution is superior to the Gaussian distribution in modeling stock returns, and that the combination outperforms each individual candidate model both in and outofsample. 
Keywords:  Rate of return ; Econometric models ; Stocks 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:200509&r=ecm 
By:  Todd E. Clark; Michael W. McCracken 
Abstract:  This paper presents analytical, Monte Carlo, and empirical evidence on the effectiveness of combining recursive and rolling forecasts when linear predictive models are subject to structural change. We first provide a characterization of the biasvariance tradeoff faced when choosing between either the recursive and rolling schemes or a scalar convex combination of the two. From that, we derive pointwise optimal, timevarying and datadependent observation windows and combining weights designed to minimize mean square forecast error. We then proceed to consider other methods of forecast combination, including Bayesian methods that shrink the rolling forecast to the recursive and Bayesian model averaging. Monte Carlo experiments and several empirical examples indicate that although the recursive scheme is often difficult to beat, when gains can be obtained, some form of shrinkage can often provide improvements in forecast accuracy relative to forecasts made using the recursive scheme or the rolling scheme with a fixed window width. 
Keywords:  Forecasting 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:fip:fedkrw:rwp0410&r=ecm 
By:  Siddhartha Chib; Michael J. Dueker 
Abstract:  This article presents a nonMarkovian regime switching model in which the regime states depend on the sign of an autoregressive latent variable. The magnitude of the latent variable indexes the 'strength' of the state or how deeply the system is embedded in the current regime. In this model, regimes have dynamics, not only persistence, so that one regime can gradually give way to another. In this framework, it is natural to allow the autoregressive latent variable to be endogenous so that regimes are determined jointly with the observed data. We apply the model to GDP growth, as in Hamilton (1989), Albert and Chib (1993) and Filardo and Gordon (1998) to illustrate the relation of the regimes to NBERdated recessions and the timevarying expected durations of regimes. The article makes use of the MetropolisHastings algorithm to make multimove draws of the latent regime strength variable, where the extended Kalman filter provides a valid proposal density for the latent variable. 
Keywords:  Timeseries analysis ; Business cycles 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:fip:fedlwp:2004030&r=ecm 
By:  Marco J. Lombardi (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti"); Simon J. Godsill (Cambridge University Engineering Department, Signal Processing Lab) 
Abstract:  In this paper we propose an online Bayesian filtering and smoothing method for time series models with heavytailed alphastable noise, with a particular focus on TVAR models. alphastable processes have been shown in the past to be a good model for many naturally occurring noise sources. We first point out how a filter that fails to take into account the heavytailed character of the noise performs poorly and then examine how an alphastable based particle filter can be devised to overcome this problem. The filtering methodology is based on a scale mixtures of normals (SMiN) representation of the alphastable distribution, which allows efficient RaoBlackwellised implementation within a conditionally Gaussian framework, and requires no direct evaluation of the alphastable density, which is in general unavailable in closed form. The methodology is shown to work well, outperforming the traditional Gaussian methods both on simulated data and on real audio data sets. 
Keywords:  Particle filters, Kalman filter, Alphastable distributions, Scale mixture of normals. 
Date:  2004–05–01 
URL:  http://d.repec.org/n?u=RePEc:fir:econom:wp2004_05&r=ecm 
By:  Marco J. Lombardi (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti"); Giorgio Calzolari (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti") 
Abstract:  The alphastable family of distributions constitutes a generalization of the Gaussian distribution, allowing for asymmetry and thicker tails. Its practical usefulness is coupled with a marked theoretical appeal, as it stems from a generalized version of the central limit theorem in which the assumption of the finiteness of the variance is replaced by a less restrictive assumption concerning a somehow regular behavior of the tails. Estimation difficulties have however hindered its diffusion among practitioners. Since simulated values from alphastable distributions can be straightforwardly obtained, the indirect inference approach could prove useful to overcome these estimation difficulties. In this paper we provide a description of how to implement such a method by using a skewt distribution as an auxiliary model. The indirect inference approach will be introduced in the setting of the estimation of the distribution parameters and then extended to linear time series models with alphastable disturbances. The performance of this estimation method is then assessed on simulated data. An application on timeseries models for the inflation rate concludes the paper. 
Keywords:  Indirect inference, Alphastable distributions, Heavy tails. 
Date:  2004–06–01 
URL:  http://d.repec.org/n?u=RePEc:fir:econom:wp2004_07&r=ecm 
By:  Marco J. Lombardi (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti") 
Abstract:  The alphastable family of distributions constitutes a generalization of the Gaussian distribution, allowing for asymmetry and thicker tails. Its practical usefulness is coupled with a marked theoretical appeal, given that it stems from a generalized version of the central limit theorem in which the assumption of the finiteness of the variance is replaced by a less restrictive assumption concerning a somehow regular behavior of the tails. The absence of the density function in a closed form and the associated estimation difficulties have however hindered its diffusion among practitioners. In this paper I introduce a novel approach for Bayesian inference in the setting of alphastable distributions that resorts to a FFT of the characteristic function in order to approximate the likelihood function; the posterior distributions of the parameters are then produced via a random walk MCMC method. Contrary to the other MCMC schemes proposed in the literature, the proposed approach does not require auxiliary variables, and so it is less computationally expensive, especially when large sample sizes are involved. A simulation exercise highlights the empirical properties of the sampler; an application on audio noise data demonstrates how this estimation scheme performs in practical applications. 
Keywords:  Alphastable distributions, Infinite variance, MCMC. 
Date:  2004–09–01 
URL:  http://d.repec.org/n?u=RePEc:fir:econom:wp2004_11&r=ecm 
By:  Teodosio PerezAmaral (Universidad Complutense de Madrid, Departamento de Economía Cuantitativa); Giampiero M. Gallo (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti"); Halbert L. White (University of California, San Diego, Department of Economics) 
Abstract:  In PerezAmaral, Gallo, and White (2003), the authors proposed an automatic predictive modelling tool called Relevant Transformation of the Inputs Network Approach (RETINA). It is designed to embody flexibility (using nonlinear transformations of the predictors of interest), selective search within the range of possible models, control of collinearity, outofsample forecasting ability, and computational simplicity. In this paper we compare the characteristics of RETINA with PcGets, a wellknown automatic modeling method proposed by David Hendry. We point out similarities, differences, and complementarities of the two methods. In an example using US telecommunications demand data we find that RETINA can improve both in and outofsample over the usual linear regression model, and over some models suggested by PcGets. Thus, both methods are useful components of the modern applied econometrician’s automated modelling tool chest. 
Keywords:  Model selection, crossvalidation, flexible modelling, information criteria, forecasting. 
Date:  2004–10–04 
URL:  http://d.repec.org/n?u=RePEc:fir:econom:wp2004_12&r=ecm 
By:  Jan G. De Gooijer; Rob J. Hyndman 
Abstract:  We review the past 25 years of time series research that has been published in journals managed by the International Institute of Forecasters (Journal of Forecasting 19821985; International Journal of Forecasting 19852005). During this period, over one third of all papers published in these journals concerned time series forecasting. We also review highly influential works on time series forecasting that have been published elsewhere during this period. Enormous progress has been made in many areas, but we find that there are a large number of topics in need of further development. We conclude with comments on possible future research directions in this field. 
Keywords:  Accuracy measures; ARCH model; ARIMA model; Combining; Count data; Densities; Exponential smoothing; Kalman Filter; Long memory; Multivariate; Neural nets; Nonlinearity; Prediction intervals; Regime switching models; Robustness; Seasonality; State space; Structural models; Transfer function; Univariate; VAR. 
JEL:  C53 C22 C32 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:200512&r=ecm 
By:  Rob J. Hyndman; Anne B. Koehler 
Abstract:  We discuss and compare measures of accuracy of univariate time series forecasts. The methods used in the Mcompetition and the M3competition, and many of the measures recommended by previous authors on this topic, are found to be inadequate, and many of them are degenerate in commonly occurring situations. Instead, we propose that the mean absolute scaled error become the standard measure for comparing forecast accuracy across multiple time series. 
Keywords:  Forecast accuracy, Forecast evaluation, Forecast error measures, Mcompetition, Mean absolute scaled error. 
JEL:  C53 C52 C22 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:200513&r=ecm 
By:  George Kapetanios (Queen Mary, University of London) 
Abstract:  The question of variable selection in a regression model is a major open research topic in econometrics. Traditionally two broad classes of methods have been used. One is sequential testing and the other is information criteria. The advent of large datasets used by institutions such as central banks has exacerbated this model selection problem. This paper provides a new solution in the context of information criteria. The solution rests on the judicious selection of a subset of models for consideration using nonstandard optimisation algorithms for information criterion minimisation. In particular, simulated annealing and genetic algorithms are considered. Both a Monte Carlo study and an empirical forecasting application to UK CPI infation suggest that the new methods are worthy of further consideration. 
Keywords:  Simulated Annealing, Genetic Algorithms, Information criteria, Model selection, Forecasting, Inflation. 
JEL:  C11 C15 C53 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp533&r=ecm 
By:  George Kapetanios (Queen Mary, University of London) 
Abstract:  It is well known that instrumental variables (IV) estimation is sensitive to the choice of instruments both in small samples and asymptotically. Recently, Donald and Newey (2001) suggested a simple method for choosing the instrument set. The method involves minimising the approximate mean square error (MSE) of a given IV estimator where the MSE is obtained using refined asymptotic theory. An issue with the work of Donald and Newey (2001) is the fact that when considering large sets of valid instruments, it is not clear how to order the instruments in order to choose which ones ought to be included in the estimation. The present paper provides a possible solution to the problem using nonstandard optimisation algorithms. The properties of the algorithms are discussed. A Monte Carlo study illustrates the potential of the new method. 
Keywords:  Instrumental Variables, MSE, Simulated Annealing, Genetic Algorithms. 
JEL:  C12 C15 C23 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp534&r=ecm 
By:  George Kapetanios (Queen Mary, University of London) 
Abstract:  Panel datasets have been increasingly used in economics to analyse complex economic phenomena. One of the attractions of panel datasets is the ability to use an extended dataset to obtain information about parameters of interest which are assumed to have common values across panel units. However, the assumption of poolability has not been studied extensively beyond tests that determine whether a given dataset is poolable. We propose an information criterion method that enables the distinction of a set of series into a set of poolable series for which the hypothesis of a common parameter subvector cannot be reject and a set of series for which the poolability hypothesis fails. The method can be extended to analyse datasets with multiple clusters of series with similar characteristics. We discuss the theoretical properties of the method and investigate its small sample performance in a Monte Carlo study. 
Keywords:  Panel datasets, Poolability, Information criteria, Genetic Algorithm, Simulated Annealing. 
JEL:  C12 C15 C23 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp535&r=ecm 
By:  Andrea Cipollini (Queen Mary, University of London); George Kapetanios (Queen Mary, University of London) 
Abstract:  In this paper we compare the performance of a regional indicator of vulnerability in predicting, out of sample, the crisis events affecting the South East Asian region during the 199798 period. A Dynamic Factor method was used to retrieve the vulnerability indicator and stochastic simulation is used to produce probability forecasts. The empirical findings suggest evidence of financial contagion. 
Keywords:  Financial contagion, Dynamic factor model. 
JEL:  C32 C51 F34 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp538&r=ecm 
By:  George Kapetanios (Queen Mary, University of London) 
Abstract:  The problem of structural change justifiably attracts considerable attention in econometrics. A number of different paradigms have been adopted ranging from structural breaks which are sudden and rare to timevarying coefficient models which exhibit structural change more frequently and continuously. This paper is concerned with parametric econometric models whose coefficients change deterministically and smoothly over time. In particular we provide and discuss tests for the null hypothesis of no structural change versus the alternative hypothesis of smooth deterministic structural change. We provide asymptotic tests for this null hypothesis. However, the finite sample performance of these tests is not good as they overreject significantly. To address this problem we propose and justify bootstrap based tests. These tests perform well in an extensive Monte Carlo study. 
Keywords:  Structural change, Nonstationarity, Deterministic timevariation. 
JEL:  C10 C14 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp539&r=ecm 
By:  George Kapetanios (Queen Mary, University of London) 
Abstract:  The problem of structural change justifiably attracts considerable attention in econometrics. A number of different paradigms have been adopted ranging from structural breaks which are sudden and rare to time varying coefficient models which exhibit structural change more frequently and continuously. This paper is concerned with parametric econometric models whose coefficients change deterministically and smoothly over time. In particular we provide a new estimator for unconditional time varying variances in regression models. A small Monte Carlo study indicates that the method works reasonably well for moderately large sample sizes. 
Keywords:  Structural change, Nonstationarity, Deterministic timevariation. 
JEL:  C10 C14 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp540&r=ecm 
By:  Gonzalo CambaMendez (European Central Bank); George Kapetanios (Queen Mary, University of London) 
Abstract:  Testing the rank of a matrix of estimated parameters is key in a large variety of econometric modelling scenarios. This paper describes general methods to test for the rank of a matrix, and provides details on a variety of modelling scenarios in the econometrics literature where these tests are required. 
Keywords:  Multiple time series, Model specification, Tests of rank. 
JEL:  C12 C15 C32 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp541&r=ecm 
By:  Hashem Dezhbakhsh (Emory University); Daniel Levy (BarIlan University) 
Abstract:  Although linearly interpolated series are often used in economics, little has been done to examine the effects of interpolation on time series properties and on statistical inference. We show that linear interpolation of a trend tationary series superimposes a ‘periodic’ structure on the moments of the series. Using conventional time series methods to make inference about the interpolated series may therefore be invalid. Also, the interpolated series may exhibit more shock persistence than the original trend stationary series. 
Keywords:  Linear Interpolation, TrendStationary Series, Shock Persistence, Periodic Properties of Time Series 
JEL:  C10 C22 C82 E37 
Date:  2005–05–15 
URL:  http://d.repec.org/n?u=RePEc:wpa:wuwpem:0505004&r=ecm 
By:  Jonathan B. Hill (Florida International University) 
Abstract:  In this paper we analyze the asymptotic properties of the popularly used distribution tail estimator by B. Hill (1975), for heavytailed heterogenous, dependent processes. We prove the Hill estimator is weakly consistent for functionals of mixingales and L0approximable processes with regularly varying tails, covering ARMA, GARCH, and many IGARCH and FIGARCH processes. Moreover, for functionals of processes nearepoch dependent on a mixing process, we prove a Gaussian distribution limit exists. In this case, as opposed to all existing prior results in the literature, we do not require the asymptotic variance of the Hill estimator to be bounded, and we develop a NeweyWest kernel estimator of the variance. We expedite the theory by defining 'extremal mixingale' and 'extremal NED' properties to hold exclusively in the extreme distribution tails, disbanding with dependence restrictions in the non extremal support, and prove a broad class of linear processes are extremal NED. We demonstrate that for greater degrees of serial dependence more tail information is required in order to ensure asymptotic normality, both in theory and practice. 
Keywords:  Hill estimator; regular variation; infinite variance; near epoch dependence; mixingales 
JEL:  C1 C2 C3 C4 C5 C8 
Date:  2005–05–20 
URL:  http://d.repec.org/n?u=RePEc:wpa:wuwpem:0505005&r=ecm 