
on Econometrics 
By:  Zongwu Cai (Department of Mathematics & Statistics, University of North Carolina at Charlotte; Fujian Key Laboratory of Statistical Sciences, Xiamen University); Zhijie Xiao (Boston College) 
Abstract:  We study quantile regression estimation for dynamic models with partially varying coefficients so that the values of some coefficients may be functions of informative covariates. Estimation of both parametric and nonparametric functional coefficients are proposed. In particular, we propose a three stage semiparametric procedure. Both consistency and asymptotic normality of the proposed estimators are derived. We demonstrate that the parametric estimators are rootn consistent and the estimation of the functional coefficients is oracle. In addition, efficiency of parameter estimation is discussed and a simple efficient estimator is proposed. A simple and easily implemented test for the hypothesis of varyingcoefficient is proposed. A Monte Carlo experiment is conducted to evaluate the performance of the proposed estimators. 
Keywords:  Efficiency; nonlinear time series; partially linear; partially varying coefficients; quantile regression; semiparametric 
Date:  2010–11–22 
URL:  http://d.repec.org/n?u=RePEc:boc:bocoec:761&r=ecm 
By:  Federico Crudu 
Abstract:  In this paper we introduce a weighted Zestimator for moment condition models in the presence of auxiliary information on the unknown distribution of the data under the assumption of weak dependence. The resulting weighted estimator is shown to be consistent and asymptotically normal. Its small sample properties are checked via Monte Carlo experiments. 
Keywords:  Zestimators; Mestimators; GMM; Generalized Empirical Likelihood; blocking techniques; ?mixing. 
JEL:  C12 C14 C22 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:cns:cnscwp:201022&r=ecm 
By:  Ralph W Bailey; John T Addison 
Keywords:  nonparametric regression; Nadaraya Watson; kernel density; conditional expectation estimator; conditional variance estimator; local polynomial estimator 
JEL:  C14 
Date:  2010–11 
URL:  http://d.repec.org/n?u=RePEc:bir:birmec:1030&r=ecm 
By:  Juan Carlos Escanciano; Chuan Goh 
Abstract:  This paper introduces a broad family of tests for the hypothesis of linearity in parameters of functions that are identified by conditional quantile restrictions involving instrumental variables. These tests are tantamount to assessments of lack of fit for quantile regression models involving endogenous conditioning variables, and may be applied to assess the validity of postestimation inferences regarding the counterfactual effect of endogenous treatments on the distribution of outcomes. We show that the use of an orthogonal projection on the tangent space of nuisance parameters at each quantile index improves power performance and facilitates the simulation of critical values via the application of simple multipliertype bootstrap procedures. Monte Carlo evidence is included, along with an application to an empirical analysis of the structure of demand for a particular subsegment of the market for antibacterial drugs in India. 
Keywords:  Quantile regression, instrumental variables, structural models 
JEL:  C12 C31 C52 
Date:  2010–11–19 
URL:  http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa415&r=ecm 
By:  Andreea Borla (GREQAM  Groupement de Recherche en Économie Quantitative d'AixMarseille  Université de la Méditerranée  AixMarseille II  Université Paul Cézanne  AixMarseille III  Ecole des Hautes Etudes en Sciences Sociales (EHESS)  CNRS : UMR6579); Costin Protopopescu (GREQAM  Groupement de Recherche en Économie Quantitative d'AixMarseille  Université de la Méditerranée  AixMarseille II  Université Paul Cézanne  AixMarseille III  Ecole des Hautes Etudes en Sciences Sociales (EHESS)  CNRS : UMR6579) 
Abstract:  We propose an estimator for the fractional derivative of a distribution function. Our estimator, based on finite differences of the empirical distribution function generalizes the estimator proposed by Maltz for the nonnegative real case. The asymptotic bias, variance and the consistency of the estimator are studied. Finally, the optimal choice for the ''smoothing parameter'' proves that even in the fractional case, the Stone's rate of convergence is achieved. 
Keywords:  fractional derivative; nonparametric estimation; distribution function; generalized differences 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:halshs00536979_v1&r=ecm 
By:  Qiankun Zhou (School of Economics, Singapore Management University); Jun Yu (School of Economics, Singapore Management University) 
Abstract:  The asymptotic distributions of the least squares estimator of the mean reversion parameter (κ) are developed in a general class of diffusion models under three sampling schemes, namely, longspan, infill and the combination of longspan and infill. The models have an affine structure in the drift function, but allow for nonlinearity in the diffusion function. The limiting distributions are quite different under the alternative sampling schemes. In particular, the infill limiting distribution is nonstandard and depends on the initial condition and the time span whereas the other two are Gaussian. Moreover, while the other two distributions are discontinuous at κ = 0, the infill distribution is continuous in κ. This property provides an answer to the Bayesian criticism to the unit root asymptotics. Monte Carlo simulations suggest that the infill asymptotic distribution provides a more accurate approximation to the finite sample distribution than the other two distributions in empirically realistic settings. The empirical application using the U.S. Federal fund rates highlights the difference in statistical inference based on the alternative asymptotic distributions and suggests strong evidence of a unit root in the data. 
Keywords:  Vasicek Model, Onefactor Model, Mean Reversion, Infill Asymptotics, Longspan Asymptotics, Unit Root Test 
JEL:  C12 C22 G12 
Date:  2010–01 
URL:  http://d.repec.org/n?u=RePEc:siu:wpaper:202010&r=ecm 
By:  Zhenlin Yang (School of Economics, Singapore Management University) 
Abstract:  The biasedness issue arising from the maximum likelihood estimation of the spatial autoregressive model (SAR) is further investigated under a broader setup than that in Bao and Ullah (2007a). A major difficulty in analytically evaluating the expectations of ratios of quadratic forms is overcome by a simple bootstrap procedure. With that, the corrections on bias and variance of the spatial estimator can easily be made up to thirdorder, and once this is done, the estimators of other model parameters become nearly unbiased. Compared with the analytical approach, the new approach is much simpler, and can easily be extended to other models of a similar structure. Extensive Monte Carlo results show that the new approach performs excellently in general. 
Keywords:  Thirdorder bias; Thirdorder variance; Bootstrap; Concentrated estimating equation; Monte Carlo; QuasiMLE; Spatial layout. 
JEL:  C10 C21 
Date:  2010–10 
URL:  http://d.repec.org/n?u=RePEc:siu:wpaper:122010&r=ecm 
By:  Nikolaus Hautsch; Peter Malec; Melanie Schienle 
Abstract:  We propose a novel approach to model serially dependent positivevalued variables which realize a nontrivial proportion of zero outcomes. This is a typical phenomenon in financial time series observed on high frequencies, such as cumulated trading volumes or the time between potentially simultaneously occurring market events. We introduce a flexible pointmass mixture distribution and develop a semiparametric specification test explicitly tailored for such distributions. Moreover, we propose a new type of multiplicative error model (MEM) based on a zeroaugmented distribution, which incorporates an autoregressive binary choice component and thus captures the (potentially different) dynamics of both zero occurrences and of strictly positive realizations. Applying the proposed model to highfrequency cumulated trading volumes of liquid NYSE stocks, we show that the model captures both the dynamic and distribution properties of the data very well and is able to correctly predict future distributions. 
Keywords:  highfrequency data, pointmass mixture, multiplicative error model, excess zeros, semiparametric specification test, market microstructure 
JEL:  C22 C25 C14 C16 C51 
Date:  2010–11 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2010055&r=ecm 
By:  Jun Yu (School of Economics, Singapore Management University) 
Abstract:  This chapter overviews some recent advances on simulationbased methods of estimating financial time series models that are widely used in financial economics. The simulationbased methods have proven to be particularly useful when the likelihood function and moments do not have tractable forms, and hence, the maximum likelihood (ML) method and the generalized method of moments (GMM) are diffcult to use. They are also capable of improving the finite sample performance of the traditional methods. Both frequentist's and Bayesian simulationbased methods are reviewed. Frequentist's simulationbased methods cover various forms of simulated maximum likelihood (SML) methods, the simulated generalized method of moments (SGMM), the efficient method of moments (EMM), and the indirect inference (II) method. Bayesian simulationbased methods cover various MCMC algorithms. Each simulationbased method is discussed in the context of a specific financial time series model as a motivating example. Empirical applications, based on real exchange rates, interest rates and equity data, illustrate how the simulationbased methods are implemented. In particular, SML is applied to a discrete time stochastic volatility model, EMM to estimate a continuous time stochastic volatility model, MCMC to a credit risk model, the II method to a term structure model. 
Keywords:  Generalized method of moments, Maximum likelihood, MCMC, Indirect Inference, Credit risk, Stock price, Exchange rate, Interest rate.. 
Date:  2010–10 
URL:  http://d.repec.org/n?u=RePEc:siu:wpaper:192010&r=ecm 
By:  Yong Li (Business School, Sun YatSen University); Jun Yu (School of Economics, Singapore Management University) 
Abstract:  A new posterior odds analysis is proposed to test for a unit root in volatility dynamics in the context of stochastic volatility models. This analysis extends the Bayesian unit root test of So and Li (1999, Journal of Business Economic Statistics) in two important ways. First, a numerically more stable algorithm is introduced to compute the Bayes factor, taking into account the special structure of the competing models. Owing to its numerical stability, the algorithm overcomes the problem of diverged “size” in the marginal likelihood approach. Second, to improve the “power” of the unit root test, a mixed prior specification with random weights is employed. It is shown that the posterior odds ratio is the byproduct of Bayesian estimation and can be easily computed by MCMC methods. A simulation study examines the “size” and “power” performances of the new method. An empirical study, based on time series data covering the subprime crisis, reveals some interesting results. 
Keywords:  Bayes factor; Mixed Prior; Markov Chain Monte Carlo; Posterior odds ratio; Stochastic volatility models; Unit root testing. 
Date:  2010–10 
URL:  http://d.repec.org/n?u=RePEc:siu:wpaper:212010&r=ecm 
By:  Badi H. Baltagi (Department of Economics and Center for Policy Research, Syracuse University); Zhenlin Yang (School of Economics, Singapore Management University) 
Abstract:  The robustness of the LM tests for spatial error dependence of Burridge (1980) for the linear regression model and Anselin (1988) for the panel regression model are examined. While both tests are asymptotically robust against distributional misspecification, their finite sample behavior can be sensitive to the spatial layout. To overcome this shortcoming, standardized LM tests are suggested. Monte Carlo results show that the new tests possess good finite sample properties. An important observation made throughout this study is that the LM tests for spatial dependence need to be both mean and varianceadjusted for good finite sample performance to be achieved. The former is, however, often neglected in the literature. 
Keywords:  Distributional misspecification; Group interaction; LM test; Moran’s I Test; Robustness; Spatial panel models. 
JEL:  C23 C5 
Date:  2010–09 
URL:  http://d.repec.org/n?u=RePEc:siu:wpaper:112010&r=ecm 
By:  Han Lin Shang 
Abstract:  This paper uses halfhourly electricity demand data in South Australia as an empirical study of nonparametric modeling and forecasting methods for prediction from halfhour ahead to one year ahead. A notable feature of the univariate time series of electricity demand is the presence of both intraweek and intraday seasonalities. An intraday seasonal cycle is apparent from the similarity of the demand from one day to the next, and an intraweek seasonal cycle is evident from comparing the demand on the corresponding day of adjacent weeks. There is a strong appeal in using forecasting methods that are able to capture both seasonalities. In this paper, the forecasting methods slice a seasonal univariate time series into a time series of curves. The forecasting methods reduce the dimensionality by applying functional principal component analysis to the observed data, and then utilize an univariate time series forecasting method and functional principal component regression techniques. When data points in the most recent curve are sequentially observed, updating methods can improve the point and interval forecast accuracy. We also revisit a nonparametric approach to construct prediction intervals of updated forecasts, and evaluate the interval forecast accuracy. 
Keywords:  Functional principal component analysis; functional time series; multivariate time series, ordinary least squares, penalized least squares; ridge regression; seasonal time series 
JEL:  C88 C63 C14 C22 
Date:  2010–10–18 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201019&r=ecm 
By:  Markus Eberhardt; Christian Helmers 
Abstract:  This paper surveys the most popular parametric and semiparametric estimators for CobbDouglas production functions arising from the econometric literature of the past two decades. We focus on the different approaches dealing with ‘transmission bias’ in firmlevel studies, which arises from firms’ reaction to unobservable productivity realisations when making input choices. The contribution of the paper is threefold: we provide applied economists with (i) an indepth discussion of the estimation problem and the solutions suggested in the literature; (ii) a detailed empirical example using FAME data for UK hightech firms, emphasising analytical tools to investigate data properties and the robustness of the empirical results; (iii) a powerful illustration of the impact of estimator choice on TFP estimates, using matched data on patents in ‘TFP regressions’. Our discussion concludes that while from a theoretical point of view the different estimators are conceptually very similar, in practice, the choice of the preferred estimator is far from arbitrary and instead requires indepth analysis of the data properties rather than blind belief in asymptotic consistency. 
Keywords:  Productivity production function, UK firms, panel data estimates 
JEL:  D21 D24 L25 O23 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:513&r=ecm 
By:  Belyaev, Yuri (Centre of Biostochastics, SLUUmeå); Kriström, Bengt (CERE, SLUUmeå and Umeå University) 
Abstract:  We analyze an approach to quantitative information elicitation in surveys that includes many currently popular variants as special cases. Rather than asking the individual to state a point estimate or select between given brackets, the individual can selfselect any interval of choice. We propose a new estimator for such interval censored data. It can be viewed as an extension of Turnbull's estimator (Turnbull(1976)) for interval censored data. A detailed empirical example is provided, using a survey on the valuation of a public good. We estimate survival functions based on a Weibull and a mixed Weibull/exponential distribution and prove that a consistent maximum likelihood estimator exists and that its accuracy can be consistently estimated by resampling methods in these two families of distributions. 
Keywords:  Interval data; Maximum Likelihood; Turnbull estimator; willingnesstopay; quantitative elicitation 
JEL:  C25 
Date:  2010–02–15 
URL:  http://d.repec.org/n?u=RePEc:hhs:slucer:2010_002&r=ecm 
By:  Tore Selland Kleppe (Department of Mathematics, University of Bergen); Jun Yu (School of Economics, Singapore Management University); Hans J. Skaug (Department of Mathematics, University of Bergen) 
Abstract:  A new algorithm is developed to provide a simulated maximum likelihood estimation of the GARCH diffusion model of Nelson (1990) based on return data only. The method combines two accurate approximation procedures, namely, the polynomial expansion of AïtSahalia (2008) to approximate the transition probability density of return and volatility, and the Efficient Importance Sampler (EIS) of Richard and Zhang (2007) to integrate out the volatility. The first and second order terms in the polynomial expansion are used to generate a baseline importance density for an EIS algorithm. The higher order terms are included when evaluating the importance weights. Monte Carlo experiments show that the new method works well and the discretization error is well controlled by the polynomial expansion. In the empirical application, we fit the GARCH diffusion to equity data, perform diagnostics on the model fit, and test the finiteness of the importance weights. 
Keywords:  Ecient importance sampling; GARCH diusion model; Simulated Maximum likelihood; Stochastic volatility 
JEL:  C11 C15 G12 
Date:  2010–01 
URL:  http://d.repec.org/n?u=RePEc:siu:wpaper:132010&r=ecm 
By:  Clément Bosquet (GREQAM  Groupement de Recherche en Économie Quantitative d'AixMarseille  Université de la Méditerranée  AixMarseille II  Université Paul Cézanne  AixMarseille III  Ecole des Hautes Etudes en Sciences Sociales (EHESS)  CNRS : UMR6579); Hervé Boulhol (CES  Centre d'économie de la Sorbonne  CNRS : UMR8174  Université PanthéonSorbonne  Paris I) 
Abstract:  Following Santos Silva and Tenreyro (2006), various studies have used the Poisson PseudoMaximum Likelihood to estimate gravity specifications of trade flows and noncount data models more generally. Some papers also report results based on the Negative Binomial estimator, which is more general and encompasses the Poisson assumption as a special case. This note shows that the Negative Binomial estimator is inappropriate when applied to a continuous dependent variable which unit choice is arbitrary, because estimates artificially depend on that choice. 
Keywords:  pseudomaximum likelihood methods;negative binomial estimator;Poisson regression;gamma PML 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:hal:cesptp:halshs00535594_v1&r=ecm 
By:  John Gibson (University of Waikato); Bonggeun Kim (Seoul National University); Susan Olivia (Monash University) 
Abstract:  Standard error corrections for clustered samples impose untested restrictions on spatial correlations. Our example shows these are too conservative, compared with a spatial error model that exploits information on exact locations of observations, causing inference errors when cluster corrections are used. 
Keywords:  clustered samples; GPS; spatial correlation 
JEL:  C31 C81 
Date:  2011–08–18 
URL:  http://d.repec.org/n?u=RePEc:wai:econwp:10/07&r=ecm 
By:  Kociecki, Andrzej 
Abstract:  The article presents the problem of identification in parametric models from an algebraic point of view. We argue that it is not just another perspective but the proper one. That is, using our approach we can see the very nature of the identification problem, which is slightly different than that suggested in the literature. In practice, it means that in many models we can unambiguously estimate parameters that have been thought as unidentifiable. This is illustrated in the case of Simultaneous Equations Model (SEM), where our analysis leads to conclusion that existing identification conditions, although correct, are based on the inappropriate premise: only the structural parameters that are in one–to–one correspondence with the reduced form parameters are identified. We will show that this is not true. In fact, there are other structural parameters, which are identified, but can not be uniquely recovered from the reduced form parameters. Although we apply our theory only to SEM, it can be used in many standard econometric models. 
Keywords:  identification; group theory 
JEL:  C01 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:26820&r=ecm 
By:  Rapisarda, Grazia; Echeverry, David 
Abstract:  When estimating Loss Given Default (LGD) parameters using a workout approach, i.e. discounting cash flows over the workout period, the problem arises of how to take into account partial recoveries from incomplete workouts. The simplest approach would see LGD based on complete recovery profiles only. Whilst simple, this approach may lead to data selection bias, which may be at the basis of regulatory guidance requiring the assessment of the relevance of incomplete workouts to LGD estimation. Despite its importance, few academic contributions have covered this topic. We enhance this literature by developing a nonparametric estimator that under certain distributional assumptions on the recovery profiles aggregates complete and incomplete workout data to produce unbiased and more efficient estimates of mean LGD than those obtained from the estimator based on resolved cases only. Our estimator is appropriate in LGD estimation for wholesale portfolios, where the exposureweighted LGD estimators available in the literature would not be applicable under Basel II regulatory guidance. 
Keywords:  Credit risk; bank loans; lossgivendefault; LGD; incomplete observations; mortality curves 
JEL:  C14 G32 
Date:  2010–11–16 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:26797&r=ecm 
By:  Martin Huber; Giovanni Mellace 
Abstract:  In the presence of an endogenous treatment and a valid instrument, causal effects are (nonparametrically) point identified only for the subpopulation of compliers, given that the treatment is monotone in the instrument. Further populations of likely policy interest have been widely ignored in econometrics. Therefore, we use treatment monotonicity and/or stochastic dominance assumptions to derive sharp bounds on the average treatment effects of the treated population, the entire population, the compliers, the always takers, and the never takers. We also provide an application to labor market data and briefly discuss testable implications of the instrumental exclusion restriction and stochastic dominance. 
Keywords:  Instrument, noncompliance, principal stratification, nonparametric bounds 
JEL:  C14 C31 
Date:  2010–11 
URL:  http://d.repec.org/n?u=RePEc:usg:dp2010:201031&r=ecm 
By:  D M NACHANE 
Abstract:  The aim of this paper is to take stock of the important recent contributions to spectral analysis, especially as they apply to nonstationary processes. Nonstationary processes are particularly relevant in the empirical sciences where most phenomena exhibit pronounced departures from stationary. 
Keywords:  spectral analysis, nonstationary, empirical sciences, time series, 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:ess:wpaper:id:3191&r=ecm 
By:  Liebl, Dominik 
Abstract:  Classical univariate and multivariate time series models have problems to deal with the high variability of hourly electricity spot prices. We propose to model alternatively the daily mean electricity supply functions using a dynamic factor model. And to derive, subsequently, the hourly electricity spot prices by the evaluation of the estimated supply functions at the corresponding hourly values of demand for electricity. Supply functions are price (EUR/MWh) functions, that increase monotonically with demand for electricity (MW). Apart from this new conceptual approach, that allows us to represent the auction design of energy exchanges in a most natural way, our main contribution is an extraordinary simple algorithm to estimate the factor structure of the dynamic factor model. We decompose the time series into a functional spherical component and an univariate scaling component. The elements of the spherical component are all standardized having unit size such that we can robustly estimate the factor structure. This algorithm is much simpler than procedures suggested in the literature. In order to use a parsimonious labeling we will refer to the daily mean supply curves simply as price curves. 
Keywords:  Factor Analysis; functional time series data; sparse data; electricity spot market prices; European Electricity Exchange (EEX) 
JEL:  C14 C22 C1 C01 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:26800&r=ecm 
By:  Elena Andreou; Eric Ghysels; Andros Kourtellos 
Abstract:  We introduce easy to implement regressionbased methods for predicting quarterly real economic activity that use daily financial data and rely on forecast combinations of MIDAS regressions. Our analysis is designed to elucidate the value of daily information and provide realtime forecast updates of the current (nowcasting) and future quarters. Our findings show that while on average the predictive ability of all models worsens substantially following the financial crisis, the models we propose suffer relatively less losses than the traditional ones. Moreover, these predictive gains are primarily driven by the classes of government securities, equities, and especially corporate risk. 
Keywords:  MIDAS, macro forecasting, leads, daily financial information, daily factors. 
Date:  2010–11 
URL:  http://d.repec.org/n?u=RePEc:ucy:cypeua:92010&r=ecm 
By:  Anne B. Koehler; Ralph D. Snyder; J. Keith Ord; Adrian Beaumont 
Abstract:  Compositional time series are formed from measurements of proportions that sum to one in each period of time. We might be interested in forecasting the proportion of home loans that have adjustable rates, the proportion of nonagricultural jobs in manufacturing, the proportion of a rock's geochemical composition that is a specific oxide, or the proportion of an election betting market choosing a particular candidate. A problem may involve many related time series of proportions. There could be several categories of nonagricultural jobs or several oxides in the geochemical composition of a rock that are of interest. In this paper we provide a statistical framework for forecasting these special kinds of time series. We build on the innovations state space framework underpinning the widely used methods of exponential smoothing. We couple this with a generalized logistic transformation to convert the measurements from the unit interval to the entire real line. The approach is illustrated with two applications: the proportion of new home loans in the U.S. that have adjustable rates; and four probabilities for specified candidates winning the 2008 democratic presidential nomination. 
Keywords:  compositional time series, innovations state space models, exponential smoothing, forecasting proportions 
JEL:  C22 
Date:  2010–11 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201020&r=ecm 
By:  Peter C.B. Phillips (Yale University); Jun Yu (School of Economics, Singapore Management University) 
Abstract:  An error is corrected in Yu and Phillips (2001) (Econometrics Journal, 4, 210224) where a time transformation was used to induce Gaussian disturbances in the discrete time equivalent model. It is shown that the error process in this model is not a martingale and the Dambis, DubinsSchwarz (DDS) theorem is not directly applicable. However, a detrended error process is a martingale, the DDS theorem is applicable, and the corresponding stopping time correctly induces Gaussianity. We show that the two stopping time sequences differ by O(a2), where a is the prespecified normalized timing constant. 
Keywords:  Nonlinear Diffusion, Normalizing Transformation, Level Effect, DDS Theorem. 
Date:  2010–10 
URL:  http://d.repec.org/n?u=RePEc:siu:wpaper:182010&r=ecm 
By:  Faust, Jon; Gupta, Abhishek 
Abstract:  In this paper, we develop and apply certain tools to evaluate the strengths and weaknesses of dynamic stochastic general equilibrium (DSGE) models. In particular, this paper makes three contributions: One, it argues the need for such tools to evaluate the usefulness of the these models; two, it defines these tools which take the form of prior and particularly posterior predictive analysis and provides illustrations; and three, it provides a justification for the use of these tools in the DSGE context in defense against the standard criticisms for the use of these tools. 
Keywords:  Prior and posterior predictive analysis; DSGE Model Evaluation; Monetary Policy. 
JEL:  C52 E1 C11 
Date:  2010–10–30 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:26721&r=ecm 
By:  Tödter, KarlHeinz 
Abstract:  The carryover effect is the advance contribution of the old year to growth in the new year. Among practitioners the informative content of the carryover effect for shortterm forecasting is undisputed and is used routinely in economic forecasting. In this paper, the carryover effect is analysed 'statistically' and it is shown how it reduces the uncertainty of shortterm economic forecasts. This is followed by an empirical analysis of the carryover effect using simple forecast models as well as Bundesbank and Consensus projections.  
Keywords:  forecast uncertainty,growth rates,carryover effect,variance contribution,Chebyshev density 
JEL:  C53 E37 C16 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:zbw:bubdp1:201021&r=ecm 