
on Econometrics 
By:  Juan Jose Dolado; Jesus Gonzalo; Laura Mayoral 
Abstract:  This paper presents an overview of some new results regarding an easily implementable Wald teststatistic (EFDF test) of the null hypotheses that a timeseries process is I(1) or I(0) against fractional I(d) alternatives, with d?(0,1), allowing for unknown deterministic components and serial correlation in the error term. Specifically, we argue that the EFDF test has better power properties under fixed alternatives than other available tests for fractional roots, as well as analyze how to implement this test when the deterministic components or the longmemory parameter are subject to structural breaks. 
Keywords:  Fractional processes, Deterministic components, Power, Structural breaks 
JEL:  C12 C22 
Date:  2008–01 
URL:  http://d.repec.org/n?u=RePEc:cte:werepe:we20080129&r=ecm 
By:  Eduardo Mendes; Les Oxley (University of Canterbury); William Rea; Marco Reale 
Abstract:  It is now recognised that long memory and structural change can be confused because the statistical properties of times series of lengths typical of financial and econometric series are similar for both models. We propose a new set of methods aimed at distinguishing between long memory and structural change. The approach, which utilises the computational efficient methods based upon Atheoretical Regression Trees (ART), establishes through simulation the bivariate distribution of the fractional integration parameter, d, with regime length for simulated fractionally integrated series. This bivariate distribution is then compared with the data for the time series. We also combine ART with the established goodness of fit test for long memory series due to Beran. We apply these methods to the realized volatility series of 16 stocks in the Dow Jones Industrial Average. We show that in these series the value of the fractional integration parameter is not constant with time. The mathematical consequence of this is that the definition of H selfsimilarity is violated. We present evidence that these series have structural breaks. 
Keywords:  Longrange dependence; Strong dependence; Global dependence; Hurst phenomena 
JEL:  C22 
Date:  2008–01–29 
URL:  http://d.repec.org/n?u=RePEc:cbt:econwp:08/04&r=ecm 
By:  Beatrice Pataracchia 
Abstract:  In this paper we propose a method to derive the spectral representation in the case of a particular class of nonlinear models: Markov Switching ARMA models. The procedure simply relies on the application of the RieszFisher Theorem which describes the spectral density as the Fourier transform of the autocovariance functions. We explicitly show the analytical structure of the spectral density in the simple Markov Switching AR(1). Finally, a monetary policy application of a Markov Switching VAR(4) is presented 
Keywords:  Multivariate ARMA models; Regimeswitching models; Markov switching models; Frequency Domain 
JEL:  C32 C44 E52 
Date:  2008–03 
URL:  http://d.repec.org/n?u=RePEc:usi:wpaper:528&r=ecm 
By:  Segers, J.J.J.; Akker, R. van den; Werker, B.J.M. (Tilburg University, Center for Economic Research) 
Abstract:  At the heart of the copula methodology in statistics is the idea of separating marginal distributions from the dependence structure. However, as shown in this paper, this separation is not to be taken for granted: in the model where the copula is known and the marginal distributions are completely unknown, the empirical distribution functions are semiparametrically efficient if and only if the copula is the independence copula. Incorporating the knowledge of the copula into a nonparametric likelihood yields an estimation procedure which by simulations is shown to outperform the empirical distribution functions, the amount of improvement depending on the copula. Although the knowncopula model is arguably artificial, it provides an instructive stepping stone to the more general model of a parametrically specified copula and arbitrary margins. 
Keywords:  independence copula;nonparametric maximum likelihood estimator;score function;semiparametric efficiency;tangent space 
JEL:  C14 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:200840&r=ecm 
By:  Lennart Hoogerheide (Erasmus University Rotterdam); Herman K. van Dijk (Erasmus University Rotterdam) 
Abstract:  Highly nonelliptical posterior distributions may occur in several econometric models, in particular, when the likelihood information is allowed to dominate and data information is weak. We explain the issue of highly nonelliptical posteriors in a model for the effect of education on income using data from the wellknown Angrist and Krueger (1991) study and discuss how a socalled Information Matrix or Jeffreys' prior may be used as a `regularization prior' that in combination with the likelihood yields posteriors with desirable properties. We further consider an 8dimensional bimodal posterior distribution in a 2regime mixture model for the real US GNP growth. In order to perform a Bayesian posterior analysis using indirect sampling methods in these models, one has to find a good candidate density. In a recent paper  Hoogerheide, Kaashoek and Van Dijk (2007)  a class of neural network functions was introduced as candidate densities in case of nonelliptical posteriors. In the present paper, the connection between canonical model structures, nonelliptical credible sets, and more sophisticated neural network simulation techniques is explored. In all examples considered in this paper – a bimodal distribution of Gelman and Meng (1991) and posteriors in IV and mixture models  the mixture of Student's <I>t</I> distributions is clearly a much better candidate than a Student's <I>t</I> candidate, yielding far more precise estimates of posterior means after the same amount of computing time, whereas the Student's <I>t</I> candidate almost completely misses substantial parts of the parameter space. 
Keywords:  instrumental variables; vector error correction model; mixture model; importance sampling; Markov chain Monte Carlo; neural network 
JEL:  C11 C15 C45 
Date:  2008–04–08 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20080036&r=ecm 
By:  Canova, Fabio 
Abstract:  This chapter highlights the problems that structural methods and SVAR approaches have when estimating DSGE models and examining their ability to capture important features of the data. We show that structural methods are subject to severe identification problems due, in large part, to the nature of DSGE models. The problems can be patched up in a number of ways, but solved only if DSGEs are completely reparametrized or respecified. The potential misspecification of the structural relationships give Bayesian methods an hedge over classical ones in structural estimation. SVAR approaches may face invertibility problems but simple diagnostics can help to detect and remedy these problems. A pragmatic empirical approach ought to use the flexibility of SVARs against potential misspecification of the structural relationships but must firmly tie SVARs to the class of DSGE models which could have have generated the data. 
Keywords:  DSGE models; Identification; Invertibility; SVAR models 
JEL:  C10 C52 E32 E50 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:6791&r=ecm 
By:  Mohamed Boutahar (GREQAM  Groupement de Recherche en Économie Quantitative d'AixMarseille  Université de la Méditerranée  AixMarseille II  Université Paul Cézanne  AixMarseille III  Ecole des Hautes Etudes en Sciences Sociales  CNRS : UMR6579); Gilles Dufrénot (GREQAM  Groupement de Recherche en Économie Quantitative d'AixMarseille  Université de la Méditerranée  AixMarseille II  Université Paul Cézanne  AixMarseille III  Ecole des Hautes Etudes en Sciences Sociales  CNRS : UMR6579); Anne PeguinFeissolle (GREQAM  Groupement de Recherche en Économie Quantitative d'AixMarseille  Université de la Méditerranée  AixMarseille II  Université Paul Cézanne  AixMarseille III  Ecole des Hautes Etudes en Sciences Sociales  CNRS : UMR6579) 
Abstract:  This paper generalizes the standard long memory modeling by assuming that the long memory parameter d is stochastic and time varying: we introduce a STAR process on this parameter characterized by a logistic function. We propose an estimation method of this model. Some simulation experiments are conducted. The empirical results suggest that this new model offers an interesting alternative competing framework to describe the persistent dynamics in modelling some financial series. 
Keywords:  Longmemory, Logistic function, STAR 
Date:  2008–04–23 
URL:  http://d.repec.org/n?u=RePEc:hal:papers:halshs00275254_v1&r=ecm 
By:  Barnett, William A.; Serletis, Apostolos 
Abstract:  This paper is an uptodate survey of the stateoftheart in consumer demand modelling. We review and evaluate advances in a number of related areas, including different approaches to empirical demand analysis, such as the differential approach, the locally flexible functional forms approach, the seminonparametric approach, and a nonparametric approach. We also address estimation issues, including sampling theoretic and Bayesian estimation methods, and discuss the limitations of the currently common approaches. We also highlight the challenge inherent in achieving economic regularity, for consistency with the assumptions of the underlying neoclassical economic theory, as well as econometric regularity, when variables are nonstationary. 
Keywords:  Representative consumer; Engel curves; rank; flexible functional forms; parametric tests; nonparametric tests; theoretical regularity 
JEL:  C14 C50 C30 C11 
Date:  2008–04–22 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:8413&r=ecm 
By:  Laurent Ferrara (CES  Centre d'économie de la Sorbonne  CNRS : UMR8174  Université PanthéonSorbonne  Paris I, DGEIDAMEP  Banque de France); Thomas Raffinot (CPRAsset Management  CPR Asset Management) 
Abstract:  Nonparametric methods have been empirically proved to be of great interest in the statistical literature in order to forecast stationary time series, but very few applications have been proposed in the econometrics literature. In this paper, our aim is to test whether nonparametric statistical procedures based on a Kernel method can improve classical linear models in order to nowcast the Euro area manufacturing industrial production index (IPI) by using business surveys released by the European Commission. Moreover, we consider the methodology based on bootstrap replications to estimate the confidence interval of the nowcasts. 
Keywords:  Nonparametric, Kernel, nowcasting, bootstrap, Euro area IPI. 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:hal:papers:halshs00275769_v1&r=ecm 
By:  Clive G. Bowsher; Roland Meeks 
Abstract:  The class of Functional Signal plus Noise (FSN) models is introduced that provides a new, general method for modelling and forecasting time series of economic functions. The underlying, continuous economic function (or "signal") is a natural cubic spline whose dynamic evolution is driven by a cointegrated vector autoregression for the ordinates (or "yvalues") at the knots of the spline. The natural cubic spline provides flexible crosssectional fit and results in a linear, state space model. This FSN model achieves dimension reduction, provides a coherent description of the observed yield curve and its dynamics as the crosssectional dimension N becomes large, and can feasibly be estimated and used for forecasting when N is large. The integration and cointegration properties of the model are derived. The FSN models are then applied to forecasting 36dimensional yield curves for US Treasury bonds at the one month ahead horizon. The method consistently outperforms the Diebold and Li (2006) and random walk forecasts on the basis of both mean square forecast error criteria and economically relevant loss functions derived from the realised profits of pairs trading algorithms. The analysis also highlights in a concrete setting the dangers of attempts to infer the relative economic value of model forecasts on the basis of their associated mean square forecast errors. 
Keywords:  Timeseries analysis ; Forecasting ; Mathematical models ; Macroeconomics  Econometric models 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:fip:feddwp:0804&r=ecm 
By:  Daniel Millimet (Southern Methodist University); Rusty Tchernis (Indiana University Bloomington) 
Abstract:  We characterize the bias of propensity score based estimators of common average treatment effect parameters in the case of selection on unobservables. We then propose a new minimum biased estimator of the average treatment effect. We assess the finite sample performance of our estimator using simulated data, as well as a timely application examining the causal effect of the School Breakfast Program on childhood obesity. We find our new estimator to be quite advantageous in many situations, even when selection is only on observables. 
Keywords:  Treatment Effects, Propensity Score, Bias, Unconfoundedness, Selection on Unobservables 
JEL:  C21 C52 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:inu:caeprp:2008008&r=ecm 
By:  Viktor Winschel; Markus Krätzig 
Abstract:  We present an objectoriented software framework allowing to specify, solve, and estimate nonlinear dynamic general equilibrium (DSGE) models. The imple mented solution methods for nding the unknown policy function are the standard linearization around the deterministic steady state, and a function iterator using a multivariate global Chebyshev polynomial approximation with the Smolyak op erator to overcome the course of dimensionality. The operator is also useful for numerical integration and we use it for the integrals arising in rational expecta tions and in nonlinear state space lters. The estimation step is done by a parallel MetropolisHastings (MH) algorithm, using a linear or nonlinear lter. Implemented are the Kalman, Extended Kalman, Particle, Smolyak Kalman, Smolyak Sum, and Smolyak Kalman Particle lters. The MH sampling step can be interactively moni tored and controlled by sequence and statistics plots. The number of parallel threads can be adjusted to benet from multiprocessor environments. JBendge is based on the framework JStatCom, which provides a standardized ap plication interface. All tasks are supported by an elaborate multithreaded graphical user interface (GUI) with project management and data handling facilities. 
Keywords:  Dynamic Stochastic General Equilibrium (DSGE) Models, Bayesian Time Series Econometrics, Java, Software Development 
JEL:  C11 C13 C15 C32 C52 C63 C68 C87 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2008034&r=ecm 
By:  Lanne, Markku; Saikkonen, Pentti 
Abstract:  This paper is concerned with univariate noncausal autoregressive models and their potential usefulness in economic applications. We argue that noncausal autoregressive models are especially well suited for modeling expectations. Unlike conventional causal autoregressive models, they explicitly show how the considered economic variable is affected by expectations and how expectations are formed. Noncausal autoregressive models can also be used to examine the related issue of backwardlooking or forwardlooking dynamics of an economic variable. We show in the paper how the parameters of a noncausal autoregressive model can be estimated by the method of maximum likelihood and how related test procedures can be obtained. Because noncausal autoregressive models cannot be distinguished from conventional causal autoregressive models by second order properties or Gaussian likelihood, a detailed discussion on their specification is provided. Motivated by economic applications we explicitly use a forwardlooking autoregressive polynomial in the formulation of the model. This is different from the practice used in previous statistics literature on noncausal autoregressions and, in addition to its economic motivation, it is also convenient from a statistical point of view. In particular, it facilitates obtaining likelihood based diagnostic tests for the specified orders of the backwardlooking and forwardlooking autoregressive polynomials. Such test procedures are not only useful in the specification of the model but also in testing economically interesting hypotheses such as whether the considered variable only exhibits forwardlooking behavior. As an empirical application, we consider modeling the U.S. inflation dynamics which, according to our results, is purely forwardlooking. 
Keywords:  Noncausal autoregression; expectations; inflation persistence 
JEL:  C52 E31 C22 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:8411&r=ecm 
By:  Hahn, Jinyong; Hirano, Keisuke; Karlan, Dean 
Abstract:  Many social experiments are run in multiple waves, or are replications of earlier social experiments. In principle, the sampling design can be modified in later stages or replications to allow for more efficient estimation of causal effects. We consider the design of a twostage experiment for estimating an average treatment effect, when covariate information is available for experimental subjects. We use data from the first stage to choose a conditional treatment assignment rule for units in the second stage of the experiment. This amounts to choosing the propensity score, the conditional probability of treatment given covariates. We propose to select the propensity score to minimize the asymptotic variance bound for estimating the average treatment effect. Our procedure can be implemented simply using standard statistical software and has attractive largesample properties. 
JEL:  C90 C42 C93 C01 
Date:  2008–04–15 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:8315&r=ecm 
By:  Erik Hjalmarsson 
Abstract:  This paper analyzes the asymptotic properties of longhorizon estimators under both the null hypothesis and an alternative of predictability. Asymptotically, under the null of no predictability, the longrun estimator is an increasing deterministic function of the shortrun estimate and the forecasting horizon. Under the alternative of predictability, the conditional distribution of the longrun estimator, given the shortrun estimate, is no longer degenerate and the expected pattern of coefficient estimates across horizons differs from that under the null. Importantly, however, under the alternative, highly endogenous regressors, such as the dividendprice ratio, tend to deviate much less than exogenous regressors, such as the short interest rate, from the pattern expected under the null, making it more difficult to distinguish between the null and the alternative. 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgif:928&r=ecm 
By:  Jonas Dovern (The Kiel Institute for the World Economy (IfW)); Ulrich Fritsche (Department for Economics and Politics, University of Hamburg, and DIW Berlin) 
Abstract:  A couple of recent papers have shifted the focus towards disagreement of professional forecasters. When dealing with survey data that is sampled at a frequency higher than annual and that includes only fixed event forecasts, e.g. expectation of average annual growth rates measures of disagreement across forecasters naturally are distorted by a component that mainly reflects the time varying forecast horizon. We use data from the Survey of Professional Forecasters, which reports both fixed event and fixed horizon forecasts, to evaluate different methods for extracting the ``fundamental'' component of disagreement. Based on the paper's results we suggest two methods to estimate dispersion measures from panels of fixed event forecasts: a moving average transformation of the underlying forecasts and estimation with constant forecasthorizoneffects. Both models are easy to handle and deliver equally well performing results, which show a surprisingly high correlation (up to 0.94) with the true dispersion. 
Keywords:  survey data, dispersion, disagreement, fixed event forecasts 
JEL:  C22 C32 E37 
Date:  2008–05 
URL:  http://d.repec.org/n?u=RePEc:hep:macppr:200801&r=ecm 
By:  Prof D.S.G. Pollock 
Abstract:  This paper shows how a frequencyselective filter that is applicable to short trended data sequences can be implemented via a frequencydomain approach. A filtered sequence can be obtained by multiplying the Fourier ordinates of the data by the ordinates of the frequency response of the filter and by applying the inverse Fourier transform to carry the product back into the time domain. Using this technique, it is possible, within the constraints of a finite sample, to design an ideal frequencyselective filter that will preserve all elements within a specified range of frequencies and that will remove all elements outside it. Approximations to ideal filters that are implemented in the time domain are commonly based on truncated versions of the infinite sequences of coefficients derived from the Fourier transforms of rectangular frequency response functions. An alternative to truncating an infinite sequence of coefficients is to wrap it around a circle of a circumference equal in length to the data sequence and to add the overlying coefficients. The coefficients of the wrapped filter can also be obtained by applying a discrete Fourier transform to a set of ordinates sampled from the frequency response function. Applying the coefficients to the data via circular convolution produces results that are identical to those obtained by a multiplication in the frequency domain, which constitutes a more efficient approach. 
Keywords:  Linear filtering; Frequencydomain analysis 
JEL:  C22 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:lec:leecon:08/13&r=ecm 
By:  Ibrahim Ahamada (CES  Centre d'économie de la Sorbonne  CNRS : UMR8174  Université PanthéonSorbonne  Paris I, Ecole d'économie de Paris  Paris School of Economics  Université PanthéonSorbonne  Paris I); Philippe Jolivaldt (CES  Centre d'économie de la Sorbonne  CNRS : UMR8174  Université PanthéonSorbonne  Paris I, Ecole d'économie de Paris  Paris School of Economics  Université PanthéonSorbonne  Paris I) 
Abstract:  Test for unit root based in wavelets theory is recently defined (Genay and Fan, 2007). While the new test is supposed to be robust to the initial value, we bring out by contrast the significant effects of the initial value in the size and the power. We found also that both the wavelets unit root test and ADF test give the same efficiency if the data are corrected of the initial value. Our approach is based in monte carlo experiment. 
Keywords:  Unit root tests, wavelets, monte carlo experiments, sizepower curve. 
Date:  2008–03 
URL:  http://d.repec.org/n?u=RePEc:hal:papers:halshs00275767_v1&r=ecm 
By:  Mehrotra , Aaron (BOFIT); SánchezFung, José R. (BOFIT) 
Abstract:  This paper forecasts inflation in China over a 12month horizon. The analysis runs 15 alternative models and finds that only those considering many predictors via a principal component display a better relative forecasting performance than the univariate benchmark. 
Keywords:  inflation forecasting; datarich environment; principal components; China 
JEL:  C53 E31 
Date:  2008–04–21 
URL:  http://d.repec.org/n?u=RePEc:hhs:bofitp:2008_002&r=ecm 
By:  Mercereau, Benôit; Miniane, Jacques Alain 
Abstract:  The present value model of the current account has been very popular, as it provides an optimal benchmark to which actual current account series have often been compared. We show why persistence in observed current account data makes the estimated optimal series very sensitive to smallsample estimation error, making it close to impossible to determine whether the paths of the two series truly bear any relation to each other. Moreover, the standard Wald test of the model will falsely accept or reject the model with substantial probability. Monte Carlo simulations and estimations using annual and quarterly data from five OECD countries strongly support our predictions. In particular, we conclude that two important consensus results in the literature – that the optimal series is highly correlated with the actual series, but substantially less volatile – are not statistically robust. 
Keywords:  Current account, present value model, model evaluation 
JEL:  C11 C52 F32 F41 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:zbw:ifwedp:7211&r=ecm 
By:  Colander, David C. 
Abstract:  This paper asks the question: Why has the “generaltospecific” cointegrated VAR approach as developed in Europe had only limited success in the US as a tool for doing empirical macroeconomics, where what might be called a “theory comes first” approach dominates? The reason this paper highlights is the incompatibility of the European approach with the US focus on the journal publication metric for advancement. Specifically, the European “generalto specific” cointegrated VAR approach requires researcher judgment to be part of the analysis, and the US focus on a journal publication metric discourages such research methods. The US “theory comes first” approach fits much better with the journal publication metric. 
Keywords:  Incentives, empirical work, econometrics, methodology, cointegration, VAR 
JEL:  B4 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:zbw:ifwedp:7213&r=ecm 
By:  TENENHAUS, Michel 
Abstract:  In this research, the authors explore the use of ULSSEM (StructuralEquationModelling), PLS (Partial Least Squares), GSCA (Generalized Structured Component Analysis), path analysis on block principal components and path analysis on block scales on customer satisfaction data. 
Keywords:  Componentbased SEM; covariancebased SEM; GSCA; path analysis; PLS path modelling; Structural Equation Modelling; Unweighted Least Squares 
JEL:  C10 C23 
Date:  2008–01–01 
URL:  http://d.repec.org/n?u=RePEc:ebg:heccah:0887&r=ecm 