nep-ecm New Economics Papers
on Econometrics
Issue of 2006‒03‒18
thirty papers chosen by
Sune Karlsson
Orebro University

  1. Bias Corrected Instrumental Variables Estimation for Dynamic Panel Models with Fixed E¤ects By Jinyong Hahn; Jerry Hausman; Guido Kuersteiner
  2. Difference in Difference Meets Generalized Least Squares: Higher Order Properties of Hypotheses Tests By Jerry Hausman; Guido Kuersteiner
  3. Improving the Reliability of Bootstrap Tests with the Fast Double Bootstrap By Russell Davidson; James MacKinnon
  4. Computing the Distributions of Economic Models Via Simulation By John Stachurski
  5. The Limit Distribution of the CUSUM of Square Test Under Genreal MIxing Conditions* By Ai Deng; Pierre Perron
  6. Variance Estimation in a Random Coefficients Model By Schlicht, Ekkehart; Ludsteck, Johannes
  7. A Genetic Algorithm for the Structural Estimation of Games with Multiple Equilibria By VICTOR AGUIRREGABIRIA; PEDRO MIRA
  8. Small Concentration Asymptotics and Instrumental Variables Inference By D.S. Poskitt; C.L. Skeels
  9. Dealing with Structural Breaks By Pierre Perron
  10. Limit theorems for bipower variation in financial econometrics By Ole E. Barndorff-Nielsen; Sven Erik Graversen; Jean Jacod; Neil Shephard
  11. Regression Discontinuity Inference with Specification Error By David S. Lee; David Card
  12. Understanding Spurious Regression in Financial Economics By Ai Deng
  13. Limit theorems for multipower variation in the presence of jumps By Ole E. Barndorff-Nielsen; Neil Shephard; Matthias Winkel
  14. Forecasting Inflation and GDP growth: Comparison of Automatic Leading Indicator (ALI) Method with Macro Econometric Structural Models (MESMs) By Duo Qin; Marie Anne Cagas; Geoffrey Ducanes; Nedelyn Magtibay-Ramos; Pilipinas Quising
  15. Estimation of Structural Parameters and Marginal Effects in Binary Choice Panel Data Models with Fixed Effects By Ivan Fernandez-Val;
  16. Testing for Reference Dependence: An Application to the Art Market By Alan Beggs; Kathryn Graddy
  17. A Comparison of Alternative Asymptotic Frameworks to Analyze a Structural Change in a Linear Time Trend By Ai Deng; Pierre Perron
  18. Testing for Shifts in Trend with an Integrated or Stationary Noise Component By Pierre Perron; Tomoyoshi Yabu
  19. Estimating Deterministric Trends with an Integrated or Stationary Noise Component By Pierre Perron; Tomoyoshi Yabu
  20. Estimating quadratic variation when quoted prices jump by a constant increment By Jeremy Large
  21. A generalised dynamic factor model for the Belgian economy - Useful business cycle indicators and GDP growth forecasts By Christophe Van Nieuwenhuyze
  22. A candidate-set-free algorithm for generating D-optimal split-plot designs By Jones B.; Goos P.
  23. Variation, jumps, market frictions and high frequency data in financial econometrics By Neil Shephard; Ole E. Barndorff-Nielsen
  24. Exploring the Usefulness of a Non-Random Holdout Sample for Model Validation: Welfare Effects on Female Behavior By Michael P. Keane; Kenneth I. Wolpin
  25. Multi-step Forecasting in Unstable Economies: Robustness Issues in the Presence of Location Shifts By Guillaume Chevillon
  26. TESTING FOR ASYMMETRY IN INTEREST RATE VOLATILITY IN THE PRESENCE OF A NEGLECTED LEVEL EFFECT By O.T. Henry; S. Suardi
  27. NECESSARY AND SUFFICIENT CONDITIONS FORSTABILITY OF FINITE STATE MARKOV CHAINS By John Stachurski
  28. A Macroeconometric Model of the Chinese Economy By Duo Qin; Marie Anne Cagas; Geoffrey Ducanes; Xinhua He; Rui Liu; Shiguo Liu; Nedelyn Magtibay-Ramos; Pilipinas Quising
  29. VAR Modelling Approach and Cowles Commission Heritage By Duo Qin
  30. Macroeconometric Modelling with a Global Perspective By M. Hashem Pesaran; Ron Smith

  1. By: Jinyong Hahn (UCLA); Jerry Hausman; Guido Kuersteiner (Department of Economics, Boston University)
    Abstract: This paper proposes a new instrumental variables estimator for a dynamic panel model with .xed e¤ects with good bias and mean squared error properties even when identi.cation of the model becomes weak near the unit circle. We adopt a weak instrument asymptotic approximation to study the behavior of various estimators near the unit circle. We show that an estimator based on long di¤erencing the model is much less biased than conventional implementations of the GMM estimator for the dynamic panel model. We also show that under the weak instrument approximation such conventional estimators are dominated in terms of mean squared error by an estimator with far less moment conditions. The long di¤erence estimator mimics the infeasible optimal procedure through its reliance on a small set of moment conditions.
    Keywords: dynamic panel, bias correction, second order, unit root, weak instrument
    JEL: C13 C23 C51
    Date: 2005–07
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-024&r=ecm
  2. By: Jerry Hausman (MIT, Department of Economics); Guido Kuersteiner (Department of Economics, Boston University)
    Abstract: We investigate estimation and inference in difference in difference econometric models used in the analysis of treatment effects. When the innovations in such models display serial correlation, commonly used ordinary least squares (OLS) procedures are inefficient and may lead to tests with incorrect size. Implementation of feasible generalized least squares (FGLS) procedures is often hindered by too few observations in the cross section to allow for unrestricted estimation of the weight matrix without leading to tests with similar size distortions as conventional OLS based procedures. We analyze the small sample properties of FGLS based tests with a higher order Edgeworth expansion that allows us to construct a size corrected version of the test. We also address the question of optimal temporal aggregation as a method to reduce the dimension of the weight matrix. We apply our procedure to data on regulation of mobile telephone service prices. We find that a size corrected FGLS based test outperforms tests based on OLS.
    Date: 2005–03
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-010&r=ecm
  3. By: Russell Davidson (McGill University); James MacKinnon (Queen's University)
    Abstract: We first propose two procedures for estimating the rejection probabilities of bootstrap tests in Monte Carlo experiments without actually computing a bootstrap test for each replication. These procedures are only about twice as expensive (per replication) as estimating rejection probabilities for asymptotic tests. We then propose a new procedure for computing bootstrap P values that will often be more accurate than ordinary ones. This "fast double bootstrap" is closely related to the double bootstrap, but it is far less computationally demanding. Simulation results for three different cases suggest that this procedure can be very useful in practice.
    Keywords: bootstrap test, double bootstrap, Monte Carlo experiment, rejection frequency
    JEL: C12 C15
    Date: 2006–03
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1044&r=ecm
  4. By: John Stachurski
    Abstract: This paper studies the convergence properties of a Monte Carlo algorithm for computing distributions of state variables when the underlying model is a Markov chain with absolutely continuous transition probabilities. We show that the L1 error of the estimator always converges to zero with probability one. In addition, rates of convergence are established for L1 and integral mean squared errors. The algorithm is shown to have many applications in economics.
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:mlb:wpaper:949&r=ecm
  5. By: Ai Deng (Department of Economics, Boston University); Pierre Perron (Department of Economics, Boston University)
    Abstract: We consider the CUSUM of squares test in a linear regression model with general mixing assumptions on the regressors and the errors. We derive its limit distribution and show how it depends on the nature of the error process. We suggest a corrected version that has a limit distribution free of nuisance parameters. We also discuss how it provides an improvement over the standard approach to testing for a change in the variance in a univariate times series. Simulation evidence is presented to support this. We illustrate the usefulness of our method by analyzing changes in the variance of stock returns and a variety of macroeconomic time series, as well as by testing for change in the variance of the residuals in a typical four-variable VAR model. Our results show the widespread prevalence of changes in the variance of such series and the fact that the variability of shocks affecting the U.S. economy has decreased.
    Keywords: Change-point, Variance shift, Recursive residuals, Dynamic models, Conditional heteroskedasticity.
    JEL: D80 D91 G11 E21
    Date: 2005–11
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-043&r=ecm
  6. By: Schlicht, Ekkehart; Ludsteck, Johannes
    Abstract: This papers describes an estimator for a standard state-space model with coefficients generated by a random walk that is statistically superior to the Kalman filter as applied to this particular class of models. Two closely related estimators for the variances are introduced: A maximum likelihood estimator and a moments estimator that builds on the idea that some moments are equalized to their expectations. These estimators perform quite similar in many cases. In some cases, however, the moments estimator is preferable both to the proposed likelihood estimator and the Kalman filter, as implemented in the program package Eviews.
    JEL: C52 C51 C22 C2
    Date: 2006–03
    URL: http://d.repec.org/n?u=RePEc:lmu:muenec:904&r=ecm
  7. By: VICTOR AGUIRREGABIRIA (Department of Economics, Boston University); PEDRO MIRA (Centro de Estudios Monetarios y Financieros (CEMFI))
    Abstract: This paper proposes an algorithm to obtain maximum likelihood estimates of structural parameters in discrete games with multiple equilibria. The method combines a genetic algorithm (GA) with a pseudo maximum likelihood (PML) procedure. The GA searches efficiently over the huge space of possible combinations of equilibria in the data. The PML procedure avoids the repeated computation of equilibria for each trial value of the parameters of interest. To test the ability of this method to get maximum likelihood estimates, we present a Monte Carlo experiment in the context of a game of price competition and collusion.
    Keywords: Empirical games, Maximum likelihood estimation, Multiple equilibria, Genetic algorithms
    JEL: C13 C35
    Date: 2005–01
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-001&r=ecm
  8. By: D.S. Poskitt; C.L. Skeels
    Abstract: Poskitt and Skeels (2005) provide a new approximation to the sampling distribution of the IV estimator in a simultaneous equations model, the approximation is appropriate when the concentration parameter associated with the reduced form model is small. We present approximations to the sampling distributions of various functions of the IV estimator based upon small-concentration asymptotics, and investigate hypothesis testing procedures and confidence region construction using these approximations. We explore the relationship between our work and the K statistic of Kleibergen (2002) and demonstrate that our results can be used to explain the sampling behaviour of the K statistic in simultaneous equations models where identification is weak.
    Keywords: simultaneous equations model, IV estimator, weak identification, weak instruments, small-concentration asymptotics
    JEL: C10 C12 C13 C30
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:mlb:wpaper:948&r=ecm
  9. By: Pierre Perron (Department of Economics, Boston University)
    Abstract: This chapter is concerned with methodological issues related to estimation, testing and computation in the context of structural changes in the linear models. A central theme of the review is the interplay between structural change and unit root and on methods to distinguish between the two. The topics covered are: methods related to estimation and inference about break dates for single equations with or without restrictions, with extensions to multi-equations systems where allowance is also made for changes in the variability of the shocks; tests for structural changes including tests for a single or multiple changes and tests valid with unit root or trending regressors, and tests for changes in the trend function of a series that can be integrated or trendstationary; testing for a unit root versus trend-stationarity in the presence of structural changes in the trend function; testing for cointegration in the presence of structural changes; and issues related to long memory and level shifts. Our focus is on the conceptual issues about the frameworks adopted and the assumptions imposed as they relate to potential applicability. We also highlight the potential problems that can occur with methods that are commonly used and recent work that has been done to overcome them.
    Date: 2005–04
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-017&r=ecm
  10. By: Ole E. Barndorff-Nielsen (Department of Mathematical Sciences, University of Aarhus, Ny Munkegade, DK-8000 Aarhus C, Denmark); Sven Erik Graversen (Department of Mathematical Sciences, University of Aarhus, Ny Munkegade, DK-8000 Aarhus C, Denmark); Jean Jacod (Laboratoire de Probabilités et Modéles Aléatoires (CNRS UMR 7599), Université Pierre et Marie Curie, 4 Place Jussieu, 75252 Paris Cedex 05, France); Neil Shephard (Nuffield College, Oxford)
    Abstract: In this paper we provide an asymptotic analysis of generalised bipower measures of the variation of price processes in financial economics. These measures encompass the usual quadratic variation, power variation and bipower variations which have been highlighted in recent years in financial econometrics. The analysis is carried out under some rather general Brownian semimartingale assumptions, which allow for standard leverage effects.
    Keywords: Bipower variation, Power variation, Quadratic variation, Semimartingales, Stochastic volatility
    Date: 2006–03–09
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:0506&r=ecm
  11. By: David S. Lee; David Card
    Abstract: A regression discontinuity (RD) research design is appropriate for program evaluation problems in which treatment status (or the probability of treatment) depends on whether an observed covariate exceeds a fixed threshold. In many applications the treatment-determining covariate is discrete. This makes it impossible to compare outcomes for observations "just above" and "just below" the treatment threshold, and requires the researcher to choose a functional form for the relationship between the treatment variable and the outcomes of interest. We propose a simple econometric procedure to account for uncertainty in the choice of functional form for RD designs with discrete support. In particular, we model deviations of the true regression function from a given approximating function -- the specification errors -- as random. Conventional standard errors ignore the group structure induced by specification errors and tend to overstate the precision of the estimated program impacts. The proposed inference procedure that allows for specification error also has a natural interpretation within a Bayesian framework.
    JEL: C1 C5 J0
    Date: 2006–03
    URL: http://d.repec.org/n?u=RePEc:nbr:nberte:0322&r=ecm
  12. By: Ai Deng (Department of Economics, Boston University)
    Abstract: This paper provides an asymptotic theory for the spurious regression analyzed by Ferson, Sarkissian and Simin (2003). The asymptotic framework developed by Nabeya and Perron (1994) is used to provide approximations for the various estimates and statistics. Also, using a fixed-bandwidth asymptotic framework, a convergent t test is constructed, following Sun (2005). These are shown to be accurate and to explain the simulation findings in Ferson et al. (2003). Monte Carlo studies show that our asymptotic distribution provides a very good finite sample approximation for sample sizes often encountered in finance. Our analysis also reveals an important potential problem in the theoretical hypothesis testing literature on predictability. A possible reconciling interpretation is provided.
    Keywords: spurious regression, observational equivalence, Nabeya-Perron asymptotics, fixed-b asymptotics, data mining, nearly integrated, nearly white noise (NINW)
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-044&r=ecm
  13. By: Ole E. Barndorff-Nielsen (Department of Mathematical Sciences, University of Aarhus, Ny Munkegade, DK-8000 Aarhus C, Denmark); Neil Shephard (Nuffield College, Oxford); Matthias Winkel (Department of Statistics, University of Oxford, 1 South Parks Road, Oxford, OX1 3TG, U.K.)
    Abstract: In this paper we provide a systematic study of the robustness of probability limits and central limit theory for realised multipower variation when we add finite activity and infinite activity jump processes to an underlying Brownian semimartingale.
    Keywords: Bipower variation, Infinite activity, Multipower variation, Power variation, Quadratic variation, Semimartingales, Stochastic volatility
    Date: 2006–03–09
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:0507&r=ecm
  14. By: Duo Qin (Queen Mary, University of London); Marie Anne Cagas (Asian Development Bank (ADB), and University of the Philippines); Geoffrey Ducanes (Asian Development Bank (ADB), and University of the Philippines); Nedelyn Magtibay-Ramos (Asian Development Bank (ADB)); Pilipinas Quising (Asian Development Bank (ADB))
    Abstract: This paper compares forecast performance of the ALI method and the MESMs and seeks ways of improving the ALI method. Inflation and GDP growth form the forecast objects for comparison, using data from China, Indonesia and the Philippines. The ALI method is found to produce better forecasts than those by MESMs in general, but the method is found to involve greater uncertainty in choosing indicators, mixing data frequencies and utilizing unrestricted VARs. Two possible improvements are found helpful to reduce the uncertainty: (i) give theory priority in choosing indicators and include theory-based disequilibrium shocks in the indicator sets; and (ii) reduce the VARs by means of the general→specific model reduction procedure.
    Keywords: Dynamic factor models, Model reduction, VAR
    JEL: E31 C53
    Date: 2006–03
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp554&r=ecm
  15. By: Ivan Fernandez-Val (Department of Economics, Boston University);
    Abstract: Fixed e®ects estimates of structural parameters in nonlinear panel models can be severely biased due to the incidental parameters problem. In this paper I show that the most important com- ponent of this incidental parameters bias for probit ¯xed e®ects estimators of index coe±cients is proportional to the true parameter value, using a large-T expansion of the bias. This result allows me to derive a lower bound for this bias, and to show that ¯xed e®ects estimates of ratios of coe±cients and average marginal e®ects have zero bias in the absence of heterogeneity and have negligible bias relative to their true values for a wide range of distributions of regressors and individual e®ects. Numerical examples suggest that this small bias property also holds for logit and linear probability models, and for exogenous variables in dynamic binary choice models. An empirical analysis of female labor force participation using data from the PSID shows that whereas the signi¯cant biases in ¯xed e®ects estimates of model parameters do not contami- nate the estimates of marginal e®ects in static models, estimates of both index coe±cients and marginal e®ects can be severely biased in dynamic models. Improved bias corrected estimators for index coe±cients and marginal e®ects are also proposed for both static and dynamic models.
    JEL: C23 C25 J22
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-36&r=ecm
  16. By: Alan Beggs; Kathryn Graddy
    Abstract: This paper tests for reference dependence, using data from Impressionist and Contemporary Art auctions. We distinguish reference dependence based on "rule of thumb" learning from reference dependence based on "rational" learning. Furthermore, we distinguish pure reference dependence from effects due to loss aversion. Thus, we use actual market data to test essential characteristics of Kahneman and Tversky`s Prospect Theory. The main methodological innovations of this paper are firstly, that reference dependence can be identified separately from loss aversion. Secondly, we introduce a consistent non-linear estimator to deal with measurement errors problems involved in testing for loss aversion. In this dataset, we find strong reference dependence but no loss aversion.
    Keywords: Reference Dependence, Loss Aversion, Prospect Theory, Art, Auctions
    JEL: D81 D44 L82
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:228&r=ecm
  17. By: Ai Deng (Department of Economics, Boston University); Pierre Perron (Columbia Business School)
    Abstract: This paper considers various asymptotic approximations to the finite sample distribution of the estimate of the break date in a simple one-break model for a linear trend function that exhibits a change in slope, with or without a concurrent change in intercept. The noise component is either stationary or has an autoregressive unit root. Our main focus is on comparing the so-called “bounded-trend” and “unbounded-trend” asymptotic frameworks. Not surprisingly, the “bounded-trend” asymptotic framework is of little use when the noise component is integrated. When the noise component is stationary, we obtain the following results. If the intercept does not change and is not allowed to change in the estimation, both frameworks yield the same approximation. However, when the intercept is allowed to change, whether or not it actually changes in the data, the “bounded-trend" asymptotic framework completely misses important features of the finite sample distribution of the estimate of the break date, especially the pronounced bimodality that was uncovered by Perron and Zhu (2005) and shown to be well captured using the “unbounded-trend” asymptotic framework. Simulation experiments confirm our theoretical findings, which expose the drawbacks of using the “bounded-trend” asymptotic framework in the context of structural change models.
    Keywords: change-point, confidence intervals, shrinking shifts, bounded trend, level shift.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-029&r=ecm
  18. By: Pierre Perron (Department of Economics, Boston University); Tomoyoshi Yabu (Department of Economics, Boston University)
    Keywords: structural change, unit root, median unbaised estimates, GLS procedure, super efficient estimates
    JEL: C22
    Date: 2005–07
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-026&r=ecm
  19. By: Pierre Perron (Department of Economics, Boston University); Tomoyoshi Yabu (Department of Economics, Boston University)
    Keywords: linear trend, unit root, median unbaised estimates, GLS procedure, super efficient estimates
    JEL: C22
    Date: 2004–10
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-035&r=ecm
  20. By: Jeremy Large (Nuffield College, Oxford)
    Abstract: Financial assets' quoted prices normally change through frequent revisions, or jumps. For markets where quotes are almost always revised by the minimum price tick, this paper proposes a new estimator of Quadratic Variation which is robust to microstructure effects. It compares the number of alternations, where quotes are revised back to their previous price, to the number of other jumps. Many markets exhibit a lack of autocorrelation in their quotes' alternation pattern. Under quite general 'no leverage' assumptions, whenever this is so the proposed statistic is consistent as the intensity of jumps increases without bound. After an empirical implementation, some useful corollaries of this are given.
    JEL: C10 C22 C80
    Date: 2006–03–09
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:0505&r=ecm
  21. By: Christophe Van Nieuwenhuyze (National Bank of Belgium, Research Department)
    Abstract: This paper aims to extract the common variation in a data set of 509 conjunctural series as an indication of the Belgian business cycle. The data set contains information on business and consumer surveys of Belgium and its neighbouring countries, macroeconomic variables and some worldwide watched indicators such as the ISM and the OECD confidence indicators. The statistical framework used is the One-sided Generalised Dynamic Factor Model developed by Forni, Hallin, Lippi and Reichlin (2005). The model splits the series in a common component, driven by the business cycle, and an idiosyncratic component. Well-known indicators such as the EC economic sentiment indicator for Belgium and the NBB overall synthetic curve contain a high amount of business cycle information. Furthermore, the richness of the model allows to determine the cyclical properties of the series and to forecast GDP growth all within the same unified setting. We classify the common component of the variables into leading, lagging and coincident with respect to the common component of quarter-on-quarter GDP growth. 22% of the variables are found to be leading. Amongst the most leading variables we find asset prices and international confidence indicators such as the ISM and some OECD indicators. In general, national business confidence surveys are found to coincide with Belgian GDP, while they lead euro area GDP and its confidence indicators. Consumer confidence seems to lag. Although the model captures the dynamic common variation contained in the data set, forecasts based on that information are insufficient to deliver a good proxy for GDP growth as a result of a nonnegligible idiosyncratic part in GDP's variance. Lastly, we explore the dependence of the model's results on the data set and show through a data reduction process that the idiosyncratic part of GDP's quarter-on-quarter growth can be dramatically reduced. However, this does not improve the forecasts.
    Keywords: Dynamic factor model, business cycle, leading indicators, forecasting, data reduction.
    JEL: C33 C43 E32 E37
    Date: 2006–03
    URL: http://d.repec.org/n?u=RePEc:nbb:reswpp:200603-2&r=ecm
  22. By: Jones B.; Goos P.
    Abstract: We introduce a new method for generating optimal split-plot designs. These designs are optimal in the sense that they are efficient for estimating the fixed effects of the statistical model that is appropriate given the split-plot design structure. One advantage of the method is that it does not require the prior specification of a candidate set. This makes the production of split-plot designs computationally feasible in situations where the candidate set is too large to be tractable. The method allows for flexible choice of the sample size and supports inclusion of both continuous and categorical factors. The model can be any linear regression model and may include arbitrary polynomial terms in the continuous factors and interaction terms of any order. We demonstrate the usefulness of this flexibility with a 100-run polypropylene experiment involving 11 factors where we found a design that is substantially more efficient than designs produced using other approaches.
    Date: 2006–02
    URL: http://d.repec.org/n?u=RePEc:ant:wpaper:2006006&r=ecm
  23. By: Neil Shephard; Ole E. Barndorff-Nielsen
    Abstract: We will review the econometrics of non-parametric estimation of the components of the variation of asset prices. This very active literature has been stimulated by the recent advent of complete records of transaction prices, quote data and order books. In our view the interaction of the new data sources with new econometric methodology is leading to a paradigm shift in one of the most important areas in econometrics: volatility measurement, modelling and forecasting. We will describe this new paradigm which draws together econometrics with arbitrage free financial economics theory. Perhaps the two most influential papers in this area have been Andersen, Bollerslev, Diebold and Labys (2001) and Barndorff-Nielsen and Shephard (2002), but many other papers have made important contributions. This work is likely to have deep impacts on the econometrics of asset allocation and risk management. One of our observations will be that inferences based on these methods, computed from observed market prices and so under the physical measure, are also valid as inferences under all equivalent measures. This puts this subject also at the heart of the econometrics of derivative pricing. One of the most challenging problems in this context is dealing with various forms of market frictions, which obscure the efficient price from the econometrician. Here we will characterise four types of statistical models of frictions and discuss how econometricians have been attempting to overcome them.
    Keywords: Quadratic Variation, Volatility, Realised Volatility
    JEL: C14 C22
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:240&r=ecm
  24. By: Michael P. Keane (Department of Economics, Yale University); Kenneth I. Wolpin (Department of Economics, University of Pennsylvania)
    Abstract: Opportunities for external validation of behavioral models in the social sciences that are based on randomized social experiments or on large regime shifts, that can be treated as experiments for the purpose of model validation, are extremely rare. In this paper, we consider an alternative approach, namely mimicking the essential element of regime change by non-randomly holding out from estimation a portion of the sample that faces a significantly different policy regime. The non-random holdout sample is used for model validation/selection. We illustrate the non-random holdout sample approach to model validation in the context of a model of welfare program participation. The policy heterogeneity that we exploit to generate a non-random holdout sample takes advantage of the wide variation across states that has existed in welfare policy.
    Keywords: Model validation, Hold-out sample, Public welfare
    JEL: C52 C53 J1 J2
    Date: 2006–05–01
    URL: http://d.repec.org/n?u=RePEc:pen:papers:06-006&r=ecm
  25. By: Guillaume Chevillon
    Abstract: To forecast at several, say h, periods into the future, a modeller faces two techniques: iterating one-step ahead forecasts (the IMS technique) or directly modelling the relation between observations separated by an h-period interval and using it for forecasting (DMS forecasting). It is known that unit-root non-stationarity and residual autocorrelation benefit DMS accuracy in finite samples. We analyze here the effect of structural breaks as observed in unstable economies, and show that the benefits of DMS stem from its better appraisal of the dynamic relationships of interest for forecasting. It thus acts in between congruent modelling and intercept correction. We apply our results to forecasting the South African GDP over the last thirty years as this economy exhibits significant unstability. We analyze the forecasting properties of 31 competing models. We find that the GDP of South Africa is best forecast, 4 quarters ahead, using direct multi-step techniques, as with our theoretical results.
    Keywords: Multi-step Forecasting, Structural Breaks, South Africa
    JEL: C32 C53 E3
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:257&r=ecm
  26. By: O.T. Henry; S. Suardi
    Abstract: Empirical evidence documents a level effect in the volatility of short term rates of interest. That is, volatility is positively correlated with the level of the short term interest rate. Using Monte-Carlo simulations this paper examines the performance of the commonly used Engle-Ng (1993) tests which differentiate the effect of good and bad news on the predictability of future short rate volatility. Our results show that the tests exhibit serious size distortions and loss of power in the face of a neglected level effect.
    Keywords: Level Effects; Asymmetry; Engle-Ng Tests
    JEL: C12 G12 E44
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:mlb:wpaper:945&r=ecm
  27. By: John Stachurski
    Abstract: This note considers finite state Markov chains which overlap supports. While the overlapping supports condition is known to be necessary and sufficient for stability of these chains, the result is typically presented in a more general context. As such, one objective of the note is to provide an exposition, along with simple proofs corresponding to the finite case. Second, the note provides an additional equivalent condition which should be useful in applications.
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:mlb:wpaper:951&r=ecm
  28. By: Duo Qin (Queen Mary, University of London); Marie Anne Cagas (Asian Development Bank (ADB)); Geoffrey Ducanes (Asian Development Bank (ADB)); Xinhua He (Institute of World Economics & Politics (IWEP), Chinese Academy of Social Sciences (CASS)); Rui Liu (Institute of World Economics & Politics (IWEP), Chinese Academy of Social Sciences (CASS)); Shiguo Liu (Institute of World Economics & Politics (IWEP), Chinese Academy of Social Sciences (CASS)); Nedelyn Magtibay-Ramos (Asian Development Bank (ADB)); Pilipinas Quising (Asian Development Bank (ADB))
    Abstract: This paper describes a quarterly macroeconometric model of the Chinese economy. The model comprises household consumption, investment, government, trade, production, prices, money, and employment blocks. The equilibrium-correction form is used for all the behavioral equations and the general→simple dynamic specification approach is adopted. Great efforts have been made to achieve the best possible blend of standard long-run theories, country-specific institutional features and short-run dynamics in data. The tracking performance of the model is evaluated. Forecasting and empirical investigation of a number of topical macroeconomic issues utilizing model simulations have shown the model to be immensely useful.
    Keywords: Macroeconometric model, Chinese economy, Forecasts, Simulations
    JEL: C51 E17
    Date: 2006–03
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp553&r=ecm
  29. By: Duo Qin (Queen Mary, University of London)
    Abstract: This paper examines the rise of the VAR approach from a historical perspective. It shows that the VAR approach arises as a systematic solution to the issue of ‘model choice’ bypassed by Cowles Commission (CC) researchers, and that the approach essentially inherits and enhances the CC legacy rather than abandons or opposes it. It argues that the approach is not so atheoretical as widely believed and that it helps reform econometrics by shifting research focus from measurement of given theories to identification/verification of data-coherent theories, and hence from confirmatory analysis to a mixture of confirmatory and exploratory analysis.
    Keywords: VAR, Macroeconometrics, Methodology, Rational expectations, Structural model
    JEL: B23 B40 C10 C30 C50
    Date: 2006–03
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp557&r=ecm
  30. By: M. Hashem Pesaran; Ron Smith
    Abstract: This paper provides a synthesis and further development of a global modelling approach introduced in Pesaran, Schuermann andWeiner (2004), where country specific models in the form of VARX* structures are estimated relating a vector of domestic variables, xit, to their foreign counterparts, xit, and then consistently combined to form a Global VAR (GVAR). It is shown that the VARX* models can be derived as the solution to a dynamic stochastic general equilibrium (DSGE) model where over-identifying long-run theoretical relations can be tested and imposed if acceptable. This gives the system a transparent long-run theoretical structure. Similarly, short-run over-identifying theoretical restrictions can be tested and imposed if accepted. Alternatively, if one has less confidence in the short-run theory the dynamics can be left unrestricted. The assumption of the weak exogeneity of the foreign variables for the long-run parameters can be tested, where xit variables can be interpreted as proxies for global factors. Rather than using deviations from ad hoc statistical trends, the equilibrium values of the variables reflecting the long-run theory embodied in the model can be calculated. This approach has been used in a wide variety of contexts and for a wide variety of purposes. The paper also provides some new results.
    Keywords: Global VAR (GVAR), DSGE models, VARX
    JEL: C32 E17 F42
    URL: http://d.repec.org/n?u=RePEc:scp:wpaper:06-43&r=ecm

This nep-ecm issue is ©2006 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.