
on Econometrics 
By:  Jinyong Hahn (UCLA); Jerry Hausman; Guido Kuersteiner (Department of Economics, Boston University) 
Abstract:  This paper proposes a new instrumental variables estimator for a dynamic panel model with .xed e¤ects with good bias and mean squared error properties even when identi.cation of the model becomes weak near the unit circle. We adopt a weak instrument asymptotic approximation to study the behavior of various estimators near the unit circle. We show that an estimator based on long di¤erencing the model is much less biased than conventional implementations of the GMM estimator for the dynamic panel model. We also show that under the weak instrument approximation such conventional estimators are dominated in terms of mean squared error by an estimator with far less moment conditions. The long di¤erence estimator mimics the infeasible optimal procedure through its reliance on a small set of moment conditions. 
Keywords:  dynamic panel, bias correction, second order, unit root, weak instrument 
JEL:  C13 C23 C51 
Date:  2005–07 
URL:  http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005024&r=ecm 
By:  Jerry Hausman (MIT, Department of Economics); Guido Kuersteiner (Department of Economics, Boston University) 
Abstract:  We investigate estimation and inference in difference in difference econometric models used in the analysis of treatment effects. When the innovations in such models display serial correlation, commonly used ordinary least squares (OLS) procedures are inefficient and may lead to tests with incorrect size. Implementation of feasible generalized least squares (FGLS) procedures is often hindered by too few observations in the cross section to allow for unrestricted estimation of the weight matrix without leading to tests with similar size distortions as conventional OLS based procedures. We analyze the small sample properties of FGLS based tests with a higher order Edgeworth expansion that allows us to construct a size corrected version of the test. We also address the question of optimal temporal aggregation as a method to reduce the dimension of the weight matrix. We apply our procedure to data on regulation of mobile telephone service prices. We find that a size corrected FGLS based test outperforms tests based on OLS. 
Date:  2005–03 
URL:  http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005010&r=ecm 
By:  Russell Davidson (McGill University); James MacKinnon (Queen's University) 
Abstract:  We first propose two procedures for estimating the rejection probabilities of bootstrap tests in Monte Carlo experiments without actually computing a bootstrap test for each replication. These procedures are only about twice as expensive (per replication) as estimating rejection probabilities for asymptotic tests. We then propose a new procedure for computing bootstrap P values that will often be more accurate than ordinary ones. This "fast double bootstrap" is closely related to the double bootstrap, but it is far less computationally demanding. Simulation results for three different cases suggest that this procedure can be very useful in practice. 
Keywords:  bootstrap test, double bootstrap, Monte Carlo experiment, rejection frequency 
JEL:  C12 C15 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:qed:wpaper:1044&r=ecm 
By:  John Stachurski 
Abstract:  This paper studies the convergence properties of a Monte Carlo algorithm for computing distributions of state variables when the underlying model is a Markov chain with absolutely continuous transition probabilities. We show that the L1 error of the estimator always converges to zero with probability one. In addition, rates of convergence are established for L1 and integral mean squared errors. The algorithm is shown to have many applications in economics. 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:mlb:wpaper:949&r=ecm 
By:  Ai Deng (Department of Economics, Boston University); Pierre Perron (Department of Economics, Boston University) 
Abstract:  We consider the CUSUM of squares test in a linear regression model with general mixing assumptions on the regressors and the errors. We derive its limit distribution and show how it depends on the nature of the error process. We suggest a corrected version that has a limit distribution free of nuisance parameters. We also discuss how it provides an improvement over the standard approach to testing for a change in the variance in a univariate times series. Simulation evidence is presented to support this. We illustrate the usefulness of our method by analyzing changes in the variance of stock returns and a variety of macroeconomic time series, as well as by testing for change in the variance of the residuals in a typical fourvariable VAR model. Our results show the widespread prevalence of changes in the variance of such series and the fact that the variability of shocks affecting the U.S. economy has decreased. 
Keywords:  Changepoint, Variance shift, Recursive residuals, Dynamic models, Conditional heteroskedasticity. 
JEL:  D80 D91 G11 E21 
Date:  2005–11 
URL:  http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005043&r=ecm 
By:  Schlicht, Ekkehart; Ludsteck, Johannes 
Abstract:  This papers describes an estimator for a standard statespace model with coefficients generated by a random walk that is statistically superior to the Kalman filter as applied to this particular class of models. Two closely related estimators for the variances are introduced: A maximum likelihood estimator and a moments estimator that builds on the idea that some moments are equalized to their expectations. These estimators perform quite similar in many cases. In some cases, however, the moments estimator is preferable both to the proposed likelihood estimator and the Kalman filter, as implemented in the program package Eviews. 
JEL:  C52 C51 C22 C2 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:lmu:muenec:904&r=ecm 
By:  VICTOR AGUIRREGABIRIA (Department of Economics, Boston University); PEDRO MIRA (Centro de Estudios Monetarios y Financieros (CEMFI)) 
Abstract:  This paper proposes an algorithm to obtain maximum likelihood estimates of structural parameters in discrete games with multiple equilibria. The method combines a genetic algorithm (GA) with a pseudo maximum likelihood (PML) procedure. The GA searches efficiently over the huge space of possible combinations of equilibria in the data. The PML procedure avoids the repeated computation of equilibria for each trial value of the parameters of interest. To test the ability of this method to get maximum likelihood estimates, we present a Monte Carlo experiment in the context of a game of price competition and collusion. 
Keywords:  Empirical games, Maximum likelihood estimation, Multiple equilibria, Genetic algorithms 
JEL:  C13 C35 
Date:  2005–01 
URL:  http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005001&r=ecm 
By:  D.S. Poskitt; C.L. Skeels 
Abstract:  Poskitt and Skeels (2005) provide a new approximation to the sampling distribution of the IV estimator in a simultaneous equations model, the approximation is appropriate when the concentration parameter associated with the reduced form model is small. We present approximations to the sampling distributions of various functions of the IV estimator based upon smallconcentration asymptotics, and investigate hypothesis testing procedures and confidence region construction using these approximations. We explore the relationship between our work and the K statistic of Kleibergen (2002) and demonstrate that our results can be used to explain the sampling behaviour of the K statistic in simultaneous equations models where identification is weak. 
Keywords:  simultaneous equations model, IV estimator, weak identification, weak instruments, smallconcentration asymptotics 
JEL:  C10 C12 C13 C30 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:mlb:wpaper:948&r=ecm 
By:  Pierre Perron (Department of Economics, Boston University) 
Abstract:  This chapter is concerned with methodological issues related to estimation, testing and computation in the context of structural changes in the linear models. A central theme of the review is the interplay between structural change and unit root and on methods to distinguish between the two. The topics covered are: methods related to estimation and inference about break dates for single equations with or without restrictions, with extensions to multiequations systems where allowance is also made for changes in the variability of the shocks; tests for structural changes including tests for a single or multiple changes and tests valid with unit root or trending regressors, and tests for changes in the trend function of a series that can be integrated or trendstationary; testing for a unit root versus trendstationarity in the presence of structural changes in the trend function; testing for cointegration in the presence of structural changes; and issues related to long memory and level shifts. Our focus is on the conceptual issues about the frameworks adopted and the assumptions imposed as they relate to potential applicability. We also highlight the potential problems that can occur with methods that are commonly used and recent work that has been done to overcome them. 
Date:  2005–04 
URL:  http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005017&r=ecm 
By:  Ole E. BarndorffNielsen (Department of Mathematical Sciences, University of Aarhus, Ny Munkegade, DK8000 Aarhus C, Denmark); Sven Erik Graversen (Department of Mathematical Sciences, University of Aarhus, Ny Munkegade, DK8000 Aarhus C, Denmark); Jean Jacod (Laboratoire de Probabilités et Modéles Aléatoires (CNRS UMR 7599), Université Pierre et Marie Curie, 4 Place Jussieu, 75252 Paris Cedex 05, France); Neil Shephard (Nuffield College, Oxford) 
Abstract:  In this paper we provide an asymptotic analysis of generalised bipower measures of the variation of price processes in financial economics. These measures encompass the usual quadratic variation, power variation and bipower variations which have been highlighted in recent years in financial econometrics. The analysis is carried out under some rather general Brownian semimartingale assumptions, which allow for standard leverage effects. 
Keywords:  Bipower variation, Power variation, Quadratic variation, Semimartingales, Stochastic volatility 
Date:  2006–03–09 
URL:  http://d.repec.org/n?u=RePEc:nuf:econwp:0506&r=ecm 
By:  David S. Lee; David Card 
Abstract:  A regression discontinuity (RD) research design is appropriate for program evaluation problems in which treatment status (or the probability of treatment) depends on whether an observed covariate exceeds a fixed threshold. In many applications the treatmentdetermining covariate is discrete. This makes it impossible to compare outcomes for observations "just above" and "just below" the treatment threshold, and requires the researcher to choose a functional form for the relationship between the treatment variable and the outcomes of interest. We propose a simple econometric procedure to account for uncertainty in the choice of functional form for RD designs with discrete support. In particular, we model deviations of the true regression function from a given approximating function  the specification errors  as random. Conventional standard errors ignore the group structure induced by specification errors and tend to overstate the precision of the estimated program impacts. The proposed inference procedure that allows for specification error also has a natural interpretation within a Bayesian framework. 
JEL:  C1 C5 J0 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberte:0322&r=ecm 
By:  Ai Deng (Department of Economics, Boston University) 
Abstract:  This paper provides an asymptotic theory for the spurious regression analyzed by Ferson, Sarkissian and Simin (2003). The asymptotic framework developed by Nabeya and Perron (1994) is used to provide approximations for the various estimates and statistics. Also, using a fixedbandwidth asymptotic framework, a convergent t test is constructed, following Sun (2005). These are shown to be accurate and to explain the simulation findings in Ferson et al. (2003). Monte Carlo studies show that our asymptotic distribution provides a very good finite sample approximation for sample sizes often encountered in finance. Our analysis also reveals an important potential problem in the theoretical hypothesis testing literature on predictability. A possible reconciling interpretation is provided. 
Keywords:  spurious regression, observational equivalence, NabeyaPerron asymptotics, fixedb asymptotics, data mining, nearly integrated, nearly white noise (NINW) 
Date:  2005–12 
URL:  http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005044&r=ecm 
By:  Ole E. BarndorffNielsen (Department of Mathematical Sciences, University of Aarhus, Ny Munkegade, DK8000 Aarhus C, Denmark); Neil Shephard (Nuffield College, Oxford); Matthias Winkel (Department of Statistics, University of Oxford, 1 South Parks Road, Oxford, OX1 3TG, U.K.) 
Abstract:  In this paper we provide a systematic study of the robustness of probability limits and central limit theory for realised multipower variation when we add finite activity and infinite activity jump processes to an underlying Brownian semimartingale. 
Keywords:  Bipower variation, Infinite activity, Multipower variation, Power variation, Quadratic variation, Semimartingales, Stochastic volatility 
Date:  2006–03–09 
URL:  http://d.repec.org/n?u=RePEc:nuf:econwp:0507&r=ecm 
By:  Duo Qin (Queen Mary, University of London); Marie Anne Cagas (Asian Development Bank (ADB), and University of the Philippines); Geoffrey Ducanes (Asian Development Bank (ADB), and University of the Philippines); Nedelyn MagtibayRamos (Asian Development Bank (ADB)); Pilipinas Quising (Asian Development Bank (ADB)) 
Abstract:  This paper compares forecast performance of the ALI method and the MESMs and seeks ways of improving the ALI method. Inflation and GDP growth form the forecast objects for comparison, using data from China, Indonesia and the Philippines. The ALI method is found to produce better forecasts than those by MESMs in general, but the method is found to involve greater uncertainty in choosing indicators, mixing data frequencies and utilizing unrestricted VARs. Two possible improvements are found helpful to reduce the uncertainty: (i) give theory priority in choosing indicators and include theorybased disequilibrium shocks in the indicator sets; and (ii) reduce the VARs by means of the general→specific model reduction procedure. 
Keywords:  Dynamic factor models, Model reduction, VAR 
JEL:  E31 C53 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp554&r=ecm 
By:  Ivan FernandezVal (Department of Economics, Boston University); 
Abstract:  Fixed e®ects estimates of structural parameters in nonlinear panel models can be severely biased due to the incidental parameters problem. In this paper I show that the most important com ponent of this incidental parameters bias for probit ¯xed e®ects estimators of index coe±cients is proportional to the true parameter value, using a largeT expansion of the bias. This result allows me to derive a lower bound for this bias, and to show that ¯xed e®ects estimates of ratios of coe±cients and average marginal e®ects have zero bias in the absence of heterogeneity and have negligible bias relative to their true values for a wide range of distributions of regressors and individual e®ects. Numerical examples suggest that this small bias property also holds for logit and linear probability models, and for exogenous variables in dynamic binary choice models. An empirical analysis of female labor force participation using data from the PSID shows that whereas the signi¯cant biases in ¯xed e®ects estimates of model parameters do not contami nate the estimates of marginal e®ects in static models, estimates of both index coe±cients and marginal e®ects can be severely biased in dynamic models. Improved bias corrected estimators for index coe±cients and marginal e®ects are also proposed for both static and dynamic models. 
JEL:  C23 C25 J22 
Date:  2005–10 
URL:  http://d.repec.org/n?u=RePEc:bos:wpaper:wp200536&r=ecm 
By:  Alan Beggs; Kathryn Graddy 
Abstract:  This paper tests for reference dependence, using data from Impressionist and Contemporary Art auctions. We distinguish reference dependence based on "rule of thumb" learning from reference dependence based on "rational" learning. Furthermore, we distinguish pure reference dependence from effects due to loss aversion. Thus, we use actual market data to test essential characteristics of Kahneman and Tversky`s Prospect Theory. The main methodological innovations of this paper are firstly, that reference dependence can be identified separately from loss aversion. Secondly, we introduce a consistent nonlinear estimator to deal with measurement errors problems involved in testing for loss aversion. In this dataset, we find strong reference dependence but no loss aversion. 
Keywords:  Reference Dependence, Loss Aversion, Prospect Theory, Art, Auctions 
JEL:  D81 D44 L82 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:228&r=ecm 
By:  Ai Deng (Department of Economics, Boston University); Pierre Perron (Columbia Business School) 
Abstract:  This paper considers various asymptotic approximations to the finite sample distribution of the estimate of the break date in a simple onebreak model for a linear trend function that exhibits a change in slope, with or without a concurrent change in intercept. The noise component is either stationary or has an autoregressive unit root. Our main focus is on comparing the socalled “boundedtrend” and “unboundedtrend” asymptotic frameworks. Not surprisingly, the “boundedtrend” asymptotic framework is of little use when the noise component is integrated. When the noise component is stationary, we obtain the following results. If the intercept does not change and is not allowed to change in the estimation, both frameworks yield the same approximation. However, when the intercept is allowed to change, whether or not it actually changes in the data, the “boundedtrend" asymptotic framework completely misses important features of the finite sample distribution of the estimate of the break date, especially the pronounced bimodality that was uncovered by Perron and Zhu (2005) and shown to be well captured using the “unboundedtrend” asymptotic framework. Simulation experiments confirm our theoretical findings, which expose the drawbacks of using the “boundedtrend” asymptotic framework in the context of structural change models. 
Keywords:  changepoint, confidence intervals, shrinking shifts, bounded trend, level shift. 
Date:  2005–08 
URL:  http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005029&r=ecm 
By:  Pierre Perron (Department of Economics, Boston University); Tomoyoshi Yabu (Department of Economics, Boston University) 
Keywords:  structural change, unit root, median unbaised estimates, GLS procedure, super efficient estimates 
JEL:  C22 
Date:  2005–07 
URL:  http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005026&r=ecm 
By:  Pierre Perron (Department of Economics, Boston University); Tomoyoshi Yabu (Department of Economics, Boston University) 
Keywords:  linear trend, unit root, median unbaised estimates, GLS procedure, super efficient estimates 
JEL:  C22 
Date:  2004–10 
URL:  http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005035&r=ecm 
By:  Jeremy Large (Nuffield College, Oxford) 
Abstract:  Financial assets' quoted prices normally change through frequent revisions, or jumps. For markets where quotes are almost always revised by the minimum price tick, this paper proposes a new estimator of Quadratic Variation which is robust to microstructure effects. It compares the number of alternations, where quotes are revised back to their previous price, to the number of other jumps. Many markets exhibit a lack of autocorrelation in their quotes' alternation pattern. Under quite general 'no leverage' assumptions, whenever this is so the proposed statistic is consistent as the intensity of jumps increases without bound. After an empirical implementation, some useful corollaries of this are given. 
JEL:  C10 C22 C80 
Date:  2006–03–09 
URL:  http://d.repec.org/n?u=RePEc:nuf:econwp:0505&r=ecm 
By:  Christophe Van Nieuwenhuyze (National Bank of Belgium, Research Department) 
Abstract:  This paper aims to extract the common variation in a data set of 509 conjunctural series as an indication of the Belgian business cycle. The data set contains information on business and consumer surveys of Belgium and its neighbouring countries, macroeconomic variables and some worldwide watched indicators such as the ISM and the OECD confidence indicators. The statistical framework used is the Onesided Generalised Dynamic Factor Model developed by Forni, Hallin, Lippi and Reichlin (2005). The model splits the series in a common component, driven by the business cycle, and an idiosyncratic component. Wellknown indicators such as the EC economic sentiment indicator for Belgium and the NBB overall synthetic curve contain a high amount of business cycle information. Furthermore, the richness of the model allows to determine the cyclical properties of the series and to forecast GDP growth all within the same unified setting. We classify the common component of the variables into leading, lagging and coincident with respect to the common component of quarteronquarter GDP growth. 22% of the variables are found to be leading. Amongst the most leading variables we find asset prices and international confidence indicators such as the ISM and some OECD indicators. In general, national business confidence surveys are found to coincide with Belgian GDP, while they lead euro area GDP and its confidence indicators. Consumer confidence seems to lag. Although the model captures the dynamic common variation contained in the data set, forecasts based on that information are insufficient to deliver a good proxy for GDP growth as a result of a nonnegligible idiosyncratic part in GDP's variance. Lastly, we explore the dependence of the model's results on the data set and show through a data reduction process that the idiosyncratic part of GDP's quarteronquarter growth can be dramatically reduced. However, this does not improve the forecasts. 
Keywords:  Dynamic factor model, business cycle, leading indicators, forecasting, data reduction. 
JEL:  C33 C43 E32 E37 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:nbb:reswpp:2006032&r=ecm 
By:  Jones B.; Goos P. 
Abstract:  We introduce a new method for generating optimal splitplot designs. These designs are optimal in the sense that they are efficient for estimating the fixed effects of the statistical model that is appropriate given the splitplot design structure. One advantage of the method is that it does not require the prior specification of a candidate set. This makes the production of splitplot designs computationally feasible in situations where the candidate set is too large to be tractable. The method allows for flexible choice of the sample size and supports inclusion of both continuous and categorical factors. The model can be any linear regression model and may include arbitrary polynomial terms in the continuous factors and interaction terms of any order. We demonstrate the usefulness of this flexibility with a 100run polypropylene experiment involving 11 factors where we found a design that is substantially more efficient than designs produced using other approaches. 
Date:  2006–02 
URL:  http://d.repec.org/n?u=RePEc:ant:wpaper:2006006&r=ecm 
By:  Neil Shephard; Ole E. BarndorffNielsen 
Abstract:  We will review the econometrics of nonparametric estimation of the components of the variation of asset prices. This very active literature has been stimulated by the recent advent of complete records of transaction prices, quote data and order books. In our view the interaction of the new data sources with new econometric methodology is leading to a paradigm shift in one of the most important areas in econometrics: volatility measurement, modelling and forecasting. We will describe this new paradigm which draws together econometrics with arbitrage free financial economics theory. Perhaps the two most influential papers in this area have been Andersen, Bollerslev, Diebold and Labys (2001) and BarndorffNielsen and Shephard (2002), but many other papers have made important contributions. This work is likely to have deep impacts on the econometrics of asset allocation and risk management. One of our observations will be that inferences based on these methods, computed from observed market prices and so under the physical measure, are also valid as inferences under all equivalent measures. This puts this subject also at the heart of the econometrics of derivative pricing. One of the most challenging problems in this context is dealing with various forms of market frictions, which obscure the efficient price from the econometrician. Here we will characterise four types of statistical models of frictions and discuss how econometricians have been attempting to overcome them. 
Keywords:  Quadratic Variation, Volatility, Realised Volatility 
JEL:  C14 C22 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:240&r=ecm 
By:  Michael P. Keane (Department of Economics, Yale University); Kenneth I. Wolpin (Department of Economics, University of Pennsylvania) 
Abstract:  Opportunities for external validation of behavioral models in the social sciences that are based on randomized social experiments or on large regime shifts, that can be treated as experiments for the purpose of model validation, are extremely rare. In this paper, we consider an alternative approach, namely mimicking the essential element of regime change by nonrandomly holding out from estimation a portion of the sample that faces a significantly different policy regime. The nonrandom holdout sample is used for model validation/selection. We illustrate the nonrandom holdout sample approach to model validation in the context of a model of welfare program participation. The policy heterogeneity that we exploit to generate a nonrandom holdout sample takes advantage of the wide variation across states that has existed in welfare policy. 
Keywords:  Model validation, Holdout sample, Public welfare 
JEL:  C52 C53 J1 J2 
Date:  2006–05–01 
URL:  http://d.repec.org/n?u=RePEc:pen:papers:06006&r=ecm 
By:  Guillaume Chevillon 
Abstract:  To forecast at several, say h, periods into the future, a modeller faces two techniques: iterating onestep ahead forecasts (the IMS technique) or directly modelling the relation between observations separated by an hperiod interval and using it for forecasting (DMS forecasting). It is known that unitroot nonstationarity and residual autocorrelation benefit DMS accuracy in finite samples. We analyze here the effect of structural breaks as observed in unstable economies, and show that the benefits of DMS stem from its better appraisal of the dynamic relationships of interest for forecasting. It thus acts in between congruent modelling and intercept correction. We apply our results to forecasting the South African GDP over the last thirty years as this economy exhibits significant unstability. We analyze the forecasting properties of 31 competing models. We find that the GDP of South Africa is best forecast, 4 quarters ahead, using direct multistep techniques, as with our theoretical results. 
Keywords:  Multistep Forecasting, Structural Breaks, South Africa 
JEL:  C32 C53 E3 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:257&r=ecm 
By:  O.T. Henry; S. Suardi 
Abstract:  Empirical evidence documents a level effect in the volatility of short term rates of interest. That is, volatility is positively correlated with the level of the short term interest rate. Using MonteCarlo simulations this paper examines the performance of the commonly used EngleNg (1993) tests which differentiate the effect of good and bad news on the predictability of future short rate volatility. Our results show that the tests exhibit serious size distortions and loss of power in the face of a neglected level effect. 
Keywords:  Level Effects; Asymmetry; EngleNg Tests 
JEL:  C12 G12 E44 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:mlb:wpaper:945&r=ecm 
By:  John Stachurski 
Abstract:  This note considers finite state Markov chains which overlap supports. While the overlapping supports condition is known to be necessary and sufficient for stability of these chains, the result is typically presented in a more general context. As such, one objective of the note is to provide an exposition, along with simple proofs corresponding to the finite case. Second, the note provides an additional equivalent condition which should be useful in applications. 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:mlb:wpaper:951&r=ecm 
By:  Duo Qin (Queen Mary, University of London); Marie Anne Cagas (Asian Development Bank (ADB)); Geoffrey Ducanes (Asian Development Bank (ADB)); Xinhua He (Institute of World Economics & Politics (IWEP), Chinese Academy of Social Sciences (CASS)); Rui Liu (Institute of World Economics & Politics (IWEP), Chinese Academy of Social Sciences (CASS)); Shiguo Liu (Institute of World Economics & Politics (IWEP), Chinese Academy of Social Sciences (CASS)); Nedelyn MagtibayRamos (Asian Development Bank (ADB)); Pilipinas Quising (Asian Development Bank (ADB)) 
Abstract:  This paper describes a quarterly macroeconometric model of the Chinese economy. The model comprises household consumption, investment, government, trade, production, prices, money, and employment blocks. The equilibriumcorrection form is used for all the behavioral equations and the general→simple dynamic specification approach is adopted. Great efforts have been made to achieve the best possible blend of standard longrun theories, countryspecific institutional features and shortrun dynamics in data. The tracking performance of the model is evaluated. Forecasting and empirical investigation of a number of topical macroeconomic issues utilizing model simulations have shown the model to be immensely useful. 
Keywords:  Macroeconometric model, Chinese economy, Forecasts, Simulations 
JEL:  C51 E17 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp553&r=ecm 
By:  Duo Qin (Queen Mary, University of London) 
Abstract:  This paper examines the rise of the VAR approach from a historical perspective. It shows that the VAR approach arises as a systematic solution to the issue of ‘model choice’ bypassed by Cowles Commission (CC) researchers, and that the approach essentially inherits and enhances the CC legacy rather than abandons or opposes it. It argues that the approach is not so atheoretical as widely believed and that it helps reform econometrics by shifting research focus from measurement of given theories to identification/verification of datacoherent theories, and hence from confirmatory analysis to a mixture of confirmatory and exploratory analysis. 
Keywords:  VAR, Macroeconometrics, Methodology, Rational expectations, Structural model 
JEL:  B23 B40 C10 C30 C50 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp557&r=ecm 
By:  M. Hashem Pesaran; Ron Smith 
Abstract:  This paper provides a synthesis and further development of a global modelling approach introduced in Pesaran, Schuermann andWeiner (2004), where country specific models in the form of VARX* structures are estimated relating a vector of domestic variables, xit, to their foreign counterparts, xit, and then consistently combined to form a Global VAR (GVAR). It is shown that the VARX* models can be derived as the solution to a dynamic stochastic general equilibrium (DSGE) model where overidentifying longrun theoretical relations can be tested and imposed if acceptable. This gives the system a transparent longrun theoretical structure. Similarly, shortrun overidentifying theoretical restrictions can be tested and imposed if accepted. Alternatively, if one has less confidence in the shortrun theory the dynamics can be left unrestricted. The assumption of the weak exogeneity of the foreign variables for the longrun parameters can be tested, where xit variables can be interpreted as proxies for global factors. Rather than using deviations from ad hoc statistical trends, the equilibrium values of the variables reflecting the longrun theory embodied in the model can be calculated. This approach has been used in a wide variety of contexts and for a wide variety of purposes. The paper also provides some new results. 
Keywords:  Global VAR (GVAR), DSGE models, VARX 
JEL:  C32 E17 F42 
URL:  http://d.repec.org/n?u=RePEc:scp:wpaper:0643&r=ecm 