nep-ecm New Economics Papers
on Econometrics
Issue of 2005‒02‒01
fifteen papers chosen by
Sune Karlsson
Stockholm School of Economics

  1. Copula Based Monte Carlo Integration in Financial Problems By Alessio Sancetta
  2. Empirical Exchange Rate Models of the Nineties: Are Any Fit to Survive? By Yin-Wong Cheung; Menzie Chinn; Antonio Garcia Pascual
  3. Covariate selection for non-parametric estimation of treatment effects By de Luna, Xavier; Waernbaum, Ingeborg
  4. New Simple Tests for Panel Cointegration By Westerlund, Joakim
  5. Pooled Unit Root Tests in Panels with a Common Factor By Westerlund, Joakim
  6. Panel Cointegration Tests of the Fisher Hypothesis By Westerlund, Joakim
  7. Testing for Error Correction in Panel Data By Westerlund, Joakim
  8. Testing for Panel Cointegration with Multiple Structural Breaks By Westerlund, Joakim
  9. Estimation of a Binary Choice Model with Grouped Choice Data By Kurkalova, Lyubov; Rabotyagov, Sergey
  10. Where Are We Now? Real-Time Estimates of the Macro Economy By Martin D.D. Evans
  11. Practical Volatility and Correlation Modeling for Financial Market Risk Management By Torben G. Andersen; Tim Bollerslev; Peter F. Christoffersen; Francis X. Diebold
  12. Measurement bias due to response styles: a structural equation model assessing the effects of modes of data collection By Weijters,; Schillewaert, N.; Geuens, M.
  13. Nonparametric tests of optimizing behavior in public service provision: Methodology and an application to local safety By Cherchye L.; De Borger B.; Van Puyenbroeck T.
  14. Covariate Measurement Error in Endogenous Stratified Samples By Esmeralda Ramalho
  15. Binary models with misclassification in the variable of interest By Esmeralda Ramalho

  1. By: Alessio Sancetta
    Abstract: A computational technique that transform integrals over RK, or some of its subsets, into the hypercube [0, 1]K can be exploited in order to solve integrals via Monte Carlo integration without the need to simulate from the original distribution; all that is needed is to simulate iid uniform [0, 1] pseudo random variables. In particular the technique arises from the copula representation of multivariate distributions and the use of the marginal quantile function of the data. The procedure is further simplified if the quantile function has closed form. Several financial applications are considered in order to highlight the scope of this numerical technique for financial problems
    Keywords: Copula, Martingale, Monte Carlo Integral, Quantile Transform, Utility Function.
    JEL: C15 G11 G12
    Date: 2005–01
  2. By: Yin-Wong Cheung (University of California, Santa Cruz); Menzie Chinn (University of Wisconsin, Madison); Antonio Garcia Pascual (International Monetary Fund)
    Abstract: Previous assessments of forecasting performance of exchange rate models have focused upon a narrow set of models typically of the 1970's vintage. The canonical papers in this literature are by Meese and Rogoff (1983, 1988), who examined monetary and portfolio balance models. Succeeding works by Mark (1995) and Chinn and Meese (1995) focused on similar models. In this paper we re-assess exchange rate prediction using a wider set of models that have been proposed in the last decade: interest rate parity, productivity based models, and a composite specification incorporating the real interest differential, portfolio balance and nontradables price channels. The performance of these models is compared against two reference specifications the purchasing power parity and the Dornbusch-Frankel sticky price monetary model. The models are estimated in error correction and first-difference specifications. Rather than estimating the cointegrating vector over the entire sample and treating it as part of the ex ante information set as is commonly done in the literature, we also update the cointegrating vector, thereby generating true ex ante forecasts. We examine model performance at various forecast horizons (1 quarter, 4 quarters, 20 quarters) using differing metrics (mean squared error, direction of change), as well as the "consistency" test of Cheung and Chinn (1998). No model consistently outperforms a random walk, by a mean squared error measure; however, along a direction-of-change dimension, certain structural models do outperform a random walk with statistical significance. Moreover, one finds that these forecasts are cointegrated with the actual values of exchange rates, although in a large number of cases, the elasticity of the forecasts with respect to the actual values is different from unity. Overall, model/specification/currency combinations that work well in one period will not necessarily work well in another period.
    Keywords: exchange rates, monetary model, productivity, interest rate parity, purchasing power parity, forecasting performance,
    Date: 2003–06–01
  3. By: de Luna, Xavier (Umeå University); Waernbaum, Ingeborg (Umeå University)
    Abstract: In observational studies, the non-parametric estimation of a binary treatment effect is often performed by matching each treated individual with a control unit which is similar in observed characteristics (covariates). In practical applications, the reservoir of covariates available may be extensive and the question arises which covariates should be matched for. The current practice consists in matching for covariates which are not balanced for the treated and the control groups, i.e. covariates affecting the treatment assignment. This paper develops a theory based on graphical models, whose results emphasize the need for methods looking both at how the covariates affect the treatment assignment and the outcome. Furthermore, we propose identification algorithms to select at minimal set of covariates to match for. An application to the estimation of the effect of a social program is used to illustrate the implementation of such algorithms.
    Keywords: Graphical models; matching estimators; observational studies; potential outcomes; social programs
    JEL: C14
    Date: 2005–01–25
  4. By: Westerlund, Joakim (Department of Economics, Lund University)
    Abstract: We propose two new simple residual-based panel data tests for the null of no cointegration. The tests are simple because they do not require any correction for the temporal dependencies of the data. Yet they are able to accommodate individual specific short-run dynamics, individual specific intercept and trend terms, as well as individual specific slope parameters. A third test that is modified to accommodate for cross-sectionally dependent data is also proposed. We derive the limiting distributions of the tests and show that they are free of nuisance parameters. Our Monte Carlo results suggest that the asymptotic results are borne out well even in very small samples.
    Keywords: Panel Cointegration; Residual-Based Tests; Cross-Sectional Dependence; Monte Carlo Simulation.
    JEL: C12 C31 C33
    Date: 2005–01–26
  5. By: Westerlund, Joakim (Department of Economics, Lund University)
    Abstract: This paper proposes new pooled panel unit root tests that are appropriate when the data exhibit cross-sectional dependence that is generated by a single common factor. Using sequential limit arguments, we show that the tests have a limiting normal distribution that is free of nuisance parameters and that they are unbiased against heterogenous local alternatives. Our Monte Carlo results indicate that the tests perform well in comparison to other popular tests that also presumes a common factor structure for the cross-sectional dependence.
    Keywords: Pooled Unit Root Tests; Panel Data; Common Factor; Cross-Sectional Dependence; Monte Carlo Simulation.
    JEL: C12 C31 C33
    Date: 2005–01–26
  6. By: Westerlund, Joakim (Department of Economics, Lund University)
    Abstract: Recent empirical studies suggest that the Fisher hypothesis, stating that inflation and nominal interest rates should cointegrate with a unit parameter on inflation, does not hold, a finding at odds with many theoretical models. This paper argues that these results can be explained in part by the low power inherent in univariate cointegration tests and that the use of panel data should generate more powerful tests. In doing so, we propose two new panel cointegration tests, which are shown by simulation to be more powerful than other existing tests. Applying these tests to a panel of monthly data covering the period 1980:1 to 1999:12 on 14 OECD countries, we find evidence supportive of the Fisher hypothesis.
    Keywords: Fisher Hypothesis; Residual-Based Panel Cointegration Test; Monte Carlo Simulation.
    JEL: C12 C15 C32 C33 E40
    Date: 2005–01–26
  7. By: Westerlund, Joakim (Department of Economics, Lund University)
    Abstract: This paper proposes four new tests for the null hypothesis of no cointegration in panel data that are based on the error correction parameter in a conditional error correction model. The limit distribution of the test statistics are derived and critical values are provided. Our Monte Carlo results suggest that the tests have reasonable size properties and good power relative to other popular residual-based cointegration tests. These differences arises because latter imposes a possibly invalid common factor restriction. In our empirical application, we present evidence suggesting that international health care expenditures and GDP are cointegrated once the possibility of an invalid common factor restriction has been accounted for.
    Keywords: Panel Cointegration Test; Monte Carlo Simulation; Common Factor Restriction; International Health Care Expenditures.
    JEL: C12 C32 C33 O30
    Date: 2005–01–26
  8. By: Westerlund, Joakim (Department of Economics, Lund University)
    Abstract: This paper proposes an LM test for the null hypothesis of cointegration that allows for the possibility of multiple structural breaks in both the level and trend of a cointegrated panel regression. The test is general enough to allow for endogenous regressors, serial correlation and an unknown number of breaks that may be located at different dates for different individuals. We derive the limiting distribution of the test and conduct a small Monte Carlo study to investigate its finite sample properties. In our empirical application to the Feldstein-Horioka Puzzle, we find evidence of cointegration between saving and investment once a level break is accommodated.
    Keywords: Panel Cointegration; Residual-Based Cointegration Test; Structural Break; Monte Carlo Simulation; Feldstein-Horioka Puzzle.
    JEL: C12 C32 C33 F21
    Date: 2005–01–26
  9. By: Kurkalova, Lyubov; Rabotyagov, Sergey
    Abstract: We propose an econometric technique for estimating the parameters of a binary choice model when only aggregated data are available on the choices made. The method performs favorably in applications to both simulated and real world choice data.
    Date: 2005–01–24
  10. By: Martin D.D. Evans
    Abstract: This paper describes a method for calculating daily real-time estimates of the current state of the U.S. economy. The estimates are computed from data on scheduled U.S. macroeconomic announcements using an econometric model that allows for variable reporting lags, temporal aggregation, and other complications in the data. The model can be applied to find real-time estimates of GDP, inflation, unemployment or any other macroeconomic variable of interest. In this paper I focus on the problem of estimating the current level of and growth rate in GDP. I construct daily real-time estimates of GDP that incorporate public information known on the day in question. The real-time estimates produced by the model are uniquely-suited to studying how perceived developments the macro economy are linked to asset prices over a wide range of frequencies. The estimates also provide, for the first time, daily time series that can be used in practical policy decisions.
    JEL: E3 C3
    Date: 2005–01
  11. By: Torben G. Andersen; Tim Bollerslev; Peter F. Christoffersen; Francis X. Diebold
    Abstract: What do academics have to offer market risk management practitioners in financial institutions? Current industry practice largely follows one of two extremely restrictive approaches: historical simulation or RiskMetrics. In contrast, we favor flexible methods based on recent developments in financial econometrics, which are likely to produce more accurate assessments of market risk. Clearly, the demands of real-world risk management in financial institutions %uF818 in particular, real-time risk tracking in very high-dimensional situations %uF818 impose strict limits on model complexity. Hence we stress parsimonious models that are easily estimated, and we discuss a variety of practical approaches for high-dimensional covariance matrix modeling, along with what we see as some of the pitfalls and problems in current practice. In so doing we hope to encourage further dialog between the academic and practitioner communities, hopefully stimulating the development of improved market risk management technologies that draw on the best of both worlds.
    JEL: G1
    Date: 2005–01
  12. By: Weijters,; Schillewaert, N.; Geuens, M.
    Abstract: This paper validates measures of response styles as latent variables using structural equation modeling. Next to measurement validation the main objective is to assess whether different modes of data collection bring along measurement bias due to response styles. Results indicate that Internet panel and telephone survey respondents do not show a higher yeah-saying tendency than do people responding to a postal mail survey. Participants in web panel surveys also use the range of rating scales similarly compared to postal mail participants. Telephone survey respondents used a wider range of rating scale options. This may be due to primacy and recency effects of the response options. Internet pop-up surveys seem to lead to more yeah-saying, while respondents also use a narrower range of the rating scale.
    Keywords: Note
    Date: 2004–11–15
  13. By: Cherchye L.; De Borger B.; Van Puyenbroeck T.
    Abstract: We develop a positive non-parametric model of public sector production that allows us to test whether an implicit procedure of cost minimization at shadow prices can rationalize the outcomes of public sector activities. The basic model focuses on multiple C-outputs and does not imply any explicit or implicit assumption regarding the trade-offs between the different inputs (in terms of relative shadow prices) or outputs (in terms of relative valuation). The proposed methodology is applied to a cross-section sample of 546 Belgian municipal police forces. Drawing on detailed task-allocation data and controlling, among others, for the presence of state police forces, the cost minimization hypothesis is found to provide a good fit of the data. Imposing additional structure on output valuation, derived from available ordinal information, yields equally convincing goodness-of-fit results. By contrast, we find that aggregating the labor input over task specializations, a common practice in efficiency assessments of police departments, entails a significantly worse fit of the data.
    Date: 2005–01
  14. By: Esmeralda Ramalho (Department of Economics, University of Évora)
    Abstract: In this paper we propose a general framework to deal with the presence of covariate mea-surement error in endogenous stratifield samples. Using Chesher's (2000) methodology, we develop approximately consistent estimators for the parameters of the structural model, in the sense that their inconsistency is of smaller order than that of the conventional estimators which ignore the existence of covariate measurement error. The approximate bias corrected estimators are obtained by applying the generalized method of moments (GMM) to a modifeld version of the moment indicators suggested by Imbens and Lancaster (1996) for endogenous stratified samples. Only the specification of the conditional distribution of the response vari-able given the latent covariates and the classical additive measurement error model assumption are required, the availability of information on both the marginal probability of the strata in the population and the variance of the measurement error not being essential. A score test to detect the presence of covariate measurement error arises as a by-product of this approach.Monte Carlo evidence is presented which suggests that, in endogenous stratified samples of moderate sizes, the modified GMM estimators perform well.
    Keywords: endogenous stratified samples, covariate measurement error, generalized method of moments estimation, score tests
    JEL: C51 C52
    Date: 2004
  15. By: Esmeralda Ramalho (Department of Economics, University of Évora)
    Abstract: In this paper we propose a general framework to deal with datasets where a binary outcome is subject to misclassification and, for some sampling units, neither the error-prone variable of interest nor the covariates are recorded. A model to describe the observed data is for-malized and eficient likelihood-based generalized method of moments (GMM) estimators are suggested. These estimators merely require the formulation of the conditional distribution of the latent outcome given the covariates. The conditional probabilities which describe the error and the nonresponse mechanisms are estimated simultaneously with the parameters of inter-est. In a small Monte Carlo simulation study our GMM estimators revealed a very promising performance.
    Keywords: nonignorable nonresponse, misclassification, generalized method of moments estimation.
    JEL: C51 C52
    Date: 2004

This nep-ecm issue is ©2005 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.