nep-ecm New Economics Papers
on Econometrics
Issue of 2006‒06‒24
fourteen papers chosen by
Sune Karlsson
Orebro University

  1. Asymptotic Properties for a Class of Partially Identified Models By Beresteanu, Arie; Molinari, Francesca
  2. Asymptotics and Consistent Bootstraps for DEA Estimators in Non-parametric Frontier Models By Alois Kneip; Léopold Simar; Paul W. Wilson
  3. Forecasting Canadian Time Series with the New Keynesian Model By Ali Dib; Mohamed Gammoudi; Kevin Moran
  4. Forecasting Commodity Prices: GARCH, Jumps, and Mean Reversion By Jean-Thomas Bernard; Lynda Khalaf; Maral Kichian; Sebastien McMahon
  5. Towards New Empirical Versions of Financial and Accounting Models Corrected for Measurement Errors By Francois-Éric Racicot; Raymond Théoret; Alain Coen
  6. Parametric Binary Choice Models By LECHNER, Michael; LOLLIVIER, Stefan; MAGNAC, Thierry
  7. Time Series Analysis By Francis X. Diebold; Lutz Kilian; Marc Nerlove
  8. Estimating the finite population total under frame imperfections and nonresponse By Ängsved, Marianne
  9. Survival on the Titantic: Illustrating Wald and LM Tests for Proportions and Logits By Robert Dixon; William Griffiths
  10. A New Approach Based on Cumulants for Estimating Financial Regression Models with Errors in the Variables: the Fama and French Model Revisited By Alain Coen; Francois-Éric Racicot; Raymond Théoret
  11. On Selection of Components for a Diffusion Index Model : It's not the Size, It's How You Use It By Boriss Siliverstovs; Konstantin A. Kholodilin
  12. Reexamining the linkages between inflation and output growth: A bivariate ARFIMA-FIGARCH approach By Mustafa Caglayan and Feng Jiang
  13. Robust Multidimensional Poverty Comparisons with Discrete Indicators of Well-being By Jean-Yves Duclos; David Sahn; Stephen D. Younger
  14. Measurement of Business Cycles By Don Harding; Adrian Pagan

  1. By: Beresteanu, Arie; Molinari, Francesca
    Abstract: We propose inference procedures for partially identified population features for which the population identification region can be written as a transformation of the Aumann expectation of a properly defined set valued random variable (SVRV). An SVRV is a mapping that associates a set (rather than a real number) with each element of the sample space. Examples of population features in this class include sample means and best linear predictors with interval outcome data, and parameters of semiparametric binary models with interval regressor data. We extend the analogy principle to SVRVs, and show that the sample analog estimator of the population identification region is given by a transformation of a Minkowski average of SVRVs. Using the results of the mathematics literature on SVRVs, we show that this estimator converges in probability to the identification region of the model with respect to the Hausdorff distance. We then show that the Hausdorff distance between the estimator and the population identification region, when properly normalized by ?n, converges in distribution to the supremum of a Gaussian process whose covariance kernel depends on parameters of the population identification region. We provide consistent bootstrap procedures to approximate this limiting distribution. Using similar arguments as those applied for vector valued random variables, we develop a methodology to test assumptions about the true identification region and to calculate the power of the test. We show that these results can be used to construct a confidence collection, that is a collection of sets that, when specified as null hypothesis for the true value of the population identification region, cannot be rejected by our test.
    Keywords: Partial Identification, Confidence Collections, Set-Valued Random Variables
    JEL: C14
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:duk:dukeec:06-04&r=ecm
  2. By: Alois Kneip; Léopold Simar; Paul W. Wilson
    Abstract: Non-parametric data envelopment analysis (DEA) estimators based on linear programming methods have been widely applied in analyses of productive efficiency. The distributions of these estimators remain unknown except in the simple case of one input and one output, and previous bootstrap methods proposed for inference have not been proven consistent, making inference doubtful. This paper derives the asymptotic distribution of DEA estimators under variable returns-to-scale. This result is then used to prove that two different bootstrap procedures (one based on sub-sampling, the other based on smoothing) provide consistent inference. The smooth bootstrap requires smoothing the irregularly-bounded density of inputs and outputs as well as smoothing of the DEA frontier estimate. Both bootstrap procedures allow for dependence of the inefficiency process on output levels and the mix of inputs in the case of input-oriented measures, or on inputs levels and the mix of outputs in the case of output-oriented measures.
    Keywords: bootstrap, frontier, efficiency, data envelopment analysis, DEA
    JEL: C12 C14 C15
    Date: 2006–04
    URL: http://d.repec.org/n?u=RePEc:bon:bonedp:bgse12_2006&r=ecm
  3. By: Ali Dib; Mohamed Gammoudi; Kevin Moran
    Abstract: The authors document the out-of-sample forecasting accuracy of the New Keynesian model for Canada. They estimate their variant of the model on a series of rolling subsamples, computing out-of-sample forecasts one to eight quarters ahead at each step. They compare these forecasts with those arising from simple vector autoregression (VAR) models, using econometric tests of forecasting accuracy. Their results show that the forecasting accuracy of the New Keynesian model compares favourably with that of the benchmarks, particularly as the forecasting horizon increases. These results suggest that the model could become a useful forecasting tool for Canadian time series. The authors invoke the principle of parsimony to explain their findings.
    Keywords: Business fluctuations and cycles; Economic models; Econometric and statistical methods
    JEL: E32 E37 C12
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:06-4&r=ecm
  4. By: Jean-Thomas Bernard; Lynda Khalaf; Maral Kichian; Sebastien McMahon
    Abstract: Fluctuations in the prices of various natural resource products are of concern in both policy and business circles; hence, it is important to develop accurate price forecasts. Structural models provide valuable insights into the causes of price movements, but they are not necessarily the best suited for forecasting given the multiplicity of known and unknown factors that affect supply and demand conditions in these markets. Parsimonious representations of price processes often prove more useful for forecasting purposes. Central questions in such stochastic models often revolve around the time-varying trend, the stochastic convenience yield and volatility, and mean reversion. The authors seek to assess and compare alternative approaches to modelling these effects, focusing on forecast performance. Three econometric specifications are considered that cover the most up-to-date models in the recent literature on commodity prices: (i) random-walk models with autoregressive conditional heteroscedasticity (ARCH) or generalized ARCH (GARCH) effects, and with normal or student-t innovations, (ii) Poisson-based jump-diffusion models with ARCH or GARCH effects, and with normal or student-t innovations, and (iii) meanreverting models that allow for uncertainty in equilibrium price.
    Keywords: Econometric and statistical methods
    JEL: C52 C53 E37
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:06-14&r=ecm
  5. By: Francois-Éric Racicot (Département des sciences administratives, Université du Québec (Outaouais) et LRSP); Raymond Théoret (Département de stratégie des affaires, Université du Québec (Montréal)); Alain Coen (Département de stratégie des affaires, Université du Québec (Montréal))
    Abstract: In this paper, we propose a new empirical version of the Fama and French Model based on the Hausman (1978) specification test and aimed at discarding measurement errors in the variables. The proposed empirical framework is general enough to be used for correcting other financial and accounting models of measurement errors. Removing measurement errors is important at many levels as information disclosure, corporate governance and protection of investors.
    Keywords: Asset pricing, portfolio selection, errors in variables, measurement errors, higher moments, instrumental variables, Specification test, corporate governance, protection of investors.
    JEL: C13 C19 C49 G12 G31
    Date: 2006–03–01
    URL: http://d.repec.org/n?u=RePEc:pqs:wpaper:132006&r=ecm
  6. By: LECHNER, Michael; LOLLIVIER, Stefan; MAGNAC, Thierry
    Date: 2005–11
    URL: http://d.repec.org/n?u=RePEc:ide:wpaper:5512&r=ecm
  7. By: Francis X. Diebold (Department of Economics, University of Pennsylvania); Lutz Kilian (Department of Economics, University of Michigan); Marc Nerlove (Department of Agricultural and Resource Economics, University of Maryland)
    Abstract: We provide a concise overview of time series analysis in the time and frequency domains, with lots of references for further reading.
    Keywords: time series analysis, time domain, frequency domain
    JEL: C22
    Date: 2006–05–01
    URL: http://d.repec.org/n?u=RePEc:pen:papers:06-019&r=ecm
  8. By: Ängsved, Marianne (Statistics Sweden)
    Abstract: When sampling from a finite population the access to a good sampling frame is of vital importance. However, the statistician often has to face the problem with non-negligible frame imperfections, e.g. overcoverage and undercoverage. More so, error from nonresponse is an increasing problem in many surveys today. <p> In this paper we discuss different approaches to deal with these problems simultaneously. In particular, we address the situation when there exists a new up-to-date current register and the improvement this brings along.
    Keywords: Finite population sampling; target population; samplingframe; overcoverage; undercoverage; nonresponse; GREG estimator; calibration
    JEL: C13
    Date: 2006–06–19
    URL: http://d.repec.org/n?u=RePEc:hhs:oruesi:2006_004&r=ecm
  9. By: Robert Dixon; William Griffiths
    Abstract: Students are very interested in lecture examples and class exercises involving data connected to the maiden voyage and the sinking of the liner Titanic. Information on the passengers and their fate can be used to explore relationships between various tests for differences in survival rates between different groups of passengers. Among the concepts examined are tests for differences of proportions using a normal distribution, a chi-square test for independence, a test for the equality of two logits and a test for the significance of the coefficient of a binary variable in logit model. The relationship between Wald and LM test statistics is also examined. Two related examples are given, one to be used for step by step instructional purposes and one to be given as an exercise to students.
    Keywords: Contingency table, Difference in proportions, Logit model, Statistical tests
    JEL: A22 A23 C12 C25
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:mlb:wpaper:964&r=ecm
  10. By: Alain Coen (Département de stratégie des affaires, Université du Québec (Montréal)); Francois-Éric Racicot (Département des sciences administratives, Université du Québec (Outaouais) et LRSP); Raymond Théoret (Département de stratégie des affaires, Université du Québec (Montréal))
    Abstract: This paper proposes to revisit both the CAPM and the three-factor model of Fama and French (1993) in presence of errors in the variables. To reduce the bias induced by measurement and specification errors, we transpose to the cost of equity an estimator based on cumulants of order three and four initially developed by Dagenais and Dagenais (1997) and lated generalized to financial models by Racicot (2003). Our results show that our technique has great and significant consequences on the measure of the cost of equity. We obtain ipso facto a new estimator of the Jensen alpha.
    Keywords: Errors in the variables, cumulants, higher moments, instrumental variables, cost of equity, Jensen alpha.
    JEL: C13 C49 G12 G31
    Date: 2006–05–01
    URL: http://d.repec.org/n?u=RePEc:pqs:wpaper:142006&r=ecm
  11. By: Boriss Siliverstovs; Konstantin A. Kholodilin
    Abstract: This paper suggests a novel approach to pre-selection of the component series of the diffusion index based on their individual forecasting performance. It is shown that this targeted selection allows substantially improving the forecasting ability compared to the diffusion index models that are based on the largest available dataset.
    Keywords: Diffusion index, forecasting, optimal subset of data
    JEL: E32 C10
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp598&r=ecm
  12. By: Mustafa Caglayan and Feng Jiang
    Abstract: In this paper, given recent theoretical developments that inflation can exhibit long memory properties due to the output growth process, we propose a new class of bivariate processes to simultaneously investigate the dual long memory properties in the mean and the conditional variance of inflation and output growth series. We estimate the model using monthly UK data and document the presence of dual long memory properties in both series. Then, using the conditional variances generated from our bivariate model, we employ Granger causality tests to scrutinize the linkages between the means and the volatilities of inflation and output growth.
    JEL: C32 E31
    URL: http://d.repec.org/n?u=RePEc:gla:glaewp:2006_8&r=ecm
  13. By: Jean-Yves Duclos; David Sahn; Stephen D. Younger
    Abstract: This paper provides a method to make robust multidimensional poverty comparisons when one or more of the dimensions of well-being or deprivation is discrete. Sampling distributions for the statistics used in these poverty comparisons are provided. Several examples show that the methods are both practical and interesting in the sense that they can provide richer information than do univariate poverty comparisons.
    Keywords: Multidimensional Poverty, Stochastic Dominance
    JEL: D31 D63 I31 I32
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:lvl:lacicr:0628&r=ecm
  14. By: Don Harding; Adrian Pagan
    Abstract: We describe different ways of measuring the business cycle. Insti- tutions such as the NBER, OECD and IMF do this through locating the turning points in series taken to represent the aggregate level of economic activity. The turning points are determined according to rules that either come from a parametric model or are non-parametric. Once located information can be extracted on cycle characteristics. We also distinguish cases where a single or multiple series are used to represent the level of activity.
    JEL: E32
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:mlb:wpaper:966&r=ecm

This nep-ecm issue is ©2006 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.