nep-ecm New Economics Papers
on Econometrics
Issue of 2010‒07‒03
nineteen papers chosen by
Sune Karlsson
Orebro University

  1. Identification And Estimation Of Interventions Using Changes In Inequality Measures By Firpo, Sergio
  2. Selection of weak VARMA models by Akaïke's information criteria By Boubacar Mainassara, Yacouba
  3. Bootstrapping Structural VARs: Avoiding a Potential Bias in Confidence Intervals for Impulse Response Functions By Phillips, Kerk L.; Spencer, David E.
  4. Model Selection Criteria in Multivariate Models with Multiple Structural Changes By Eiji Kurozumi; Purevdorj Tuvaandorj
  5. Estimating Gravity Models of International Trade with Correlated Time-Fixed Regressors: To IV or not IV? By Mitze, Timo
  6. Ten Things We Should Know About Time Series By Michael McAleer; Les Oxley
  7. Realized Volatility Risk By David E. Allen; Michael McAleer; Marcel Scharth
  8. Multivariate extremality measure By Henry Laniado; Rosa E. Lillo; Juan Romo
  9. Robust Estimation of Some Nonregular Parameters By Kyungchul Song
  10. Comparing quantile residual life functions by confidence bands By Alba M. Franco-Pereira; Rosa E. Lillo; Juan Romo
  11. Scaling methods for categorical self-assessed health measures By Patricia Cubí Mollá
  12. Simplicial similarity and its application to hierarchical clustering By Ángel López; Juan Romo
  13. Convergence test in the presence of structural changes: an empirical procedure based on panel data with cross-sectional dependence By Niang, Abdou-Aziz; Pichery, Marie-Claude; Edjo, Marcellin
  14. Concave-Monotone Treatment Response and Monotone Treatment Selection: With an Application to the Returns to Schooling By Okumura, Tsunao; Usui, Emiko
  15. Estimating Dynamic Models with Aggregate Shocks And an Application to Mortgage Default in Colombia By Juan Esteban Carranza; Salvador Navarro
  16. Validation of credit default probabilities via multiple testing procedures By Sebastian D\"ohler
  17. Reducing Status Quo Bias in Choice Experiments – An Application of a Protest Reduction Entreaty By Ole Bonnichsen; Jacob Ladenburg
  18. Empirical econometric evaluation of alternative methods of dealing with missing values in investment climate surveys By Escribano, Alvaro; Pena, Jorge; Guasch, J. Luis
  19. Does capacity utilisation help estimating the TFP cycle? By Christophe Planas; Werner Roeger; Alessandro Rossi

  1. By: Firpo, Sergio
    Abstract: This paper presents semiparametric estimators of changes in inequality measures of adependent variable distribution taking into account the possible changes on the distribu-tions of covariates. When we do not impose parametric assumptions on the conditionaldistribution of the dependent variable given covariates, this problem becomes equivalent toestimation of distributional impacts of interventions (treatment) when selection to the pro-gram is based on observable characteristics. The distributional impacts of a treatment willbe calculated as di¤erences in inequality measures of the potential outcomes of receivingand not receiving the treatment. These differences are called here Inequality TreatmentEffects (ITE). The estimation procedure involves a first non-parametric step in whichthe probability of receiving treatment given covariates, the propensity-score, is estimated.Using the inverse probability weighting method to estimate parameters of the marginal dis-tribution of potential outcomes, in the second step weighted sample versions of inequalitymeasures are computed. Root-N consistency, asymptotic normality and semiparametrice¢ ciency are shown for the semiparametric estimators proposed. A Monte Carlo exerciseis performed to investigate the behavior in finite samples of the estimator derived in thepaper. We also apply our method to the evaluation of a job training program.
    Date: 2010–06–16
  2. By: Boubacar Mainassara, Yacouba
    Abstract: This article considers the problem of orders selections of vector autoregressive moving-average (VARMA) models and the sub-class of vector autoregressive (VAR) models under the assumption that the errors are uncorrelated but not necessarily independent. We relax the standard independence assumption to extend the range of application of the VARMA models, and allow to cover linear representations of general nonlinear processes. We propose a modified criterion to the corrected AIC (Akaïke information criterion) version (AICc) introduced by Tsai and Hurvich (1989). This modified criterion is an approximately unbiased estimator of the Kullback-Leibler discrepancy, originally used to derive AIC-based criteria. Moreover, this criterion requires the estimation of the matrice involved in the asymptotic variance of the quasi-maximum likelihood (QML) estimator of the models, which provide an additional information about models. Monte carlo experiments show that the proposed modified criterion estimates the models orders more accurately than the standard AIC and AICc in large samples and often in small samples.
    Keywords: AIC; discrepancy; Kullback-Leibler information; QMLE/LSE; order selection; structural representation; weak VARMA models.
    JEL: C52 C22 C01
    Date: 2010–06–21
  3. By: Phillips, Kerk L.; Spencer, David E.
    Abstract: Constructing bootstrap confidence intervals for impulse response functions (IRFs) from structural vector autoregression (SVAR) models has become standard practice in empirical macroeconomic research. The accuracy of such confidence intervals can deteriorate severely, however, if the bootstrap IRFs are biased. In this paper, we document an apparently common source of bias in the estimation of the VAR error covariance matrix. The bias is easily corrected with a straightforward scale adjustment. This bias is often unrecognized because it only affects the bootstrap estimates of the error variance, not the original OLS estimates. Nevertheless, as we illustrate here, analytically, with sampling experiments, and in an example from the literature, the bootstrap error variance bias can have significant distorting effects on bootstrap IRF confidence intervals even if the original IRF estimate relies on unbiased parameter estimates.
    Keywords: impulse response function; structural VAR; bias; bootstrap
    JEL: C32 E32 E37
    Date: 2010–02
  4. By: Eiji Kurozumi; Purevdorj Tuvaandorj
    Abstract: This paper considers the issue of selecting the number of regressors and the number of structural breaks in multivariate regression models in the possible presence of mul- tiple structural changes. We develop a modified Akaike's information criterion (AIC), a modified Mallows' Cp criterion and a modified Bayesian information criterion (BIC). The penalty terms in these criteria are shown to be different from the usual terms. We prove that the modified BIC consistently selects the regressors and the number of breaks whereas the modified AIC and the modified Cp criterion tend to overly choose them with positive probability. The finite sample performance of these criteria is investigated through Monte Carlo simulations and it turns out that our modification is successful in comparison to the classical model selection criteria and the sequential testing procedure with the robust method.
    Keywords: structural breaks, AIC, Mallows' Cp, BIC, information criteria
    JEL: C13 C32
    Date: 2010–06
  5. By: Mitze, Timo
    Abstract: Gravity type models are widely used in international economics. In these models the inclusion of time-fixed regressors like geographical or cultural distance, language and institutional (dummy) variables is often of vital importance e.g. to analyse the impact of trade costs on internationalization activity. This paper analyses the problem of parameter inconsistency due to a correlation of the time-fixed regressors with the combined error term in panel data settings. A common solution is to use Instrumental-Variable (IV) estimation in the spirit of Hausman-Taylor (1981) since a standard Fixed Effect Model (FEM) estimation is not applicable. However, some potential shortcomings of the latter approach recently gave rise to the use of non-IV two-step estimators. Given their growing number of empirical applications, we aim to compare the performance of IV and non-IV approaches in the presence of time-fixed variables and right hand side endogeneity using Monte Carlo simulations, where we explicitly control for the problem of IV selection in the Hausman-Taylor case. The simulation results show that the Hausman-Taylor model with perfect-knowledge about the underlying data structure (instrument orthogonality) has on average the smallest bias. However, compared to the empirically relevant specification with imperfect-knowledge and instruments chosen by statistical criteria, simple non-IV rival estimators performs equally well or even better. We illustrate these findings by estimating gravity type models for German regional export activity within the EU. The results show that the HT specification tends to overestimate the role of trade costs proxied by geographical distance.
    Keywords: Gravity model; Exports; Instrumental variables; two-step estimators; Monte Carlo simulations
    JEL: C52 C23 C15
    Date: 2010–06–26
  6. By: Michael McAleer (University of Canterbury); Les Oxley (University of Canterbury)
    Abstract: Time series data affect many aspects of our lives. This paper highlights ten things we should all know about time series, namely: a good working knowledge of econometrics and statistics, an awareness of measurement errors, testing for zero frequency, seasonal and periodic unit roots, analysing fractionally integrated and long memory processes, estimating VARFIMA models, using and interpreting cointegrating models carefully, choosing sensibly among univariate conditional, stochastic and realized volatility models, not confusing thresholds, asymmetry and leverage, not underestimating the complexity of multivariate volatility models, and thinking carefully about forecasting models and expertise.
    Keywords: Unit roots; fractional integration; long memory; VARFIMA; cointegration; volatility; thresholds; asymmetry; leverage; forecasting models and expertise
    JEL: C22 C32
    Date: 2010–06–01
  7. By: David E. Allen; Michael McAleer (University of Canterbury); Marcel Scharth
    Abstract: In this paper we document that realized variation measures constructed from high- frequency returns reveal a large degree of volatility risk in stock and index returns, where we characterize volatility risk by the extent to which forecasting errors in realized volatility are substantive. Even though returns standardized by ex post quadratic variation measures are nearly gaussian, this unpredictability brings considerably more uncertainty to the empirically relevant ex ante distribution of returns. Carefully modeling this volatility risk is fundamental. We propose a dually asymmetric realized volatility (DARV) model, which incorporates the important fact that realized volatility series are systematically more volatile in high volatility periods. Returns in this framework display time varying volatility, skewness and kurtosis. We provide a detailed account of the empirical advantages of the model using data on the S&P 500 index and eight other indexes and stocks.
    Keywords: Realized volatility; volatility of volatility; volatility risk; value-at-risk; forecasting; conditional heteroskedasticity
    Date: 2010–05–01
  8. By: Henry Laniado; Rosa E. Lillo; Juan Romo
    Abstract: We propose a new multivariate order based on a concept that we will call extremality". Given a unit vector, the extremality allows to measure the "farness" of a point with respect to a data cloud or to a distribution in the vector direction. We establish the most relevant properties of this measure and provide the theoretical basis for its nonparametric estimation. We include two applications in Finance: a multivariate Value at Risk (VaR) with level sets constructed through extremality and a portfolio selection strategy based on the order induced by extremality.
    Keywords: Extremality, Oriented cone, Value at risk, Portfolio selection
    Date: 2010–06
  9. By: Kyungchul Song (Department of Economics, University of Pennsylvania)
    Abstract: This paper develops optimal estimation of a potentially nondifferentiable functional Г(β) of a regular parameter β, when Г satisfies certain conditions. Primary examples are min or max functionals that frequently appear in the analysis of partially identified models. This paper investigates both the average risk approach and the minimax approach. The average risk approach considers average local asymptotic risk with a weight function Π over β-q(β) for a fixed location-scale equivariant map q, and the minimax approach searches for a robust decision that minimizes the local asymptotic maximal risk. In both approaches, optimal decisions are proposed. Certainly, the average risk approach is preferable to the minimax approach when one has fairly accurate information of β-q(β). When one does not, one may ask whether the average risk decision with a certain weight function Π is as robust as the minimax decision. This paper specifies conditions for Г such that the answer is negative. This paper discusses some results from Monte Carlo simulation studies.
    Keywords: Local Asymptotic Minimax Estimation, Average Risks, Limit Experiments, Nondifferentiable Functionals, Partial Identification
    JEL: C10 C13 C14 C44
    Date: 2010–06–17
  10. By: Alba M. Franco-Pereira; Rosa E. Lillo; Juan Romo
    Abstract: A quantile residual life function is the quantile of the remaining life of a surviving subject, as it varies with time. In this article we present a nonparametric method for constructing confidence bands for the difference of two quantile residual life functions. These bands provide evidence for two random variables ordering with respect to a quantile residual life order introduced in Franco-Pereira et al. (2010). A simulation study has been carried out in order to evaluate and illustrate the performance and the consistency of this new methodology. We also present applications to real data examples.
    Keywords: Quantile residual life, Confidence bands
    Date: 2010–06
  11. By: Patricia Cubí Mollá (Universidad de Alicante)
    Abstract: The lack of a continuous health valuation is a major drawback in health analyses over broad populations. The use of categorical health variables to estimate a continuous health variable is an usual procedure in healthstudies. The most common approaches (ordered probit/logit model and interval regression model) do not admit any skewness in the distribution of health. In the present study a new procedure is suggested, that is attaching a log-normal distribution to health values. Different scaling procedures have been compared, with data obtained from the Catalan Health Survey (2006). The validity of the scaling approaches is assessed by measuring to what extent the health values derived from categorical health variables suit the actual health values. Two different health tariffs have been used for each procedure (VAS tariff and TTO tariff), so that the results are robust to the selection of a metric. In general, models under log normality outperform the other approaches.
    Keywords: Health-Related Quality of Life, Health Measurement, Interval
    JEL: C01 I10
    Date: 2010–01
  12. By: Ángel López; Juan Romo
    Abstract: In the present document, an extension of the statistical depth notion is introduced with the aim to allow for measuring proximities between pairs of points. In particular, we will extend the simplicial depth function, which measures how central is a point by using random simplices (triangles in the two-dimensional space). The paper is structured as follows: In first place, there is a brief introduction to statistical depth functions. Next, the simplicial similarity function will be defined and its properties studied. Finally, we will present a few graphical examples in order to show its behavior with symmetric and asymmetric distributions, and apply the function to hierarchical clustering.
    Keywords: Statistical depth, Similarity measures, Hierarchical clustering
    Date: 2010–06
  13. By: Niang, Abdou-Aziz; Pichery, Marie-Claude; Edjo, Marcellin
    Abstract: This paper presents an essay on empirical testing procedure for economic convergence. Referring to the unit root test proposed by Moon and Perron (2004), we proposed a modified Evans (1996)testing procedure of the convergence hypothesis. The advantage of this modified procedure is that it makes possible to take into account cross-sectional dependences that affect GDP per capita. It also allows to take into account structural instabilities in these aggregates. The application of the procedure on OECD member countries and CFA zone member countries leads to accept the hypothesis of economic convergence for these two groups of countries, and it shows that the convergence rate is significantly lower in the OECD sample. However, the results of the tests applied to the Global sample composed by all countries in these two samples conclude a rejection of the convergence hypothesis.
    Keywords: β-convergence; Unit root; Panel data; Factor model; Cross-sectional dependence; Structural change
    JEL: C23 C22 O40 R11
    Date: 2010–04–01
  14. By: Okumura, Tsunao; Usui, Emiko
    Abstract: This paper identifies sharp bounds on the mean treatment response and average treatment effect under the assumptions of both concave monotone treatment response (concave-MTR) and monotone treatment selection (MTS). We use our bounds and the US National Longitudinal Survey of Youth to estimate mean returns to schooling. Our upperbound estimates are substantially smaller than (1) estimates using only the concave-MTR assumption of Manski (1997) and (2) estimates using only the MTR and MTS assumptions of Manski and Pepper (2000). They fall in the lower range of the point estimates given in previous studies that assume linear wage functions. This is because ability bias is corrected by assuming MTS when the functions are close to linear. Our results therefore imply that higher returns reported in previous studies are likely to be overestimated.
    Keywords: Nonparametric Methods, Partial Identification, Sharp Bounds, Treatment Response, Returns to Schooling
    JEL: C14 J24
    Date: 2010–06
  15. By: Juan Esteban Carranza; Salvador Navarro
    Abstract: We estimate a dynamic model of mortgage default for a cohort of Colombian debtors between 1997 and 2004. We use the estimated model to study the efects on default of a class of policies that afected the evolution of mortgage balances in Colombia during the 1990's. We propose a framework for estimating dynamic behavioral models accounting for the presence of unobserved state variables that are correlated across individuals and across time periods. We extend the standard literature on the structural estimation of dynamic models by incorporating an unobserved common correlated shock that afects all individuals' static payofs and the dynamic continuation payofs associated with diferent decisions. Given a standard parametric specification the dynamic problem, we show that the aggregate shocks are identifed from the variation in the observed aggregate behavior. The shocks and their transition are separately identifed, provided there is enough cross-sectional variation of the observed states
    Date: 2010–06–22
  16. By: Sebastian D\"ohler
    Abstract: We apply multiple testing procedures to the validation of estimated default probabilities in credit rating systems. The goal is to identify rating classes for which the probability of default is estimated inaccurately, while still maintaining a predefined level of committing type I errors as measured by the familywise error rate (FWER) and the false discovery rate (FDR). For FWER, we also consider procedures that take possible discreteness of the data resp. test statistics into account. The performance of these methods is illustrated in a simulation setting and for empirical default data.
    Date: 2010–06
  17. By: Ole Bonnichsen (Institute of Food and Resource Economics, University of Copenhagen); Jacob Ladenburg (Danish Institute of Governmental Research)
    Abstract: In stated preference literature, the tendency to choose the alternative representing the status quo situation seems to exceed real life status quo effects. Accordingly, status quo bias can be a problem. In Choice Experiments, status quo bias is found to be strongly correlated with protest attitudes toward the cost attribute. If economic values are to be elicited, this problem is difficult to remedy. In a split sample framework we test a novel ex-ante entreaty aimed specifically at the cost attribute and find that it effectively reduces status quo bias and improves the internal validity of the hypothetical preferences.
    Keywords: Choice Experiment, Status Quo Bias, Entreaty, Stated Preference,
    JEL: C10 C51 C52 C90
    Date: 2010–06
  18. By: Escribano, Alvaro; Pena, Jorge; Guasch, J. Luis
    Abstract: Investment climate Surveys are valuable instruments that improve our understanding of the economic, social, political, and institutional factors determining economic growth, particularly in emerging and transition economies. However, at the same time, they have to overcome some difficult issues related to the quality of the information provided; measurement errors, outlier observations, and missing data that are frequently found in these datasets. This paper discusses the applicability of recent procedures to deal with missing observations in investment climate surveys. In particular, it presents a simple replacement mechanism -- for application in models with a large number of explanatory variables -- which in turn is a proxy of two methods: multiple imputations and an export-import algorithm. The performance of this method in the context of total factor productivity estimation in extended production functions is evaluated using investment climate surveys from four countries: India, South Africa, Tanzania, and Turkey. It is shown that the method is very robust and performs reasonably well even under different assumptions on the nature of the mechanism generating missing data.
    Keywords: E-Business,Statistical&Mathematical Sciences,Economic Theory&Research,Information Security&Privacy,Information and Records Management
    Date: 2010–06–01
  19. By: Christophe Planas; Werner Roeger; Alessandro Rossi
    Abstract: In the production function approach, accurate output gap assessment requires a careful evaluation of the TFP cycle. In this paper we propose a bivariate model that links TFP to capacity utilization and we show that this model improves the TFP trend-cycle decomposition upon univariate and Hodrick-Prescott filtering. In particular, we show that estimates of the TFP cycle that load information about capacity utilization are less revised than univariate and HP estimates, both with 2009 and real-time TFPdata vintages. We obtain this evidence for twelve pre-enlargement EU countries.
    JEL: C11 E23 E32
    Date: 2010–05

This nep-ecm issue is ©2010 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.