nep-ecm New Economics Papers
on Econometrics
Issue of 2010‒02‒13
twenty-one papers chosen by
Sune Karlsson
Orebro University

  1. Bayesian Estimation of Stochastic-Transition Markov-Switching Models for Business Cycle Analysis By Monica Billio; Roberto Casarin
  2. Instrumental Variables Estimation with Partially Missing Instruments By Mogstad, Magne; Wiswall, Matthew
  3. Cross-sectional Dependence in Panel Data Analysis By Sarafidis, Vasilis; Wansbeek, Tom
  4. Bootstrap prediction mean squared errors of unobserved states based on the Kalman filter with estimated parameters By Alejandro Rodríguez; Esther Ruiz
  5. Modelling Heterogeneity and Dynamics in the Volatility of Individual Wages By Hospido, Laura
  6. Mean Shift detection under long-range dependencies with ART By Willert, Juliane
  7. "Do We Really Need Both BEKK and DCC? A Tale of Two Multivariate GARCH Models" By Massimiliano Caporin; Michael McAleer
  8. A Computationally Efficient Analytic Procedure for the Random Effects Probit Model By Peng-Hsuan Ke; Wen-Jen Tsay
  9. Do Peers Affect Student Achievement? Evidence from Canada Using Group Size Variation By Boucher, Vincent; Bramoullé, Yann; Djebbari, Habiba; Fortin, Bernard
  10. Euler-Equation Estimation for Discrete Choice Models: A Capital Accumulation Application By Russell Cooper; John Haltiwanger; Jonathan Willis
  11. Testing for Mobility Dominance By Yélé Maweki Batana; Jean-Yves Duclos
  12. How to evaluate an Early Warning System ? By Elena-Ivona Dumitrescu; Christophe Hurlin; Bertrand Candelon
  13. Exit times in non-Markovian drifting continuous-time random walk processes By Miquel Montero; Javier Villarroel
  14. Modelling Addiction in Life-Cycle Models: Revisiting the Treatment of Latent Stocks and Other Unobservables By Biørn, Erik
  15. Beliefs and Actions in the Trust Game: Creating Instrumental Variables to Estimate the Causal Effect By Costa-Gomes, Miguel A.; Huck, Steffen; Weizsäcker, Georg
  16. Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide By Schlotter, Martin; Schwerdt, Guido; Woessmann, Ludger
  17. A New Approximation to the Normal Distribution Quantile Function By Paul M. Voutier
  18. Treatment Evaluation in the Case of Interactions within Markets By Ferracci, Marc; Jolivet, Grégory; van den Berg, Gerard J.
  19. Cross-Correlation Dynamics in Financial Time Series By Thomas Conlon; Heather J. Ruskin; Martin Crane
  20. New methods of estimating stochastic volatility and the stock return By Alghalith, Moawia
  21. Asset returns and volatility clustering in financial time series By Jie-Jun Tseng; Sai-Ping Li

  1. By: Monica Billio; Roberto Casarin
    Abstract: We propose a new class of Markov-switching (MS) models for business cycle analysis. As usually done in the literature, we assume that the MS latent factor is driving the dynamics of the business cycle but the transition probabilities can vary randomly over time. Transition probabilities are generated by random processes which may account for the stochastic duration of the regimes and for possible stochastic relations between the MS probabilities and some explanatory variables, such as autoregressive components and exogenous variables. The presence of latent factors and nonlinearities calls for the use of simulation-based inference methods. We propose a full Bayesian inference approach which can be naturally combined with Monte Carlo methods. We discuss the choice of the priors and a Markov-chain Monte Carlo (MCMC) algorithm for estimating the parameters and the latent variables. We provide an application of the model and of the MCMC procedure to data of Euro area. We also carry out a real-time comparison between different models by employing sequential Monte Carlo methods and some concordance statistics, which are widely used in business cycle analysis.
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:ubs:wpaper:1002&r=ecm
  2. By: Mogstad, Magne (Statistics Norway); Wiswall, Matthew (New York University)
    Abstract: We examine instrumental variables estimation in situations where the instrument is only observed for a sub-sample, which is fairly common in empirical research. Typically, researchers simply limit the analysis to the sub-sample where the instrument is non-missing. We show that when the instrument is non-randomly missing, standard IV estimators require strong, auxiliary assumptions to be consistent. In many (quasi)natural experiments, the auxiliary assumptions are unlikely to hold. We therefore introduce alternative IV estimators that are robust to non-randomly missing instruments without auxiliary assumptions. A Monte-Carlo study illustrates our results.
    Keywords: instrumental variables, partially missing instruments, sample selection, sub-sample estimation
    JEL: C31 C34
    Date: 2010–01
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp4689&r=ecm
  3. By: Sarafidis, Vasilis; Wansbeek, Tom
    Abstract: This paper provides an overview of the existing literature on panel data models with error cross-sectional dependence. We distinguish between spatial dependence and factor structure dependence and we analyse the implications of weak and strong cross-sectional dependence on the properties of the estimators. We consider estimation under strong and weak exogeneity of the regressors for both T fixed and T large cases. Available tests for error cross-sectional dependence and methods for determining the number of factors are discussed in detail. The finite-sample properties of some estimators and statistics are investigated using Monte Carlo experiments.
    Keywords: Panel data; Cross-sectional dependence; Spatial dependence; Factor structure; Strong/Weak exogeneity.
    JEL: C50 C33
    Date: 2010–02
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:20367&r=ecm
  4. By: Alejandro Rodríguez; Esther Ruiz
    Abstract: Prediction intervals in State Space models can be obtained by assuming Gaussian innovations and using the prediction equations of the Kalman filter, where the true parameters are substituted by consistent estimates. This approach has two limitations. First, it does not incorporate the uncertainty due to parameter estimation. Second, the Gaussianity assumption of future innovations may be inaccurate. To overcome these drawbacks, Wall and Stoffer (2002) propose to obtain prediction intervals by using a bootstrap procedure that requires the backward representation of the model. Obtaining this representation increases the complexity of the procedure and limits its implementation to models for which it exists. The bootstrap procedure proposed by Wall and Stoffer (2002) is further complicated by fact that the intervals are obtained for the prediction errors instead of for the observations. In this paper, we propose a bootstrap procedure for constructing prediction intervals in State Space models that does not need the backward representation of the model and is based on obtaining the intervals directly for the observations. Therefore, its application is much simpler, without loosing the good behavior of bootstrap prediction intervals. We study its finite sample properties and compare them with those of the standard and the Wall and Stoffer (2002) procedures for the Local Level Model. Finally, we illustrate the results by implementing the new procedure to obtain prediction intervals for future values of a real time series.
    Keywords: NAIRU, Output gap, Parameter uncertainty, Prediction Intervals, State Space Models
    Date: 2010–01
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws100301&r=ecm
  5. By: Hospido, Laura (Bank of Spain)
    Abstract: This paper presents a model for the heterogeneity and dynamics of the conditional mean and the conditional variance of standardized individual wages. In particular, a heteroskedastic autoregressive model with multiple individual fixed effects is proposed. The expression for a modified likelihood function is obtained for estimation and inference in a fixed-T context. Using a bias-corrected likelihood approach makes it possible to reduce the estimation bias to a term of order 1/T². The small sample performance of the bias corrected estimator is investigated in a Monte Carlo simulation study. The simulation results show that the bias of the maximum likelihood estimator is substantially corrected for designs that are broadly calibrated to the data used in the empirical analysis, drawn from the 1968-1993 Panel Study of Income Dynamics. The empirical results show that it is important to account for individual unobserved heterogeneity and dynamics in the variance, and that the latter is driven by job mobility. The model also explains the non-normality observed in logwage data.
    Keywords: panel data, dynamic nonlinear models, conditional heteroskedasticity, fixed effects, bias reduction, individual wages
    JEL: C23 J31
    Date: 2010–01
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp4712&r=ecm
  6. By: Willert, Juliane
    Abstract: Atheoretical regression trees (ART) are applied to detect changes in the mean of a stationary long memory time series when location and number are unknown. It is shown that the BIC, which is almost always used as a pruning method, does not operate well in the long memory framework. A new method is developed to determine the number of mean shifts. A Monte Carlo Study and an application is given to show the performance of the method.
    Keywords: long memory, mean shift, regression tree, ART, BIC.
    JEL: C14 C22
    Date: 2010–02
    URL: http://d.repec.org/n?u=RePEc:han:dpaper:dp-437&r=ecm
  7. By: Massimiliano Caporin (Dipartimento di Scienze Economiche "Marco Fanno", Universita degli Studi di Padova); Michael McAleer (Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute)
    Abstract: The management and monitoring of very large portfolios of financial assets are routine for many individuals and organizations. The two most widely used models of conditional covariances and correlations in the class of multivariate GARCH models are BEKK and DCC. It is well known that BEKK suffers from the archetypal "curse of dimensionality", whereas DCC does not. It is argued in this paper that this is a misleading interpretation of the suitability of the two models for use in practice. The primary purpose of this paper is to analyze the similarities and dissimilarities between BEKK and DCC, both with and without targeting, on the basis of the structural derivation of the models, the availability of analytical forms for the sufficient conditions for existence of moments, sufficient conditions for consistency and asymptotic normality of the appropriate estimators, and computational tractability for ultra large numbers of financial assets. Based on theoretical considerations, the paper sheds light on how to discriminate between BEKK and DCC in practical applications.
    Date: 2010–02
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2010cf713&r=ecm
  8. By: Peng-Hsuan Ke (Institute of Economics, Academia Sinica, Taipei, Taiwan); Wen-Jen Tsay (Institute of Economics, Academia Sinica, Taipei, Taiwan)
    Abstract: It is found in Lee (2000) and Rabe-Hesketh et al. (2005) that the typical numerical-integral procedure suggested by Butler and Moffitt (1982) for the random effects probit model becomes biased when the correlation coefficient within each unit is relatively large. This could possibly explain why Guilkey and Murphy (1993, p. 316) recommend that if only two points (T=2) are available, then one may as well use the probit estimator. This paper tackles this issue by deriving an analytic formula for the likelihood function of the random effects probit model with T=2. Thus, the numerical-integral procedure is not required for the closed-form approach, and the possible bias generated from numerical integral is avoided. The simulation outcomes show that the root of mean-squared-error (RMSE) of the random effects probit estimator (MLE) using our method could be over 40% less than that from the probit estimator when the cross correlation is 0.9.
    Keywords: Discrete choice, random effects, panel probit model, error function
    JEL: C23 C24
    Date: 2010–01
    URL: http://d.repec.org/n?u=RePEc:sin:wpaper:10-a001&r=ecm
  9. By: Boucher, Vincent (University of Montreal); Bramoullé, Yann (Université Laval); Djebbari, Habiba (Université Laval); Fortin, Bernard (Université Laval)
    Abstract: We provide the first empirical application of a new approach proposed by Lee (2007) to estimate peer effects in a linear-in-means model. This approach allows to control for group-level unobservables and to solve the reflection problem. We investigate peer effects in student achievement in Mathematics, Science, French and History in Quebec secondary schools. We estimate the model using maximum likelihood and instrumental variables methods. We find evidence of peer effects. The endogenous peer effect is positive, when significant, and some contextual peer effects matter. Using calibrated Monte Carlo simulations, we find that high dispersion in group sizes helps with potential issues of weak identification.
    Keywords: reflection problem, student achievement, peer effects
    JEL: C31 I20 Z13
    Date: 2010–01
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp4723&r=ecm
  10. By: Russell Cooper; John Haltiwanger; Jonathan Willis
    Abstract: This paper studies capital adjustment at the establishment level. Our goal is to characterize capital adjustment costs, which are important for understanding both the dynamics of aggregate investment and the impact of various policies on capital accumulation. Our estimation strategy searches for parameters that minimize ex post errors in an Euler equation. This strategy is quite common in models for which adjustment occurs in each period. Here, we extend that logic to the estimation of parameters of dynamic optimization problems in which non-convexities lead to extended periods of investment inactivity. In doing so, we create a method to take into account censored observations stemming from intermittent investment. This methodology allows us to take the structural model directly to the data, avoiding time-consuming simulation based methods. To study the effectiveness of this methodology, we first undertake several Monte Carlo exercises using data generated by the structural model. We then estimate capital adjustment costs for U.S. manufacturing establishments in two sectors.
    Date: 2010–01
    URL: http://d.repec.org/n?u=RePEc:cen:wpaper:10-02&r=ecm
  11. By: Yélé Maweki Batana; Jean-Yves Duclos
    Abstract: This paper proposes tests for stochastic dominance in mobility based on the empirical likelihood ratio. Two views of mobility are considered, either based on measures of absolute mobility or on transition matrices. First-order and second-order dominance conditions in mobility are first derived, followed by the derivation of statistical inferences techniques to test a null hypothesis of non dominance against an alternative of mobility dominance. An empirical analysis, based on the US Panel Study of Income Dynamics (PSID), is performed by comparing four income mobility periods ranging from 1970 to 1990.
    Keywords: Mobility, Stochastic dominance, Transition matrices, Empirical Likelihood ratio, Bootstrap tests
    JEL: C10 C12 C13 D31 J60
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:lvl:lacicr:1002&r=ecm
  12. By: Elena-Ivona Dumitrescu (LEO - Laboratoire d'économie d'Orleans - CNRS : UMR6221 - Université d'Orléans); Christophe Hurlin (LEO - Laboratoire d'économie d'Orleans - CNRS : UMR6221 - Université d'Orléans); Bertrand Candelon (Laboratoire d'Economie d'Orléans - Université d'Orléans - CNRS : FRE2783)
    Abstract: This paper proposes a new statistical framework originating from the traditional credit- scoring literature, to evaluate currency crises Early Warning Systems (EWS). Based on an assessment of the predictive power of panel logit and Markov frameworks, the panel logit model is outperforming the Markov switching specifications. Furthermore, the introduction of forward-looking variables clearly improves the forecasting properties of the EWS. This improvement confirms the adequacy of the second generation crisis models in explaining the occurrence of crises.
    Keywords: currency crisis; Early Warning System; credit-scoring
    Date: 2010–01–01
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-00450050_v1&r=ecm
  13. By: Miquel Montero; Javier Villarroel
    Abstract: By appealing to renewal theory we determine the equations that the mean exit time of a continuous-time random walk with drift satisfies both when the present coincides with a jump instant or when it does not. Particular attention is paid to the corrections ensuing from the non-Markovian nature of the process. We show that when drift and jumps have the same sign the relevant integral equations can be solved in closed form. The case when holding times have the classical Erlang distribution is considered in detail.
    Date: 2010–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1002.0571&r=ecm
  14. By: Biørn, Erik (Dept. of Economics, University of Oslo)
    Abstract: Dynamic modeling of demand for goods whose cumulated stocks enter an intertemporal utility function as latent variables, is discussed. The issues include: how represent addiction, how handle unobserved expectations and changing plans, how deal with `dynamic inconsistency'? Arguments are put forth to give all optimizing conditions attention, not only those in which all variables are observable. If the latter, fairly common, `limited information-reduced dimension' strategy is pursued, problems are shown to arise in attempting to identify coe±cients of the preference structure and to test for addictive stocks. Examples, based on quadratic utility functions, illustrate the main points and challenge the validity of testing the `rational addiction' hypothesis, by using linear, single- equation autoregressive models, as suggested by Becker, Grossman, and Murphy (1994) and adopted in several following studies.
    Keywords: Life-cycle model. Addiction. Identi¯cation. Latent stocks; Perfect foresight; Ra- tional expectations; Dynamic inconsistency
    JEL: C32 C51 D91 I12
    Date: 2009–12–15
    URL: http://d.repec.org/n?u=RePEc:hhs:osloec:2009_026&r=ecm
  15. By: Costa-Gomes, Miguel A. (University of Aberdeen); Huck, Steffen (University College London); Weizsäcker, Georg (University College London)
    Abstract: In many economic contexts, an elusive variable of interest is the agent's expectation about relevant events, e.g. about other agents' behavior. Recent experimental studies as well as surveys have asked participants to state their beliefs explicitly, but little is known about the causal relation between beliefs and other behavioral variables. This paper discusses the possibility of creating exogenous instrumental variables for belief statements, by shifting the probabilities of the relevant events. We conduct trust game experiments where the amount sent back by the second player (trustee) is exogenously varied by a random process, in a way that informs only the first player (trustor) about the realized variation. The procedure allows detecting causal links from beliefs to actions under plausible assumptions. The IV estimates indicate a significant causal effect, comparable to the connection between beliefs and actions that is suggested by OLS analyses.
    Keywords: social capital, trust game, instrumental variables, belief elicitation
    JEL: C72 C81 C91 D84
    Date: 2010–01
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp4709&r=ecm
  16. By: Schlotter, Martin (Ifo Institute for Economic Research); Schwerdt, Guido (Ifo Institute for Economic Research); Woessmann, Ludger (Ifo Institute for Economic Research)
    Abstract: Education policy-makers and practitioners want to know which policies and practices can best achieve their goals. But research that can inform evidence-based policy often requires complex methods to distinguish causation from accidental association. Avoiding econometric jargon and technical detail, this paper explains the main idea and intuition of leading empirical strategies devised to identify causal impacts and illustrates their use with real-world examples. It covers six evaluation methods: controlled experiments, lotteries of oversubscribed programs, instrumental variables, regression discontinuities, differences-in-differences, and panel-data techniques. Illustrating applications include evaluations of early-childhood interventions, voucher lotteries, funding programs for disadvantaged, and compulsory-school and tracking reforms.
    Keywords: causal effects, education, policy evaluation, non-technical guide
    JEL: I20 C01
    Date: 2010–01
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp4725&r=ecm
  17. By: Paul M. Voutier
    Abstract: We present a new approximation to the normal distribution quantile function. It has a similar form to the approximation of Beasley and Springer [3], providing a maximum absolute error of less than $2.5 \cdot 10^{-5}$. This is less accurate than [3], but still sufficient for many applications. However it is faster than [3]. This is its primary benefit, which can be crucial to many applications, including in financial markets.
    Date: 2010–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1002.0567&r=ecm
  18. By: Ferracci, Marc (CREST-INSEE); Jolivet, Grégory (University of Bristol); van den Berg, Gerard J. (University of Mannheim)
    Abstract: We extend the standard evaluation framework to allow for interactions between individuals within segmented markets. An individual's outcome depends not only on the assigned treatment status but also on (features of) the distribution of the assigned treatments in his market. To evaluate how the distribution of treatments within a market causally affects the average effect within the market, averaged over the full population, we develop an identification and estimation method in two steps. The first one focuses on the distribution of the treatment within markets and between individuals and the second step addresses the distribution of the treatment between markets. We apply our method to data on training programs for unemployed workers in France. We use a rich administrative register of unemployment and training spells as well as the information on local labor demand that is used by unemployment agencies to allocate training programs. The results show that the average treatment effect on the employment rate causally decreases with respect to the proportion treated in the market. Our analysis accounts for unobserved heterogeneity between markets (using the longitudinal dimension of the data) and, in a robustness check, between individuals.
    Keywords: treatment evaluation, equilibrium effects, matching estimators
    JEL: C13 C14 C21 C31 J64
    Date: 2010–01
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp4700&r=ecm
  19. By: Thomas Conlon; Heather J. Ruskin; Martin Crane
    Abstract: The dynamics of the equal-time cross-correlation matrix of multivariate financial time series is explored by examination of the eigenvalue spectrum over sliding time windows. Empirical results for the S&P 500 and the Dow Jones Euro Stoxx 50 indices reveal that the dynamics of the small eigenvalues of the cross-correlation matrix, over these time windows, oppose those of the largest eigenvalue. This behaviour is shown to be independent of the size of the time window and the number of stocks examined. A basic one-factor model is then proposed, which captures the main dynamical features of the eigenvalue spectrum of the empirical data. Through the addition of perturbations to the one-factor model, (leading to a 'market plus sectors' model), additional sectoral features are added, resulting in an Inverse Participation Ratio comparable to that found for empirical data. By partitioning the eigenvalue time series, we then show that negative index returns, (drawdowns), are associated with periods where the largest eigenvalue is greatest, while positive index returns, (drawups), are associated with periods where the largest eigenvalue is smallest. The study of correlation dynamics provides some insight on the collective behaviour of traders with varying strategies.
    Date: 2010–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1002.0321&r=ecm
  20. By: Alghalith, Moawia
    Abstract: We present a new method of estimating the asset stochastic volatility and return. In doing so, we overcome some of the limitations of the existing random walk models, such as the GARCH/ARCH models.
    Keywords: portfolio; investment; stock; stochastic volatility
    JEL: C13 G12 G0
    Date: 2010–01–28
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:20303&r=ecm
  21. By: Jie-Jun Tseng; Sai-Ping Li
    Abstract: An analysis of the stylized facts in financial time series is carried out. We find that, instead of the heavy tails in asset return distributions, the slow decay behaviour in autocorrelation functions of absolute returns is actually directly related to the degree of clustering of large fluctuations within the financial time series. We also introduce an index to quantitatively measure the clustering behaviour of fluctuations in these time series and show that big losses in financial markets usually lump more severely than big gains. We further give examples to demonstrate that comparing to conventional methods, our index enables one to extract more information from the financial time series.
    Date: 2010–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1002.0284&r=ecm

This nep-ecm issue is ©2010 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.