nep-ecm New Economics Papers
on Econometrics
Issue of 2007‒11‒10
nineteen papers chosen by
Sune Karlsson
Orebro University

  1. On the efficiency and consistency of likelihood estimation in multivariate conditionally heteroskedastic dynamic regression models By Gabriele Fiorentini; Enrique Sentana
  2. Indirect estimation of large conditionally heteroskedastic factor models, with an application to the Dow 30 stocks By Gabriele Fiorentini; Giorgio Calzolari; Enrique Sentana
  3. On the distribution of penalized maximum likelihood estimators: The LASSO, SCAD, and thresholding. By Pötscher, Benedikt M.; Leeb, Hannes
  4. We derive general distribution tests based on the method of Maximum Entropy density. By Thanasis Stengos; Ximing Wu†
  5. NoVaS Transformations: Flexible Inference for Volatility Forecasting By Dimitrios D. Thomakos; Dimitris N. Politis
  6. Expectations Hypothesis Tests in the Presence of Model Uncertainty By Erdenebat Bataa; Dong H. Kim; Denise R. Osborn
  7. On approximating the distributions of goodness-of-fit test statistics based on the empirical distribution function: The case of unknown parameters By Marco Capasso; Lucia Alessi; Matteo Barigozzi; Giorgio Fagiolo
  8. Bandwidth Selection and the Estimation of Treatment Effects with Unbalanced Data By Jose Galdo; Jeffrey Smith; Dan Black
  9. Multivariate forecast evaluation and rationality testing By Ivana Komunjer; Michael T. Owyang
  10. On Sequential Estimation and Prediction for Discrete Time Series By Gusztav Morvav; Benjamin Weiss
  11. Method-of-Moment View of Linear Simultaneous Equation Systems By Myoung-jae Lee
  12. Bayesian Estimation of Unknown Regression Error Heteroscedasticity By Hiroaki Chigira; Tsunemasa Shiba
  13. Modelling good and bad volatility By Matteo Pelagatti
  14. Out-of-Sample Forecasting of Unemployment Rates with Pooled STVECM Forecasts By Costas Milas; Philip Rothman
  15. Asymptotics for the Hirsch Index By Beirlant, J.; Einmahl, J.H.J.
  16. Prediction of random effects and effects of misspecification of their distribution By Charles McCulloch
  17. Difference in Generalized-Differences with Panel Data: Effects of Moving from Private to Public School on Test Scores By Myoung-jae Lee
  18. Models for Non-Exclusive Multinomial Choice, with Application to Indonesian Rural Households By Christopher L. Gilbert; Francesca Modena
  19. Martingales and First Passage Times of AR(1) Sequences By Alex Novikov; Nino Kordzakhia

  1. By: Gabriele Fiorentini (University of Florence and The Rimini Centre for Economics Analysis, Italy.); Enrique Sentana (CEMFI, Spain)
    Abstract: We rank the efficiency of several likelihood-based parametric and semiparametric estimators of conditional mean and variance parameters in multivariate dynamic models with i.i.d. spherical innovations, and show that Gaussian pseudo maximum likelihood estimators are inefficient except under normality. We also provide conditions for partial adaptivity of semiparametric procedures, and relate them to the consistency of distributionally misspecified maximum likelihood estimators. We propose Hausman tests that compare Gaussian pseudo maximum likelihood estimators with more efficient but less robust competitors. We also study the efficiency of sequential estimators of the shape parameters. Finally, we provide finite sample results through Monte Carlo simulations.
    Keywords: Adaptivity, ARCH, Elliptical Distributions, Financial Returns, Hausman tests, Semiparametric Estimators, Sequential Estimators.
    JEL: C13 C14 C12 C51 C52
    Date: 2007–07
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:38-07&r=ecm
  2. By: Gabriele Fiorentini (University of Florence and The Rimini Centre for Economics Analysis, Italy.); Giorgio Calzolari (University of Florence); Enrique Sentana (CEMFI, Spain)
    Abstract: We derive indirect estimators of conditionally heteroskedastic factor models in which the volatilities of common and idiosyncratic factors depend on their past unobserved values by calibrating the score of a Kalman-filter approximation with inequality constraints on the auxiliary model parameters. We also propose alternative indirect estimators for large-scale models, and explain how to apply our procedures to many other dynamic latent variable models. We analyse the small sample behaviour of our indirect estimators and several likelihood-based procedures through an extensive Monte Carlo experiment with empirically realistic designs. Finally, we apply our procedures to weekly returns on the Dow 30 stocks.
    Keywords: ARCH, Idiosyncratic risk, Inequality constraints, Kalman filter, Sequential estimators, Simulation estimators, Volatility.
    JEL: C13 C15 C32
    Date: 2007–07
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:40-07&r=ecm
  3. By: Pötscher, Benedikt M.; Leeb, Hannes
    Abstract: We study the distributions of the LASSO, SCAD, and thresholding estimators, in finite samples and in the large-sample limit. The asymptotic distributions are derived for both the case where the estimators are tuned to perform consistent model selection and for the case where the estimators are tuned to perform conservative model selection. Our findings complement those of Knight and Fu (2000) and Fan and Li (2001). We show that the distributions are typically highly nonnormal regardless of how the estimator is tuned, and that this property persists in large samples. An impossibility result regarding estimation of the estimators' distribution function is also provided.
    Keywords: Penalized maximum likelihood; LASSO; SCAD; thresholding; post-model-selection estimator; finite-sample distribution; asymptotic distribution; estimation of distribution; uniform consistency
    JEL: C13 C2
    Date: 2007–10
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:5615&r=ecm
  4. By: Thanasis Stengos (University of Guelph, Canada and The Rimini Centre for Economics Analysis, Rimini, Italy.); Ximing Wu† (Texas A&M University, USA and University of Guelph, Canada)
    Abstract: The proposed tests are derived from maximizing the differential entropy subject to moment constraints. By exploiting the equivalence between the Maximum Entropy and Maximum Likelihood estimates of the general exponential family, we can use the conventional Likelihood Ratio, Wald and Lagrange Multiplier testing principles in the maximum entropy framework. In particular we use the Lagrange Multiplier method to derive tests for normality and their asymptotic properties. Monte Carlo evidence suggests that the proposed tests have desirable small sample properties.
    Keywords: distribution test, maximum entropy, normality.
    JEL: C1 C12 C16
    Date: 2007–07
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:24-07&r=ecm
  5. By: Dimitrios D. Thomakos (University of Peloponnese, Greece and The Rimini Centre for Economics Analysis, Italy.); Dimitris N. Politis (University of California, San Diego, USA)
    Abstract: In this paper we contribute several new results on the NoVaS transformation approach for volatility forecasting introduced by Politis (2003a,b, 2007). In particular: (a) we introduce an alternative target distribution (uniform); (b) we present a new method for volatility forecasting using NoVaS ; (c) we show that the NoVaS methodology is applicable in situations where (global) stationarity fails such as the cases of local stationarity and/or structural breaks; (d) we show how to apply the NoVaS ideas in the case of returns with asymmetric distribution; and finally (e) we discuss the application of NoVaS to the problem of estimating value at risk (VaR). The NoVaS methodology allows for a flexible approach to inference and has immediate applications in the context of short time series and series that exhibit local behavior (e.g. breaks, regime switching etc.) We conduct an extensive simulation study on the predictive ability of the NoVaS approach and find that NoVaS forecasts lead to a much ÔtighterÕ distribution of the forecasting performance measure for all data generating processes. This is especially relevant in the context of volatility predictions for risk management. We further illustrate the use of NoVaS for a number of real datasets and compare the forecasting performance of NoVaS -based volatility forecasts with realized and range-based volatility measures.
    Keywords: ARCH, GARCH, local stationarity, structural breaks, VaR, volatility.
    Date: 2007–07
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:44-07&r=ecm
  6. By: Erdenebat Bataa (Centre for Growth and Business Cycles Research, Economics, University of Manchester); Dong H. Kim (Department of Economics, Korea University and Economics, University of Manchester); Denise R. Osborn (Centre for Growth and Business Cycles Research, Economics, University of Manchester)
    Abstract: We extend vector autoregressive (VAR) model based expectations hypothesis tests of the term structure by relaxing some specification assumptions in order to reflect model uncertainty. Firstly, the wild bootstrap is used to allow for conditional heteroskedasticity of unknown form in the VAR residuals. Secondly, the model selection procedure is endogenized in the bootstrap replications and supplemented with a robust multivariate autocorrelation test. Finally, a stationarity correction is introduced to prevent the bias corrected VAR coefficients from becoming explosive. When the new methodology is applied to extensive US term structure data it emerges that the model uncertainty goes a long way in explaining the empirical rejections of the theory.
    Keywords: expectations hypothesis, term structure, wild bootstrap, conditional heteroskedasticity
    JEL: G10 E43
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:iek:wpaper:0703&r=ecm
  7. By: Marco Capasso; Lucia Alessi; Matteo Barigozzi; Giorgio Fagiolo
    Abstract: This note discusses some problems possibly arising when approximating via Monte-Carlo simulations the distributions of goodness-of-fit test statistics based on the empirical distribution function. We argue that failing to re-estimate unknown parameters on each simulated Monte-Carlo sample -- and thus avoiding to employ this information to build the test statistic -- may lead to wrong, overly-conservative testing. Furthermore, we present a simple example suggesting that the impact of this possible mistake may turn out to be dramatic and does not vanish as the sample size increases.
    Keywords: Goodness of fit tests, Critical values, Anderson-Darling statistic, Kolmogorov-Smirnov statistic, Kuiper Statistic, Cramer-Von Mises statistic, Empirical Distribution function, Monte-Carlo Simulations
    Date: 2007–11–06
    URL: http://d.repec.org/n?u=RePEc:ssa:lemwps:2007/23&r=ecm
  8. By: Jose Galdo (McMaster University and IZA); Jeffrey Smith (University of Michigan, NBER, IFS, PSI, ZEW and IZA); Dan Black (University of Chicago and NORC)
    Abstract: This paper addresses the selection of smoothing parameters for estimating the average treatment effect on the treated using matching methods. Because precise estimation of the expected counterfactual is particularly important in regions containing the mass of the treated units, we define and implement weighted cross-validation approaches that improve over conventional methods by considering the location of the treated units in the selection of the smoothing parameters. We also implement a locally varying bandwidth method that uses larger bandwidths in areas where the mass of the treated units is located. A Monte Carlo study compares our proposed methods to the conventional unweighted method and to a related method inspired by Bergemann et al. (2005). The Monte Carlo analysis indicates efficiency gains from all methods that take account of the location of the treated units. We also apply all five methods to bandwidth selection in the context of the data from LaLonde’s (1986) study of the performance of non-experimental estimators using the experimental data from the National Supported Work (NSW) Demonstration program as a benchmark. Overall, both the Monte Carlo analysis and the empirical application show feasible precision gains for the weighted cross-validation and the locally varying bandwidth approaches.
    Keywords: C13, C14
    Date: 2007–10
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp3095&r=ecm
  9. By: Ivana Komunjer; Michael T. Owyang
    Abstract: In this paper, we propose a new family of multivariate loss functions that can be used to test the rationality of vector forecasts without assuming independence across individual variables. When only one variable is of interest, the loss function reduces to the flexible asymmetric family recently proposed by Elliott, Komunjer, and Timmermann (2005). Following their methodology, we derive a GMM test for multivariate forecast rationality that allows the forecast errors to be dependent, and takes into account forecast estimation uncertainty. We use our test to study the rationality of macroeconomic vector forecasts in the growth rate in nominal output, the CPI inflation rate, and a short-term interest rate.
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:fip:fedlwp:2007-047&r=ecm
  10. By: Gusztav Morvav; Benjamin Weiss
    Abstract: The problem of extracting as much information as possible from a sequence of observations of a stationary stochastic process X0,X1,…,Xn has been considered by many authors from different points of view. It has long been known through the work of D. Bailey that no universal estimator for P(Xn+1|X0,X1, ...Xn) can be found which converges to the true estimator almost surely. Despite this result, for restricted classes of processes, or for sequences of estimators along stopping times, universal estimators can be found. We present here a survey of some of the recent work that has been done along these lines.
    Keywords: Nonparametric estimation; Stationary processes
    Date: 2007–09
    URL: http://d.repec.org/n?u=RePEc:huj:dispap:dp464&r=ecm
  11. By: Myoung-jae Lee (Department of Economics, Korea University)
    Abstract: In this paper, we review the modern method-of-moment-based approaches to identification and estimation of linear simultaneous equation systems. First, we present the rank condition for the structural form (SF) parameter identification. The rank condition comes naturally and is much easier to understand than that in the conventional reduced-form-based indirect approach. Then, we show how to estimate all SF parameters jointly (in a single step) with method-of-moment estimators. As it turns out, using only unconditional moments, but not any conditional moments, greatly simplifies the identification and estimation issues, and makes light work of conveying the essential ideas involved.
    Keywords: methods of moments, linear simultaneous equations, system estimation
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:iek:wpaper:0719&r=ecm
  12. By: Hiroaki Chigira; Tsunemasa Shiba
    Abstract: We propose a Bayesian procedure to estimate possibly heteroscedastic variances of the regression error term, without assuming any structure on them. Our prior information, in this paper, is elicited from the well-known Eicker-White Heteroscedasticity Consistent Variance- Covariance Matrix Estimator ("HCCM" hereafter). After we obtain an estimator via. the HCCM, we set up a Bayesian model for the entire structure and use an MCMC to simulate posterior pdf's of possibly heteroscedastic variances whose structures are unknown. In addition to the numerical examples, we present an empirical investigation of a set of Japanese pharmaceutical and biomedical companies' stock prices.
    Date: 2007–10
    URL: http://d.repec.org/n?u=RePEc:hst:hstdps:d07-221&r=ecm
  13. By: Matteo Pelagatti
    Abstract: The returns of many financial assets show significant skewness, but in the literature this issue is only marginally dealt with. Our conjecture is that this distributional asymmetry may be due to two different dynamics in positive and negative returns. In this paper we propose a process that allows the simultaneous modelling of skewed conditional returns and different dynamics in their conditional second moments. The main stochastic properties of the model are analyzed and necessary and sufficient conditions for weak and strict stationarity are derived. An application to the daily returns on the principal index of the London Stock Exchange supports our model when compared to other frequently used GARCH-type models, which are nested into ours.
    Keywords: Volatility, Skewness, GARCH, Asymmetric Dynamics, Stationarity
    JEL: C22 C53 G10
    Date: 2007–11
    URL: http://d.repec.org/n?u=RePEc:mis:wpaper:20071101&r=ecm
  14. By: Costas Milas (Keele University, UK and The Rimini Centre for Economics Analysis, Italy.); Philip Rothman (East Carolina University, USA)
    Abstract: In this paper we use smooth transition vector error-correction models (STVECMs) in a simulated out-of-sample forecasting experiment for the unemployment rates of the four non-Euro G-7 countries, the U.S., U.K., Canada, and Japan. For the U.S., pooled forecasts constructed by taking the median value across the point forecasts generated by the linear and STVECM forecasts appear to perform better than the linear AR(p) benchmark more so during business cycle expansions. Such pooling also tends to lead to statistically significant forecast improvement for the U.K. ÒReality checksÓ of these results suggest that they do not stem from data snooping.
    Keywords: nonlinear, asymmetric, STVECM, pooled forecasts, Diebold-Mariano
    Date: 2007–07
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:49-07&r=ecm
  15. By: Beirlant, J.; Einmahl, J.H.J. (Tilburg University, Center for Economic Research)
    Abstract: The last decade methods for quantifying the research output of individual researchers have become quite popular in academic policy making. The h- index (Hirsch, 2005) constitutes an interesting quality measure that has attracted a lot of attention recently. It is now a standard measure available for instance on theWeb of Science. In this paper we establish the asymptotic normality of the empirical h-index. The rate of convergence is non-standard: ph=(1 + nf(h)), where f is the density of the citation distribution and n the number of publications of a researcher. In case that the citations follow a Pareto-type or a Weibull-type distribution as defined in extreme value theory, our general result nicely specializes to results that are useful for constructing confidence intervals for the h-index.
    Keywords: Asymptotic normality;confidence interval;extreme value theory;research output;scientometrics;tail empirical process.
    JEL: C13 C14
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200786&r=ecm
  16. By: Charles McCulloch (University of California, San Francisco)
    Abstract: Statistical models that include random effects are commonly used to analyze longitudinal and clustered data. These models are often used to derive predicted values of the random effects, for example in predicting which physicians or hospitals are performing exceptionally well or exceptionally poorly. I start this talk with a brief introduction and several examples of the use of prediction of random effects in practice. In typical applications, the data analyst specifies a parametric distribution for the random effects (often Gaussian) although there is little information available to guide this choice. Are predictions sensitive to this specification? Through theory, simulations, and an example illustrating the prediction of who is likely to go on to develop high blood pressure, I show that misspecification can have a moderate impact on predictions of random effects and describe simple ways to diagnose such sensitivity.
    Date: 2007–10–31
    URL: http://d.repec.org/n?u=RePEc:boc:wsug07:12&r=ecm
  17. By: Myoung-jae Lee (Department of Economics, Korea University)
    Abstract: Difference in differences (DD) relies on the key identifying condition that the untreated response variable would have grown equally across the control and treatment groups; i.e., the ¡®time effects¡¯ across the groups are the same. This condition can be rewritten as the ¡®group effects¡¯ across the time points being the same, with which this paper generalizes DD to difference in generalized-differences (DG). DG is indexed by a parameter ¥ç, and includes DD as a special case when ¥ç = 1. This makes it possible to use DG as a sensitivity analysis for DD by trying values of ¥ç other than one. Going further from sensitivity analysis, one may desire to fix ¥ç. For this, we provide a way to get a benchmark value of ¥ç using a dynamic panel data model for the control group. An empirical analysis is provided for the effects of moving from private to public school on test scores. In the empirical analysis, (i) DD magnitude is fairly sensitive to changes in ¥ç around one, but its statistical significance is not, (ii) ¥ç is significantly smaller than one in math score (and possibly in science score), and (iii) DG yields a significant negative effect of about 3-5% for reading score, but the effects are ambiguous or insignificant for the other scores. ¥ç being less than 1 means that, had the movers to public school stayed, the score gap between the movers and stayers would have narrowed. That is, the move to public school is likely to have been involuntary.
    Keywords: treatment effect, difference in differences, panel data, private-school effect
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:iek:wpaper:0721&r=ecm
  18. By: Christopher L. Gilbert; Francesca Modena
    Abstract: Textbook discussions of discrete choice modelling focus on binomial and multinomial choice models in which agents select a single response. We consider the situation of non-exclusive multinomial choice. The widely used Marginal Logit Model imposes independence and has other disadvantages. We propose two models which account for non-exclusive and dependent multiple responses and require at least one response. In the first and simpler specification, the Poisson-multinomial, households first choose the number of responses to a specific shock, and then the specific choices are identified to maximize household utility conditional on the former choice. The second specification, the threshold-multinomial, generalizes the standard multinomial logit model by supposing that agents will choose more than one response if the utility they derive from other choices is “close” to that of the utility-maximizing choice. We apply these two approaches to reported responses of rural Indonesian rural households to demographic and economic shocks.
    Keywords: Discrete choice models, Marginal logit, Shocks, Risk coping strategies
    JEL: C25 C51 O12
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:trn:utwpde:0724&r=ecm
  19. By: Alex Novikov (School of Finance and Economics, University of Technology, Sydney); Nino Kordzakhia
    Abstract: Using the martingale approach we find sufficient conditions for exponential boundedness of first passage times over a level for ergodic first order autoregressive sequences (AR(1)). Further, we prove a martingale identity and use it for obtaining explicit bounds for the expectation of exit times.
    Keywords: first passage times; autoregressive processes; martingales; expenential boundedness
    Date: 2007–10–01
    URL: http://d.repec.org/n?u=RePEc:uts:rpaper:205&r=ecm

This nep-ecm issue is ©2007 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.