nep-ecm New Economics Papers
on Econometrics
Issue of 2010‒01‒23
twenty papers chosen by
Sune Karlsson
Orebro University

  1. Structural change tests for GEL criteria By Alain Guay; Jean-Francois Lamarche
  2. GARCH-Based Identification and Estimation of Triangular Systems By Todd, Prono
  3. The impacts of outliers on different estimators for GARCH processes: an empirical study By Ardelean, Vlad
  4. Simple GMM Estimation of the Semi-Strong GARCH(1,1) Model By Todd, Prono
  5. A Gaussian Test for Cointegration By Gulasekaran Rajaguru; Tilak Abeysinghe
  6. Dynamic Panel Data Models Featuring Endogenous Interaction and Spatially Correlated Errors By Jan P.A.M. Jacobs; Jenny E. Ligthart; Hendrik Vrijburg
  7. Improving the Forecasting of Dynamic Conditional Correlation: a Volatility Dependent Approach By Edoardo Otranto
  8. GMM, Generalized Empirical Likelihood, and Time Series By Federico Crudu
  9. Bayesian Inference in a Stochastic Volatility Nelson-Siegel Model By Nikolaus Hautsch; Fuyu Yang
  10. Spatial-serial dependency in multivariate GARCH models and dynamic copulas: a simulation study By Klein; Ingo; Köck; Christian; Tinkl; Fabian
  11. Market Proxies, Correlation, and Relative Mean-Variance Efficiency: Still Living with the Roll Critique By Todd, Prono
  12. Substitution Patterns of the Random Coefficients Logit By Thomas J. Steenburgh; Andrew Ainslie
  13. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis By Martin Paldam; Laurent Callot
  14. Impact of Model Specification Decisions on Unit Root Tests By Atiq-ur-Rehman, Atiq-ur-Rehman; Zaman, Asad
  15. Constructing a quasilinear moving average using the scaling function By Schlüter; Stefan
  16. Non-Extensitivity versus informative moments for financial models: a unifying framework and empirical results By Herrmann, Klaus
  17. A quarterly fiscal database for the euro area based on intra-annual fiscal information By Joan Paredes; Diego J. Pedregal; Javier J. Pérez
  18. Welfare Rankings From Multivariate Data, A Non-Parametric Approach By Gordon Anderson; Ian Crawford; Andrew Leicester
  19. A tail quantile approximation formula for the student t and the symmetric generalized hyperbolic distribution By Schlüter, Stephan; Fischer, Matthias
  20. Class Size and the Regression Discontinuity Design: The Case of Public Schools By Cohen-Zada, Danny; Gradstein, Mark; Reuven, Ehud

  1. By: Alain Guay (Department of Economics, Universite du Quebec a Montreal); Jean-Francois Lamarche (Department of Economics, Brock University)
    Abstract: This paper examines structural change tests based on generalized empirical likelihood methods in the time series context. Standard structural change test for the generalized method of moments are adapted to generalized empirical likelihood context. We show that when moment conditions are properly smoothed, these test statistics converges to the same asymptotic distribution as in the generalized method of moments in cases with known and unknown breakpoints. We also suggest new structural change test statistics specific to generalized empirical likelihood methods of estimation. A simulation study examines the small sample properties of the tests.
    Keywords: Generalized empirical likelihood, generalized method of moments, parameter instability, structural change
    JEL: C12 C32
    Date: 2009–12
  2. By: Todd, Prono
    Abstract: The diagonal GARCH(1,1) model is shown to support identification of the triangular system and is argued as a higher moment analog to traditional exclusion restrictions. Estimators for this result include QML and GMM. For the GMM estimator, only partial parameterization of the conditional covariance matrix is required. An alternative weighting matrix for the GMM estimator is also proposed.
    Keywords: Triangular Systems; Endogeneity; Identification; Heteroskedasticity; Quasi Maximum Likelihood; Generalized Method of Moments; GARCH; QML; GMM
    JEL: C13 C32
    Date: 2009–09
  3. By: Ardelean, Vlad
    Abstract: The Maximum likelihood estimation (MLE) is the most widely used method to estimate the parameters of a GARCH(p,q) process. This is owed to the fact that the MLE, among other properties, is asymptotically efficient. Even though the MLE is sensitive to outliers, which can occur in time series. In order to abate the influence of outliers, robust estimators are introduced. Afterwards an Monte Carlo study compares the introduced estimators. --
    Keywords: GARCH,Robust-Estimates,M-Estimates
    Date: 2009
  4. By: Todd, Prono
    Abstract: Efficient GMM estimation of the semi-strong GARCH(1,1) model requires simultaneous estimation of the conditional third and fourth moments. This paper proposes a simple alternative to efficient GMM based upon the unconditional skewness of residuals and the autocovariances of squared residuals. An advantage of this simple alternative is that neither the third nor the fourth conditional moment needs to be estimated. A second advantage is that linear estimators apply to all of the parameters in the model, making estimation straightforward in practice. The proposed estimators are IV-like with potentially many instruments. Sequential estimation involves TSLS in a first step followed by linear GMM. Simultaneous estimation involves either two-step GMM or CUE. A Monte Carlo study of the proposed estimators is included.
    Keywords: GARCH; Time Series Heteroskedasticity; GMM; CUE; Many Moments; Conditional Moment Restrictions; Consistency; Robust Statistics
    JEL: C53 G12 C22
    Date: 2010–01
  5. By: Gulasekaran Rajaguru (School of Business, Bond University, Australia); Tilak Abeysinghe (Department of Economics, National University of Singapore)
    Abstract: We use a mixed-frequency regression technique to develop a test for cointegration under the null of stationarity of the deviations from a long-run relationship. What is noteworthy about this MA unit root test, based on a variance-difference, is that, instead of having to deal with non-standard distributions, it takes the testing back to the normal distribution and offers a way to increase power without having to increase the sample size substantially. Monte Carlo simulations show minimal size distortions even when the AR root is close to unity and that the test offers substantial gains in power against near-null alternatives in moderate size samples. An empirical exercise illustrates the relative usefulness of the test further.
    Keywords: Null of stationarity, MA unit root, mixed-frequency regression, variance difference, normal distribution, power.
    JEL: C12 C22
    Date: 2009–12
  6. By: Jan P.A.M. Jacobs (University of Groningen, CAMA and CIRANO); Jenny E. Ligthart (International Studies Program. Andrew Young School of Policy Studies, Georgia State University); Hendrik Vrijburg (Erasmus University Rotterdam)
    Abstract: We extend the three-step generalized methods of moments (GMM) approach of Kapoor, Kelejian, and Prucha (2007), which corrects for spatially correlated errors in static panel data models, by introducing a spatial lag and a one-period lag of the dependent variable as additional explanatory variables. Combining the extended Kapoor, Kelejian, and Prucha (2007) approach with the dynamic panel data model GMM estimators of Arellano and Bond (1991) and Blundell and Bond (1998) and supplementing the dynamic instruments by lagged and weighted exogenous variables as suggested by Kelejian and Robinson (1993) yields new spatial dynamic panel data estimators. The performance of these spatial dynamic panel data estimators is in- vestigated by means of Monte Carlo simulations. We show that dierences in bias as well as root mean squared error between spatial GMM estimates and corresponding GMM estimates in which spatial error correlation is ignored are small.
    Keywords: Dynamic panel models, spatial lag, spatial error, GMM estimation
    Date: 2009–12–01
  7. By: Edoardo Otranto
    Abstract: Forecasting volatility in a multivariate framework has received many contributions in the recent literature, but problems in estimation are still frequently encountered when dealing with a large set of time series. The Dynamic Conditional Correlation (DCC) modeling is probably the most used approach; it has the advantage of separating the estimation of the volatility of each time series (with great flexibility, using single univariate models) and the correlation part (with the strong constraint imposing the same dynamics to all the correlations). We propose a modification to the DCC model, providing different dynamics for each correlation, simply hypothesizing a dependence on the volatility structure of each time series. This new model implies adding only two parameters with respect to the original DCC model. Its performance is evaluated in terms of out-of-sample forecasts with respect to the DCC models and other multivariate GARCH models. The results on four data sets seem to favor the new model.
    Keywords: Dynamic conditional correlation; GARCH distance; Multivariate
    JEL: C32 C53 G10
    Date: 2009
  8. By: Federico Crudu
    Abstract: In this paper we extend the results of Kitamura (1997) for BEL to the more general class of GEL estimators. The resulting BGEL estimator is proved to be consistent and asymptotically normal and attains the semiparametric lower bound. In addition, we define the BGEL version of the classical trinity of tests, Wald, Lagrange Multiplier, and Likelihood Ratio tests. The resulting tests are as expected chi square distributed. We find via Monte Carlo experiments that the overidentification tests that stem from the BGEL estimator have generally better small sample properties than the J test.
    JEL: C12 C14 C22
    Date: 2009
  9. By: Nikolaus Hautsch; Fuyu Yang
    Abstract: In this paper, we develop and apply Bayesian inference for an extended Nelson- Siegel (1987) term structure model capturing interest rate risk. The so-called Stochastic Volatility Nelson-Siegel (SVNS) model allows for stochastic volatility in the underlying yield factors. We propose a Markov chain Monte Carlo (MCMC) algorithm to efficiently estimate the SVNS model using simulation-based inference. Applying the SVNS model to monthly U.S. zero-coupon yields, we find significant evidence for time-varying volatility in the yield factors. This is mostly true for the level and slope volatility revealing also the highest persistence. It turns out that the inclusion of stochastic volatility improves the model's goodness-of-fit and clearly reduces the forecasting uncertainty particularly in low-volatility periods. The proposed approach is shown to work efficiently and is easily adapted to alternative specifications of dynamic factor models revealing (multivariate) stochastic volatility.
    Keywords: term structure of interest rates, stochastic volatility, dynamic factor model, Markov chain Monte Carlo
    JEL: C5 C11 C32
    Date: 2010–01
  10. By: Klein; Ingo; Köck; Christian; Tinkl; Fabian
    Abstract: The serial dependency of multivariate financial data will often be filtered by considering the residuals of univariate GARCH models adapted to every single series. This is the correct filtering strategy if the multivariate process follows a so-called copula based multivariate dynamic model (CMD). These multivariate dynamic models combine univariate GARCH in a linear or nonlinear way. In these models the parameters of the marginal distribution (=univariate GARCH models) and the dependence parameter are separable in the sense that they can be estimated in two or more steps. In the first step the parameters of the marginal distribution will be estimated and in the second step the parameter(s) of dependence.To the class of CMD models belong several multivariate GARCH models like the CCC and the DCC model. In contrast the BEKK model, f.e., does not belong to this class. If the BEKK model is correctly specified the above mentioned filtering strategy could fail from a theoretical point of view. Up to now, it is not known which dynamic copula is incorporated in a BEKK model. We will show that if the distribution of the innovations (i.e. the residuals) of MGARCH models is spherical the conditional distribution of the whole MGARCH process belongs to the elliptical distribution family. Therefore estimating the dependence of a BEKK model by copulas from the elliptical family should be an appropriate strategy to identify the dependence (i.e. correlation) between the univariate time series. Furthermore we will show, that a diagonal BEKK model can be separated in its margins and a copula, but that this strategy falls short of investigating full BEKK models. --
    Date: 2009
  11. By: Todd, Prono
    Abstract: A test of the CAPM is developed conditional on a prior belief about the correlation between the true market return and the proxy return used in the test. Consideration is given to the effect of the proxy's mismeasurement of the market return on the estimation of the market model. Failure to grant this consideration biases tests towards rejection by overstating the inefficiency of the proxy. An extension of the proposed test to a CAPM with conditioning information links mismeasurement of the market return to time-variation in beta.
    Keywords: Asset pricing; CAPM; portfolio efficiency; multivariate testing; bootstrap hypothesis testing; triangular systems; endogeneity; identification; GMM; conditional heteroskedasticity; GARCH
    JEL: C32 G12
    Date: 2009–09
  12. By: Thomas J. Steenburgh (Harvard Business School, Marketing Unit); Andrew Ainslie (UCLA Anderson, School of Management)
    Abstract: Previous research suggests that the random coefficients logit is a highly flexible model that overcomes the problems of the homogeneous logit by allowing for differences in tastes across individuals. The purpose of this paper is to show that this is not true. We prove that the random coefficients logit imposes restrictions on individual choice behavior that limit the types of substitution patterns that can be found through empirical analysis, and we raise fundamental questions about when the model can be used to recover individuals' preferences from their observed choices. Part of the misunderstanding about the random coefficients logit can be attributed to the lack of cross-level inference in previous research. To overcome this deficiency, we design several Monte Carlo experiments to show what the model predicts at both the individual and the population levels. These experiments show that the random coefficients logit leads a researcher to very different conclusions about individuals' tastes depending on how alternatives are presented in the choice set. In turn, these biased parameter estimates affect counterfactual predictions. In one experiment, the market share predictions for a given alternative in a given choice set range between 17% and 83% depending on how the alternatives are displayed both in the data used for estimation and in the counterfactual scenario under consideration. This occurs even though the market shares observed in the data are always about 50% regardless of the display.
    Date: 2010–01
  13. By: Martin Paldam (School of Economics and Management, University of Aarhus, Denmark); Laurent Callot (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis, on the data of the funnel, which jointly estimates the FAT and the PET. Ideal funnels are lean and symmetric. Empirical funnels are wide, and most have asymmetries biasing the plain average. Many asymmetries are due to censoring made during the research-publication process. The PET is tooled to correct the average for censoring. We show that estimation faults and misspecification may cause natural asymme¬tries, which the PET does not correct. If the MRA includes controls for omitted variables, the PET does correct for omitted variables bias. Thus, it is important to know the reason for an asymmetry.
    Keywords: Meta-analysis, funnel asymmetry, meta average
    JEL: B4 C9
    Date: 2010–01–14
  14. By: Atiq-ur-Rehman, Atiq-ur-Rehman; Zaman, Asad
    Abstract: Performance of unit tests depends on several specification decisions prior to their application e.g., whether or not to include a deterministic trend. Since there is no standard procedure for making such decisions, therefore the practitioners routinely make several arbitrary specification decisions. In Monte Carlo studies, the design of DGP supports these decisions, but for real data, such specification decisions are often unjustifiable and sometimes incompatible with data. We argue that the problems posed by choice of initial specification are quite complex and the existing voluminous literature on this issue treats only certain superficial aspects of this choice. We also show how these initial specifications affect the performance of unit root tests and argue that Monte Carlo studies should include these preliminary decisions to arrive at a better yardstick for evaluating such tests.
    Keywords: model specification; trend stationary; difference stationary
    JEL: C15 C22 C01
    Date: 2009
  15. By: Schlüter; Stefan
    Abstract: The scaling function from multiresolution analysis can be used to constuct a smoothing tool in the context of time series analysis. We give a time series smoothing function for which we show the properties of a quasilinear moving average. Furthermore; we discuss its features and especially derive the distributional properties of our quasilinear moving average given some simple underlying stochastic processes. Eventually we compare it to existing smoothing methods in order to motivate its application --
    Keywords: Scaling function,Quasilinear moving average,Influence function
    Date: 2009
  16. By: Herrmann, Klaus
    Abstract: Information-theoretic approaches still play a minor role in financial market analysis. Nonetheless, there have been two very similar approaches evolving during the last years, one in so-called econophysics and the other in econometrics. Both generalize the notion of GARCH processes in an information-theoretic sense and are able to capture skewness and kurtosis better than traditional models. In this article we present both approaches in a more general framework and compare their performance in some illustrative data sets. --
    Keywords: Entropy density,Skewness,Kurtosis,GARCH
    JEL: C22
    Date: 2009
  17. By: Joan Paredes (European Central Bank); Diego J. Pedregal (Universidad de Castilla-La Mancha); Javier J. Pérez (Banco de España)
    Abstract: The analysis of the macroeconomic impact of fiscal policies in the euro area has been traditionally limited by the absence of quarterly fiscal data. To overcome this problem, we provide two new databases in this paper. Firstly, we construct a quarterly database of euro area fiscal variables for the period 1980-2008 for a quite disaggregated set of fiscal variables; secondly, we present a real-time fiscal database for a subset of fiscal variables, composed of bi-annual vintages of data for the euro area period (2000-2009). All models are multivariate, state-space mixed-frequencies models estimated with available national accounts fiscal data (mostly annual) and, more importantly, monthly and quarterly information taken from the cash accounts of the governments. We provide not seasonally and seasonally-adjusted data. Focusing solely on intra-annual fiscal information for interpolation purposes allows us to capture genuine intra-annual "fiscal" dynamics in the data. Thus, we provide fiscal data that avoid some problems likely to appear in studies using fiscal time series interpolated on the basis of general macroeconomic indicators, namely the well-known decoupling of tax collection from the evolution of standard macroeconomic tax bases (revenue windfalls/shortfalls).
    Keywords: Euro area, Fiscal policies, Interpolation, Unobserved Components models, Mixed-frequencies
    JEL: C53 E6 H6
    Date: 2009–12
  18. By: Gordon Anderson; Ian Crawford; Andrew Leicester
    Abstract: Economic and Social Welfare is inherently multidimensional. However choosing a measure which combines several indicators is difficult and may have unintended and undesireable effects on the incentives for policymakers. We develope a nonparametric empirical method for deriving welfare rankings based on data envelopment which avoids the need to specify a weighting scheme. The results are valid for all possible social welfare functions which share certain cannonical properties. We apply this method to data on human development.
    Keywords: Welfare Rankings, Data Envelopment, Human development
    JEL: I3
    Date: 2010–01–14
  19. By: Schlüter, Stephan; Fischer, Matthias
    Abstract: Calculating a large number of tail probabilities or tail quantiles for a given distribution families becomes very challenging, if both the cumulative and the inverse distribution function are not available in closed form. In case of the Gaussian and Student t distribution, quantile approximations are already available. This is not the case for the (symmetric) generalized hyperbolic distribution (GHD) whose popularity steadily increases and which includes both Gaussian and Student t as limiting case. Within this paper we close this gap and derive one possible tail approximation formula for the GHD as well as for the Student t distribution. --
    Keywords: Generalized hyperbolic distribution,Quantile approximation,Student t distribution
    Date: 2009
  20. By: Cohen-Zada, Danny (Ben Gurion University); Gradstein, Mark (Ben Gurion University); Reuven, Ehud (Ben Gurion University)
    Abstract: Using a rich individual-level dataset on secondary public schools in Israel, we find strong evidence for discontinuities in the relationship between enrollment and household characteristics at cutoff points induced by a maximum class size rule. Our findings extend existing work that documents such discontinuities only among private schools (Urquiola and Verhoogen, 2009). These discontinuities violate the assumptions underlying the regression discontinuity design, which are crucial for identification. Consequently, IV estimates of class size effects are likely to be seriously biased. Potential manipulation of the treatment assignment rule by public schools warrants caution in applying a regression discontinuity design to estimate class size effects and indicates that institutional context is crucial for its scope of applicability.
    Keywords: regression discontinuity design, class size
    JEL: I20
    Date: 2009–12

This nep-ecm issue is ©2010 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.