nep-ecm New Economics Papers
on Econometrics
Issue of 2009‒04‒25
seventeen papers chosen by
Sune Karlsson
Orebro University

  1. Generalized Methods of Trimmed Moments By Cizek, P.
  2. Exact Maximum Likelihood estimation for the BL-GARCH model under elliptical distributed innovations By Abdou Kâ Diongue; Dominique Guegan; Rodney C. Wolff
  3. Wavelet Method for Locally Stationary Seasonal Long Memory Processes By Dominique Guegan; Zhiping Lu
  4. Optimal Rank-Based Testing for Principal Component By Marc Hallin; Davy Paindaveine; Thomas Verdebout
  5. Do Local Projections Solve the Bias Problem in Impulse Response Inference? By Kilian, Lutz; Kim, Yun Jung
  6. Volatility and realized quadratic variation of differenced returns : A wavelet method approach By Høg, Esben
  7. Spectral estimation of the fractional order of a Lévy process By Denis Belomestny
  8. p-Value Adjustments for Asymptotic Control of the Generalized Familywise Error Rate By Christopher J. Bennett
  9. A State Space Approach to Estimating the Integrated Variance and Microstructure Noise Component By Daisuke Nagakura; Toshiaki Watanabe
  10. More Reliable Inference for Segregation Indices By Rebecca Allen; Simon Burgess; Frank Windmeijer
  11. Change analysis of dynamic copula for measuring dependence in multivariate financial data By Dominique Guegan; Jing Zhang
  12. Breaks or Long Memory Behaviour : An empirical Investigation By Lanouar Charfeddine; Dominique Guegan
  13. Technology shocks and aggregate fluctuations in an estimated hybrid RBC model By Jim Malley; Ulrich Woitek
  14. A trivariate non-Gaussian copula having 2-dimensional Gaussian copulas as margins By Stéphane Loisel
  15. Multi-Factor Gegenbauer Processes and European Inflation Rates By Guglielmo Maria Caporale; Luis A. Gil-Alana
  16. Chaos in Economics and Finance By Dominique Guegan
  17. Predicting Betas: Two new methods. By Mª Victoria Esteban González; Fernando Tusell Palmer

  1. By: Cizek, P. (Tilburg University, Center for Economic Research)
    Abstract: High breakdown-point regression estimators protect against large errors and data contamination. We adapt and generalize the concept of trimming used by many of these robust estimators so that it can be employed in the context of the generalized method of moments. The proposed generalized method of trimmed moments (GMTM) offers a globally robust estimation approach (contrary to existing only locally robust estimators) applicable in econometric models identified and estimated using moment conditions. We derive the consistency and asymptotic distribution of GMTM in a general setting, propose a robust test of overidentifying conditions, and demonstrate the application of GMTM in the instrumental variable regression. We also compare the finite-sample performance of GMTM and existing estimators by means of Monte Carlo simulation.
    Keywords: asymptotic normality;generalized method of moments;instrumental variables regression;robust estimation;trimming
    JEL: C13 C20 C30 C12
    Date: 2009
  2. By: Abdou Kâ Diongue (UFR SAT - Université Gaston Berger - Université Gaston Berger de Saint-Louis); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Rodney C. Wolff (School of Mathematical Sciences - Queensland University of Technology)
    Abstract: In this paper, we discuss the class of Bilinear GATRCH (BL-GARCH) models which are capable of capturing simultaneously two key properties of non-linear time series : volatility clustering and leverage effects. It has been observed often that the marginal distributions of such time series have heavy tails ; thus we examine the BL-GARCH model in a general setting under some non-Normal distributions. We investigate some probabilistic properties of this model and we propose and implement a maximum likelihood estimation (MLE) methodology. To evaluate the small-sample performance of this method for the various models, a Monte Carlo study is conducted. Finally, within-sample estimation properties are studied using S&P 500 daily returns, when the features of interest manifest as volatility clustering and leverage effects.
    Keywords: BL-GARCH process - elliptical distribution - leverage effects - Maximum Likelihood - Monte Carlo method - volatility clustering
    Date: 2009
  3. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Zhiping Lu (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, ECNU - East China Normal University)
    Abstract: Long memory processes have been extensively studied over the past decades. When dealing with the financial and economic data, seasonality and time-varying long-range dependence can often be observed and thus some kind of non-stationarity can exist inside financial data sets. To take into account this kind of phenomena, we propose a new class of stochastic process : the locally stationary k-factor Gegenbauer process. We describe a procedure of estimating consistently the time-varying parameters by applying the discrete wavelet packet transform (DWPT). The robustness of the algorithm is investigated through simulation study. An application based on the error correction term of fractional cointegration analysis of the Nikkei Stock Average 225 index is proposed.
    Keywords: Discrete wavelet packet transform ; Gegenbauer process ; Nikkei Stock Average 225 index ; non-stationarity ; ordinary least square estimation
    Date: 2009–03
  4. By: Marc Hallin; Davy Paindaveine; Thomas Verdebout
    Abstract: This paper provides parametric and rank-based optimal tests for eigenvectors and eigenvalues of covariance or scatter matrices in elliptical families. The parametric tests extend the Gaussian likelihood ratio tests of Anderson (1963) and their pseudo-Gaussian robustifications by Tyler (1981, 1983) and Davis (1977), with which their Gaussian versions are shown to coincide, asymptotically, under Gaussian or finite fourth-order moment assumptions, respectively. Such assumptions however restrict the scope to covariance-based principal component analysis. The rank-based tests we are proposing remain valid without such assumptions. Hence, they address a much broader class of problems, where covariance matrices need not exist and principal components are associated with more general scatter matrices. Asymptotic relative efficiencies moreover show that those rank-based tests are quite powerful; when based on van der Waerden or normal scores, they even uniformly dominate the pseudo-Gaussian versions of Anderson’s procedures. The tests we are proposing thus outperform daily practice both from the point of view of validity as from the point of view of efficiency. The main methodological tool throughout is Le Cam’s theory of locally asymptotically normal experiments, in the nonstandard context, however, of a curved parametrization. The results we derive for curved experiments are of independent interest, and likely to apply in other setups.
    Keywords: Panel data, temporal aggregation, temporal aggregation, model specification, efficiency.
    JEL: C23 C51 C52
    Date: 2009
  5. By: Kilian, Lutz; Kim, Yun Jung
    Abstract: It is well documented that the small-sample accuracy of asymptotic and bootstrap approximations to the pointwise distribution of VAR impulse response estimators is undermined by the estimator’s bias. A natural conjecture is that impulse response estimators based on the local projection (LP) method of Jordà (2005, 2007) are less susceptible to this problem and hence potentially more reliable in small samples than VAR-based estimators. We show that - contrary to this conjecture - LP estimators tend to have both higher bias and higher variance, resulting in pointwise impulse response confidence intervals that are typically less accurate and wider on average than suitably constructed VAR-based intervals. Bootstrapping the LP estimator only worsens its finite-sample accuracy. We also evaluate recently proposed joint asymptotic intervals for VAR and LP impulse response functions. Our analysis suggests that the accuracy of joint intervals can be erratic in practice, and neither joint interval is uniformly preferred over the other.
    Keywords: Bias; Confidence interval; Impulse response function; Joint interval; Local projection; Vector autoregression
    JEL: C32 C52 C53
    Date: 2009–04
  6. By: Høg, Esben (Department of Business Studies, Aarhus School of Business)
    Abstract: This paper analyzes some asymptotic results for an alternative estimator of integrated volatility in a continuous-time diffusion process of high frequency data (used in asset pricing finance). <p> The estimator, which is computationally efficient, is based on the quadratic variation of the second order log-price differences. This is contrary to the well known realized quadratic variation of intra daily returns (which is based on first order log-price differences). This latter is known as realized volatility. <p> Analytically, the asymptotics of the proposed estimator is compared to the usual realized volatility estimators. Lastly, we provide some simulation experiments to illustrate the results.
    Keywords: continuous-time methods; quadratic variation; realized volatility; second order quadratic variation
    Date: 2008–08–01
  7. By: Denis Belomestny
    Abstract: We consider the problem of estimating the fractional order of a L´evy process from low frequency historical and options data. An estimation methodology is developed which allows us to treat both estimation and calibration problems in a unified way. The corresponding procedure consists of two steps: the estimation of a conditional characteristic function and the weighted least squares estimation of the fractional order in spectral domain. While the second step is identical for both calibration and estimation, the first one depends on the problem at hand. Minimax rates of convergence for the fractional order estimate are derived, the asymptotic normality is proved and a data-driven algorithm based on aggregation is proposed. The performance of the estimator in both estimation and calibration setups is illustrated by a simulation study.
    Keywords: regular Lévy processes, Blumenthal-Getoor index, semiparametric estimation
    JEL: C12 C13
    Date: 2009–04
  8. By: Christopher J. Bennett (Department of Economics, Vanderbilt University)
    Abstract: This paper introduces a computationally efficient bootstrap procedure for obtaining multiplicity-adjusted p-values in situations where multiple hypotheses are tested simultaneously. This new testing procedure accounts for the mutual dependence of the individual statistics, and is shown under weak conditions to maintain asymptotic control of the generalized familywise error rate. Moreover, the estimated critical values (p-values) obtained via our procedure are less sensitive to the inclusion of true hypotheses and, as a result, our test has greater power to identify false hypotheses even as the collection of hypotheses under test increases in size. Another attractive feature of our test is that it leads naturally to balance among the individual hypotheses under test. This feature is especially attractive in settings where balance is desired but alternative approaches, such as those based on studentization, are difficult or infeasible.
    Keywords: Bootstrap, familywise error, multiple testing, step-down, balanced testing
    JEL: C12 C14 C52
    Date: 2009–04
  9. By: Daisuke Nagakura; Toshiaki Watanabe
    Abstract: We call the realized variance (RV) calculated with observed prices contaminated by microstructure noises (MNs) the noise-contaminated RV (NCRV) and refer to the component in the NCRV associated with the MNs as the MN component. This paper develops a state space method for estimating the integrated variance (IV) and MN component simultaneously. We represent the NCRV by a state space form and show that the state space form parameters are not identifiable; however, they can be expressed as functions of fewer identifiable parameters. We illustrate how to estimate these parameters. The proposed method is applied to yen/dollar exchange rate data.
    Keywords: Realized Variance, Integrated Variance, Microstructure Noise, State Space, Identification, Exchange Rate
    Date: 2009–03
  10. By: Rebecca Allen; Simon Burgess; Frank Windmeijer
    Abstract: The most widely used measure of segregation is the dissimilarity index, D. It is now well understood that this measure also reflects randomness in the allocation of individuals to units; that is, it measures deviations from evenness not deviations from randomness. This leads to potentially large values of the segregation index when unit sizes and/or minority proportions are small, even if there is no underlying systematic segregation. Our response to this is to produce an adjustment to the index, based on an underlying statistical model. We specify the assignment problem in a very general way, with differences in conditional assignment probabilities underlying the resulting segregation. From this we derive a likelihood ratio test for the presence of any systematic segregation and a bootstrap bias adjustment to the dissimilarity index. We further develop the asymptotic distribution theory for testing hypotheses concerning the magnitude of the segregation index and show that use of bootstrap methods can improve the size and power properties of test procedures considerably. We illustrate these methods by comparing dissimilarity indices across school districts in England to measure social segregation.
    Keywords: segregation, dissimilarity index, bootstrap methods, hypothesis testing
    JEL: C12 C13 C15 C46 I21
    Date: 2009–04
  11. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Jing Zhang (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, ECNU - East China Normal University)
    Abstract: This paper proposes a new approach to measure the dependence in multivariate financial data. Data in finance and insurance often cover a long time period. Therefore, the economic factors may induce some changes inside the dependence structure. Recently, two methods using copulas have been proposed to analyze such changes. The first approach investigates the changes of copula's parameters. The second one tests the changes of copulas by determining the best copulas using moving windows. In this paper we take into account the non stationarity of the data and analyze : (1) the changes of parameters while the copula family keeps static ; (2) the changes of copula family. We propose a series of tests based on conditional copulas and goodness-of-fit (GOF) tests to decide the type of change, and further give the corresponding change analysis. We illustrate our approach with Standard & Poor 500 and Nasdaq indices, and provide dynamic risk measures.
    Keywords: Dynamic copula - goodness-of-fit test - change-point - time-varying parameter - VaR - ES
    Date: 2009
  12. By: Lanouar Charfeddine (OEP - Université de Marne-la-Vallée); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: Are structural breaks models true switching models or long memory processes ? The answer to this question remain ambiguous. A lot of papers, in recent years, have dealt with this problem. For instance, Diebold and Inoue (2001) and Granger and Hyung (2004) show, under specific conditions, that switching models and long memory processes can be easily confused. In this paper, using several generating models like the mean-plus-noise model, the STOchastic Permanent BREAK model, the Markov switching model, the TAR model, the sign model and the Structural CHange model (SCH) and several estimation techiques like the GPH technique, the Exact Local Whittle (ELW) and the Wavelet methods, we show that, if the answer is quite simple in some cases, it can be mitigate in other cases. Using French and American inflation rates, we show that these series cannot be characterized by the same class of models. The main result of this study suggests that estimating the long memory parameter without taking account existence of breaks in the data sets may lead to misspecification and to overestimate the true parameter.
    Keywords: Structural breaks models, spurious long memory behavior, inflation series.
    Date: 2009–04
  13. By: Jim Malley; Ulrich Woitek
    Abstract: This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.
    Keywords: Real Business Cycle, Bayesian estimation, VARMA errors
    JEL: C11 C52 E32
    Date: 2009–04
  14. By: Stéphane Loisel (SAF - EA2429 - Laboratoire de Science Actuarielle et Financière - Université Claude Bernard - Lyon I)
    Abstract: Arthur Charpentier (see Arthur's blog) was recently contacted by some researchers willing to test if a multivariate copula is - or not - Gaussian. They use a test proposed in Malevergne and Sornette (2003) stating that one should simply test for pairwise normality. This test may be of importance in finance, in actuarial science, and in risk management in general: for example, given 120 financial assets, in order to test whether or not some 120-dimensional random vector of interest in finance admits a Gaussian copula, can one restrict the Gaussian copula hypothesis test to pairs of assets? This short note proves that it is not the case, and provides a simple counter-example based on some multivariate EFGM copula. This confirms the intuition that one cannot only consider all pairs of the studied random variables and that one cannot avoid to study the full vector to test whether a random vector admits a Gaussian copula. An earlier counter-example, discovered after writing this note, is also mentioned.
    Keywords: Gaussian copula; trivariate copulas with fixed bivariate copulas; pairwise and global normality
    Date: 2009–04–16
  15. By: Guglielmo Maria Caporale; Luis A. Gil-Alana
    Abstract: In this paper we specify a multi-factor long-memory process that enables us to estimate the fractional differencing parameters at each frequency separately, and adopt this framework to model quarterly prices in three European countries (France, Italy and the UK). The empirical results suggest that inflation in France and Italy is nonstationary. However, while for the former country this applies both to the zero and the seasonal frequencies, in the case of Italy the nonstationarity comes exclusively from the long-run or zero frequency. In the UK, inflation seems to be stationary with a component of long memory at both the zero and the semi-annual frequencies, especially at the former.
    Keywords: Fractional Integration, Long Memory, Inflation
    JEL: C22 O40
    Date: 2009
  16. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: This paper focuses on the use of dynamical chaotic systems in Economics and Finance. In these fields, researchers employ different methods from those taken by mathematicians and physicists. We discuss this point. Then, we present statistical tools and problems which are innovative and can be useful in practice to detect the existence of chaotic behavior inside real data sets.
    Keywords: Chaos ; Deterministic dynamical system ; Economics ; Estimation theory ; Finance ; Forecasting
    Date: 2009
  17. By: Mª Victoria Esteban González (Facultad de CC. EE. y Empresariales, UPV/EHU); Fernando Tusell Palmer (Facultad de CC. EE. y Empresariales, UPV/EHU)
    Abstract: Betas play a central role in modern finance. The estimation of betas from historical data and their extrapolation into the future is of considerable practical interest. We propose two new methods: the first is a direct generalization of the method in Blume (1975), and the second is based on Procrustes rotation in phase space. We compare their performance with various competitors and draw some conclusions.
    Keywords: risk prediction, systematic risk, beta coefficients, Procustes rotation
    JEL: G11 G12
    Date: 2009–04–21

This nep-ecm issue is ©2009 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.