nep-ecm New Economics Papers
on Econometrics
Issue of 2007‒02‒17
nineteen papers chosen by
Sune Karlsson
Orebro University

  1. General trimmed estimation: robust approach to nonlinear and limited dependent variable models By Cizek,Pavel
  2. Semiparametric estimation of the dependence parameter of the error terms in multivariate regression By Gunky Kim; Mervyn J. Silvapulle; and Paramsothy Silvapulle
  3. Estimation d'un modèle TIMA avec asymétrie contemporaine par inférence indirecte By Catherine Bruneau; Amine Lahiani
  4. Some new bivariate IG and NIG-distributions for modelling covariate nancial returns By Lillestøl, Jostein
  5. Evaluating Forecasts from Factor Models for Canadian GDP Growth and Core Inflation By Calista Cheung; Frédérick Demers
  6. The empirical process of autoregressive residuals By Bent Nielsen; Eric Engler
  7. Bootstrapping long memory tests: some Monte Carlo results By Anthony Murphy; Marwan Izzeldin
  8. Simulation experiments in practice : statistical design and regression analysis By Kleijnen,Jack P.C.
  9. Nonlinear Filtering for Stochastic Volatility Models with Heavy Tails and Leverage By Adam Clements; Scott White
  10. Inflation persistence in the euro-area, US, and new members of the EU: Evidence from time-varying coefficient models By Zsolt Darvas; Balázs Varga
  11. Testing I(1) against I(d) alternatives with Wald Tests in the presence of deterministic components By Juan José Dolado; Jesús Gonzalo; Laura Mayoral
  12. Convergence to Stochastic Integrals with Non-linear integrands By Bent Nielsen; Carlos Caceres
  13. The Instrumental Weighted Variables. Part I. Consistency By Jan Ámos Víšek
  14. The Instrumental Weighted Variables. Part II. Square root of n-consistency By Jan Ámos Víšek
  15. The Instrumental Weighted Variables. Part III. Asymptotic Representation By Jan Ámos Víšek
  16. Testing the Power of Leading Indicators to Predict Business Cycle Phase Changes By Allan Layton; Daniel R. Smith
  17. An Economic Analysis of Exclusion Restrictions for Instrumental Variable Estimation By Gerard J. van den Berg
  18. Simulation of Gegenbauer processes using wavelet packets By Collet J.J.; Fadili J.M.
  19. Non-linear filtering with state dependant transition probabilities: A threshold (size effect) SV model By Adam Clements; Scott White

  1. By: Cizek,Pavel (Tilburg University, Center for Economic Research)
    Abstract: High breakdown-point regression estimators protect against large errors and data contamination. Motivated by some { the least trimmed squares and maximum trimmed likelihood estimators { we propose a general trimmed estimator, which unifies and extends many existing robust procedures. We derive here the consistency and asymptotic distribution of the proposed general trimmed estimator under mild B-mixing conditions and demonstrate its applicability in nonlinear regression, time series, and limited dependent variable models.
    Keywords: asymptotic normality;regression;robust estimation;trimming
    JEL: C13 C20 C24 C25
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:20071&r=ecm
  2. By: Gunky Kim; Mervyn J. Silvapulle; and Paramsothy Silvapulle
    Abstract: A semiparametric method is developed for estimating the dependence parameter and the joint distribution of the error term in the multivariate linear regression model. The nonparametric part of the method treats the marginal distributions of the error term as unknown, and estimates them by suitable empirical distribution functions. Then a pseudolikelihood is maximized to estimate the dependence parameter. It is shown that this estimator is asymptotically normal, and a consistent estimator of its large sample variance is given. A simulation study shows that the proposed semiparametric estimator is better than the parametric methods available when the error distribution is unknown, which is almost always the case in practice. It turns out that there is no loss of asymptotic efficiency due to the estimation of the regression parameters. An empirical example on portfolio management is used to illustrate the method. This is an extension of earlier work by Oakes (1994) and Genest et al. (1995) for the case when the observations are independent and identically distributed, and Oakes and Ritz (2000) for the multivariate regression model.
    Keywords: Copula; Pseudo-likelihood; Robustness.
    JEL: C01 C12 C13 C14
    Date: 2007–02
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2007-1&r=ecm
  3. By: Catherine Bruneau; Amine Lahiani
    Abstract: This paper implements a simulation-based method for estimating the parameters of Threshold Integrated Moving Average Models with contemporaneous asymmetry. Among many simulation-based methods we use the Indirect Inference method (II) with an autoregressive model as auxiliary model. To investigate the properties of the estimator in finite samples we refer to Monte Carlo methods. We apply our framework to the daily CAC40 index returns series and we find that this series exhibits an asymmetric response to shocks around a threshold.
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:drm:wpaper:2006-17&r=ecm
  4. By: Lillestøl, Jostein (Dept. of Finance and Management Science, Norwegian School of Economics and Business Administration)
    Abstract: The univariate Normal Inverse Gaussian (NIG) distribution is found useful for modelling financial return data exhibiting skewness and fat tails. Multivariate versions exists, but may be impractical to implement in finance. This work explores some possibilities with links to the mixing representation of the NIG distribution by the IG-distribution. We present two approaches for constructing bivariate NIG distribution that take advantage of the correlation between the univariate latent IG-variables that characterizes the marginal NIG-distribution. These are readily available from the marginal estimation, either by maximum likelihood via the EM-algorithm or by Bayesian estimation via Markov chain Monte Carlo methods. A context for implementation in finance is given.
    Keywords: Financial returns; bivariate distribution; NIG distribution; mixture representation; inverse Gaussian distribution; bivariate simulation
    JEL: C10 C11 C13 C15 C16
    Date: 2007–01–08
    URL: http://d.repec.org/n?u=RePEc:hhs:nhhfms:2007_001&r=ecm
  5. By: Calista Cheung; Frédérick Demers
    Abstract: This paper evaluates the performance of static and dynamic factor models for forecasting Canadian real output growth and core inflation on a quarterly basis. We extract the common component from a large number of macroeconomic indicators, and use the estimates to compute out-of-sample forecasts under a recursive and a rolling scheme with different window sizes. Factor-based forecasts are compared with AR(p) models as well as IS- and Phillips-curve models. We find that factor models can improve the forecast accuracy relative to standard benchmark models, for horizons of up to 8 quarters. Forecasts from our proposed factor models are also less prone to committing large errors, in particular when the horizon increases. We further show that the choice of the sampling-scheme has a large influence on the overall forecast accuracy, with smallest rolling-window samples generating superior results to larger samples, implying that using "limited-memory" estimators contribute to improve the quality of the forecasts.
    Keywords: Econometric and statistical methods
    JEL: C32 E37
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:07-8&r=ecm
  6. By: Bent Nielsen (Nuffield College, Oxford University); Eric Engler (Dept of Economics, Oxford University)
    Abstract: The empirical process of the residuals from general autoregressions is investigated. If an intercept is included in the regression, the empirical process is asymptotically Gaussian and free of nuisance parameters. This contrasts the known result that in the unit root case without intercept the empirical process is asymptotically non-Gaussian. The result is used to establish asymptotic theory for the Kolmogorov-Smirnov test, Probability-Probability plots, and Quantile-Quantile plots. The link between sample moments and the empirical process of the residuals is established and used to establish the properties of the cumulant based tests for normality referred to as the Jarque-Bera test.
    Keywords: Autogression, Empirical process, Kolmogorov-Smirnov test, Probability-Probability plots, Quantile-Quantile plots, Test for normality.
    Date: 2007–01–17
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:0701&r=ecm
  7. By: Anthony Murphy; Marwan Izzeldin
    Abstract: We investigate the bootstrapped size and power properties of five long memory tests, including the modified R/S, KPSS and GPH tests. In small samples, the moving block bootstrap controls the empirical size of the tests. However, for these sample sizes, the power of bootstrapped tests against fractionally integrated alternatives is often a good deal less than that of asymptotic tests. In larger samples, the power of the five tests is good against common fractionally integrated alternatives - the FI case and the FI with a stochastic volatility error case.
    Keywords: Moving block bootstrap; fractional integration
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:lan:wpaper:003091&r=ecm
  8. By: Kleijnen,Jack P.C. (Tilburg University, Center for Economic Research)
    Abstract: In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic theory assumes a single simulation response that is normally and independently distributed with a constant variance; moreover, the regression (meta)model of the simulation model's I/O behaviour is assumed to have residuals with zero means. This article addresses the following questions: (i) How realistic are these assumptions, in practice? (ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions do hold? (iv) If not, which alternative statistical methods can then be applied?
    Keywords: metamodels;experimental designs;generalized least squares;multivariate analysis;normality;jackknife;bootstrap;heteroscedasticity;common random numbers; validation
    JEL: C0 C1 C9
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:20079&r=ecm
  9. By: Adam Clements; Scott White (School of Economics and Finance, Queensland University of Technology)
    Abstract: This paper develops a computationally efficient filtering based procedure for the estimation of the heavy tailed SV model with leverage. While there are many accepted techniques for the estimation of standard SV models, incorporating these effects into an SV framework is difficult. Simulation evidence provided in this paper indicates that the proposed procedure outperforms competing approaches in terms of the accuracy of parameter estimation. In an empirical setting, it is shown how the individual effects of heavy tails and leverage can be isolated using standard likelihood ratio tests.
    URL: http://d.repec.org/n?u=RePEc:qut:dpaper:192&r=ecm
  10. By: Zsolt Darvas (Corvinus University of Budapest); Balázs Varga (Corvinus University of Budapest)
    Abstract: This paper studies inflation persistence with time-varying-coefficient autoregressions in response to recently discovered structural breaks in historical inflation time series of the euro-area and the US. To this end, we compare the statistical properties of the well known ML estimation using the Kalman-filter and the less known Flexible Least Squares estimator by Monte Carlo simulation. We also suggest a procedure for selecting the weight for FLS based on an iterative Monte Carlo simulation technique calibrated to the time series in question. We apply the methods for the study of inflation persistence of the US, the euro-area and the new members of the EU
    Keywords: flexible least squares, inflation persistence, Kalman-filter, time-varying coefficient models
    JEL: C22 E31
    Date: 2007–02–02
    URL: http://d.repec.org/n?u=RePEc:mmf:mmfc06:137&r=ecm
  11. By: Juan José Dolado; Jesús Gonzalo; Laura Mayoral
    Abstract: This paper analyses how to test I(1) against I(d), d<1, in the presence of deterministic components in the DGP, by extending a Wald-type test, i.e., the (Efficient) Fractional Dickey-Fuller (EFDF) test, to this case. Tests of these hypotheses are important in many economic applications where it is crucial to distinguish between permanent and transitory shocks because I(d) processes with d<1 are mean-reverting. On top of it, the inclusion of deterministic components becomes a necessary addition in order to analyze most macroeconomic variables. We show how simple is the implementation of the EFDF in these situations and argue that, in general, has better properties than LM tests. Finally, an empirical application is provided where the EFDF approach allowing for deterministic components is used to test for long-memory in the GDP p.c. of several OECD countries, an issue that has important consequences to discriminate between growth theories, and on which there has been some controversy.
    Date: 2006–12
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:we20061221&r=ecm
  12. By: Bent Nielsen (Nuffield College, Oxford University); Carlos Caceres (Nuffield College, Oxford University)
    Abstract: In this paper we present a general result concerning the convergence to stochastic integrals with non-linear integrands. The key finding represents a generalization of Chan and Wei's (1988) Theorem 2.4 and that of Ibragimov and Phillips' (2004) Theorem 8.2. This result is necessary for analysing the asymptotic properties of mis-specification tests, when applied to a unit root process, for which Wooldridge (1999) mentioned that the exiting results in the literature were not sufficient.
    Keywords: non-stationarity, unit roots, convergence, autoregressive processes, martingales stochastic integrals, non-linearity.
    Date: 2007–02–12
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:0702&r=ecm
  13. By: Jan Ámos Víšek (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic)
    Abstract: A robust version of the method of Instrumental Variables accommodating the idea of an implicit weighting the residuals is proposed and its properties studied. (The idea of implicit weighting down the “suspicious” residuals was firstly employed by the method of the Least Weighted Squares, see Víšek (2000c).) It means that at first, it is shown that all solutions of the corresponding normal equations are bounded in probability. Finally, the weak consistency of them is proved.
    Keywords: Robustness; instrumental variables; implicit weighting; consistency of estimate by instrumental weighted variables
    Date: 2007–01
    URL: http://d.repec.org/n?u=RePEc:fau:wpaper:wp2007_05&r=ecm
  14. By: Jan Ámos Víšek (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic)
    Abstract: The definition of Instrumental Weighted Variables (IWV) (which is a robust version of the classical Instrumental Variables) and conditions for the weak consistency as given in the Part I of this paper are recalled. The reasons why the classical Instrumental Variables were introduced as well as the idea of implicit weighting the residuals (firstly employed by the Least Weighted Squares, see Víšek (2000)) are also recalled. Then square root of n-consistency of all solutions of the corresponding normal equations is proved.
    Keywords: Robustness; instrumental variables; implicit weighting; square root of n-consistency of estimate by means of instrumental weighted variables
    Date: 2007–01
    URL: http://d.repec.org/n?u=RePEc:fau:wpaper:wp2007_06&r=ecm
  15. By: Jan Ámos Víšek (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic)
    Abstract: The robust version of the classical instrumental variables, called Instrumental Weighted Variables (IWV) and the conditions for its square root of n-consistency as given in the Part I and II of this paper are recalled. Of course, the reasons why the classical instrumental variables as well as IWV were introduced and the idea of implicit weighting the residuals (firstly employed by the Least Weighted Squares, see Víšek (2000)) are also very briefly recalled (details were discussed in Part I of this paper). Then asymptotic representation and normality of all solutions of the corresponding normal equations is proved.
    Keywords: Robustness; instrumental variables; implicit weighting; square root of n-consistency of estimate by means of instrumental weighted variables; asymptotic representation of the estimate and its normality
    Date: 2007–01
    URL: http://d.repec.org/n?u=RePEc:fau:wpaper:wp2007_07&r=ecm
  16. By: Allan Layton; Daniel R. Smith (School of Economics and Finance, Queensland University of Technology)
    Abstract: In the business cycle literature researchers often want to determine the extent to which models of the business cycle reproduce broad characteristics of the real world business cycle they purport to represent. Of considerable interest is whether a model’s implied cycle chronology is consistent with the actual business cycle chronology. In the US, a very widely accepted business cycle chronology is that compiled by the National Bureau of Economic research (NBER) and the vast majority of US business cycle scholars have, for many years, proceeded to test their models for their consistency with the NBER dates. In doing this, one of the most prevalent metrics in use since its introduction into the business cycle literature by Diebold and Rudebusch (1989) is the so-called quadratic probability score, or QPS. However, an important limitation to the use of the QPS statistic is that its sampling distribution is unknown so that rigorous statistical inference is not feasible. We suggest circumventing this by bootstrapping the distribution. This analysis yields some interesting insights into the relationship between statistical measures of goodness of fit of a model and the ability of the model to predict some underlying set of regimes of interest. Furthermore, in modeling the business cycle, a popular approach in recent years has been to use some variant of the so-called Markov regime switching (MRS) model first introduced by Hamilton (1989) and we therefore use MRS models as the framework for the paper. Of course, the approach could be applied to any US business cycle model.
    Keywords: Markov Regime Switching, Business Cycle, Quadratic Probability Score
    URL: http://d.repec.org/n?u=RePEc:qut:dpaper:200&r=ecm
  17. By: Gerard J. van den Berg (Princeton University, Free University Amsterdam, IFAU-Uppsala, IFS, CREST, CEPR and IZA)
    Abstract: Instrumental variable estimation requires untestable exclusion restrictions. With policy effects on individual outcomes, there is typically a time interval between the moment the agent realizes that he may be exposed to the policy and the actual exposure or the announcement of the actual treatment status. In such cases there is an incentive for the agent to acquire information on the value of the IV. This leads to violation of the exclusion restriction. We analyze this in a dynamic economic model framework. This provides a foundation of exclusion restrictions in terms of economic behavior. The results are used to describe policy evaluation settings in which instrumental variables are likely or unlikely to make sense. For the latter cases we analyze the asymptotic bias. The exclusion restriction is more likely to be violated if the outcome of interest strongly depends on interactions between the agent’s effort before the outcome is realized and the actual treatment status. The bias has the same sign as this interaction effect. Violation does not causally depend on the weakness of the candidate instrument or the size of the average treatment effect. With experiments, violation is more likely if the treatment and control groups are to be of similar size. We also address side-effects. We develop a novel economic interpretation of placebo effects and provide some empirical evidence for the relevance of the analysis.
    Keywords: treatment, policy evaluation, information, selection effects, randomization, placebo effect
    JEL: C31 C21 D81 J68 D82 D83 D84 C35 C51
    Date: 2007–01
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp2585&r=ecm
  18. By: Collet J.J.; Fadili J.M. (School of Economics and Finance, Queensland University of Technology)
    Abstract: In this paper, we propose to study the synthesis of Gegenbauer processes using the wavelet packets transform. In order to simulate 1-factor Gegenbauer process, we introduce an original algorithm, inspired by the one proposed by Coifman and Wickerhauser [CW92], to adaptively search for the best-ortho-basis in the wavelet packet library where the covariance matrix of the transformed process is nearly diagonal. Our method clearly outperforms the one recently proposed by [Whi01], is very fast, does not depend on the wavelet choice, and is not very sensitive to the length of the time series. From these first results we propose an algorithm to build bases to simulate k-factor Gegenbauer processes. Given the simplicity of programming and running, we feel the general practitioner will be attracted to our simulator. Finally we evaluate the approximation due to the fact that we consider the wavelet packet coeficients as uncorrelated. An empirical study is carried out which supports our results.
    Keywords: Gegenbauer process, Wavelet packet transform, Best-basis, Autocovariance
    URL: http://d.repec.org/n?u=RePEc:qut:dpaper:190&r=ecm
  19. By: Adam Clements; Scott White (School of Economics and Finance, Queensland University of Technology)
    Abstract: This paper considers the size effect, where volatility dynamics are dependant upon the current level of volatility within an stochastic volatility framework. A non-linear filtering algorithm is proposed where the dynamics of the latent variable is conditioned on its current level. This allows for the estimation of a stochastic volatility model where dynamics are dependant on the level of volatility. Empirical results suggest that volatility dynamics are in fact influenced by the level of prevailing volatility. When volatility is relatively low (high), volatility is extremely (not) persistent with little (a great deal of) noise.
    Keywords: Non-linear filtering, stochastic volatility, size effect, threshold
    URL: http://d.repec.org/n?u=RePEc:qut:dpaper:191&r=ecm

This nep-ecm issue is ©2007 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.