
on Econometrics 
By:  Cizek,Pavel (Tilburg University, Center for Economic Research) 
Abstract:  High breakdownpoint regression estimators protect against large errors and data contamination. Motivated by some { the least trimmed squares and maximum trimmed likelihood estimators { we propose a general trimmed estimator, which unifies and extends many existing robust procedures. We derive here the consistency and asymptotic distribution of the proposed general trimmed estimator under mild Bmixing conditions and demonstrate its applicability in nonlinear regression, time series, and limited dependent variable models. 
Keywords:  asymptotic normality;regression;robust estimation;trimming 
JEL:  C13 C20 C24 C25 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:20071&r=ecm 
By:  Gunky Kim; Mervyn J. Silvapulle; and Paramsothy Silvapulle 
Abstract:  A semiparametric method is developed for estimating the dependence parameter and the joint distribution of the error term in the multivariate linear regression model. The nonparametric part of the method treats the marginal distributions of the error term as unknown, and estimates them by suitable empirical distribution functions. Then a pseudolikelihood is maximized to estimate the dependence parameter. It is shown that this estimator is asymptotically normal, and a consistent estimator of its large sample variance is given. A simulation study shows that the proposed semiparametric estimator is better than the parametric methods available when the error distribution is unknown, which is almost always the case in practice. It turns out that there is no loss of asymptotic efficiency due to the estimation of the regression parameters. An empirical example on portfolio management is used to illustrate the method. This is an extension of earlier work by Oakes (1994) and Genest et al. (1995) for the case when the observations are independent and identically distributed, and Oakes and Ritz (2000) for the multivariate regression model. 
Keywords:  Copula; Pseudolikelihood; Robustness. 
JEL:  C01 C12 C13 C14 
Date:  2007–02 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:20071&r=ecm 
By:  Catherine Bruneau; Amine Lahiani 
Abstract:  This paper implements a simulationbased method for estimating the parameters of Threshold Integrated Moving Average Models with contemporaneous asymmetry. Among many simulationbased methods we use the Indirect Inference method (II) with an autoregressive model as auxiliary model. To investigate the properties of the estimator in finite samples we refer to Monte Carlo methods. We apply our framework to the daily CAC40 index returns series and we find that this series exhibits an asymmetric response to shocks around a threshold. 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:drm:wpaper:200617&r=ecm 
By:  Lillestøl, Jostein (Dept. of Finance and Management Science, Norwegian School of Economics and Business Administration) 
Abstract:  The univariate Normal Inverse Gaussian (NIG) distribution is found useful for modelling financial return data exhibiting skewness and fat tails. Multivariate versions exists, but may be impractical to implement in finance. This work explores some possibilities with links to the mixing representation of the NIG distribution by the IGdistribution. We present two approaches for constructing bivariate NIG distribution that take advantage of the correlation between the univariate latent IGvariables that characterizes the marginal NIGdistribution. These are readily available from the marginal estimation, either by maximum likelihood via the EMalgorithm or by Bayesian estimation via Markov chain Monte Carlo methods. A context for implementation in finance is given. 
Keywords:  Financial returns; bivariate distribution; NIG distribution; mixture representation; inverse Gaussian distribution; bivariate simulation 
JEL:  C10 C11 C13 C15 C16 
Date:  2007–01–08 
URL:  http://d.repec.org/n?u=RePEc:hhs:nhhfms:2007_001&r=ecm 
By:  Calista Cheung; Frédérick Demers 
Abstract:  This paper evaluates the performance of static and dynamic factor models for forecasting Canadian real output growth and core inflation on a quarterly basis. We extract the common component from a large number of macroeconomic indicators, and use the estimates to compute outofsample forecasts under a recursive and a rolling scheme with different window sizes. Factorbased forecasts are compared with AR(p) models as well as IS and Phillipscurve models. We find that factor models can improve the forecast accuracy relative to standard benchmark models, for horizons of up to 8 quarters. Forecasts from our proposed factor models are also less prone to committing large errors, in particular when the horizon increases. We further show that the choice of the samplingscheme has a large influence on the overall forecast accuracy, with smallest rollingwindow samples generating superior results to larger samples, implying that using "limitedmemory" estimators contribute to improve the quality of the forecasts. 
Keywords:  Econometric and statistical methods 
JEL:  C32 E37 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:bca:bocawp:078&r=ecm 
By:  Bent Nielsen (Nuffield College, Oxford University); Eric Engler (Dept of Economics, Oxford University) 
Abstract:  The empirical process of the residuals from general autoregressions is investigated. If an intercept is included in the regression, the empirical process is asymptotically Gaussian and free of nuisance parameters. This contrasts the known result that in the unit root case without intercept the empirical process is asymptotically nonGaussian. The result is used to establish asymptotic theory for the KolmogorovSmirnov test, ProbabilityProbability plots, and QuantileQuantile plots. The link between sample moments and the empirical process of the residuals is established and used to establish the properties of the cumulant based tests for normality referred to as the JarqueBera test. 
Keywords:  Autogression, Empirical process, KolmogorovSmirnov test, ProbabilityProbability plots, QuantileQuantile plots, Test for normality. 
Date:  2007–01–17 
URL:  http://d.repec.org/n?u=RePEc:nuf:econwp:0701&r=ecm 
By:  Anthony Murphy; Marwan Izzeldin 
Abstract:  We investigate the bootstrapped size and power properties of five long memory tests, including the modified R/S, KPSS and GPH tests. In small samples, the moving block bootstrap controls the empirical size of the tests. However, for these sample sizes, the power of bootstrapped tests against fractionally integrated alternatives is often a good deal less than that of asymptotic tests. In larger samples, the power of the five tests is good against common fractionally integrated alternatives  the FI case and the FI with a stochastic volatility error case. 
Keywords:  Moving block bootstrap; fractional integration 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:lan:wpaper:003091&r=ecm 
By:  Kleijnen,Jack P.C. (Tilburg University, Center for Economic Research) 
Abstract:  In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic theory assumes a single simulation response that is normally and independently distributed with a constant variance; moreover, the regression (meta)model of the simulation model's I/O behaviour is assumed to have residuals with zero means. This article addresses the following questions: (i) How realistic are these assumptions, in practice? (ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions do hold? (iv) If not, which alternative statistical methods can then be applied? 
Keywords:  metamodels;experimental designs;generalized least squares;multivariate analysis;normality;jackknife;bootstrap;heteroscedasticity;common random numbers; validation 
JEL:  C0 C1 C9 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:20079&r=ecm 
By:  Adam Clements; Scott White (School of Economics and Finance, Queensland University of Technology) 
Abstract:  This paper develops a computationally efficient filtering based procedure for the estimation of the heavy tailed SV model with leverage. While there are many accepted techniques for the estimation of standard SV models, incorporating these effects into an SV framework is difficult. Simulation evidence provided in this paper indicates that the proposed procedure outperforms competing approaches in terms of the accuracy of parameter estimation. In an empirical setting, it is shown how the individual effects of heavy tails and leverage can be isolated using standard likelihood ratio tests. 
URL:  http://d.repec.org/n?u=RePEc:qut:dpaper:192&r=ecm 
By:  Zsolt Darvas (Corvinus University of Budapest); BalÃ¡zs Varga (Corvinus University of Budapest) 
Abstract:  This paper studies inflation persistence with timevaryingcoefficient autoregressions in response to recently discovered structural breaks in historical inflation time series of the euroarea and the US. To this end, we compare the statistical properties of the well known ML estimation using the Kalmanfilter and the less known Flexible Least Squares estimator by Monte Carlo simulation. We also suggest a procedure for selecting the weight for FLS based on an iterative Monte Carlo simulation technique calibrated to the time series in question. We apply the methods for the study of inflation persistence of the US, the euroarea and the new members of the EU 
Keywords:  flexible least squares, inflation persistence, Kalmanfilter, timevarying coefficient models 
JEL:  C22 E31 
Date:  2007–02–02 
URL:  http://d.repec.org/n?u=RePEc:mmf:mmfc06:137&r=ecm 
By:  Juan José Dolado; Jesús Gonzalo; Laura Mayoral 
Abstract:  This paper analyses how to test I(1) against I(d), d<1, in the presence of deterministic components in the DGP, by extending a Waldtype test, i.e., the (Efficient) Fractional DickeyFuller (EFDF) test, to this case. Tests of these hypotheses are important in many economic applications where it is crucial to distinguish between permanent and transitory shocks because I(d) processes with d<1 are meanreverting. On top of it, the inclusion of deterministic components becomes a necessary addition in order to analyze most macroeconomic variables. We show how simple is the implementation of the EFDF in these situations and argue that, in general, has better properties than LM tests. Finally, an empirical application is provided where the EFDF approach allowing for deterministic components is used to test for longmemory in the GDP p.c. of several OECD countries, an issue that has important consequences to discriminate between growth theories, and on which there has been some controversy. 
Date:  2006–12 
URL:  http://d.repec.org/n?u=RePEc:cte:werepe:we20061221&r=ecm 
By:  Bent Nielsen (Nuffield College, Oxford University); Carlos Caceres (Nuffield College, Oxford University) 
Abstract:  In this paper we present a general result concerning the convergence to stochastic integrals with nonlinear integrands. The key finding represents a generalization of Chan and Wei's (1988) Theorem 2.4 and that of Ibragimov and Phillips' (2004) Theorem 8.2. This result is necessary for analysing the asymptotic properties of misspecification tests, when applied to a unit root process, for which Wooldridge (1999) mentioned that the exiting results in the literature were not sufficient. 
Keywords:  nonstationarity, unit roots, convergence, autoregressive processes, martingales stochastic integrals, nonlinearity. 
Date:  2007–02–12 
URL:  http://d.repec.org/n?u=RePEc:nuf:econwp:0702&r=ecm 
By:  Jan Ámos Víšek (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic) 
Abstract:  A robust version of the method of Instrumental Variables accommodating the idea of an implicit weighting the residuals is proposed and its properties studied. (The idea of implicit weighting down the “suspicious” residuals was firstly employed by the method of the Least Weighted Squares, see Víšek (2000c).) It means that at first, it is shown that all solutions of the corresponding normal equations are bounded in probability. Finally, the weak consistency of them is proved. 
Keywords:  Robustness; instrumental variables; implicit weighting; consistency of estimate by instrumental weighted variables 
Date:  2007–01 
URL:  http://d.repec.org/n?u=RePEc:fau:wpaper:wp2007_05&r=ecm 
By:  Jan Ámos Víšek (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic) 
Abstract:  The definition of Instrumental Weighted Variables (IWV) (which is a robust version of the classical Instrumental Variables) and conditions for the weak consistency as given in the Part I of this paper are recalled. The reasons why the classical Instrumental Variables were introduced as well as the idea of implicit weighting the residuals (firstly employed by the Least Weighted Squares, see Víšek (2000)) are also recalled. Then square root of nconsistency of all solutions of the corresponding normal equations is proved. 
Keywords:  Robustness; instrumental variables; implicit weighting; square root of nconsistency of estimate by means of instrumental weighted variables 
Date:  2007–01 
URL:  http://d.repec.org/n?u=RePEc:fau:wpaper:wp2007_06&r=ecm 
By:  Jan Ámos Víšek (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic) 
Abstract:  The robust version of the classical instrumental variables, called Instrumental Weighted Variables (IWV) and the conditions for its square root of nconsistency as given in the Part I and II of this paper are recalled. Of course, the reasons why the classical instrumental variables as well as IWV were introduced and the idea of implicit weighting the residuals (firstly employed by the Least Weighted Squares, see Víšek (2000)) are also very briefly recalled (details were discussed in Part I of this paper). Then asymptotic representation and normality of all solutions of the corresponding normal equations is proved. 
Keywords:  Robustness; instrumental variables; implicit weighting; square root of nconsistency of estimate by means of instrumental weighted variables; asymptotic representation of the estimate and its normality 
Date:  2007–01 
URL:  http://d.repec.org/n?u=RePEc:fau:wpaper:wp2007_07&r=ecm 
By:  Allan Layton; Daniel R. Smith (School of Economics and Finance, Queensland University of Technology) 
Abstract:  In the business cycle literature researchers often want to determine the extent to which models of the business cycle reproduce broad characteristics of the real world business cycle they purport to represent. Of considerable interest is whether a model’s implied cycle chronology is consistent with the actual business cycle chronology. In the US, a very widely accepted business cycle chronology is that compiled by the National Bureau of Economic research (NBER) and the vast majority of US business cycle scholars have, for many years, proceeded to test their models for their consistency with the NBER dates. In doing this, one of the most prevalent metrics in use since its introduction into the business cycle literature by Diebold and Rudebusch (1989) is the socalled quadratic probability score, or QPS. However, an important limitation to the use of the QPS statistic is that its sampling distribution is unknown so that rigorous statistical inference is not feasible. We suggest circumventing this by bootstrapping the distribution. This analysis yields some interesting insights into the relationship between statistical measures of goodness of fit of a model and the ability of the model to predict some underlying set of regimes of interest. Furthermore, in modeling the business cycle, a popular approach in recent years has been to use some variant of the socalled Markov regime switching (MRS) model first introduced by Hamilton (1989) and we therefore use MRS models as the framework for the paper. Of course, the approach could be applied to any US business cycle model. 
Keywords:  Markov Regime Switching, Business Cycle, Quadratic Probability Score 
URL:  http://d.repec.org/n?u=RePEc:qut:dpaper:200&r=ecm 
By:  Gerard J. van den Berg (Princeton University, Free University Amsterdam, IFAUUppsala, IFS, CREST, CEPR and IZA) 
Abstract:  Instrumental variable estimation requires untestable exclusion restrictions. With policy effects on individual outcomes, there is typically a time interval between the moment the agent realizes that he may be exposed to the policy and the actual exposure or the announcement of the actual treatment status. In such cases there is an incentive for the agent to acquire information on the value of the IV. This leads to violation of the exclusion restriction. We analyze this in a dynamic economic model framework. This provides a foundation of exclusion restrictions in terms of economic behavior. The results are used to describe policy evaluation settings in which instrumental variables are likely or unlikely to make sense. For the latter cases we analyze the asymptotic bias. The exclusion restriction is more likely to be violated if the outcome of interest strongly depends on interactions between the agent’s effort before the outcome is realized and the actual treatment status. The bias has the same sign as this interaction effect. Violation does not causally depend on the weakness of the candidate instrument or the size of the average treatment effect. With experiments, violation is more likely if the treatment and control groups are to be of similar size. We also address sideeffects. We develop a novel economic interpretation of placebo effects and provide some empirical evidence for the relevance of the analysis. 
Keywords:  treatment, policy evaluation, information, selection effects, randomization, placebo effect 
JEL:  C31 C21 D81 J68 D82 D83 D84 C35 C51 
Date:  2007–01 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp2585&r=ecm 
By:  Collet J.J.; Fadili J.M. (School of Economics and Finance, Queensland University of Technology) 
Abstract:  In this paper, we propose to study the synthesis of Gegenbauer processes using the wavelet packets transform. In order to simulate 1factor Gegenbauer process, we introduce an original algorithm, inspired by the one proposed by Coifman and Wickerhauser [CW92], to adaptively search for the bestorthobasis in the wavelet packet library where the covariance matrix of the transformed process is nearly diagonal. Our method clearly outperforms the one recently proposed by [Whi01], is very fast, does not depend on the wavelet choice, and is not very sensitive to the length of the time series. From these first results we propose an algorithm to build bases to simulate kfactor Gegenbauer processes. Given the simplicity of programming and running, we feel the general practitioner will be attracted to our simulator. Finally we evaluate the approximation due to the fact that we consider the wavelet packet coeficients as uncorrelated. An empirical study is carried out which supports our results. 
Keywords:  Gegenbauer process, Wavelet packet transform, Bestbasis, Autocovariance 
URL:  http://d.repec.org/n?u=RePEc:qut:dpaper:190&r=ecm 
By:  Adam Clements; Scott White (School of Economics and Finance, Queensland University of Technology) 
Abstract:  This paper considers the size effect, where volatility dynamics are dependant upon the current level of volatility within an stochastic volatility framework. A nonlinear filtering algorithm is proposed where the dynamics of the latent variable is conditioned on its current level. This allows for the estimation of a stochastic volatility model where dynamics are dependant on the level of volatility. Empirical results suggest that volatility dynamics are in fact influenced by the level of prevailing volatility. When volatility is relatively low (high), volatility is extremely (not) persistent with little (a great deal of) noise. 
Keywords:  Nonlinear filtering, stochastic volatility, size effect, threshold 
URL:  http://d.repec.org/n?u=RePEc:qut:dpaper:191&r=ecm 