nep-ecm New Economics Papers
on Econometrics
Issue of 2010‒09‒18
sixteen papers chosen by
Sune Karlsson
Orebro University

  1. Consistent Density Deconvolution under Partially Known Error Distribution By Schwarz, Maik; Van Bellegem, Sébastien
  2. Iterative Regularization in Nonparametric Instrumental Regression By Johannes, Jan; Van Bellegem, Sébastien; Vanhems, Anne
  3. Nonparametric Frontier Estimation from Noisy Data By Florens, Jean-Pierre; Schwarz, Maik; Van Bellegem, Sébastien
  4. Modelling Conditional Heteroscedasticity in Nonstationary Series By Cizek, P.
  5. Testing for Structural Breaks at Unknown Time: A Steeplechase By Makram El-Shagi; Sebastian Giesen
  6. A simple and Efficient (Parametric Conditional) Test for the Pareto Law By Goerlich Gisbert Francisco J.
  7. A Cholesky-MIDAS model for predicting stock portfolio volatility By Ralf Becker; Adam Clements; Robert O'Neill
  8. Analysis of coexplosive processes.. By Nielsen, Bent
  9. Using Dynamic Copulae for Modeling Dependency in Currency Denominations of a Diversifed World Stock Index By Katja Ignatieva; Eckhard Platen; Renata Rendek
  10. Archimedean Copulas and Temporal Dependence By Beare, Brendan K.
  11. Looking behind Granger causality By Chen, Pu; Hsiao, Chih-Ying
  12. Hidden Regular Variation: Detection and Estimation By Abhimanyu Mitra; Sidney I. Resnick
  13. Parametric estimation of risk neutral density functions By Maria Grith; Volker Krätschmer
  14. Modelling income processes with lots of heterogeneity.. By Browning, Martin; Ejrnæs, Mette; Alvarez, Javier
  15. A time series causal model By Chen, Pu
  16. Using "Shares" vs. "Log of Shares" in Fixed-Effect Estimations By Gerdes, Christer

  1. By: Schwarz, Maik; Van Bellegem, Sébastien
    Abstract: We estimate the distribution of a real-valued random variable from contaminated observations. The additive error is supposed to be normally distributed, but with unknown variance. The distribution is identiable from the observations if we restrict the class of considered distributions by a simple condition in the time domain. A minimum distance estimator is shown to be consistent imposing only a slightly stronger assumption than the identification condition.
    Keywords: deconvolution, error measurement, density estimation
    Date: 2009–10–06
    URL: http://d.repec.org/n?u=RePEc:ide:wpaper:23156&r=ecm
  2. By: Johannes, Jan; Van Bellegem, Sébastien; Vanhems, Anne
    Abstract: We consider the nonparametric regression model with an additive error that is correlated with the explanatory variables. We suppose the existence of instrumental variables that are considered in this model for the identification and the estimation of the regression function. The nonparametric estimation by instrumental variables is an illposed linear inverse problem with an unknown but estimable operator. We provide a new estimator of the regression function using an iterative regularization method (the Landweber-Fridman method). The optimal number of iterations and the convergence of the mean square error of the resulting estimator are derived under both mild and severe degrees of ill-posedness. A Monte-Carlo exercise shows the impact of some parameters on the estimator and concludes on the reasonable finite sample performance of the new estimator.
    Keywords: Nonparametric estimation; Instrumental variable; Ill-posed inverse problem
    JEL: C14 C30
    Date: 2010–07
    URL: http://d.repec.org/n?u=RePEc:ide:wpaper:23149&r=ecm
  3. By: Florens, Jean-Pierre; Schwarz, Maik; Van Bellegem, Sébastien
    Abstract: A new nonparametric estimator of production a frontier is defined and studied when the data set of production units is contaminated by measurement error. The measurement error is assumed to be an additive normal random variable on the input variable, but its variance is unknown. The estimator is a modification of the m-frontier, which necessitates the computation of a consistent estimator of the conditional survival function of the input variable given the output variable. In this paper, the identification and the consistency of a new estimator of the survival function is proved in the presence of additive noise with unknown variance. The performance of the estimator is also studied through simulated data.
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:ide:wpaper:22801&r=ecm
  4. By: Cizek, P. (Tilburg University, Center for Economic Research)
    Abstract: To accommodate the inhomogenous character of financial time series over longer time periods, standard parametric models can be extended by allow- ing their coeffcients to vary over time. Focusing on conditional heteroscedas- ticity models, we discuss various strategies to identify and estimate varying- coefficients models and compare all methods by means of a real-data applica- tion.
    Keywords: adaptive estimation;conditional heteroscedasticity;varying-coefficient models;time series
    JEL: C14 C22 C53
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:201084&r=ecm
  5. By: Makram El-Shagi; Sebastian Giesen
    Abstract: This paper analyzes the role of common data problems when identifying structural breaks in small samples. Most notably, we survey small sample properties of the most commonly applied endogenous break tests developed by Brown, Durbin, and Evans (1975) and Zeileis (2004), Nyblom (1989) and Hansen (1992), and Andrews, Lee, and Ploberger (1996). Power and size properties are derived using Monte Carlo simulations. Results emphasize that mostly the CUSUM type tests are affected by the presence of heteroscedasticity, whereas the individual parameter Nyblom test and AvgLM test are proved to be highly robust. However, each test is significantly affected by leptokurtosis. Contrarily to other tests, where skewness is far more problematic than kurtosis, it has no additional effect for any of the endogenous break tests we analyze. Concerning overall robustness the Nyblom test performs best, while being almost on par to more recently developed tests in terms of power.
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:iwh:dispap:19-10&r=ecm
  6. By: Goerlich Gisbert Francisco J. (Ivie)
    Abstract: This working paper presents a simple and locally optimal test statistic for the Pareto law. The test is based on the Lagrange multiplier (LM) principle and can be computed easily once the maximum likelihood estimator of the scale parameter of the Pareto density has been obtained. A Monte Carlo exercise shows the good small sample properties of the test under the null hypothesis of the Pareto law and also its power against some sensible alternatives. Finally, a simple application to urban economics is performed. An appendix presents derivations and proofs.
    Keywords: LM test, Pareto law, statistical distributions
    Date: 2010–02–01
    URL: http://d.repec.org/n?u=RePEc:fbb:wpaper:20101&r=ecm
  7. By: Ralf Becker; Adam Clements; Robert O'Neill
    Abstract: This paper presents a simple forecasting technique for variance covariance matrices. It relies significantly on the contribution of Chiriac and Voev (2010) who propose to forecast elements of the Cholesky decomposition which recombine to form a positive definite forecast for the variance covariance matrix. The method proposed here combines this methodology with advances made in the MIDAS literature to produce a forecasting methodology that is flexible, scales easily with the size of the portfolio and produces superior forecasts in simulation experiments and an empirical application.
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:man:cgbcrp:149&r=ecm
  8. By: Nielsen, Bent
    Abstract: A vector autoregressive model allowing for unit roots as well as an explosive characteristic root is developed. The Granger-Johansen representation shows that this results in processes with two common features: a random walk and an explosively growing process. Cointegrating and coexplosive vectors can be found that eliminate these common factors. The likelihood ratio test for a simple hypothesis on the coexplosive vectors is analyzed. The method is illustrated using data from the extreme Yugoslavian hyperinflation of the 1990s.
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:ner:oxford:http://economics.ouls.ox.ac.uk/14854/&r=ecm
  9. By: Katja Ignatieva (School of Finance and Economics, University of Technology, Sydney); Eckhard Platen (School of Finance and Economics, University of Technology, Sydney); Renata Rendek (School of Finance and Economics, University of Technology, Sydney)
    Abstract: The aim of this paper is to model the dependencya mong log-returns when security account prices are expressed in units of a well diversified world stock index. The paper uses the equi-weighted index EWI104s, calculated as the average of 104 world industry sector indices. The log-returns of its denominations in different currencies appear to be Student-t distributed with about four degrees of freedom. Motivated by these findings, the dependency in log-returns of currency denominations of the EWI104s is modeled using time-varying copulae, aiming to identify the best fitting copula family. The Student-t copula turns generally out to be superior to e.g. the Gaussian copula, where the dependence structure relates to the multivariate normal distribution. It is shown that merely changing the distributional assumption for the log-returns of the marginals from normal to Student-t leads to a significantly better fit. Furthermore, the Student-t copula with Student-t marginals is able to better capture dependent extreme values than the other models considered. Finally, the paper applies copulae to the estimation of the Value-at-Risk and the expected shortfall of a portfolio, constructed of savings accounts of different currencies. The proposed copula-based approach allows to split market risk into general and specific market risk, as defied in regulatory documents. The paper demonstrates that the approach performs clearly better than the Risk Metrics approach.
    Keywords: diversified world stock index; Student-t distribution; time-varying copula; Value-at-Risk; expected shortfall
    Date: 2010–09–01
    URL: http://d.repec.org/n?u=RePEc:uts:rpaper:284&r=ecm
  10. By: Beare, Brendan K.
    Abstract: We study the dependence properties of stationary Markov chains generated by Archimedean copulas. Under some simple regularity conditions, we show that regular variation of the Archimedean generator at zero and one implies geometric orgodicityof the associated Markov chain. We verify our assumptions for a range of Archimedean copulas used in applications.
    Keywords: archimedean copula, geometric ergodicity, Markov chain, mixing, regular variation, tail dependence
    Date: 2010–09–09
    URL: http://d.repec.org/n?u=RePEc:cdl:ucsdec:1549539&r=ecm
  11. By: Chen, Pu; Hsiao, Chih-Ying
    Abstract: Granger causality as a popular concept in time series analysis is widely applied in empirical research. The interpretation of Granger causality tests in a cause-effect context is, however, often unclear or even controversial, so that the causality label has faded away. Textbooks carefully warn that Granger causality does not imply true causality and preferably refer the Granger causality test to a forecasting technique. Applying theory of inferred causation, we develop in this paper a method to uncover causal structures behind Granger causality. In this way we re-substantialize the causal attribution in Granger causality through providing an causal explanation to the conditional dependence manifested in Granger causality.
    Keywords: Granger Causality; Time Series Causal Model; Graphical Model
    JEL: C1 E3
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:24859&r=ecm
  12. By: Abhimanyu Mitra; Sidney I. Resnick
    Abstract: Hidden regular variation defines a subfamily of distributions satisfying multivariate regular variation on $\mathbb{E} = [0, \infty]^d \backslash \{(0,0, ..., 0) \} $ and models another regular variation on the sub-cone $\mathbb{E}^{(2)} = \mathbb{E} \backslash \cup_{i=1}^d \mathbb{L}_i$, where $\mathbb{L}_i$ is the $i$-th axis. We extend the concept of hidden regular variation to sub-cones of $\mathbb{E}^{(2)}$ as well. We suggest a procedure for detecting the presence of hidden regular variation, and if it exists, propose a method of estimating the limit measure exploiting its semi-parametric structure. We exhibit examples where hidden regular variation yields better estimates of probabilities of risk sets.
    Date: 2010–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1001.5058&r=ecm
  13. By: Maria Grith; Volker Krätschmer
    Abstract: This chapter deals with the estimation of risk neutral distributions for pricing index options resulting from the hypothesis of the risk neutral valuation principle. After justifying this hypothesis, we shall focus on parametric estimation methods for the risk neutral density functions determining the risk neutral distributions. We we shall differentiate between the direct and the indirect way. Following the direct way, parameter vectors are estimated which characterize the distributions from selected statistical families to model the risk neutral distributions. The idea of the indirect approach is to calibrate characteristic parameter vectors for stochastic models of the asset price processes, and then to extract the risk neutral density function via Fourier methods. For every of the reviewed methods the calculation of option prices under hypothetically true risk neutral distributions is a building block. We shall give explicit formula for call and put prices w.r.t. reviewed parametric statistical families used for direct estimation. Additionally, we shall introduce the Fast Fourier Transform method of call option pricing developed in [6]. It is intended to compare the reviewed estimation methods empirically.
    Keywords: Risk neutral valuation principle, risk neutral distribution, logprice risk neutral distribution, risk neutral density function, Black Scholes formula, Fast Fourier Transform method, log-normal distributions, mixtures of log-normal distributions, generalized gamma distributions, model calibration, Merton’s jump diffusion model, Heston’s volatility model
    JEL: C13 C16 G12 G13
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2010-045&r=ecm
  14. By: Browning, Martin; Ejrnæs, Mette; Alvarez, Javier
    Abstract: All empirical models of earnings processes in the literature assume a good deal of homogeneity. In contrast to this we model earnings processes allowing for lots of heterogeneity between agents. We also introduce an ex- tension to the linear ARMA model that allows that the initial convergence to the long run may be di¤erent from that implied by the conventional ARMA model. This is particularly important for unit root tests which are actually tests of a composite of two independent hypotheses. We …t our models to a variety of statistics including most of those considered by pre- vious investigators. We use a sample drawn from the PSID, and focus on white males with a high school degree. Despite this observable homogene- ity we …nd much greater latent heterogeneity than previous investigators.
    JEL: J30 C23
    Date: 2010–10
    URL: http://d.repec.org/n?u=RePEc:ner:oxford:http://economics.ouls.ox.ac.uk/14853/&r=ecm
  15. By: Chen, Pu
    Abstract: Cause-effect relations are central in economic analysis. Uncovering empirical cause-effect relations is one of the main research activities of empirical economics. In this paper we develop a time series casual model to explore casual relations among economic time series. The time series causal model is grounded on the theory of inferred causation that is a probabilistic and graph-theoretic approach to causality featured with automated learning algorithms. Applying our model we are able to infer cause-effect relations that are implied by the observed time series data. The empirically inferred causal relations can then be used to test economic theoretical hypotheses, to provide evidence for formulation of theoretical hypotheses, and to carry out policy analysis. Time series causal models are closely related to the popular vector autoregressive (VAR) models in time series analysis. They can be viewed as restricted structural VAR models identified by the inferred causal relations.
    Keywords: Inferred Causation; Automated Learning; VAR; Granger Causality; Wage-Price Spiral
    JEL: E31 C01
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:24841&r=ecm
  16. By: Gerdes, Christer (SOFI, Stockholm University)
    Abstract: This paper looks at potential implications emerging from including "shares" as a control variable in fixed effect estimations. By shares I refer to the ratio of a sum of units over another, such as the share of immigrants in a city or school. As will be shown in this paper, a logarithmic transformation of shares has some methodological merits as compared to the use of shares defined as mere ratios. In certain empirical settings the use of the latter might result in coefficient estimates that, spuriously, are statistically significant more often than they should.
    Keywords: consistency, Törnqvist index, symmetry, spurious significance
    JEL: C23 C29 J10
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp5171&r=ecm

This nep-ecm issue is ©2010 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.