
on Econometrics 
By:  Schwarz, Maik; Van Bellegem, Sébastien 
Abstract:  We estimate the distribution of a realvalued random variable from contaminated observations. The additive error is supposed to be normally distributed, but with unknown variance. The distribution is identiable from the observations if we restrict the class of considered distributions by a simple condition in the time domain. A minimum distance estimator is shown to be consistent imposing only a slightly stronger assumption than the identification condition. 
Keywords:  deconvolution, error measurement, density estimation 
Date:  2009–10–06 
URL:  http://d.repec.org/n?u=RePEc:ide:wpaper:23156&r=ecm 
By:  Johannes, Jan; Van Bellegem, Sébastien; Vanhems, Anne 
Abstract:  We consider the nonparametric regression model with an additive error that is correlated with the explanatory variables. We suppose the existence of instrumental variables that are considered in this model for the identification and the estimation of the regression function. The nonparametric estimation by instrumental variables is an illposed linear inverse problem with an unknown but estimable operator. We provide a new estimator of the regression function using an iterative regularization method (the LandweberFridman method). The optimal number of iterations and the convergence of the mean square error of the resulting estimator are derived under both mild and severe degrees of illposedness. A MonteCarlo exercise shows the impact of some parameters on the estimator and concludes on the reasonable finite sample performance of the new estimator. 
Keywords:  Nonparametric estimation; Instrumental variable; Illposed inverse problem 
JEL:  C14 C30 
Date:  2010–07 
URL:  http://d.repec.org/n?u=RePEc:ide:wpaper:23149&r=ecm 
By:  Florens, JeanPierre; Schwarz, Maik; Van Bellegem, Sébastien 
Abstract:  A new nonparametric estimator of production a frontier is defined and studied when the data set of production units is contaminated by measurement error. The measurement error is assumed to be an additive normal random variable on the input variable, but its variance is unknown. The estimator is a modification of the mfrontier, which necessitates the computation of a consistent estimator of the conditional survival function of the input variable given the output variable. In this paper, the identification and the consistency of a new estimator of the survival function is proved in the presence of additive noise with unknown variance. The performance of the estimator is also studied through simulated data. 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:ide:wpaper:22801&r=ecm 
By:  Cizek, P. (Tilburg University, Center for Economic Research) 
Abstract:  To accommodate the inhomogenous character of financial time series over longer time periods, standard parametric models can be extended by allow ing their coeffcients to vary over time. Focusing on conditional heteroscedas ticity models, we discuss various strategies to identify and estimate varying coefficients models and compare all methods by means of a realdata applica tion. 
Keywords:  adaptive estimation;conditional heteroscedasticity;varyingcoefficient models;time series 
JEL:  C14 C22 C53 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:201084&r=ecm 
By:  Makram ElShagi; Sebastian Giesen 
Abstract:  This paper analyzes the role of common data problems when identifying structural breaks in small samples. Most notably, we survey small sample properties of the most commonly applied endogenous break tests developed by Brown, Durbin, and Evans (1975) and Zeileis (2004), Nyblom (1989) and Hansen (1992), and Andrews, Lee, and Ploberger (1996). Power and size properties are derived using Monte Carlo simulations. Results emphasize that mostly the CUSUM type tests are aﬀected by the presence of heteroscedasticity, whereas the individual parameter Nyblom test and AvgLM test are proved to be highly robust. However, each test is signiﬁcantly aﬀected by leptokurtosis. Contrarily to other tests, where skewness is far more problematic than kurtosis, it has no additional eﬀect for any of the endogenous break tests we analyze. Concerning overall robustness the Nyblom test performs best, while being almost on par to more recently developed tests in terms of power. 
Date:  2010–09 
URL:  http://d.repec.org/n?u=RePEc:iwh:dispap:1910&r=ecm 
By:  Goerlich Gisbert Francisco J. (Ivie) 
Abstract:  This working paper presents a simple and locally optimal test statistic for the Pareto law. The test is based on the Lagrange multiplier (LM) principle and can be computed easily once the maximum likelihood estimator of the scale parameter of the Pareto density has been obtained. A Monte Carlo exercise shows the good small sample properties of the test under the null hypothesis of the Pareto law and also its power against some sensible alternatives. Finally, a simple application to urban economics is performed. An appendix presents derivations and proofs. 
Keywords:  LM test, Pareto law, statistical distributions 
Date:  2010–02–01 
URL:  http://d.repec.org/n?u=RePEc:fbb:wpaper:20101&r=ecm 
By:  Ralf Becker; Adam Clements; Robert O'Neill 
Abstract:  This paper presents a simple forecasting technique for variance covariance matrices. It relies significantly on the contribution of Chiriac and Voev (2010) who propose to forecast elements of the Cholesky decomposition which recombine to form a positive definite forecast for the variance covariance matrix. The method proposed here combines this methodology with advances made in the MIDAS literature to produce a forecasting methodology that is flexible, scales easily with the size of the portfolio and produces superior forecasts in simulation experiments and an empirical application. 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:man:cgbcrp:149&r=ecm 
By:  Nielsen, Bent 
Abstract:  A vector autoregressive model allowing for unit roots as well as an explosive characteristic root is developed. The GrangerJohansen representation shows that this results in processes with two common features: a random walk and an explosively growing process. Cointegrating and coexplosive vectors can be found that eliminate these common factors. The likelihood ratio test for a simple hypothesis on the coexplosive vectors is analyzed. The method is illustrated using data from the extreme Yugoslavian hyperinflation of the 1990s. 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:ner:oxford:http://economics.ouls.ox.ac.uk/14854/&r=ecm 
By:  Katja Ignatieva (School of Finance and Economics, University of Technology, Sydney); Eckhard Platen (School of Finance and Economics, University of Technology, Sydney); Renata Rendek (School of Finance and Economics, University of Technology, Sydney) 
Abstract:  The aim of this paper is to model the dependencya mong logreturns when security account prices are expressed in units of a well diversified world stock index. The paper uses the equiweighted index EWI104s, calculated as the average of 104 world industry sector indices. The logreturns of its denominations in different currencies appear to be Studentt distributed with about four degrees of freedom. Motivated by these findings, the dependency in logreturns of currency denominations of the EWI104s is modeled using timevarying copulae, aiming to identify the best fitting copula family. The Studentt copula turns generally out to be superior to e.g. the Gaussian copula, where the dependence structure relates to the multivariate normal distribution. It is shown that merely changing the distributional assumption for the logreturns of the marginals from normal to Studentt leads to a significantly better fit. Furthermore, the Studentt copula with Studentt marginals is able to better capture dependent extreme values than the other models considered. Finally, the paper applies copulae to the estimation of the ValueatRisk and the expected shortfall of a portfolio, constructed of savings accounts of different currencies. The proposed copulabased approach allows to split market risk into general and specific market risk, as defied in regulatory documents. The paper demonstrates that the approach performs clearly better than the Risk Metrics approach. 
Keywords:  diversified world stock index; Studentt distribution; timevarying copula; ValueatRisk; expected shortfall 
Date:  2010–09–01 
URL:  http://d.repec.org/n?u=RePEc:uts:rpaper:284&r=ecm 
By:  Beare, Brendan K. 
Abstract:  We study the dependence properties of stationary Markov chains generated by Archimedean copulas. Under some simple regularity conditions, we show that regular variation of the Archimedean generator at zero and one implies geometric orgodicityof the associated Markov chain. We verify our assumptions for a range of Archimedean copulas used in applications. 
Keywords:  archimedean copula, geometric ergodicity, Markov chain, mixing, regular variation, tail dependence 
Date:  2010–09–09 
URL:  http://d.repec.org/n?u=RePEc:cdl:ucsdec:1549539&r=ecm 
By:  Chen, Pu; Hsiao, ChihYing 
Abstract:  Granger causality as a popular concept in time series analysis is widely applied in empirical research. The interpretation of Granger causality tests in a causeeffect context is, however, often unclear or even controversial, so that the causality label has faded away. Textbooks carefully warn that Granger causality does not imply true causality and preferably refer the Granger causality test to a forecasting technique. Applying theory of inferred causation, we develop in this paper a method to uncover causal structures behind Granger causality. In this way we resubstantialize the causal attribution in Granger causality through providing an causal explanation to the conditional dependence manifested in Granger causality. 
Keywords:  Granger Causality; Time Series Causal Model; Graphical Model 
JEL:  C1 E3 
Date:  2010–09 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:24859&r=ecm 
By:  Abhimanyu Mitra; Sidney I. Resnick 
Abstract:  Hidden regular variation defines a subfamily of distributions satisfying multivariate regular variation on $\mathbb{E} = [0, \infty]^d \backslash \{(0,0, ..., 0) \} $ and models another regular variation on the subcone $\mathbb{E}^{(2)} = \mathbb{E} \backslash \cup_{i=1}^d \mathbb{L}_i$, where $\mathbb{L}_i$ is the $i$th axis. We extend the concept of hidden regular variation to subcones of $\mathbb{E}^{(2)}$ as well. We suggest a procedure for detecting the presence of hidden regular variation, and if it exists, propose a method of estimating the limit measure exploiting its semiparametric structure. We exhibit examples where hidden regular variation yields better estimates of probabilities of risk sets. 
Date:  2010–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1001.5058&r=ecm 
By:  Maria Grith; Volker Krätschmer 
Abstract:  This chapter deals with the estimation of risk neutral distributions for pricing index options resulting from the hypothesis of the risk neutral valuation principle. After justifying this hypothesis, we shall focus on parametric estimation methods for the risk neutral density functions determining the risk neutral distributions. We we shall differentiate between the direct and the indirect way. Following the direct way, parameter vectors are estimated which characterize the distributions from selected statistical families to model the risk neutral distributions. The idea of the indirect approach is to calibrate characteristic parameter vectors for stochastic models of the asset price processes, and then to extract the risk neutral density function via Fourier methods. For every of the reviewed methods the calculation of option prices under hypothetically true risk neutral distributions is a building block. We shall give explicit formula for call and put prices w.r.t. reviewed parametric statistical families used for direct estimation. Additionally, we shall introduce the Fast Fourier Transform method of call option pricing developed in [6]. It is intended to compare the reviewed estimation methods empirically. 
Keywords:  Risk neutral valuation principle, risk neutral distribution, logprice risk neutral distribution, risk neutral density function, Black Scholes formula, Fast Fourier Transform method, lognormal distributions, mixtures of lognormal distributions, generalized gamma distributions, model calibration, Merton’s jump diffusion model, Heston’s volatility model 
JEL:  C13 C16 G12 G13 
Date:  2010–09 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2010045&r=ecm 
By:  Browning, Martin; Ejrnæs, Mette; Alvarez, Javier 
Abstract:  All empirical models of earnings processes in the literature assume a good deal of homogeneity. In contrast to this we model earnings processes allowing for lots of heterogeneity between agents. We also introduce an ex tension to the linear ARMA model that allows that the initial convergence to the long run may be di¤erent from that implied by the conventional ARMA model. This is particularly important for unit root tests which are actually tests of a composite of two independent hypotheses. We t our models to a variety of statistics including most of those considered by pre vious investigators. We use a sample drawn from the PSID, and focus on white males with a high school degree. Despite this observable homogene ity we nd much greater latent heterogeneity than previous investigators. 
JEL:  J30 C23 
Date:  2010–10 
URL:  http://d.repec.org/n?u=RePEc:ner:oxford:http://economics.ouls.ox.ac.uk/14853/&r=ecm 
By:  Chen, Pu 
Abstract:  Causeeffect relations are central in economic analysis. Uncovering empirical causeeffect relations is one of the main research activities of empirical economics. In this paper we develop a time series casual model to explore casual relations among economic time series. The time series causal model is grounded on the theory of inferred causation that is a probabilistic and graphtheoretic approach to causality featured with automated learning algorithms. Applying our model we are able to infer causeeffect relations that are implied by the observed time series data. The empirically inferred causal relations can then be used to test economic theoretical hypotheses, to provide evidence for formulation of theoretical hypotheses, and to carry out policy analysis. Time series causal models are closely related to the popular vector autoregressive (VAR) models in time series analysis. They can be viewed as restricted structural VAR models identified by the inferred causal relations. 
Keywords:  Inferred Causation; Automated Learning; VAR; Granger Causality; WagePrice Spiral 
JEL:  E31 C01 
Date:  2010–09 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:24841&r=ecm 
By:  Gerdes, Christer (SOFI, Stockholm University) 
Abstract:  This paper looks at potential implications emerging from including "shares" as a control variable in fixed effect estimations. By shares I refer to the ratio of a sum of units over another, such as the share of immigrants in a city or school. As will be shown in this paper, a logarithmic transformation of shares has some methodological merits as compared to the use of shares defined as mere ratios. In certain empirical settings the use of the latter might result in coefficient estimates that, spuriously, are statistically significant more often than they should. 
Keywords:  consistency, Törnqvist index, symmetry, spurious significance 
JEL:  C23 C29 J10 
Date:  2010–09 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp5171&r=ecm 