|
on Econometrics |
By: | Caiado, Jorge; Crato, Nuno; Peña, Daniel |
Abstract: | In statistical data analysis it is often important to compare, classify, and cluster different time series. For these purposes various methods have been proposed in the literature, but they usually assume time series with the same sample size. In this paper, we propose a spectral domain method for handling time series of unequal length. The method make the spectral estimates comparable by producing statistics at the same frequency. The procedure is compared with other methods proposed in the literature by a Monte Carlo simulation study. As an illustrative example, the proposed spectral method is applied to cluster industrial production series of some developed countries. |
Keywords: | Autocorrelation function; Cluster analysis; Interpolated periodogram; Reduced periodogram; Spectral analysis; Time series; Zero-padding. |
JEL: | C32 C0 |
Date: | 2009–04 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:15310&r=ecm |
By: | Aurea Grane |
Abstract: | The statistic introduced in Fortiana and Grané (2003) is modified so that it can be used to test the goodness-of-fit of a censored sample, when the distribution function is fully specified. Exact and asymptotic distributions of three modified versions of this statistic are obtained and exact critical values are given for different sample sizes. Empirical power studies show the good performance of these statistics in detecting symmetrical alternatives. |
Keywords: | Goodness-of-fit, Censored Samples, Maximum Correlation, Exact Distribution, L-statistics |
Date: | 2009–05 |
URL: | http://d.repec.org/n?u=RePEc:cte:wsrepe:ws093010&r=ecm |
By: | Ringle, Christian M.; Götz, Oliver; Wetzels, Martin; Wilson, Bradley |
Abstract: | The broader goal of this paper is to provide social researchers with some analytical guidelines when investigating structural equation models (SEM) with predominantly a formative specification. This research is the first to investigate the robustness and precision of parameter estimates of a formative SEM specification. Two distinctive scenarios (normal and non-normal data scenarios) are compared with the aid of a Monte Carlo simulation study for various covariance-based structural equation modeling (CBSEM) estimators and various partial least squares path modeling (PLS-PM) weighting schemes. Thus, this research is also one of the first to compare CBSEM and PLS-PM within the same simulation study. We establish that the maximum likelihood (ML) covariance-based discrepancy function provides accurate and robust parameter estimates for the formative SEM model under investigation when the methodological assumptions are met (e.g., adequate sample size, distributional assumptions, etc.). Under these conditions, ML-CBSEM outperforms PLS-PM. We also demonstrate that the accuracy and robustness of CBSEM decreases considerably when methodological requirements are violated, whereas PLS-PM results remain comparatively robust, e.g. irrespective of the data distribution. These findings are important for researchers and practitioners when having to choose between CBSEM and PLS-PM methodologies to estimate formative SEM in their particular research situation. |
Keywords: | PLS; path modeling; covariance structure analysis; structural equation modeling; formative measurement; simulation study |
JEL: | C10 C51 C30 C15 |
Date: | 2009 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:15390&r=ecm |
By: | Michael Clemens; Samuel Bazzi |
Abstract: | Despite intense concern that many instrumental variables used in growth regressions may be weak, invalid, or both, top journals continue to publish studies of economic growth based on problematic instruments. Doing so risks pushing the entire literature closer to irrelevance. We illustrate hidden problems with identification in recent prominently published and widely cited growth studies using their original data. We urge researchers to take three steps to overcome the shortcomings: grounding research in somewhat more generalized theoretical models, deploying the latest methods to test sensitivity to violations of the exclusion restriction, and opening the “black box” of the Generalized Method of Moments (GMM) with supportive evidence of instrument strength. |
Keywords: | IV, instrumental variables, , 2SLS, two-stage least squares, Generalized Method of Moments, GMM, Blundell-Bond, Arellano-Bond, exclusion restriction, economic growth, regression, weak instruments, valid instruments, overidentification, underidentification, identification problem, growth determinants, foreign aid, institutions, geography, legal origins, too many instruments. |
JEL: | F35 C12 O4 |
Date: | 2009–05 |
URL: | http://d.repec.org/n?u=RePEc:cgd:wpaper:171&r=ecm |
By: | Proietti, Tommaso |
Abstract: | The Beveridge-Nelson decomposition defines the trend component in terms of the eventual forecast function, as the value the series would take if it were on its long-run path. The paper introduces the multistep Beveridge-Nelson decomposition, which arises when the forecast function is obtained by the direct autoregressive approach, which optimizes the predictive ability of the AR model at forecast horizons greater than one. We compare our proposal with the standard Beveridge-Nelson decomposition, for which the forecast function is obtained by iterating the one-step-ahead predictions via the chain rule. We illustrate that the multistep Beveridge-Nelson trend is more efficient than the standard one in the presence of model misspecification and we subsequently assess the predictive validity of the extracted transitory component with respect to future growth. |
Keywords: | Trend and Cycle; Forecasting; Filtering. |
JEL: | E32 E31 C52 C22 |
Date: | 2009–04–02 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:15345&r=ecm |
By: | Ciuiu, Daniel |
Abstract: | In this paper we will classify patterns using an algorithm analogous to the k-means algorithm and the principal components regression (PCR). We will also present a financial application in which we apply PCR if the points represent the interests for accounts with different terms. |
Keywords: | Principal components regression; pattern classification; k-means |
JEL: | C51 E51 C45 |
Date: | 2008–01 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:15360&r=ecm |
By: | Almut E. D. Veraart (School of Economics and Management, Aarhus University and CREATES); Luitgard A. M. Veraart (Institut für Stochastik, Universität Karlsruhe) |
Abstract: | This paper proposes the new concept of stochastic leverage in stochastic volatility models. Stochastic leverage refers to a stochastic process which replaces the classical constant correlation parameter between the asset return and the stochastic volatility process. We provide a systematic treatment of stochastic leverage and propose to model the stochastic leverage effect explicitly, e.g. by means of a linear transformation of a Jacobi process. Such models are both analytically tractable and allow for a direct economic interpretation. In particular, we propose two new stochastic volatility models which allow for a stochastic leverage effect: the generalised Heston model and the generalised Barndorff-Nielsen & Shephard model. We investigate the impact of a stochastic leverage effect in the risk neutral world by focusing on implied volatilities generated by option prices derived from our new models. Furthermore, we give a detailed account on statistical properties of the new models. |
Keywords: | Stochastic volatility · volatility of volatility · stochastic correlation · leverage effect · Jacobi process · Ornstein–Uhlenbeck process · square root diffusion · L´evy process · Heston model · Barndorff-Nielsen & Shephard model |
JEL: | C1 C5 G0 G1 |
Date: | 2009–05–19 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2009-20&r=ecm |
By: | Giovanni Caggiano (Department of Economics, University of Padua, Via del Santo 33, 35123 Padova, Italy.); George Kapetanios (Department of Economics, Queen Mary University of London, Mile End Road, London E1 4NS, United Kingdom.); Vincent Labhard (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.) |
Abstract: | Factor based forecasting has been at the forefront of developments in the macroeconometric forecasting literature in the recent past. Despite the flurry of activity in the area, a number of specification issues such as the choice of the number of factors in the forecasting regression, the benefits of combining factor-based forecasts and the choice of the dataset from which to extract the factors remain partly unaddressed. This paper provides a comprehensive empirical investigation of these issues using data for the euro area, the six largest euro area countries, and the UK. JEL Classification: C100,C150,C530. |
Keywords: | Factors, Large Datasets, Forecast Combinations. |
Date: | 2009–05 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:200901051&r=ecm |
By: | Olivier J. Blanchard; Jean-Paul L'Huillier; Guido Lorenzoni |
Abstract: | We explore empirically models of aggregate fluctuations with two basic ingredients: agents form anticipations about the future based on noisy sources of information; these anticipations affect spending and output in the short run. Our objective is to separate fluctuations due to actual changes in fundamentals (news) from those due to temporary errors in the private sector's estimates of these fundamentals (noise). Using a simple model where the consumption random walk hypothesis holds exactly, we address some basic methodological issues and take a first pass at the data. First, we show that if the econometrician has no informational advantage over the agents in the model, structural VARs cannot be used to identify news and noise shocks. Next, we develop a structural Maximum Likelihood approach which allows us to identify the model's parameters and to evaluate the role of news and noise shocks. Applied to postwar U.S. data, this approach suggests that noise shocks play an important role in short-run fluctuations. |
JEL: | C32 D83 E32 |
Date: | 2009–05 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:15015&r=ecm |
By: | Filippo Domma (Dipartimento di Economia e Statistica, Università della Calabria) |
Abstract: | In this paper we study the bivariate Rodriguez-Burr III distribution from a reliability point of view. In particular, we derive various functions used in reliability theory of conditional distributions, viz hazard rate, reversed hazard rate, mean residual life and mean reversed residual life and, using some notions of dependence, their monotonicity is discussed. Finally, some measures of dependence based on the distribution function and on the mean reversed residual life are investigated. |
Keywords: | Conditional Distribution, Reveserd Hazard Rate, TP2, Dependence Measures |
Date: | 2009–05 |
URL: | http://d.repec.org/n?u=RePEc:clb:wpaper:200907&r=ecm |
By: | Sinha, Pankaj; Jayaraman, Prabha |
Abstract: | This paper aims to study the sensitivity of Bayes estimate of location parameter of an Inverse Gaussian (IG) distribution to misspecification in the prior distribution. It also studies the effect of misspecification of the prior distribution on two-sided predictive limits for a future observation from IG population. Two prior distributions, a class ML-II ε-contaminated and Edgeworth Series (ESD), are employed for the location parameter of an IG distribution, to investigate the effect of misspecification in the priors. The numerical illustrations suggest that moderate amount of misspecification in prior distributions belonging to the class of ML-II ε-contaminated and ESD does not affect the Bayesian results. |
Keywords: | Bayesian results;Inverse Gaussian distribution;ML-II ε-contaminated prior;Edgeworth Series Distributions |
JEL: | C44 C02 C46 A10 C01 C11 |
Date: | 2009–05–17 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:15396&r=ecm |
By: | Einmahl, J.H.J.; Haan, L.F.M. de (Tilburg University, Center for Economic Research) |
Abstract: | AMS 2000 subject classifications. Primary 62G32, 62G05; secondary 60G70, 60F05. |
Date: | 2009 |
URL: | http://d.repec.org/n?u=RePEc:dgr:kubcen:200929&r=ecm |