nep-ecm New Economics Papers
on Econometrics
Issue of 2005‒04‒16
forty-four papers chosen by
Sune Karlsson
Orebro University

  2. A HALF-GRAPH DEPTH FOR FUNCTIONAL DATA By Sara Lopez-Pintado; Juan Romo
  3. Nonparametric Estimation of Conditional Expected Shortfall By Olivier SCAILLET
  4. A Kolmogorov-Smirnov Type Test for Positive Quadrant Dependence By Olivier Scaillet
  5. Indirect Robust Estimation of the Short-term interest Rate Process By Veronika Czellar; G. Andrew Karolyi; Elvezio Ronchetti
  6. Bivariate Time Series Modelling of Financial Count Data By Quoreshi, Shahiduzzaman
  7. Interest Rate Smoothing versus Serially Correlated Errors in Taylor Rules: Testing the Tests By Welz, Peter; Österholm, Pär
  8. Small Sample Bias Propreties of the System GMM Estimator in Dynamic Panel Data Models By Kazuhiko Hayakawa
  9. Bayesian Inference via Classes of Normalized Random Measures. By Lancelot F. James; Antonio Lijoi; Igor Pruenster
  10. The latent factor VAR model: Testing for a common component in the intraday trading process By Nikolaus Hautsch
  11. Exploring the Use of a Nonparametrically Generated Instrumetal Variable in the Estimation of a Linear Parametric Equation By Frank T. Denton
  12. A Pedant's Approach to Exponential Smoothing By Ralph D Snyder
  13. Exponential Smoothing Model Selection for Forecasting By Baki Billah; Maxwell L King; Ralph D Snyder; Anne B Koehler
  14. Time Series Forecasting: The Case for the Single Source of Error State Space By J Keith Ord; Ralph D Snyder; Anne B Koehler; Rob J Hyndman; Mark Leeds
  15. Monte Carlo Tests with Nuisance Parameters: A General Approach to Finite-Sample Inference and Nonstandard Asymptotics By DUFOUR, Jean-Marie
  16. Exact Multivariate Tests of Asset Pricing Models with Stable Asymmetric Distributions By BEAULIEU, Marie-Claude; DUFOUR, Jean-Marie; KHALAF, Lynda
  17. Distribution-Free Bounds for Serial Correlation Coefficients in Heteroskedastic Symmetric Time Series By DUFOUR, Jean-Marie; FARHAT, Abdekjelik; HALLIN, Marc
  18. Asymptotic Distribution of a Simple Linear Estimator for VARMA Models in Echelon Form By DUFOUR, Jean-Marie; TAREK, Jouini
  19. Are Exchange Rates Really Random Walks? Some Evidence Robust to Parameter Instability By Barbara Rossi
  20. Measurement Error in Access to Markets By Javier Escobal; Sonia Laszlo
  21. Grocer 1.0, an Econometric Toolbox for Scilab: an Econometrician Point of View By Dubois
  22. Overlaying Time Scales in Financial Volatility Data By Eric Hillebrand
  24. The Long-Run Forecasting of Energy Prices Using the Model of Shifting Trend By Stanislav Radchenko
  25. Modeling and forecasting electricity loads: A comparison By Rafal Weron; Adam Misiorek
  26. On detecting and modeling periodic correlation in financial data By Ewa Broszkiewicz-Suwaj; Andrzej Makagon; Rafal Weron; Agnieszka Wylomanska
  27. Assessing Forecast Performance in a VEC Model: An Empirical Examination By Bragoudakis Zacharias
  28. Nonparametric Slope Estimators for Fixed-Effect Panel Data By Kusum Mundra
  29. Multivariate STAR Unemployment Rate Forecasts By Costas Milas; Phil Rothman
  30. Extraction of Common Signal from Series with Different Frequency By Edoardo Otranto
  31. A Genetic Algorithm for the Structural Estimation of Games with Multiple Equilibria By Victor Aguirregabiria; Pedro Mira
  32. Nonlinearity, Nonstationarity and Spurious Forecasts By Vadim Marmer
  33. Dynamic Conditional Correlation with Elliptical Distributions By Matteo M. Pelagatti; Stefania Rondena
  34. Time Series Modeling with Duration Dependent Markov-Switching Vector Autoregressions: MCMC Inference, Software and Applications By Matteo M. Pelagatti
  36. Powerful and Serial Correlation Robust Tests of the Economic Convergence Hypothesis By Ozgen Sayginsoy
  37. Causation Delays and Causal Neutralization up to Three Steps Ahead: The Money-Output Relationship Revisited By Jonathan B. Hill
  38. An intuitive guide to wavelets for economists By Patrick Crowley
  39. What causes the forecasting failure of Markov-Switching models? A Monte Carlo study By Marie Bessec; Othman Bouabdallah
  41. Testing Cointegration Rank in Large Systems By Chen Pu; Hsiao Chihying
  42. The Available Information for Invariant Tests of a Unit Root By Patrick Marsh
  43. Goodness of Fit Tests for Moment Condition Models By Joaquim J.S. Ramalho; Richard J. Smith
  44. Empirical Likelihood Estimation under Alternative Stratified Sampling Schemes when Aggregate Data are Available By Joaquim J.S. Ramalho; Esmeralda A. Ramalho

  1. By: Jose Olmo
    Abstract: This paper introduces an estimator for the extremal index as the ratio of the number of elements of two point processes defined by threshold sequences un, vn and a partition of the sequence in different blocks of the same size. The first point process is defined by the sequence of the block maxima that exceed un. This paper introduces a thinning of this point process, defined by a threshold vn with vn > un, and with the appealing property that under some mild conditions the ratio of the number of elements of both point processes is a consistent estimator of the extremal index. The method supports a hypothesis test for the extremal index, and hence for testing the existence of clustering in the extreme values. Other advantages are that it allows some freedom to choose un, and it is not very sensitive to the choice of the partition. Finally, the stylized facts found in financial returns (clustering, skewness, heavy tails) are tested via the extremal index, in this case for the DaX returns
    Date: 2005–04
  2. By: Sara Lopez-Pintado; Juan Romo
    Abstract: A recent and highly attractive area of research in statistics is the analysis of functional data. In this paper a new definition of depth for functional observations is introduced based on the notion of “half-graph” of a curve. It has computational advantages with respect to other concepts of depth previously proposed. The half-graph depth provides a natural criterion to measure the centrality of a function within a sample of curves. Based on this depth a sample of curves can be ordered from the center outward and L-statistics are defined. The properties of the half-graph depth, such as the consistency and uniform convergence, are established. A simulation study shows the robustness of this new definition of depth when the curves are contaminated. Finally real data examples are analyzed.
    Date: 2005–04
  3. By: Olivier SCAILLET (HEC-University of Geneva and FAME)
    Abstract: We consider a nonparametric method to estimate conditional expected shortfalls, i.e. conditional expected losses knowing that losses are larger than a given loss quantile. We derive the asymptotic properties of kernal estimators of conditional expected shortfalls in the context of a stationary process satisfying strong mixing conditions. An empirical illustration is given for several stock index returns, namely CAC40, DAX30, S&P500, DJI, and Nikkei225.
    Keywords: Nonparametric; Kernel; Time series; Conditional VAR; Conditional expected shortfall; Risk management; Loss severity distribution
    JEL: C14 D81 G10 G21 G22 G28
    Date: 2004–05
  4. By: Olivier Scaillet (HEC, University of Geneva and FAME)
    Abstract: We consider a consistent test, that is similar to a Kolmogorov-Smirnov test, of the complete set of restrictions that relate to the copula representation of positive quadrant dependence. For such a test we propose and justify inference relying on a simulation based multiplier method and a bootstrap method. We also explore the finite sample behaviour of ^ both methods with Monte Carlo experiments. A first empirical illustration is given for US insurance claim data. A second one exemines the presence of positive quadrant dependence in life expectancies at birth of males and females among countries.
    Keywords: Nonparametric; Positive Quadrant Dependence; Copula; Risk Management; Loss Severity Distribution; Bootstrap; Multiplier Method; Empirical Process
    JEL: C12 D81 G10 G21 G22
  5. By: Veronika Czellar (Dept. of Econometrics, University of Geneva); G. Andrew Karolyi (Fisher College of Business, Ohio State University); Elvezio Ronchetti (Dept. Econometrics, University of Geneva)
    Abstract: We introduce Indirect Robust Generalized Method of Moments (IRGMM), a new simulation-based estimation methodology, to model short-term interest rate processes. The primary advantage of IRGMM relative to classical estimators of the continuous-time short-rate diffusion processes is that it corrects both errors due to discretization and the errors due to model misspecification. We apply this new approach to various monthly and weekly Eurocurrency interest rate series.
    Keywords: GMM and RGMM estimators; CKLS one factor model; indirect inference
    JEL: G10 G12 C10 C22 C15 C53
    Date: 2005–03
  6. By: Quoreshi, Shahiduzzaman (Department of Economics, Umeå University)
    Abstract: A bivariate integer-valued moving average (BINMA) model is proposed. The BINMA model allows for both positive and negative correlation between the counts. This model can be seen as an inverse of the conditional duration model in the sense that short durations in a time interval correspond to a large count and vice versa. The conditional mean, variance and covariance of the BINMA model are given. Model extensions to include explanatory variables are suggested. Using the BINMA model for AstraZeneca and Ericsson B it is found that there is positive correlation between the stock transactions series. Empirically, we find support for the use of long-lag bivariate moving average models for the two series. have significant effects for both series.
    Keywords: Count data; Intra-day; High frequency; Time series; Estimation; Long memory; Finance
    JEL: C13 C22 C25 C51 G12 G14
    Date: 2005–04–14
  7. By: Welz, Peter (Department of Economics); Österholm, Pär (Department of Economics)
    Abstract: This paper contributes to the recent debate about the estimated high partial adjustment coefficient in dynamic Taylor rules, commonly interpreted as deliberate interest rate smoothing on the part of the monetary authority. We argue that a high coefficient on the lagged interest rate term may be a consequence of an incorrectly specified central bank reaction function. Focusing on omitted variables, our Monte Carlo study first generates the well-known fact that all coefficients in the misspecified equation are biased in such cases. In particular, if relevant variables are left out from the estimated equation, a high partial adjustment coefficient is obtained even when it is in fact zero in the data generating process. Misspecification also leads to considerable size distortions in two tests that were recently proposed by English, Nelson, and Sack (2003) in order to distinguish between interest rate smoothing and serially correlated disturbances. Our results question the common interpretation of very slow partial adjustment as interest rate smoothing in estimated dynamic Taylor rules.
    Keywords: Monetary policy; Taylor rule; Interest rate smoothing; Serially correlated error term; Omitted variables
    JEL: C12 C15 E52
    Date: 2005–03–31
  8. By: Kazuhiko Hayakawa
    Abstract: This paper examines analytically and experimentally why the system GMM estimator in dynamic panel data models is less biased than the first differencing or the level estimators even though the former uses more instruments. We find that the bias of the system GMM estimator is a weighted sum of the biases in opposite directions of the first differencing and the level estimator. We also find that an important condition for the system GMM estimator to have small bias is that the variances of the individual effects and the disturbances are almost of the same magnitude. If the variance of individual effects is much larger than that of disturbances, then all GMM estimators are heavily biased. To reduce such biases, we propose bias-corrected GMM estimators. On the other hand, if the variance of individual effects is smaller than that of disturbances, the system estimator has a more severe downward bias than the level estimator.
    Date: 2005–04
  9. By: Lancelot F. James; Antonio Lijoi; Igor Pruenster
    Abstract: One of the main research areas in Bayesian Nonparametrics is the proposal and study of priors which generalize the Dirichlet process. Here we exploit theoretical properties of Poisson random measures in order to provide a comprehensive Bayesian analysis of random probabilities which are obtained by an appropriate normalization. Specifically we achieve explicit and tractable forms of the posterior and the marginal distributions, including an explicit and easily used description of generalizations of the important Blackwell-MacQueen Pólya urn distribution. Such simplifications are achieved by the use of a latent variable which admits quite interesting interpretations which allow to gain a better understanding of the behaviour of these random probability measures. It is noteworthy that these models are generalizations of models considered by Kingman (1975) in a non-Bayesian context. Such models are known to play a significant role in a variety of applications including genetics, physics, and work involving random mappings and assemblies. Hence our analysis is of utility in those contexts as well. We also show how our results may be applied to Bayesian mixture models and describe computational schemes which are generalizations of known efficient methods for the case of the Dirichlet process. We illustrate new examples of processes which can play the role of priors for Bayesian nonparametric inference and finally point out some interesting connections with the theory of generalized gamma convolutions initiated by Thorin and further developed by Bondesson.
    Keywords: Bayesian Nonparametrics; Chinese restaurant process; Generalized gamma convolutions; Gibbs partitions; Poisson random measure
    Date: 2005–04
  10. By: Nikolaus Hautsch (Institute of Economics, University of Copenhagen)
    Abstract: In this paper, we propose a framework for the modelling of multivariate dynamic processes which are driven by an unobservable common autoregressive component. Economically motivated by the mixture-of-distribution hypothesis, we model the multivariate intraday trading process of return volatility, volume and trading intensity by a VAR model that is augmented by a joint latent factor serving as a proxy for the unobserved information flow. The model is estimated by simulated maximum likelihood using efficient importance sampling techniques. Analyzing intraday data from the NYSE, we find strong empirical evidence for the existence of an underlying persistent component as an important driving force of the trading process. It is shown that the inclusion of the latent factor clearly improves the goodness-of-fit of the model as well as its dynamical and distributional properties.
    Keywords: observation vs. parameter driven dynamics; mixture-of-distribution hypothesis; VAR model; efficient importance sampling
    JEL: C15 C32 C52
    Date: 2005–03
  11. By: Frank T. Denton
    Abstract: The use of a nonparametrically generated instrumental variable in estimating a single-equation linear parametric model is explored, using kernel and other smoothing functions. The method, termed IVOS (Instrumental Variables Obtained by Smoothing), is applied in the estimation of measurement error and endogenous regressor models. Asymptotic and small-sample properties are investigated by simulation, using artificial data sets. IVOS is easy to apply and the simulation results exhibit good statistical properties. It can be used in situations in which standard IV cannot because suitable instruments are not available.
    Keywords: single equation models; nonparametric; instrumental variables
    JEL: C13 C14 C21
    Date: 2005–01
  12. By: Ralph D Snyder
    Abstract: An approach to exponential smoothing that relies on a linear single source of error state space model is outlined. A maximum likelihood method for the estimation of associated smoothing parameters is developed. Commonly used restrictions on the smoothing parameters are rationalised. Issues surrounding model identification and selection are also considered. It is argued that the proposed revised version of exponential smoothing provides a better framework for forecasting than either the Box-Jenkins or the traditional multi-disturbance state space approaches.
    Keywords: Time Series Analysis, Prediction, Exponential Smoothing, ARIMA Models, Kalman Filter, State Space Models
    JEL: C22
    Date: 2005–03
  13. By: Baki Billah; Maxwell L King; Ralph D Snyder; Anne B Koehler
    Abstract: Applications of exponential smoothing to forecast time series usually rely on three basic methods: simple exponential smoothing, trend corrected exponential smoothing and a seasonal variation thereof. A common approach to select the method appropriate to a particular time series is based on prediction validation on a withheld part of the sample using criteria such as the mean absolute percentage error. A second approach is to rely on the most appropriate general case of the three methods. For annual series this is trend corrected exponential smoothing: for sub-annual series it is the seasonal adaptation of trend corrected exponential smoothing. The rationale for this approach is that a general method automatically collapses to its nested counterparts when the pertinent conditions pertain in the data. A third approach may be based on an information criterion when maximum likelihood methods are used in conjunction with exponential smoothing to estimate the smoothing parameters. In this paper, such approaches for selecting the appropriate forecasting method are compared in a simulation study. They are also compared on real time series from the M3 forecasting competition. The results indicate that the information criterion approach appears to provide the best basis for an automated approach to method selection, provided that it is based on Akaike's information criterion.
    Keywords: Model Selection; Exponential Smoothing; Information Criteria; Prediction; Forecast Validation
    JEL: C22
    Date: 2005–03
  14. By: J Keith Ord; Ralph D Snyder; Anne B Koehler; Rob J Hyndman; Mark Leeds
    Abstract: The state space approach to modelling univariate time series is now widely used both in theory and in applications. However, the very richness of the framework means that quite different model formulations are possible, even when they purport to describe the same phenomena. In this paper, we examine the single source of error [SSOE] scheme, which has perfectly correlated error components. We then proceed to compare SSOE to the more common version of the state space models, for which all the error terms are independent; we refer to this as the multiple source of error [MSOE] scheme. As expected, there are many similarities between the MSOE and SSOE schemes, but also some important differences. Both have ARIMA models as their reduced forms, although the mapping is more transparent for SSOE. Further, SSOE does not require a canonical form to complete its specification. An appealing feature of SSOE is that the estimates of the state variables converge in probability to their true values, thereby leading to a formal inferential structure for the ad-hoc exponential smoothing methods for forecasting. The parameter space for SSOE models may be specified to match that of the corresponding ARIMA scheme, or it may be restricted to meaningful sub-spaces, as for MSOE but with somewhat different outcomes. The SSOE formulation enables straightforward extensions to certain classes of non-linear models, including a linear trend with multiplicative seasonals version that underlies the Holt-Winters forecasting method. Conditionally heteroscedastic models may be developed in a similar manner. Finally we note that smoothing and decomposition, two crucial practical issues, may be performed within the SSOE framework.
    Keywords: ARIMA, Dynamic Linear Models, Equivalence, Exponential Smoothing, Forecasting, GARCH, Holt's Method, Holt-Winters Method, Kalman Filter, Prediction Intervals.
    JEL: C22 C53 C51
    Date: 2005–04
  15. By: DUFOUR, Jean-Marie
    Abstract: The technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] provides an attractive method of building exact tests from statistics whose finite sample distribution is intractable but can be simulated (provided it does not involve nuisance parameters). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing the method to statistics whose null distributions involve nuisance parameters (maximized MC tests, MMC). Simplified asymptotically justified versions of the MMC method are also proposed and it is shown that they provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics (e.g., unit root asymptotics). Parametric bootstrap tests may be interpreted as a simplified version of the MMC method (without the general validity properties of the latter).
    Keywords: Monte Carlo test ; maximized monte Carlo test ; finite same test ; exact test ; nuisance rameter ; bounds ; bootstra; rametric bootstra; simulated annealing ; asymotics ; nonstandard asymotic distribution.
    JEL: C12 C15 C2 C52 C22
    Date: 2005
  16. By: BEAULIEU, Marie-Claude; DUFOUR, Jean-Marie; KHALAF, Lynda
    Abstract: In this paper, we propose exact inference procedures for asset pricing models that can be formulated in the framework of a multivariate linear regression (CAPM), allowing for stable error distributions. The normality assumption on the distribution of stock returns is usually rejected in empirical studies, due to excess kurtosis and asymmetry. To model such data, we propose a comprehensive statistical approach which allows for alternative - possibly asymmetric - heavy tailed distributions without the use of large-sample approximations. The methods suggested are based on Monte Carlo test techniques. Goodness-of-fit tests are formally incorporated to ensure that the error distributions considered are empirically sustainable, from which exact confidence sets for the unknown tail area and asymmetry parameters of the stable error distribution are derived. Tests for the efficiency of the market portfolio (zero intercepts) which explicitly allow for the presence of (unknown) nuisance parameter in the stable error distribution are derived. The methods proposed are applied to monthly returns on 12 portfolios of the New York Stock Exchange over the period 1926-1995 (5 year subperiods). We find that stable possibly skewed distributions provide statistically significant improvement in goodness-of-fit and lead to fewer rejections of the efficiency hypothesis.
    Keywords: catal asset icing model ; mean-variance efficiency ; non-normality ; multivariate linear regression ; stable distribution ; skewness ; kurtosis ; asymmetry ; uniform linear hythesis ; exact test ; Monte Carlo test ; nuisance rameter ; scification test ; diagnostics.
    JEL: C3 C12 C33 C15 G1 G12 G14
    Date: 2005
  17. By: DUFOUR, Jean-Marie; FARHAT, Abdekjelik; HALLIN, Marc
    Abstract: We consider the problem of testing whether the observations X1, ..., Xn of a time series are independent with unspecified (possibly nonidentical) distributions symmetric about a common known median. Various bounds on the distributions of serial correlation coefficients are proposed: exponential bounds, Eaton-type bounds, Chebyshev bounds and Berry-Esséen-Zolotarev bounds. The bounds are exact in finite samples, distribution-free and easy to compute. The performance of the bounds is evaluated and compared with traditional serial dependence tests in a simulation experiment. The procedures proposed are applied to U.S. data on interest rates (commercial paper rate).
    Keywords: autocorrelation ; serial dendence ; nonrametric test ; distribution-free test ; heterogeneity ; heteroskedasticity ; symmetric distribution ; robustness ; exact test ; bound ; exnential bound ; large deviations ; Chebyshev inequality ; Berry-Esséen ; interest rates.
    JEL: C14 C22 C12 C32 E4
    Date: 2005
  18. By: DUFOUR, Jean-Marie; TAREK, Jouini
    Abstract: In this paper, we study the asymptotic distribution of a simple two-stage (Hannan-Rissanen-type) linear estimator for stationary invertible vector autoregressive moving average (VARMA) models in the echelon form representation. General conditions for consistency and asymptotic normality are given. A consistent estimator of the asymptotic covariance matrix of the estimator is also provided, so that tests and confidence intervals can easily be constructed.
    Keywords: Time series ; VARMA ; stationary ; invertible ; echelon form ; estimation ; asymotic normality ; bootstra; Hannan-Rissanen
    JEL: C3 C32 C53
    Date: 2005
  19. By: Barbara Rossi (Duke University)
    Abstract: Many authors have documented that it is challenging to explain exchange rate fluctuations with macroeconomic fundamentals: a random walk forecasts future exchange rates better than existing macroeconomic models. This paper applies newly developed tests for nested model that are robust to the presence of parameter instability. The empirical evidence shows that for some countries we can reject the hypothesis that exchange rates are random walks. This raises the possibility that economic models were previously rejected not because the fundamentals are completely unrelated to exchange rate fluctuations, but because the relationship is unstable over time and, thus, difficult to capture by Granger Causality tests or by forecast comparisons. We also analyze forecasts that exploit the time variation in the parameters and find that, in some cases, they can improve over the random walk.
    Keywords: forecasting, exchange rates, parameter instability, random walks
    JEL: C52 C53 F3
    Date: 2005–03–19
  20. By: Javier Escobal (GRADE); Sonia Laszlo (McGill University)
    Abstract: Studies in the microeconometric literature increasingly utilize distance to or time to reach markets or social services as determinants of economic issues. These studies typically use self-reported measures from survey data, often characterized by non-classical measurement error. This paper is the first validation study of access to markets data. New and unique data from Peru allow comparison of self-reported variables with scientifically calculated variables. We investigate the determinants of the deviation between imputed and self-reported data and show that it is non-classical and dependent on observable socio-economic variables. Our results suggest that studies using self-reported measures of access may be estimating biased effects.
    JEL: O P
    Date: 2005–03–29
  21. By: Dubois (Minstère de l'Economie, des Finances et de l'Industrie - Paris- France)
    Abstract: Grocer is an econometric toolbox for Scilab, a free opensource matrix- oriented toolbox similar to Matlab and Gauss. It contains more than 50 econometric different methods, with many variants and extensions. Most standard econometric tools are available. Grocer contains also two original econometric tools: a function allowing the 'automatic' estimation of the 'true model' starting from a more general and bigger one, a method that provides the most thorough expression of the so- called LSE econometric methodology; a function calculating the contributions of exogenous variables to an endogenous one.
    Keywords: econometric software, estimation, general to specific, contributions
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–01–21
  22. By: Eric Hillebrand (Louisiana State University, Department of Economics)
    Abstract: Apart from the well-known, high persistence of daily financial volatility data, there is also a short correlation structure that reverts to the mean in less than a month. We find this short correlation time scale in six different daily financial time series and use it to improve the short-term forecasts from GARCH models. We study different generalizations of GARCH that allow for several time scales. On our holding sample, none of the considered models can fully exploit the information contained in the short scale. Wavelet analysis shows a correlation between fluctuations on long and on short scales. Models accounting for this correlation as well as long memory models for absolute returns appear to be promising.
    Keywords: GARCH, volatility persistence, spurious high persistence, long memory, fractional integration, change-points, wavelets, time scales
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–01–31
  23. By: Edgar L. Feige (University of Wisconsin-Madison); Harold W. Watts (University of Wisconsin-Madison)
    Abstract: A proposal for maintaining privacy protection in large data bases by the use of partially aggregated data instead of the original individual data. Proper micro aggregation techniques can serve to protect the confidential nature of the individual data with minimumal information loss. Reference:Data base4s, Computers and the Social Sciences, R. Bisco (ed.),Wiley, 1970, pp. 261-272
    Keywords: Privacy,data protection, micro-aggregation,
    JEL: C43 C82 C88
    Date: 2005–02–03
  24. By: Stanislav Radchenko (UNC at Charlotte)
    Abstract: This paper constructs long-term forecasts of energy prices using a reduced form model of shifting trend developed by Pindyck (1999). A Gibbs sampling algorithm is developed to estimate models with a shifting trend line which are used to construct 10-period-ahead and 15-period ahead forecasts. An advantage of forecasts from this model is that they are not very influenced by the presence of large, long-lived increases and decreases in energy prices. The forecasts form shifting trends model are combined with forecasts from the random walk model and the autoregressive model to substantially decrease the mean forecast squared error compared to each individual model.
    Keywords: energy forecasting, oil price, coal price, natural gas price, shifting trends model, long term forecasting
    JEL: C53
    Date: 2005–02–04
  25. By: Rafal Weron (Hugo Steinhaus Center); Adam Misiorek (Institute of Power Systems Automation)
    Abstract: In this paper we study two statistical approaches to load forecasting. Both of them model electricity load as a sum of two components – a deterministic (representing seasonalities) and a stochastic (representing noise). They differ in the choice of the seasonality reduction method. Model A utilizes differencing, while Model B uses a recently developed seasonal volatility technique. In both models the stochastic component is described by an ARMA time series. Models are tested on a time series of system-wide loads from the California power market and compared with the official forecast of the California System Operator (CAISO).
    Keywords: Electricity, load forecasting, ARMA model, seasonal component
    JEL: C22 C53 L94 Q40
    Date: 2005–02–07
  26. By: Ewa Broszkiewicz-Suwaj (Wroclaw University of Technology); Andrzej Makagon (Hampton University); Rafal Weron (Hugo Steinhaus Center); Agnieszka Wylomanska (Wroclaw University of Technology)
    Abstract: For many economic problems standard statistical analysis, based on the notion of stationarity, is not adequate. These include modeling seasonal decisions of consumers, forecasting business cycles and - as we show in the present article - modeling wholesale power market prices. We apply standard methods and a novel spectral domain technique to conclude that electricity price returns exhibit periodic correlation with daily and weekly periods. As such they should be modeled with periodically correlated processes. We propose to apply periodic autoregression (PAR) models which are closely related to the standard instruments in econometric analysis - vector autoregression (VAR) models.
    Keywords: periodic correlation, sample coherence, electricity price, periodic autoregression, vector autoregression
    JEL: C22 C32 L94 Q40
    Date: 2005–02–07
  27. By: Bragoudakis Zacharias (Bank of Greece)
    Abstract: This paper is an exercise in applied macroeconomic forecasting. We examine the forecasting power of a vector error-correction model (VECM) that is anchored by a long-run equilibrium relationship between Greek national income and productive public expenditure as suggested by the economic theory. We compare the estimated forecasting values of the endogenous variables to the real-historical values using a stochastic simulation analysis. The simulation results provide new evidence supporting the ability of the model to forecast not only one-period ahead but also many periods into the future. Keywords: Cointegration, Forecasting, Simulation Analysis, Vector error- correction models JEL Classifications: C15, C32, C53, E0, E6 Working Paper Series
    Keywords: Cointegration, Forecasting, Simulation Analysis, Vector error- correction models
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–02–09
  28. By: Kusum Mundra (San Diego State University)
    Abstract: In panel data the interest is often in slope estimation while taking account of the unobserved cross sectional heterogeneity. This paper proposes two nonparametric slope estimation where the unobserved effect is treated as fixed across cross section. The first estimator uses first-differencing transformation and the second estimator uses the mean deviation transformation. The asymptotic properties of the two estimators are established and the finite sample Monte Carlo properties of the two estimators are investigated allowing for systematic dependence between the cross-sectional effect and the independent variable. Simulation results suggest that the new nonparametric estimators perform better than the parametric counterparts. We also investigate the finite sample properties of the parametric within and first differencing estimators. A very common practice in estimating earning function is to assume earnings to be quadratic in age and tenure, but that might be misspecified. In this paper we estimate nonparametric slope of age and tenure on earnings using NLSY data and compare it to the parametric (quadratic) effect.
    Keywords: Nonparametric, Fixed-effect, Kernel, Monte carlo
    JEL: C1 C14 C23 C15
    Date: 2005–02–09
  29. By: Costas Milas (City University); Phil Rothman (East Carolina University)
    Abstract: In this paper we use smooth transition vector error-correction models (STVECMs) in a simulated out-of-sample forecasting experiment for the unemployment rates of the four non-Euro G-7 countries, the U.S., U.K., Canada, and Japan. For the U.S., pooled forecasts constructed by taking the median value across the point forecasts generated by the STVECMs perform better than the linear VECM benchmark more so during business cycle expansions. Pooling across the linear and nonlinear forecasts tends to lead to statistically signißcant forecast improvement for business cycle expansions for Canada, while the opposite is the case for the U.K.
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–02–18
  30. By: Edoardo Otranto (DEIR-Università di Sassari)
    Abstract: The extraction of a common signal from a group of time series is generally obtained using variables recorded with the same frequency or transformed to have the same frequency (monthly, quarterly, etc.). The statistical literature has not paid a great deal of attention to this topic. In this paper we extend an approach based on the use of dummy variables to the well known trend plus cycle model, in a multivariate context, using both quarterly and monthly data. This procedure is applied to the Italian economy, using the variables suggested by an Italian Institution (ISAE) to provide a national dating.
    Keywords: Business cycle; State-space; Time Series; Trend; Turning Points
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–02–18
  31. By: Victor Aguirregabiria (Boston University); Pedro Mira (CEMFI)
    Abstract: This paper proposes an algorithm to obtain maximum likelihood estimates of structural parameters in discrete games with multiple equilibria. The method combines a genetic algorithm (GA) with a pseudo maximum likelihood (PML) procedure. The GA searches efficiently over the huge space of possible combinations of equilibria in the data. The PML procedure avoids the repeated computation of equilibria for each trial value of the parameters of interest. To test the ability of this method to get maximum likelihood estimates, we present a Monte Carlo experiment in the context of a game of price competition and collusion.
    Keywords: Empirical games; Maximum likelihood estimation; Multiple equilibria; Genetic algorithms.
    JEL: C13 C35
    Date: 2005–02–28
  32. By: Vadim Marmer (Yale University)
    Abstract: Various implications of nonlinearity, nonstationarity and misspecification are considered from a forecasting perspective. My model allows for small departures from the martingale difference sequence hypothesis by including an additive nonlinear component, formulated as a general, integrable transformation of the predictor, which is assumed to be I(1). Such a generating mechanism provides for predictability only in the extremely short run. In the stock market example, this formulation corresponds to a situation where some relevant information may escape the attention of market participants only for very short periods of time. I assume that the true generating mechanism involving the nonlinear dependency is unknown to the econometrician and he is therefore forced to use some approximating functions. I show that the usual regression techniques lead to spurious forecasts. Improvements of the forecast accuracy are possible with properly chosen integrable approximating functions. This paper derives the limiting distribution of the forecast MSE. In the case of square integrable approximants, it depends on the $L_{2}$-distance between the nonlinear component and the approximating function. Optimal forecasts are available for a given class of approximants. Finally, I present a Monte Carlo simulation study and an empirical example in support of the theoretical findings.
    Keywords: forecasting, integrated time series, misspecified models, nonlinear transformations, stock returns, dividend-price ratio.
    JEL: C22 C53 G14
    Date: 2005–03–05
  33. By: Matteo M. Pelagatti (University of Milan-Bicocca); Stefania Rondena (University of Milan-Bicocca)
    Abstract: The Dynamic Conditional Correlation model of Engle has made the estimation of multivariate GARCH models feasible for reasonably big vectors of securities’ returns. In the present paper we show how Engle’s twosteps estimate of the model can be easily extended to elliptical conditional distributions and apply different leptokurtic DCC models to some stocks listed at the Milan Stock Exchange. A free software written by the authors to carry out all the required computations is presented as well.
    Keywords: Multivariate GARCH, Dynamic conditional correlation, Generalized method of moments
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–03–11
  34. By: Matteo M. Pelagatti (University of Milan-Bicocca)
    Abstract: Duration dependent Markov-switching VAR (DDMS-VAR) models are time series models with data generating process consisting in a mixture of two VAR processes, which switches according to a two-state Markov chain with transition probabilities depending on how long the process has been in a state. In the present paper I propose a MCMC-based methodology to carry out inference on the model's parameters and introduce DDMSVAR for Ox, a software written by the author for the analysis of time series by means of DDMS-VAR models. An application of the methodology to the U.S. business cycle concludes the article.
    Keywords: Markov-switching, Business cycle, Gibbs sampling, Duration dependence, Vector autoregression
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–03–11
  35. By: Amjad D. Al-Nasser
    Abstract: This paper presents the methodology of the Generalised Maximum Entropy (GME) approach for estimating linear models that contain latent variables such as customer satisfaction measurement models. The GME approach is a distribution free method and it provides better alternatives to the conventional method; Namely, Partial Least Squares (PLS), which used in the context of costumer satisfaction measurement. A simplified model that is used for the Swedish customer satis faction index (CSI) have been used to generate simulated data in order to study the performance of the GME and PLS. The results showed that the GME outperforms PLS in terms of mean square errors (MSE). A simulated data also used to compute the CSI using the GME approach.
    Keywords: Generalised Maximum Entropy, Partial Least Squares, Costumer Satisfaction Models.
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–03–10
  36. By: Ozgen Sayginsoy (University at Albany--SUNY)
    Abstract: In this paper, a likelihood ratio approach is taken to derive a test of the economic convergence hypothesis in the context of the linear deterministic trend model. The test is designed to directly address the nonstandard nature of the hypothesis, and is a systematic improvement over existing methods for testing convergence in the same context. The test is first derived under the assumption of Gaussian errors with known serial correlation. However, the normality assumption is then relaxed, and the results are naturally extended to the case of covariance stationary errors with unknown serial correlation. The test statistic is a continuous function of individual t-statistics on the intercept and slope parameters of the linear deterministic trend model, and therefore, standard heteroskedasticity and autocorrelation consistent estimators of the long-run variance can be directly implemented. Building upon the likelihood ratio framework, concrete and specific tests are recommended to be used in practice. The recommended tests do not require the knowledge of the form of serial correlation in the data, and they are robust to highly persistent serial correlation, including the case of a unit root in the errors. The recommended tests utilize the nonparametric kernel variance estimators, which are analyzed using the fixed bandwidth (fixed-b) asymptotic framework recently proposed by Kiefer and Vogelsang (2003). The fixed-b framework makes possible the choice of kernel and bandwidth that deliver tests with maximal asymptotic power within a specific class of tests. It is shown that when the Daniell kernel variance estimator is implemented with specific bandwidth choices, the recommended tests have asymptotic power close that of the known variance case, as well as good finite sample size and power properties. Finally, the newly developed tests are used to investigate economic convergence among eight regions of the United States (as defined by the Bureau of Economic Analysis) in the post-World-War-II period. Empirical evidence is found for convergence in three of the eight regions.
    Keywords: Likelihood Ratio, Joint Inequality, HAC Estimator, Fixed-b Asymptotics, Power Envelope, Unit Root, Linear Trend, BEA Regions.
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–03–11
  37. By: Jonathan B. Hill (Florida International University)
    Abstract: In this paper, we develop a parametric test procedure for multiple horizon "Granger" causality and apply the procedure to the well established problem of determining causal patterns in aggregate monthly U.S. money and output. As opposed to most papers in the parametric causality literature, we are interested in whether money ever "causes" (can ever be used to forecast) output, when causation occurs, and how (through which causal chains). For brevity, we consider only causal patterns up to horizon h = 3. Our tests are based on new recursive parametric characterizations of causality chains which help to distinguish between mere noncausation (the total absence of indirect causal routes) and causal neutralization, in which several causal routes exists that cancel each other out such that noncausation occurs. In many cases the recursive characterizations imply greatly simplified linear compound hypotheses for multi-step ahead causation, and permit Wald tests with the usual asymptotic ÷²-distribution. A simulation study demonstrates that a sequential test method does not generate the type of size distortions typically reported in the literature, and null rejection frequencies depend entirely on how we define the "null hypothesis" of non-causality (at which horizon, if any). Using monthly data employed in Stock and Watson (1989), and others, we demonstrate that while Friedman and Kuttner's (1993) result that detrended money growth fails to cause output one month ahead continues into the third quarter of 2003, a significant causal lag may exist through a variety of short-term interest rates: money appears to cause output after at least one month passes, although in some cases using recent data conflicting evidence suggests money may never cause output and be truly irrelevant in matters of real decisions.
    Keywords: multiple horizon causation; multivariate time series; sequential tests.
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–03–15
  38. By: Patrick Crowley (Texas A&M University - Corpus Christi)
    Abstract: Wavelet analysis, although used extensively in disciplines such as signal processing, engineering, medical sciences, physics and astronomy, has not yet fully entered the economics discipline. In this discussion paper, wavelet analysis is introduced in an intuitive manner, and the existing economics and finance literature that utilises wavelets is explored. Extensive examples of exploratory wavelet analysis are given, many using Canadian, US and Finnish industrial production data. Finally, potential future applications for wavelet analysis in economics are also discussed and explored.
    Keywords: statistical methodology, multiresolution analysis, wavelets, business cycles, economic growth
    JEL: C19 C87 E32
    Date: 2005–03–17
  39. By: Marie Bessec (EURIsCO - University Paris Dauphine); Othman Bouabdallah (EUREQua - University Paris Panthéon Sorbonne)
    Abstract: This paper explores the forecasting abilities of Markov-Switching models. Although MS models generally display a superior in-sample fit relative to linear models, the gain in prediction remains small. We confirm this result using simulated data for a wide range of specifications by applying several tests of forecast accuracy and encompassing robust to nested models. In order to explain this poor performance, we use a forecasting error decomposition. We identify four components and derive their analytical expressions in different MS specifications. The relative contribution of each source is assessed through Monte Carlo simulations. We find that the main source of error is due to the misclassification of future regimes.
    Keywords: Forecasting, Regime Shifts, Markov-Switching.
    JEL: C22 C32 C53
    Date: 2005–03–22
  40. By: Ching-Kang Ing (Institute of Statistical Science, Academia Sinica)
    Abstract: The predictive capability of a modification of Rissanen's accumulated prediction error (APE) criterion, APE$_{\delta_{n}}$,is investigated in infinite-order autoregressive (AR($\infty$)) models. Instead of accumulating squares of sequential prediction errors from the beginning, APE$_{\delta_{n}}$ is obtained by summing these squared errors from stage $n\delta_{n}$, where $n$ is the sample size and $0 < \delta_{n} < 1$ may depend on $n$. Under certain regularity conditions, an asymptotic expression is derived for the mean-squared prediction error (MSPE) of an AR predictor with order determined by APE$_{\delta_{n}}$. This expression shows that the prediction performances of APE$_{\delta_{n}}$ can vary dramatically depending on the choice of $\delta_{n}$. Another interesting finding is that when $\delta_{n}$ approaches 1 at a certain rate, APE$_{\delta_{n}}$ can achieve asymptotic efficiency in most practical situations. An asymptotic equivalence between APE$_{\delta_{n}}$ and an information criterion with a suitable penalty term is also established from the MSPE point of view. It offers a new perspective for comparing the information- and prediction-based model selection criteria in AR($\infty$) models. Finally, we provide the first asymptotic efficiency result for the case when the underlying AR($\infty$) model is allowed to degenerate to a finite autoregression.
    Keywords: Accumulated prediction errors, Asymptotic equivalence, Asymptotic efficiency, Information criterion, Order selection, Optimal forecasting
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–03–23
  41. By: Chen Pu (Universität Bielefeld → Fakultät für Wirtschaftswissenschaften); Hsiao Chihying (Universität Bielefeld → Fakultät für Wirtschaftswissenschaften)
    Abstract: In this paper we investigate the possibility of the application of subsampling procedure for testing cointegration relations in large multivariate systems. The subsampling technique is applied to overcome the difficulty of nonstandard distribution and nuisance parameters in testing for cointegration rank without an explicitly formulated structural model. The contribution in this paper is twofold: theoretically this paper shows that the subsampling testing procedure is consistent and asymptotically most powerful; practically this paper demonstrates that the subsampling procedure can be applied to determine the cointegration rank in large scale models, where the standard procedures hits already its limit. Especially for the cases of few stochastic trends in a system, the subsampling procedure shows robust and reliable results.
    Keywords: Cointegration, Large System, Nonparametric Tests, Subsampling, PPP
    JEL: C19 C40 C50
    Date: 2005–04–08
  42. By: Patrick Marsh
    Abstract: This paper considers the information available to invariant unit root tests at and near the unit root. Since all invariant tests will be functions of the maximal invariant, the Fisher information in this statistic will be the available information. The main finding of the paper is that the available information for all tests invariant to a linear trend is zero at the unit root. This result applies for any sample size, over a variety of distributions and correlation structures and is robust to the inclusion of any other deterministic component. In addition, an explicit bound upon the power of all invariant unit root tests is shown to depend solely upon the information. This bound is illustrated via comparison with the local-to-unity power envelope and a brief simulation study illustrates the impact that the requirements of invariance have on power.
  43. By: Joaquim J.S. Ramalho (Department of Economics, University of Évora); Richard J. Smith (Department of Economics, University of Warwick)
    Abstract: This paper proposes novel methods for the construction of tests for models specified by unconditional moment restrictions. It exploits the classical-like nature of generalized empirical likelihood (GEL) to define Pearson-type statistics for over-identifying moment conditions and parametric constraints based on constrasts of GEL implied probabilities which are natural by-products of GEL estimation. As is increasingly recognized, GEL can possess both theoretical and empirical advantages over the more standard generalized method of moments (GMM). Monte Carlo evidence comparing GMM, GEL and Pearsontype statistics for over-identifying moment conditions indicates that the size properties of a particular Pearson-type statistic is competitive in most and an improvement over other statistics in many circumstances.
    Keywords: GMM, Generalized Empirical Likelihood, Overidentifying Moments, Parametric Restrictions, Pearson-Type Tests
    JEL: C13 C30
    Date: 2005
  44. By: Joaquim J.S. Ramalho (Department of Economics, University of Évora); Esmeralda A. Ramalho (Department of Economics, University of Évora)
    Abstract: Empirical likelihood (EL) is appropriate to estimate moment condition models when a random sample from the target population is available. However, many economic surveys are subject to some form of stratification, with different subsets of the underlying population of interest being sampled with different frequencies. In this setting, the available data provide a related but distorted picture of the features of the target population and direct application of EL will produce inconsistent estimators. In this paper we propose some adaptations to EL to deal with stratified samples in models defined by unconditional moment restrictions. We develop two distinct modified EL estimators: the weighted EL estimator, which requires knowledge on the marginal strata probabilities; and the two-step EL estimator, which assumes the availability of some auxiliary aggregate information on the target population. A Monte Carlo simulation study reveals promising results both for the weighted and some versions of the two-step EL estimators.
    Keywords: Stratified Sampling, Empirical Likelihood, Weighted Estimation, Auxiliary Information
    JEL: C13 C30
    Date: 2005

This nep-ecm issue is ©2005 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.