nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒07‒28
fifteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Issues in the Estimation of Mis-Specified Models of Fractionally Integrated Processes By K. Nadarajah; Gael M. Martin; D.S. Poskitt
  2. Inference on the Long-Memory Properties of Time Series with Non-Stationary Volatility By Demetrescu, Matei; Sibbertsen, Philipp
  3. Consistent Pretesting for Jumps By Valentina Corradi; Mervyn J. Silvapulle; Norman Swanson
  4. Semiparametric Estimation of First-Price Auction Models By Aryal, Gaurab; Gabrielli, Maria F.; Vuong, Quang
  5. A Laplace Stochastic Frontier Model By William C. Horrace; Christopher F. Parmeter
  6. Identification and Estimation of Outcome Response with Heterogeneous Treatment Externalities By Eleonora Patacchini; Tiziano Arduini; Edoardo Rainone
  7. Prediction and Simulation Using Simple Models Characterized by Nonstationarity and Seasonality By Norman Swanson; Richard Urbach
  8. Bayesian Exploratory Factor Analysis By Gabriella Conti; Sylvia Fruehwirth-Schnatter; James J. Heckman; Remi Piatek
  9. A control chart using copula-based Markov chain models By Long, Ting-Hsuan; Emura, Takeshi
  10. Identification of cross and autocorrelations in time series within an approach based on Wigner eigenspectrum of random matrices By Michal Sawa; Dariusz Grech
  11. A Poisson Stochastic Frontier Model with Finite Mixture Structure By Drivas, Kyriakos; Economidou, Claire; Tsionas, Efthymios G.
  12. Structural Estimation of Sequential Games of Complete Information By Jason R. Blevins
  13. Maximum entropy estimator for the predictability of energy commodity market time series By Francesco Benedetto; Gaetano Giunta; Loretta Mastroeni
  14. Generalized Random Utility Models with Multiple Types By Azari Soufiani, Hossein; Diao, Hansheng; Lai, Zhenyu; Parkes, David C.
  15. A fractionally cointegrated VAR model with deterministic trends and application to commodity futures markets By Sepideh Dolatabadi; Morten Ø. Nielsen; Ke Xu

  1. By: K. Nadarajah; Gael M. Martin; D.S. Poskitt
    Abstract: In this paper we quantify the impact of model mis-specification on the properties of parameter estimators applied to fractionally integrated processes. We demonstrate the asymptotic equivalence of four alternative parametric methods: frequency domain maximum likelihood, Whittle estimation, time domain maximum likelihood and conditional sum of squares. We show that all four estimators converge to the same pseudo-true value and provide an analytical representation of their (common) asymptotic distribution. As well as providing theoretical insights, we explore the finite sample properties of the alternative estimators when used to fit mis-specified models. In particular we demonstrate that when the difference between the true and pseudo-true values of the long memory parameter is sufficiently large, a clear distinction between the frequency domain and time domain estimators can be observed - in terms of the accuracy with which the finite sample distributions replicate the common asymptotic distribution - with the time domain estimators exhibiting a closer match overall. Simulation experiments also demonstrate that the two time-domain estimators have the smallest bias and mean squared error as estimators of the pseudo-true value of the long memory parameter, with conditional sum of squares being the most accurate estimator overall and having a relative efficiency that is approximately double that of frequency domain maximum likelihood, across a range of mis-specification designs.
    Keywords: nd phrases: bias, conditional sum of squares, frequency domain, long memory models, maximum likelihood, mean squared error, pseudo true parameter, time domain, Whittle.
    JEL: C18 C22 C52
    Date: 2014
  2. By: Demetrescu, Matei; Sibbertsen, Philipp
    Abstract: Many time series exhibit unconditional heteroskedasticity, often in addition to conditional one. But such time-varying volatility of the data generating process can have rather adverse effects when inferring about its persistence; e.g. unit root and stationarity tests possess null distributions depending on the so-called variance profile. On the contrary, this is guaranteed if taking protective actions as simple as using White standard errors (which are employed anyway to deal with conditional heteroskedasticity). The paper explores the influence of time-varying volatility on fractionally integrated processes. Concretely, we discuss how to model long memory in the presence of time-varying volatility, and analyze the effects of such nonstationarity on several existing inferential procedures for the fractional integration parameter. Based on asymptotic arguments and Monte Carlo simulations, we show that periodogram-based estimators, such as the local Whittle or the log-periodogram regression estimator, remain consistent, but have asymptotic distributions whose variance depends on the variance profile. Time-domain, regression-based tests for fractional integration retain their validity if White standard errors are used. Finally, the modified range-scale statistic is only affected if the series require adjustment for deterministic components.
    Keywords: Time-varying variance, Heteroskedasticity, Persistence, Fractional integration, Modulated process
    JEL: C12 C22
    Date: 2014–07
  3. By: Valentina Corradi (University of Surrey); Mervyn J. Silvapulle (Monash University); Norman Swanson (Rutgers University)
    Abstract: If the intensity parameter in a jump diffusion model is identically zero, then parameters characterizing the jump size density cannot be identified. In general, this lack of identification precludes consistent estimation of identified parameters. Hence, it should be standard practice to consistently pretest for jumps, prior to estimating jump diffusions. Many currently available tests have power against the presence of jumps over a �nite time span (typically a day or a week); and, as already noted by various authors, jumps may not be observed over �nite time spans, even if the intensity parameter is strictly positive. Such tests cannot be consistent against non-zero intensity. Moreover, sequential application of �nite time span tests usually leads to sequential testing bias, which in turn leads to jump discovery with probability one, in the limit, even if the true intensity is identically zero. This paper introduces tests for jump intensity, based on both in-�ll and long-span asymptotics, which solve both the test consistency and the sequential testing bias problems discussed above, in turn facilitating consistent estimation of jump diffusion models. A �self excitement �test is also introduced, which is designed to have power against path dependent intensity, thus providing a direct test for the Hawkes diffusion model of Ait-Sahalia, Cacho-Diaz and Laeven (2013). In a series of Monte Carlo experiments, the proposed tests are evaluated, and are found to perform adequately in �nite samples.
    Keywords: diffusion model, jump intensity
    JEL: C12 C22 C52
    Date: 2014–06–09
  4. By: Aryal, Gaurab; Gabrielli, Maria F.; Vuong, Quang
    Abstract: We propose a semiparametric estimator within the class of indirect methods. Specifically, we model private valuations through a set of conditional moment restrictions. Our econometric model calls for a two step procedure. In the first step we recover a sample of pseudo private values while using a Local Polynomial Estimator. In the second step we use a GMM procedure to obtain an estimate for the parameter of interest. The proposed semiparametric estimator is shown to have desirable statistical properties namely, it is consistent and has an asymptotic normal distribution. Moreover, the estimator attains the parametric rate of convergence.
    Keywords: Auctions, Structural Approach, Semiparametric Estimator, Local Polynomial, GMM.
    JEL: C14 C72 D44
    Date: 2014–07–12
  5. By: William C. Horrace (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244-1020); Christopher F. Parmeter (Department of Economics, University of Miami)
    Abstract: We propose a Laplace stochastic frontier model as an alternative to the traditional model with normal errors. An interesting feature of the Laplace model is that the distribution of inefficiency conditional on the composed error is constant for positive values of the composed error, but varies for negative values. Therefore, it may be ideally suited for analyzing industries with many forms on or close to the efficient frontier. A simulation study suggests that the model performs well relative to the normal-exponential model when the two-sided error is misspecified. A brief application to US Airlines is provided.
    Keywords: Stochastic frontier, efficient estimation
    JEL: C12 C16 C44 D24
    Date: 2014–04
  6. By: Eleonora Patacchini (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244); Tiziano Arduini (University of Rome "Tor Vergata"); Edoardo Rainone (LaSapienza University)
    Abstract: This paper studies the identification and estimation of treatment response with heterogeneous spillovers in a network model. We generalize the standard linear-in-means model to allow for multiple groups with between and within-group interactions. We provide a set of identification conditions of peer effects and consider a 2SLS estimation approach. Large sample properties of the proposed estimators are derived. Simulation experiments show that the estimators perform well in finite samples. The model is used to study the effectiveness of policies where peer effects are seen as a mechanism through which the treatments could propagate through the network. When interactions among groups are at work, a shock on a treated group has effects on the non-treated. Our framework allows for quantifying how much of the indirect treatment effect is due to variations in the characteristics of treated peers (treatment contextual effects) and how much is because of variations in peer outcomes (peer effects).
    Keywords: Networks, Heterogeneous Peer Effects, Spatial Autoregressive Model, Two-Stage Least Squares, Efficiency, Policy Evaluation, Treatment Response, Indirect Treatment Effect
    JEL: C13 C21 D62
    Date: 2014–04
  7. By: Norman Swanson (Rutgers University); Richard Urbach (Conning Germany Gmbh)
    Abstract: In this paper, we provide new evidence on the empirical usefulness of various simple seasonal models, and underscore the importance of carefully designing criteria by which one judges alternative models. In particular, we underscore the importance of both choice of forecast or simulation horizon and choice between minimizing point or distribution based loss measures. Our empirical analysis centers around the implementation of a series of simulation and prediction experiments, as well as a discussion of the stochastic properties of seasonal unit root models. Our prediction experiments are based on analysis of a group of 14 variables have been chosen to closely mimic the set of indicators used by the Federal Reserve to help in setting U.S. monetary policy, and our simulation experiments are based on a comparison of simulated and historical distributions of said variables using the testing approach of Corradi and Swanson (2007a).
    Keywords: seasonal unit root, periodic autoregression, difference stationary
    JEL: C13 C22 C52
    Date: 2013–08–10
  8. By: Gabriella Conti (University College London); Sylvia Fruehwirth-Schnatter (University of Vienna); James J. Heckman (The University of Chicago); Remi Piatek (Kobenhavns Universitet)
    Abstract: This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on \emph{ad hoc} classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements.
    Keywords: Bayesian factor modeling, exploratory factor analysis, identifiability, marginal data augmentation, model expansion, model selection
    JEL: C11 C38 C63
    Date: 2014–07
  9. By: Long, Ting-Hsuan; Emura, Takeshi
    Abstract: Statistical process control is an important and convenient tool to stabilize the quality of manufactured goods and service operations. The traditional Shewhart control chart has been used extensively for process control, which is valid under the independence assumption of consecutive observations. In real world applications, there are many types of dependent observations in which the traditional control chart cannot be used. In this paper, we propose to apply a copula-based Markov chain to perform statistical process control for correlated observations. In particular, we consider three methods to obtain the estimates of upper control limit (UCL) and lower control limit (LCL) for the control chart. It is shown by simulations that Joe’s parametric maximum likelihood method provides the most reliable estimates of the UCL and LCL compared to the other methods. We also propose simulation techniques to compute the average run length (ARL) of the proposed charts, which can be used to set the UCL and LCL for a given value of ARL. The piston rings data are analyzed for illustration.
    Keywords: Average run length, Clayton model, correlated data, Kendall’s tau, Markov chain
    JEL: C13 C15
    Date: 2014–07–19
  10. By: Michal Sawa; Dariusz Grech
    Abstract: We present an original and novel method based on random matrix approach that enables to distinguish the respective role of temporal autocorrelations inside given time series and cross correlations between various time series. The proposed algorithm is based on properties of Wigner eigenspectrum of random matrices instead of commonly used Wishart eigenspectrum methodology. The proposed approach is then qualitatively and quantitatively applied to financial data in stocks building WIG (Warsaw Stock Exchange Index).
    Date: 2014–07
  11. By: Drivas, Kyriakos; Economidou, Claire; Tsionas, Efthymios G.
    Abstract: Standard stochastic frontier models estimate log-linear specifications of production technology, represented mostly by production, cost, profit, revenue, and distance frontiers. We develop a methodology for stochastic frontier models of count data allowing for technological and inefficiency induced heterogeneity in the data and endogenous regressors. We derive the corresponding log-likelihood function and conditional mean of inefficiency to estimate technology regime-specific inefficiency. We further provide empirical evidence that demonstrates the applicability of the proposed model.
    Keywords: efficiency, Poisson stochastic frontier, mixture, innovation, states
    JEL: C13 C24 C33 C51
    Date: 2014–07–20
  12. By: Jason R. Blevins (Department of Economics, Ohio State University)
    Abstract: In models of strategic interaction, there may be important order of entry effects if one player can credibly commit to an action (e.g., entry) before other players. If one estimates a simultaneous-move model, then the move-order effects will be confounded with the payoffs. This paper considers nonparametric identification and simulation-based estimation of sequential games of complete information. Relative to simultaneous-move games, these models avoid the problem of multiple equilibria and require fewer payoff normalizations. We apply the estimator in several Monte Carlo experiments and to study entry-order effects using data from the airline industry.
    Keywords: static games, sequential games, identification, simulation-based estimation, airline industry
    JEL: C15 C35 C72 L13 L93
    Date: 2014–07
  13. By: Francesco Benedetto; Gaetano Giunta; Loretta Mastroeni
    Abstract: This paper proposes a novel method for assessing the predictability of energy market time series, by predicting the entropy of the series. According to conventional entropy-based analysis where the entropy is always ex-post estimated), high entropy values characterize unpredictable series, while more stable series exhibits lesser entropy values. Here, we predict ex-ante the entropy regarding the future behavior of a series, based on the observation of historical data. Our prediction is performed according to the optimum least squares minimization algorithm. Preliminary results, applied to energy commodities, show the efficacy of the proposed method for application to energy market time series.
    Keywords: Entropy analysis, market efficiency, energy commodity, energy time
    JEL: C53 C63 G17 Q47
    Date: 2014–07
  14. By: Azari Soufiani, Hossein; Diao, Hansheng; Lai, Zhenyu; Parkes, David C.
    Abstract: We propose a model for demand estimation in multi-agent, differentiated product settings and present an estimation algorithm that uses reversible jump MCMC techniques to classify agents' types. Our model extends the popular setup in Berry, Levinsohn and Pakes (1995) to allow for the data-driven classification of agents' types using agent-level data. We focus on applications involving data on agents' ranking over alternatives, and present theoretical conditions that establish the identifiability of the model and uni-modality of the likelihood/posterior. Results on both real and simulated data provide support for the scalability of our approach.
    Date: 2013
  15. By: Sepideh Dolatabadi (Queen's University); Morten Ø. Nielsen (Queen's University and CREATES); Ke Xu (Queen's University)
    Abstract: We apply the fractionally cointegrated vector autoregressive (FCVAR) model to analyze the relationship between spot and futures prices in five commodity markets (aluminium, copper, lead, nickel, and zinc). To this end, we first extend the FCVAR model to accommodate deterministic trends in the levels of the processes. The methodological contribution is to provide representation theory for the FCVAR model with deterministic trends, where we show that the presence of the deterministic trend in the process induces both restricted and unrestricted constant terms in the vector error correction model. The consequences for the cointegration rank test are also briefly discussed. In our empirical application we use the data from Figuerola-Ferretti and Gonzalo (2010), who conduct a similar analysis using the usual (non-fractional) cointegrated VAR model. The main conclusion from the empirical analysis is that, when using the FCVAR model, there is more support for the cointegration vector (1,-1)' in the long-run equilibrium relationship between spot and futures prices, and hence less evidence of long-run backwardation, compared to the results from the non-fractional model. Specifically, we reject the hypothesis that the cointegration vector is (1,-1) using standard likelihood ratio tests only for the lead and nickel markets.
    Keywords: backwardation, contango, deterministic trend, fractional cointegration, futures markets, vector error correction model
    JEL: C32 G14
    Date: 2014–07

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.