nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒02‒21
twenty-one papers chosen by
Sune Karlsson
Orebro University

  1. A Two-Stage Estimator for Heterogeneous Panel Models with Common Factors By Carolina Castagnetti; Eduardo Rossi; Lorenzo Trapani
  2. Specification Testing in Structural Nonparametric Cointegration By Chaohua Dong; Jiti Gao
  3. Econometric Time Series Specification Testing in a Class of Multiplicative Error Models By Patrick W Saart; Jiti Gao; Nam Hyun Kim
  4. Inference for functions of partially identified parameters in moment inequality models By Federico Bugni; Ivan Canay; Xiaoxia Shi
  5. WAVELET ANALYSIS OF SPATIO-TEMPORAL DATA By Yasumasa Matsuda
  6. "Benchmarked Empirical Bayes Methods in Multiplicative Area-level Models with Risk Evaluation" By Malay Ghosh; Tatsuya Kubokawa; Yuki Kawakubo
  7. Semiparametric Generalized Long Memory Modelling of GCC Stock Market Returns: A Wavelet Approach By Heni Boubaker; Nadia Sghaier
  8. A Within Estimator for Three-Level Data: An Application to the WTO Effect on Trade Flows By Juyoung Cheong; Do Won Kwak; Kam Ki Tang
  9. Random coefficients on endogenous variables in simultaneous equations models By Matthew Masten
  10. Measurement of causality change between multiple time series By Ryo Kinoshita; Kosuke Oya
  11. Asymptotic Efficiency in Factor Models and Dynamic Panel Data Models By Haruo Iwakura; Ryo Okui
  12. Consistent estimation of breakpoints in time series, with application to wavelet analysis of Citigroup returns By Roberts, Leigh
  13. Generalized instrumental variable models By Andrew Chesher; Adam Rosen
  14. On The Theory and Practice of Singular Spectrum Analysis Forecasting By M. Atikur Rahman Khan; D.S. Poskitt
  15. Trends Cycles and Seasons: Econometric Methods of Signal Extraction By Stephen Pollock
  16. Testing for near I(2) trends when the signal to noise ratio is small By Juselius, Katarina
  17. Seeking Ergodicity in Dynamic Economies By Takashi Kamihigashi; John Stachurski
  18. Volatility persistence in crude oil markets By Amélie Charles; Olivier Darné
  19. Econometrics: An Historical Guide for the Uninitiated By Stephen Pollock
  20. How to estimate the probability of rare long-distance trips By Plötz, Patrick
  21. "A New Improvement Scheme for Approximation Methods of Probability Density Functions" By Akihiko Takahashi; Yukihiro Tsuzuki

  1. By: Carolina Castagnetti (Department of Economics and Management, University of Pavia); Eduardo Rossi (Department of Economics and Management, University of Pavia); Lorenzo Trapani (Faculty of Finance,Cass Business School, City University, London (UK))
    Abstract: This paper considers estimation in a stationary heterogeneous panel model where common unknown factors are present. A two-stage estimator is proposed. This estimator is based on the CCE estimator (Pesaran, 2006) in the first stage and on a similar approach to the Interactive Effect estimator (Bai, 2009) in the second stage. The asymptotic properties of this estimator are provided alongside of the comparative finite-sample properties of a range of estimators by means of Monte Carlo experiments.
    Keywords: Large panels; Factor error structure; Principal components; Common regressors; Cross-section dependence
    JEL: C33 C38
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:pav:demwpp:demwp0066&r=ecm
  2. By: Chaohua Dong; Jiti Gao
    Abstract: This paper proposes two simple and new specification tests based on the use of an orthogonal series for a considerable class of cointegrated time series models with endogeneity and nonsta-tionarity. The paper then establishes an asymptotic theory for each of the proposed tests. The first test is initially proposed for the case where the regression function involved is integrable, which fills a gap in the literature, and the second test is an extended version of the first test for covering a class of non-integrable functions. Endogeneity in two general forms is allowed in the models to be tested. A potential global departure in the alternative hypothesis, which is being overlooked by the literature, is investigated. The finite sample performance of the proposed tests is examined through using several simulated examples. Meanwhile, the second test is naturally applicable to the case where there is a type of endogeneity inherited in the relationship between the United States aggregate consumers' consumption expenditure and disposable income over the period of 1960-2009. Our experience generally shows that the proposed tests are easily implementable and also have stable sizes and good power properties even when the 'distance' between the null hypothesis and a sequence of local alternatives is asymptotically negligible.
    Keywords: Consumption-income model; Endogeneity; Integrated time series; Linear process; Orthogonal series estimation; Parametric specification
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2014-2&r=ecm
  3. By: Patrick W Saart; Jiti Gao; Nam Hyun Kim
    Abstract: In recent years, analysis of financial time series has focused largely on data related to market trading activity. Apart from modelling the conditional variance of returns within the GARCH family of models, presently attention has also been devoted to other market variables, especially volumes, number of trades and durations. The financial econometrics literature has focused on Multiplicative Error Models (MEMs), which are considered particularly suited for modelling certain financial variables. The paper establishes an econometric specification approach for MEMs. In the literature, several procedures are available to perform specification testing for MEMs, but the proposed specification testing method is particularly useful within the context of the MEMs of financial duration. The paper makes a number of important theoretical contributions. Both the proposed specification testing method and the associated theory are established and evaluated through simulations and real data examples.
    Keywords: Financial duration process; Nonnegative time series; Nonparametric kernel estimation; Semiparametric mixture model
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2014-1&r=ecm
  4. By: Federico Bugni (Institute for Fiscal Studies and Duke University); Ivan Canay (Institute for Fiscal Studies and Northwestern University); Xiaoxia Shi
    Abstract: This paper introduces a new hypothesis test for the null hypothesis H0 : f(θ) = ϒ0, where f(.) is a known function, ϒ0 is a known constant, and θ is a parameter that is partially identied by a moment (in)equality model. The main application of our test is sub-vector inference in moment inequality models, that is, for a multidimensional θ, the function f(θ) = θk selects the kth coordinate of θ. Our test controls asymptotic size uniformly over a large class of distributions of the data and has better asymptotic power properties than currently available methods. In particular, we show that the new test has asymptotic power that dominates the one corresponding to two existing competitors in the literature: subsampling and projection-based tests.
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:05/14&r=ecm
  5. By: Yasumasa Matsuda
    Abstract: This paper aims to provide a wavelet analysis for spatio-temporal data which are observed on irregularly spaced stations at discrete time points, where the spatial covariances show serious non-stationarity caused by local dependency. A specific example that is used for the demonstration is US precipitation data observed on about ten thousand stations in every month. By a reinterpretation of Whittle likelihood function for stationary time series, we propose a kind of Bayesian regression model for spatial data whose regressors are given by modified Haar wavelets and try a spatio-temporal extension by a state space approach. We also propose an empirical Bayes estimation for the parameters, which is regarded as a spatio-temporal extension of Whittle likelihood estimation originally defined for stationary time series. We conduct the extended Whittle estimate and compare mean square errors of the forecasts with those of some benchmarks to evaluate its goodness for the US precipitation data in August from 1987-1997.
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:toh:tergaa:311&r=ecm
  6. By: Malay Ghosh (Department of Statistics, University of Florida,); Tatsuya Kubokawa (Faculty of Economics, The University of Tokyo); Yuki Kawakubo (Graduate School of Economics, The University of Tokyo)
    Abstract:    The paper develops empirical Bayes and benchmarked empirical Bayes estimators of positive small area means under multiplicative models. A simple example will be estimation of per capita income for small areas. It is now well-understood that small area estimation needs explicit, or at least implicit use of models. One potential difficulty with model-based estimators is that the overall estimator for a larger geographical area based on (weighted) sum of the model-based estimators is not necessarily identical to the corresponding direct estimator, such as the overall sample mean. One way to fix such a problem is the so-called benchmarking approach which modifies the model-based estimators to match the aggregate direct estimator. Benchmarked hierarchical and empirical Bayes estimators have proved to be particularly useful in this regard. However, while estimating positive small area parameters, the conventional squared error or weighted squared loss subject to the usual benchmark constraint does not necessarily produce positive estimators. Hence, it is necessary to seek other meaningful losses to alleviate this problem. In this paper, we consider the transformed Fay-Herriot model as a multiplicative model for estimating positive small area means, and suggest a weighted Kullback-Leibler divergence as a loss function. We have found out that the resulting Bayes estimator is the posterior mean and that the corresponding benchmarked Bayes and empirical Bayes estimators retain the positivity constraint. The prediction errors of the suggested empirical Bayes estimators are investigated asymptotically, and their second-order unbiased estimators are provided. In addition, bootstrapped estimators of these prediction errors are also provided. The performance of the suggested procedures is investigated through simulation as well as with an empirical study.
    Date: 2014–02
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2014cf918&r=ecm
  7. By: Heni Boubaker; Nadia Sghaier
    Abstract: This paper proposes a new class of semiparametric generalized long memory model with FIA- PARCH errors (SEMIGARMA-FIAPARCH model) that extends the conventionnel GARMA model to incorporate nonlinear deterministic trend, in the mean equation, and to allow for time varying volatility, in the conditional variance equation. The parameters of this model are estimated in a wavelet domain. We provide an empirical application of this model to examine the dynamic of the stock market returns in six GCC countries. The empirical results show that the model proposed o¤ers an interesting framework to describe the seasonal long range dependence and the nonlinear deterministic trend in the return as well as persistence to shocks in the conditional volatiliy. We also compare its performance predictive to the traditional long memory model with FIAPARCH errors (FARMA-FIAPARCH model). The predictive results indicate that the model proposed out performs the FARMA-FIAPARCH model.
    Keywords: semiparametric generalized long memory process, FIAPARCH errors, wavelet do- main, stock market returns.
    JEL: C13 C22 C32 G15
    Date: 2014–01–06
    URL: http://d.repec.org/n?u=RePEc:ipg:wpaper:2014-066&r=ecm
  8. By: Juyoung Cheong (School of Economics, The University of Queensland); Do Won Kwak (School of Economics, The University of Queensland); Kam Ki Tang (School of Economics, The University of Queensland)
    Abstract: This paper proposes a within estimator for three-level data, such as time-variant bilateral trade flows. The estimator helps to address the computational difficulties in estimating, for instance, the gravity model of bilateral trade that needs to control for unobserved country-pair and country-time heterogeneity using fixed effects. Unlike the traditional within transformation that removes cross-sectional heterogeneity, the proposed transformation method removes all three types of heterogeneity, each of which varies in two dimensions of three-level data. We demonstrate the properties of the estimator using empirical examples and Monte Carlo simulations. Simulation results show that the proposed estimators with the adjusted standard errors consistently estimate coefficient parameters and perform correct inference when no data are missing or the missing data are random. When missing data are not random, the proposed within transformation can reduce bias to various degrees, depending on the order of demeaning and sources of bias. As an empirical application, we investigate the WTO effect puzzle by applying the proposed estimator to show that WTO membership has a definite positive effect on bilateral trade flows.
    Date: 2014–02–11
    URL: http://d.repec.org/n?u=RePEc:qld:uq2004:501&r=ecm
  9. By: Matthew Masten
    Abstract: This paper considers a classical linear simultaneous equations model with random coefficients on the endogenous variables. Simultaneous equations models are used to study social interactions, strategic interactions between ï¬rms, and market equilibrium. Random coefficient models allow for heterogeneous marginal effects. For two-equation systems, I give two sets of sufficient conditions for point identiï¬cation of the coefficients’ marginal distributions conditional on exogenous covariates. The ï¬rst requires full support instruments, but allows for nearly arbitrary distributions of unobservables. The second allows for continuous instruments without full support, but places tail restrictions on the distributions of unobservables. I show that a nonparametric sieve maximum likelihood estimator for these distributions is consistent. I apply my results to the Add Health data to analyze the social determinants of obesity.
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:01/14&r=ecm
  10. By: Ryo Kinoshita (Graduate School of Economics, Osaka University); Kosuke Oya (Graduate School of Economics & Center for the Study of Finance and Insurance, Osaka University)
    Abstract: Structural change is gauged with the change of parameters in the model. In the case of multiple time series model, the causality between the time series also changes when there is a structural change. However the magnitude of change in causality is not clear in the case of structural change. We explore the measure of causality change between the time series and propose the test statistic whether there is any significance change in the causal relationship using frequency domain causality measure given by Geweke (1982) and Hosoya (1991). These procedures can be applied to error correction model which is non-stationary time series. The properties of the measure and test statistic are examined through the Monte Carlo simulation. As an example of application, the change in causality between United states and Japanese stock indexes is tested.
    Keywords: Causality, Frequency domain, Error correction model, Structural breaks
    JEL: C01 C19
    Date: 2014–02
    URL: http://d.repec.org/n?u=RePEc:osk:wpaper:1409&r=ecm
  11. By: Haruo Iwakura (Graduate School of Economics, Kyoto University); Ryo Okui (Institute of Economic Research, Kyoto University)
    Abstract: This paper studies the asymptotic efficiency in factor models with serially correlated errors and dynamic panel data models with interactive effects. We derive the efficiency bound for the estimation of factors, factor loadings and common parameters that describe the dynamic structure. We use double asymptotics under which both the cross-sectional sample size and the length of the time series tend to innity. The results show that the efficiency bound for factors is not affected by the presence of unknown factor loadings and common parameters, and analogous results hold for the bounds for factor loadings and common parameters. The efficiency bound is derived by using an innite-dimensional con- volution theorem. Perturbation to the innite-dimensional parameters, which consists in an important step of the derivation of the efficiency bound, is nontrivial and is discussed in detail.
    Keywords: asymptotic efficiency; convolution theorem; double asymptotics; dynamic panel data model; factor model; interactive effects.
    JEL: C13 C23
    Date: 2014–02
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:887&r=ecm
  12. By: Roberts, Leigh
    Abstract: Simple and intuitive non-parametric methods are provided for estimating variance change points for time series data. Only slight alterations to existing open-source computer code applying CUSUM methods for estimating breakpoints are required to apply our proposed techniques. Our approach, apparently new in this context, is first to define two artificial time series of double the length of the original by reflective continuations of the original. We then search for breakpoints forwards and backwards through each of these symmetric extensions to the original time series. A novel feature of this paper is that we are able to identify common breakpoints for multiple time series, even when they collect data at different frequencies. In particular, our methods facilitate the reconciliation of breakpoint outputs from the two standard wavelet filters. Simulation results in this paper indicate that our methods produce accurate results for time series exhibiting both long and short term correlation; and we illustrate by an application to Citigroup stock returns for the last thirty years.
    Keywords: Breakpoint, Variance change point;, Model-free, Non-parametric, R programming suite, R package waveslim, Wavelets, DWT (discrete wavelet transform), MODWT (maximal overlap discrete wavelet transform), MRA (multiresolution analysis), CUSUM (cumulative sum of squares), Cluster analysis, Change point,
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:vuw:vuwecf:3169&r=ecm
  13. By: Andrew Chesher (Institute for Fiscal Studies and University College London); Adam Rosen (Institute for Fiscal Studies and University College London)
    Abstract: The ability to allow for flexible forms of unobserved heterogeneity is an essential ingredient in modern microeconometrics. In this paper we extend the application of instrumental variable (IV) methods to a wide class of problems in which multiple values of unobservable variables can be associated with particular combinations of observed endogenous and exogenous variables. In our Generalized Instrumental Variable (GIV) models, in contrast to traditional IV models, the mapping from unobserved heterogeneity to endogenous variables need not admit a unique inverse. The class of GIV models allows unobservables to be multivariate and to enter non-separably into the determination of endogenous variables, thereby removing strong practical limitations on the role of unobserved heterogeneity. Important examples include models with discrete or mixed continuous/discrete outcomes and continuous unobservables, and models with excess heterogeneity where many combinations of different values of multiple unobserved variables, such as random coefficients, can deliver the same realizations of outcomes. We use tools from random set theory to study identiï¬cation in such models and provide a sharp characterization of the identiï¬ed set of structures admitted. We demonstrate the application of our analysis to a continuous outcome model with an interval-censored endogenous explanatory variable.
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:04/14&r=ecm
  14. By: M. Atikur Rahman Khan; D.S. Poskitt
    Abstract: Theoretical results on the properties of forecasts obtained using singular spectrum analysis are presented in this paper. The mean squared forecast error is derived under broad regularity conditions, and it is shown that the forecasts obtained in practice will converge to their population ensemble counterparts. The theoretical results are illustrated by examining the performance of singular spectrum analysis forecasts when applied to autoregressive processes and a random walk process. Simulation experiments suggest that the asymptotic properties developed are reflected in observed finite sample behaviour. Empirical applications using real world data sets indicate that forecasts based on singular spectrum analysis are competitive with other methods currently in vogue.
    Keywords: Linear recurrent formula, Mean squared forecast error, Signal dimension, Window length.
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2014-3&r=ecm
  15. By: Stephen Pollock
    Abstract: Alternative methods of trend extraction and of seasonal adjustment are described that operate in the time domain and in the frequency domain. The time-domain methods that are implemented in the TRAMO–SEATS and the STAMP programs are described and compared. An abbreviated time-domain method of seasonal adjustment that is implemented in the IDEOLOG program is also described. Finite-sample versions of the Wiener–Kolmogorov filter are described that can be used to implement the methods in a common way. The frequency-domain method, which is also implemented in the IDEOLOG program, employs a ideal frequency selective filter that depends on identifying the ordinates of the Fourier transform of a detrended data sequence that should lie in the pass band of the filter and those that should lie in its stop band. Filters of this nature can be used both for extracting a low-frequency cyclical component of the data and for extracting the seasonal component.
    Keywords: Signal extraction, Linear filtering, Frequency-domain analysis. Seasonal Adjustment
    JEL: E32 C22
    Date: 2014–02
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:14/04&r=ecm
  16. By: Juselius, Katarina
    Abstract: Researchers seldom find evidence of I(2) in exchange rates, prices, and other macroeconomics time series when they test the order of integration using univariate Dickey-Fuller tests. In contrast, when using the multivariate ML trace test we frequently find double unit roots in the data. Our paper demonstrates by simulations that this often happens when the signal-to-noise-ratio is small. --
    Keywords: univariate and multivariate unit root tests,double unit roots,near I(2)
    JEL: C1 C18 C22 C32 C52
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:20148&r=ecm
  17. By: Takashi Kamihigashi; John Stachurski
    Abstract: In both estimation and calibration studies, the notion of ergodicity plays a fundamental role, permitting time series averages to be regarded as approximations to population means. As it turns out, many economic models routinely used for quantitative modeling do not satisfy the classical ergodicity conditions. In this paper we develop a new set of ergodicity conditions orientated towards economic dynamics. We also provide sufficient conditions suitable for a variety of applications. It’s notable that the classical ergodicity results can be recovered as a special case of our main theorem.
    Keywords: Ergodicity, consistency, calibration
    JEL: C62 C63
    Date: 2014–02–12
    URL: http://d.repec.org/n?u=RePEc:ipg:wpaper:2014-086&r=ecm
  18. By: Amélie Charles (Audencia Recherche - Audencia); Olivier Darné (LEMNA - Laboratoire d'économie et de management de Nantes Atlantique - Université de Nantes : EA4272)
    Abstract: Financial market participants and policy-makers can benefit from a better understanding of how shocks can affect volatility over time. This study assesses the impact of structural changes and outliers on volatility persistence of three crude oil markets - Brent, West Texas Intermediate (WTI) and Organization of Petroleum Exporting Countries (OPEC) - between January 2, 1985 and June 17, 2011. We identify outliers using a new semi-parametric test based on conditional heteroscedasticity models. These large shocks can be associated with particular event patterns, such as the invasion of Kuwait by Iraq, the Operation Desert Storm, the Operation Desert Fox, and the Global Financial Crisis as well as OPEC announcements on production reduction or US announcements on crude inventories. We show that outliers can bias (i) the estimates of the parameters of the equation governing volatility dynamics; (ii) the regularity and non-negativity conditions of GARCH-type models (GARCH, IGARCH, FIGARCH and HYGARCH); and (iii) the detection of structural breaks in volatility, and thus the estimation of the persistence of the volatility. Therefore, taking into account the outliers on the volatility modelling process may improve the understanding of volatility in crude oil markets.
    Keywords: Crude oil ; Volatility persistence ; Structural breaks
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:hal:journl:hal-00940312&r=ecm
  19. By: Stephen Pollock
    Abstract: This essay was written to accompany a lecture to beginning students of the course of Economic Analytics, which is taught in the Institute of Econometrics of the University of Lodz in Poland. It provides, within a few pages, a broad historical account the development of econometrics. It begins by describing the origin of regression analysis and it concludes with an account of cointegration analysis. The purpose of the essay is to provide a context in which the students can locate various aspects of econometric analysis. A distinction must be made between the means by which new ideas were propagated and the manner and the circumstances in which they have originated. This account is concerned primarily with the propagation of the ideas.
    Keywords: econometrics, regression analysis, cointegration analysis, statistical analysis
    JEL: B16 B23
    Date: 2014–02
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:14/05&r=ecm
  20. By: Plötz, Patrick
    Abstract: The vehicle distances travelled by individual users can very strongly between different days. This is particularly problematic for electric vehicles since trips larger than the electric range clearly reduce the vehicle's utility. Here we estimate the number of days with driving distance larger than a given threshold for individual users based on their observed driving behaviour. The general formalism is developed and estimates for the main observable and standard errors are derived based on the assumption of individual log-normal distributed daily vehicle kilometres travelled. Numerical simulations of driving profiles demonstrate the validity and accuracy of the analytical results. --
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:zbw:fisisi:s12014&r=ecm
  21. By: Akihiko Takahashi (Faculty of Economics, The University of Tokyo); Yukihiro Tsuzuki (Graduate School of Economics, The University of Tokyo)
    Abstract:    This paper develops a new scheme for improving an approximation method of a probability density function, which is inspired by the idea in best approximation in an inner product space . Moreover, we apply "Dykstra's cyclic projections algorithm" for its implementation. Numerical examples for application to an asymptotic expansion method in option pricing demonstrate the effectiveness of our scheme under SABR model.
    Date: 2014–02
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2014cf917&r=ecm

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.