
on Econometrics 
By:  Carolina Castagnetti (Department of Economics and Management, University of Pavia); Eduardo Rossi (Department of Economics and Management, University of Pavia); Lorenzo Trapani (Faculty of Finance,Cass Business School, City University, London (UK)) 
Abstract:  This paper considers estimation in a stationary heterogeneous panel model where common unknown factors are present. A twostage estimator is proposed. This estimator is based on the CCE estimator (Pesaran, 2006) in the first stage and on a similar approach to the Interactive Effect estimator (Bai, 2009) in the second stage. The asymptotic properties of this estimator are provided alongside of the comparative finitesample properties of a range of estimators by means of Monte Carlo experiments. 
Keywords:  Large panels; Factor error structure; Principal components; Common regressors; Crosssection dependence 
JEL:  C33 C38 
Date:  2014–01 
URL:  http://d.repec.org/n?u=RePEc:pav:demwpp:demwp0066&r=ecm 
By:  Chaohua Dong; Jiti Gao 
Abstract:  This paper proposes two simple and new specification tests based on the use of an orthogonal series for a considerable class of cointegrated time series models with endogeneity and nonstationarity. The paper then establishes an asymptotic theory for each of the proposed tests. The first test is initially proposed for the case where the regression function involved is integrable, which fills a gap in the literature, and the second test is an extended version of the first test for covering a class of nonintegrable functions. Endogeneity in two general forms is allowed in the models to be tested. A potential global departure in the alternative hypothesis, which is being overlooked by the literature, is investigated. The finite sample performance of the proposed tests is examined through using several simulated examples. Meanwhile, the second test is naturally applicable to the case where there is a type of endogeneity inherited in the relationship between the United States aggregate consumers' consumption expenditure and disposable income over the period of 19602009. Our experience generally shows that the proposed tests are easily implementable and also have stable sizes and good power properties even when the 'distance' between the null hypothesis and a sequence of local alternatives is asymptotically negligible. 
Keywords:  Consumptionincome model; Endogeneity; Integrated time series; Linear process; Orthogonal series estimation; Parametric specification 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:20142&r=ecm 
By:  Patrick W Saart; Jiti Gao; Nam Hyun Kim 
Abstract:  In recent years, analysis of financial time series has focused largely on data related to market trading activity. Apart from modelling the conditional variance of returns within the GARCH family of models, presently attention has also been devoted to other market variables, especially volumes, number of trades and durations. The financial econometrics literature has focused on Multiplicative Error Models (MEMs), which are considered particularly suited for modelling certain financial variables. The paper establishes an econometric specification approach for MEMs. In the literature, several procedures are available to perform specification testing for MEMs, but the proposed specification testing method is particularly useful within the context of the MEMs of financial duration. The paper makes a number of important theoretical contributions. Both the proposed specification testing method and the associated theory are established and evaluated through simulations and real data examples. 
Keywords:  Financial duration process; Nonnegative time series; Nonparametric kernel estimation; Semiparametric mixture model 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:20141&r=ecm 
By:  Federico Bugni (Institute for Fiscal Studies and Duke University); Ivan Canay (Institute for Fiscal Studies and Northwestern University); Xiaoxia Shi 
Abstract:  This paper introduces a new hypothesis test for the null hypothesis H0 : f(Î¸) = Ï’0, where f(.) is a known function, Ï’0 is a known constant, and Î¸ is a parameter that is partially identied by a moment (in)equality model. The main application of our test is subvector inference in moment inequality models, that is, for a multidimensional Î¸, the function f(Î¸) = Î¸k selects the kth coordinate of Î¸. Our test controls asymptotic size uniformly over a large class of distributions of the data and has better asymptotic power properties than currently available methods. In particular, we show that the new test has asymptotic power that dominates the one corresponding to two existing competitors in the literature: subsampling and projectionbased tests. 
Date:  2014–01 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:05/14&r=ecm 
By:  Yasumasa Matsuda 
Abstract:  This paper aims to provide a wavelet analysis for spatiotemporal data which are observed on irregularly spaced stations at discrete time points, where the spatial covariances show serious nonstationarity caused by local dependency. A specific example that is used for the demonstration is US precipitation data observed on about ten thousand stations in every month. By a reinterpretation of Whittle likelihood function for stationary time series, we propose a kind of Bayesian regression model for spatial data whose regressors are given by modified Haar wavelets and try a spatiotemporal extension by a state space approach. We also propose an empirical Bayes estimation for the parameters, which is regarded as a spatiotemporal extension of Whittle likelihood estimation originally defined for stationary time series. We conduct the extended Whittle estimate and compare mean square errors of the forecasts with those of some benchmarks to evaluate its goodness for the US precipitation data in August from 19871997. 
Date:  2014–01 
URL:  http://d.repec.org/n?u=RePEc:toh:tergaa:311&r=ecm 
By:  Malay Ghosh (Department of Statistics, University of Florida,); Tatsuya Kubokawa (Faculty of Economics, The University of Tokyo); Yuki Kawakubo (Graduate School of Economics, The University of Tokyo) 
Abstract:  ã€€ã€€ The paper develops empirical Bayes and benchmarked empirical Bayes estimators of positive small area means under multiplicative models. A simple example will be estimation of per capita income for small areas. It is now wellunderstood that small area estimation needs explicit, or at least implicit use of models. One potential difficulty with modelbased estimators is that the overall estimator for a larger geographical area based on (weighted) sum of the modelbased estimators is not necessarily identical to the corresponding direct estimator, such as the overall sample mean. One way to fix such a problem is the socalled benchmarking approach which modifies the modelbased estimators to match the aggregate direct estimator. Benchmarked hierarchical and empirical Bayes estimators have proved to be particularly useful in this regard. However, while estimating positive small area parameters, the conventional squared error or weighted squared loss subject to the usual benchmark constraint does not necessarily produce positive estimators. Hence, it is necessary to seek other meaningful losses to alleviate this problem. In this paper, we consider the transformed FayHerriot model as a multiplicative model for estimating positive small area means, and suggest a weighted KullbackLeibler divergence as a loss function. We have found out that the resulting Bayes estimator is the posterior mean and that the corresponding benchmarked Bayes and empirical Bayes estimators retain the positivity constraint. The prediction errors of the suggested empirical Bayes estimators are investigated asymptotically, and their secondorder unbiased estimators are provided. In addition, bootstrapped estimators of these prediction errors are also provided. The performance of the suggested procedures is investigated through simulation as well as with an empirical study. 
Date:  2014–02 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2014cf918&r=ecm 
By:  Heni Boubaker; Nadia Sghaier 
Abstract:  This paper proposes a new class of semiparametric generalized long memory model with FIA PARCH errors (SEMIGARMAFIAPARCH model) that extends the conventionnel GARMA model to incorporate nonlinear deterministic trend, in the mean equation, and to allow for time varying volatility, in the conditional variance equation. The parameters of this model are estimated in a wavelet domain. We provide an empirical application of this model to examine the dynamic of the stock market returns in six GCC countries. The empirical results show that the model proposed oÂ¤ers an interesting framework to describe the seasonal long range dependence and the nonlinear deterministic trend in the return as well as persistence to shocks in the conditional volatiliy. We also compare its performance predictive to the traditional long memory model with FIAPARCH errors (FARMAFIAPARCH model). The predictive results indicate that the model proposed out performs the FARMAFIAPARCH model. 
Keywords:  semiparametric generalized long memory process, FIAPARCH errors, wavelet do main, stock market returns. 
JEL:  C13 C22 C32 G15 
Date:  2014–01–06 
URL:  http://d.repec.org/n?u=RePEc:ipg:wpaper:2014066&r=ecm 
By:  Juyoung Cheong (School of Economics, The University of Queensland); Do Won Kwak (School of Economics, The University of Queensland); Kam Ki Tang (School of Economics, The University of Queensland) 
Abstract:  This paper proposes a within estimator for threelevel data, such as timevariant bilateral trade flows. The estimator helps to address the computational difficulties in estimating, for instance, the gravity model of bilateral trade that needs to control for unobserved countrypair and countrytime heterogeneity using fixed effects. Unlike the traditional within transformation that removes crosssectional heterogeneity, the proposed transformation method removes all three types of heterogeneity, each of which varies in two dimensions of threelevel data. We demonstrate the properties of the estimator using empirical examples and Monte Carlo simulations. Simulation results show that the proposed estimators with the adjusted standard errors consistently estimate coefficient parameters and perform correct inference when no data are missing or the missing data are random. When missing data are not random, the proposed within transformation can reduce bias to various degrees, depending on the order of demeaning and sources of bias. As an empirical application, we investigate the WTO effect puzzle by applying the proposed estimator to show that WTO membership has a definite positive effect on bilateral trade flows. 
Date:  2014–02–11 
URL:  http://d.repec.org/n?u=RePEc:qld:uq2004:501&r=ecm 
By:  Matthew Masten 
Abstract:  This paper considers a classical linear simultaneous equations model with random coefficients on the endogenous variables. Simultaneous equations models are used to study social interactions, strategic interactions between ï¬rms, and market equilibrium. Random coefficient models allow for heterogeneous marginal effects. For twoequation systems, I give two sets of sufficient conditions for point identiï¬cation of the coefficientsâ€™ marginal distributions conditional on exogenous covariates. The ï¬rst requires full support instruments, but allows for nearly arbitrary distributions of unobservables. The second allows for continuous instruments without full support, but places tail restrictions on the distributions of unobservables. I show that a nonparametric sieve maximum likelihood estimator for these distributions is consistent. I apply my results to the Add Health data to analyze the social determinants of obesity. 
Date:  2014–01 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:01/14&r=ecm 
By:  Ryo Kinoshita (Graduate School of Economics, Osaka University); Kosuke Oya (Graduate School of Economics & Center for the Study of Finance and Insurance, Osaka University) 
Abstract:  Structural change is gauged with the change of parameters in the model. In the case of multiple time series model, the causality between the time series also changes when there is a structural change. However the magnitude of change in causality is not clear in the case of structural change. We explore the measure of causality change between the time series and propose the test statistic whether there is any significance change in the causal relationship using frequency domain causality measure given by Geweke (1982) and Hosoya (1991). These procedures can be applied to error correction model which is nonstationary time series. The properties of the measure and test statistic are examined through the Monte Carlo simulation. As an example of application, the change in causality between United states and Japanese stock indexes is tested. 
Keywords:  Causality, Frequency domain, Error correction model, Structural breaks 
JEL:  C01 C19 
Date:  2014–02 
URL:  http://d.repec.org/n?u=RePEc:osk:wpaper:1409&r=ecm 
By:  Haruo Iwakura (Graduate School of Economics, Kyoto University); Ryo Okui (Institute of Economic Research, Kyoto University) 
Abstract:  This paper studies the asymptotic efficiency in factor models with serially correlated errors and dynamic panel data models with interactive effects. We derive the efficiency bound for the estimation of factors, factor loadings and common parameters that describe the dynamic structure. We use double asymptotics under which both the crosssectional sample size and the length of the time series tend to innity. The results show that the efficiency bound for factors is not affected by the presence of unknown factor loadings and common parameters, and analogous results hold for the bounds for factor loadings and common parameters. The efficiency bound is derived by using an innitedimensional con volution theorem. Perturbation to the innitedimensional parameters, which consists in an important step of the derivation of the efficiency bound, is nontrivial and is discussed in detail. 
Keywords:  asymptotic efficiency; convolution theorem; double asymptotics; dynamic panel data model; factor model; interactive effects. 
JEL:  C13 C23 
Date:  2014–02 
URL:  http://d.repec.org/n?u=RePEc:kyo:wpaper:887&r=ecm 
By:  Roberts, Leigh 
Abstract:  Simple and intuitive nonparametric methods are provided for estimating variance change points for time series data. Only slight alterations to existing opensource computer code applying CUSUM methods for estimating breakpoints are required to apply our proposed techniques. Our approach, apparently new in this context, is first to define two artificial time series of double the length of the original by reflective continuations of the original. We then search for breakpoints forwards and backwards through each of these symmetric extensions to the original time series. A novel feature of this paper is that we are able to identify common breakpoints for multiple time series, even when they collect data at different frequencies. In particular, our methods facilitate the reconciliation of breakpoint outputs from the two standard wavelet filters. Simulation results in this paper indicate that our methods produce accurate results for time series exhibiting both long and short term correlation; and we illustrate by an application to Citigroup stock returns for the last thirty years. 
Keywords:  Breakpoint, Variance change point;, Modelfree, Nonparametric, R programming suite, R package waveslim, Wavelets, DWT (discrete wavelet transform), MODWT (maximal overlap discrete wavelet transform), MRA (multiresolution analysis), CUSUM (cumulative sum of squares), Cluster analysis, Change point, 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:vuw:vuwecf:3169&r=ecm 
By:  Andrew Chesher (Institute for Fiscal Studies and University College London); Adam Rosen (Institute for Fiscal Studies and University College London) 
Abstract:  The ability to allow for ï¬‚exible forms of unobserved heterogeneity is an essential ingredient in modern microeconometrics. In this paper we extend the application of instrumental variable (IV) methods to a wide class of problems in which multiple values of unobservable variables can be associated with particular combinations of observed endogenous and exogenous variables. In our Generalized Instrumental Variable (GIV) models, in contrast to traditional IV models, the mapping from unobserved heterogeneity to endogenous variables need not admit a unique inverse. The class of GIV models allows unobservables to be multivariate and to enter nonseparably into the determination of endogenous variables, thereby removing strong practical limitations on the role of unobserved heterogeneity. Important examples include models with discrete or mixed continuous/discrete outcomes and continuous unobservables, and models with excess heterogeneity where many combinations of different values of multiple unobserved variables, such as random coefficients, can deliver the same realizations of outcomes. We use tools from random set theory to study identiï¬cation in such models and provide a sharp characterization of the identiï¬ed set of structures admitted. We demonstrate the application of our analysis to a continuous outcome model with an intervalcensored endogenous explanatory variable. 
Date:  2014–01 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:04/14&r=ecm 
By:  M. Atikur Rahman Khan; D.S. Poskitt 
Abstract:  Theoretical results on the properties of forecasts obtained using singular spectrum analysis are presented in this paper. The mean squared forecast error is derived under broad regularity conditions, and it is shown that the forecasts obtained in practice will converge to their population ensemble counterparts. The theoretical results are illustrated by examining the performance of singular spectrum analysis forecasts when applied to autoregressive processes and a random walk process. Simulation experiments suggest that the asymptotic properties developed are reflected in observed finite sample behaviour. Empirical applications using real world data sets indicate that forecasts based on singular spectrum analysis are competitive with other methods currently in vogue. 
Keywords:  Linear recurrent formula, Mean squared forecast error, Signal dimension, Window length. 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:20143&r=ecm 
By:  Stephen Pollock 
Abstract:  Alternative methods of trend extraction and of seasonal adjustment are described that operate in the time domain and in the frequency domain. The timedomain methods that are implemented in the TRAMO–SEATS and the STAMP programs are described and compared. An abbreviated timedomain method of seasonal adjustment that is implemented in the IDEOLOG program is also described. Finitesample versions of the Wiener–Kolmogorov filter are described that can be used to implement the methods in a common way. The frequencydomain method, which is also implemented in the IDEOLOG program, employs a ideal frequency selective filter that depends on identifying the ordinates of the Fourier transform of a detrended data sequence that should lie in the pass band of the filter and those that should lie in its stop band. Filters of this nature can be used both for extracting a lowfrequency cyclical component of the data and for extracting the seasonal component. 
Keywords:  Signal extraction, Linear filtering, Frequencydomain analysis. Seasonal Adjustment 
JEL:  E32 C22 
Date:  2014–02 
URL:  http://d.repec.org/n?u=RePEc:lec:leecon:14/04&r=ecm 
By:  Juselius, Katarina 
Abstract:  Researchers seldom find evidence of I(2) in exchange rates, prices, and other macroeconomics time series when they test the order of integration using univariate DickeyFuller tests. In contrast, when using the multivariate ML trace test we frequently find double unit roots in the data. Our paper demonstrates by simulations that this often happens when the signaltonoiseratio is small.  
Keywords:  univariate and multivariate unit root tests,double unit roots,near I(2) 
JEL:  C1 C18 C22 C32 C52 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:zbw:ifwedp:20148&r=ecm 
By:  Takashi Kamihigashi; John Stachurski 
Abstract:  In both estimation and calibration studies, the notion of ergodicity plays a fundamental role, permitting time series averages to be regarded as approximations to population means. As it turns out, many economic models routinely used for quantitative modeling do not satisfy the classical ergodicity conditions. In this paper we develop a new set of ergodicity conditions orientated towards economic dynamics. We also provide sufficient conditions suitable for a variety of applications. Itâ€™s notable that the classical ergodicity results can be recovered as a special case of our main theorem. 
Keywords:  Ergodicity, consistency, calibration 
JEL:  C62 C63 
Date:  2014–02–12 
URL:  http://d.repec.org/n?u=RePEc:ipg:wpaper:2014086&r=ecm 
By:  Amélie Charles (Audencia Recherche  Audencia); Olivier Darné (LEMNA  Laboratoire d'économie et de management de Nantes Atlantique  Université de Nantes : EA4272) 
Abstract:  Financial market participants and policymakers can benefit from a better understanding of how shocks can affect volatility over time. This study assesses the impact of structural changes and outliers on volatility persistence of three crude oil markets  Brent, West Texas Intermediate (WTI) and Organization of Petroleum Exporting Countries (OPEC)  between January 2, 1985 and June 17, 2011. We identify outliers using a new semiparametric test based on conditional heteroscedasticity models. These large shocks can be associated with particular event patterns, such as the invasion of Kuwait by Iraq, the Operation Desert Storm, the Operation Desert Fox, and the Global Financial Crisis as well as OPEC announcements on production reduction or US announcements on crude inventories. We show that outliers can bias (i) the estimates of the parameters of the equation governing volatility dynamics; (ii) the regularity and nonnegativity conditions of GARCHtype models (GARCH, IGARCH, FIGARCH and HYGARCH); and (iii) the detection of structural breaks in volatility, and thus the estimation of the persistence of the volatility. Therefore, taking into account the outliers on the volatility modelling process may improve the understanding of volatility in crude oil markets. 
Keywords:  Crude oil ; Volatility persistence ; Structural breaks 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal00940312&r=ecm 
By:  Stephen Pollock 
Abstract:  This essay was written to accompany a lecture to beginning students of the course of Economic Analytics, which is taught in the Institute of Econometrics of the University of Lodz in Poland. It provides, within a few pages, a broad historical account the development of econometrics. It begins by describing the origin of regression analysis and it concludes with an account of cointegration analysis. The purpose of the essay is to provide a context in which the students can locate various aspects of econometric analysis. A distinction must be made between the means by which new ideas were propagated and the manner and the circumstances in which they have originated. This account is concerned primarily with the propagation of the ideas. 
Keywords:  econometrics, regression analysis, cointegration analysis, statistical analysis 
JEL:  B16 B23 
Date:  2014–02 
URL:  http://d.repec.org/n?u=RePEc:lec:leecon:14/05&r=ecm 
By:  Plötz, Patrick 
Abstract:  The vehicle distances travelled by individual users can very strongly between different days. This is particularly problematic for electric vehicles since trips larger than the electric range clearly reduce the vehicle's utility. Here we estimate the number of days with driving distance larger than a given threshold for individual users based on their observed driving behaviour. The general formalism is developed and estimates for the main observable and standard errors are derived based on the assumption of individual lognormal distributed daily vehicle kilometres travelled. Numerical simulations of driving profiles demonstrate the validity and accuracy of the analytical results.  
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:zbw:fisisi:s12014&r=ecm 
By:  Akihiko Takahashi (Faculty of Economics, The University of Tokyo); Yukihiro Tsuzuki (Graduate School of Economics, The University of Tokyo) 
Abstract:  ã€€ã€€ This paper develops a new scheme for improving an approximation method of a probability density function, which is inspired by the idea in best approximation in an inner product space . Moreover, we apply "Dykstra's cyclic projections algorithm" for its implementation. Numerical examples for application to an asymptotic expansion method in option pricing demonstrate the effectiveness of our scheme under SABR model. 
Date:  2014–02 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2014cf917&r=ecm 