
on Econometrics 
By:  Giuseppe Cavaliere; David I. Harvey; Stephen J. Leybourne; A. M. Robert Taylor 
Abstract:  In this paper we analyse the impact of nonstationary volatility on the recently developed unit root tests which allow for a possible break in trend occurring at an unknown point in the sample, considered in Harris, Harvey, Leybourne and Taylor (2009) [HHLT]. HHLT's analysis hinges on a new break fraction estimator which, when a break in trend occurs, is consistent for the true break fraction at rate Op(T^1). Unlike other available estimators, however, when there is no trend break HHLT's estimator converges to zero at rate Op(T^1/2). In their analysis HHLT assume the shocks to follow a linear process driven by IID innovations. Our first contribution is to show that HHLT's break fraction estimator retains the same consistency properties as demonstrated by HHLT for the IID case when the innovations display nonstationary behaviour of a quite general form, including, for example, the case of a single break in the volatility of the innovations which may or may not occur at the same time as a break in trend. However, as we subsequently demonstrate, the limiting null distribution of unit root statistics based around this estimator are not pivotal in the presence of nonstationary volatility. Associated Monte Carlo evidence is presented to quantify the impact of a onetime change in volatility on both the asymptotic and finite sample behaviour of such tests. A solution to the identified inference problem is then provided by considering wild bootstrapbased implementations of the HHLT tests, using the trend break estimator from the original sample data. The proposed bootstrap method does not require the practitioner to specify a parametric model for volatility, and is shown to perform very well in practice across a range of models. 
Keywords:  Unit root tests; quasi difference detrending; trend break; nonstationary volatility; wild bootstrap 
Date:  2009–12 
URL:  http://d.repec.org/n?u=RePEc:not:notgts:09/05&r=ecm 
By:  Muni S. Srivastava (Department of Statistics, University of Toronto); Tatsuya Kubokawa (Faculty of Economics, University of Tokyo) 
Abstract:  The Akaike information criterion, AIC, and Mallows' Cp statistic have been proposed for selecting a smaller number of regressor variables in the multivariate regression models with fully unknown covariance matrix. All these criteria are, however, based on the implicit assumption that the sample size is substantially larger than the dimension of the covariance matrix. To obtain a stable estimator of the covariance matrix, it is required that the dimension of the covariance matrix be much smaller than the sample size. When the dimension is close to the sample size, it is necessary to use ridge type of estimators for the covariance matrix. In this paper, we use a ridge type of estimators for the covariance matrix and obtain the modified AIC and modified Cp statistic under the asymptotic theory that both the sample size and the dimension go to infinity. It is numerically shown that these modified procedures perform very well in the sense of selecting the true model in large dimensional cases. 
Date:  2010–01 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2010cf709&r=ecm 
By:  Markus Rei\ss 
Abstract:  The basic model for highfrequency data in finance is considered, where an efficient price process is observed under microstructure noise. It is shown that this nonparametric model is in Le Cam's sense asymptotically equivalent to a Gaussian shift experiment in terms of the square root of the volatility function $\sigma$. As an application, simple rateoptimal estimators of the volatility and efficient estimators of the integrated volatility are constructed. 
Date:  2010–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1001.3006&r=ecm 
By:  Jennifer L. Castle; David F. Hendry 
Abstract:  Our strategy for automatic selection in potentially nonlinear processes is: test for nonlinearity in the unrestricted linear formulation; if that test rejects, specify a general model using polynomials, to be simplified to a minimal congruent representation; finally select by encompassing tests of specific nonlinear forms against the selected model. Nonlinearity poses many problems: extreme observations leading to nonnormal (fattailed) distributions; collinearity between nonlinear functions; usually more variables than observations when approximating the nonlinearity; and excess retention of irrelevant variables; but solutions are proposed. A returnstoeducation empirical application demonstrates the feasiblity of the nonlinear automatic model selection algorithm Autometrics. 
Keywords:  Econometric methodology, Model selection, Autometrics, Nonlinearity, Outlier, Returns to education 
JEL:  C51 C22 C87 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:473&r=ecm 
By:  Jesús Crespo Cuaresma (Department of Economics, University of Innsbruck; Universitätstrasse 15, 6020 Innsbruck, Austria); Martin Feldkircher (Oesterreichische Nationalbank, Foreign Research Division, OttoWagnerPlatz 3, POB 61, A1011 Vienna) 
Abstract:  In this paper we put forward a Bayesian Model Averaging method dealing with model uncertainty in the presence of potential spatial autocorrelation. The method uses spatial filtering in order to account for different types of spatial links. We contribute to existing methods that handle spatial dependence among observations by explicitly taking care of uncertainty stemming from the choice of a particular spatial structure. Our method is applied to estimate the conditional speed of income convergence across 255 NUTS2 European regions for the period from 1995 to 2005. We show that the choice of a spatial weight matrix  and in particular the choice of a class thereof  can have an important effect on the estimates of the parameters attached to the model covariates. We also show that estimates of the speed of income convergence across European regions depend strongly on the form of the spatial patterns which are assumed to underlie the dataset. When we take into account this dimension of model uncertainty, the posterior distribution of the speed of convergence parameter has a large probability mass around a rate of convergence of 1%, approximately half of the value which is usually reported in the literature. 
Keywords:  Dollarization, Model uncertainty, spatial filtering, determinants of economic growth, European regions 
JEL:  C11 C15 C21 R11 O52 
Date:  2010–01–11 
URL:  http://d.repec.org/n?u=RePEc:onb:oenbwp:160&r=ecm 
By:  Jennifer L. Castle; Jurgen A. Doornik; David F. Hendry 
Abstract:  We consider model selection when there is uncertainty over the choice of variables and the occurrence and timing of multiple location shifts. Generaltosimple selection is extended using Autometrics by adding an impulse indicator for every observation to the set of candidate regressors (see Hendry, Johansen and Santos, 2008, and Johansen and Nielsen, 2009). We apply that approach to a fattailed distribution and processes with breaks: Monte Carlo experiments show its capability of detecting up to 20 shifts in 100 observations, while jointly selecting variables. An illustration to U.S. real interest rates compares impulseindicator saturation with the procedure in Bai and Perron (1998). 
Keywords:  Impulseindicator saturation, Location shifts, Model selection, Autometrics 
JEL:  C51 C22 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:471&r=ecm 
By:  Ulrike Schneider; Martin Wagner 
Abstract:  This paper uses the adaptive Lasso estimator to determine the variables important for economic growth. The adaptive Lasso estimator is a computationally very simple procedure that can perform at the same time model selection and consistent parameter estimation. The methodology is applied to three data sets, the data used in SalaiMartin et al. (2004), in Fernandez et al. (2001) and a data set for the regions in the European Union. The results for the former two data sets are similar in several respects to those found in the published papers, yet are obtained at a negligible fraction of computational cost. Furthermore, the results for the European regional data highlight the importance of human capital for economic growth. 
Keywords:  adaptive Lasso, economic convergence, growth regressions, model selection 
JEL:  C31 C52 O11 O18 O47 
Date:  2009–06 
URL:  http://d.repec.org/n?u=RePEc:wii:wpaper:55&r=ecm 
By:  Thomas Flury; Neil Shephard 
Abstract:  A key ingredient of many particle filters is the use of the sampling importance resampling algorithm (SIR), which transforms a sample of weighted draws from a prior distribution into equally weighted draws from a posterior distribution. We give a novel analysis of the SIR algorithm and analyse the jittered generalisation of SIR, showing that existing implementations of jittering lead to marked inferior behaviour over the base SIR algorithm. We show how jittering can be designed to improve the performance of the SIR algorithm. We illustrate its performance in practice in the context of three filtering problems. 
Keywords:  Importance sampling, Particle filter, Random numbers, Sampling importance resampling, State space models 
JEL:  C14 C32 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:469&r=ecm 
By:  Felipe Vásquez (Departamento de Economía, Universidad de Concepción.); Michael Hanemann (Department of Agricultural and Resource Economics,University of California, Berkeley) 
Abstract:  In this paper we present a new utility model that serves as the basis for modeling discrete/continuous consumer choices with a general corner solution.The new model involves a more flexible representation of preferences than what has been used in the previous literature and, unlike most of this literature, it is not additively separable. This functional form can handle richer substitution patterns such as complementarity as well as substitution among goods. We focus in part on the Quadratic BoxCox utility function and examine its properties from both theoretical and empirical perspectives. We identify the significance of the various parameters of the utility function, and demonstrate an estimation strategy that can be applied to demand systems involving both a small and large number of commodities. 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:cnc:wpaper:082009&r=ecm 
By:  Antonios Antypas (Department of Banking and Financial Management, University of Piraeus); Phoebe Koundouri (Dept. of International and European Economic Studies, Athens University of Economics and Business); Nikolaos Kourogenis (Department of Banking and Financial Management, University of Piraeus.) 
Abstract:  This paper aims at reconciling two apparently contradictory empirical regularities of financial returns, namely the fact that the empirical distribution of returns tends to normality as the frequency of observation decreases (aggregational Gaussianity) combined with the fact that the conditional variance of high frequency returns seems to have a unit root, in which case the unconditional variance is infinite. We show that aggregational Gaussianity and infinite variance can coexist, provided that all the moments of the unconditional distribution whose order is less than two exist. The latter characterises the case of Integrated GARCH (IGARCH) processes. Finally, we discuss testing for aggregational Gaussianity under barely infinite varian 
Keywords:  aggregational Gausianity, infinite variance, IGARCH, crop prices 
JEL:  C10 G12 Q14 
Date:  2010–01–23 
URL:  http://d.repec.org/n?u=RePEc:aue:wpaper:1001&r=ecm 
By:  Bence Toth; Fabrizio Lillo; J. Doyne Farmer 
Abstract:  We introduce an algorithm for the segmentation of a class of regime switching processes. The segmentation algorithm is a non parametric statistical method able to identify the regimes (patches) of the time series. The process is composed of consecutive patches of variable length, each patch being described by a stationary compound Poisson process, i.e. a Poisson process where each count is associated to a fluctuating signal. The parameters of the process are different in each patch and therefore the time series is non stationary. Our method is a generalization of the algorithm introduced by BernaolaGalvan, et al., Phys. Rev. Lett., 87, 168105 (2001). We show that the new algorithm outperforms the original one for regime switching compound Poisson processes. As an application we use the algorithm to segment the time series of the inventory of market members of the London Stock Exchange and we observe that our method finds almost three times more patches than the original one. 
Date:  2010–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1001.2549&r=ecm 
By:  David F. Hendry; Grayham E. Mizon 
Abstract:  We model expenditure on food in the USA, using an extended time series. Even when a theory is essentially ‘correct’, it can manifest serious misspecification if just fitted to data, ignoring its observed characteristics and major external events such as wars, recessions and policy changes. When the same theory is embedded in a general framework embracing dynamics and structural breaks, it performs well even over an extended data period, as shown using Autometrics with impulseindicator saturation. Although this particular illustration involves a simple theory, the point made is generic, and applies no matter how sophisticated the theory. 
Keywords:  Econometric modelling, Food expenditure, Structural breaks, Impulseindicator saturation, Autometrics 
JEL:  C51 C22 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:475&r=ecm 
By:  Jennifer L. Castle; Nicholas W.P. Fawcett; David F. Hendry 
Abstract:  When location shifts occur, cointegrationbased equilibriumcorrection models (EqCMs) face forecasting problems. We consider alleviating such forecast failure by updating, intercept corrections, differencing, and estimating the future progress of an ‘internal’ break. Updating leads to a loss of cointegration when an EqCM suffers an equilibriummean shift, but helps when collinearities are changed by an ‘external’ break with the EqCM staying constant. Both mechanistic corrections help compared to retaining a prebreak estimated model, but an estimated model of the break process could outperform. We apply the approaches to EqCMs for UK M1, compared with updating a learning function as the break evolves. 
Keywords:  Cointegration, Equilibriumcorrection, Forecasting, Location shifts, Collinearity, M1 
JEL:  C1 C53 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:470&r=ecm 
By:  B. Kaulakys; M. Alaburda; V. Gontis 
Abstract:  We consider stochastic point processes generating time series exhibiting power laws of spectrum and distribution density (Phys. Rev. E 71, 051105 (2005)) and apply them for modeling the trading activity in the financial markets and for the frequencies of word occurrences in the language. 
Date:  2010–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1001.2639&r=ecm 
By:  Bhattacharjee, A.; Holly, S. 
Abstract:  While much of the literature on cross section dependence has focused mainly on estimation of the regression coefficients in the underlying model, estimation and inferences on the magnitude and strength of spillovers and interactions has been largely ignored. At the same time, such inferences are important in many applications, not least because they have structural interpretations and provide useful interpretation and structural explanation for the strength of any interactions. In this paper we propose GMM methods designed to uncover underlying (hidden) interactions in social networks and committees. Special attention is paid to the interval censored regression model. Our methods are applied to a study of committee decision making within the Bank of England’s monetary policy committee. 
Keywords:  Committee decision making; Social networks; Cross section and spatial interaction; Generalised method of moments; Censored regression model; ExpectationMaximisation Algorithm; Monetary policy; Interest rates 
JEL:  D71 D85 E43 E52 C31 C34 
Date:  2010–01–22 
URL:  http://d.repec.org/n?u=RePEc:cam:camdae:1003&r=ecm 
By:  Ba Chu; Roman Kozhan 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:wbs:wpaper:wp0904&r=ecm 
By:  Ingmar Nolte; Valeri Voev 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:wbs:wpaper:wp0902&r=ecm 
By:  Bhattacharjee, A.; Holly, S. 
Abstract:  Traditionally, research has been devoted almost exclusively to estimation of underlying structural models without adequate attention to the drivers of diffusion and interaction across cross section and spatial units. We review some new methodologies in this emerging area and demonstrate their use in measurement and inferences on cross section and spatial interactions. Limitations and potential enhancements of the existing methods are discussed, and several directions for new research are highlighted. 
Keywords:  Cross Sectional and Spatial Dependence; SpatialWeights Matrix; Interactions and Di¤usion; Monetary Policy Committee; Generalised Method of Moments. 
JEL:  E42 E43 E50 E58 
Date:  2010–01–22 
URL:  http://d.repec.org/n?u=RePEc:cam:camdae:1004&r=ecm 