
on Econometrics 
By:  Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 132441020) 
Abstract:  This paper gives a brief survey of forecastiang with panel data. Starting with a simple error component regression model and surveying best linear unbiased prediction under various assumptions of the disturbance term. This includes various ARMA models as well as spatial autoregressive models. The paper also surveys how these forecasts have been used in panel data applications, running horse races between heterogeneous and homogeneous panel data models using out of sample forecasts. 
Keywords:  forecasting; BLUP; panel data; spatial dependence; serial correlation; heterogeneous panels. 
JEL:  C33 
URL:  http://d.repec.org/n?u=RePEc:max:cprwps:91&r=ecm 
By:  Frank Windmeijer (Institute for Fiscal Studies and University of Bristol) 
Abstract:  This paper gives an account of the recent literature on estimating models for panel count data. Specifically, the treatment of unobserved individual heterogeneity that is correlated with the explanatory variables and the presence of explanatory variables that are not strictly exogenous are central. Moment conditions are discussed for these type of problems that enable estimation of the parameters by GMM. As standard Wald tests based on efficient twostep GMM estimation results are known to have poor finite sample behaviour, alternative test procedures that have recently been proposed in the literature are evaluated by means of a Monte Carlo study. 
Keywords:  GMM, exponential models, hypothesis testing 
JEL:  C12 C13 C23 
Date:  2006–10 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:21/06&r=ecm 
By:  Adam Rosen (Institute for Fiscal Studies and University College London) 
Abstract:  This paper proposes a new way to construct confidence sets for a parameter of interest in models comprised of finitely many moment inequalities. Building on results from the literature on multivariate onesided tests, I show how to test the hypothesis that any particular parameter value is logically consistent with the maintained moment inequalities. The associated test statistic has an asymptotic chibarsquare distribution, and can be inverted to construct an asymptotic confidence set for the parameter of interest, even if that parameter is only partially identified. The confidence sets are easily computed, and Monte Carlo simulations demonstrate good finite sample performance. 
Keywords:  Partial identification, inference, moment inequalities 
JEL:  C3 C12 
Date:  2006–12 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:25/06&r=ecm 
By:  Alex Maynard (Wilfrid Laurier University); Katsumi Shimotsu (Queen's University) 
Abstract:  This paper develops a new test of orthogonality based on a zero restriction on the covariance between the dependent variable and the predictor. The test provides a useful alternative to regressionbased tests when conditioning variables have roots close or equal to unity. In this case standard predictive regression tests can suffer from welldocumented size distortion. Moreover, under the alternative hypothesis, they force the dependent variable to share the same order of integration as the predictor, whereas in practice the dependent variable often appears stationary while the predictor may be nearnonstationary. By contrast, the new test does not enforce the same orders of integration and is therefore capable of detecting alternatives to orthogonality that are excluded by the standard predictive regression model. Moreover, the test statistic has a standard normal limit distribution for both unit root and localtounity conditioning variables, without prior knowledge of the localtounity parameter. If the conditioning variable is stationary, the test remains conservative and consistent. Thus the new test requires neither size correction nor unit root pretest. Simulations suggest good small sample performance. As an empirical application, we test for the predictability of stock returns using two persistent predictors, the dividendpriceratio and shortterm interest rate. 
Keywords:  orthogonality test, covariance estimation, localtounity, unit roots, market efficiency, predictive regression, regression imbalance 
JEL:  C12 C22 
Date:  2007–02 
URL:  http://d.repec.org/n?u=RePEc:qed:wpaper:1122&r=ecm 
By:  Rodney C Wolff; Peter Hall; Qiwei Yao (School of Economics and Finance, Queensland University of Technology) 
Abstract:  Motivated by the problem of setting prediction intervals in time series analysis, we suggest two new methods for conditional distribution estimation. The first method is based on locally fitting a logistic model and is in the spirit of recent work on locally parametric techniques in density estimation. It produces distribution estimators that may be of arbitrarily high order but nevertheless always lie between 0 and 1. The second method involves an adjusted form of the NadarayaWatson estimator. It preserves the bias and variance properties of a class of secondorder estimators introduced by Yu and Jones but has the added advantage of always being a distribution itself. Our methods also have application outside the time series setting; for example, to quantile estimation for independent data. This problem motivated the work of Yu and Jones. 
Keywords:  Absolutely regular; bandwidth; biased bootstrap; conditional distribution; kernel methods; local linear methods; local logistic methods; NadarayaWatson estimator; prediction; quantile estimation; time series analysis; weighted bootstrap 
Date:  2006–06–15 
URL:  http://d.repec.org/n?u=RePEc:qut:rwolff:200611&r=ecm 
By:  Rodney C Wolff; Jiti Gao; Howell Tong (School of Economics and Finance, Queensland University of Technology) 
Abstract:  In this paper, we consider additive stochastic nonparametric regression models. By approximating the nonparametric components by a class of orthogonal series and using a generalized crossvalidation criterion, an adaptive and simultaneous estimation procedure for the nonparametric components is constructed. We illustrate the adaptive and simultaneous estimation procedure by a number of simulated and real examples. 
Keywords:  Adaptive estimation; additive model; dependent process; mixing condition; nonlinear time series; nonparametric regression; orthogonal series; strict stationarity; truncation parameter 
Date:  2006–06–15 
URL:  http://d.repec.org/n?u=RePEc:qut:rwolff:200610&r=ecm 
By:  Rodney C Wolff; Adrian G Barnett (School of Economics and Finance, Queensland University of Technology) 
Abstract:  The bispectrum and thirdorder moment can be viewed as equivalent tools for testing for the presence of nonlinearity in stationary time series. This is because the bispectrum is the Fourier transform of the thirdorder moment. An advantage of the bispectrum is that its estimator comprises terms that are asymptotically independent at distinct bifrequencies under the null hypothesis of linearity. An advantage of the thirdorder moment is that its values in any subset of joint lags can be used in the test, whereas when using the bispectrum the entire (or truncated) thirdorder moment is required to construct the Fourier transform. In this paper, we propose a test for nonlinearity based upon the estimated thirdorder moment. We use the phase scrambling bootstrap method to give a nonparametric estimate of the variance of our test statistic under the null hypothesis. Using a simulation study, we demonstrate that the test obtains its target significance level, with large power, when compared to an existing standard parametric test that uses the bispectrum. Further we show how the proposed test can be used to identify the source of nonlinearity due to interactions at specific frequencies. We also investigate implications for heuristic diagnosis of nonstationarity. 
Keywords:  Thirdorder moment; bispectrum; nonlinear; nonstationary; time series; bootstrap; phase scrambling 
Date:  2006–06–15 
URL:  http://d.repec.org/n?u=RePEc:qut:rwolff:20065&r=ecm 
By:  Giuseppe Ragusa (Department of Economics, University of CaliforniaIrvine) 
Abstract:  Bayesian inference in moment condition models is difficult to implement. For these models, a posterior distribution cannot be calculated because the likelihood function has not been fully specified. In this paper, we obtain a class of likelihoods by formal Bayesian calculations that take into account the semiparametric nature of the problem. The likelihoods are derived by integrating out the nuisance parameters with respect to a maximum entropy tilted prior on the space of distribution. The result is a unification that uncovers a mapping between priors and likelihood functions. We show that there exist priors such that the likelihoods are closely connected to Generalized Empirical Likelihood (GEL) methods. 
Keywords:  Moment condition; GMM; GEL; Likelihood functions; Bayesian inference 
JEL:  C1 C11 C14 C21 
Date:  2007–01 
URL:  http://d.repec.org/n?u=RePEc:irv:wpaper:060714&r=ecm 
By:  Matteo Manera (University of Milan Bicocca); Chiara Longo (Fondazione Eni Enrico Mattei); Anil Markandya (University of Bath and Fondazione Eni Enrico Mattei); Elisa Scarpa (Risk Management Department, IntesaSan Paolo) 
Abstract:  The relevance of oil in the world economy explains why considerable effort has been devoted to the development of different types of econometric models for oil price forecasting. Several specifications have been proposed in the economic literature. Some are based on financial theory and concentrate on the relationship between spot and futures prices (“financial” models). Others assign a key role to variables explaining the characteristics of the physical oil market (“structural” models). The empirical literature is very far from any consensus about the appropriate model for oil price forecasting that should be implemented. Relative to the previous literature, this paper is novel in several respects. First of all, we test and systematically evaluate the ability of several alternative econometric specifications proposed in the literature to capture the dynamics of oil prices. Second, we analyse the effects of different data frequencies on the coefficient estimates and forecasts obtained using each selected econometric specification. Third, we compare different models at different data frequencies on a common sample and common data. Fourth, we evaluate the forecasting performance of each selected model using static and dynamic forecasts, as well as different measures of forecast errors. Finally, we propose a new class of models which combine the relevant aspects of the financial and structural specifications proposed in the literature (“mixed” models). Our empirical findings can be summarized as follows. Financial models in levels do not produce satisfactory forecasts for the WTI spot price. The financial error correction model yields accurate insample forecasts. Real and strategic variables alone are insufficient to capture the oil spot price dynamics in the forecasting sample. Our proposed mixed models are statistically adequate and exhibit accurate forecasts. Different data frequencies seem to affect the forecasting ability of the models under analysis. 
Keywords:  Oil Price, WTI Spot And Futures Prices, Forecasting, Econometric Models 
JEL:  C52 C53 Q32 Q43 
Date:  2007–01 
URL:  http://d.repec.org/n?u=RePEc:fem:femwpa:2007.4&r=ecm 
By:  Rodney C Wolff; Darfiana Nur; Kerrie L Mengersen (School of Economics and Finance, Queensland University of Technology) 
Abstract:  Most MCMC users address the convergence problem by applying diagnostic tools to the output produced by running their samplers. Potentially useful diagnostics may be borrowed from diverse areas such as time series. One such method is phase randomisation. The aim of this paper is to describe this method in the context of MCMC, summarise its characteristics, and contrast its performance with those of the more common diagnostic tests for MCMC. It is observed that the new tool contributes information about third and higher order cumulant behaviour which is important in characterising certain forms of nonlinearity and nonstationarity. 
Keywords:  Convergence diagnostics; higher cumulants; Markov Chain Monte Carlo; nonlinear time series; stationarity; surrogate series 
Date:  2006–06–15 
URL:  http://d.repec.org/n?u=RePEc:qut:rwolff:20064&r=ecm 
By:  Grant Hillier (Institute for Fiscal Studies and University of Southampton) 
Abstract:  For a simplified structural equation/IV regression model with one rightside endogenous variable, we obtain the exact conditional distribution function for Moreira's (2003) conditional likelihood ratio (CLR) test. This is then used to obtain the critical value function needed to implement the CLR test, and reasonably comprehensive graphical versions of the function are provided for practical use. The analogous functions are also obtained for the case of testing more than one rightside endogenous coefficient, but only for an approximation to the true likelihood ratio test. We then go on to provide an exact analysis of the power functions of the CLR test, the AndersonRubin test, and the LM test suggested by Kleibergen (2002). The CLR test is shown to clearly conditionally dominate the other two tests for virtually all parameter configurations, but none of these test is either inadmissible or uniformly superior to the other two. 
Date:  2006–10 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:23/06&r=ecm 
By:  Rodney C Wolff; Qiwei Yao; Howell Tong (School of Economics and Finance, Queensland University of Technology) 
Abstract:  In order to develop statistical tests for the Lyapunov exponents of deterministic dynamical systems, we develop bootstrap tests based on empirical likelihood for percentiles and expectiles of strictly stationary processes. The percentiles and expectiles are estimated in terms of asymmetric least deviations and asymmetric least squares methods. Asymptotic distributional properties of the estimators are established. 
Keywords:  Bootstrap; chaos; empirical likelihood; expectile; percentile 
Date:  2006–06–15 
URL:  http://d.repec.org/n?u=RePEc:qut:rwolff:20068&r=ecm 
By:  Rodney C Wolff; Peter Hall (School of Economics and Finance, Queensland University of Technology) 
Abstract:  Simple kerneltype estimators of integrals of general powers of general derivatives of probability densities are proposed. They are based on two simple properties, and in many circumstances enjoy optimal convergence rate. 
Keywords:  Kernel estimators; nonparametric density estimation; wavelets 
Date:  2006–06–15 
URL:  http://d.repec.org/n?u=RePEc:qut:rwolff:200614&r=ecm 
By:  Rodney C Wolff; Peter Hall (School of Economics and Finance, Queensland University of Technology) 
Abstract:  We study a generalised version of the logistic map of the unit interval $(0,1)$, in which the point $x$ is taken to $12x1^\nu$. Here, $\nu >0$ is a parameter of the map, which has received attention only when $\nu =1$ and 2. We obtain the invariant density when $\nu = \frac12$, and derive properties of invariant distributions in all other cases. These are obtained by a mixture of analytic and numerical argument. In particular, we develop a technique for combining "parametric" information, available from the functional form of the map, with "nonparametric" information, from a Monte Carlo study. Properties of the correlation integral under the invariant distribution are also derived. It is shown that classical behaviour of this test statistic, which demands that the logarithm of the integral have slope equal to the lag, is valid if and only if $\nu \leq 2$. 
Keywords:  Chaos; correlation integral; invariant distribution; logistic map 
Date:  2006–06–15 
URL:  http://d.repec.org/n?u=RePEc:qut:rwolff:200612&r=ecm 
By:  Christian Hansen (Institute for Fiscal Studies and Chicago GSB); Jerry Hausman (Institute for Fiscal Studies and Massachussets Institute of Technology); Whitney Newey (Institute for Fiscal Studies and Massachussets Institute of Technology) 
Abstract:  Using many valid instrumental variables has the potential to improve efficiency but makes the usual inference procedures inaccurate. We give corrected standard errors, an extension of Bekker (1994) to nonnormal disturbances, that adjust for many instruments. We find that this adujstment is useful in empirical work, simulations, and in the asymptotic theory. Use of the corrected standard errors in tratios leads to an asymptotic approximation order that is the same when the number of instrumental variables grow as when the number of instruments is fixed. We also give a version of the Kleibergen (2002) weak instrument statistic that is robust to many instruments. 
Date:  2006–09 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:19/06&r=ecm 
By:  Del Negro, Marco; Schorfheide, Frank 
Abstract:  In Bayesian analysis of dynamic stochastic general equilibrium (DSGE) prior distributions for some of the tasteandtechnology parameters can be obtained from microeconometric or presample evidence, but it is difficult to elicit priors for the parameters that govern the law of motion of unobservable exogenous processes. Moreover, since it is challenging to formulate beliefs about the correlation of parameters, most researchers assume that all model parameters are independent of each other. We provide a simple method of constructing prior distributions for (a subset of) DSGE model parameters from beliefs about the moments of the endogenous variables. We use our approach to investigate the importance of nominal rigidities and show how the specification of prior distributions affects our assessment of the relative importance of different frictions. 
Keywords:  Bayesian analysis; DSGE models; model comparisons; nominal rigidities; prior elicitation 
JEL:  C32 E3 
Date:  2007–02 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:6119&r=ecm 
By:  Matthew Harding (Institute for Fiscal Studies and MIT); Jerry Hausman (Institute for Fiscal Studies and Massachussets Institute of Technology) 
Abstract:  Current methods of estimating the random coefficients logit model employ simulations of the distribution of the taste parameters through pseudorandom sequences. These methods suffer from difficulties in estimating correlations between parameters and computational limitations such as the curse of dimensionality. This paper provides a solution to these problems by approximating the integral expression of the expected choice probability using a multivariate extension of the Laplace approximation. Simulation results reveal that our method performs very well, both in terms of accuracy and computational time. This paper is a revised version of CWP01/06. 
Date:  2006–10 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:20/06&r=ecm 
By:  Mauro Costantini (ISAE  Institute for Studies and Economic Analyses); Roy Cerqueti (Università degli Studi di Roma “La Sapienza”, Italy) 
Abstract:  This paper provides a theoretical fractional cointegration analysis in a nonparametric framework. We solve a generalized eigenvalues problem. To this end, a couple of random matrices are constructed taking into account the stationarity properties of the differencesof a fractional pvariate integrated process. These difference orders are assumed to vary in a continuous and discrete range. The random matrices are defined by some weight functions. Asymptotic behaviors of these random matrices are obtained by stating some conditions on the weight functions, and by using Bierens (1997) and Andersen et al.(1983) results. In this way, a nonparametric analysis is provided. Moving from the solution of the generalized eigenvalue problem, a fractional nonparametric VAR model for cointegration is also presented. 
Keywords:  Fractional integrated process, Nonparametric methods, Cointegration, Asymptotic distribution, Generalized eigenvalues problem. 
JEL:  C14 C22 C65 
Date:  2007–02 
URL:  http://d.repec.org/n?u=RePEc:isa:wpaper:78&r=ecm 
By:  Juncal Cuñado (Universidad de Navarra); Luis A. GilAlaña (Universidad de Navarra) 
Abstract:  This paper deals with the analysis of the number of tourists travelling to the Canary Islands by means of using different seasonal statistical models. Deterministic and stochastic seasonality is considered. For the latter case, we employ seasonal unit roots and seasonally fractionally integrated models. As a final approach, we also employ a model with possibly different orders of integration at zero and the seasonal frequencies. All these models are compared in terms of their forecasting ability in an outofsample experiment. The results in the paper show that a simple deterministic model with seasonal dummy variables and AR(1) disturbances produce better results than other approaches based on seasonal fractional and integer differentiation over short horizons. However, increasing the time horizon, the results cannot distinguish between the model based on seasonal dummies and another using fractional integration at zero and the seasonal frequencies. 
URL:  http://d.repec.org/n?u=RePEc:una:unccee:wp0207&r=ecm 
By:  Rodney C Wolff; Peter Hall (School of Economics and Finance, Queensland University of Technology) 
Abstract:  Statistical scientists have recently focused sharp attention on properties of iterated chaotic maps, with a view to employing such processes to model naturally occurring phenomena. In the present paper we treat the logistic map, which has earlier been studied in the context of modelling biological systems. We derive theory describing properties of the 'invariant' or 'stationary' distribution under logistic maps and apply those results in conjunction with numerical work to develop further properties of invariant distributions and Lyapunov exponents. We describe the role that poles play in determining properties of densities' iterated distributions and show how poles arise from iterated mappings of the centre of the interval to which the map is applied. Particular attention is paid to the shape of the invariant distribution in the tails or in the neighbourhood of a pole of its density. A new technique is developed for this application. it enables us to combine 'parametric' information, available from the structure of the map, with 'nonparametric' information obtainable from numerical experiments. 
Keywords:  Bandwidth; chaos; density estimation; invariant distribution; kernel method; logistic map; Lyapunov exponent; pole; stationary distribution 
Date:  2006–06–15 
URL:  http://d.repec.org/n?u=RePEc:qut:rwolff:200613&r=ecm 
By:  Grant Hillier (Institute for Fiscal Studies and University of Southampton) 
Abstract:  For the problem of testing the hypothesis that all <i>m</i> coefficients of the RHS endogenous variables in an IV regression are zero, the likelihood ratio (LR) test can, if the reduced form covariance matrix is known, be rendered similar by a conditioning argument. To exploit this fact requires knowledge of the relevant conditional <i>cdf</i> of the LR statistic, but the statistic is a function of the smallest characteristic root of an (<i>m</i> + 1)−square matrix, and is therefore analytically difficult to deal with when <i>m</i> > 1. We show in this paper that an iterative conditioning argument used by Hillier (2006) and Andrews, Moreira, and Stock (2007) to evaluate the cdf in the case <i>m</i> = 1 can be generalized to the case of arbitrary <i>m</i>. This means that we can completely bypass the difficulty of dealing with the smallest characteristic root. Analytic results are obtained for the case <i>m</i> = 2, and a simple and efficient simulation approach to evaluating the <i>cdf</i> is suggested for larger values of <i>m</i>. 
Date:  2006–12 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:26/06&r=ecm 
By:  Mark. B. Stewart 
URL:  http://d.repec.org/n?u=RePEc:pri:indrel:159&r=ecm 
By:  Marco Fioramanti (ISAE  Institute for Studies and Economic Analyses; University of Pescara, Faculty of Economics) 
Abstract:  Recent episodes of financial crises have revived the interest in developing models that are able to timely signal their occurrence. The literature has developed both parametric and non parametric models to predict these crises, the so called Early Warning Systems. Using data related to sovereign debt crises occurred in developing countries from 1980 to 2004, this paper shows that a further progress can be done applying a less developed nonparametric method, i.e. Artificial Neural Networks (ANN). Thanks to the high flexibility of neural networks and to the Universal Approximation Theorem an ANN based early warning system can, under certain conditions, outperform more consolidated methods. 
Keywords:  Early Warning System; Financial Crisis; Sovereign Debt Crises; Artificial Neural Network. 
JEL:  F34 F37 C45 C14 
Date:  2006–10 
URL:  http://d.repec.org/n?u=RePEc:isa:wpaper:72&r=ecm 
By:  Pástor, Lubos; Stambaugh, Robert F 
Abstract:  The standard regression approach to modeling return predictability seems too restrictive in one way but too lax in another. A predictive regression models expected returns as an exact linear function of a given set of predictors but does not exploit the likely economic property that innovations in expected returns are negatively correlated with unexpected returns. We develop an alternative frameworka predictive systemthat accommodates imperfect predictors and beliefs about that negative correlation. In this framework, the predictive ability of imperfect predictors is supplemented by information in lagged returns as well as lags of the predictors. Compared to predictive regressions, predictive systems deliver different and substantially more precise estimates of expected returns as well as different assessments of a given predictor's usefulness. 
Keywords:  expected stock return; predictability; predictive regression; predictive system; state space model 
JEL:  G1 
Date:  2007–02 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:6076&r=ecm 
By:  Guillermina Jasso (New York University and IZA); Samuel Kotz (George Washington University) 
Abstract:  Recent work on social status led to derivation of a new continuous distribution based on the exponential. The new variate, termed the ring(2)exponential, in turn leads to derivation of two closelyrelated new families of continuous distributions, which we call the mirrorexponential and the ringexponential. Both the standard exponential and the ring(2) exponential are special cases of both the new families. In this paper, we first focus on the ring(2)exponential, describing its derivation and examining its properties, and next introduce the two new families, describing their derivation and initiating exploration of their properties. The mirrorexponential arises naturally in the study of status; the ringexponential arises from the mathematical structure of the ring(2)exponential. Both have potential for broad application in diverse contexts across science and engineering, including the physical and social sciences as well as finance, information processing, and communication. Within sociobehavioral contexts, the new mirrorexponential may have application to the problem of approximating the form and inequality of the wage distribution. 
Keywords:  continuous univariate distributions, Erlang distribution, general Erlang distribution, gamma distribution, general gamma distribution, folded distributions, Gini coefficient, social status, social inequality, wage function, wage distribution, wage inequality 
JEL:  C02 C16 D31 D6 I3 
Date:  2007–02 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp2598&r=ecm 
By:  Ricardo Gimeno (Banco de España); Juan M. Nave (Universidad CEU Cardenal Herrera) 
Abstract:  The term structure of interest rates is an instrument that gives us the necessary information for valuing deterministic financial cash flows, measuring the economic market expectations and testing the effectiveness of monetary policy decisions. However, it is not directly observable and needs to be measured by smoothing data obtained from asset prices through statistical techniques. Adjusting parsimonious functional forms  as proposed by Nelson and Siegel (1987) and Svensson (1994)  is the most popular technique. This method is based on bond yields to maturity and the high degree of non linearity of the functions to be optimised make it very sensitive to the initial values employed. In this context, this paper proposes the use of genetic algorithms to find these values and reduce the risk of false convergence, showing that stable time series parameters are obtained without the need to impose any kind of restrictions. 
Keywords:  forward and spot interest rates, nelson and siegel model, nonlinear optimization, numerical methods, svensson model, yield curve estimation 
JEL:  G12 C51 C63 
Date:  2006–12 
URL:  http://d.repec.org/n?u=RePEc:bde:wpaper:0634&r=ecm 