
on Econometrics 
By:  Maurice J.G. Bun; Frank Windmeijer 
Abstract:  The system GMM estimator for dynamic panel data models combines moment conditions for the model in first differences with moment conditions for the model in levels. It has been shown to improve on the GMM estimator in the first differenced model in terms of bias and root mean squared error. However, we show in this paper that in the covariance stationary panel data AR(1) model the expected values of the concentration parameters in the differenced and levels equations for the crosssection at time t are the same when the variances of the individual heterogeneity and idiosyncratic errors are the same. This indicates a weak instrument problem also for the equation in levels. We show that the 2SLS biases relative to that of the OLS biases are then similar for the equations in differences and levels, as are the size distortions of the Wald tests. These results are shown in a Monte Carlo study to extend to the panel data system GMM estimator. 
Keywords:  Dynamic Panel Data, System GMM, Weak Instruments 
JEL:  C12 C13 C23 
URL:  http://d.repec.org/n?u=RePEc:bri:uobdis:06/595&r=ecm 
By:  Javier Gonzalez; Daniel Pena; Rosario Romera 
Abstract:  Partial least squares regression (PLS) is a linear regression technique developed to relate many regressors to one or several response variables. Robust methods are introduced to reduce or remove the effect of outlying data points. In this paper we show that if the sample covariance matrix is properly robustified further robustification of the linear regression steps of the PLS algorithm becomes unnecessary. The robust estimate of the covariance matrix is computed by searching for outliers in univariate projections of the data on a combination of random directions (StahelDonoho) and specific directions obtained by maximizing and minimizing the kurtosis coefficient of the projected data, as proposed by Peña and Prieto (2006). It is shown that this procedure is fast to apply and provides better results than other procedures proposed in the literature. Its performance is illustrated by Monte Carlo and by an example, where the algorithm is able to show features of the data which were undetected by previous methods. 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:cte:wsrepe:ws071304&r=ecm 
By:  Fulvio Corsi; Francesco Audrino 
Abstract:  We propose the Heterogeneous Autoregressive (HAR) model for the estimation and prediction of realized correlations. We construct a realized correlation measure where both the volatilities and the covariances are computed from tickbytick data. As for the realized volatility, the presence of market microstructure can induce significant bias in standard realized covariance measure computed with artificially regularly spaced returns. Contrary to these standard approaches we analyse a simple and unbiased realized covariance estimator that does not resort to the construction of a regular grid, but directly and efficiently employs the raw tickbytick returns of the two series. Montecarlo simulations calibrated on realistic market microstructure conditions show that this simple tickbytick covariance possesses no bias and the smallest dispersion among the covariance estimators considered in the study. In an empirical analysis on S&P 500 and US bond data we find that realized correlations show significant regime changes in reaction to financial crises. Such regimes must be taken into account to get reliable estimates and forecasts. 
Keywords:  High frequency data, Realized Correlation, Market Microstructure, Bias correction, HAR, Regimes 
JEL:  C13 C22 C51 C53 
Date:  2007–01 
URL:  http://d.repec.org/n?u=RePEc:usg:dp2007:200702&r=ecm 
By:  Abbring, Jaap H; van den Berg, Gerard J 
Abstract:  In a large class of hazard models with proportional unobserved heterogeneity, the distribution of the heterogeneity among survivors converges to a gamma distribution. This convergence is often rapid. We derive this result as a general result for exponential mixtures and explore its implications for the specification and empirical analysis of univariate and multivariate duration models. 
Keywords:  duration analysis; exponential mixture; gamma distribution; limit distribution; mixed proportional hazard 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:6219&r=ecm 
By:  Raymond Kan; Cesare Robotti 
Abstract:  Although it is of interest to empirical researchers to test whether or not a particular assetpricing model is true, a more useful task is to determine how wrong a model is and to compare the performance of competing assetpricing models. In this paper, we propose a new methodology to test whether two competing linear assetpricing models have the same HansenJagannathan distance. We show that the asymptotic distribution of the test statistic depends on whether the competing models are correctly specified or misspecified and are nested or nonnested. In addition, given the increasing interest in misspecified models, we propose a simple methodology for computing the standard errors of the estimated stochastic discount factor parameters that are robust to model misspecification. Using the same data as in Hodrick and Zhang (2001), we show that the commonly used returns and factors are, for the most part, too noisy to conclude that one model is superior to the other models in terms of HansenJagannathan distance. In addition, we show that many of the macroeconomic factors commonly used in the literature are no longer priced once potential model misspecification is taken into account. 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:200704&r=ecm 
By:  Barry E. Jones; Travis D. Nesmith 
Abstract:  We derive a definition of linear cointegration for nonlinear stochastic processes using a martingale representation theorem. The result shows that stationary linear cointegrations can exhibit nonlinear dynamics, in contrast with the normal assumption of linearity. We propose a sequential nonparametric method to test first for cointegration and second for nonlinear dynamics in the cointegrated system. We apply this method to weekly US interest rates constructed using a multirate filter rather than averaging. The Treasury Bill, Commerical Paper and Federal Funds rates are cointegrated, with two cointegrating vectors. Both cointegrations behave nonlinearly. Consequently, linear models will not fully relicate the dynanics of monetary policy transmission. 
Keywords:  Timeseries analysis ; Cointegration ; Interest rates 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:200703&r=ecm 
By:  Favero, Carlo A; Niu, Linlin; Sala, Luca 
Abstract:  This paper addresses the issue of forecasting the term structure. We provide a unified statespace modelling framework that encompasses different existing discretetime yield curve models. Within such framework we analyze the impact on forecasting performance of two crucial modelling choices, i.e. the imposition of noarbitrage restrictions and the size of the information set used to extract factors. Using US yield curve data, we find that: a. macro factors are very useful in forecasting at medium/long forecasting horizon; b. financial factors are useful in short run forecasting; c. noarbitrage models are effective in shrinking the dimensionality of the parameter space and, when supplemented with additional macro information, are very effective in forecasting; d. within noarbitrage models, assuming timevarying risk price is more favourable than assuming constant risk price for medium horizonmaturity forecast when yield factors dominate the information set, and for short horizon and long maturity forecast when macro factors dominate the information set; e. however, given the complexity and the highly nonlinear parameterization of noarbitrage models, it is very difficult to exploit within this type of models the additional information offered by large macroeconomic datasets. 
Keywords:  factor models; forecasting; large data set; term structure of interest rates; Yield curve 
JEL:  C33 C53 E43 E44 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:6206&r=ecm 
By:  Gregory H. Bauer; Keith Vorkink 
Abstract:  We present a new matrixlogarithm model of the realized covariance matrix of stock returns. The model uses latent factors which are functions of both lagged volatility and returns. The model has several advantages: it is parsimonious; it does not require imposing parameter restrictions; and, it results in a positivedefinite covariance matrix. We apply the model to the covariance matrix of sizesorted stock returns and find that two factors are sufficient to capture most of the dynamics. We also introduce a new method to track an index using our model of the realized volatility covariance matrix. 
Keywords:  Econometric and statistical methods; Financial markets 
JEL:  G14 C53 C32 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:bca:bocawp:0720&r=ecm 
By:  Tobias, Justin 
Abstract:  We describe estimation, learning and prediction in a treatmentresponse model with two outcomes. The introduction of potential outcomes in this model introduces four crossregime correlation parameters that are not contained in the likelihood for the observed data and thus are not identified. Despite this inescapable identification problem, we build upon the results of Koop and Poirier (1997) to describe how learning takes place about the four nonidentified correlations through the imposed positive definiteness of the covariance matrix. We then derive bivariate distributions associated with commonly estimated ``treatment parameters'' (including the Average Treatment Effect and effect of Treatment on the Treated), and use the learning that takes place about the nonidentified correlations to calculate these densities. We illustrate our points in several generated data experiments and apply our methods to estimate the joint impact of child labor on achievement scores in language and mathematics. 
Keywords:  Bayesian econometrics, Treatment effects 
Date:  2005–11–30 
URL:  http://d.repec.org/n?u=RePEc:isu:genres:12480&r=ecm 
By:  Manuel Gomez (School of Economics, Universidad de Guanajuato); Daniel VentosaSantaularia (School of Economics, Universidad de Guanajuato) 
Abstract:  We investigate the efficiency of the DickeyFuller (DF) test as a tool to examine the convergence hypothesis. In doing so, we first describe two possible outcomes, overlooked in previous studies, namely Loose Catchingup and Loose Laggingbehind. Results suggest that this test is useful when the intention is to discriminate between a unit root process and a trend stationary process, though unreliable when used to differentiate between a unit root process and a process with both deterministic and stochastic trends. This issue may explain the lack of support for the convergence hypothesis in the aforementioned literature. 
Keywords:  Divergence, Loose Catchingup/Laggingbehind, Convergence, Deterministic and Stochastic trends 
JEL:  C32 O40 
URL:  http://d.repec.org/n?u=RePEc:gua:wpaper:em200703&r=ecm 
By:  David N. DeJong; Hariharan Dharmarajan; Roman Liesenfeld; JeanFrancois Richard 
Abstract:  . . . 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:pit:wpaper:300&r=ecm 
By:  Yann Bramoullé (CIRPÉE, Université Laval); Habiba Djebbari (CIRPÉE, Université Laval and IZA); Bernard Fortin (CIRPÉE, Université Laval) 
Abstract:  We provide new results regarding the identification of peer effects. We consider an extended version of the linearinmeans model where each individual has his own specific reference group. Interactions are thus structured through a social network. We assume that correlated unobservables are either absent, or treated as fixed effects at the component level. In both cases, we provide easytocheck necessary and sufficient conditions for identification. We show that endogenous and exogenous effects are generally identified under network interaction, although identification may fail for some particular structures. Monte Carlo simulations provide an analysis of the effects of some crucial characteristics of a network (i.e., density, intransitivity) on the estimates of social effects. Our approach generalizes a number of previous results due to Manski (1993), Moffitt (2001), and Lee (2006). 
Keywords:  peer effects, social networks, identification 
JEL:  D85 L14 Z13 C3 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp2652&r=ecm 
By:  Lars Vilhuber 
Abstract:  Research users of large administrative have to adjust their data for quirks, problems, and issues that are inevitable when working with these kinds of datasets. Not all solutions to these problems are identical, and how they differ may affect how the data is to be interpreted. Some elements of the data, such as the unit of observation, remain fundamentally different, and it is important to keep that in mind when comparing data across countries. In this paper (written for Lazear and Shaw, 2007), we focus on the differences in the underlying data for a selection of country datasets. We describe two data elements that remain fundamentally different across countries  the sampling or data collection methodology, and the basic unit of analysis (establishment or firm)  and the extent to which they differ. We then proceed to document some of the problems that affect longitudinally linked administrative data in general, and we describe some of the solutions analysts and statistical agencies have implemented, and explore, through a select set of case studies, how each adjustment or absence thereof might affect the data. 
JEL:  C81 C82 J0 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:12977&r=ecm 
By:  Travis D. Nesmith 
Abstract:  Seasonal adjustment usually relies on statistical models of seasonality that treat seasonal fluctuations as noise corrupting the `true' data. But seasonality in economic series often stems from economic behavior such as Christmastime spending. Such economic seasonality invalidates the separability assumptions that justify the construction of aggregate economic indexes. To solve this problem, Diewert(1980,1983,1998,1999) incorporates seasonal behavior into aggregation theory. Using duality theory, I extend these results to a larger class of decision problems. I also relax Diewert's assumption of homotheticity. I provide support for Diewert's preferred seasonallyadjusted economic index using weak separability assumptions that are shown to be sufficient. 
Keywords:  Seasonal variations (Economics) ; Consumer behavior 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:200704&r=ecm 