
on Econometrics 
By:  Nikolaus Hautsch 
Abstract:  We suggest a robust form of conditional moment test as a constructive test for func tional misspecification in multiplicative error models. The proposed test has power solely against violations of the conditional mean restriction but is not affected by any other type of model misspecification. MonteCarlo investigations show that an appro priate choice of weighting function induces high power against various alternatives. We illustrate how to adapt the framework to test also outofsample moment restrictions, such as orthogonalities of prediction errors. 
Keywords:  Robust Conditional Moment Tests, Finite Sample Properties, Multiplicative Error Models, Prediction Errors 
JEL:  C12 C22 C52 
Date:  2008–11 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2008067&r=ecm 
By:  Leandro M. Magnusson (Department of Economics, Tulane University) 
Abstract:  This paper presents tests for the structural parameters of a censored regression model with endogenous explanatory variables. These tests have the correct size even when the identification condition for the structural parameter is invalid. My approach starts from the estimation of the unrestricted parameters, which does not depend on the identification of the structural parameter. Next, I set up the optimal minimum distance objective function, from where I derive the tests. The proposed robust tests are implemented in many statistical software packages since they demand only the Tobit and the ordinary least squares estimation functions. By simulating their power curves, I compare the robust to the Wald and the likelihood ratio tests. A case of the labor supply of married women illustrates the use of the robust tests for the construction of condence intervals. 
Keywords:  Endogenous Tobit, weak instruments, minimum distance estimation, female labor supply 
JEL:  C12 C34 
Date:  2008–12 
URL:  http://d.repec.org/n?u=RePEc:tul:wpaper:0802&r=ecm 
By:  Thomas Flury; Neil Shephard 
Abstract:  Suppose we wish to carry out likelihood based inference but we solely have an unbiased simulation based estimator of the likelihood. We note that unbiasedness is enough when the estimated likelihood is used inside a MetropolisHastings algorithm. This result has recently been intro duced in statistics literature by Andrieu, Doucet, and Holenstein (2007) and is perhaps surprising given the celebrated results on maximum simulated likelihood estimation. Bayesian inference based on simulated likelihood can be widely applied in microeconomics, macroeconomics and financial econometrics. One way of generating unbiased estimates of the likelihood is by the use of a particle filter. We illustrate these methods on four problems in econometrics, producing rather generic methods. Taken together, these methods imply that if we can simulate from an economic model we can carry out likelihood based inference using its simulations. 
Keywords:  Dynamic stochastic general equilibrium models, inference, likelihood, MCMC, MetropolisHastings, particle filter, state space models, stochastic volatility 
JEL:  C11 C13 C15 C32 E32 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:sbs:wpsefe:2008fe32&r=ecm 
By:  Leandro M. Magnusson (Department of Economics, Tulane University) 
Abstract:  We propose tests for structural parameters in limited dependent variable models with endogenous explanatory variables using the classical minimum distance framework. These tests have the correct size whether the structural parameters are identified or not. Relating to the current tests, the application of ours is appropriate especially to models whose moment conditions are nonlinear in parameters. Moreover, the computation of ours tests is simple, allowing their implementation in a large number of statistical software packages. We compare our tests with Wald tests by performing simulation experiments. We use our tests to analyze the female labor supply and the demand for cigarette. 
Keywords:  Weak identication, minimum chisquare estimation, hypothesis testing, limited dependent variable models 
JEL:  C12 C30 C34 
Date:  2008–09 
URL:  http://d.repec.org/n?u=RePEc:tul:wpaper:0801&r=ecm 
By:  John Geweke (Departments of Statistics and Economics, University of Iowa, 430 N. Clinton St., Iowa City, IA 522422020, USA.); Gianni Amisano (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.) 
Abstract:  Bayesian inference in a time series model provides exact, outofsample predictive distributions that fully and coherently incorporate parameter uncertainty. This study compares and evaluates Bayesian predictive distributions from alternative models, using as an illustration five alternative models of asset returns applied to daily S&P 500 returns from 1976 through 2005. The comparison exercise uses predictive likelihoods and is inherently Bayesian. The evaluation exercise uses the probability integral transform and is inherently frequentist. The illustration shows that the two approaches can be complementary, each identifying strengths and weaknesses in models that are not evident using the other. JEL Classification: C11, C53. 
Keywords:  Forecasting, GARCH, inverse probability transform, Markov mixture, predictive likelihood, S&P 500 returns, stochastic volatility. 
Date:  2008–11 
URL:  http://d.repec.org/n?u=RePEc:ecb:ecbwps:20080969&r=ecm 
By:  Laurent Lamy 
Abstract:  We consider standard auction models when bidders' identities are not observed by the econometrician. First, we adapt the definition of identifiability to a framework with anonymous bids and we explore the extent to which anonymity reduces the possibility to identify private value auction models. Second, in the asymmetric independent private value model which is nonparametrically identified, we generalize Guerre, Perrigne and Vuong's estimation procedure [Optimal Nonparametric Estimation of FirstPrice Auctions, Econometrica 68 (2000) 525574] and study the asymptotic properties of our multistep kernelbased estimator. Third a test for symmetry is proposed. Monte Carlo simulations illustrate the practical relevance of our estimation and testing procedures for small data sets. 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:pse:psecon:200864&r=ecm 
By:  Daniel O. Beltran; David Draper 
Abstract:  This paper estimates the parameters of a stylized dynamic stochastic general equilibrium model using maximum likelihood and Bayesian methods, paying special attention to the issue of weak parameter identification. Given the model and the available data, the posterior estimates of the weakly identified parameters are very sensitive to the choice of priors. We provide a set of tools to diagnose weak identification, which include surface plots of the loglikelihood as a function of two parameters, heat plots of the loglikelihood as a function of three parameters, Monte Carlo simulations using artificial data, and Bayesian estimation using three sets of priors. We find that the policy coefficients and the parameter governing the elasticity of labor supply are weakly identified by the data, and posterior predictive distributions remind us that DSGE models may make poor forecasts even when they fit the data well. Although parameter identification is model and dataspecific, the lack of identification of some key structural parameters in a smallscale DSGE model such as the one we examine should raise a red flag to researchers trying to estimateand draw valid inferences fromlargescale models featuring many more parameters. 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgif:955&r=ecm 
By:  John K. Dagsvik, Torbjørn Hægeland and Arvid Raknerud (Statistics Norway) 
Abstract:  In this paper we develop likelihood based methods for statistical inference in a joint system of equations for the choice of length of schooling and earnings. The model for schooling choice is assumed to be an ordered probit model, whereas the earnings equation contains variables that are flexible transformations of schooling and experience, with corresponding coefficients that are allowed to be heterogeneous across individuals. Under the assumption that the distribution of the random terms of the model can be expressed as a particular finite mixture of multinormal distributions, we show that the joint probability distribution for schooling and earnings can be expressed on closed form. In an application of our method on Norwegian data, we find that the mixed Gaussian model offers a substantial improvement in fit to the (heavytailed) empirical distribution of logearnings compared to a multinormal benchmark model. 
Keywords:  Schooling choice; earnings equation; normal mixtures; treatment effects; selfselection; random coefficients; full information maximum likelihood 
JEL:  C31 I20 J30 
Date:  2008–12 
URL:  http://d.repec.org/n?u=RePEc:ssb:dispap:567&r=ecm 
By:  Gregor Bäurle 
Abstract:  We propose a method to incorporate information from Dynamic Stochastic General Equilibrium (DSGE) models into Dynamic Factor Analysis. The method combines a procedure previously applied for Bayesian Vector Autoregressions and a Gibbs Sampling approach for Dynamic Factor Models. The factors in the model are rotated such that they can be interpreted as variables from a DSGE model. In contrast to standard Dynamic Factor Analysis, a direct economic interpretation of the factors is given. We evaluate the forecast performance of the model with respect to the amount of information from the DSGE model included in the estimation. We conclude that using prior information from a standard New Keynesian DSGE model improves the forecast performance. We also analyze the impact of identified monetary shocks on both the factors and selected series. The interpretation of the factors as variables from the DSGE model allows us to use an identification scheme which is directly linked to the DSGE model. The responses of the factors in our application resemble responses found using VARs. However, there are deviations from standard results when looking at the responses of specific series to common shocks. 
Keywords:  Dynamic Factor Model; DSGE Model; Bayesian Analysis; Forecasting; Transmission of Shocks 
JEL:  C11 C32 E0 
Date:  2008–08 
URL:  http://d.repec.org/n?u=RePEc:ube:dpvwib:dp0803&r=ecm 
By:  Marc Hallin 
Abstract:  The likelihood ratio test for msample homogeneity of covariance is notoriously sensitive to the violations of Gaussian assumptions. Its asymptotic behavior under nonGaussian densities has been the subject of an abundant literature. In a recent paper, Yanagihara et al. (2005) show that the asymptotic distribution of the likelihood ratio test statistic, under arbitrary elliptical densities with finite fourthorder moments, is that of a linear combination of two mutually independent chisquare variables. Their proof is based on characteristic function methods, and only allows for convergence in distribution conclusions. Moreover, they require homokurticity among the m populations. Exploiting the findings of Hallin and Paindaveine (2008a), we reinforce that convergenceindistribution result into a convergencein probability one —that is, we explicitly decompose the likelihood ratio test statistic into a linear combination of two variables which are asymptotically independent chisquare —and moreover extend it to the heterokurtic case. 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:eca:wpaper:2008_039&r=ecm 
By:  Strid, Ingvar (Dept. of Economic Statistics, Stockholm School of Economics) 
Abstract:  Prefetching is a simple and general method for singlechain parallelisation of the MetropolisHastings algorithm based on the idea of evaluating the posterior in parallel and ahead of time. In this paper improved MetropolisHastings prefetching algorithms are presented and evaluated. It is shown how to use available information to make better predictions of the future states of the chain and increase the efficiency of prefetching considerably. The optimal acceptance rate for the prefetching random walk MetropolisHastings algorithm is obtained for a special case and it is shown to decrease in the number of processors employed. The performance of the algorithms is illustrated using a wellknown macroeconomic model. Bayesian estimation of DSGE models, linearly or nonlinearly approximated, is identified as a potential area of application for prefetching methods. The generality of the proposed method, however, suggests that it could be applied in many other contexts as well. 
Keywords:  Prefetching; MetropolisHastings; Parallel Computing; DSGE models; Optimal acceptance rate 
JEL:  C11 C13 C63 
Date:  2008–12–02 
URL:  http://d.repec.org/n?u=RePEc:hhs:hastef:0706&r=ecm 
By:  Les Oxley, (University of Canterbury); Marco Reale; Granville Tunnicliffe Wilson 
Abstract:  In this paper graphical modelling is used to select a sparse structure for a multivariate time series model of New Zealand interest rates. In particular, we consider a recursive structural vector autoregressions that can subsequently be described parsimoniously by a directed acyclic graph, which could be given a causal interpretation. A comparison between competing models is then made by considering likelihood and economic theory. 
Keywords:  Graphical models; directed acyclic graphs; term structure; causality. 
JEL:  E43 E44 C01 C32 
Date:  2008–11–28 
URL:  http://d.repec.org/n?u=RePEc:cbt:econwp:08/19&r=ecm 
By:  Claude Lopez; David H. Papell 
Abstract:  While panel unit root tests have been used to investigate a wide range of macroeconomic issues, the tests suffer from low power to reject the unit root null in panels of stationary series if the panels consist of highly persistent series, contain a small number of series, and/or have series with a limited length. We propose a new procedure to increase the power of panel unit root tests when used to study convergence by testing for stationarity between a group of series and their crosssectional means. Although each differential has nonzero mean, the group of differentials has a crosssectional average of zero for each time period by construction, and we incorporate this constraint for estimation and when generating finite sample critical values. This procedure leads to significant power gains for the panel unit root test. We apply our new approach to study inflation convergence within the Euro Area countries for the post 1979 period. The results show strong evidence of convergence soon after the implementation of the Maastricht treaty. Furthermore, median unbiased estimates of the half life for the period before and after the Euro show a dramatic decrease in the persistence of the differential after the occurrence of the single currency. 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:cin:ucecwp:200810&r=ecm 
By:  Jian Wang; Jason J. Wu 
Abstract:  This paper attacks the MeeseRogoff (exchange rate disconnect) puzzle from a different perspective: outofsample interval forecasting. Most studies in the literature focus on point forecasts. In this paper, we apply Robust Semiparametric (RS) interval forecasting to a group of Taylor rule models. Forecast intervals for twelve OECD exchange rates are generated and modified tests of Giacomini and White (2006) are conducted to compare the performance of Taylor rule models and the random walk. Our contribution is twofold. First, we find that in general, Taylor rule models generate tighter forecast intervals than the random walk, given that their intervals cover outofsample exchange rate realizations equally well. This result is more pronounced at longer horizons. Our results suggest a connection between exchange rates and economic fundamentals: economic variables contain information useful in forecasting the distributions of exchange rates. The benchmark Taylor rule model is also found to perform betterthan the monetary and PPP models. Second, the inference framework proposed in this paper for forecastinterval evaluation, can be applied in a broader context, such as inflation forecasting, not just to the models and interval forecasting methods used in this paper. 
Keywords:  Foreign exchange ; Forecasting ; Taylor's rule ; Econometric models  Evaluation 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:fip:feddgw:22&r=ecm 
By:  Meredith Beechey; Erik Hjalmarsson; Par Osterholm 
Abstract:  Nominal interest rates are unlikely to be generated by unitroot processes. Using data on short and long interest rates from eight developed and six emerging economies, we test the expectations hypothesis using cointegration methods under the assumption that interest rates are near integrated. If the null hypothesis of no cointegration is rejected, we then test whether the estimated cointegrating vector is consistent with that suggested by the expectations hypothesis. The results show support for cointegration in ten of the fourteen countries we consider, and the cointegrating vector is similar across countries. However, the parameters differ from those suggested by theory. We relate our findings to existing literature on the failure of the expectations hypothesis and to the role of term premia. 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgif:953&r=ecm 
By:  Arthur Lewbel (Boston College); Krishna Pendakur (Simon Fraser University) 
Abstract:  The structural consumer demand methods used to estimate the parameters of collective household models are typically either very restrictive and easy to implement or very general and difficult to estimate. In this paper, we provide a middle ground. We adapt the very general framework of Browning, Chiappori and Lewbel (2007) by adding a simple restriction that recasts the empirical model from a highly nonlinear demand system with price variation to a slightly nonlinear Engel curve system. Our restriction has an interpretation in terms of the behaviour of household scale economies and is testable. Our method identifies the levels of (not just changes in) household resource shares, and a variant of equivalence scales called indifference scales. We apply our methodology to Canadian expenditure data. 
Keywords:  Consumer Demand, Collective Model, Sharing rule, Household Bargaining, Bargaining Power, Indifference Scales, Adult Equivalence Scales, Demand Systems, Barten Scales, Nonparametric Identification. 
JEL:  D12 D11 C30 I31 J12 
Date:  2008–05–01 
URL:  http://d.repec.org/n?u=RePEc:boc:bocoec:694&r=ecm 
By:  Crépon, Bruno (CRESTINSEE); Ferracci, Marc (CRESTINSEE); Jolivet, Grégory (University of Bristol); van den Berg, Gerard J. (Free University of Amsterdam) 
Abstract:  This paper implements a method to identify and estimate treatment effects in a dynamic setting where treatments may occur at any point in time. By relating the standard matching approach to the timingofevents approach, it demonstrates that effects of the treatment on the treated at a given date can be identified although nontreated may be treated later in time. The approach builds on a "no anticipation" assumption and the assumption of conditional independence between the duration until treatment and the counterfactual durations until exit. To illustrate the approach, the paper studies the effect of training for unemployed workers in France, using a rich register data set. Training has little impact on unemployment duration. The contamination of the standard matching estimator due to later entries into treatment is large if the treatment probability is high. 
Keywords:  propensity score, training, unemployment duration, program participation, treatment, matching, contamination bias 
JEL:  J64 C21 C31 C41 C14 
Date:  2008–11 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp3848&r=ecm 
By:  Lutz Kilian; Clara Vega 
Abstract:  Models that treat innovations to the price of energy as predetermined with respect to U.S. macroeconomic aggregates are widely used in the literature. For example, it is common to order energy prices first in recursively identified VAR models of the transmission of energy price shocks. Since exactly identifying assumptions are inherently untestable, this approach in practice has required an act of faith in the empirical plausibility of the delay restriction used for identification. An alternative view that would invalidate such models is that energy prices respond instantaneously to macroeconomic news, implying that energy prices should be ordered last in recursively identified VAR models. In this paper, we propose a formal test of the identifying assumption that energy prices are predetermined with respect to U.S. macroeconomic aggregates. Our test is based on regressing cumulative changes in daily energy prices on daily news from U.S. macroeconomic data releases. Using a wide range of macroeconomic news, we find no compelling evidence of feedback at daily or monthly horizons, contradicting the view that energy prices respond instantaneously to macroeconomic news and supporting the use of delay restrictions for identification. 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgif:957&r=ecm 
By:  Andersson, Eva (Statistical Research Unit, Department of Economics, School of Business, Economics and Law, Göteborg University) 
Abstract:  A system for detecting changes in an ongoing process is needed in many situations. Online monitoring (surveillance) is used in early detection of disease outbreaks, of patients at risk and of financial instability. By continually monitoring one or several indicators, we can, early, detect a change in the processes of interest. There are several suggested methods for multivariate surveillance, one of which is the Hotelling’s T2. Since one aim in surveillance is quick detection of a change, it is important to use evaluation measures that reflect the timeliness of an alarm. One suggested measure is the expected delay of an alarm, in relation to the time of change () in the process. Here we investigate a delay measure for the bivariate situation. Generally, the measure depends on both change times (i.e. 1 and 2). We show that, for a bivariate situation using the T2 method, the delay only depends on 1 and 2 through the distance 12. 
Keywords:  Monitoring; Online; Surveillance; T2; Timeliness 
JEL:  C10 
Date:  2008–11–28 
URL:  http://d.repec.org/n?u=RePEc:hhs:gunsru:2008_003&r=ecm 