nep-ecm New Economics Papers
on Econometrics
Issue of 2008‒12‒07
nineteen papers chosen by
Sune Karlsson
Orebro University

  1. Testing Multiplicative Error Models Using Conditional Moment Tests By Nikolaus Hautsch
  2. Tests in Censored Models when the Structural Parameters Are Not Identified By Leandro M. Magnusson
  3. Bayesian inference based only on simulated likelihood: particle filter analysis of dynamic economic models By Thomas Flury; Neil Shephard
  4. Inference in Limited Dependent Variable Models Robust to Weak Identification By Leandro M. Magnusson
  5. Comparing and evaluating Bayesian predictive distributions of asset returns. By John Geweke; Gianni Amisano
  6. The econometrics of auctions with asymmetric anonymous bidders By Laurent Lamy
  7. Estimating the parameters of a small open economy DSGE model: identifiability and inferential validity By Daniel O. Beltran; David Draper
  8. Estimating the Returns to Schooling: A Likelihood Approach Based on Normal Mixtures By John K. Dagsvik, Torbjørn Hægeland and Arvid Raknerud
  9. Priors from DSGE Models for Dynamic Factor Analysis By Gregor Bäurle
  10. On the Non Gaussian Asymptotics of the Likelihood Ratio Test Statistic for Homogeneity of Covariance By Marc Hallin
  11. Metropolis-Hastings prefetching algorithms By Strid, Ingvar
  12. Constructing Structural VAR Models with Conditional Independence Graphs By Les Oxley,; Marco Reale; Granville Tunnicliffe Wilson
  13. Testing for Group-Wise Convergence with an Application to Euro Area Inflation By Claude Lopez; David H. Papell
  14. The Taylor rule and forecast intervals for exchange rates By Jian Wang; Jason J. Wu
  15. Testing the expectations hypothesis when interest rates are near integrated By Meredith Beechey; Erik Hjalmarsson; Par Osterholm
  16. Estimation of Collective Household Models With Engel Curves By Arthur Lewbel; Krishna Pendakur
  17. Active Labor Market Policy Effects in a Dynamic Setting By Crépon, Bruno; Ferracci, Marc; Jolivet, Grégory; van den Berg, Gerard J.
  18. Do energy prices respond to U.S. macroeconomic news? a test of the hypothesis of predetermined energy prices By Lutz Kilian; Clara Vega
  19. Hotelling´s T2 Method in Multivariate On-line Surveillance. On the Delay of an Alarm By Andersson, Eva

  1. By: Nikolaus Hautsch
    Abstract: We suggest a robust form of conditional moment test as a constructive test for func- tional misspecification in multiplicative error models. The proposed test has power solely against violations of the conditional mean restriction but is not affected by any other type of model misspecification. Monte-Carlo investigations show that an appro- priate choice of weighting function induces high power against various alternatives. We illustrate how to adapt the framework to test also out-of-sample moment restrictions, such as orthogonalities of prediction errors.
    Keywords: Robust Conditional Moment Tests, Finite Sample Properties, Multiplicative Error Models, Prediction Errors
    JEL: C12 C22 C52
    Date: 2008–11
  2. By: Leandro M. Magnusson (Department of Economics, Tulane University)
    Abstract: This paper presents tests for the structural parameters of a censored regression model with endogenous explanatory variables. These tests have the correct size even when the identification condition for the structural parameter is invalid. My approach starts from the estimation of the unrestricted parameters, which does not depend on the identification of the structural parameter. Next, I set up the optimal minimum distance objective function, from where I derive the tests. The proposed robust tests are implemented in many statistical software packages since they demand only the Tobit and the ordinary least squares estimation functions. By simulating their power curves, I compare the robust to the Wald and the likelihood ratio tests. A case of the labor supply of married women illustrates the use of the robust tests for the construction of condence intervals.
    Keywords: Endogenous Tobit, weak instruments, minimum distance estimation, female labor supply
    JEL: C12 C34
    Date: 2008–12
  3. By: Thomas Flury; Neil Shephard
    Abstract: Suppose we wish to carry out likelihood based inference but we solely have an unbiased simulation based estimator of the likelihood. We note that unbiasedness is enough when the estimated likelihood is used inside a Metropolis-Hastings algorithm. This result has recently been intro- duced in statistics literature by Andrieu, Doucet, and Holenstein (2007) and is perhaps surprising given the celebrated results on maximum simulated likelihood estimation. Bayesian inference based on simulated likelihood can be widely applied in microeconomics, macroeconomics and financial econometrics. One way of generating unbiased estimates of the likelihood is by the use of a particle filter. We illustrate these methods on four problems in econometrics, producing rather generic methods. Taken together, these methods imply that if we can simulate from an economic model we can carry out likelihood based inference using its simulations.
    Keywords: Dynamic stochastic general equilibrium models, inference, likelihood, MCMC, Metropolis-Hastings, particle filter, state space models, stochastic volatility
    JEL: C11 C13 C15 C32 E32
    Date: 2008
  4. By: Leandro M. Magnusson (Department of Economics, Tulane University)
    Abstract: We propose tests for structural parameters in limited dependent variable models with endogenous explanatory variables using the classical minimum distance framework. These tests have the correct size whether the structural parameters are identified or not. Relating to the current tests, the application of ours is appropriate especially to models whose moment conditions are nonlinear in parameters. Moreover, the computation of ours tests is simple, allowing their implementation in a large number of statistical software packages. We compare our tests with Wald tests by performing simulation experiments. We use our tests to analyze the female labor supply and the demand for cigarette.
    Keywords: Weak identication, minimum chi-square estimation, hypothesis testing, limited dependent variable models
    JEL: C12 C30 C34
    Date: 2008–09
  5. By: John Geweke (Departments of Statistics and Economics, University of Iowa, 430 N. Clinton St., Iowa City, IA 52242-2020, USA.); Gianni Amisano (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.)
    Abstract: Bayesian inference in a time series model provides exact, out-of-sample predictive distributions that fully and coherently incorporate parameter uncertainty. This study compares and evaluates Bayesian predictive distributions from alternative models, using as an illustration five alternative models of asset returns applied to daily S&P 500 returns from 1976 through 2005. The comparison exercise uses predictive likelihoods and is inherently Bayesian. The evaluation exercise uses the probability integral transform and is inherently frequentist. The illustration shows that the two approaches can be complementary, each identifying strengths and weaknesses in models that are not evident using the other. JEL Classification: C11, C53.
    Keywords: Forecasting, GARCH, inverse probability transform, Markov mixture, predictive likelihood, S&P 500 returns, stochastic volatility.
    Date: 2008–11
  6. By: Laurent Lamy
    Abstract: We consider standard auction models when bidders' identities are not observed by the econometrician. First, we adapt the definition of identifiability to a framework with anonymous bids and we explore the extent to which anonymity reduces the possibility to identify private value auction models. Second, in the asymmetric independent private value model which is nonparametrically identified, we generalize Guerre, Perrigne and Vuong's estimation procedure [Optimal Nonparametric Estimation of First-Price Auctions, Econometrica 68 (2000) 525-574] and study the asymptotic properties of our multi-step kernel-based estimator. Third a test for symmetry is proposed. Monte Carlo simulations illustrate the practical relevance of our estimation and testing procedures for small data sets.
    Date: 2008
  7. By: Daniel O. Beltran; David Draper
    Abstract: This paper estimates the parameters of a stylized dynamic stochastic general equilibrium model using maximum likelihood and Bayesian methods, paying special attention to the issue of weak parameter identification. Given the model and the available data, the posterior estimates of the weakly identified parameters are very sensitive to the choice of priors. We provide a set of tools to diagnose weak identification, which include surface plots of the log-likelihood as a function of two parameters, heat plots of the log-likelihood as a function of three parameters, Monte Carlo simulations using artificial data, and Bayesian estimation using three sets of priors. We find that the policy coefficients and the parameter governing the elasticity of labor supply are weakly identified by the data, and posterior predictive distributions remind us that DSGE models may make poor forecasts even when they fit the data well. Although parameter identification is model- and data-specific, the lack of identification of some key structural parameters in a small-scale DSGE model such as the one we examine should raise a red flag to researchers trying to estimate--and draw valid inferences from--large-scale models featuring many more parameters.
    Date: 2008
  8. By: John K. Dagsvik, Torbjørn Hægeland and Arvid Raknerud (Statistics Norway)
    Abstract: In this paper we develop likelihood based methods for statistical inference in a joint system of equations for the choice of length of schooling and earnings. The model for schooling choice is assumed to be an ordered probit model, whereas the earnings equation contains variables that are flexible transformations of schooling and experience, with corresponding coefficients that are allowed to be heterogeneous across individuals. Under the assumption that the distribution of the random terms of the model can be expressed as a particular finite mixture of multinormal distributions, we show that the joint probability distribution for schooling and earnings can be expressed on closed form. In an application of our method on Norwegian data, we find that the mixed Gaussian model offers a substantial improvement in fit to the (heavy-tailed) empirical distribution of log-earnings compared to a multinormal benchmark model.
    Keywords: Schooling choice; earnings equation; normal mixtures; treatment effects; self-selection; random coefficients; full information maximum likelihood
    JEL: C31 I20 J30
    Date: 2008–12
  9. By: Gregor Bäurle
    Abstract: We propose a method to incorporate information from Dynamic Stochastic General Equilibrium (DSGE) models into Dynamic Factor Analysis. The method combines a procedure previously applied for Bayesian Vector Autoregressions and a Gibbs Sampling approach for Dynamic Factor Models. The factors in the model are rotated such that they can be interpreted as variables from a DSGE model. In contrast to standard Dynamic Factor Analysis, a direct economic interpretation of the factors is given. We evaluate the forecast performance of the model with respect to the amount of information from the DSGE model included in the estimation. We conclude that using prior information from a standard New Keynesian DSGE model improves the forecast performance. We also analyze the impact of identified monetary shocks on both the factors and selected series. The interpretation of the factors as variables from the DSGE model allows us to use an identification scheme which is directly linked to the DSGE model. The responses of the factors in our application resemble responses found using VARs. However, there are deviations from standard results when looking at the responses of specific series to common shocks.
    Keywords: Dynamic Factor Model; DSGE Model; Bayesian Analysis; Forecasting; Transmission of Shocks
    JEL: C11 C32 E0
    Date: 2008–08
  10. By: Marc Hallin
    Abstract: The likelihood ratio test for m-sample homogeneity of covariance is notoriously sensitive to the violations of Gaussian assumptions. Its asymptotic behavior under non-Gaussian densities has been the subject of an abundant literature. In a recent paper, Yanagihara et al. (2005) show that the asymptotic distribution of the likelihood ratio test statistic, under arbitrary elliptical densities with finite fourth-order moments, is that of a linear combination of two mutually independent chi-square variables. Their proof is based on characteristic function methods, and only allows for convergence in distribution conclusions. Moreover, they require homokurticity among the m populations. Exploiting the findings of Hallin and Paindaveine (2008a), we reinforce that convergence-in-distribution result into a convergence-in- probability one —-that is, we explicitly decompose the likelihood ratio test statistic into a linear combination of two variables which are asymptotically independent chi-square —-and moreover extend it to the heterokurtic case.
    Date: 2008
  11. By: Strid, Ingvar (Dept. of Economic Statistics, Stockholm School of Economics)
    Abstract: Prefetching is a simple and general method for single-chain parallelisation of the Metropolis-Hastings algorithm based on the idea of evaluating the posterior in parallel and ahead of time. In this paper improved Metropolis-Hastings prefetching algorithms are presented and evaluated. It is shown how to use available information to make better predictions of the future states of the chain and increase the efficiency of prefetching considerably. The optimal acceptance rate for the prefetching random walk Metropolis-Hastings algorithm is obtained for a special case and it is shown to decrease in the number of processors employed. The performance of the algorithms is illustrated using a well-known macroeconomic model. Bayesian estimation of DSGE models, linearly or nonlinearly approximated, is identified as a potential area of application for prefetching methods. The generality of the proposed method, however, suggests that it could be applied in many other contexts as well.
    Keywords: Prefetching; Metropolis-Hastings; Parallel Computing; DSGE models; Optimal acceptance rate
    JEL: C11 C13 C63
    Date: 2008–12–02
  12. By: Les Oxley, (University of Canterbury); Marco Reale; Granville Tunnicliffe Wilson
    Abstract: In this paper graphical modelling is used to select a sparse structure for a multivariate time series model of New Zealand interest rates. In particular, we consider a recursive structural vector autoregressions that can subsequently be described parsimoniously by a directed acyclic graph, which could be given a causal interpretation. A comparison between competing models is then made by considering likelihood and economic theory.
    Keywords: Graphical models; directed acyclic graphs; term structure; causality.
    JEL: E43 E44 C01 C32
    Date: 2008–11–28
  13. By: Claude Lopez; David H. Papell
    Abstract: While panel unit root tests have been used to investigate a wide range of macroeconomic issues, the tests suffer from low power to reject the unit root null in panels of stationary series if the panels consist of highly persistent series, contain a small number of series, and/or have series with a limited length. We propose a new procedure to increase the power of panel unit root tests when used to study convergence by testing for stationarity between a group of series and their cross-sectional means. Although each differential has non-zero mean, the group of differentials has a cross-sectional average of zero for each time period by construction, and we incorporate this constraint for estimation and when generating finite sample critical values. This procedure leads to significant power gains for the panel unit root test. We apply our new approach to study inflation convergence within the Euro Area countries for the post 1979 period. The results show strong evidence of convergence soon after the implementation of the Maastricht treaty. Furthermore, median unbiased estimates of the half life for the period before and after the Euro show a dramatic decrease in the persistence of the differential after the occurrence of the single currency.
    Date: 2008
  14. By: Jian Wang; Jason J. Wu
    Abstract: This paper attacks the Meese-Rogoff (exchange rate disconnect) puzzle from a different perspective: out-of-sample interval forecasting. Most studies in the literature focus on point forecasts. In this paper, we apply Robust Semi-parametric (RS) interval forecasting to a group of Taylor rule models. Forecast intervals for twelve OECD exchange rates are generated and modified tests of Giacomini and White (2006) are conducted to compare the performance of Taylor rule models and the random walk. Our contribution is twofold. First, we find that in general, Taylor rule models generate tighter forecast intervals than the random walk, given that their intervals cover out-of-sample exchange rate realizations equally well. This result is more pronounced at longer horizons. Our results suggest a connection between exchange rates and economic fundamentals: economic variables contain information useful in forecasting the distributions of exchange rates. The benchmark Taylor rule model is also found to perform betterthan the monetary and PPP models. Second, the inference framework proposed in this paper for forecast-interval evaluation, can be applied in a broader context, such as inflation forecasting, not just to the models and interval forecasting methods used in this paper.
    Keywords: Foreign exchange ; Forecasting ; Taylor's rule ; Econometric models - Evaluation
    Date: 2008
  15. By: Meredith Beechey; Erik Hjalmarsson; Par Osterholm
    Abstract: Nominal interest rates are unlikely to be generated by unit-root processes. Using data on short and long interest rates from eight developed and six emerging economies, we test the expectations hypothesis using cointegration methods under the assumption that interest rates are near integrated. If the null hypothesis of no cointegration is rejected, we then test whether the estimated cointegrating vector is consistent with that suggested by the expectations hypothesis. The results show support for cointegration in ten of the fourteen countries we consider, and the cointegrating vector is similar across countries. However, the parameters differ from those suggested by theory. We relate our findings to existing literature on the failure of the expectations hypothesis and to the role of term premia.
    Date: 2008
  16. By: Arthur Lewbel (Boston College); Krishna Pendakur (Simon Fraser University)
    Abstract: The structural consumer demand methods used to estimate the parameters of collective household models are typically either very restrictive and easy to implement or very general and difficult to estimate. In this paper, we provide a middle ground. We adapt the very general framework of Browning, Chiappori and Lewbel (2007) by adding a simple restriction that recasts the empirical model from a highly nonlinear demand system with price variation to a slightly nonlinear Engel curve system. Our restriction has an interpretation in terms of the behaviour of household scale economies and is testable. Our method identifies the levels of (not just changes in) household resource shares, and a variant of equivalence scales called indifference scales. We apply our methodology to Canadian expenditure data.
    Keywords: Consumer Demand, Collective Model, Sharing rule, Household Bargaining, Bargaining Power, Indifference Scales, Adult Equivalence Scales, Demand Systems, Barten Scales, Nonparametric Identification.
    JEL: D12 D11 C30 I31 J12
    Date: 2008–05–01
  17. By: Crépon, Bruno (CREST-INSEE); Ferracci, Marc (CREST-INSEE); Jolivet, Grégory (University of Bristol); van den Berg, Gerard J. (Free University of Amsterdam)
    Abstract: This paper implements a method to identify and estimate treatment effects in a dynamic setting where treatments may occur at any point in time. By relating the standard matching approach to the timing-of-events approach, it demonstrates that effects of the treatment on the treated at a given date can be identified although non-treated may be treated later in time. The approach builds on a "no anticipation" assumption and the assumption of conditional independence between the duration until treatment and the counterfactual durations until exit. To illustrate the approach, the paper studies the effect of training for unemployed workers in France, using a rich register data set. Training has little impact on unemployment duration. The contamination of the standard matching estimator due to later entries into treatment is large if the treatment probability is high.
    Keywords: propensity score, training, unemployment duration, program participation, treatment, matching, contamination bias
    JEL: J64 C21 C31 C41 C14
    Date: 2008–11
  18. By: Lutz Kilian; Clara Vega
    Abstract: Models that treat innovations to the price of energy as predetermined with respect to U.S. macroeconomic aggregates are widely used in the literature. For example, it is common to order energy prices first in recursively identified VAR models of the transmission of energy price shocks. Since exactly identifying assumptions are inherently untestable, this approach in practice has required an act of faith in the empirical plausibility of the delay restriction used for identification. An alternative view that would invalidate such models is that energy prices respond instantaneously to macroeconomic news, implying that energy prices should be ordered last in recursively identified VAR models. In this paper, we propose a formal test of the identifying assumption that energy prices are predetermined with respect to U.S. macroeconomic aggregates. Our test is based on regressing cumulative changes in daily energy prices on daily news from U.S. macroeconomic data releases. Using a wide range of macroeconomic news, we find no compelling evidence of feedback at daily or monthly horizons, contradicting the view that energy prices respond instantaneously to macroeconomic news and supporting the use of delay restrictions for identification.
    Date: 2008
  19. By: Andersson, Eva (Statistical Research Unit, Department of Economics, School of Business, Economics and Law, Göteborg University)
    Abstract: A system for detecting changes in an on-going process is needed in many situations. On-line monitoring (surveillance) is used in early detection of disease outbreaks, of patients at risk and of financial instability. By continually monitoring one or several indicators, we can, early, detect a change in the processes of interest. There are several suggested methods for multivariate surveillance, one of which is the Hotelling’s T2. Since one aim in surveillance is quick detection of a change, it is important to use evaluation measures that reflect the timeliness of an alarm. One suggested measure is the expected delay of an alarm, in relation to the time of change () in the process. Here we investigate a delay measure for the bivariate situation. Generally, the measure depends on both change times (i.e. 1 and 2). We show that, for a bivariate situation using the T2 method, the delay only depends on 1 and 2 through the distance 1-2.
    Keywords: Monitoring; On-line; Surveillance; T2; Timeliness
    JEL: C10
    Date: 2008–11–28

This nep-ecm issue is ©2008 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.