nep-ecm New Economics Papers
on Econometrics
Issue of 2011‒04‒23
twenty-one papers chosen by
Sune Karlsson
Orebro University

  1. A Monte Carlo Analysis of the VAR-Based Indirect Inference Estimation of DSGE Models By David Dubois
  2. Conditional Value-at-Risk and Average Value-at-Risk: Estimation and Asymptotics By Chun, So Yeon; Shapiro, Alexander; Uryasev, Stan
  3. Multiplicative Error Models By Christian T. Brownlees; Fabrizio Cipollini; Giampiero M. Gallo
  4. A statistical test for forecast evaluation under a discrete loss function By Francisco J. Eransus; Alfonso Novales Cinca
  5. Estimation of long memory in integrated variance By Eduardo Rossi; Paolo Santucci de Magistris
  6. Identification Using Stability Restrictions By Leandro M. Magnusson; Sophocles Mavroeidis
  7. On Identification of Bayesian DSGE Models By Koop, Gary; Pesaran, Hashem; Smith, Ron P.
  8. Modelling Regime Switching and Structural Breaks with an Infinite Dimension Markov Switching Model By Yong Song
  9. Inference with Imperfect Randomization: The Case of the Perry Preschool Program By Heckman, James J.; Pinto, Rodrigo; Shaikh, Azeem M.; Yavitz, Adam
  10. Running and Jumping Variables in RD Designs By Alan Barreca; Melanie Guldi; Jason M. Lindo; Glen R. Waddell
  11. The predictive accuracy of credit ratings: Measurement and statistical inference By Orth, Walter
  12. A Bunch of Models, a Bunch of Nulls and Inference About Predictive Ability. By Pablo Pincheira
  13. On some problems in discrete wavelet analysis of bivariate spectra with an application to business cycle synchronization in the euro zone By Bruzda, Joanna
  14. Classical time-varying FAVAR models - estimation, forecasting and structural analysis By Eickmeier, Sandra; Lemke, Wolfgang; Marcellino, Massimiliano
  15. Identification and Estimation of Stochastic Bargaining Models, Third Version By Antonio Merlo; Xun Tang
  16. Generalized Cointegration: A New Concept with an Application to Health Expenditure and Health Outcomes By Stephen Hall; P. A. V. B. Swamy; George S. Tavlas
  17. Nowcasting inflation using high frequency data By Michele Modugno
  18. On stochastic properties between some ordered random variables By Nuria Torrado; Rosa E. Lillo; Michael P. Wiper
  19. Distress Dependence and Financial Stability By Miguel A. Segoviano; Charles Goodhart
  20. Methodische Aspekte linearer Strukturgleichungsmodelle. Ein Vergleich von kovarianz- und varianzbasierten Kausalanalyseverfahren By Fuchs, Andreas
  21. How Prediction Markets Can Save Event Studies By Snowberg, Erik; Wolfers, Justin; Zitzewitz, Eric

  1. By: David Dubois
    Abstract: In this paper we study estimation of DSGE models. More specifically, in the indirect inference framework, we analyze how critical is the choice of the reduced form model for estimation purposes. As it turns out, simple VAR parameters performs better than commonly used impulse response functions. This can be attributed to the fact that IRF worsen identification issues for models that are already plagued by that phenomenon.
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:rpp:wpaper:1104&r=ecm
  2. By: Chun, So Yeon; Shapiro, Alexander; Uryasev, Stan
    Abstract: We discuss linear regression approaches to conditional Value-at-Risk and Average Value-at-Risk (Conditional Value-at-Risk, Expected Shortfall) risk measures. Two estimation procedures are considered for each conditional risk measure, one is direct and the other is based on residual analysis of the standard least squares method. Large sample statistical inference of the estimators obtained is derived. Furthermore, finite sample properties of the proposed estimators are investigated and compared with theoretical derivations in an extensive Monte Carlo study. Empirical results on the real-data (different financial asset classes) are also provided to illustrate the performance of the estimators.
    Keywords: Value-at-Risk; Average Value-at-Risk; Conditional Value-at-Risk; Expected Shortfall; linear regression; least squares residual; quantile regression; conditional risk measures; statistical inference
    JEL: C53 D81 G32 C15 C1
    Date: 2011–04–03
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:30132&r=ecm
  3. By: Christian T. Brownlees (Stern School of Business, New York University); Fabrizio Cipollini (Dipartimento di Statistica, Universita` di Firenze); Giampiero M. Gallo (Dipartimento di Statistica, Universita` di Firenze)
    Abstract: Financial time series analysis has focused on data related to market trading activity. Next to the modeling of the conditional variance of returns within the GARCH family of models, recent attention has been devoted to other variables: first, and foremost, volatility measured on the basis of ultra-high frequency data, but also volumes, number of trades, durations. In this paper, we examine a class of models, named Multiplicative Error Models, which are particularly suited to model such non-negative time series. We discuss the univariate specification, by considering the base choices for the conditional expectation and the error term. We provide also a general framework, allowing for richer specifications of the conditional mean. The outcome is a novel MEM (called Composite MEM) which is reminiscent of the short- and long-run component GARCH model by Engle and Lee (1999). Inference issues are discussed relative to Maximum Likelihood and Generalized Method of Moments estimation. In the application, we show the regularity in parameter estimates and forecasting performance obtainable by applying the MEM to the realized kernel volatility of components of the S&P100 index. We suggest extensions of the base model by enlarging the information set and adopting a multivariate specification.
    Keywords: Multiplicative Error Models, Realized Volatility, Financial Time Series, Composite MEM
    JEL: C22 C51 C52
    Date: 2011–02
    URL: http://d.repec.org/n?u=RePEc:fir:econom:wp2011_03&r=ecm
  4. By: Francisco J. Eransus (Departamento de Economía Cuantitativa (Department of Quantitative Economics), Facultad de Ciencias Económicas y Empresariales (Faculty of Economics and Business), Universidad Complutense de Madrid); Alfonso Novales Cinca (Departamento de Economía Cuantitativa (Department of Quantitative Economics), Facultad de Ciencias Económicas y Empresariales (Faculty of Economics and Business), Universidad Complutense de Madrid)
    Abstract: We propose a new approach to evaluating the usefulness of a set of forecasts, based on the use of a discrete loss function de…ned on the space of data and forecasts. Exist- ing procedures for such an evaluation either do not allow for formal testing, or use tests statistics based just on the frequency distribution of (data , forecasts)-pairs. They can easily lead to misleading conclusions in some reasonable situations, because of the way they formalize the underlying null hypothesis that the set of forecasts is not useful. Even though the ambiguity of the underlying null hypothesis precludes us from per- forming a standard analysis of the size and power of the tests, we get results suggesting that the proposed DISC test performs better than its competitors.
    Keywords: Forecasting Evaluation, Loss Function.
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1107&r=ecm
  5. By: Eduardo Rossi (University of Pavia); Paolo Santucci de Magistris (University of Padova and CREATES)
    Abstract: A stylized fact is that realized variance has long memory. We show that, when the instantaneous volatility is driven by a fractional Brownian motion, the integrated variance is characterized by long-range dependence. As a consequence, the realized variance inherits this property when prices are observed continuously and without microstructure noise, and the spectral densities of integrated and realized variance coincide. However, prices are not observed continuously, so that the realized variance is affected by a measurement error. Discrete sampling and market microstructure noise induce a finite-sample bias in the fractionally integration semiparametric estimates. A Monte Carlo simulation analysis provides evidence of such a bias for common sampling frequencies.
    Keywords: Realized variance, Long memory, fractional Brownian Motion, Measurement error, Whittle estimator.
    JEL: C10 C22 C80
    Date: 2011–04–12
    URL: http://d.repec.org/n?u=RePEc:aah:create:2011-11&r=ecm
  6. By: Leandro M. Magnusson (Department of Economics, Tulane University); Sophocles Mavroeidis (Brown University)
    Abstract: Structural change, typically induced by policy regime shifts, is a common feature of dynamic economic models. We show that structural change can be used constructively to improve the identification of structural parameters that are stable over time. A leading example is models that are immune to the well-known Lucas (1976) critique. This insight is used to develop novel econometric methods that extend the widely used generalized method of moments (GMM). The proposed methods yield improved inference in a leading macroeconomic policy model.
    Keywords: GMM, identification, structural stability, Lucas critique, new Keynesian models
    JEL: C22 E31
    Date: 2011–01
    URL: http://d.repec.org/n?u=RePEc:tul:wpaper:1116&r=ecm
  7. By: Koop, Gary (University of Strathclyde); Pesaran, Hashem (University of Cambridge); Smith, Ron P. (Birkbeck College, University of London)
    Abstract: In recent years there has been increasing concern about the identification of parameters in dynamic stochastic general equilibrium (DSGE) models. Given the structure of DSGE models it may be difficult to determine whether a parameter is identified. For the researcher using Bayesian methods, a lack of identification may not be evident since the posterior of a parameter of interest may differ from its prior even if the parameter is unidentified. We show that this can be the case even if the priors assumed on the structural parameters are independent. We suggest two Bayesian identification indicators that do not suffer from this difficulty and are relatively easy to compute. The first applies to DSGE models where the parameters can be partitioned into those that are known to be identified and the rest where it is not known whether they are identified. In such cases the marginal posterior of an unidentified parameter will equal the posterior expectation of the prior for that parameter conditional on the identified parameters. The second indicator is more generally applicable and considers the rate at which the posterior precision gets updated as the sample size (T) is increased. For identified parameters the posterior precision rises with T, whilst for an unidentified parameter its posterior precision may be updated but its rate of update will be slower than T. This result assumes that the identified parameters are -consistent, but similar differential rates of updates for identified and unidentified parameters can be established in the case of super consistent estimators. These results are illustrated by means of simple DSGE models.
    Keywords: Bayesian identification, DSGE models, posterior updating, New Keynesian Phillips Curve
    JEL: C11 C15 E17
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp5638&r=ecm
  8. By: Yong Song
    Abstract: This paper proposes an infinite dimension Markov switching model to accommodate regime switching and structural break dynamics or a combination of both in a Bayesian framework. Two parallel hierarchical structures, one governing the transition probabilities and another governing the parameters of the conditional data density, keep the model parsimonious and improve forecasts. This nonparametric approach allows for regime persistence and estimates the number of states automatically. A global identification algorithm for structural changes versus regime switching is presented. Applications to U.S. real interest rates and inflation compare the new model to existing parametric alternatives. Besides identifying episodes of regime switching and structural breaks, the hierarchical distribution governing the parameters of the conditional data density provides significant gains to forecasting precision.
    Keywords: hidden Markov model; Bayesian nonparametrics; Dirchlet process
    JEL: C51 C53 C22 C11
    Date: 2011–04–15
    URL: http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa-427&r=ecm
  9. By: Heckman, James J. (University of Chicago); Pinto, Rodrigo (University of Chicago); Shaikh, Azeem M. (University of Chicago); Yavitz, Adam (University of Chicago)
    Abstract: This paper considers the problem of making inferences about the effects of a program on multiple outcomes when the assignment of treatment status is imperfectly randomized. By imperfect randomization we mean that treatment status is reassigned after an initial randomization on the basis of characteristics that may be observed or unobserved by the analyst. We develop a partial identification approach to this problem that makes use of information limiting the extent to which randomization is imperfect to show that it is still possible to make nontrivial inferences about the effects of the program in such settings. We consider a family of null hypotheses in which each null hypothesis specifies that the program has no effect on one of several outcomes of interest. Under weak assumptions, we construct a procedure for testing this family of null hypotheses in a way that controls the familywise error rate – the probability of even one false rejection – infinite samples. We develop our methodology in the context of a reanalysis of the HighScope Perry Preschool program. We find statistically significant effects of the program on a number of different outcomes of interest, including outcomes related to criminal activity for males and females, even after accounting for the imperfectness of the randomization and the multiplicity of null hypotheses.
    Keywords: multiple testing, multiple outcomes, randomized trial, randomization tests, imperfect randomization, Perry Preschool Program, program evaluation, familywise error rate, exact inference, partial identification
    JEL: C31 I21 J13
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp5625&r=ecm
  10. By: Alan Barreca; Melanie Guldi; Jason M. Lindo; Glen R. Waddell (Department of Economics, Tulane University)
    Abstract: This study demonstrates that regression discontinuity designs will arrive at biased estimates when attributes related to outcomes predict heaping in the running variable. We discuss several approaches to diagnosing and correcting for this type of problem. Our primary example focuses on the use of birth weights as a running variable. We begin by showing that birth weights are measured most precisely for children of white and highly educated mothers. As a result, less healthy children, who are more likely to be of low socioeconomic status, are disproportionately represented at multiples of round numbers. For this reason, RD estimates using birth weight as the running variable will be biased in a manner that leads one to conclude that it is "good" to be strictly less than any 100-gram cutoff. As such, prior estimates of the effects of very low birth weight classification (Almond, Doyle, Kowalski, and Williams 2010) have been overstated and appear to be zero. We also demonstrate potential problems using days of birth or grade point averages as running variables.
    Keywords: regression discontinuity, birth weight, infant mortality
    JEL: C21 C14 I12
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:tul:wpaper:1001&r=ecm
  11. By: Orth, Walter
    Abstract: Credit ratings are ordinal predictions for the default risk of an obligor. To evaluate the accuracy of such predictions commonly used measures are the Accuracy Ratio or, equivalently, the Area under the ROC curve. The disadvantage of these measures is that they treat default as a binary variable thereby neglecting the timing of the default events and also not using the full information from censored observations. We present an alternative measure that is related to the Accuracy Ratio but does not suffer from these drawbacks. As a second contribution, we study statistical inference for the Accuracy Ratio and the proposed measure in the case of multiple cohorts of obligors with overlapping lifetimes. We derive methods that use more sample information and lead to more powerful tests than alternatives that filter just the independent part of the dataset. All procedures are illustrated in the empirical section using a dataset of S\&P Long Term Credit Ratings.
    Keywords: Ratings; predictive accuracy; Accuracy Ratio; Harrell's C; overlapping lifetimes
    JEL: C41 G32 G24
    Date: 2010–03–22
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:30148&r=ecm
  12. By: Pablo Pincheira
    Abstract: Inference about predictive ability is usually carried out in the form of pairwise comparisons between two competing forecasting methods. Nevertheless, some interesting questions are concerned with families of models and not just with a couple of forecasting strategies. An example of this would be the question about the predictive accuracy of pure time-series models versus models based on economic fundamentals. It is clear that an appropriate answer to this question requires comparing families of models, which may include a number of different forecasting strategies. Another usual approach in the literature consists of comparing the accuracy of a new forecasting method with a natural benchmark. Nevertheless, unless the econometrician is completely sure about the superiority of the benchmark over the rest of the methods available in the literature, he/she may want to compare the accuracy of his/her new forecasting model, and its extensions, against a broader set of methods. In this article we present a simple methodology to test the null hypothesis of equal predictive ability between two families of forecasting methods. Our approach corresponds to a natural extension of the White (2000) reality check in which we allow for the families being compared to be populated by a large number of forecasting methods. We illustrate our testing approach with an empirical application comparing the ability of two families of models to predict headline inflation in Chile, the US, Sweden and Mexico. With this illustration we show that comparing families of models using the usual approach based on pairwise comparisons of the best ex-post performing models in each family, may lead to conclusions that are at odds with those suggested by our approach.
    Date: 2011–01
    URL: http://d.repec.org/n?u=RePEc:chb:bcchwp:607&r=ecm
  13. By: Bruzda, Joanna
    Abstract: The paper considers some of the problems emerging from discrete wavelet analysis of popular bivariate spectral quantities like the coherence and phase spectra and the frequency-dependent time delay. The approach taken here, introduced by Whitcher and Craigmile (2004), is based on the maximal overlap discrete Hilbert wavelet transform (MODHWT). Firstly, we point at a deficiency in the implementation of the MODHWT and suggest using a modified implementation scheme resembling the one applied in the context of the dual-tree complex wavelet transform of Kingsbury (see Selesnick et al., 2005). Secondly, via a broad set of simulation experiments we examine small and large sample properties of two wavelet estimators of the scale-dependent time delay. The estimators are: the wavelet cross-correlator and the wavelet phase angle-based estimator. Our results provide some practical guidelines for empirical examination of short- and medium-term lead-lag relations for octave frequency bands. Besides, we show how the MODHWT-based wavelet quantities can serve to approximate the Fourier bivariate spectra and discuss certain issues connected with building confidence intervals for them. The discrete wavelet analysis of coherence and phase angle is illustrated with a scale-dependent examination of business cycle synchronization between 11 euro zone member countries. The study is supplemented with wavelet analysis of variance and covariance of the euro zone business cycles. The empirical examination underlines good localization properties and high computational efficiency of the wavelet transformations applied, and provides new arguments in favour of the endogeneity hypothesis of the optimum currency area criteria as well as a wavelet evidence on dating the Great Moderation in the euro zone. --
    Keywords: Hilbert wavelet pair,MODHWT,wavelet coherence,wavelet phase angle,business cycle synchronization
    JEL: C19 E32 E58 O52
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:20115&r=ecm
  14. By: Eickmeier, Sandra; Lemke, Wolfgang; Marcellino, Massimiliano
    Abstract: We propose a classical approach to estimate factor-augmented vector autoregressive (FAVAR) models with time variation in the factor loadings, in the factor dynamics, and in the variance-covariance matrix of innovations. When the time-varying FAVAR is estimated using a large quarterly dataset of US variables from 1972 to 2007, the results indicate some changes in the factor dynamics, and more marked variation in the factors' shock volatility and their loading parameters. Forecasts from the time-varying FAVAR are more accurate than those from a constant parameter FAVAR for most variables and horizons when computed insample, for some variables in pseudo real time, mostly financial indicators. Finally, we use the time-varying FAVAR to assess how monetary transmission to the economy has changed. We find substantial time variation in the volatility of monetary policy shocks, and we observe that the reaction of GDP, the GDP deflator, inflation expectations and long-term interest rates to an equally-sized monetary policy shock has decreased since the early-1980s. --
    Keywords: FAVAR,time-varying parameters,monetary transmission,forecasting
    JEL: C3 C53 E52
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdp1:201104&r=ecm
  15. By: Antonio Merlo (Department of Economics, University of Pennsylvania); Xun Tang (Department of Economics, University of Pennsylvania)
    Abstract: Stochastic sequential bargaining models (Merlo and Wilson (1995, 1998)) have found wide applications in different fields including political economy and macroeconomics due to their flexibility in explaining delays in reaching an agreement. This paper presents new results in nonparametric identification and estimation of such models under different data scenarios.
    Keywords: Nonparametric identification, non-cooperative bargaining, stochastic sequential bargaining, rationalizable counterfactual outcomes
    JEL: C14 C35 C73 C78
    Date: 2010–03–01
    URL: http://d.repec.org/n?u=RePEc:pen:papers:11-008&r=ecm
  16. By: Stephen Hall; P. A. V. B. Swamy; George S. Tavlas
    Abstract: We propose a new generalization of the concept of cointegration that allows for the possibility that a set of variables are involved in an unknown nonlinear relationship. Although these variables may be unit-root non-stationary, there exists a nonlinear combination of them that takes account of such non-stationarity. We then introduce an estimation technique that allows us to test for the presence of this generalized cointegration in the absence of knowledge as to the true nonlinear functional form and the full set of regressors. We outline the basic stages of the technique and discuss how the issue of unit-root non-stationarity and cointegration affects each stage of the estimation procedure. We then apply this technique to the relationship between health expenditure and health outcomes, which is an important but controversial issue. A number of studies have found very little or no relationship between the level of health expenditure and outcomes. In econometric terms, if there is such a relationship then there should exist a cointegrating relationship between these two variables and possibly many others. The problem that arises is that we may be either unable to measure these other variables or that we do not know about them, in which case we may incorrectly find no relationship between health expenditures and outcomes. We then apply the concept of generalized cointegration; we obtain a highly significant relationship between health expenditure and health outcomes.
    Keywords: Generalized cointegration; non-stationarity; time-varying coefficient model; coefficient driver
    JEL: C13 C19 C22
    Date: 2011–03
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:11/22&r=ecm
  17. By: Michele Modugno (European Central Bank, DG-R/EMO, Kaiserstrasse 29, D-60311, Frankfurt am Main, Germany.)
    Abstract: This paper proposes a methodology to nowcast and forecast inflation using data with sampling frequency higher than monthly. The nowcasting literature has been focused on GDP, typically using monthly indicators in order to produce an accurate estimate for the current and next quarter. This paper exploits data with weekly and daily frequency in order to produce more accurate estimates of inflation for the current and followings months. In particular, this paper uses the Weekly Oil Bulletin Price Statistics for the euro area, the Weekly Retail Gasoline and Diesel Prices for the US and daily World Market Prices of Raw Materials. The data are modeled as a trading day frequency factor model with missing observations in a state space representation. For the estimation we adopt the methodology exposed in Banbura and Modugno (2010). In contrast to other existing approaches, the methodology used in this paper has the advantage of modeling all data within a unified single framework that, nevertheless, allows one to produce forecasts of all variables involved. This offers the advantage of disentangling a model-based measure of ”news” from each data release and subsequently to assess its impact on the forecast revision. The paper provides an illustrative example of this procedure. Overall, the results show that these data improve forecast accuracy over models that exploit data available only at monthly frequency for both countries. JEL Classification: C53, E31, E37.
    Keywords: Factor Models, Forecasting, Inflation, Mixed Frequencies.
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20111324&r=ecm
  18. By: Nuria Torrado; Rosa E. Lillo; Michael P. Wiper
    Abstract: A great number of articles have dealt with stochastic comparisons of ordered random variables in the last decades. In particular, distributional and stochastic properties of ordinary order statistics have been studied extensively in the literature. Sequential order statistics are proposed as an extension of ordinary order statistics. Since sequential order statistics models unify various models of ordered random variables, it is interesting to study their distributional and stochastic properties. In this work, we consider the problem of comparing sequential order statistics according to magnitude and location orders.
    Keywords: Stochastic orderings, Reliability, Order statistics
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws110603&r=ecm
  19. By: Miguel A. Segoviano; Charles Goodhart
    Abstract: This paper defines a set of systemic financial stability indicators which measure distress dependence among the financial institutions in a system, thereby allowing to analyze stability from three complementary perspectives: common distress in the system, distress between specific banks, and cascade effects associated with a specific bank. Our approach defines the banking system as a portfolio of banks and infers the system’s multivariate density (BSMD) from which the proposed measures are estimated. The BSMD embeds the banks’ default inter-dependence structure that captures linear and non-linear distress dependencies among the banks in the system, and its changes at different times of the economic cycle. The BSMD is recovered using the CIMDO-approach, a new approach that, in the presence of restricted data, improves density specification without explicitly imposing parametric forms that, under restricted data sets, are difficult to model. Thus, the proposed measures can be constructed from a very limited set of publicly available data and can be provided for a wide range of both developing and developed countries.
    Date: 2010–04
    URL: http://d.repec.org/n?u=RePEc:chb:bcchwp:569&r=ecm
  20. By: Fuchs, Andreas
    Abstract: --
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:zbw:wuerpm:22011&r=ecm
  21. By: Snowberg, Erik (California Institute of Technology); Wolfers, Justin (Wharton School, University of Pennsylvania); Zitzewitz, Eric (Stanford University)
    Abstract: This review paper articulates the relationship between prediction market data and event studies, with a special focus on applications in political economy. Event studies have been used to address a variety of political economy questions – from the economic effects of party control of government to the importance of complex rules in congressional committees. However, the results of event studies are notoriously sensitive to both choices made by researchers and external events. Specifically, event studies will generally produce different results depending on three interrelated things: which event window is chosen, the prior probability assigned to an event at the beginning of the event window, and the presence or absence of other events during the event window. In this paper we show how each of these may bias the results of event studies, and how prediction markets can mitigate these biases.
    Keywords: prediction markets, event studies, political economy
    JEL: A2 D72 H50 G14
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp5640&r=ecm

This nep-ecm issue is ©2011 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.