nep-ecm New Economics Papers
on Econometrics
Issue of 2009‒07‒11
25 papers chosen by
Sune Karlsson
Orebro University

  1. Forecasting with Spatial Panel Data By Baltagi, Badi H.; Bresson, Georges; Pirotte, Alain
  2. Adaptive Rate-optimal Detection of Small Autocorrelation Coefficients By Alain Guay; Emmanuel Guerre; Štěpána Lazarová
  3. Forecasting Levels of log Variables in Vector Autoregressions By Gunnar Bardsen; Helmut Luetkepohl
  4. Robust estimation of zero-inflated count models By Kevin E. Staub; Rainer Winkelmann
  5. Econometric Issues and Methods in the Estimation of Production Functions By Aguirregabiria, Victor
  6. Equidispersion and moment conditions for count panel data model By Yoshitsugu Kitazawa
  7. Some Notes on Sample Selection Models By Aguirregabiria, Victor
  8. Bayesian Analysis of Time-Varying Parameter Vector Autoregressive Model for the Japanese Economy and Monetary Policy By Jouchi Nakajima; Munehisa Kasuya; Toshiaki Watanabe
  9. A negative binomial model and moment conditions for count panel data By Yoshitsugu Kitazawa
  10. Forecasting with Factor-augmented Error Correction Models By Igor Masten; Massimiliano Marcellino; Anindya Banerjeey
  11. Impossibility Results for Nondifferentiable Functionals By Hirano, Keisuke; Porter, Jack
  12. Spatial Filtering, Model Uncertainty and the Speed of Income Convergence in Europe By Jesus Crespo Cuaresma; Martin Feldkircher
  13. Introducing the Euro-STING: Short-Term Indicator of Euro Area Growth By Camacho, Maximo; Pérez-Quirós, Gabriel
  14. Long memory in stock market volatility and the volatility-in-mean effect: the FIEGARCH-M model By Bent Jesper Christensen; Morten Ørregaard Nielsen; Jie Zhu
  15. Cluster sample inference using sensitivity analysis: the case with few groups By Vikström, Johan
  16. Identifying Heterogeneity in Economic Choice Models By Jeremy T. Fox; Amit Gandhi
  17. An I(2) Cointegration Model with Piecewise Linear Trends: Likelihood Analysis and Application By Takamitsu Kurita; Heino Bohn Nielsen; Anders Rahbek
  18. Foundations of Non-Commutative Probability Theory By Daniel Lehmann
  19. Real-time density forecasts from VARs with stochastic volatility By Todd E. Clark
  20. Sequential Methodology for Signaling Business Cycle Turning Points By Vasyl Golosnoy; Jens Hogrefe
  21. Combining discrete and continuous representations of preference heterogeneity: a latent class approach By Angel Bujosa Bestard; Antoni Riera Font; Robert L. Hicks
  22. Estimating censored and non homothetic demand systems: the generalized maximum entropy approach By Fabienne Féménia; Alexandre Gohin
  23. Assessing the Transmission of Monetary Policy Shocks Using Dynamic Factor Models By Dimitris Korobilis
  24. Dynamics in systematic liquidity By Björn Hagströmer; Richard G. Anderson; Jane M. Binner; Birger Nilsson
  25. Stochastic Volatility and DSGE Models By Martin M. Andreasen

  1. By: Baltagi, Badi H. (Syracuse University); Bresson, Georges (University of Paris 2); Pirotte, Alain (University of Paris 2)
    Abstract: This paper compares various forecasts using panel data with spatial error correlation. The true data generating process is assumed to be a simple error component regression model with spatial remainder disturbances of the autoregressive or moving average type. The best linear unbiased predictor is compared with other forecasts ignoring spatial correlation, or ignoring heterogeneity due to the individual effects, using Monte Carlo experiments. In addition, we check the performance of these forecasts under misspecification of the spatial error process, various spatial weight matrices, and heterogeneous rather than homogeneous panel data models.
    Keywords: forecasting, BLUP, panel data, spatial dependence, heterogeneity
    JEL: C33
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp4242&r=ecm
  2. By: Alain Guay (Université du Québec); Emmanuel Guerre (Queen Mary, University of London); Štěpána Lazarová (Queen Mary, University of London)
    Abstract: A new test is proposed for the null of absence of serial correlation. The test uses a data-driven smoothing parameter. The resulting test statistic has a standard limit distribution under the null. The smoothing parameter is calibrated to achieve rate-optimality against several classes of alternatives. The test can detect alternatives with many small correlation coefficients that can go to zero with an optimal adaptive rate which is faster than the parametric rate. The adaptive rate-optimality against smooth alternatives of the new test is established as well. The test can also detect ARMA and local Pitman alternatives converging to the null with a rate close or equal to the parametric one. A simulation experiment and an application to monthly financial square returns illustrate the usefulness of the proposed approach.
    Keywords: Absence of serial correlation; Data-driven nonparametric tests; Adaptive rate-optimality; Small alternatives; Time series
    JEL: C12 C32
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp645&r=ecm
  3. By: Gunnar Bardsen; Helmut Luetkepohl
    Abstract: Sometimes forecasts of the original variable are of interest although a variable appears in logarithms (logs) in a system of time series. In that case converting the forecast for the log of the variable to a naive forecast of the original variable by simply applying the exponential transformation is not optimal theoretically. A simple expression for the optimal forecast under normality assumptions is derived. Despite its theoretical advantages the optimal forecast is shown to be inferior to the naive forecast if specification and estimation uncertainty are taken into account. Hence, in practice using the exponential of the log forecast is preferable to using the optimal forecast.
    Keywords: Vector autoregressive model, cointegration, forecast root mean square error
    JEL: C32
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2009/24&r=ecm
  4. By: Kevin E. Staub (Socioeconomic Institute, University of Zurich); Rainer Winkelmann (Socioeconomic Institute, University of Zurich)
    Abstract: Applications of zero-inflated count data models have proliferated in empirical economic research. There is a downside to this development, as zero-inflated Poisson or zero-inflated Negative Binomial Maximum Likelihood estimators are not robust to misspecification. In contrast, simple Poisson regression provides consistent parameter estimates even in the presence of excess zeros. The advantages of the Poisson approach are illustrated in a series of Monte Carlo simulations.
    Keywords: excess zeros, Poisson, overdispersion, Negative Binomial regression.
    JEL: C12 C25
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:soz:wpaper:0908&r=ecm
  5. By: Aguirregabiria, Victor
    Abstract: This paper discusses the main econometric issues in the identification and estimation of production functions, and reviews recent methods. The main emphasis of the paper is in explaining the role of different identifying assumptions used in alternative estimation methods.
    Keywords: Production Function Estimation. Dynamic Panel Data Models. Endogeneity. Sample Selection.
    JEL: L11 C01
    Date: 2009–06–25
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15973&r=ecm
  6. By: Yoshitsugu Kitazawa (Faculty of Economics, Kyushu Sangyo University)
    Abstract: This paper proposes some new moment conditions under the assumption of the equidispersion in count panel data model. These are obtained by using the association between variances and covariances in the disturbance. Some Monte Carlo experiments configured for the Poisson model show that the GMM estimators using the new moment conditions perform better than the conventional quasi-differenced GMM estimator and some gains are recognized in using the new moment conditions.
    Keywords: count panel data, linear feedback model, equidispersion, implicit operation, crosslinkage moment conditions, GMM, Monte Carlo experiments
    JEL: C23
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:kyu:dpaper:33&r=ecm
  7. By: Aguirregabiria, Victor
    Abstract: Sample selection problems are pervasive when working with micro economic models and datasets of individuals, households or firms. During the last three decades, there have been very significant developments in this area of econometrics. Different type of models have been proposed and used in empirical applications. And new estimation and inference methods, both parametric and semiparametric, have been developed. These notes provide a brief introduction to this large literature.
    Keywords: Sample selection. Censored regression model. Truncated regression model. Treatment effects. Semiparametric methods.
    JEL: C14 C24 C01
    Date: 2009–05–20
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15974&r=ecm
  8. By: Jouchi Nakajima; Munehisa Kasuya; Toshiaki Watanabe
    Abstract: This paper analyzes the time-varying parameter vector autoregressive (TVP-VAR) model for the Japanese economy and monetary policy. The time-varying parameters are estimated via the Markov chain Monte Carlo method and the posterior estimates of parameters reveal the time-varying structure of the Japanese economy and monetary policy during the period from 1981 to 2008. The marginal likelihoods of the TVP-VAR model and other VAR models are also estimated. The estimated marginal likelihoods indicate that the TVP-VAR model best fits the Japanese economic data.
    Keywords: Bayesian inference, Markov chain Monte Carlo, Monetary policy, State space model, Structural vector autoregressive model, Stochastic volatility, Time-varying parameter
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:hst:ghsdps:gd09-072&r=ecm
  9. By: Yoshitsugu Kitazawa (Faculty of Economics, Kyushu Sangyo University)
    Abstract: This paper proposes some moment conditions associated with an appropriate specification of negative binomial model for count panel data, which is proposed by Hausman et al. (1984). The newly proposed moment conditions enable researchers to conduct the consistent estimation of the model under much weaker assumptions than those configured by Hausman et al. (1984). In some Monte Carlo experiments, it is shown that the GMM estimators using the new moment conditions perform well in the DGP configurations conforming to the specification above.
    Keywords: count panel data, predetermined explanatory variable, linear feedback model, overdispersion, negative binomial model, implicit operation, cross-linkage moment conditions, GMM, Monte Carlo experiments
    JEL: C23
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:kyu:dpaper:34&r=ecm
  10. By: Igor Masten; Massimiliano Marcellino; Anindya Banerjeey
    Abstract: As a generalization of the factor-augmented VAR (FAVAR) and of the Error Correction Model (ECM), Banerjee and Marcellino (2009) introduced the Factor-augmented Error Correction Model (FECM). The FECM combines error-correction, cointegration and dynamic factor models, and has several conceptual advantages over standard ECM and FAVAR models. In particular, it uses a larger dataset compared to the ECM and incorporates the long-run information lacking from the FAVAR because of the latter's specification in di¤erences. In this paper we examine the forecasting performance of the FECM by means of an analytical example, Monte Carlo simulations and several empirical applications. We show that relative to the FAVAR, FECM generally o¤ers a higher forecasting precision and in general marks a very useful step forward for forecasting with large datasets.
    Keywords: Forecasting, Dynamic Factor Models, Error Correction Models, Cointegration, Factor-augmented Error Correction Models, FAVAR
    Date: 2009–06–25
    URL: http://d.repec.org/n?u=RePEc:rsc:rsceui:2009/32&r=ecm
  11. By: Hirano, Keisuke; Porter, Jack
    Abstract: We examine challenges to estimation and inference when the objects of interest are nondifferentiable functionals of the underlying data distribution. This situation arises in a number of applications of bounds analysis and moment inequality models, and in recent work on estimating optimal dynamic treatment regimes. Drawing on earlier work relating differentiability to the existence of unbiased and regular estimators, we show that if the target object is not continuously differentiable in the parameters of the data distribution, there exist no locally asymptotically unbiased estimators and no regular estimators. This places strong limits on estimators, bias correction methods, and inference procedures.
    Keywords: bounds analysis; moment inequality models; treatment effects; limits of experiments
    JEL: C13 C14 C1
    Date: 2009–06–29
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15990&r=ecm
  12. By: Jesus Crespo Cuaresma; Martin Feldkircher
    Abstract: In this paper we put forward a Bayesian Model Averaging method dealing with model uncertainty in the presence of potential spatial autocorrelation. The method uses spatial filtering in order to account for different types of spatial links. We contribute to existing methods that handle spatial dependence among observations by explicitly taking care of uncertainty stemming from the choice of a particular spatial structure. Our method is applied to estimate the conditional speed of income convergence across 255 NUTS-2 European regions for the period 1995-2005. We show that the choice of a spatial weight matrix - and in particular the choice of a class thereof - can have an important effect on the estimates of the parameters attached to the model covariates. We also show that estimates of the speed of income convergence across European regions depend strongly on the form of the spatial patterns which are assumed to underly the dataset. When we take into account this dimension of model uncertainty, the posterior distribution of the speed of convergence parameter appears bimodal, with a large probability mass around no convergence (0% speed of convergence) and a rate of convergence of 1%, approximately half of the value which is usually reported in the literature.
    Keywords: Model uncertainty, spatial filtering, determinants of economic growth, European regions.
    JEL: C11 C15 C21 R11 O52
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:inn:wpaper:2009-17&r=ecm
  13. By: Camacho, Maximo; Pérez-Quirós, Gabriel
    Abstract: We set out a model to compute short-term forecasts of the euro area GDP growth in real-time. To allow for forecast evaluation, we construct a real-time data set that changes for each vintage date and includes the exact information that was available at the time of each forecast. With this data set, we show that our simple factor model algorithm, which uses a clear, easy-to-replicate methodology, is able to forecast the euro area GDP growth as well as professional forecasters who can combine the best forecasting tools with the possibility of incorporating their own judgement. In this context, we provide examples showing how data revisions and data availability affect point forecasts and forecast uncertainty.
    Keywords: Business cycle; Forecasting; Time Series
    JEL: C22 E27 E32
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:7343&r=ecm
  14. By: Bent Jesper Christensen (University of Aarhus and CREATES); Morten Ørregaard Nielsen (Queen's University and CREATES); Jie Zhu (University of Aarhus and CREATES)
    Abstract: We extend the fractionally integrated exponential GARCH (FIEGARCH) model for daily stock return data with long memory in return volatility of Bollerslev and Mikkelsen (1996) by introducing a possible volatility-in-mean effect. To avoid that the long memory property of volatility carries over to returns, we consider a filtered FIEGARCH-in-mean (FIEGARCH-M) effect in the return equation. The filtering of the volatility-in-mean component thus allows the co-existence of long memory in volatility and short memory in returns. We present an application to the daily CRSP value-weighted cum-dividend stock index return series from 1926 through 2006 which documents the empirical relevance of our model. The volatility-in-mean effect is significant, and the FIEGARCH-M model outperforms the original FIEGARCH model and alternative GARCH-type specifications according to standard criteria.
    Keywords: FIEGARCH, financial leverage, GARCH, long memory, risk-return tradeoff, stock returns, volatility feedback
    JEL: C22
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1207&r=ecm
  15. By: Vikström, Johan (IFAU - Institute for Labour Market Policy Evaluation)
    Abstract: This paper re-examines inference for cluster samples. Sensitivity analysis is proposed as a new method to perform inference when the number of groups is small. Based on estimations using disaggregated data, the sensitivity of the standard errors with respect to the variance of the cluster effects can be examined in order to distinguish a causal effect from random shocks. The method even handles just-identified models. One important example of a just-identified model is the two groups and two time periods difference-in-differences setting. The method allows for different types of correlation over time and between groups in the cluster effects.
    Keywords: Cluster-correlation; difference-in-difference; sensitivity analysis
    JEL: C12 C21 C23
    Date: 2009–06–11
    URL: http://d.repec.org/n?u=RePEc:hhs:ifauwp:2009_015&r=ecm
  16. By: Jeremy T. Fox; Amit Gandhi
    Abstract: We show how to nonparametrically identify the distribution that characterizes heterogeneity among agents in a general class of structural choice models. We introduce an axiom that we term separability and prove that separability of a structural model ensures identification. The main strength of separability is that it makes verifying the identification of nonadditive models a tractable task because it is a condition that is stated directly in terms of the choice behavior of agents in the model. We use separability to prove several new results. We prove the identification of the distribution of random functions and marginal effects in a nonadditive regression model. We also identify the distribution of utility functions in the multinomial choice model. Finally, we extend 2SLS to have random functions in both the first and second stages. This instrumental variables strategy applies equally to multinomial choice models with endogeneity.
    JEL: C14 C25 L0
    Date: 2009–07
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:15147&r=ecm
  17. By: Takamitsu Kurita (Faculty of Economics, Fukuoka University); Heino Bohn Nielsen (Department of Economics, University of Copenhagen); Anders Rahbek (Department of Economics, University of Copenhagen and CREATES)
    Abstract: This paper presents likelihood analysis of the I(2) cointegrated vector autoregression with piecewise linear deterministic terms. Limiting behavior of the maximum likelihood estimators are derived, which is used to further derive the limiting distribution of the likelihood ratio statistic for the cointegration ranks, extending the result for I(2) models with a linear trend in Nielsen and Rahbek (2007) and for I(1) models with piecewise linear trends in Johansen, Mosconi, and Nielsen (2000). The provided asymptotic theory extends also the results in Johansen, Juselius, Frydman, and Goldberg (2009) where asymptotic inference is discussed in detail for one of the cointegration parameters. To illustrate, an empirical analysis of US consumption, income and wealth, 1965 - 2008, is performed, emphasizing the importance of a change in nominal price trends after 1980.
    Keywords: Cointegration, I(2), Piecewise linear trends, Likelihood analysis, US consumption
    JEL: C32
    Date: 2009–07–06
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-28&r=ecm
  18. By: Daniel Lehmann
    Abstract: Kolmogorov's setting for probability theory is given an original generalization to account for probabilities arising from Quantum Mechanics. The sample space has a central role in this presentation and random variables, i.e., observables, are defined in a natural way. The mystery presented by the algebraic equations satisfied by (non-commuting) observables that cannot be observed in the same states is elucidated
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:huj:dispap:dp514&r=ecm
  19. By: Todd E. Clark
    Abstract: Central banks and other forecasters have become increasingly interested in various aspects of density forecasts. However, recent sharp changes in macroeconomic volatility such as the Great Moderation and the more recent sharp rise in volatility associated with greater variation in energy prices and the deep global recession pose significant challenges to density forecasting. Accordingly, this paper examines, with real-time data, density forecasts of U.S. GDP growth, unemployment, inflation, and the federal funds rate from VAR models with stochastic volatility. The model of interest extends the steady state prior BVAR of Villani (2009) to include stochastic volatility, because, as found in some prior work and this paper, incorporating informative priors on the steady states of the model variables often improves the accuracy of point forecasts. The evidence presented in the paper shows that adding stochastic volatility to the BVAR with a steady state prior materially improves the real-time accuracy of point and density forecasts.
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:fip:fedkrw:rwp09-08&r=ecm
  20. By: Vasyl Golosnoy; Jens Hogrefe
    Abstract: The dates of U.S. business cycle are reported by NBER with a considerable delay, so an early notion of turning points is of particular interest. This paper proposes a novel sequential approach designed for timely signaling these turning points. A directional cumulated sum decision rule is adapted for the purpose of on-line monitoring of transitions between subsequent phases of economic activity. The introduced procedure shows a sound detection ability for business cycle peaks and troughs compared to the established dynamic factor Markov switching methodology. It exhibits a range of theoretical optimality properties for early signaling, moreover, it is transparent and easy to implement
    Keywords: Business cycle; CUSUM control chart; Dynamic Factor Markov switching models; Early signaling; NBER dating
    JEL: C44 C50 E32
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:kie:kieliw:1527&r=ecm
  21. By: Angel Bujosa Bestard (Centre de Recerca Econòmica (UIB · Sa Nostra)); Antoni Riera Font (Centre de Recerca Econòmica (UIB · Sa Nostra)); Robert L. Hicks (The College of William and Mary)
    Abstract: This paper investigates heterogeneity in preferences for forest recreators in Mallorca, Spain. We develop a latent class approach combining discrete and continuous representations of tastes and compare it with the conventional latent class and random parameter logit approaches. We investigate the performance of the discrete-continuous model by comparing welfare estimates and predictive accuracy. The discrete-continuous model outperforms latent class and mixed logit approaches when comparing goodness-of-fit and in- sample site-choice forecasts. We find that the discrete-continuous model for preference heterogeneity reveals variation among individuals' preferences and WTP, and for some policy changes our results reveal striking differences in means and distributions of WTP.
    Keywords: Travel Cost Method, latent class model, random parameter model, recreation demand, forests
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:pdm:wpaper:2009/2&r=ecm
  22. By: Fabienne Féménia; Alexandre Gohin
    Abstract: The econometric estimation of zero censored demand system faces major difficulties. The virtual price approach pioneered by Lee and Pitt (1986) in an econometric framework is theoretically consistent but empirically feasible only for homothetic demand system. It may even fail to converge depending on initial conditions. In this paper we propose to expand on this approach by relying on the generalized maximum entropy concept instead of the Maximum Likelihood paradigm. The former is robust to the error distribution while the latter must stick with a normality assumption. Accordingly the econometric specification of censored demand systems with virtual prices is made easier even with non homothetic preferences defined over several goods. Illustrative Monte Carlo sampling results show its relative performance.
    Keywords: censored demand system, virtual prices, generalised maximum entropy
    JEL: C34 C51 D12
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:rae:wpaper:200912&r=ecm
  23. By: Dimitris Korobilis (Department of Economics, University of Strathclyde)
    Abstract: The evolution of monetary policy in the U.S. is examined based on structural dynamic factor models. I extend the current literature which questions the stability of the monetary transmission mechanism, by proposing and studying time-varying parameters factor-augmented vector autoregressions (TVP-FAVAR), which allow for fast and efficient inference based on hundreds of explanatory variables. Different specifcations are compared where the factor loadings, VAR coefficients and error covariances, or combinations of those, may change gradually in every period or be subject to small breaks. The model is applied to 157 post-World War II U.S. quarterly macroeconomic variables. The results clearly suggest that the propagation of the monetary and non-monetary (exogenous) shocks has altered its behavior, and speciffically in a fashion which supports smooth evolution rather than abrupt change. The most notable changes were in the responses of real activity measures, prices and monetary aggregates, while other key indicators of the economy remained relatively unaffected.
    Keywords: Structural FAVAR, time varying parameter model, monetary policy
    JEL: C11 C32 E52
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:0914&r=ecm
  24. By: Björn Hagströmer; Richard G. Anderson; Jane M. Binner; Birger Nilsson
    Abstract: We develop the principal component analysis (PCA) approach to systematic liquidity measurement by introducing moving and expanding estimation windows. We evaluate these methods along with traditional estimation techniques (full sample PCA and market average) in terms of ability to explain (1) cross-sectional stock liquidity and (2) cross-sectional stock returns. For several traditional liquidity measures our results suggest an expanding window specification for systematic liquidity estimation. However, for price impact liquidity measures we find support for a moving window specification. The market average proxy of systematic liquidity produces the same degree of commonality, but does not have the same ability to explain stock returns as the PCA-based estimates.
    Keywords: Liquidity (Economics)
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:fip:fedlwp:2009-25&r=ecm
  25. By: Martin M. Andreasen (Bank of England and CREATES)
    Abstract: This paper argues that a specification of stochastic volatility commonly used to analyze the Great Moderation in DSGE models may not be appropriate, because the level of a process with this specification does not have conditional or unconditional moments. This is unfortunate because agents may as a result expect productivity and hence consumption to be inifinite in all future periods. This observation is followed by three ways to overcome the problem.
    Keywords: Great Moderation, Productivity shocks, and Time-varying coe¢ cients
    JEL: E10 E30
    Date: 2009–07–07
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-29&r=ecm

This nep-ecm issue is ©2009 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.