nep-ecm New Economics Papers
on Econometrics
Issue of 2008‒05‒10
thirteen papers chosen by
Sune Karlsson
Orebro University

  1. Simulated Maximum Likelihood using Tilted Importance Sampling By Christian N. Brinch
  2. Semiparametric Deconvolution with Unknown Error Variance By William C. Horrace; Christopher F. Parmeter
  3. Nonparametric Identification of Dynamic Models with Unobserved State Variables By Yingyao Hu; Matthew Shum
  4. Simple nonparametric estimators for unemployment duration analysis By Wichert, Laura; Wilke, Ralf A.
  5. Multivariate tests of asset pricing: Simulation evidence from an emerging market By Javed Iqbal; Robert Brooks; Don U.A. Galagedera
  6. Bayesian Model Averaging and Identification of Structural Breaks in Time Series By Fraser, Iain; Balcombe, Kelvin; Sharma, Abhijit
  7. Nonlinearities in Exchange rates: Double EGARCH Threshold Models for Forecasting Volatility By Sitzia, Bruno; Iovino, Doriana
  8. Bounds analysis of competing risks : a nonparametric evaluation of the effect of unemployment benefits on migration in Germany By Arntz, Melanie; Lo, Simon M. S.; Wilke, Ralf A.
  9. Testing a DSGE model of the EU using indirect inference By David Meenagh; Patrick Minford; Michael Wickensy
  10. The matching method for treatment evaluation with selective participation and ineligibles By Costa Dias, Monica; Ichimura, Hidehiko; van den Berg, Gerard J.
  11. The Treatment Effect, the Cross Difference, and the Interaction Term in Nonlinear “Difference-in-Differences” Models By Puhani, Patrick A.
  12. Evaluating the New Keynesian Phillips Curve under VAR-Based Learning By Fanelli, Luca
  13. Limit Theorems for Functionals of Sums that Converge to Fractional Brownian and Stable Motions By P. Jeganathan

  1. By: Christian N. Brinch (Statistics Norway)
    Abstract: This paper develops the important distinction between tilted and simple importance sampling as methods for simulating likelihood functions for use in simulated maximum likelihood. It is shown that tilted importance sampling removes a lower bound to simulation error for given importance sample size that is inherent in simulated maximum likelihood using simple importance sampling, the main method for simulating likelihood functions in the statistics literature. In addition, a new importance sampling technique, generalized Laplace importance sampling, easily combined with tilted importance sampling, is introduced. A number of applications and Monte Carlo experiments demonstrate the power and applicability of the methods. As an example, simulated maximum likelihood estimates from the infamous salamander mating model from McCullagh and Nelder (1989) can be found to easily satisfactory precision with an importance sample size of 100.
    Keywords: Simulation based estimation; importance sampling.
    JEL: C13 C15
    Date: 2008–04
    URL: http://d.repec.org/n?u=RePEc:ssb:dispap:540&r=ecm
  2. By: William C. Horrace (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 13244-1020); Christopher F. Parmeter (Department of Agricultural and Applied Economics, Virginia Polytechnic Institute and State University, Blacksburg, VA 24061-0401)
    Abstract: Deconvolution is a useful statistical technique for recovering an unknown density in the presence of measurement error. Typically, the method hinges on stringent assumptions about teh nature of the measurement error, more specifically, that the distribution is *entirely* known. We relax this assumption in the context of a regression error component model and develop an estimator for the unkinown density. We show semi-uniform consistency of the estimator and provide Monte Carlo evidence that demonstrates the merits of the method.
    Keywords: Error component, ordinary smooth, semi-uniform consistency
    JEL: C14 C21
    Date: 2008–04
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:104&r=ecm
  3. By: Yingyao Hu; Matthew Shum
    Abstract: We consider the identification of a Markov process {Wt,Xt*} for t = 1, 2, ... , T when only {Wt} for t = 1, 2, ... , T is observed. In structural dynamic models, Wt denotes the sequence of choice variables and observed state variables of an optimizing agent, while Xt* denotes the sequence of unobserved state variables. The Markov setting allows the distribution of the unobserved state variable Xt* to depend on Wt-1 and Xt-1*. We show that the joint distribution f Wt, Xt*, Wt-1, Xt-1* is identified from the observed distribution f Wt+1, Wt, Wt-1, Wt-2, Wt-3 under reasonable assumptions. Identification of f Wt, Xt*, Wt-1, Xt-1* is a crucial input in methodologies for estimating dynamic models based on the "conditional-choice-probability (CCP)" approach pioneered by Hotz and Miller.
    Date: 2007–12
    URL: http://d.repec.org/n?u=RePEc:jhu:papers:543&r=ecm
  4. By: Wichert, Laura; Wilke, Ralf A.
    Abstract: "We consider an extension of conventional univariate Kaplan-Meier type estimators for the hazard rate and the survivor function to multivariate censored data with a censored random regressor. It is an Akritas (1994) type estimator which adapts the nonparametric conditional hazard rate estimator of Beran (1981) to more typical data situations in applied analysis. We show with simulations that the estimator has nice finite sample properties and our implementation appears to be fast. As an application we estimate nonparametric conditional quantile functions with German administrative unemployment duration data." (author's abstract, IAB-Doku) ((en))<br><br><b>Additional Information</b><ul><li><a href='http://doku.iab.de/fdz/reporte/2007/MR_09-07_Programme.zip'>Appendix for the FDZ Methodenreport No. 09/2007: Programme.zip</a></li></ul>
    Keywords: Arbeitslosigkeitsdauer, Schätzung - Methode, IAB-Beschäftigtenstichprobe
    Date: 2007–10–16
    URL: http://d.repec.org/n?u=RePEc:iab:iabfme:200709_en&r=ecm
  5. By: Javed Iqbal; Robert Brooks; Don U.A. Galagedera
    Abstract: The finite sample performance of the Wald, GMM and Likelihood Ratio (LR) tests of multivariate asset pricing tests have been investigated in several studies on the US financial markets. This paper extends this analysis in two important ways. Firstly, considering the fact that the Wald test is not invariant to alternative non-linear formulation of the null hypothesis the paper investigates whether alternative forms of the Wald and GMM tests result in considerable difference in size and power. Secondly, the paper extends the analysis to the emerging market data. Emerging markets provide an interesting practical laboratory to test asset pricing models. The characteristics of emerging markets are different from the well developed markets of US, Japan and Europe. It is found that the asymptotic Wald and GMM tests based on Chi-Square critical values result in considerable size distortions. The bootstrap tests yield the correct sizes. Multiplicative from of bootstrap GMM test appears to outperform the LR test when the returns deviate from normality and when the deviations from the asset pricing model are smaller. Application of the bootstrap tests to the data from the Karachi Stock Exchange strongly supports the zero-beta CAPM. However the low power of the multivariate tests warrants a careful interpretation of the results.
    Keywords: Zero-beta CAPM, Multivariate Test, Wald, LR, Emerging Markets
    Date: 2008–04
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2008-2&r=ecm
  6. By: Fraser, Iain; Balcombe, Kelvin; Sharma, Abhijit
    Abstract: Bayesian model averaging is used for testing for multiple break points in uni- variate series using conjugate normal-gamma priors. This approach can test for the number of structural breaks and produce posterior probabilities for a break at each point in time. Results are averaged over speciÖcations including: station- ary; stationary around trend; and, unit root models, each containing di§ erent types and numbers of breaks and di§ erent lag lengths. The procedures are used to test for structural breaks on 14 annual macroeconomic series and 11 natural resource price series. The results indicate that there are structural breaks in al l of the natural resource series and most of the macroeconomic series. Many of the series had multiple breaks. Our Öndings regarding the existence of unit roots, having al lowed for structural breaks in the data, are largely consistent with previous work.
    Keywords: Bayesian Model Averaging; Structural Breaks; Unit Root; Macro- economic Data; Natural Resource data
    JEL: C01 C11
    Date: 2007–10
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:8676&r=ecm
  7. By: Sitzia, Bruno; Iovino, Doriana
    Abstract: This paper illustrates how to specify and test a Double Threshold EGARCH Model for some important exchange rates. The analysis is monthly and refers to the period 1990.01-2007.06. The procedure involves testing for Threshold effects the residuals of a linear autoregressive model of the exchange rate that is taken as the starting point. If this preliminary testing is favourable to the hypothesis off nonlinearity one then specifies and estimates a threshold model using Tong (1983,1990) algorithm, Tong algorithm allows to specify separately two AR regimes and helps locating both the delay and the parameters of the regimes using a search procedure based on the AIC. Residual for the SETAR model are then further tested for conditional heteroskedasticity. If it is present then a Double symmetric EGARCH is fitted to the data by maximum likelihood. The result is compared with an AR GARCH model both in sample and out of sample to asses whether there is any forecasting superiority of the more complex model. Reported results favour this outcome. In the text of the paper we report explicitly the results for the Japanese yen and the British pound exchange rates vis a vis the US dollar, but the same procedure has been applied to many other exchange rate series with results favourable to the double variance model in more than 50% of the cases. We report the complete results in the appendix. We conclude that the proposed model is both feasible and of wide applicability to the analysis of volatility of exchange rates. We add two provisos: data are monthly and the period of estimation reflects only the most recent experience.
    Keywords: non linearity; forecasting volatility; exchange rates
    JEL: C22
    Date: 2008–01–18
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:8661&r=ecm
  8. By: Arntz, Melanie; Lo, Simon M. S.; Wilke, Ralf A.
    Abstract: "In this paper we derive nonparametric bounds for the cumulative incidence curve within a competing risks model with partly identified interval data. As an advantage over earlier attempts our approach also gives valid results in case of dependent competing risks. We apply our framework to empirically evaluate the effect of unemployment benefits on observed migration of unemployed workers in Germany. Our findings weakly indicate that reducing the entitlement length for unemployment benefits increases migration among high-skilled individuals." (author's abstract, IAB-Doku) ((en))
    Keywords: Arbeitslosenunterstützung, Leistungsanspruch - Dauer, Binnenwanderung, regionale Mobilität, Wanderungsmotivation, Mobilitätsbereitschaft, Arbeitslose, Hochqualifizierte, IAB-Beschäftigtenstichprobe
    JEL: C41 C14 J61
    Date: 2007–08–13
    URL: http://d.repec.org/n?u=RePEc:iab:iabfme:200704_en&r=ecm
  9. By: David Meenagh; Patrick Minford; Michael Wickensy
    Abstract: We use the method of indirect inference, using the bootstrap, to test the Smets and Wouters model of the EU against a VAR auxiliary equation describing their data; the test is based on the Wald statistic. We find that their model generates excessive variance compared with the data. If the errors are scaled down, then the original model marginally passes the Wald test. We compare a New Classical version of the model which passes the test but generates a combination of excessive inflation variance and inadequate output variance. If the large consumption and investment errors are removed as possibly due to low frequency events, then the New Classical version passes easily while the original version is strongly rejected.
    Keywords: Bootstrap, DSGE Model, VAR model, Model of EU, indirect inference, Wald statistic.
    JEL: C12 C32
    Date: 2008–03
    URL: http://d.repec.org/n?u=RePEc:san:cdmacp:0709&r=ecm
  10. By: Costa Dias, Monica (Institute for Fiscal Studies); Ichimura, Hidehiko (University of Tokyo); van den Berg, Gerard J. (Department of Economics, University Amsterdam)
    Abstract: The matching method for treatment evaluation does not balance selective unobserved differences between treated and non-treated. We derive a simple correction term if there is an instrument that shifts the treatment probability to zero in specific cases. Policies with eligibility restrictions, where treatment is impossible if some variable exceeds a certain value, provide a natural application. In an empirical analysis, we first examine the performance of matching versus regression-discontinuity estimation in the sharp age-discontinuity design of the NDYP job search assistance program for young unemployed in the UK. Next, we exploit the age eligibility restriction in the Swedish Youth Practice subsidized work program for young unemployed, where compliance is imperfect among the young. Adjusting the matching estimator for selectivity changes the results towards inefectiveness of subsidized work in moving individuals into employment.
    Keywords: Propensity score: policy evaluation; treatment effect; regression discontinuity; selection; job search assistance; subsidized work; youth unemployment
    JEL: C14 C25 J64
    Date: 2008–04–21
    URL: http://d.repec.org/n?u=RePEc:hhs:ifauwp:2008_006&r=ecm
  11. By: Puhani, Patrick A. (University of Hannover)
    Abstract: I demonstrate that Ai and Norton’s (2003) point about cross differences is not relevant for the estimation of the treatment effect in nonlinear “difference-in-differences” models such as probit, logit or tobit, because the cross difference is not equal to the treatment effect, which is the parameter of interest. In a nonlinear “difference-in-differences” model, the treatment effect is the cross difference of the conditional expectation of the observed outcome minus the cross difference of the conditional expectation of the potential outcome without treatment. Unlike in the linear model, the latter cross difference is not zero in the nonlinear model. It follows that the sign of the treatment effect in a nonlinear “difference-in-differences” model with a strictly monotonic transformation function is equal to the sign of the coefficient of the interaction term of the time and treatment group indicators. The treatment effect is simply the incremental effect of the coefficient of the interaction term.
    Keywords: identification, nonlinear models, limited dependent variable, probit, logit, tobit, difference-in-differences, interaction effect
    JEL: C21 C25 H0 I0 J0
    Date: 2008–04
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp3478&r=ecm
  12. By: Fanelli, Luca
    Abstract: This paper proposes the econometric evaluation of the New Keynesian Phillips Curve (NKPC) in the euro area, under a particular specification of the adaptive learning hypothesis. The key assumption is that agents’ perceived law of motion is a Vector Autoregressive (VAR) model, whose coefficients are updated by maximum likelihood estimation, as the information set increases over time. Each time new data is available, likelihood ratio tests for the crossequation restrictions that the NKPC imposes on the VAR are computed and compared with a proper set of critical values which take the sequential nature of the test into account. The analysis is developed by focusing on the case where the variables entering the NKPC can be approximated as nonstationary cointegrated processes, assuming that the agents’ recursive estimation algorithm involves only the parameters associated with the short run transient dynamics of the system. Results on quarterly data relative to the period 1981–2006 show that: (i) the euro area inflation rate and the wage share are cointegrated; (ii) the cointegrated version of the ‘hybrid’ NKPC is sharply rejected under the rational expectations hypothesis; (iii) the model is supported by the data over relevant fractions of the chosen monitoring period, 1986–2006, under the adaptive learning hypothesis, although this evidence does not appear compelling.
    Keywords: Adaptive learning, cointegration, cross-equation restrictions, forward-looking model of inflation dynamics, New Keynesian Phillips Curve
    JEL: C32 C52 D83 E10
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:7257&r=ecm
  13. By: P. Jeganathan (Indian Statistical Institute)
    Abstract: Too technical to post, see paper.
    Keywords: Fractional ARIMA, Sums of linear process, Nonlinear functionals, Limit theorems, Local time, Fractional Brownian and Stable motions
    JEL: C22 C23
    Date: 2008–04
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1649&r=ecm

This nep-ecm issue is ©2008 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.