nep-ecm New Economics Papers
on Econometrics
Issue of 2011‒10‒15
twenty-one papers chosen by
Sune Karlsson
Orebro University

  1. Improved estimation in generalized threshold regression models By Friederike Greb; Tatyana Krivobokova; Axel Munk; Stephan von Cramon-Taubadel
  2. Distribution Theory for the Studentized Mean for Long, Short, and Negative Memory Time Series By McElroy, Tucker S; Politis, D N
  3. Bootstrap Confidence Sets with Weak Instruments By Russell Davidson; James G. MacKinnon
  4. Limited Information Bayesian Model Averaging for Dynamic Panels with an Application to a Trade Gravity Model By Charalambos G. Tsangarides; Alin Mirestean; Huigang Chen
  5. Estimation Issues in Single Commodity Gravity Trade Models By Prehn, Soren; Brummer, Bernhard
  6. Copula bivariate probit models: with an application to medical expenditures By Rainer Winkelmann
  7. Identification through heteroskedasticity: A likelihood-based approach By emanuele bacchiocchi
  8. Adapting Johansenâs Estimation Method for Flexible Regime-dependent Cointegration Modelling By Ihle, Rico; Amikuzuno, Joseph; Cramon-Taubadel, Stephan von
  9. Time-series Modelling, Stationarity and Bayesian Nonparametric Methods By Juan Carlos Martínez-Ovando; Stephen G. Walker
  10. A new method for approximating vector autoregressive processes by finite-state Markov chains By Gospodinov, Nikolay; Lkhagvasuren, Damba
  11. Cointegrated VARMA models and forecasting US interest rates By Christian Kascha; Carsten Trenkler
  12. Some Computational Aspects of Gaussian CARMA Modelling By Tómasson, Helgi
  13. Monetary policy indeterminacy in the U.S.: results from a classical test By Efrem Castelnuovo; Luca Fanelli
  14. Filtering Short Term Fluctuations in Inflation Analysis By H. Cagri Akkoyun; Oguz Atuk; N. Alpay Kocak; M. Utku Ozmen
  15. Weak convergence to the t-distribution By Christian Schluter; Mark Trede
  16. Chaotic Time Series Analysis in Economics: Balance and Perspectives By Marisa Faggini
  17. Trend-cycle decomposition of output and euro area inflation forecasts: a real-time approach based on model combination By Pierre Guérin; Laurent Maurin; Matthias Mohr
  18. On threshold estimation in threshold vector error correction models By Greb, Friederike; Krivobokova, Tatyana; von Cramon-Taubadel, Stephan; Munk, Axel
  19. Do Experts incorporate Statistical Model Forecasts and should they? By Rianne Legerstee; Philip Hans Franses; Richard Paap
  20. Asymmetric Price Transmission in Food Supply Chains: Impulse Response Analysis by Local Projections By Kuiper, Erno; Bunte, Frank
  21. Data-Rich DSGE and Dynamic Factor Models By Maxym Kryshko

  1. By: Friederike Greb (Georg-August-University Göttingen); Tatyana Krivobokova (Georg-August-University Göttingen); Axel Munk (Georg-August-University Göttingen); Stephan von Cramon-Taubadel (Georg-August-University Göttingen)
    Abstract: Estimation of threshold parameters in (generalized) threshold regression models is typically performed by maximizing the corresponding profile likelihood function. Also, certain Bayesian techniques based on non-informative priors are developed and widely used. This article draws attention to settings (not rare in practice) in which these standard estimators either perform poorly or even fail. In particular, if estimation of the regression coeffcients is associated with high uncertainty, the profile likelihood for the threshold parameters and thus the corresponding estimator can be highly affected. We suggest an alternative estimation method employing the empirical Bayes paradigm, which allows to circumvent deficiencies of standard estimators. The new estimator is completely data-driven and induces little additional numerical effort compared with the old one. Simulation results show that our estimator outperforms commonly used estimators and produces excellent results even if the latter show poor performance. The practical relevance of our approach is illustrated by a real-data example; we follow up the anlysis of cross-country growth behavior detailed in Hansen (2000).
    Keywords: threshold estimation; nuisance parameters; empirical Bayes
    Date: 2011–10–07
    URL: http://d.repec.org/n?u=RePEc:got:gotcrc:099&r=ecm
  2. By: McElroy, Tucker S; Politis, D N
    Abstract: We consider the problem of estimating the variance of the partial sums of a stationary time series that has either long memory, short memory, negative/intermediate memory, or is the ¯rst- di®erence of such a process. The rate of growth of this variance depends crucially on the type of memory, and we present results on the behavior of tapered sums of sample autocovariances in this context when the bandwidth vanishes asymptotically. We also present asymptotic results for the case that the bandwidth is a ¯xed proportion of sample size, extending known results to the case of °at-top tapers. We adopt the ¯xed proportion bandwidth perspective in our empirical section, presenting two methods for estimating the limiting critical values { both the subsampling method and a plug-in approach. Extensive simulation studies compare the size and power of both approaches as applied to hypothesis testing for the mean. Both methods perform well { although the subsampling method appears to be better sized { and provide a viable framework for conducting inference for the mean. In summary, we supply a uni¯ed asymptotic theory that covers all di®erent types of memory under a single umbrella.
    Keywords: kernel, lag-windows, overdifferencing, spectral estimation, subsampling, tapers, unit-root problem, Econometrics
    Date: 2011–09–01
    URL: http://d.repec.org/n?u=RePEc:cdl:ucsdec:2265346&r=ecm
  3. By: Russell Davidson (McGill University); James G. MacKinnon (Queen's University)
    Abstract: We study several methods of constructing confidence sets for the coefficient of the single right-hand-side endogenous variable in a linear equation with weak instruments. Two of these are based on conditional likelihood ratio (CLR) tests, and the others are based on inverting t statistics or the bootstrap P values associated with them. We propose a new method for constructing bootstrap confidence sets based on t statistics. In large samples, the procedures that generally work best are CLR confidence sets using asymptotic critical values and bootstrap confidence sets based on LIML estimates.
    Keywords: weak instruments, bootstrap, confidence sets, CLR test, LIML
    JEL: C10 C15
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1278&r=ecm
  4. By: Charalambos G. Tsangarides; Alin Mirestean; Huigang Chen
    Abstract: This paper extends the Bayesian Model Averaging framework to panel data models where the lagged dependent variable as well as endogenous variables appear as regressors. We propose a Limited Information Bayesian Model Averaging (LIBMA) methodology and then test it using simulated data. Simulation results suggest that asymptotically our methodology performs well both in Bayesian model averaging and selection. In particular, LIBMA recovers the data generating process well, with high posterior inclusion probabilities for all the relevant regressors, and parameter estimates very close to their true values. These findings suggest that our methodology is well suited for inference in short dynamic panel data models with endogenous regressors in the context of model uncertainty. We illustrate the use of LIBMA in an application to the estimation of a dynamic gravity model for bilateral trade.
    Keywords: Bilateral trade , Economic models ,
    Date: 2011–10–03
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:11/230&r=ecm
  5. By: Prehn, Soren; Brummer, Bernhard
    Abstract: Recently gravity trade models are applied to disaggregated trade data. Here many zeros are characteristic. In the presence of excess zeros usual Poisson Pseudo Maximum Likelihood (PPML) is still consistent, the variance covariance matrix however is invalid. Correct economic interpretation however requires also the last. So alternative estimators are looked for. Staub & Winkelmann [2010] argue that zeroinflated count data models (i.e. zero-inflated Poisson / Negative Binomial Pseudo Maximum Likelihood (ZIPPML / ZINBPML)) are no alternative since under model misspecification these estimators are inconsistent. Yet zero-inflated Poisson Quasi- Likelihood (PQL) is a reliable alternative. It is consistent even under model misspecifications and beyond that robust against unobserved heterogeneity. Another alternative is a log-skew-normal Two-Part Model (G2PM) which generalizes the standard log-normal Two-Part Model (2PM). It is insofar advantageous as it adjusts for (negative) skewness and regression coefficients retain usual interpretations as in log-normal models. PQL is useful for multiplicative gravity model estimation and G2PM for log-linear gravity model estimation. Exemplarily the estimators are applied to intra-European piglet trade to assess their empirical performance and applicability for single commodity trade flow analysis. The empirical part favours PQL but G2PM is a reliable alternative for other trade flow analyses. PQL and G2PM should become standard tools for single commodity trade flow analysis.
    Keywords: Gravity Model, Homogeneous Firm Trade Model, Excess Zeros, Overdispersion, Negatively Skewed Distribution, Poisson Quasi Likelihood, Two Part Model, International Relations/Trade,
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:ags:eaae11:114776&r=ecm
  6. By: Rainer Winkelmann
    Abstract: The bivariate probit model is frequently used for estimating the effect of an endogenous binary regressor (the "treatment") on a binary health outcome variable. This paper discusses simple modifications that maintain the probit assumption for the marginal distributions while introducing non-normal dependence using copulas. In an application of the copula bivariate probit model to the effect of insurance status on the absence of ambulatory health care expen- diture, a model based on the Frank copula outperforms the standard bivariate probit model.
    Keywords: Bivariate probit, binary endogenous regressor, Frank copula, Clayton copula
    JEL: C35 I12
    Date: 2011–09
    URL: http://d.repec.org/n?u=RePEc:zur:econwp:029&r=ecm
  7. By: emanuele bacchiocchi (University of Milan)
    Abstract: In this paper we show how the analysis of identification of simultaneous systems of equations with different volatility regimes can be addressed in a conventional likelihood-based setup, generalizing previous works in different directions. We discuss general conditions for identification and one of the results shows that an adequate number of different levels of heteroskedasticity is sufficient to identify the parameters of the structural form without the inclusion of any kind of restriction. A Full Information Maximum Likelihood (FIML) algorithm is discussed and the small sample performances of estimators and tests on the parameters are studied through Monte Carlo simulations. Finally, this methodology is used to investigate the relationships between sovereign bond yields for some highly indebted EU countries.
    Keywords: simultaneous equations model, heteroskedasticity, identification, FIML, contagion,
    Date: 2011–06–13
    URL: http://d.repec.org/n?u=RePEc:bep:unimip:unimi-1111&r=ecm
  8. By: Ihle, Rico; Amikuzuno, Joseph; Cramon-Taubadel, Stephan von
    Keywords: Research Methods/ Statistical Methods,
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:ags:eaae11:114461&r=ecm
  9. By: Juan Carlos Martínez-Ovando; Stephen G. Walker
    Abstract: In this paper we introduce two general non-parametric first-order stationary time-series models for which marginal (invariant) and transition distributions are expressed as infinite-dimensional mixtures. That feature makes them the first Bayesian stationary fully non-parametric models developed so far. We draw on the discussion of using stationary models in practice, as a motivation, and advocate the view that flexible (non-parametric) stationary models might be a source for reliable inferences and predictions. It will be noticed that our models adequately fit in the Bayesian inference framework due to a suitable representation theorem. A stationary scale-mixture model is developed as a particular case along with a computational strategy for posterior inference and predictions. The usefulness of that model is illustrated with the analysis of Euro/USD exchange rate log-returns.
    Keywords: Stationarity, Markov processes, Dynamic mixture models, Random probability measures, Conditional random probability measures, Latent processes.
    JEL: C11 C14 C15 C22 C51
    Date: 2011–09
    URL: http://d.repec.org/n?u=RePEc:bdm:wpaper:2011-08&r=ecm
  10. By: Gospodinov, Nikolay; Lkhagvasuren, Damba
    Abstract: This paper proposes a new method for approximating vector autoregressions by a finite-state Markov chain. The method is more robust to the number of discrete values and tends to outperform the existing methods over a wide range of the parameter space, especially for highly persistent vector autoregressions with roots near the unit circle.
    Keywords: Markov Chain; Vector Autoregressive Processes; Functional Equation; Numerical Methods; Moment Matching; Numerical Integration
    JEL: C10 C15 C60
    Date: 2011–06–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:33827&r=ecm
  11. By: Christian Kascha; Carsten Trenkler
    Abstract: We bring together some recent advances in the literature on vector autoregressive moving-average models creating a relatively simple specification and estimation strategy for the cointegrated case. We show that in the cointegrated case with fixed initial values there exists a so-called final moving representation which is usually simpler but not as parsimonious than the usual Echelon form. Furthermore, we proof that our specification strategy is consistent also in the case of cointegrated series. In order to show the potential usefulness of the method, we apply it to US interest rates and find that it generates forecasts superior to methods which do not allow for moving-average terms.
    Keywords: Cointegration, VARMA models, forecasting
    JEL: C32 C53 E43 E47
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:zur:econwp:033&r=ecm
  12. By: Tómasson, Helgi (Faculty of Economics, University of Iceland, Reykjavik, Iceland)
    Abstract: Representation of continuous-time ARMA, CARMA, models is reviewed. Computational aspects of simulating and calculating the likelihood-function of CARMA are summarized. Some numerical properties are illustrated by simulations. Some real data applications are shown.
    Keywords: CARMA, maximum-likelihood, spectrum, Kalman filter, computation
    JEL: C01 C10 C22 C53 C63
    Date: 2011–09
    URL: http://d.repec.org/n?u=RePEc:ihs:ihsesp:274&r=ecm
  13. By: Efrem Castelnuovo (Università di Padova); Luca Fanelli (Università di Bologna)
    Abstract: We work with a newly developed method to empirically assess whether a specified new-Keynesian business cycle monetary model estimated with U.S. quarterly data is consistent with a unique equilibrium or multiple equilibria under rational expectations. We conduct classical tests to verify if the structural model is correctly specified. Conditional on a positive answer, we formally assess if such model is either consistent with a unique equilibrium or with indeterminacy. Importantly, our full-system approach requires neither the use of prior distributions nor that of nonstandard inference. The case of an indeterminate equilibrium in the pre-1984 sample and of a determinate equilibrium in the post-1984 sample is favored by the data. The long-run coefficients on inflation and the output gap in the monetary policy rule are found to be weakly identified. However, our results are further supported by a proposed identification-robust indicator of indeterminacy
    Keywords: GMM, Indeterminatezza, Massima Verosimiglianza, Errata specificazione, modello neo-Keynesiano per il ciclo economico, VAR, Identificazione debole GMM, Indeterminacy, Maximum Likelihood, Misspecification, new-Keynesian business cycle model, VAR, Weak identification.
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:bot:quadip:112&r=ecm
  14. By: H. Cagri Akkoyun; Oguz Atuk; N. Alpay Kocak; M. Utku Ozmen
    Abstract: Many economic time series, specifically inflation, are inherently subject to seasonal fluctuations which obscure the real changes of the series. In this respect, seasonal adjustment is a powerful tool when removing such fluctuations. On the other hand, seasonal adjustment may provide highly volatile series, making it still difficult to interpret the movements of the series. The reason is that seasonal adjustment deals with certain type of movements that are completed on specific seasonal frequencies. However, it is possible that there may be other short term fluctuations occurring at non seasonal frequencies. From this observation and in the context of inflation, an improved methodology aiming to deal with all short term fluctuations that are completed within a year is proposed in this study. The two-step approach combines wavelet filters and band pass filters. This method yields much smoother time series than seasonal adjustment does. Moreover, the filtered series capture the dynamics of the inflation in sub groups well. Hence, this two-step procedure provides a useful tool for improved short term inflation analysis.
    Keywords: Consumer prices, inflation, seasonal adjustment, wavelet filter, band pass filter
    JEL: E31
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:tcb:wpaper:1120&r=ecm
  15. By: Christian Schluter; Mark Trede
    Abstract: We present a new limit theorem for random means: if the sample size is not deterministic but has a negative binomial or geometric distribution, the limit distribution of the normalised random mean is a t-distribution with degrees of freedom depending on the shape parameter of the negative binomial distribution. Thus the limit distribution exhibits exhibits heavy tails, whereas limit laws for random sums do not achieve this unless the summands have innite variance. The limit law may help explain several empirical regularities. We consider two such examples: rst, a simple model is used to explain why city size growth rates are approximately t-distributed. Second, a random averaging argument can account for the heavy tails of high-frequency returns. Our empirical investigations demonstrate that these predictions are borne out by the data.
    Keywords: convergence, t-distribution, limit theorem
    JEL: A
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:cqe:wpaper:2111&r=ecm
  16. By: Marisa Faggini (Department of Economics and Statistics, University of Salerno)
    Abstract: To show that a mathematical model exhibits chaotic behaviour does not prove that chaos is also present in the corresponding data. To convincingly show that a system behaves chaotically, chaos has to be identified directly from the data. From an empirical point of view, it is difficult to distinguish between fluctuations provoked by random shocks and endogenous fluctuations determined by the nonlinear nature of the relation between economic aggregates. For this purpose, chaos tests test are developed to investigate the basic features of chaotic phenomena: nonlinearity, fractal attractor, and sensitivity to initial conditions. The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in the data. More specifically, our attention will be devoted to reviewing the results reached by the application of these techniques to economic and financial time series and to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.
    Keywords: Economic dynamics, nonlinearity, tests for chaos, chaos
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:tur:wpaper:25&r=ecm
  17. By: Pierre Guérin (International Economic Analysis Department, Bank of Canada, 234 Wellington Street, Ottawa, Canada, K1A 0G9 and European University Institute); Laurent Maurin (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt, Germany.); Matthias Mohr (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt, Germany.)
    Abstract: The paper focuses on the estimation of the euro area output gap. We construct model-averaged measures of the output gap in order to cope with both model uncertainty and parameter instability that are inherent to trend-cycle decomposition models of GDP. We first estimate nine models of trend-cycle decomposition of euro area GDP, both univariate and multivariate, some of them allowing for changes in the slope of trend GDP and/or its error variance using Markov-switching specifications, or including a Phillips curve. We then pool the estimates using three weighting schemes. We compute both ex-post and real-time estimates to check the stability of the estimates to GDP revisions. We finally run a forecasting experiment to evaluate the predictive power of the output gap for inflation in the euro area. We find evidence of changes in trend growth around the recessions. We also find support for model averaging techniques in order to improve the reliability of the potential output estimates in real time. Our measures help forecasting inflation over most of our evaluation sample (2001-2010) but fail dramatically over the last recession. JEL Classification: C53, E32, E37.
    Keywords: Trend-cycle decomposition, Phillips curve, unobserved components model, Kalman Filter, Markov-switching, auxiliary information, model averaging, inflation forecast, real-time analysis.
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20111384&r=ecm
  18. By: Greb, Friederike; Krivobokova, Tatyana; von Cramon-Taubadel, Stephan; Munk, Axel
    Keywords: Resource /Energy Economics and Policy,
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:ags:eaae11:114599&r=ecm
  19. By: Rianne Legerstee (Erasmus University Rotterdam); Philip Hans Franses (Erasmus University Rotterdam); Richard Paap (Erasmus University Rotterdam)
    Abstract: Experts can rely on statistical model forecasts when creating their own forecasts. Usually it is not known what experts actually do. In this paper we focus on three questions, which we try to answer given the availability of expert forecasts and model forecasts. First, is the expert forecast related to the model forecast and how? Second, how is this potential relation influenced by other factors? Third, how does this relation influence forecast accuracy? We propose a new and innovative two-level Hierarchical Bayes model to answer these questions. We apply our proposed methodology to a large data set of forecasts and realizations of SKU-level sales data from a pharmaceutical company. We find that expert forecasts can depend on model forecasts in a variety of ways. Average sales levels, sales volatility, and the forecast horizon influence this dependence. We also demonstrate that theoretical implications of expert behavior on forecast accuracy are reflected in the empirical data.
    Keywords: model forecasts; expert forecasts; forecast adjustment; Bayesian analysis; endogeneity
    JEL: C11 C53
    Date: 2011–10–04
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20110141&r=ecm
  20. By: Kuiper, Erno; Bunte, Frank
    Abstract: In this paper we set out Jordaâs (2005) method of local projections by which nonlinear impulse responses can be computed without the need to specify and estimate the underlying nonlinear dynamic system. The method is used to compute price reaction functions that show how the prices of the different stages in the supply chain dynamically respond to one another and whether or not these responses reveal any asymmetric patterns. Empirical applications for the US pork-meat and broiler composite chains illustrate the convenience of the method.
    Keywords: Agribusiness,
    Date: 2011–09–02
    URL: http://d.repec.org/n?u=RePEc:ags:eaae11:114679&r=ecm
  21. By: Maxym Kryshko
    Abstract: Dynamic factor models and dynamic stochastic general equilibrium (DSGE) models are widely used for empirical research in macroeconomics. The empirical factor literature argues that the co-movement of large panels of macroeconomic and financial data can be captured by relatively few common unobserved factors. Similarly, the dynamics in DSGE models are often governed by a handful of state variables and exogenous processes such as preference and/or technology shocks. Boivin and Giannoni(2006) combine a DSGE and a factor model into a data-rich DSGE model, in which DSGE states are factors and factor dynamics are subject to DSGE model implied restrictions. We compare a data-richDSGE model with a standard New Keynesian core to an empirical dynamic factor model by estimating both on a rich panel of U.S. macroeconomic and financial data compiled by Stock and Watson (2008).We find that the spaces spanned by the empirical factors and by the data-rich DSGE model states are very close. This proximity allows us to propagate monetary policy and technology innovations in an otherwise non-structural dynamic factor model to obtain predictions for many more series than just a handful of traditional macro variables, including measures of real activity, price indices, labor market indicators, interest rate spreads, money and credit stocks, and exchange rates.
    Keywords: Economic models , Monetary policy ,
    Date: 2011–09–16
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:11/216&r=ecm

This nep-ecm issue is ©2011 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.