nep-ecm New Economics Papers
on Econometrics
Issue of 2006‒05‒27
twenty-one papers chosen by
Sune Karlsson
Orebro University

  1. Local asymptotic normality and efficient estimation for inar (P) models By Drost,Feike C.; Akker,Ramon van den; Werker,Bas J.M.
  2. Calculation of Multivariate Normal Probabilities by Simulation, with Applications to Maximum Simulated Likelihood Estimation By Lorenzo Cappellari; Stephen P. Jenkins
  3. Efficient Bayesian Inference for Multiple Change-Point and Mixture Innovation Models By Giordani, Paolo; Kohn, Robert
  4. Local Linear Multivariate Regression with Variable Bandwidth in the Presence of Heteroscedasticity By Azhong Ye; Rob J Hyndman; Zinai Li
  5. Duration Dependent Markov-Switching Vector Autoregression: Properties, Bayesian Inference, Software and Application By Matteo Pelagatti
  6. An asymptotic analysis of nearly unstable inar (1) models By Drost,Feike C.; Akker,Ramon van den; Werker,Bas J.M.
  7. Maximum Likelihood Estimation of Endogenous Switching And Sample Selection Models for Binary, Count, And Ordinal Variables By Alfonso Miranda; Sophia Rabe-Hesketh
  8. Dynamic Conditional Correlation with Elliptical Distributions By Matteo Pelagatti; Stefania Rondena
  9. Forecasting US bond yields at weekly frequency By Riccardo LUCCHETTI; Giulio PALOMBA
  10. An Anisotropic Model For Spatial Processes By Minfeng Deng
  11. IMPLEMENTING PLS FOR DISTANCE-BASED REGRESSION: COMPUTATIONAL ISSUES By Eva Boj; Aurea Grane; Josep Fortiana; M. Merce Claramunt
  12. Convergences of prices and rates of inflation By Fabio Busetti; Silvia Fabiani; Andrew Harvey
  13. Do macro variables, asset markets, or surveys forecast inflation better? By Andrew Ang; Geert Bekaert; Min Wei
  14. Timing transitions between determinate and indeterminate equilibria in an empirical DSGE model: benefits and implications By Anatoliy Belaygorod; Michael J. Dueker
  15. Law and Statistical Disorder: Statistical Hypothesis Test Procedures And the Criminal Trial Analogy By Tung Liu; Courtenay Cliff Stone
  16. The Data Quality Concept of Accuracy in the Context of Public Use Data Sets By Carsten Kuchler; Martin Spieß
  17. A Further Examination of the Expectations Hypothesis for the Term Structure By E Bataa; D R Osborn; D H Kim
  18. The Stability of Electricity Prices: Estimation and Inference of the Lyapunov Exponent By Mikael Bask; Tung Liu; Anna Widerberg
  19. Une synthèse des tests de cointégration sur données de panel By Christophe Hurlin; Valérie Mignon
  20. Design of web questionnaires : the effect of layout in rating scales By Toepoel,Vera; Das,Marcel; Soest,Arthur van
  21. Method to Find the VARs Easily By Angela Birk

  1. By: Drost,Feike C.; Akker,Ramon van den; Werker,Bas J.M. (Tilburg University, Center for Economic Research)
    Abstract: Integer-valued autoregressive (INAR) processes have been introduced to model nonnegative integervalued phenomena that evolve in time. The distribution of an INAR(p) process is determined by two parameters: a vector of survival probabilities and a probability distribution on the nonnegative integers, called an immigration or innovation distribution. This paper provides an efficient estimator of the parameters, and in particular, shows that the INAR(p) model has the Local Asymptotic Normality property.
    Keywords: count data;integer-valued time series;information loss structure
    JEL: C12 C13 C19
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200645&r=ecm
  2. By: Lorenzo Cappellari; Stephen P. Jenkins
    Abstract: We discuss methods for calculating multivariate normal probabilities by simulation and two new Stata programs for this purpose: mvdraws for deriving draws from the standard uniform density using either Halton or pseudo-random sequences, and an egen function mvnp() for calculating the probabilities themselves. Several illustrations show how the programs may be used for maximum simulated likelihood estimation.
    Keywords: Simulation estimation, maximum simulated likelihood, multivariate probit, Halton sequences, pseudo-random sequences, multivariate normal, GHK simulator
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp584&r=ecm
  3. By: Giordani, Paolo (Research Department, Central Bank of Sweden); Kohn, Robert (School of Economics, School of Banking and Finance)
    Abstract: Time series subject to parameter shifts of random magnitude and timing are commonly modeled with a change-point approach using Chib's (1998) algorithm to draw the break dates. We outline some advantages of an alternative approach in which breaks come through mixture distributions in state innovations, and for which the sampler of Gerlach, Carter and Kohn (2000) allows reliable and efficient inference. We show how this approach can be used to (i) model shifts in variance that occur independently of shifts in other parameters (ii) draw the break dates efficiently in change-point and regime-switching models with either Markov or non-Markov transition probabilities. We extend the proofs given in Carter and Kohn (1994) and in Gerlach, Carter and Kohn (2000) to state-space models with system matrices which are functions of lags of the dependent variables, and we further improve the algorithms in Gerlach, Carter and Kohn by introducing to the time series literature the concept of adaptive Metropolis-Hastings sampling for discrete latent variable models. We develop an easily implemented adative algorithm that promises to sizably reduce computing time in a variety of problems including mixture innovation, change-point, regime-switching, and outlier detection.
    Keywords: Structural breaks; Parameter instability; Change-point; State-space; Mixtures; Discrete latent variables; Adaptive Metropolis-Hastings
    JEL: C11 C15 C22
    Date: 2006–05–01
    URL: http://d.repec.org/n?u=RePEc:hhs:rbnkwp:0196&r=ecm
  4. By: Azhong Ye; Rob J Hyndman; Zinai Li
    Abstract: We present a local linear estimator with variable bandwidth for multivariate nonparametric regression. We prove its consistency and asymptotic normality in the interior of the observed data and obtain its rates of convergence. This result is used to obtain practical direct plug-in bandwidth selectors for heteroscedastic regression in one and two dimensions. We show that the local linear estimator with variable bandwidth has better goodness-of-fit properties than the local linear estimator with constant bandwidth, in the presence of heteroscedasticity.
    Keywords: Heteroscedasticity; kernel smoothing; local linear regression; plug-in bandwidth, variable bandwidth.
    JEL: C12 C15 C52
    Date: 2006–05
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2006-8&r=ecm
  5. By: Matteo Pelagatti
    Abstract: Duration dependent Markov-switching VAR (DDMS-VAR) models are time series models with data generating process consisting in a mixture of two VAR processes. The switching between the two VAR processes is governed by a two state Markov chain with transition probabilities that depend on how long the chain has been in a state. In the present paper we analyze the second order properties of such models and propose a Markov chain Monte Carlo algorithm to carry out Bayesian inference on the model’s unknowns. Furthermore, a freeware software written by the author for the analysis of time series by means of DDMS-VAR models is illustrated. The methodology and the software are applied to the analysis of the U.S. business cycle.
    Keywords: Markov-switching, business cycle, Gibbs sampler, duration dependence, vector autoregression
    JEL: C11 C15 C32 C41 E32
    Date: 2003–08
    URL: http://d.repec.org/n?u=RePEc:mis:wpaper:20061101&r=ecm
  6. By: Drost,Feike C.; Akker,Ramon van den; Werker,Bas J.M. (Tilburg University, Center for Economic Research)
    Abstract: This paper considers integer-valued autoregressive processes where the autoregression parameter is close to unity. We consider the asymptotics of this `near unit root' situation. The local asymptotic structure of the likelihood ratios of the model is obtained, showing that the limit experiment is Poissonian. This Poisson limit experiment is used to construct efficient estimators and tests.
    Keywords: integer-valued times series;Poisson limit experiment;local-to-unity asymptotics
    JEL: C12 C13 C19
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200644&r=ecm
  7. By: Alfonso Miranda (Department of Economics, Keele,); Sophia Rabe-Hesketh (Graduate Schoolof Education,)
    Abstract: Studying behaviour in economics, sociology, and statistics often involves fitting models in which the response variable depends on a dummy variable (also known as a regime switch variable) or in which the response variable is observed only if a particular selection condition is met. In either case, standard regression techniques deliver inconsistent estimators if unobserved factors that affect the response are correlated with unobserved factors that affect the switching or selection variable. Consistent estimators can be obtained by maximum likelihood estimation of a joint model of the outcome and switching or selection variable. This paper describes a ‘wrapper’ program, ssm, that calls gllamm (Rabe-Hesketh et al. 2004a) to fit such models. The wrapper accepts data in a simple structure, has a straightforward syntax, and reports output in a manner that is easily interpretable. One important feature of ssm is that the log-likelihood can be evaluated using adaptive quadrature (Rabe- Hesketh and Skrondal 2002; Rabe-Hesketh et al. 2005)
    Keywords: Endogenous switching, sample selection, binary variable, count data, ordinal variable, probit, poisson regression, adaptive quadrature, gllamm, wrapper, ssm.
    JEL: C13 C31 C35 C87 I12
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:kee:kerpuk:2005/14&r=ecm
  8. By: Matteo Pelagatti; Stefania Rondena
    Abstract: The Dynamic Conditional Correlation (DCC) model of Engle has made the estimation of multivariate GARCH models feasible for reasonably big vectors of securities’ returns. In the present paper we show how Engle’s multi-step estimation of the model can be easily extended to elliptical conditional distributions and apply different leptokurtic DCC models to twenty shares listed at the Milan Stock Exchange.
    Keywords: Multivariate GARCH, Correlation, Elliptical distributions, Fat Tails
    JEL: C32 C51 C87
    Date: 2004–06
    URL: http://d.repec.org/n?u=RePEc:mis:wpaper:20060508&r=ecm
  9. By: Riccardo LUCCHETTI (Universita' Politecnica delle Marche, Dipartimento di Economia); Giulio PALOMBA ([n.d.])
    Abstract: Forecasting models for bond yields often use macro data to improve their properties. Unfortunately, macro data are not available at frequencies higher than monthly.;In order to mitigate this problem, we propose a nonlinear VEC model with conditional heteroskedasticity (NECH) and find that such model has superior in-sample performance than models which fail to encompass nonlinearities and/or GARCH-type effects.;Out-of-sample forecasts by our model are marginally superior to competing models; however, the data points we used for evaluating forecasts refer to a period of relative tranquillity on the financial markets, whereas we argue that our model should display superior performance under "unusual" circumstances.
    Keywords: conditional heteroskedasticity, forecasting, interest rates, nonlinear cointegration
    JEL: C32 C53 E43
    Date: 2006–05
    URL: http://d.repec.org/n?u=RePEc:anc:wpaper:261&r=ecm
  10. By: Minfeng Deng
    Abstract: One of the key assumptions in spatial econometric modelling is that the spatial process is isotropic, which means that direction is irrelevant in the specification of the spatial structure. On one hand, this assumption largely reduces the complexity of the spatial models and facilitates estimation and interpretation; on the other hand, it appears rather restrictive and hard to justify in many empirical applications. In this paper a very general anisotropic spatial model, which allows for a high level of flexibility in the spatial structure, is proposed. This new model can be estimated using maximum likelihood and its asymptotic properties are well understood. When the model is applied to the well-known 1970 Boston housing prices data, it significantly outperforms the isotropic spatial lag model. It also provides interesting additional insights into the price determination process in the properties market.
    Keywords: Anisotropy, spatial econometrics, maximum likelihoods estimation, housing prices.
    JEL: C21 R15 R31
    Date: 2006–03
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2006-7&r=ecm
  11. By: Eva Boj; Aurea Grane; Josep Fortiana; M. Merce Claramunt
    Abstract: Distance-based regression allows for a neat implementation of the Partial Least Squares recurrence. In this paper we address practical issues arising when dealing with moderately large datasets (n ~ 104) such as those typical of automobile insurance premium calculations.
    Date: 2006–05
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws063514&r=ecm
  12. By: Fabio Busetti (Bank of Italy); Silvia Fabiani (Bank of Italy); Andrew Harvey (Cambridge University)
    Abstract: We consider how unit root and stationarity tests can be used to study the convergence properties of prices and rates of inflation. Special attention is paid to the issue of whether a mean should be extracted in carrying out unit root and stationarity tests and whether there is an advantage to adopting a new (Dickey-Fuller) unit root test based on deviations from the last observation. The asymptotic distribution of the new test statistic is given and Monte Carlo simulation experiments show that the test yields considerable power gains for highly persistent autoregressive processes with relatively large initial conditions, the case of primary interest for analysing convergence. We argue that the joint use of unit root and stationarity tests in levels and first differences allows the researcher to distinguish between series that are converging and series that have already converged, and we set out a strategy to establish whether convergence occurs in relative prices or just in rates of inflation. The tests are applied to the monthly series of the Consumer Price Index in the Italian regional capitals over the period 1970-2003. It is found that all pairwise contrasts of inflation rates have converged or are in the process of converging. Only 24% of price level contrasts appear to be converging, but a multivariate test provides strong evidence of overall convergence.
    Keywords: Dickey-Fuller test, initial condition, law of one price, stationarity test
    JEL: C22 C32
    Date: 2006–02
    URL: http://d.repec.org/n?u=RePEc:bdi:wptemi:td_575_06&r=ecm
  13. By: Andrew Ang; Geert Bekaert; Min Wei
    Abstract: Surveys do! We examine the forecasting power of four alternative methods of forecasting U.S. inflation out-of-sample: time series ARIMA models; regressions using real activity measures motivated from the Phillips curve; term structure models that include linear, non-linear, and arbitrage-free specifications; and survey-based measures. We also investigate several methods of combining forecasts. Our results show that surveys outperform the other forecasting methods and that the term structure specifications perform relatively poorly. We find little evidence that combining forecasts produces superior forecasts to survey information alone. When combining forecasts, the data consistently places the highest weights on survey information.
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2006-15&r=ecm
  14. By: Anatoliy Belaygorod; Michael J. Dueker
    Abstract: We extend Lubik and Schorfheide's (2004) likelihood-based estimation of dynamic stochastic general equilibrium (DSGE) models under indeterminacy to encompass a sample period including both determinacy and indeterminacy by implementing the change-point methodology (Chib, 1998). This feature is useful because DSGE models generally are estimated with data sets that include the Great Inflation of the 1970s and the surrounding low inflation periods. Timing the transitions between determinate and indeterminate equilibria is one of the key contributions of this paper. Moreover, by letting the data provide estimates of the state transition dates and allowing the estimated structural parameters to be the same across determinacy states, we obtain more precise estimates of the differences in characteristics, such as the impulse responses, across the states. In particular, we find that positive interest rate shocks were inflationary under indeterminacy. While the change-point treatment of indeterminacy is applicable to all estimated linear DSGE models, we demonstrate our methodology by estimating the canonical Woodford model with a time-varying inflation target. Implementation of the change-point methodology coupled with Tailored Metropolis-Hastings provides a highly efficient Bayesian MCMC algorithm. Our prior-posterior updates indicate substantially lower sensitivity to hyperparameters of the prior relative to other estimated DSGE models.
    Keywords: Equilibrium (Economics) - Mathematical models ; Econometric models - Evaluation
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:fip:fedlwp:2006-025&r=ecm
  15. By: Tung Liu (Department of Economics, Ball State University); Courtenay Cliff Stone (Department of Economics, Ball State University)
    Abstract: Virtually all business and economics statistics texts start their discussion of hypothesis tests with some more-or-less detailed reference to criminal trials. Apparently, these authors believe that students are better able to understand the relevance and usefulness of hypothesis test procedures by introducing them first via the dramatic analogy of the criminal justice system. In this paper, we argue that using the criminal trial analogy to motivate and introduce hypothesis test procedures represents bad statistics and bad pedagogy. First, we show that statistical hypothesis test procedures can not be applied to criminal trials. Thus, the criminal trial analogy is invalid. Second, we propose that students can better understand the simplicity and validity of statistical hypothesis test procedures if these procedures are carefully contrasted with the difficulties of decisionmaking in the context of criminal trials. The criminal trial discussion provides a bad analogy but an excellent counter-example for teaching statistical hypothesis procedures and the nature of statistical decision-making.
    Keywords: hypothesis tests, criminal trials, Neyman-Pearson hypothesis test procedures
    JEL: A22 C12 K14
    Date: 2006–03
    URL: http://d.repec.org/n?u=RePEc:bsu:wpaper:200601&r=ecm
  16. By: Carsten Kuchler; Martin Spieß
    Abstract: Like other data quality dimensions, the concept of accuracy is often adopted to characterise a particular data set. However, its common specification basically refers to statistical properties of estimators, which can hardly be proved by means of a single survey at hand. This ambiguity can be resolved by assigning 'accuracy' to survey processes that are known to affect these properties. In this contribution, we consider the sub-process of imputation as one important step in setting up a data set and argue that the so called 'hit-rate' criterion, that is intended to measure the accuracy of a data set by some distance function of 'true' but unobserved and imputed values, is neither required nor desirable. In contrast, the so-called 'inference' criterion allows for valid inferences based on a suitably completed data set under rather general conditions. The underlying theoretical concepts are illustrated by means of a simulation study. It is emphasised that the same principal arguments apply to other survey processes that introduce uncertainty into an edited data set.
    Keywords: Survey Quality, Survey Processes, Accuracy, Assessment of Imputation Methods, Multiple Imputation
    JEL: C42 C81 C11 C13
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp586&r=ecm
  17. By: E Bataa; D R Osborn; D H Kim
    Abstract: We extend the vector autoregression (VAR) based expectations hypothesis tests of term structure using recent developments in bootstrap literature. Firstly, we use wild bootstrap to allow for conditional heteroskedasticity in the VAR residuals without imposing any parameterization on this heteroskedasticity. Secondly, we endogenize the model selection procedure in the bootstrap replications to reflect true uncertainty. Finally, a stationarity correction is introduced which is designed to prevent finitesample bias adjusted VAR parameters from becoming explosive. When the new methodology is applied to extensive US zero coupon term structure data ranging from 1 month to 10 years, we find less rejections for the theory in a subsample of Jan 1982-Dec 2003 than in Jan 1952-Dec 1978, and when it is rejected it occurs at only the very short and long ends of the maturity spectrum, in contrast to the U shape pattern observed in some of the previous literature.
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:man:cgbcrp:72&r=ecm
  18. By: Mikael Bask (Monetary Policy and Research Department, Bank of Finland,); Tung Liu (Department of Economics, Ball State University); Anna Widerberg (Department of Economics,)
    Abstract: The aim of this paper is to illustrate how the stability of a stochastic dynamic system is measured using the Lyapunov exponents. Specifically, we use a feedforward neural network to estimate these exponents as well as asymptotic results for this estimator to test for unstable (chaotic) dynamics. The data set used is spot electricity prices from the Nordic power exchange market, Nord Pool, and the dynamic system that generates these prices appears to be chaotic in one case.
    Keywords: Feedforward Neural Network; Nord Pool; Lyapunov Exponents; Spot Electricity Prices; Stochastic Dynamic System
    JEL: C12 C14 C22
    Date: 2006–04
    URL: http://d.repec.org/n?u=RePEc:bsu:wpaper:200603&r=ecm
  19. By: Christophe Hurlin (LEO - Laboratoire d'économie d'Orleans - [CNRS : UMR6221] - [Université d'Orléans]); Valérie Mignon (CEPII - Centre d'études prospectives et d'informations internationales - [Université de Paris X - Nanterre])
    Abstract: L'objet de ce papier est de dresser un panorama complet de la littérature relative aux tests de cointégration sur données de panel. Après un exposé des concepts spécifiques à la cointégration en panel, sont ainsi présentés les tests de l'hypothèse nulle d'absence de cointégration (tests de Pedroni (1995, 1997, 1999, 2003), Kao (1999), Bai et Ng (2001) et Groen et Kleibergen (2003)) ainsi que le test de McCoskey et Kao (1998) reposant sur l'hypothèse nulle de cointégration. Quelques éléments relatifs à l'inférence et l'estimation de systèmes cointégrés sont également fournis.
    Keywords: Données de panel non stationnaires ; racine unitaire ; cointégration
    Date: 2006–05–22
    URL: http://d.repec.org/n?u=RePEc:hal:papers:halshs-00070887_v1&r=ecm
  20. By: Toepoel,Vera; Das,Marcel; Soest,Arthur van (Tilburg University, Center for Economic Research)
    Abstract: This article shows that respondents gain meaning from visual cues in a web survey as well as from verbal cues (words). We manipulated the layout of a five point rating scale using verbal, graphical, numerical, and symbolic language. This paper extends the existing literature in four directions: (1) all languages (verbal, graphical, numeric, and symbolic) are individually manipulated on the same rating scale, (2) a heterogeneous sample is used, (3) in which way personal characteristics and a respondent's need to think and evaluate account for variance in survey responding is analyzed, and (4) a web survey is used. Our experiments show differences due to verbal and graphical language but no effects of numeric or symbolic language are found. Respondents with a high need for cognition and a high need to evaluate are affected more by layout than respondents with a low need to think or evaluate. Furthermore, men, the elderly, and the highly educated are the most sensible for layout effects.
    Keywords: web survey;questionnaire lay out;context effects;need for cognition;need to evaluate
    JEL: C42 C81 C93
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200630&r=ecm
  21. By: Angela Birk
    Abstract: The paper shows an easy method to get the impulse responses of VARs of a stochastic recursive dynamic macro model by defining the transition matrix and the stationary distribution function of a model using the model, i.e. economic theory, itself.
    URL: http://d.repec.org/n?u=RePEc:lsu:lsuwpp:2006-11&r=ecm

This nep-ecm issue is ©2006 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.