nep-ecm New Economics Papers
on Econometrics
Issue of 2012‒04‒03
eighteen papers chosen by
Sune Karlsson
Orebro University

  1. Identification, Estimation and Specification in a Class of Semiparametic Time Series Models By Jiti Gao
  2. A Note on the Finite Sample Properties of the CLS Method of TAR Models By Marian Vavra
  3. Robustness of Power Properties of Non-linearity Tests By Marian Vavra
  4. RA Monte Carlo Study of Bias Corrections for Panel Probit Models By Blair Alexander; Robert Breunig
  5. Generalized Tests of Investment Fund Performance By Márcio Laurini
  6. ¿Cuál matriz de pesos espaciales?. Un enfoque sobre selección de modelos By Herrera Gómez, Marcos; Mur Lacambra, Jesús; Ruiz Marín, Manuel
  7. Martingale approximation for common factor representation By Bystrov, Victor; di Salvatore, Antonietta
  8. Model Discovery and Trygve Haavelmo's Legacy By David F. Hendry; Soren Johansen
  9. New methods to estimate models with large sets of fixed effects with an application to matched employer-employee data from Germany By Mittag, Nikolas
  10. Bimodality & the performance of PPML By Prehn, Sören; Brümmer, Bernhard
  11. Testing Non-linearity Using a Modified Q Test By Marian Vavra
  12. Evidence on Features of a DSGE Business Cycle Model from Bayesian Model Averaging By Rodney Strachan; Herman K. van Dijk
  13. Backward and forward closed solutions of multivariate ARMA models. By Ludlow-Wiechers, Jorge
  14. Empirical analysis of the forecast error impact of classical and bayesian beta adjustment techniques By Sinha, Pankaj; Jayaraman, Prabha
  15. Heavy-Tail Distribution from Correlation of Discrete Stochastic Process By Jongwook Kim; Teppei Okumura
  16. The Dual of the Least-Squares Method By Paris, Quirino
  17. Nested logit or random coefficients logit? A comparison of alternative discete models of product differentiation By Laura GRIGOLON; Frank VERBOVEN
  18. Forecasting from Structural Econometric Models By David F. Hendry; Grayham E. Mizon

  1. By: Jiti Gao
    Abstract: In this paper, we consider some identification, estimation and specification problems in a class of semiparametric time series models. Existing studies for the stationary time series case have been reviewed and discussed. We also consider the case where new studies for the integrated nonstationary case are established. In the meantime, we propose some new estimation methods and establish some new results for a new class of semiparametric autoregressive models. In addition, we discuss certain directions for further research.
    Keywords: Asymptotic theory, departure function, kernel method, nonlinearity, nonstationarity, semiparametric model, stationarity, time series
    JEL: C13 C14 C22
    Date: 2012–03
  2. By: Marian Vavra (Department of Economics, Mathematics & Statistics, Birkbeck)
    Abstract: In this paper we focus on the finite sample properties of the conditional least squares (CLS) method of threshold autoregressive (TAR) parameters under the following conditions: (a) non-Gaussian model innovations; (b) two types of asymmetry (i.e. deepness and steepness) captured by TAR models. It is clearly demonstrated that the finite sample properties of the CLS method of TAR parameters significantly differ depending on the type of asymmetry. The behavior of steepness-based models is very good compared to that obtained from deepness-based models. Therefore, extreme caution must be excercised to preliminary modelling steps, such as testing the type of asymmetry before estimating TAR models in practice. A mistake in this phase of modelling can, in turn, give rise to very problematic results.
    Keywords: threshold autoregressive model, Monte Carlo method, bias, asymmetry
    JEL: C15 C22 C46
    Date: 2012–03
  3. By: Marian Vavra (Department of Economics, Mathematics & Statistics, Birkbeck)
    Abstract: The paper examines the robustness of the size and power properties of the standard non-linearity tests under different conditions such as moment failure and asymmetry of innovations. Our results reveal the following. First, there seems not to be a direct link between moment condition failure and the power variation of non-linearity tests. Second, the power of the tests is very sensitive to asymmetry of innovations compared to moment condition failure. Third, although we evaluate 9 non-linear time series models using 8 standard non-linearity tests, some non-linear models remain completely undetected.
    Keywords: non-linearity testing, Monte Carlo experiments
    JEL: C15 C22
    Date: 2012–03
  4. By: Blair Alexander; Robert Breunig
    Abstract: We examine bias corrections which have been proposed for the Fixed Effects Panel Probit model with exogenous regressors, using several different data generating processes to evaluate the performance of the estimators in different situations. We find a best estimator across all cases for coefficient estimates, but when the marginal effects are the quantity of interest no analytical correction is able to outperform the uncorrected maximum likelihood estimator (MLE).
    Keywords: bias correction; panel probit; marginal effects
    JEL: C13 C23
    Date: 2012–03
  5. By: Márcio Laurini (IBMEC Business School)
    Abstract: The paper discusses the use of statistical methods in the comparison of investment fund performance indicators. The analysis is based on the robust statistics proposed by Ledoit and Wolf (2008), for the pairwise comparison of funds and two generalizations for sets of multiple investment funds. The multiple investment fund tests use the Wald and Distance Metric statistics, based on estimation by Generalized Method of Moments using HAC matrices. In order to correct power limitations in the GMM estimation in the case of a large number of moment conditions, the test distributions are obtained through block-bootstrap procedures. We applied the proposed procedures to daily return data for the largest 97 actively managed equity funds in the Brazilian market, covering the period from July 2006 to July 2008. The results indicate that there are no significant differences in the performances of the 97 funds in the sample, both in pairwise and joint comparisons, thus providing what is believed to be the first Brazilian market evidence for the so-called herding hypothesis.
    Keywords: Sharpe Ratio, GMM, Investment Analysis
    JEL: G11 G14
    Date: 2012–03–22
  6. By: Herrera Gómez, Marcos; Mur Lacambra, Jesús; Ruiz Marín, Manuel
    Abstract: In spatial econometrics, it is customary to specify a weighting matrix, the so-called W matrix. The decision is important because the choice of W matrix determines the rest of the analysis. However, the procedure is not well defined and, usually, reflects the priors of the user. In the paper, we revise the literature looking for criteria to help with this problem. Also, a new nonparametric procedure is introduced. Our proposal is based on a measure of the information, conditional entropy, that uses information present in the data. We compare these alternatives by means of a Monte Carlo experiment.
    Keywords: Econometría espacial; Selección de modelos; Entropía simbólica
    JEL: C52 C12 C21 C01
    Date: 2011
  7. By: Bystrov, Victor; di Salvatore, Antonietta
    Abstract: In this paper a martingale approximation is used to derive the limiting distribution of simple positive eigenvalues of the sample covariance matrix for a stationary linear process. The derived distribution can be used to study stability of the common factor representation based on the principal component analysis of the covariance matrix.
    Keywords: martingale approximation; dynamic factor model; eigenvalue; stability
    JEL: C10 C32
    Date: 2012–03–26
  8. By: David F. Hendry; Soren Johansen
    Abstract: Trygve Haavelmo’s Probability Approach aimed to implement economic theories, but he later recognized their incompleteness. Although he did not explicitly consider model selection, we apply it when theory-relevant variables, {xt}, are retained without selection while selecting other candidate variables, {wt}. Under the null that the {wt} are irrelevant, by orthogonalizing with respect to the {xt}, the estimator distributions of the xt‘s parameters are unaffected by selection even for more variables than observations and for endogenous variables. Under the alternative, when the joint model nests the generating process, an improved outcome results from selection. This implements Haavelmo’s program relatively costlessly.
    Keywords: Trygve Haavelmo, Model discovert, Theory retention, Impulse-indicator saturation, Autometrics
    JEL: C51 C22
    Date: 2012
  9. By: Mittag, Nikolas
    Abstract: "This paper will introduce new methods to estimate the two-way fixed effects model and the match effects model in datasets where the number of fixed effects makes standard estimation techniques infeasible. The methods work for balanced and unbalanced panels and increase the speed of estimation without imposing excessive computational demands. I will apply the methods to a new and unusually detailed matched employer-employee dataset from Germany. The analysis shows that the omission of match effects leads to biased inference particularly concerning the effects of individual characteristics and underlines the importance of accurate biographic data." (Author's abstract, IAB-Doku) ((en))
    Keywords: Arbeitsmarktforschung, IAB-Linked-Employer-Employee-Datensatz, Arbeitsmarktmodell, Schätzung, matching, empirische Forschung, Panel, Berufsverlauf, Längsschnittuntersuchung, Algorithmus
    Date: 2012–03–12
  10. By: Prehn, Sören; Brümmer, Bernhard
    Abstract: There has been an extensive discussion on the applicability of Poisson Pseudo Maximum Likelihood (PPML) to trade. Here, we are going to analyse again the performance of PPML but in the light of a bimodal distribution; in addition, we also explicitly account for excess zeros. Simulations are based on a Bernoulli-Gamma distribution (a zero-inflated Gamma distribution). Again, our results are a confirmation of how well-behaved PPML is in general. --
    Keywords: Poisson Pseudo Maximum Likelihood,excess zeros,zero-inflated Gamma Distribution,simulation
    Date: 2012
  11. By: Marian Vavra (Department of Economics, Mathematics & Statistics, Birkbeck)
    Abstract: A new version of the Q test, based on generalized residual correlations (i.e. auto-correlations and cross-correlations), is developed in this paper. The Q test fixes two main shortcomings of the Mcleod and Li Q (MLQ) test often used in the literature: (i) the test is capable to capture some interesting non-linear models, for which the original MLQ test completely fails (e.g. a non-linear moving average model). Additionally, the Q test also significantly improves the power for some other non-linear models (e.g. a threshold moving average model), for which the original MLQ test does not work very well; (ii) the new Q test can be used for discrimination between simple and more complicated (non-linear/asymmetric) GARCH models as well.
    Keywords: non-linearity testing, portmanteau Q test, auto-correlation, cross-correlation
    JEL: C12 C15 C32 C46
    Date: 2012–03
  12. By: Rodney Strachan (Australian National University); Herman K. van Dijk (Erasmus University Rotterdam, and VU University Amsterdam.)
    Abstract: The empirical support for features of a Dynamic Stochastic General Equilibrium model with two technology shocks is valuated using Bayesian model averaging over vector autoregressions. The model features include equilibria, restrictions on long-run responses, a structural break of unknown date and a range of lags and deterministic processes. We find support for a number of features implied by the economic model and the evidence suggests a break in the entire model structure around 1984 after which technology shocks appear to account for all stochastic trends. Business cycle volatility seems more due to investment specific technology shocks than neutral technology shocks.
    Keywords: Posterior probability; Dynamic stochastic general equilibrium model; Cointegration; Model averaging; Stochastic trend; Impulse response; Vector autoregressive model
    JEL: C11 C32 C52
    Date: 2012–03–20
  13. By: Ludlow-Wiechers, Jorge
    Abstract: Some of the most widely used models in economics are based on variables not yet observed, and their specification depends on future observations; the theory that underpins these delivers the backward/ forward solution. We present a newly unified construction, starting with a more general specification of an ARMA model, yet is capable of delivering in closed form, in both the backward and forward cases, leading to an alternative presentation of causal/non-causal and invertible/non-invertible cases.
    Keywords: Causal models; non-causal models; invertible models; non-invertible models; backward solution; forward solution
    JEL: C32 C50 C22 C01
    Date: 2012–03–25
  14. By: Sinha, Pankaj; Jayaraman, Prabha
    Abstract: The paper presents a comparative study of conventional beta adjustment techniques and suggests an improved Bayesian model for beta forecasting. The seminal papers of Blume (1971) and Levy (1971) suggested that for both single security and portfolio there was a tendency for relatively high and low beta coefficients to over predict and under predict, respectively, the corresponding betas for the subsequent time period. We utilize this proven fact to give a Bayesian adjustment technique under a bilinear loss function where the problem of overestimation and underestimation of future betas is rectified to an extent so as to give us improved beta forecasts. The accuracy and efficiency of our methodology with respect to existing procedures is shown by computing the mean square forecast error.
    Keywords: Bayesian Beta adjustment technique; bi-linear loss function; portfolio risk measure
    JEL: G12 G32 C1 C11
    Date: 2012–02–06
  15. By: Jongwook Kim; Teppei Okumura
    Abstract: We propose a stochastic process driven by the memory effect with novel distributions which include both exponential and leptokurtic heavy-tailed distributions. A class of the distributions is analytically derived from the continuum limit of the discrete binary process with the renormalized auto-correlation. The moment generating function with a closed form is obtained, thus the cumulants are calculated and shown to be convergent. The other class of the distributions is numerically investigated. The combination of the two stochastic processes of memory with different signs under regime switching mechanism does result in behaviors of power-law decay. Therefore we claim that memory is the alternative origin of heavy-tail.
    Date: 2012–03
  16. By: Paris, Quirino
    Keywords: least squares, primal, dual, Pythagoras theorem, noise, value of sample information, Research Methods/ Statistical Methods, C20, C30,
    Date: 2012
  17. By: Laura GRIGOLON; Frank VERBOVEN
    Abstract: We start from an aggregate random coefficients nested logit (RCNL) model to provide a systematic comparison between the tractable logit and nested logit (NL) models with the computationally more complex random coefficients logit (RC) model. We first use simulated data to assess possible parameter biases when the true model is a RCNL model. We then use data on the automobile market to estimate the different models, and as an illustration assess what they imply for competition policy analysis. As expected, the simple logit model is rejected against the NL and RC model, but both of these models are in turn rejected against the more general RCNL model. While the NL and RC models result in quite different substitution patterns, they give robust policy conclusions on the predicted price effects from mergers. In contrast, the conclusions for market definition are not robust across different demand models. In general, our findings suggest that it is important to account for sources of market segmentation that are not captured by continuous characteristics in the RC model.
    Date: 2011–09
  18. By: David F. Hendry; Grayham E. Mizon
    Abstract: Understanding the workings of whole economies is essential for sound policy advice - but not necessarily for accurate forecasts. Structural models play a major role at most central banks and many other governmental agencies, yet almost none forecast the financial crisis and ensuing recession. We focus on the problem of forecast failure that has become prominent during and after that crisis, and illustrate its sources and many surprising implications using a simple model. An application to ‘forecasting’ UK GDP over 2008(1)-2011(2) is consistent with our interpretation.
    Keywords: Structural models, Location shifts, Economic forecasting, Autometrics
    JEL: C52 C22
    Date: 2012

This nep-ecm issue is ©2012 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.