nep-ecm New Economics Papers
on Econometrics
Issue of 2013‒01‒19
thirteen papers chosen by
Sune Karlsson
Orebro University

  1. The Estimation and Testing of a Linear Regression with Near Unit Root in the Spatial Autoregressive Error Term By Badi H. Baltagi; Chihwa Kao; Long Liu
  2. Limiting experiments for panel-data and jump-diffusion models. By Becheri, I.G.
  3. Survival prediction based on compound covariate under cox proportional hazard models By Emura, Takeshi; Chen, Yi-Hau; Chen, Hsuan-Yu
  4. The Generalised Autocovariance Function By Tommaso , Proietti; Alessandra, Luati
  5. The Design Effect: Bias and Variance Estimation By Alberto Padilla
  6. Bayesian inference and data cloning in population projection matrices By J. de la Horra Navarro; J. Miguel Marín; M. T. Rodríguez Bernal
  7. The impact of sampling variation on peer measures: a comment on a proposal to adjust estimates for measurement error By Pedro N. Silva; John Micklewright; Sylke V. Schnepf
  8. Forecasting extreme electricity spot prices By Volodymyr Korniichuk
  9. Leverage and Feedback Effects on Multifactor Wishart Stochastic Volatility for Option Pricing By Manabu Asai; Michael McAleer
  10. Monetary Transmission Mechanism and Time Variation in the Euro Area By Kemal Bagzibagli
  11. Coupling between time series: a network view By Saeed Mehraban; Amirhossein Shirazi; Maryam Zamani; Gholamreza Jafari
  12. On the Choice of Optimization Routine in Estimation of Parsimonious Term Structure Models: Results from the Svensson Model By Virmani, Vineet
  13. Forecasting and nowcasting real GDP: Comparing statistical models and subjective forecasts By Jos Jansen; Xiaowen Jin; Jasper de Winter

  1. By: Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244-1020); Chihwa Kao (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244-1020); Long Liu (Department of Economics, College of Business, University of Texas at San Antonio)
    Abstract: This paper considers the estimation of a linear regression involving the spatial autoregressive (SAR) error term, which is nearly nonstationary. The asymptotics properties of the ordinary least squares (OLS), true generalized least squares (GLS) and feasible generalized least squares (FGLS) estimators as well as the corresponding Wald test statistics are derived. Monte Carlo results are conducted to study the sampling behavior of the proposed estimators and test statistics. Key Words: Spatial Autocorrelation; Ordinary Least Squares; Generalized Least Squares; Two-stage Least Squares; Maximum Likelihood Estimation JEL No. C23, C33
    Date: 2012–12
  2. By: Becheri, I.G. (Tilburg University)
    Abstract: Abstract: This work concerns the theory of limiting experiments and its use in econometrics. In Chapter 2, we consider jump-diffusion models and we compare, by means of the limiting experiment, the statistical information contained in continuous-time observations with that contained in discrete-time observations sampled in high frequency. In Chapter 3, we establish the Local Asymptotic Quadratic condition for bivariate hidden Ornstein-Uhlenbeck models using continuous-time observations. We assume that the hidden process is highly persistent and, using the limiting experiment, we discuss some inference procedures. Chapter 4 provides the power envelope for tests of the unit root hypothesis in Gaussian panel data models with cross-sectional dependence. And, it proposes a test statistic which attains the power envelope.
    Date: 2012
  3. By: Emura, Takeshi; Chen, Yi-Hau; Chen, Hsuan-Yu
    Abstract: Survival prediction from a large number of covariates is a current focus of statistical and medical research. In this paper, we study a methodology known as the compound covariate prediction performed under univariate Cox proportional hazard models. We demonstrate via simulations and real data analysis that the compound covariate method generally competes well with ridge regression and Lasso methods, both already well-studied methods for predicting survival outcomes with a large number of covariates. Furthermore, we develop a refinement of the compound covariate method by incorporating likelihood information from multivariate Cox models. The new proposal is an adaptive method that borrows information contained in both the univariate and multivariate Cox regression estimators. We show that the new proposal has a theoretical justification from a statistical large sample theory and is naturally interpreted as a shrinkage-type estimator, a popular class of estimators in statistical literature. Two datasets, the primary biliary cirrhosis of the liver data and the non-small-cell lung cancer data, are used for illustration. The proposed method is implemented in R package “compound.Cox” available in CRAN at
    Keywords: Cox proportional hazard model; Prediction; Survival analysis
    JEL: C13 C14 C34 C24 C4
    Date: 2012
  4. By: Tommaso , Proietti; Alessandra, Luati
    Abstract: The generalised autocovariance function is defined for a stationary stochastic process as the inverse Fourier transform of the power transformation of the spectral density function. Depending on the value of the transformation parameter, this function nests the inverse and the traditional autocovariance functions. A frequency domain non-parametric estimator based on the power transformation of the pooled periodogram is considered and its asymptotic distribution is derived. The results are employed to construct classes of tests of the white noise hypothesis, for clustering and discrimination of stochastic processes and to introduce a novel feature matching estimator of the spectrum.
    Keywords: Stationary Gaussian processes. Non-parametric spectral estimation. White noise tests. Feature matching. Discriminant Analysis
    JEL: C14 C22
    Date: 2012–06–06
  5. By: Alberto Padilla
    Abstract: The estimation of the sample size is a crucial part of the planning process of a survey and it can be accomplished in different ways, some of them require information not available or that may be obtained with a substantial cost. The estimation of the sample size can be done by using the design effect estimator proposed by Kish. This estimator is also used as an efficiency measure for a probability sampling plan and to build confidence intervals. Even though the design effect estimator is widely used in practice, little is known about its statistical properties and there are no variance estimators available. In this paper we show that the design effect estimator is biased, we give an expression for an upper bound to the ratio of the bias to the standard error and a method to estimate the variance. With these elements it is possible to improve the precision of the estimators during the planning and estimation stage of a survey. This also results in a better resource allocation during the planning stage of a survey.
    Keywords: Ratio estimator, Design effect, Variance of variances, Sample size, Coefficient of variation, Resampling method, Confidence interval.
    JEL: C80 C83
    Date: 2012–12
  6. By: J. de la Horra Navarro; J. Miguel Marín; M. T. Rodríguez Bernal
    Abstract: Discrete time models are used in Ecology for describing the evolution of an agestructured population. Usually, they are considered from a deterministic viewpoint but, in practice, this is not very realistic. The statistical model we propose in this article is a reasonable model for the case in which the evolution of the population is described by means of a projection matrix. In this statistical model, fertility rates and survival rates are unknown parameters and they are estimated by using a Bayesian approach. Usual Bayesian and data cloning methods (based on Bayesian methodology) are applied to real data from the population of the Steller sea lions located in the Alaska coast since 1978 to 2004. The estimates obtained from these methods show a good behavior when they are compared to the actual values
    Keywords: Population projection matrices, Data cloning, Age-structured population, Leslie matrix, Bayesian MCMC algorithm
    Date: 2013–01
  7. By: Pedro N. Silva (Instituto Brasileiro de Geografia e Estatística, Rio de Janeiro); John Micklewright (Department of Quantitative Social Science, Institute of Education, University of London); Sylke V. Schnepf (Southampton Statistical Sciences Research Institute and School of Social Sciences, University of Southampton)
    Abstract: Investigation of peer effects on pupil’s achievement with survey data on samples of schools and pupils within schools may mean that only a random sample of peers is observed for each individual pupil. This generates classical measurement error on peer variables. Hence under OLS model fitting the estimated peer group effects in a regression model are biased towards zero (attenuation). A simple adjustment for this kind of measurement error was proposed by Neidell and Waldfogel (2008). We review the derivation of the simple adjustment and suggest that it is not properly justified.
    Keywords: Peer effects, measurement error, school surveys, sampling variation.
    JEL: C21 C81 I21
    Date: 2012–12–20
  8. By: Volodymyr Korniichuk
    Abstract: We propose a model for forecasting extreme electricity prices in real time (high frequency) settings. The unique feature of our model is its ability to forecast electricity price exceedances over very high thresholds, where only a few (if any) observations are available. The model can also be applied for simulating times of occurrence and magnitudes of the extreme prices. We employ a copula with a changing dependence parameter for capturing serial dependence in the extreme prices and the censored GPD for modelling their marginal distributions. For modelling times of the extreme price occurrences we propose an approach based on a negative binomial distribution. The model is applied to electricity spot prices from Australia's national electricity market.
    Keywords: electricity spot prices, copula, GPD, negative binomial distribution
    JEL: C53 C51 C32
    Date: 2012–12–27
  9. By: Manabu Asai (Soka University, Japan); Michael McAleer (Erasmus University Rotterdam, The Netherlands; Kyoto University, Japan; Complutense University of Madrid, Spain)
    Abstract: The paper proposes a general asymmetric multifactor Wishart stochastic volatility (AMWSV) diffusion process which accommodates leverage, feedback effects and multifactor for the covariance process. The paper gives the closed-form solution for the conditional and unconditional Laplace transform of the AMWSV models. The paper also suggests estimating the AMWSV model by the generalized method of moments using information not only of stock prices but also of realized volatilities and co-volatilities. The empirical results for the bivariate data of the NASDAQ 100 and S&P 500 indices show that the general AMWSV model is preferred among several nested models.
    Keywords: Multivariate Stochastic Volatility; Wishart Process; Leverage Effects; Feedback Effects; Multifactor Model; Option Pricing
    JEL: C32 C51 G13
    Date: 2013–01–07
  10. By: Kemal Bagzibagli
    Abstract: This paper examines the monetary transmission mechanism in the euro area for the period of single monetary policy using factor-augmented vector autoregressive (FAVAR) techniques. The contributions of the paper are fourfold. First, a novel dataset consisting of 120 disaggregated macroeconomic time series spanning the period 1999: M1 through 2011: M12 is gathered for the euro area as an aggregate. Second, Bayesian joint estimation technique of FAVARs is applied to the European data. Third, time variation in the transmission mechanism and the impact of the global financial crisis is investigated in the FAVAR context using a rolling windows technique. Fourth, we tried to contribute to the question of whether more data are always better for factor analysis as well as the estimation of structural FAVAR models. We find that there are considerable gains from the implementation of the Bayesian technique such as smoother impulse response functions and statistical significance of the estimates. According to our rolling estimations, consumer prices and monetary aggregates display the most time variant responses to the monetary policy shocks. The pre-screening technique considered, elimination of almost half of the dataset seems to do no worse, and in some cases, better in a structural context.
    Keywords: Monetary Policy Shocks, FAVAR, Bayesian Methods, Rolling Windows, Euro Area
    JEL: C11 C32 C33 E5
    Date: 2012–11
  11. By: Saeed Mehraban; Amirhossein Shirazi; Maryam Zamani; Gholamreza Jafari
    Abstract: Recently, the visibility graph has been introduced as a novel view for analyzing time series, which maps it to a complex network. In this paper, we introduce new algorithm of visibility, "cross-visibility", which reveals the conjugation of two coupled time series. The correspondence between the two time series is mapped to a network, "the cross-visibility graph", to demonstrate the correlation between them. We applied the algorithm to several correlated and uncorrelated time series, generated by the linear stationary ARFIMA process. The results demonstrate that the cross-visibility graph associated with correlated time series with power-law auto-correlation is scale-free. If the time series are uncorrelated, the degree distribution of their cross-visibility network deviates from power-law. For more clarifying the process, we applied the algorithm to real-world data from the financial trades of two companies, and observed significant small-scale coupling in their dynamics.
    Date: 2013–01
  12. By: Virmani, Vineet
    Abstract: Objective function in term structure estimation with price errors is not only non-linear but also non-convex in parameters. This makes the final results sensitive to both the choice of the optimization routine as well as to the starting guess. This study looks at the impact of the choice of the optimization routine to final parameter estimates for the Svensson model. While results are expected to differ numerically across routines, what is of interest is the economic impact. Using eleven different routines over a range of starting parameter values, it is found while there is significant variation in the final objective function value across routines, for the most part, implied short-rates and long-rates have low standard deviation. Also, while grid-search seems unavoidable, popular quasi-Newton methods allowing for linear constraints seem quite adequate for the task at hand.
  13. By: Jos Jansen; Xiaowen Jin; Jasper de Winter
    Abstract: We conduct a systematic comparison of the short-term forecasting abilities of eleven statistical models and professional analysts in a pseudo-real time setting, using a large set of monthly indicators. Our analysis covers the euro area and its five largest countries over the years 1996-2011. We find that summarizing the available monthly information in a few factors is a more promising forecasting strategy than averaging a large number of indicator-based forecasts. The dynamic and static factor model outperform other models, especially during the crisis period. Judgmental forecasts by professional analysts often embody valuable information that could be used to enhance forecasts derived from purely mechanical procedures.
    Keywords: nowcasting; professional forecasters; factor model; judgment; forecasting
    JEL: E52 C53 C33
    Date: 2012–12

This nep-ecm issue is ©2013 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.