nep-ecm New Economics Papers
on Econometrics
Issue of 2012‒07‒08
fourteen papers chosen by
Sune Karlsson
Orebro University

  1. Bayesian Adaptively Updated Hamiltonian Monte Carlo with an Application to High-Dimensional BEKK GARCH Models By Martin Burda; John M. Maheu
  2. Generated Covariates in Nonparametric Estimation: A Short Review. By Enno Mammen; Christoph Rothe; Melanie Schienle;
  3. A simple bootstrap method for constructing nonparametric confidence bands for functions By Paul Hall; Joel Horowitz
  4. A Joint Chow Test for Structural Instability By Bent Nielsen; Andrew Whitby
  5. Testing for time-varying fractional cointegration using the bootstrap approach By Simwaka, Kisu
  6. Bayesian Semiparametric Multivariate GARCH Modeling By Mark J. Jensen; John M. Maheu
  7. Reducing bias due to missing values of the response variable by joint modeling with an auxiliary variable By Alfonso Miranda; Sophia Rabe-Hesketh; John W. McDonald
  8. Identification and shape restrictions in nonparametric instrumental variables estimation By Joachim Freyberger; Joel Horowitz
  9. Evaluating alternative frequentist inferential approaches for optimal order quantities in the newsvendor model under exponential demand By Halkos, George; Kevork, Ilias
  10. Maximum likelihood estimation of a stochastic frontier model with residual covariance By Simwaka, Kisu
  11. New Evidence on Linear Regression and Treatment Effect Heterogeneity By Słoczyński, Tymon
  12. Moment Condition Models in Empirical Economics. By [no author]
  13. A p-median problem with distance selection By Stefano Benati; Sergio García
  14. Modeling financial contagion: approach-based on asymmetric cointegration By Lazeni Fofana; Françoise SEYTE

  1. By: Martin Burda (Department of Economics, University of Toronto, Canada; IES, Charles University, Czech Republic); John M. Maheu (Department of Economics, University of Toronto, Canada; RCEA, Italy)
    Abstract: Hamiltonian Monte Carlo (HMC) is a recent statistical procedure to sample from complex distributions. Distant proposal draws are taken in a sequence of steps following the Hamiltonian dynamics of the underlying parameter space, often yielding superior mixing properties of the resulting Markov chain. However, its performance can deteriorate sharply with the degree of irregularity of the underlying likelihood due to its lack of local adaptability in the parameter space. Riemann Manifold HMC (RMHMC), a locally adaptive version of HMC, alleviates this problem, but at a substantially increased computational cost that can become prohibitive in high-dimensional scenarios. In this paper we propose the Adaptively Updated HMC (AUHMC), an alternative inferential method based on HMC that is both fast and locally adaptive, combining the advantages of both HMC and RMHMC. The benefits become more pronounced with higher dimensionality of the parameter space and with the degree of irregularity of the underlying likelihood surface. We show that AUHMC satisfies detailed balance for a valid MCMC scheme and provide a comparison with RMHMC in terms of effective sample size, highlighting substantial efficiency gains of AUHMC. Simulation examples and an application of the BEKK GARCH model show the practical usefulness of the new posterior sampler.
    Keywords: High-dimensional joint sampling, Markov chain Monte Carlo
    JEL: C01 C11 C15 C32
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:46_12&r=ecm
  2. By: Enno Mammen; Christoph Rothe; Melanie Schienle;
    Abstract: In many applications, covariates are not observed but have to be estimated from data. We outline some regression-type models where such a situation occurs and discuss estimation of the regression function in this context.We review theoretical results on how asymptotic properties of nonparametric estimators differ in the presence of generated covariates from the standard case where all covariates are observed. These results also extend to settings where the focus of interest is on average functionals of the regression function.
    Keywords: Nonparametric estimation, generated covariates
    JEL: C14 C31
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2012-042&r=ecm
  3. By: Paul Hall; Joel Horowitz (Institute for Fiscal Studies and Northwestern University)
    Abstract: Standard approaches to constructing nonparametric confidence bands for functions are frustrated by the impact of bias, which generally is not estimated consistently when using the bootstrap and conventionally smoothed function estimators. To overcome this problem it is common practice to either undersmooth, so as to reduce the impact of bias, or oversmooth, and thereby introduce an explicit or implicit bias estimator. However, these approaches, and others based on nonstandard smoothing methods, complicate the process of inference, for example by requiring the choice of new, unconventional smoothing parameters and, in the case of undersmoothing, producing relatively wide bands. In this paper we suggest a new approach, which exploits to our advantage one of the difficulties that, in the past, has prevented an attractive solution to this problem - the fact that the standard bootstrap bias estimator suffers from relatively high-frequency stochastic error. The high frequency, together with a technique based on quantiles, can be exploited to dampen down the stochastic error term, leading to relatively narrow, simple-to-construct confidence bands.
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:14/12&r=ecm
  4. By: Bent Nielsen (Dept of Economics, University of Oxford); Andrew Whitby (Dept of Economics, University of Oxford)
    Abstract: The classical Chow (1960) test for structural instability requires strictly exogenous regressors and a break-point specied in advance. In this paper we consider two generalisations, the 1-step recursive Chow test (based on the sequence of studentized recursive residuals) and its supremum counterpart, which relax these requirements. We use results on strong consistency of regression estimators to show that the 1-step test is appropriate for stationary, unit root or explosive processes modelled in the autoregressive distributed lags (ADL) framework. We then use results in extreme value theory to develop a new supremum version of the test, suitable for formal testing of structural instability with an unknown break-point. The test assumes normality of errors, and is intended to be used in situations where this can either be assumed or established empirically.
    Date: 2012–06–25
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:1207&r=ecm
  5. By: Simwaka, Kisu
    Abstract: Fractional cointegration has attracted interest in time series econometrics in recent years (see among others, Dittmann 2004). According to Engle and Granger (1987), the concept of fractional cointegration was introduced to generalize the traditional cointegration to the long memory framework. Although cointegration tests have been developed for the traditional cointegration framework, these tests do not take into account fractional cointegration. This paper proposes a bootstrap procedure to test for time-varying fractional cointegration.
    Keywords: Time-varying fractional cointegration; bootstrap procedure
    JEL: C52 C15 C22
    Date: 2012–06–26
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:39698&r=ecm
  6. By: Mark J. Jensen (Federal Reserve Bank of Atlanta, USA); John M. Maheu (Department of Economics, University of Toronto, Canada; RCEA, Italy)
    Abstract: This paper proposes a Bayesian nonparametric modeling approach for the return distribution in multivariate GARCH models. In contrast to the parametric literature the return distribution can display general forms of asymmetry and thick tails. An innite mixture of multivariate normals is given a exible Dirichlet process prior. The GARCH functional form enters into each of the components of this mixture. We discuss conjugate methods that allow for scale mixtures and nonconjugate methods which provide mixing over both the location and scale of the normal components. MCMC methods are introduced for posterior simulation and computation of the predictive density. Bayes factors and density forecasts with comparisons to GARCH models with Student-t innovations demonstrate the gains from our exible modeling approach.
    JEL: C11 C14 C32 C58
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:48_12&r=ecm
  7. By: Alfonso Miranda (Department of Quantitative Social Science, Institute of Education, University of London. 20 Bedford Way, London WC1H 0AL, UK.); Sophia Rabe-Hesketh (Graduate School of Education and Graduate Group in Biostatistics, University of California, Berkeley, USA. Institute of Education, University of London, London, UK.); John W. McDonald (Department of Quantitative Social Science, Institute of Education, University of London. 20 Bedford Way, London WC1H 0AL, UK.)
    Abstract: In this paper, we consider the problem of missing values of a continuous response variable that cannot be assumed to be missing at random. The example considered here is an analysis of pupil's subjective engagement at school using longitudinal survey data, where the engagement score from wave 3 of the survey is missing due to a combination of attrition and item non-response. If less engaged students are more likely to drop out and less likely to respond to questions regarding their engagement, then missingness is not ignorable and can lead to inconsistent estimates. We suggest alleviating this problem by modelling the response variable jointly with an auxiliary variable that is correlated with the response variable and not subject to non-response. Such auxiliary variables can be found in administrative data, in our example, the National Pupil Database containing test scores from national achievement tests. We estimate a joint model for engagement and achievement to reduce the bias due to missing values of engagement. A Monte Carlo study is performed to compare our proposed multivariate response approach with alternative approaches such as the Heckman selection model and inverse probability of selection weighting.
    Keywords: Auxiliary variable, joint model, multivariate regression, not missing at random, sample selection bias, seemingly-unrelated regressions, selection model, SUR
    JEL: C13 C33 I21
    Date: 2012–06–29
    URL: http://d.repec.org/n?u=RePEc:qss:dqsswp:1205&r=ecm
  8. By: Joachim Freyberger; Joel Horowitz (Institute for Fiscal Studies and Northwestern University)
    Abstract: This paper is concerned with inference about an unidentified linear functional, L(g), where the function g satisfies the relation Y = g(X) + U; E(U | W) = 0. In this relation, Y is the dependent variable, X is a possibly endogenous explanatory variable, W is an instrument for X, and U is an unobserved random variable. The data are an independent random sample of (Y,X,W). In much applied research, X and W are discrete, and W has fewer points of support than X. Consequently, neither g nor L(g) is nonparametrically identified. Indeed, L(g) can have any value in (-∞, ∞). In applied research, this problem is typically overcome and point identification is achieved by assuming that g is a linear function of X. However, the assumption of linearity is arbitrary. It is untestable if W is binary, as is the case in many applications. This paper explores the use of shape restrictions, such as monotonicity or convexity, for achieving interval identification of L(g). Economic theory often provides such shape restrictions. This paper shows that they restrict L(g) to an interval whose upper and lower bounds can be obtained by solving linear programming problems. Inference about the identified interval and the functional L(g) can be carried out by using by using the bootstrap. An empirical application illustrates the usefulness of shape restrictions for carrying out nonparametric inference about L(g).
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:15/12&r=ecm
  9. By: Halkos, George; Kevork, Ilias
    Abstract: Three estimation policies for the optimal order quantity of the classical newsvendor model under exponential demand are evaluated in the current paper. According to the principle of the first estimation policy, the corresponding estimator is obtained replacing in the theoretical formula which gives the optimal order quantity the parameter of exponential distribution with its maximum likelihood estimator. The estimator of the second estimation policy is derived in such a way as to ensure that the requested critical fractile is attained. For the third estimation policy, the corresponding estimator is obtained maximizing the a-priori expected profit with respect to a constant which has been included into the form of the estimator. Three statistical measures have been chosen to perform the evaluation. The actual critical fractile attained by each estimator, the mean square error, and the range of deviation of estimates from the optimal order quantity, when the probability to take such a range is the same for the three estimation policies. The behavior of the three statistical measures is explored under different combinations of sample sizes and critical fractiles. With small sample sizes, no estimation policy predominates over the others. The estimator which attains the closest actual critical fractile to the requested one, this estimator has the largest mean square and the largest range of deviation of estimates from the optimal order quantity. On the contrary, with samples over 40 observations, the choice is restricted among the estimators of the first and third estimation policy. To facilitate this choice, at different sample sizes, we offer the required values of the critical fractile which determine which estimation policy eventually should be applied.
    Keywords: Classical newsvendor model; Exponential distribution; Demand estimation; Actual critical fractile; Mean square error of estimators
    JEL: C13 M11 C44 D24
    Date: 2012–06–25
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:39650&r=ecm
  10. By: Simwaka, Kisu
    Abstract: In theoretical literature on productivity, the disturbance terms of the stochastic frontier model are assumed to be independent random variables. In this paper, we consider a stochastic production frontier model with residuals that are both spatially and time-wise correlated. We introduce generalizations of the Maximum Likelihood Estimation procedure suggested in Cliff and Ord (1973) and Kapoor (2003). We assume the usual error component specification, but allow for possible correlation between individual specific errors components. The model combines specifications usually considered in the spatial literature with those in the error components literature. Our specifications are such that the model’s disturbances are potentially spatially correlated due to geographical or economic activity. For instance, for agricultural farmers, spatial correlations can represent productivity shock spillovers, based on geographical proximity and weather. These spillovers effect estimation of efficiency.
    Keywords: spatial stochastic production frontier models; correlated errors
    JEL: C23 C24 C21
    Date: 2012–06–27
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:39726&r=ecm
  11. By: Słoczyński, Tymon
    Abstract: In this paper I provide new evidence on the implications of treatment effect heterogeneity for least squares estimation when the effects are inappropriately assumed to be homogenous. I prove that under a set of benchmark assumptions linear regression provides a consistent estimator of the population average treatment effect on the treated times the population proportion of the nontreated individuals plus the population average treatment effect on the nontreated times the population proportion of the treated individuals. Consequently, in many empirical applications the linear regression estimates might not be close to any of the standard average treatment effects of interest.
    Keywords: treatment effects; linear regression; ordinary least squares; decomposition methods
    JEL: C31 C21 C01
    Date: 2012–06–27
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:39524&r=ecm
  12. By: [no author]
    Abstract: In the first chapter of this dissertation, we approach the estimation of dynamic stochastic general equilibrium models through a moments-based estimator, the empirical likelihood. We try to show that this inference process can be a valid alternative to maximum likelihood. The empirical likelihood estimator only requires knowledge about the moments of the data generating process of the model. In this context, we exploit the fact that these economies can be formulated as a set of moment conditions to infer on their parameters through this technique. For illustrational purposes, we consider the standard real business cycle model with a constant relative risk adverse utility function and indivisible labour, driven by a normal technology shock. In the second chapter, we explore further aspects of the estimation of dynamic stochastic general equilibrium models using the empirical likelihood family of estimators. In particular, we propose possible ways of tackling the main problems identified in the first chapter. These problems resume to: (i) the possible existence of dependence between the random variables; (ii) the definition of moment conditions in the dynamic stochastic general equilibrium models setup; (iii) the alternatives to the data generation process used in the first chapter. In the third chapter, we investigate the short run effects of macroeconomic and fiscal volatility on the decision of the policy maker on how much to consume and how much to invest. To that end, we analyse a panel of 10 EU countries during 1991-2007. Our results suggest that increases in the volatility of regularly collected and cyclical revenues such as the VAT and income taxes tend to tilt the expenditure composition in favour of public investment. In contrast, increases in the volatility of ad hoc -type of taxes such as capital taxes tend to favour public consumption spending, albeit only a little.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:ner:euiflo:urn:hdl:1814/22454&r=ecm
  13. By: Stefano Benati; Sergio García
    Abstract: This paper introduces an extension of the p-median problem and its application to clustering, in which the distance/dissimilarity function between units is calculated as the distance sum on the q most important variables. These variables are to be chosen from a set of m elements, so a new combinatorial feature has been added to the problem, that we call the p-median model with distance selection. This problem has its origin in cluster analysis, often applied to sociological surveys, where it is common practice for a researcher to select the q statistical variables they predict will be the most important in discriminating the statistical units before applying the clustering algorithm. Here we show how this selection can be formulated as a non-linear mixed integer optimization mode and we show how this model can be linearized in several different ways. These linearizations are compared in a computational study and the results outline that the radius formulation of the p-median is the most efficient model for solving this problem.
    Keywords: p-median problem, Distance selection, Radius formulation
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws121913&r=ecm
  14. By: Lazeni Fofana; Françoise SEYTE
    Abstract: We analyze the financial contagion using an approach based on cointégration with asymmetric adjustment TAR and M-TAR. To capture the contagion effect, we consider regime change in the adjustment of the error correction term. We have introduced Threshold Autoregressive model (TAR) and Momentum Threshold Autoregressive model (M TAR) in adjustment mechanism of the error correction model with assumption that the error term exhibits self-excite jump. Our empirical study required the selection of four markets indices such as the CAC40, the FTSE 100, the S&P500 and NIKKEI225. We used these markets to understand the mechanism of shock propagation during the 2007 crisis. The results demonstrate the transmission of shocks by pure contagion from the S&P500 to FTSE100 and the CAC40. In contrast, we found a shocks transmission in the bond of interdependence from the S&P500 to NIKKEI225.
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:lam:wpaper:12-21&r=ecm

This nep-ecm issue is ©2012 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.