nep-ecm New Economics Papers
on Econometrics
Issue of 2015‒03‒22
fourteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Beyond location and dispersion models: The Generalized Structural Time Series Model with Applications By Djennad, Abdelmajid; Rigby, Robert; Stasinopoulos, Dimitrios; Voudouris, Vlasios; Eilers, Paul
  2. MSE Performance of the Weighted Average Estimators Consisting of Shrinkage Estimators By Akio Namba; Kazuhiro Ohtani
  3. Pitfalls of Estimating the Marginal Likelihood Using the Modified Harmonic Mean By Joshua C.C. Chan; Angelia L. Grant
  4. Outlier robust small area estimation under spatial correlation By Schmid, Timo; Tzavidis, Nikos; Münnich, Ralf; Chambers, Ray
  5. The Stochastic Volatility in Mean Model with Time-Varying Parameters: An Application to Inflation Modeling By Joshua C.C. Chan
  6. Estimating the density of ethnic minorities and aged people in Berlin: Multivariate kernel density estimation applied to sensitive geo-referenced administrative data protected via measurement error By Groß, Marcus; Rendtel, Ulrich; Schmid, Timo; Schmon, Sebastian; Tzavidis, Nikos
  7. Pooling data across markets in dynamic Markov games By Taisuke Otsu; Martin Pesendorfer; Yuya Takahashi
  8. GTL Regression: A Linear Model with Skewed and Thick-Tailed Disturbances By Vijverberg, Wim P.; Hasebe, Takuya
  9. The Econometrics Approach to the Measurement of Efficiency: A Survey By Martín Rossi
  10. Forecasting U.S. Recessions with a Large Set of Predictors By Fornaro, Paolo
  11. Short-term forecasting with mixed-frequency data: A MIDASSO approach By Boriss Siliverstovs
  12. Principal Components Analysis for Semi-Martingales and Stochastic PDE By Alberto Ohashi; Alexandre B Simas
  13. Who should be Treated? Empirical Welfare Maximization Methods for Treatment Choice By Toru Kitagawa; Aleksey Tetenov
  14. Large sample properties of an optimization-based matching estimator By Roberto Cominetti; Juan Diaz; Jorge Rivera

  1. By: Djennad, Abdelmajid; Rigby, Robert; Stasinopoulos, Dimitrios; Voudouris, Vlasios; Eilers, Paul
    Abstract: In many settings of empirical interest, time variation in the distribution parameters is important for capturing the dynamic behaviour of time series processes. Although the fitting of heavy tail distributions has become easier due to computational advances, the joint and explicit modelling of time-varying conditional skewness and kurtosis is a challenging task. We propose a class of parameter-driven time series models referred to as the generalized structural time series (GEST) model. The GEST model extends Gaussian structural time series models by a) allowing the distribution of the dependent variable to come from any parametric distribution, including highly skewed and kurtotic distributions (and mixed distributions) and b) expanding the systematic part of parameter-driven time series models to allow the joint and explicit modelling of all the distribution parameters as structural terms and (smoothed) functions of independent variables. The paper makes an applied contribution in the development of a fast local estimation algorithm for the evaluation of a penalised likelihood function to update the distribution parameters over time \textit{without} the need for evaluation of a high-dimensional integral based on simulation methods.
    Keywords: non-Gaussian parameter-driven time series, fast local estimation algorithm, time-varying skewness, time-varying kurtosis
    JEL: C14 C46 C53 C58 G17
    Date: 2015–03–12
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:62807&r=ecm
  2. By: Akio Namba (Graduate School of Economics, Kobe University); Kazuhiro Ohtani (Graduate School of Economics, Kobe University)
    Abstract: In this paper we consider a regression model and propose estimators which are the weighted averages of two estimators among three estimators; the Stein-rule (SR), minimum mean squared error (MMSE) and the adjusted minimum mean squared error (AMMSE) estimators. We derive the formula for the mean squared error (MSE) of the proposed estimators. It is shown by numerical evaluations that one of the proposed estimators has smaller mean squared error (MSE) than the positive-part Stein-rule (PSR) estimator over a moderate region of parameter space when the number of the regression coefficients is small (i.e., 3). Also, its MSE performance is comparable to the PSR estimator even when the number of the regression coefficient is not so small.
    Keywords: Mean squared error, Stein-rule estimator, Minimum mean squared error estimator, Ad-justed Minimum mean squared error estimator, weighted average estimator
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:koe:wpaper:1513&r=ecm
  3. By: Joshua C.C. Chan; Angelia L. Grant
    Abstract: The modified harmonic mean is widely used for estimating the marginal likelihood. We investigate the empirical performance of two versions of this estimator: one based on the observed-data likelihood and the other on the complete-data likelihood. Through an empirical example using US and UK inflation, we show that the version based on the complete-data likelihood has a substantial finite sample bias and tends to select the wrong model, whereas the version based on the observed-data likelihood works well.
    Keywords: Bayesian model comparison, state space, unobserved components, inflation
    JEL: C11 C15 C32 C52
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:een:camaaa:2015-08&r=ecm
  4. By: Schmid, Timo; Tzavidis, Nikos; Münnich, Ralf; Chambers, Ray
    Abstract: Modern systems of official statistics require the estimation and publication of business statistics for disaggregated domains, for example, industry domains and geographical regions. Outlier robust methods have proven to be useful for small area estimation. Recently proposed outlier robust modelbased small area methods assume, however, uncorrelated random effects. Spatial dependencies, resulting from similar industry domains or geographic regions, often occur. In this paper we propose outlier robust small area methodology that allows for the presence of spatial correlation in the data. In particular, we present a robust predictive methodology that incorporates the potential spatial impact from other areas (domains) on the small area (domain) of interest. We further propose two parametric bootstrap methods for estimating the mean-squared error. Simulations indicate that the proposed methodology may lead to efficiency gains. The paper concludes with an illustrative application by using business data for estimating average labour costs in Italian provinces.
    Keywords: bias correction,projective and predictive estimators,spatial correlation,business surveys
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:zbw:fubsbe:20158&r=ecm
  5. By: Joshua C.C. Chan
    Abstract: This paper generalizes the popular stochastic volatility in mean model of Koopman and Hol Uspensky (2002) to allow for time-varying parameters in the conditional mean. The estimation of this extension is nontrival since the volatility appears in both the conditional mean and the conditional variance, and its coefficient in the former is time-varying. We develop an efficient Markov chain Monte Carlo algorithm based on band and sparse matrix algorithms instead of the Kalman filter to estimate this more general variant. We illustrate the methodology with an application that involves US, UK and Germany inflation. The estimation results show substantial time-variation in the coefficient associated with the volatility, high-lighting the empirical relevance of the proposed extension. Moreover, in a pseudo out-of-sample forecasting exercise, the proposed variant also forecasts better than various standard benchmarks.
    Keywords: nonlinear, state space, inflation forecasting, inflation uncertainty
    JEL: C11 C15 C53 C58 E31
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:een:camaaa:2015-07&r=ecm
  6. By: Groß, Marcus; Rendtel, Ulrich; Schmid, Timo; Schmon, Sebastian; Tzavidis, Nikos
    Abstract: Modern systems of official statistics require the timely estimation of area-specific densities of sub-populations. Ideally estimates should be based on precise geo-coded information, which is not available due to confidentiality constraints. One approach for ensuring confidentiality is by rounding the geo-coordinates. We propose multivariate non-parametric kernel density estimation that reverses the rounding process by using a Bayesian measurement error model. The methodology is applied to the Berlin register of residents for deriving density estimates of ethnic minorities and aged people. Estimates are used for identifying areas with a need for new advisory centres for migrants and infrastructure for older people.
    Keywords: ageing,binned data,ethnic segregation,non-parametric estimation,official statistics
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:zbw:fubsbe:20157&r=ecm
  7. By: Taisuke Otsu; Martin Pesendorfer; Yuya Takahashi
    Abstract: This paper proposes several statistical tests for finite state Markov games to examine the null hypothesis that data from distinct markets can be pooled. We formulate tests of (i) the conditional choice and state transition probabilities, (ii) the steady-state distribution, and (iii) the conditional state distribution given an initial state. If the null cannot be rejected, then the data across markets can be pooled. A rejection of the null implies that the data cannot be pooled across markets. In a Monte Carlo study we find that the test based on the steady-state distribution performs well and has high power even with small numbers of markets and time periods. We apply the tests to the empirical study of Ryan (2012) that analyzes dynamics of the U.S. Portland Cement industry and assess if the single equilibrium assumption is supported by the data.
    Keywords: Dynamic Markov game, Poolability, Multiplicity of equilibria, Hypothesis testing
    JEL: C12 C72 D44
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2015/582&r=ecm
  8. By: Vijverberg, Wim P. (CUNY Graduate Center); Hasebe, Takuya (Sophia University)
    Abstract: If the disturbances of a linear regression model are skewed and/or thick-tailed, a maximum likelihood estimator is efficient relative to the customary Ordinary Least Squares (OLS) estimator. In this paper, we specify a highly flexible Generalized Tukey Lambda (GTL) distribution to model skewed and thick-tailed disturbances. The GTL-regression estimator is consistent and asymptotically normal. We demonstrate the potential gains of the GTL estimator over the OLS estimator in a Monte Carlo study and in five applications that are typical of applied economics research problems: log-wage equations, hedonic housing price equations, an analysis of speeding tickets, the issue of trade creation and trade diversion that result from preferential trade agreements, and the familiar CAPM model in financial economics.
    Keywords: linear regression, robust estimation, Generalized Tukey Lambda distribution
    JEL: C16 C21
    Date: 2015–02
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp8898&r=ecm
  9. By: Martín Rossi (Department of Economics, Universidad de San Andres)
    Abstract: I present a survey on the econometric approach to the measurement of efficiency, focusing on the models used in empirical applications. I describe both models for cross sectional data and models for panel data. Finally, I survey the recent literature on models with time varying technical efficiency.
    Keywords: stochastic frontiers, productivity, technical change
    JEL: O3
    Date: 2015–02
    URL: http://d.repec.org/n?u=RePEc:sad:wpaper:117&r=ecm
  10. By: Fornaro, Paolo
    Abstract: In this paper, I use a large set of macroeconomic and financial predictors to forecast U.S. recession periods. I adopt Bayesian methodology with shrinkage in the parameters of the probit model for the binary time series tracking the state of the economy. The in-sample and out-of-sample results show that utilizing a large cross-section of indicators yields superior U.S. recession forecasts in comparison to a number of parsimonious benchmark models. Moreover, data rich models with shrinkage manage to beat the forecasts obtained with the factor-augmented probit model employed in past research
    Keywords: Bayesian shrinkage, Business Cycles, Probit model, large cross-sections
    JEL: C11 C25 E32 E37
    Date: 2015–03–15
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:62973&r=ecm
  11. By: Boriss Siliverstovs (KOF Swiss Economic Institute, ETH Zurich, Switzerland)
    Abstract: In this paper we extend the targeted-regressor approach suggested in Bai and Ng (2008) for variables sampled at the same frequency to mixed-frequency data. Our MIDASSO approach is a combination of the unrestricted MIxed-frequency DAta-Sampling approach (U-MIDAS) (see Foroni et al., 2015; Castle et al., 2009; Bec and Mogliani, 2013), and the LASSO-type penalised regression used in Bai and Ng (2008), called the elastic net (Zou and Hastie, 2005). We illustrate our approach by forecasting the quarterly real GDP growth rate in Switzerland.
    Keywords: LASSO, Switzerland, Forecasting, Real-time data, MIDAS
    JEL: C22 C53
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:kof:wpskof:15-375&r=ecm
  12. By: Alberto Ohashi; Alexandre B Simas
    Abstract: In this work, we develop a novel principal component analysis (PCA) for semimartingales by introducing a suitable spectral analysis for the quadratic variation operator. Motivated by high-dimensional complex systems typically found in interest rate markets, we investigate correlation in high-dimensional high-frequency data generated by continuous semimartingales. In contrast to the traditional PCA methodology, the directions of large variations are not deterministic, but rather they are bounded variation adapted processes which maximize quadratic variation almost surely. This allows us to reduce dimensionality from high-dimensional semimartingale systems in terms of covariation rather than the usual covariance concept. The proposed methodology allows us to investigate space-time data driven by multi-dimensional latent semimartingale state processes. The theory is applied to discretely-observed stochastic PDEs which admit finite-dimensional realizations. In particular, we provide consistent estimators for finite-dimensional invariant manifolds for Heath-Jarrow-Morton models. More importantly, components of the invariant manifold induced by volatility and drift dynamics are consistently estimated and identified.
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1503.05909&r=ecm
  13. By: Toru Kitagawa; Aleksey Tetenov
    Abstract: One of the main objectives of empirical analysis of experiments and quasi-experiments is to inform policy decisions that determine the allocation of treatments to individuals with different observable covariates. We propose the Empirical Welfare Maximization (EWM) method, which estimates a treatment assignment policy by maximizing the sample analog of average social welfare over a class of candidate treatment policies. The EWM approach is attractive in terms of both statistical performance and practical implementation in realistic settings of policy design. Common features of these settings include: (i) feasible treatment assignment rules are constrained exogenously for ethical, legislative, or political reasons, (ii) a policy maker wants a simple treatment assignment rule based on one or more eligibility scores in order to reduce the dimensionality of individual observable characteristics, and/or (iii) the proportion of individuals who can receive the treatment is a priori limited due to a budget or a capacity constraint. We show that when the propensity score is known, the average social welfare attained by EWM rules converges at least at n^(-1/2) rate to the maximum obtainable welfare uniformly over a minimally constrained class of data distributions, and this uniform convergence rate is minimax optimal. In comparison with this benchmark rate, we examine how the uniform convergence rate of the average welfare improves or deteriorates depending on the richness of the class of candidate decision rules, the distribution of conditional treatment effects, and the lack of knowledge of the propensity score. We provide an asymptotically valid inference procedure for the population welfare gain obtained by exercising the EWM rule. We offer easily implementable algorithms for computing the EWM rule and an application using experimental data from the National JTPA Study.
    Keywords: randomized experiments, statistical treatment rules, minimax rate optimality, VC-dimension.
    JEL: C21 C44 C14
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:cca:wpaper:402&r=ecm
  14. By: Roberto Cominetti; Juan Diaz; Jorge Rivera
    Date: 2014–11
    URL: http://d.repec.org/n?u=RePEc:udc:wpaper:wp389&r=ecm

This nep-ecm issue is ©2015 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.