nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒03‒08
fourteen papers chosen by
Sune Karlsson
Orebro University

  1. Joint Confidence Sets for Structural Impulse Responses By Atsushi Inoue; Lutz Kilian
  3. Positive Semidefinite Integrated Covariance Estimation, Factorizations and Asynchronicity By Kris Boudt; Sébastien Laurent; Asger Lunde; Rogier Quaedvlieg
  4. Quasi-Bayesian Model Selection By Atsushi Inoue; Mototsugu Shintania
  5. Statistics of Heteroscedastic Extremes By Einmahl, J.H.J.; Haan, L.F.M. de; Zhou, C.
  6. Capturing the time-varying drivers of an epidemic using stochastic dynamical systems By Joseph Dureau; Konstantinos Kalogeropoulos; Marc Baguelin
  7. "Volatility and Quantile Forecasts by Realized Stochastic Volatility Models with Generalized Hyperbolic Distribution" By Makoto Takahashi; Toshiaki Watanabe; Yasuhiro Omori
  8. Go with the Flow: A GAS model for Predicting Intra-daily Volume Shares By Francesco Calvori; Fabrizio Cipollini; Giampiero M. Gallo
  9. Efficient Modeling and Forecasting of the Electricity Spot Price By Florian Ziel; Rick Steinert
  10. Modelling Inflation Volatility By Eric Eisenstat; Rodney W. Strachan
  11. Forecasting Equity Premia using Bayesian Dynamic Model Averaging By Joscha Beckmann; Rainer Schüssler
  12. Response Surface Methodology By Kleijnen, Jack P.C.
  13. Forecasting Realized Volatility with Changes of Regimes By Giampiero M. Gallo; Edoardo Otranto
  14. Robust Implementation of a Parsimonious Dynamic Factor Model to Nowcast GDP By Pablo Duarte; Bernd Süssmuth

  1. By: Atsushi Inoue (Southern Methodist University); Lutz Kilian (University of Michigan)
    Abstract: Many users of structural VAR models are primarily interested in learning about the shape of structural impulse response functions. This requires joint inference about sets of structural impulse responses, allowing for dependencies across time as well as across response functions. Such joint inference is complicated by the fact that the joint distribution of structural impulse response becomes degenerate when the number of structural impulse responses of interest exceeds the number of model parameters, as is often the case in applied work. This degeneracy may be overcome by transforming the estimator appropriately. We show that the joint Wald test is invariant to this transformation and converges to a nonstandard distribution, which can be approximated by the bootstrap, allowing the construction of asymptotically valid joint confidence sets for any subset of structural impulse responses, regardless of whether the joint distribution of the structural impulse responses is degenerate or not. We demonstrate by simulation the coverage accuracy of these sets in finite samples under realistic conditions. We make the case for representing these joint confidence sets in the form of "shotgun plots" rather than joint confidence bands for impulse response functions. Several empirical examples demonstrate that this approach not only conveys the same information as confidence bands about the statistical significance of response functions, but provides economically relevant additional information about the shape of response functions that is lost when reducing the joint confidence set to two-dimensional bands Classification-JEL: C32, C52, C53.
    Keywords: Confidence Bands, Simultaneous Inference, Singular Covariance Matrix.
    Date: 2014–02
  2. By: Sokbae Lee (Department of Economics, Seoul National University); Kyungchul Song (Vancouver School of Economics, University of British Columbia); Yoon-Jae Whang (Department of Economics, Seoul National University and Institute of Economic Research at Kyoto University)
    Abstract: In this paper, we propose a general method for testing inequality restrictions on nonparametric functions. Our framework includes many nonparametric testing problems in a unied framework, with a number of possible applications in auction models, game theoretic models, wage inequality, and revealed preferences. Our test involves a one-sided version of Lp functionals of kernel-type estimators (1 p
    Keywords: Bootstrap, conditional moment inequalities, kernel estimation, local poly-nomial estimation, Lp norm, nonparametric testing, partial identication, Poissonization, quantile regression, uniform asymptotics
    JEL: C12 C14
    Date: 2014–02
  3. By: Kris Boudt (Department of Business, Vrije Universiteit Brussel, Belgium and VU University Amsterdam, Netherlands); Sébastien Laurent (Aix-Marseille University, Aix-Marseille School of Economics, CNRS & EHESS, France); Asger Lunde (Aarhus University and CREATES); Rogier Quaedvlieg (Department of Finance, Maastricht University, Netherlands)
    Abstract: An estimator of the ex-post covariation of log-prices under asynchronicity and microstructure noise is proposed. It uses the Cholesky factorization on the correlation matrix in order to exploit the heterogeneity in trading intensity to estimate the different parameters sequentially with as many observations as possible. The estimator is guaranteed positive semidefinite. Monte Carlo simulations confirm good finite sample properties. In the application we forecast portfolio Value-at-Risk and sector risk exposures for a portfolio of 52 stocks. We find that forecasts obtained from dynamic models utilizing the proposed high-frequency estimator provide statistically and economically superior forecasts to models using daily returns.
    Keywords: Cholesky decomposition, Integrated covariance, Non-synchronous trading, Positive semidefinite, Realized covariance
    JEL: C10 C58
    Date: 2014–02–24
  4. By: Atsushi Inoue (Southern Methodist University); Mototsugu Shintania (Vanderbilt University)
    Abstract: In this paper we establish the consistency of the model selection criterion based on the quasi-marginal likelihood obtained from Laplace-type estimators (LTE). We consider cases in which parameters are strongly identified, weakly identified and partially identified. Our Monte Carlo results confirm our consistency results. Our proposed procedure is applied to select among monetary macroeconomic models using US data.
    Keywords: Consistent Model Selection, Laplace-Type Estimators, Marginal Likelihood.
    JEL: C32 C36 C52
    Date: 2014–02
  5. By: Einmahl, J.H.J.; Haan, L.F.M. de; Zhou, C. (Tilburg University, Center for Economic Research)
    Abstract: Abstract: We extend classical extreme value theory to non-identically distributed observations. When the distribution tails are proportional much of extreme value statistics remains valid. The proportionality function for the tails can be estimated nonparametrically along with the (common) extreme value index. Joint asymptotic normality of both estimators is shown; they are asymptotically independent. We develop tests for the proportionality function and for the validity of the model. We show through simulations the good performance of tests for tail homoscedasticity. The results are applied to stock market returns. A main tool is the weak convergence of a weighted sequential tail empirical process.
    Date: 2014
  6. By: Joseph Dureau; Konstantinos Kalogeropoulos; Marc Baguelin
    Abstract: Epidemics are often modeled using non-linear dynamical systems observed through partial and noisy data. In this paper, we consider stochastic extensions in order to capture unknown influences (changing behaviors, public interventions, seasonal effects, etc.). These models assign diffusion processes to the time-varying parameters, and our inferential procedure is based on a suitably adjusted adaptive particle Markov chain Monte Carlo algorithm. The performance of the proposed computational methods is validated on simulated data and the adopted model is applied to the 2009 H1N1 pandemic in England. In addition to estimating the effective contact rate trajectories, the methodology is applied in real time to provide evidence in related public health decisions. Diffusion-driven susceptible exposed infected retired-type models with age structure are also introduced.
    Keywords: Bayesian inference; particle MCMC; population epidemic model; time-varying parameters
    JEL: I1
    Date: 2013–07
  7. By: Makoto Takahashi (Center for the Study of Finance and Insurance, Osaka University and Department of Finance, Kellogg School of Management, Northwestern University); Toshiaki Watanabe (Institute of Economic Research, Hitotsubashi University); Yasuhiro Omori (Faculty of Economics, The University of Tokyo)
    Abstract:    The realized stochastic volatility model of Takahashi, Omori, and Watanabe (2009), which incorporates the asymmetric stochastic volatility model with the realized volatility, is extended with more general form of bias correction in realized volatility and wider class distribution, the generalized hyperbolic skew Student's t -distribution, fornancial returns. The extensions make it possible to adjust the bias due to the market microstructure noise and non-trading hours, which possibly depends on the level of the volatility, and to consider the heavy tail and skewness in nancial returns. With the Bayesian estimation scheme via Markov chain Monte Carlo method, the model enables us to estimate the parameters in the return distribution and in the model jointly. It also makes it possible to forecast volatility and return quantiles by sampling from their posterior distributions jointly. The model is applied to quantile forecasts of nancial returns such as value-at-risk and expected shortfall as well as volatility forecasts and those forecasts are evaluated by several backtesting procedures. Empirical results with SPDR, the S&P 500 exchange-traded fund, show that the heavy tail and skewness of daily returns are important for the model fit and the quantile forecasts but not for the volatility forecasts, and that the additional bias correction improves the quantile forecasts but does not substantially improve the model fit nor the volatility forecasts.
    Date: 2014–02
  8. By: Francesco Calvori (Dipartimento di Statistica, Informatica, Applicazioni "G.Parenti", Università di Firenze); Fabrizio Cipollini (Dipartimento di Statistica, Informatica, Applicazioni "G.Parenti", Università di Firenze); Giampiero M. Gallo (Dipartimento di Statistica, Informatica, Applicazioni "G.Parenti", Università di Firenze)
    Abstract: The Volume Weighted Average Price (VWAP) mixes volumes and prices at intra-daily intervals and is a benchmark measure frequently used to evaluate a trader's performance. Under suitable assumptions, splitting a daily order according to ex-ante volume predictions is a good strategy to replicate the VWAP. To bypass possible problems generated by local trends in volumes, we propose a novel Generalized Autoregressive Score (GAS) model for predicting volume shares (relative to the daily total), inspired by the empirical regularities of the observed series (intra-daily periodicity pattern, residual serial dependence). An application to six NYSE tickers confirms the suitability of the model proposed in capturing the features of intra-daily dynamics of volume shares.
    Keywords: High Frequency Financial Data, Prediction, Trading Volumes, Volume Shares, VWAP, GAS, Dirichlet Distribution
    JEL: C22 C53 C58
    Date: 2014–02
  9. By: Florian Ziel; Rick Steinert
    Abstract: The raising importance of renewable energy, especially solar and wind power, led to new impacts on the formation of electricity prices. Hence, this paper introduces an econometric model for the hourly time series of electricity prices of the EEX which incorporates specific features like renewable energy. The model consists of several sophisticated and established approaches and can be regarded as a periodic VAR-TARCH with wind power, solar power and load as influencing time series. It is able to map the distinct and well-known features of electricity prices in Germany. An efficient iteratively reweighted lasso approach is used for estimation. Moreover, it is shown that several existing models are outperformed by using the procedure developed within this paper.
    Date: 2014–02
  10. By: Eric Eisenstat; Rodney W. Strachan
    Abstract: This paper discusses estimation of US inflation volatility using time varying parameter models, in particular whether it should be modelled as a stationary or random walk stochastic process. Specifying inflation volatility as an unbounded process, as implied by the random walk, conflicts with priors beliefs, yet a stationary process cannot capture the low frequency behaviour commonly observed in estimates of volatility. We therefore propose an alternative model with a change-point process in the volatility that allows for switches between stationary models to capture changes in the level and dynamics over the past forty years. To accommodate the stationarity restriction, we develop a new representation that is equivalent to our model but is computationally more efficient. All models produce effectively identical estimates of volatility, but the change-point model provides more information on the level and persistence of volatility and the probabilities of changes. For example, we find a few well defined switches in the volatility process and, interestingly, these switches line up well with economic slowdowns or changes of the Federal Reserve Chair.
    Keywords: Inflation volatility, monetary policy, time varying parameter model, Bayesian estimation, Change-point model
    JEL: C11 C32 E52
    Date: 2014–02
  11. By: Joscha Beckmann; Rainer Schüssler
    Abstract: This paper introduces a Bayesian version for Dynamic Model Averaging for predicting aggregate stock returns. Our suggested approach simultaneously accounts for many sources of uncertainty. It is designed to handle (i) parameter instability, (ii) time-varying volatility, (iii) model uncertainty and (iv) time-varying model weights. We use our method to analyze predictability of S&P500 returns for the 1927 - 2012 period. The flexibility of the econometric setup enables us to disentangle the multitude of effects at work when generating (point and density) forecasts. A key point of our analysis is to assess which components of forecast models pay off in terms of statistical accuracy and economic value. We document that statistical and economic evaluation metrics can be in sharp contrast. While stochastic volatility emerges to be important both in terms of density forecast accuracy and economic gains, return prediction models that use economic covariates turned out to be helpful to time the market only within very limited periods of time.
    Keywords: Asset allocation; Density forecasting; Model averaging
    JEL: C11 G11
    Date: 2014–02
  12. By: Kleijnen, Jack P.C. (Tilburg University, Center for Economic Research)
    Abstract: Abstract: This chapter first summarizes Response Surface Methodology (RSM), which started with Box and Wilsons article in 1951 on RSM for real, non-simulated systems. RSM is a stepwise heuristic that uses first-order polynomials to approximate the response surface locally. An estimated polynomial metamodel gives an estimated local gradient, which RSM uses in steepest ascent (or descent) to decide on the next local experiment. When RSM approaches the optimum, the latest first-order polynomial is replaced by a second-order polynomial. The fitted second-order polynomial enables the estimation of the optimum. Furthermore, this chapter focuses on simulated systems, which may violate the assumptions of constant variance and independence. The chapter also summarizes a variant of RSM that is proven to converge to the true optimum, under specific conditions. The chapter presents an adapted steepest ascent that is scale-independent. Moreover, the chapter generalizes RSM to multiple random responses, selecting one response as the goal variable and the other responses as the constrained variables. This generalized RSM is combined with mathematical programming to estimate a better search direction than the steepest ascent direction. To test whether the estimated solution is indeed optimal, bootstrapping may be used. Finally, the chapter discusses robust optimization of the decision variables, while accounting for uncertainties in the environmental variables.
    Keywords: simulation;optimization;regression;robustness;risk
    JEL: C0 C1 C9
    Date: 2014
  13. By: Giampiero M. Gallo (Dipartimento di Statistica, Informatica, Applicazioni "G.Parenti", Università di Firenze); Edoardo Otranto (Dipartimento di Scienze Cognitive e della Formazione, Università degli Studi di Messina)
    Abstract: Realized volatility of financial time series generally shows a slow–moving average level from the early 2000s to recent times, with alternating periods of turmoil and quiet. Modeling such a pattern has been variously tackled in the literature with solutions spanning from long–memory, Markov switching and spline interpolation. In this paper, we explore the extension of Multiplicative Error Models to include a Markovian dynamics (MS-MEM). Such a model is able to capture some sudden changes in volatility following an abrupt crisis and to accommodate different dynamic responses within each regime. The model is applied to the realized volatility of the S&P500 index: next to an interesting interpretation of the regimes in terms of market events, the MS-MEM has better in–sample fitting capability and achieves good out–of–sample forecasting performances relative to alternative specifications.
    Keywords: MEM, regime switching, realized volatility, volatility persistence, volatility forecasting
    JEL: C22 C24 C58
    Date: 2014–02
  14. By: Pablo Duarte; Bernd Süssmuth
    Abstract: Quarterly GDP figures usually are published with a delay of some weeks. A common way to generate GDP series of higher frequency, i.e. to nowcast GDP, is to use available indicators to calculate a single index by means of a common factor derived from a dynamic factor model (DFM). This paper deals with the implementation stage of this practice. We propose a two-tiered mechanism consisting in the identification of variables highly correlated with GDP as “core” indicators and a check of robustness of these variables in the sense of extreme bounds analysis. Accordingly selected indicators are used in an approximate DFM framework to exemplarily nowcast Spanish GDP growth. We show that our implementation produces more accurate nowcasts than both a benchmark stochastic process and the implementation based on the total set of core indicators.
    Keywords: small-scale nowcasting models, Kalman Filter, extreme bounds analysis
    JEL: C38 C53
    Date: 2014

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.