nep-ets New Economics Papers
on Econometric Time Series
Issue of 2015‒12‒08
ten papers chosen by
Yong Yin
SUNY at Buffalo

  1. Full and fast calibration of the Heston stochastic volatility model By Yiran Cui; Sebastian del Bano Rollin; Guido Germano
  2. Applying Flexible Parameter Restrictions in Markov-Switching Vector Autoregression Models By Andrew Binning; Junior Maih
  3. On Estimating Long-Run Effects in Models with Lagged Dependent Variables By W. Robert Reed; Min Zhu
  4. Macro-Driven VaR Forecasts: From Very High to Very-Low Frequency Data By Yves Dominicy; Harry-Paul Vander Elst
  5. lCARE – localizing Conditional AutoRegressive Expectiles By Xiu Xu; Andrija Mihoci; Wolfgang Karl Härdle;
  6. A Simple Estimator for Short Panels with Common Factors By Juodis, Arturas; Sarafidis, Vasilis
  7. Large Vector Autoregressions with Asymmetric Priors By Andrea Carriero; Todd E. Clark; Massimiliano Marcellino
  8. Model Uncertainty in Panel Vector Autoregressive Models By Gary Koop; Dimitris Korobilis
  9. Large Bayesian VARMAs By Joshua Chan; Eric Eisenstat; Gary Koop
  10. Improving the Finite Sample Performance of Autoregression Estimators in Dynamic Factor Models: A Bootstrap Approach By Mototsugu Shintani; Zi-yi Guo

  1. By: Yiran Cui; Sebastian del Bano Rollin; Guido Germano
    Abstract: This paper presents an algorithm for a complete and efficient calibration of the Heston stochastic volatility model. We express the calibration as a nonlinear least squares problem; exploiting a suitable expression of the characteristic function, we give the analytical gradient of the price of a vanilla option with respect to the model parameters, which is the key element of all variants of the objective function. The interdependency between the components of the gradient enables an efficient implementation which is around ten times faster than a numerical gradient. We choose the Levenberg-Marquardt method to calibrate the model and do not observe multiple local minima reported in previous research. Two dimensional sections show that the objective function is shaped as a narrow valley with a flat bottom. Our method is the fastest calibration of the Heston model developed so far and meets the speed requirement of practical trading.
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1511.08718&r=ets
  2. By: Andrew Binning (Norges Bank); Junior Maih (Norges Bank and BI Norwegian Business School)
    Abstract: We present a new method for imposing parameter restrictions in Markov-Switching Vector Autoregression (MS-VAR) models. Our method is more flexible than competing methodologies and easily handles a range of parameter restrictions over different equations, regimes and parameter types. We also expand the range of priors used in the MS-VAR literature. We demonstrate the versatility of our approach using three appropriate examples.
    Keywords: Parameter Restrictions, MS-VAR estimation, Block Exogeneity, Zero Restrictions, Bayesian estimation
    Date: 2015–12–01
    URL: http://d.repec.org/n?u=RePEc:bno:worpap:2015_17&r=ets
  3. By: W. Robert Reed (University of Canterbury); Min Zhu
    Abstract: This note points out the hazards of estimating long-run effects from models with lagged dependent variables. We use Monte Carlo experiments to demonstrate that this practice often fails to produce reliable estimates. Biases can be substantial, sample ranges very wide, and hypothesis tests can be rendered useless in realistic data environments. There are three reasons for this poor performance. First, OLS estimates of the coefficient of a lagged dependent variable are downwardly biased in finite samples. Second, small biases in the estimate of the lagged, dependent variable coefficient are magnified in the calculation of long-run effects. And third, and perhaps most importantly, the statistical distribution associated with estimates of the LRP is complicated, heavy-tailed, and difficult to use for hypothesis testing. While alternative procedures such as jackknifing and indirect inference address the first issue, associated estimates of long-run effects remain unreliable.
    Keywords: Hurwicz bias, Auto-Regressive Distributed-Lag models, ARDL, Dynamic Panel Data models, DPD, Anderson-Hsaio, Arellano-Bond, Difference GMM, System GMM, indirect inference, jackknifing, long-run impact, long-run propensity
    JEL: C22 C23
    Date: 2015–11–30
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:15/18&r=ets
  4. By: Yves Dominicy; Harry-Paul Vander Elst
    Abstract: This paper studies in some details the joint-use of high-frequency data and economic variables tomodel financial returns and volatility. We extend the Realized LGARCH model by allowing for a timevaryingintercept, which responds to changes in macroeconomic variables in a MIDAS framework andallows macroeconomic information to be included directly into the estimation and forecast procedure.Using more than 10 years of high-frequency transactions for 55 U.S. stocks, we argue that the combinationof low-frequency exogenous economic indicators with high-frequency financial data improves our abilityto forecast the volatility of returns, their full multi-step ahead conditional distribution and the multiperiodValue-at-Risk. We document that nominal corporate profits and term spreads generate accuraterisk measures forecasts at horizons beyond two business weeks.
    Keywords: realized LGARCH; value-at-risk; density forecasts; realized measures of volatility
    JEL: C22 C53
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/220550&r=ets
  5. By: Xiu Xu; Andrija Mihoci; Wolfgang Karl Härdle;
    Abstract: We account for time-varying parameters in the conditional expectile based value at risk (EVaR) model. EVaR appears more sensitive to the magnitude of portfolio losses compared to the quantile-based Value at Risk (QVaR), nevertheless, by fitting the models over relatively long ad-hoc fixed time intervals, research ignores the potential time-varying parameter properties. Our work focuses on this issue by exploiting the local parametric approach in quantifying tail risk dynamics. By achieving a balance between parameter variability and modelling bias, one can safely fit a parametric expectile model over a stable interval of homogeneity. Empirical evidence at three stock markets from 2005- 2014 shows that the parameter homogeneity interval lengths account for approximately 1-6 months of daily observations. Our method outperforms models with one-year fixed intervals, as well as quantile based candidates while employing a time invariant portfolio protection (TIPP) strategy for the DAX portfolio. The tail risk measure implied by our model finally provides valuable insights for asset allocation and portfolio insurance.
    Keywords: expectiles, tail risk, local parametric approach, risk management
    JEL: C32 C51 G17
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2015-052&r=ets
  6. By: Juodis, Arturas; Sarafidis, Vasilis
    Abstract: There is a substantial theoretical literature on the estimation of short panel data models with common factors nowadays. Nevertheless, such advances appear to have remained largely unnoticed by empirical practitioners. A major reason for this casual observation might be that existing approaches are computationally burdensome and difficult to program. This paper puts forward a simple methodology for estimating panels with multiple factors based on the method of moments approach. The underlying idea involves substituting the unobserved factors with time-specific weighted averages of the variables included in the model. The estimation procedure is easy to implement because unobserved variables are superseded with observed data. Furthermore, since the model is effectively parameterized in a more parsimonious way, the resulting estimator can be asymptotically more efficient than existing ones. Notably, our methodology can easily accommodate observed common factors and unbalanced panels, both of which are important empirical scenarios. We apply our approach to a data set involving a large panel of 4,500 households in New South Wales (Australia), and estimate the price elasticity of urban water demand.
    Keywords: Dynamic Panel Data, Factor Model, Fixed T Consistency, Monte Carlo Simulation, Urban Water Management.
    JEL: C13 C15 C23
    Date: 2015–11–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:68164&r=ets
  7. By: Andrea Carriero (Queen Mary University of London); Todd E. Clark (Federal Reserve Bank of Cleveland); Massimiliano Marcellino (Bocconi University, IGIER and CEPR)
    Abstract: We propose a new algorithm which allows easy estimation of Vector Autoregressions (VARs) featuring asymmetric priors and time varying volatilities, even when the cross sectional dimension of the system <i>N</i> is particularly large. The algorithm is based on a simple triangularisation which allows to simulate the conditional mean coefficients of the VAR by drawing them equation by equation. This strategy reduces the computational complexity by a factor of <i>N<sup>2</sup></i> with respect to the existing algorithms routinely used in the literature and by practitioners. Importantly, this new algorithm can be easily obtained by modifying just one of the steps of the existing algorithms. We illustrate the benefits of the algorithm with numerical and empirical applications.
    Keywords: Bayesian VARs, Stochastic volatility, Large datasets, Forecasting, Impulse response functions
    JEL: C11 C13 C33 C53
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp759&r=ets
  8. By: Gary Koop (Department of Economics, University of Strathclyde, UK; The Rimini Centre for Economic Analysis, Italy); Dimitris Korobilis (Adam Smith Business School, University of Glasgow, UK; The Rimini Centre for Economic Analysis, Italy)
    Abstract: We develop methods for Bayesian model averaging (BMA) or selection (BMS) in Panel Vector Autoregressive (PVARs). Our approach allows us to select between or average over all possible combinations of restricted PVARs where the restrictions involve interdependencies between and heterogeneities across cross-sectional units. The resulting BMA framework can find a parsimonious PVAR specification, thus dealing with overparameterization concerns. We use these methods in an application involving the euro area sovereign debt crisis and show that our methods perform better than alternatives. Our findings contradict a simple view of the sovereign debt crisis which divides the euro zone into groups of core and peripheral countries and worries about financial contagion within the latter group.
    Date: 2015–09
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:15-35&r=ets
  9. By: Joshua Chan (Research School of Economics, Australian National University, Australia); Eric Eisenstat (Faculty of Business Administration, University of Bucharest, Romania); Gary Koop (Department of Economics, University of Strathclyde, UK; The Rimini Centre for Economic Analysis, Italy)
    Abstract: Vector Autoregressive Moving Average (VARMA) models have many theoretical properties which should make them popular among empirical macroeconomists. However, they are rarely used in practice due to over-parameterization concerns, difficulties in ensuring identification and computational challenges. With the growing interest in multivariate time series models of high dimension, these problems with VARMAs become even more acute, accounting for the dominance of VARs in this field. In this paper, we develop a Bayesian approach for inference in VARMAs which surmounts these problems. It jointly ensures identification and parsimony in the context of an efficient Markov Chain Monte Carlo (MCMC) algorithm. We use this approach in a macroeconomic application involving up to twelve dependent variables. We find our algorithm t work successfully and provide insights beyond those provided by VARs.
    Date: 2015–05
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:15-36&r=ets
  10. By: Mototsugu Shintani (University of Tokyo and Vanderbilt University); Zi-yi Guo (Vanderbilt University)
    Abstract: We investigate the finite sample properties of the estimator of a persistence parameter of an unobservable common factor when the factor is estimated by the principal components method. When the number of cross-sectional observations is not sufficiently large, relative to the number of time series observations, the autoregressive coefficient estimator of a positively autocorrelated factor is biased downward and the bias becomes larger for a more persistent factor. Based on theoretical and simulation analyses, we show that bootstrap procedures are e¤ective in reducing the bias, and bootstrap confidence intervals outperform naive asymptotic confidence intervals in terms of the coverage probability.
    Keywords: Bias Correction; Bootstrap; Dynamic Factor Model; Principal Components
    JEL: C1 C5
    Date: 2015–12–02
    URL: http://d.repec.org/n?u=RePEc:van:wpaper:vuecon-sub-15-00015&r=ets

This nep-ets issue is ©2015 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.