nep-ecm New Economics Papers
on Econometrics
Issue of 2015‒04‒19
twenty-two papers chosen by
Sune Karlsson
Örebro universitet

  1. Identification and estimation of non-Gaussian structural vector autoregressions By Markku Lanne; Mika Meitz; Pentti Saikkonen
  2. Adaptative LASSO estimation for ARDL models with GARCH innovations By Marcelo C. Medeiros; Eduardo F. Mendes
  3. Poor (Wo)man’s Bootstrap By Honore, Bo E.; Hu, Luojia
  4. Bridging Centrality and Extremity : Refining Empirical Data Depth using Extreme Value Statistics By Einmahl, J.H.J.; Li, Jun; Liu, Regina
  5. l1-Regularization of High-Dimensional Time-Series Models with Flexible Innovations By Marcelo C. Medeiros; Eduardo F. Mendes
  6. Spatial Panel Data Model with error dependence: a Bayesian Separable Covariance Approach By Samantha Leorato; Maura Mezzetti
  7. Model Equivalence Tests for Overidentifying Restrictions By Lavergne, Pascal
  8. Theoretical Aspects of Modeling of the SVAR By Skrobotov, Anton; Turuntseva, Marina
  9. "The SIML Estimation of Integrated Covariance and Hedging Coefficient under Round-off Errors, Micro-market Price Adjustments and Random Sampling" By Naoto Kunitomo; Hiroumi Misaki; Seisho Sato
  10. Flexible statistical models: Methods for the ordering and comparison of theoretical distributions By Rigby, Robert; Stasinopoulos, Dimitrios; Voudouris, Vlasios
  11. Empirical Relevance of Ambiguity in First Price Auction Models By Gaurab Aryal; Dong-Hyuk Kim
  12. Identifcation and Estimation of Incomplete Information Games with Multiple Equilibria By Ruli Xiao
  13. Consistent Tests for Poverty Dominance Relations By Garry F. Barrett; Stephen G. Donald; Yu-Chin Hsu
  14. About Trend, the Shift and the Initial Value in Testing of the Hypothesis of a Unit Root By Skrobotov, Anton
  15. Bringing an elementary agent-based model to the data: Estimation via GMM and an application to forecasting of asset price volatility By Ghonghadze, Jaba; Lux, Thomas
  16. A Bayesian Analysis of Racial Differences in Treatment among Breast-cancer Patients By Nandram, B.; Bhadra, Dhiman; Liu, Yiwei
  17. Estimation of sentiment effects in financial markets: A simulated method of moments approach By Zhenxi, Chen; Lux, Thomas
  18. Forecasting in nonstationary environments: What works and what doesn't in reduced-form and structural models By Raffaella Giacomini; Barbara Rossi
  19. Identification of Affine Term Structure Models with Observed Factors: Economic Shocks on Brazilian Yield Curves By Marco S. Matsumura; Ajax R. B. Moreira
  20. Forecasting trends with asset prices By Ahmed Bel Hadj Ayed; Gr\'egoire Loeper; Fr\'ed\'eric Abergel
  21. Detrended partial cross-correlation analysis of two time series influenced by common external forces By Xi-Yuan Qian; Ya-Min Liu; Zhi-Qiang Jiang; Boris Podobnik; Wei-Xing Zhou; H. Eugene Stanley
  22. Switching-GAS Copula Models for Systemic Risk Assessment By Mauro Bernardi; Leopoldo Catania

  1. By: Markku Lanne (University of Helsinki and CREATES); Mika Meitz (University of Helsinki); Pentti Saikkonen (University of Helsinki)
    Abstract: Conventional structural vector autoregressive (SVAR) models with Gaussian errors are not identified, and additional identifying restrictions are typically imposed in applied work. We show that the Gaussian case is an exception in that a SVAR model whose error vector consists of independent non-Gaussian components is, without any additional restrictions, identified and leads to (essentially) unique impulse responses. We also introduce an identification scheme under which the maximum likelihood estimator of the non-Gaussian SVAR model is consistent and asymptotically normally distributed. As a consequence, additional economic identifying restrictions can be tested. In an empirical application, we find a negative impact of a contractionary monetary policy shock on financial markets, and clearly reject the commonly employed recursive identifying restrictions.
    Keywords: Structural vector autoregressive model, identification, impulse responses, non-Gaussianity
    JEL: C13 C32 C53
    Date: 2015–03–30
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-16&r=ecm
  2. By: Marcelo C. Medeiros (Department of Economics PUC-Rio); Eduardo F. Mendes (Department of Economics Australian School of Business)
    Abstract: In this paper we show the validity of the adaptive LASSO procedure in estimating stationary ARDL(p,q) models with GARCH innovations. We show that, given a set of initial weights, the adaptive Lasso selects the relevant variables with probability converging to one. Afterwards, we show that the estimator is oracle, meaning that its distribution converges to the same distribution of the oracle assisted least squares, i.e., the least squares estimator calculated as if we knew the set of relevant variables beforehand. Finally, we show that the LASSO estimator can be used to construct the initial weights. The performance of the method in finite samples is illustrated using Monte Carlo simulation
    Date: 2015–04
    URL: http://d.repec.org/n?u=RePEc:rio:texdis:637&r=ecm
  3. By: Honore, Bo E. (Princeton University); Hu, Luojia (Federal Reserve Bank of Chicago)
    Abstract: The bootstrap is a convenient tool for calculating standard errors of the parameters of complicated econometric models. Unfortunately, the fact that these models are complicated often makes the bootstrap extremely slow or even practically infeasible. This paper proposes an alternative to the bootstrap that relies only on the estimation of one-dimensional parameters. The paper contains no new difficult math. But we believe that it can be useful.
    Keywords: bootstrap; standard error; inference; structural models; parametric estimation
    JEL: C10 C18
    Date: 2015–03–04
    URL: http://d.repec.org/n?u=RePEc:fip:fedhwp:wp-2015-01&r=ecm
  4. By: Einmahl, J.H.J. (Tilburg University, Center For Economic Research); Li, Jun; Liu, Regina
    Abstract: Abstract: Data depth measures the centrality of a point with respect to a given distribution or data cloud. It provides a natural center-outward ordering of multivariate data points and yields a systematic nonparametric multivariate analysis scheme. In particular, the halfspace depth is shown to have many desirable properties and broad applicability. However, the empirical halfspace depth is zero outside the convex hull of the data. This property has rendered the empirical halfspace depth useless outside the data cloud, and limited its utility in applications where the extreme outlying probability mass is the focal point, such as in classification problems and control charts with very small false alarm rates. To address this issue, we apply extreme value statistics to refine the empirical halfspace depth in “the tail”. This provides an important linkage between data depth, which is useful for inference on centrality, and extreme value statistics, which is useful for inference on extremity. The refined empirical halfspace depth can thus extend all its utilities beyond the data cloud, and hence broaden greatly its applicability. The refined estimator is shown to have substantially improved upon the empirical estimator in theory and simulations. The benefit of this improvement is also demonstrated through the applications in classification and statistical process control.
    Keywords: depth; extremes; nonparametric classification; nonparametric multivariate SPC; tail
    JEL: C13 C14
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:tiu:tiucen:bcd9783a-e07e-4da2-bc47-bb96d816c0d8&r=ecm
  5. By: Marcelo C. Medeiros (Department of Economics PUC-Rio); Eduardo F. Mendes (Department of Economics Australian School of Business)
    Abstract: We study the asymptotic properties of the Adaptive LASSO (adaLASSO) in sparse, high-dimensional, linear time-series models. We assume that both the number of covariates in the model and the number of candidate variables can increase with the sample size (polynomially orgeometrically). In other words, we let the number of candidate variables to be larger than the number of observations. We show the adaLASSO consistently chooses the relevant variables as the number of observations increases (model selection consistency) and has the oracle property, even when the errors are non-Gaussian and conditionally heteroskedastic. This allows the adaLASSO to be applied to a myriad of applications in empirical finance and macroeconomics. A simulation study shows that the method performs well in very general settings with t-distributed and heteroskedastic errors as well with highly correlated regressors. Finally, we consider an application to forecast monthly US inflation with many predictors. The model estimated by the adaLASSO delivers superior forecasts than traditional benchmark competitors such as autoregressive and factor models.
    Date: 2015–04
    URL: http://d.repec.org/n?u=RePEc:rio:texdis:636&r=ecm
  6. By: Samantha Leorato (DEF and CEIS University of Rome Tor Vergata); Maura Mezzetti (DEF and CEIS University of Rome Tor Vergata)
    Abstract: A hierarchical Bayesian model for spatial panel data is proposed. The idea behind the proposed method is to analyze spatially dependent panel data by means of a separable covariance matrix. Let us indicate the observations as yit, i = 1,...,N regions and t = 1,...,T time, var(y), the covariance matrix of y is written as a Kronecker product of a purely spatial and a purely temporal covariance. On the one hand, the structure of separable covariances dramatically reduces the number of parameters, while on the other, the lack of a structured pattern for spatial and temporal covariances permits to capture possible unknown dependencies (both in time and space). The use of the Bayesian approach allows to overcome some of the difficulties of the classical (MLE or GMM based) approach. We present two illustrative examples: the estimation of cigarette price elasticity and of the determinants of the house price in 120 municipalities in the Province of Rome.
    Keywords: Bayesian Inference, Kronecker Product, Separable Covariance Matrix, Inverted Wishart Distribution, Spatial-Temporal Dependence
    JEL: C11 C23
    Date: 2015–04–09
    URL: http://d.repec.org/n?u=RePEc:rtv:ceisrp:338&r=ecm
  7. By: Lavergne, Pascal
    Abstract: I propose a new theoretical framework to assess the approximate validity of overidentifying moment restrictions. Their approximate validity is evaluated by the divergence between the true probability measure and the closest measure that imposes the moment restrictions of interest. The divergence can be chosen as any of the Cressie-Read family. The considered alternative hypothesis states that the divergence is smaller than some user-chosen tolerance. Model equivalence tests are constructed for this hypothesis based on the minimum empirical divergence. These tests attains the local semiparametric power envelope of invariant tests. Three empirical applications illustrate their practical usefulness for providing evidence on the potential extent of misspecification.
    Keywords: Hypothesis testing, Semiparametric models.
    JEL: C12 C14 C52
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:tse:wpaper:29157&r=ecm
  8. By: Skrobotov, Anton (Russian Presidential Academy of National Economy and Public Administration (RANEPA)); Turuntseva, Marina (Russian Presidential Academy of National Economy and Public Administration (RANEPA))
    Abstract: In this paper an overview of methods for the analysis of structural VAR models is provided. The fundamental properties of SVAR models, the estimated parameters, as well as various methods of identifying shocks and pritsnipe construct confidence intervals for impulse responses, are discussed. The paper also discusses the problems associated with non-stationary variables.
    Keywords: structural VAR models (SVAR), structural VECM (SVECM), impulse responses, decomposition of the forecast error variances, the identification of shocks
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:rnp:ppaper:mak8&r=ecm
  9. By: Naoto Kunitomo (Faculty of Economics, The University of Tokyo); Hiroumi Misaki (Research Center for Advanced Science and Technology, The University of Tokyo); Seisho Sato (Faculty of Economics, The University of Tokyo)
    Abstract: For estimating the integrated volatility and covariance by using high frequency data, Kunitomo and Sato (2011, 2013) have proposed the Separating Information Maximum Likelihood (SIML) method when there are micro-market noises. The SIML estimator has reasonable nite sample properties and asymptotic properties when the sample size is large when the hidden efficient price process follow a Brownian semi-martingale. We shall show that the SIML estimation is useful for estimating the integrated covariance and hedging coefficient when we have round-off errors, micro-market price adjustments, noises and high-frequency data are randomly sampled. The SIML estimation is consistent, asymptotically normal in the stable convergence sense under a set of reasonable assumptions and it has reasonable nite sample properties with these effects. --
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2015cf965&r=ecm
  10. By: Rigby, Robert; Stasinopoulos, Dimitrios; Voudouris, Vlasios
    Abstract: Statistical models usually rely on the assumption that the shape of the distribution is fixed and that it is only the mean and volatility that varies. Although the fitting of heavy tail distributions has become easier due to computational advances, the fitting of the appropriate heavy tail distribution requires knowledge of the properties of the different theoretical distributions. The selection of the appropriate theoretical distribution is not trivial. Therefore, this paper provides methods for the ordering and comparison of continuous distributions by making a threefold contribution. Firstly, it provides an ordering of the heaviness of distribution tails of continuous distributions. The resulting classification of over 30 important distributions is given. Secondly it provides guidance on choosing the appropriate tail for a given variable. As an example, we use the USA box-office revenues, an industry characterised by extreme events affecting the supply schedule of the films, to illustrate how the theoretical distribution could be selected. Finally, since moment based measures may not exist or may be unreliable, the paper uses centile based measures of skewness and kurtosis to compare distributions. The paper therefore makes a substantial methodological contribution towards the development of conditional densities for statistical model in the presence of heavy tails.
    Keywords: centile measures, heavy tails, distributions
    JEL: C1 C46
    Date: 2015–04–13
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:63620&r=ecm
  11. By: Gaurab Aryal; Dong-Hyuk Kim
    Abstract: We study the identification and estimation of first-price auction models where bidders have ambiguity about the valuation distribution and their preferences are represented by maxmin expected utility. When entry is exogenous, the distribution and ambiguity structure are nonparametrically identified, separately from risk aversion (CRRA). We propose a flexible Bayesian method based on Bernstein polynomials. Monte Carlo experiments show that our method estimates parameters precisely, and chooses reserve prices with (nearly) optimal revenues, whether there is ambiguity or not. Furthermore, if the model is misspecified -- incorrectly assuming no ambiguity among bidders -- it may induce estimation bias with a substantial revenue loss.
    Date: 2015–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1504.02516&r=ecm
  12. By: Ruli Xiao (Indiana University)
    Abstract: Multiple equilibria in games pose a big challenge for identifcation and estimation. Existing literature typically abstracts from multiplicity by assuming that the data is generated by the same equilibrium. Instead of imposing such restrictions, this paper provides a nonparametric identification methodology for finite action games with incomplete information that allow for (possibly) multiple equilibria. The method is applicable to both cross-sectional and panel data. Upon observing players' actions, the identification is achieved in two steps. First, I identify the equilibrium-specific components, such as the number of equilibria, the equilibrium selection probabilities, and individual players' strategies associated with each equilibrium. The identification is feasible by treating the underlying equilibrium as a latent variable and using results from the measurement error literature. Next, I identify the payoff functions nonparametrically with conventional exclusion restrictions. A two-step estimator is then proposed based on this identification method, which performs well based on Monte Carlo evidence. I apply the proposed methodology to study the strategic interaction among radio stations when choosing dierent time slots to air commercials. The empirical result supports the claim that multiple equilibria do exist across different cities. Moreover, I nd that each city exhibits the same equilibrium over time.
    Keywords: trade, gravity, composition, price, Washington Apples, trade-to-GDP, gains, rich, poor
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:inu:caeprp:2015007&r=ecm
  13. By: Garry F. Barrett (School of Economics, University of Sydney); Stephen G. Donald (Department of Economics, University of Texas at Austin); Yu-Chin Hsu (Institute of Economics, Academia Sinica, Taipei, Taiwan)
    Abstract: This paper considers methods for comparing poverty in two income distributions. We …first discuss the concept and usefulness of the Poverty Gap Pro…le (PGP) for comparing poverty in two populations. Dominance of one PGP over another suggests poverty dom- inance for a wide class of indices which may be expressed as functionals of the PGP. We then discuss hypotheses that can be used to test poverty dominance in terms of the PGP and introduce and justify a test statistic based on empirical PGP's where we allow for the poverty line to be estimated. A method for obtaining critical values by simulation is pro- posed that takes account of estimation of the poverty line. The fi…nite sample properties of the methods are examined in the context of a Monte Carlo simulation study and the methods are illustrated in an assessment of relative consumption poverty in Australian over the period 1988/89-2009/10. JEL Classification: C01, C12, C21
    Keywords: Poverty gap profi…le, poverty gap profi…le dominance, hypothesis testing, poverty line
    Date: 2015–04
    URL: http://d.repec.org/n?u=RePEc:sin:wpaper:15-a002&r=ecm
  14. By: Skrobotov, Anton (Russian Presidential Academy of National Economy and Public Administration (RANEPA))
    Abstract: Recent approaches to testing the hypothesis of a unit root that take into account the effect of the initial value, trend and changes in the data using pre-tested of the initial value, trend and shifts, and on the basis of this use the strategy of combining of rejections of several tests. This allows the use of more powerful tests if there is some uncertainty about the model parameters. In this paper we propose a generalization of the approach Harvey et al. (2012b) in the event of uncertainty about the initial value. It is shown that this approach has a low power at high initial value, because it includes tests based on the GLS-detrending. Therefore, we investigate the effectiveness of some tests for unit root ADF-type taking into account the shift at different values of the initial value, and proposes a decision rule based on the additional pre-testing the value of the initial value and the simultaneous use of tests based on the GLS, and OLS-detrending. In addition, modification of the proposed algorithm are discussed: the use of coefficient pre-test of the trend, the possible presence of multiple structural breaks in the trend and the presence of partial information about the location of the shift. The asymptotic behavior of all the tests is analyzed with local representation and autoregressive parameter and parameters in trend and shifts. The proposed modification are showing good properties asymptotically and in finite samples at different values of interfering parameters.
    Keywords: Dickey-Fuller test, the local trend, local shift in the trend, the asymptotic local power, uniting of the rejections, pre-testing, several changes in the trend
    JEL: C12 C22
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:rnp:ppaper:mak6&r=ecm
  15. By: Ghonghadze, Jaba; Lux, Thomas
    Abstract: We explore the issue of estimating a simple agent-based model of price formation in an asset market using the approach of Alfarano et al. (2008) as an example. Since we are able to derive various moment conditions for this model, we can apply generalized method of moments (GMM) estimation. We find that we can get relatively accurate parameter estimates with an appropriate choice of moment conditions and initialization of the iterative GMM estimates that reduce the biases arising from strong autocorrelations of the estimates of certain parameters. We apply our estimator to a sample of long records of returns of various stock and foreign exchange markets as well the price of gold. Using the estimated parameters to form the best linear forecasts for future volatility we find that the behavioral model generates sensible forecasts that get close to those of a standard GARCH (1,1) model in their overall performance, and often provide useful information on top of the information incorporated in the GARCH forecasts.
    Keywords: sentiment dynamics,GMM estimation,volatility forecasting
    JEL: G12 C22 C53
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:zbw:fmpwps:38&r=ecm
  16. By: Nandram, B.; Bhadra, Dhiman; Liu, Yiwei
    Abstract: It is a well known fact that race and ethnicity specificc variations exist in the treatment and survival of cancer patients. Studies based on breast cancer patients admitted to community hospitals in U.S depicted that there is significant difference in patterns of care between black and white breast cancer patients with blacks receiving lower quality and quantity of care. In this study, we look at this problem from a different perspective, treating the hospitals as small areas, and employing Bayesian techniques for parameter estimation. Two separate models are constructed to estimate the odds ratio of receiving liver scan (a pattern of care) for blacks and whites. The first model uses hospital-specific information while the second one uses pooled hospital data by borrowing strength from neighbouring hospitals. We have used the non-central hyper-geometric distribution as the basis for constructing the likelihood while estimation has been carried out using the griddy Metropolis-Hastings sampler. We apply our methodology on a National Cancer Institute (NCI) database. Although our results corroborate some of the observations from previous studies, it proposes a computationally attractive alternative to the established procedures in formulating and analyzing this problem.
    URL: http://d.repec.org/n?u=RePEc:iim:iimawp:13338&r=ecm
  17. By: Zhenxi, Chen; Lux, Thomas
    Abstract: We take the model of Alfarano et al. (Journal of Economic Dynamics & Control 32, 2008, 101-136) as a prototype agent-based model that allows reproducing the main stylized facts of financial returns. The model does so by combining fundamental news driven by Brownian motion with a minimalistic mechanism for generating boundedly rational sentiment dynamics. Since we can approximate the herding component among an ensemble of agents in the aggregate by a Langevin equation, we can either simulate the model in full at the micro level, or investigate the impact of sentiment formation in an aggregate asset pricing equation. In the simplest version of our model, only three parameters need to be estimated. We estimate this model using a simulated method of moments (SMM) approach. As it turns out, sensible parameter estimates can only be obtained if one first provides a rough "mapping" of the objective function via an extensive grid search. Due to the high correlations of the estimated parameters, uninformed choices will often lead to a convergence to any one of a large number of local minima. We also find that even for large data sets and simulated samples, the efficiency of SMM remains distinctly inferior to that of GMM based on the same set of moments. We believe that this feature is due to the limited range of moments available in univariate asset pricing models, and that the sensitivity of the present model to the specification of the SMM estimator could carry over to many related agent-based models of financial markets as well as to similar diffusion processes in mathematical finance.
    Keywords: simulation-based estimation,herding,agent-based model,model validation
    JEL: C14 C15 F31
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:zbw:fmpwps:37&r=ecm
  18. By: Raffaella Giacomini; Barbara Rossi
    Abstract: This review provides an overview of forecasting methods that can help researchers forecast in the presence of non-stationarities caused by instabilities. The emphasis of the review is both theoretical and applied, and provides several examples of interest to economists. We show that modeling instabilities can help, but it depends on how they are modeled. We also show how to robustify a model against instabilities.
    Keywords: Forecasting, instabilities, structural breaks.
    Date: 2014–12
    URL: http://d.repec.org/n?u=RePEc:upf:upfgen:1476&r=ecm
  19. By: Marco S. Matsumura; Ajax R. B. Moreira
    Abstract: We propose different exactly identified specifications of affine models with observed macri factors. The models are compared estimating Brazilian domestic and sovereign yield curves. Propomos diferentes especificações exatamente identificadas de modelos afins com fatores macroeconômicos observados. Foram comparadas estimando os modelos para as curvas de juros domésticas e soberanas brasileiras.
    Date: 2015–01
    URL: http://d.repec.org/n?u=RePEc:ipe:ipetds:0178&r=ecm
  20. By: Ahmed Bel Hadj Ayed; Gr\'egoire Loeper; Fr\'ed\'eric Abergel
    Abstract: In this paper, we consider a stochastic asset price model where the trend is an unobservable Ornstein Uhlenbeck process. We first review some classical results from Kalman filtering. Expectedly, the choice of the parameters is crucial to put it into practice. For this purpose, we obtain the likelihood in closed form, and provide two on-line computations of this function. Then, we investigate the asymptotic behaviour of statistical estimators. Finally, we quantify the effect of a bad calibration with the continuous time mis-specified Kalman filter. Numerical examples illustrate the difficulty of trend forecasting in financial time series.
    Date: 2015–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1504.03934&r=ecm
  21. By: Xi-Yuan Qian (ECUST); Ya-Min Liu (ECUST); Zhi-Qiang Jiang (ECUST); Boris Podobnik (BU and ZSEM); Wei-Xing Zhou (ECUST); H. Eugene Stanley (BU)
    Abstract: We propose a new method, detrended partial cross-correlation analysis (DPXA), to uncover the intrinsic power-law cross-correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis by taking into account the partial correlation analysis. We illustrate the performance of the method using bivariate fractional Brownian motions and multifractal binomial measures with analytical expressions and apply it to extract the intrinsic cross-correlation between crude oil and gold futures by considering the impact of the US dollar index.
    Date: 2015–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1504.02435&r=ecm
  22. By: Mauro Bernardi; Leopoldo Catania
    Abstract: Recent financial disasters have emphasised the need to accurately predict extreme financial losses and their consequences for the institutions belonging to a given financial market. The ability of econometric models to predict extreme events strongly relies on their flexibility to account for the highly nonlinear and asymmetric dependence observed in financial returns. We develop a new class of flexible Copula models where the evolution of the dependence parameters follow a Markov-Switching Generalised Autoregressive Score (SGASC) dynamics. Maximum Likelihood estimation is consistently performed using the Inference Functions for Margins (IFM) approach and a version of the Expectation-Maximisation (EM) algorithm specifically tailored to this class of models. The SGASC models are then used to estimate the Conditional Value-at-Risk (CoVaR), which is defined as the VaR of a given asset conditional on another asset (or portfolio) being in financial distress, and the Conditional Expected Shortfall (CoES). Our empirical investigation shows that the proposed SGASC models are able to explain and predict the systemic risk contribution of several European countries. Moreover, we also find that the SGASC models outperform competitors using several CoVaR backtesting procedures.
    Date: 2015–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1504.03733&r=ecm

This nep-ecm issue is ©2015 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.