nep-ecm New Economics Papers
on Econometrics
Issue of 2007‒10‒20
twenty-six papers chosen by
Sune Karlsson
Orebro University

  1. A Specification Test For Nonparametric Instrumental Variable Regression By Patrick Gagliardini; Olivier Scaillet
  2. Tikhonov Regularization for Functional Minimum Distance Estimators By P. Gagliardini; O. Scaillet
  3. Robust Value at Risk Prediction By Loriano Mancini; Fabio Trojani
  4. Local Transformation Kernel Density Estimation of Loss Distributions By J. Gustafsson; M. Hagmann; J.P. Nielsen; O. Scaillet
  5. A Test for Serial Dependence Using Neural Networks By George Kapetanios
  6. Inference for Parameters Defined by Moment Inequalities Using Generalized Moment Selection By Donald W.K. Andrews; Gustavo Soares
  7. An Objective Function for Simulation Based Inference on Exchange Rate Data By Peter Winker; Manfred Gilli; Vahidin Jeleskovic
  8. Robust Subsampling By Lorenzo Camponovo; Olivier Scaillet; Fabio Trojani
  9. Analyzing Strongly Periodic Series in the Frequency Domain: A Comparison of Alternative Approaches with Applications By Artis, Michael J; Clavel, Jose Garcia; Hoffmann, Mathias; Nachane, Dilip M
  10. Forecasting the Yield Curve Using Priors from No Arbitrage Affine Term Structure Models By Andrea Carriero
  11. Modelling and Forecasting Oil Prices: The Role of Asymmetric Cycles By Jesus Crespo Cuaresma; Adusei Jumah; Sohbet Karbuz
  12. Testing for Breaks in Cointegrated Panels - with an Application to the Feldstein-Horioka Puzzle By Di Iorio, Francesca; Fachin, Stefano
  13. A Method of Moments Estimator of Tail Dependence By Einmahl, J.H.J.; Krajina, A.; Segers, J.J.J.
  14. Mixtures of t-distributions for Finance and Forecasting By Giacomini, Raffaella; Gottschling, Andreas; Haefke, Christian; White, Halbert
  15. Detecting long memory co-movements in macroeconomic time series By Gianluca Moretti
  16. A Note on the Relation of Weighting and Matching Estimators By Michael Lechner
  17. The Stability and Volatility of Electricity Prices: An Illustration of (lambda, sigma-2) Analysis By Bask, Mikael; Widerberg, Anna
  18. New Eurocoin: Tracking Economic Growth in Real Time By Filippo Altissimo; Riccardo Cristadoro; Mario Forni; Marco Lippi; Giovanni Veronese
  19. A variable-neighbourhood search algorithm for finding optimal run orders in the presence of serial correlation and time trends By Garroi J.-J.; Goos P.; Sörensen K.
  20. A test for monotone comparative statics By Echenique, Federico; Komunjer, Ivana
  21. Learning in Real Time: Theory and Empirical Evidence from the Term Structure of Survey Forecasts By Patton, Andrew J; Timmermann, Allan G
  22. Forecasting elections using expert surveys: an application to U.S. presidential elections By Jones, Randall J.; Armstrong, J. Scott; Cuzan, Alfred G.
  23. The comonotonicity coefficient: a new measure of positive dependence in a multivariate setting By Koch I.; De Schepper A.
  24. Prices and Portfolio Choices in Financial Markets: Theory, Econometrics, Experiments By Peter Bossaerts; Charles Plott; William R. Zame
  25. The Impact of News on Higher Moments By Eric Jondeau; Michael Rockinger
  26. A GARCH Option Pricing Model in Incomplete Markets By Giovanni Barone-Adesi; Robert F. Engle; Loriano Mancini

  1. By: Patrick Gagliardini (University of Lugano and Swiss Finance Institute); Olivier Scaillet (University of Geneva, HEC and Swiss Finance Institute)
    Abstract: We consider testing for correct specification of a nonparametric instrumental variable regression. In this ill-posed inverse problem setting, the test statistic is based on the empirical minimum distance criterion corresponding to the conditional moment restriction evaluated with a Tikhonov Regularized estimator of the functional parameter. Its asymptotic distribution is normal under the null hypothesis, and a consistent bootstrap is available to get simulation based critical values. We explore the finite sample behavior with Monte Carlo experiments. Finally, we provide an empirical application for an estimated Engel curve.
    Keywords: Specification Test, Nonparametric Regression, Instrumental Variables, Minimum Distance, Tikhonov Regularization, Ill-posed Inverse Problems, Generalized Method of Moments, Bootstrap, Engel Curve
    JEL: C13 C14 C15 D12
    Date: 2007–04
  2. By: P. Gagliardini (University of Lugano and Swiss Finance Institute); O. Scaillet (University of Geneva and Swiss Finance Institute)
    Abstract: We study the asymptotic properties of a Tikhonov Regularized (TiR) estimator of a functional parameter based on a minimum distance principle for nonparametric conditional moment restrictions. The estimator is computationally tractable and takes a closed form in the linear case. We derive its asymptotic Mean Integrated Squared Error (MISE), its rate of convergence and its pointwise asymptotic normality under a regularization parameter depending on sample size. The optimal value of the regularization parameter is characterized. We illustrate our theoretical findings and the small sample properties with simulation results for two numerical examples. We also discuss two data driven selection procedures of the regularization parameter via a spectral representation and a subsampling approximation of the MISE. Finally, we provide an empirical application to nonparametric estimation of an Engel curve.
    Keywords: MinimumDistance, Nonparametric Estimation, III-posed In-verse Problems, Tikhonov Regularization, Endogeneity, InstrumentalVariable, Generalized Method of Moments, Subsampling, Engelcurve.
    JEL: C13 C14 C15 D12
    Date: 2006–05
  3. By: Loriano Mancini; Fabio Trojani
    Abstract: We propose a general robust semiparametric bootstrap method to estimate conditional predictive distributions of GARCH-type models. Our approach is based on a robust estimator for the parameters in GARCH-type models and a robustified resampling method for standardized GARCH residuals, which controls the bootstrap instability due to influential observations in the tails of standardized GARCH residuals. Monte Carlo simulation showsthat our method consistently provides lower VaR forecast errors, often to a large extent, and in contrast to classical methods never fails validation tests at usual significance levels. We test extensively our approach in the context of real data applications to VaR prediction for market risk, and find that only our robust procedure passes all validation tests at usualconfidence levels. Moreover, the smaller tail estimation risk of robust VaR forecasts implies VaR prediction intervals that can be nearly 20% narrower and 50% less volatile over time. This is a further desirable property of our method, which allows to adapt risky positions to VaR limits more smoothly and thus more efficiently.
    Keywords: Backtesting, M-estimator, Extreme Value Theory, Breakdown Point
    JEL: C14 C15 C23 C59
    Date: 2007–09
  4. By: J. Gustafsson (Codan Insurance and University of Copenhagen, Copenhagen, Denmark); M. Hagmann (University of Geneva and Concordia Advisors, London, United Kingdom); J.P. Nielsen (Festina Lente and University of Copenhagen, Copenhagen, Denmark); O. Scaillet (University of Geneva and Swiss Finance Institute)
    Abstract: We develop a tailor made semiparametric asymmetric kernel density estimator for the estimation of actuarial loss distributions. The estimator is obtained by transforming the data with the generalized Champernowne distribution initially fitted to the data. Then the density of the transformed data is estimated by use of local asymmetric kernel methods to obtain superior estimation properties in the tails. We find in a vast simulation study that the proposed semiparametric estimation procedure performs well relative to alternative estimators. An application to operational loss data illustrates the proposed method.
    Keywords: Actuarial loss models, Transformation, Champernowne distribution, asymmetric kernels, local likelihood estimation
    JEL: C13 C14
    Date: 2006–11
  5. By: George Kapetanios (Queen Mary, University of London)
    Abstract: Testing serial dependence is central to much of time series econometrics. A number of tests that have been developed and used to explore the dependence properties of various processes. This paper builds on recent work on nonparametric tests of independence. We consider a fact that characterises serially dependent processes using a generalisation of the autocorrelation function. Using this fact we build dependence tests that make use of neural network based approximations. We derive the theoretical properties of our tests and show that they have superior power properties. Our Monte Carlo evaluation supports the theoretical findings. An application to a large dataset of stock returns illustrates the usefulness of the proposed tests.
    Keywords: Independence, Neural networks, Strict stationarity, Bootstrap, S&P500
    JEL: C32 C33 G12
    Date: 2007–10
  6. By: Donald W.K. Andrews (Cowles Foundation, Yale University); Gustavo Soares (Dept. of Economics, Yale University)
    Abstract: The topic of this paper is inference in models in which parameters are defined by moment inequalities and/or equalities. The parameters may or may not be identified. This paper introduces a new class of confidence sets and tests based on generalized moment selection (GMS). GMS procedures are shown to have correct asymptotic size in a uniform sense and are shown not to be asymptotically conservative. The power of GMS tests is compared to that of subsampling, m out of n bootstrap, and "plug-in asymptotic" (PA) tests. The latter three procedures are the only general procedures in the literature that have been shown to have correct asymptotic size in a uniform sense for the moment inequality/equality model. GMS tests are shown to have asymptotic power that dominates that of subsampling, m out of n bootstrap, and PA tests. Subsampling and m out of n bootstrap tests are shown to have asymptotic power that dominates that of PA tests.
    Keywords: Asymptotic size, Asymptotic power, Confidence set, Exact size, Generalized moment selection, m out of n bootstrap, Subsampling, Moment inequalities, Moment selection, Test
    JEL: C12 C15
    Date: 2007–10
  7. By: Peter Winker (Department of Economics, University of Giessen); Manfred Gilli (University of Geneva and Swiss Finance Institute); Vahidin Jeleskovic (Department of Economics, University of Giessen)
    Abstract: The assessment of models of financial market behavior requires evaluation tools. When complexity hinders a direct estimation approach, e.g., for agent basedmicrosimulationmodels or complex multifractal models, simulation based estimators might provide an alternative. In order to apply such techniques, an objective function is required, which should be based on robust statistics of the time series under consideration. Based on the identification of robust statistics of foreign exchange rate time series in previous research, an objective function is derived. This function takes into account stylized facts about the unconditional distribution of exchange rate returns and properties of the conditional distribution, in particular, autoregressive conditional heteroscedasticity and long memory. A bootstrap procedure is used to obtain an estimate of the variance-covariancematrix of the different moments included in the objective function, which is used as a base for the weighting matrix. Finally, the properties of the objective function are analyzed for two different agent based models of the foreign exchange market, a simple GARCH-model and a stochastic volatility model using the DM/US-$ exchange rate as a benchmark. It is also discussed how the results might be used for inference purposes.
    Keywords: Indirect estimation; simulation based estimation; exchange rate returns
    JEL: C14 C15 F31
    Date: 2007–02
  8. By: Lorenzo Camponovo (University of Lugano); Olivier Scaillet (University of Geneva and Swiss Finance Institute); Fabio Trojani (University of St. Gallen)
    Abstract: We compute the breakdown point of the subsampling quantile of a general statistic, and show that it is increasing in the subsampling block size and the breakdown point of the statistic. These results imply fragile subsampling quantiles for moderate block sizes, also when subsampling procedures are applied to robust statistics. This instability is inherited by data driven block size selection procedures based on the minimum confidence interval volatility (MCIV) index. To overcome these problems, we propose for the linear regression setting a robust subsampling method, which implies a su±ciently high breakdown point and is consistent under standard conditions. Monte Carlo simulations and sensitivity analysis in the linear regression setting show that the robust subsampling with block size selection based on the MCIV index outperforms the subsampling, the classical bootstrap and the robust bootstrap, in terms of accuracy and robustness. These results show that robustness is a key aspect in selecting data driven subsampling block sizes.
    Keywords: Subsampling, bootstrap, breakdown point, robustness, regression
    JEL: C12 C13 C15
    Date: 2006–11
  9. By: Artis, Michael J; Clavel, Jose Garcia; Hoffmann, Mathias; Nachane, Dilip M
    Abstract: Strongly periodic series occur frequently in many disciplines. This paper reviews one specific approach to analyzing such series viz. the harmonic regression approach. In this paper, the five major methods suggested under this approach are critically reviewed and compared, and their empirical potential highlighted via two applications. The out-of-sample forecast comparisons are made using the Superior Predictive Ability test, which specifically guards against the perils of data snooping. Certain tentative conclusions are drawn regarding the relative forecasting ability of the different methods.
    Keywords: autoregressive methods; data snooping; dynamic harmonic regression; eigenvalue methods; mixed spectrum; multiple forecast comparisons
    JEL: C22 C53
    Date: 2007–10
  10. By: Andrea Carriero (Queen Mary, University of London)
    Abstract: In this paper we propose a strategy for forecasting the term structure of interest rates which may produce significant gains in predictive accuracy. The key idea is to use the restrictions implied by Affine Term Structure Models (ATSM) on a vector autoregression (VAR) as prior information rather than imposing them dogmatically. This allows to account for possible model misspecification. We apply the method to a system of five US yields, and we find that the gains in predictive accuracy can be substantial. In particular, for horizons longer than 1-step ahead, our proposed method produces systematically better forecasts than those obtained by using a pure ATSM or an unrestricted VAR, and it also outperforms very competitive benchmarks as the Minnesota prior, the Diebold-Li (2006) model, and the random walk.
    Keywords: Bayesian methods, Forecasting, Term structure
    JEL: C11 C53 E43 E47
    Date: 2007–10
  11. By: Jesus Crespo Cuaresma; Adusei Jumah; Sohbet Karbuz
    Abstract: We propose a new time series model aimed at forecasting crude oil prices. The proposed specification is an unobserved components model with an asymmetric cyclical component. The asymmetric cycle is defined as a sine-cosine wave where the frequency of the cycle depends on past oil price observations. We show that oil price forecasts improve significantly when this asymmetry is explicitly modelled.
    Keywords: Oil price, forecasting, nonlinear time series analysis, asymmetric cycles.
    JEL: C22 O13 C53
  12. By: Di Iorio, Francesca; Fachin, Stefano
    Abstract: Stability tests for cointegrating coefficients are known to have very low power with small to medium sample sizes. In this paper we propose to solve this problem by extending the tests to dependent cointegrated panels through the stationary bootstrap. Simulation evidence shows that the proposed panel tests improve considerably on asymptotic tests applied to individual series. As an empirical illustration we examined investment and saving for a panel of 14 European countries over the 1960-2002 period. While the individual stability tests, contrary to expectations and graphical evidence, in almost all cases do not reject the null of stability, the bootstrap panel tests lead to the more plausible conclusion that the long-run relationship between these two variables is likely to have undergone a break.
    Keywords: Panel cointegration, stationary bootstrap, parameter stability tests, FM-OLS
    JEL: C15 C23
    Date: 2007
  13. By: Einmahl, J.H.J.; Krajina, A.; Segers, J.J.J. (Tilburg University, Center for Economic Research)
    Abstract: AMS 2000 subject classifications: 60G70, 62H12, 62H15, 62F05, 62F12, 62F25.
    Keywords: asymptotic properties;confidence regions;goodness-of-fit test;meta-elliptical distribution;method of moments;multivariate extremes;tail dependence.
    JEL: C12 C13 C14
    Date: 2007
  14. By: Giacomini, Raffaella (University College London); Gottschling, Andreas (Deutsche Bank AG, Credit RiskManagement); Haefke, Christian (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria); White, Halbert (Department of Economics, University of California, San Diego)
    Abstract: We explore convenient analytic properties of distributions constructed as mixtures of scaled and shifted t-distributions. A feature that makes this family particularly desirable for econometric applications is that it possesses closed-form expressions for its anti-derivatives (e.g., the cumulative density function). We illustrate the usefulness of these distributions in two applications. In the first application, we use a scaled and shifted t-distribution to produce density forecasts of U.S. inflation and show that these forecasts are more accurate, out-of-sample, than density forecasts obtained using normal or standard t-distributions. In the second application, we replicate the option-pricing exercise of Abadir and Rockinger (2003) using a mixture of scaled and shifted t-distributions and obtain comparably good results, while gaining analytical tractability.
    Keywords: ARMA-GARCH models, neural networks, nonparametric density estimation, forecast accuracy, option pricing, risk neutral density
    JEL: C63 C53 C45
    Date: 2007–10
  15. By: Gianluca Moretti (Banca d'Italia, Research Department)
    Abstract: Cointegration analysis tests for the existence of a significant long-run equilibrium among some economic variables. Standard econometric procedures to test for cointegration have proven unreliable when the long-run relation among the variables is characterized by non-linearities and persistent fluctuations around the equilibrium. As a consequence, many intuitive economic relations are empirically rejected. In this paper we propose a simple approach to account for non-linearities in the cointegrating equilibrium and possible long memory fluctuations from such equilibrium. We show that our correction allows us to test robustly for the presence of cointegration both under the null and alternative hypotheses. We apply our procedure to the Johansen-Juselius PPP-UIP database, and unlike the standard case, we do not fail to reject the null of no cointegration.
    Keywords: Cointegration analysis, long memory
    JEL: C22 C51
    Date: 2007–09
  16. By: Michael Lechner
    Abstract: This paper compares the inverse-probability-of-selection-weighting estimation principle with the matching principle and derives conditions for weighting and matching to identify the same and the true distribution, respectively. This comparison improves the understanding of the relation of these estimation principles and allows constructing new estimators.
    Keywords: Matching, inverse-of-selection-probability weighting, treatment evaluation, unconfoundedness
    JEL: C21 C13 C14
    Date: 2007–09
  17. By: Bask, Mikael (Monetary Policy and Research Department, Bank of Finland); Widerberg, Anna (Department of Economics, School of Business, Economics and Law, Göteborg University)
    Abstract: The aim of this letter is to discuss and illustrate what we call (lambda, sigma-2)analysis, which is a method to distinguish between the stability of a stochastic dynamic system and the volatility of a variable generated by this system. It is also emphasized that this method is able to generate new research questions for economic theory. The data set used in an empirical illustration is spot electricity prices from Nord Pool.<p>
    Keywords: Smooth Lyapunov Exponents; Stability; Stochastic Dynamic System; Volatility
    JEL: C14 C22
    Date: 2007–10–09
  18. By: Filippo Altissimo (Brevan Howard Asset Management); Riccardo Cristadoro (Banca d'Italia); Mario Forni (Università di Modena); Marco Lippi (Università La Sapienza di Roma); Giovanni Veronese (Banca d'Italia)
    Abstract: This paper presents ideas and methods underlying the construction of an indicator that tracks the euro area GDP growth, but, unlike GDP growth, (i) is updated monthly and almost in real time; (ii) is free from hort-run dynamics. Removal of short-run dynamics from a time series, to isolate the mediumlong-run component, can be obtained by a band-pass filter. However, it is well known that band-pass filters, being two-sided, perform very poorly at the end of the sample. New Eurocoin is an estimator of the medium- long-run component of the GDP that only uses contemporaneous values of a large panel of macroeconomic time series, so that no end-of-sample deterioration occurs. Moreover, as our dataset is monthly, New Eurocoin can be updated each month and with a very short delay. Our method is based on generalized principal components that are designed to use leading variables in the dataset as proxies for future values of the GDP growth. As the medium- long-run component of the GDP is observable, although with delay, the performance of New Eurocoin at the end of the sample can be measured.
    Keywords: coincident indicator, band-pass filter, large-dataset factor models, generalized principal components
    JEL: C51 E32 O30
    Date: 2007–06
  19. By: Garroi J.-J.; Goos P.; Sörensen K.
    Abstract: The responses obtained from response surface designs that are run sequentially often exhibit serial correlation or time trends. The order in which the runs of the design are performed then has an impact on the precision of the parameter estimators. This article proposes the use of a variable-neighbourhood search algorithm to compute run orders that guarantee a precise estimation of the effects of the experimental factors. The importance of using good run orders is demonstrated by seeking D-optimal run orders for a central composite design in the presence of an AR(1) autocorrelation pattern.
    Date: 2006–10
  20. By: Echenique, Federico; Komunjer, Ivana
    Date: 2007–09
  21. By: Patton, Andrew J; Timmermann, Allan G
    Abstract: We develop a theoretical framework for understanding how agents form expectations about economic variables with a partially predictable component. Our model incorporates the effect of measurement errors and heterogeneity in individual forecasters' prior beliefs and their information signals and also accounts for agents' learning in real time about past, current and future values of economic variables. We use the model to develop insights into the term structure of forecast errors, and test its implications on a data set comprising survey forecasts of annual GDP growth and inflation with horizons ranging from 1 to 24 months. The model is found to closely match the term structure of forecast errors for consensus beliefs and is able to replicate the cross-sectional dispersion in forecasts of GDP growth but not for inflation - the latter appearing to be too high in the data at short horizons. Our analysis also suggests that agents systematically underestimated the persistent component of GDP growth but overestimated it for inflation during most of the 1990s.
    Keywords: real time learning; survey forecasts; term structure of forecasts
    JEL: C53 E37
    Date: 2007–10
  22. By: Jones, Randall J.; Armstrong, J. Scott; Cuzan, Alfred G.
    Abstract: Prior research offers a mixed view of the value of expert surveys for long-term election forecasts. On the positive side, experts have more information about the candidates and issues than voters do. On the negative side, experts all have access to the same information. Based on prior literature and on our experiences with the 2004 presidential election and the 2008 campaign so far, we have reason to believe that a simple expert survey (the Nominal Group Technique) is preferable to Delphi. Our survey of experts in American politics was quite accurate in the 2004 election. Following the same procedure, we have assembled a new panel of experts to forecast the 2008 presidential election. Here we report the results of the first survey, and compare our experts’ forecasts with predictions by the Iowa Electronic Market .
    Keywords: forecasting; elections; expert surveys; Delphi
    JEL: Y80
    Date: 2007–10–02
  23. By: Koch I.; De Schepper A.
    Abstract: In financial and actuarial sciences, knowledge about the dependence structure is of a great importance. Unfortunately this kind of information is often scarce. Many research has already been done in this field e.g. through the theory of comonotonicity. It turned out that a comonotonic dependence structure provides a very useful tool when approximating an unknown but (preferably strongly) positive dependence structure. As a consequence of this evolution, there is a need for a measure which reflects how close a given dependence structure approaches the comonotonic one. In this contribution, we design a measure of (positive) association between n variables (X1,X2, • • • ,Xn) which is useful in this context. The proposed measure, the comonotonicity coefficient _(X) takes values in the range [0, 1]. As we want to quantify the degree of comonotonicity, _(X) is defined in such a way that it equals 1 in case (X1,X2, • • • ,Xn) is comonotonic and 0 in case (X1,X2, • • • ,Xn) is independent. It should be mentioned that both the marginal distributions and the dependence structure of the vector (X1,X2, • • • ,Xn) will have an effect on the resulting value of this comonotonicity coefficient. In a first part, we show how _(X) can be designed analytically, by making use of copulas for modeling the dependence structure. In the particular case where n = 2, we compare our measure with the classic dependence measures and find some remarkable relations between our measure and the Pearson and Spearman correlation coefficients. In a second part, we focus on the case of a discounting Gaussian process and we investigate the performance of our comonotonicity coefficient in such an environment. This provides us insight in the reason why the comonotonic structure is a good approximation for the dependence structure.
    Date: 2006–12
  24. By: Peter Bossaerts (California Institute of Technology Centre for Economic Policy Research and Swiss Finance Institute); Charles Plott (California Institute of Technology); William R. Zame (UCLA and California Institute of Technology)
    Abstract: Many tests of asset pricing models address only the pricing predictions — but these pricing predictions rest on portfolio choice predictions which seem obviously wrong. This paper suggests a new approach to asset pricing and portfolio choices, based on unobserved heterogeneity. This approach yields the standard pricing conclusions of classical models but is consistent with very different portfolio choices. Novel econometric tests link the price and portfolio predictions and take account of the general equilibrium effects of sample-size bias. The paper works through the approach in detail for the case of the classical CAPM, producing a model called CAPM+€. When these econometric tests are applied to data generated by large-scale laboratory asset markets which reveal both prices and portfolio choices, CAPM+€ is not rejected.
    Keywords: experimental finance, experimental asset markets, risk aversion
    JEL: C91 C92 D51 G11 G12
    Date: 2003–07
  25. By: Eric Jondeau (University of Lausanne and Swiss Finance Institute); Michael Rockinger (University of Lausanne and Swiss Finance Institute)
    Abstract: In this paper, we extend the concept of News Impact Curve developed by Engle and Ng (1993) to the higher moments of the multivariate returns' distribution, thereby providing a tool to investigate the impact of shocks on the characteristics of the subsequent distribution. For this purpose, we present a new methodology to describe the joint distribution of returns in a non-normal setting. This methodology allows to gain a better understanding of the temporal evolution of the returns' distribution and can be used to analyze the behavior of the optimal portfolio distribution. We apply our methodology to provide stylized facts on the four largest international stock markets. In particular, we document the persistence in large (positive or negative) daily returns. In a multivariate setting, we find that foreign holdings provide a good hedge against changes in domestic volatility after good shocks but a bad hedge after crashes.
    Keywords: Volatility, Skewness, Kurtosis, GARCH model, Multivariate skewed Student t distribution, Stock returns
    JEL: C22 C51 G12
    Date: 2006–11
  26. By: Giovanni Barone-Adesi (University of Lugano and Swiss Finance Institute); Robert F. Engle (New York University, Leonard Stern School of Business); Loriano Mancini (University of Zurich and Swiss Banking Institute)
    Abstract: We propose a new method for pricing options based on GARCH models with filtered historical innovations. In an incomplete market framework we allow for different distributions of the historical and the pricing return dynamics enhancing the model flexibility to fit market option prices. An extensive empirical analysis based on S&P 500 index options shows that our model outperforms other competing GARCH pricing models and ad hoc Black-Scholes models. Using our GARCH model and a nonparametric approach we obtain decreasing state price densities per unit probability as suggested by economic theory, validating our GARCH pricing model. Implied volatility smiles appear to be explained by the negative asymmetry of the filtered historical innovations. A new simplified delta hedging scheme is presented based on conditions usually found in option markets, namely the local homogeneity of the pricing function. We provide empirical evidence and we quantify the deterioration of the delta hedging in the presence of large volatility shocks.
    Date: 2004–10

This nep-ecm issue is ©2007 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.