New Economics Papers
on Market Microstructure
Issue of 2009‒10‒31
four papers chosen by
Thanos Verousis

  1. Are market makers uninformed and passive? Signing trades in the absence of quotes By Michel van der Wel; Albert J. Menkveld; Asani Sarkar
  2. The Market Impact of a Limit Order By Nikolaus Hautsch; Ruihong Huang
  3. High Watermarks of Market Risks By Bertrand Maillet; Jean-Philippe Médecin; Thierry Michel
  4. A blocking and regularization approach to high dimensional realized covariance estimation By Nikolaus Hautsch; Lada M. Kyj; Roel C.A. Oomen

  1. By: Michel van der Wel; Albert J. Menkveld; Asani Sarkar
    Abstract: We develop a new likelihood-based approach to signing trades in the absence of quotes. This approach is equally efficient as the existing Markov-chain Monte Carlo methods, but more than ten times faster. It can address the occurrence of multiple trades at the same time and allows for analysis of settings in which trade times are observed with noise. We apply this method to a high-frequency data set of thirty-year U.S. Treasury futures to investigate the role of the market maker. Most theory characterizes the market maker as an uninformed, passive supplier of liquidity. Our findings suggest, however, that some market makers actively demand liquidity for a substantial part of the day and that they are informed speculators
    Keywords: Electronic trading of securities ; Liquidity (Economics) ; Speculation ; Futures
    Date: 2009
  2. By: Nikolaus Hautsch; Ruihong Huang
    Abstract: Despite their importance in modern electronic trading, virtually no systematic empirical evidence on the market impact of incoming orders is existing. We quantify the short-run and long-run price effect of posting a limit order by proposing a high-frequency cointegrated VAR model for ask and bid quotes and several levels of order book depth. Price impacts are estimated by means of appropriate impulse response functions. Analyzing order book data of 30 stocks traded at Euronext Amsterdam, we show that limit orders have significant market impacts and cause a dynamic (and typically asymmetric) rebalancing of the book. The strength and direction of quote and spread responses depend on the incoming orders’ aggressiveness, their size and the state of the book. We show that the effects are qualitatively quite stable across the market. Cross-sectional variations in the magnitudes of price impacts are well explained by the underlying trading frequency and relative tick size.
    Keywords: price impact, limit order, impulse response function, cointegration
    JEL: G14 C32
    Date: 2009–10
  3. By: Bertrand Maillet (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, A.A.Advisors-QCG - ABN AMRO, EIF - Europlace Institute of Finance); Jean-Philippe Médecin (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Thierry Michel (LODH - Banque)
    Abstract: We present several estimates of measures of risk amongst the most well-known, using both high and low frequency data. The aim of the article is to show which lower frequency measures can be an acceptable substitute to the high precision measures, when transaction data is unavailable for a long history. We also study the distribution of the volatility, focusing more precisely on the slopee of the tail of the various risk measure distributions, in order to define the high watermarks of market risks. Based on estimates of the tail index of a Generalized Extreme Value density backed-out from the high frequency CAC 40 series in the period 1997-2006, using both Maximum Likelihood and L-moment Methods, we, finally find no evidence for the need of a specification with heavier tails than in the case of the traditional log-normal hypothesis.
    Keywords: Financial crisis, volatility estimators distributions, range-based volatility, extreme value, high frequency data.
    Date: 2009–08
  4. By: Nikolaus Hautsch; Lada M. Kyj; Roel C.A. Oomen
    Abstract: We introduce a regularization and blocking estimator for well-conditioned high-dimensional daily covariances using high-frequency data. Using the Barndorff-Nielsen, Hansen, Lunde, and Shephard (2008a) kernel estimator, we estimate the covariance matrix block-wise and regularize it. A data-driven grouping of assets of similar trading frequency ensures the reduction of data loss due to refresh time sampling. In an extensive simulation study mimicking the empirical features of the S&P 1500 universe we show that the ’RnB’ estimator yields efficiency gains and outperforms competing kernel estimators for varying liquidity settings, noise-to-signal ratios, and dimensions. An empirical application of forecasting daily covariances of the S&P 500 index confirms the simulation results.
    Keywords: covariance estimation, blocking, realized kernel, regularization, microstructure, asynchronous trading
    JEL: C14 C22
    Date: 2009–10

This issue is ©2009 by Thanos Verousis. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.