
on Econometrics 
By:  Taoufik Bouezmarni; Jeroen V.K. Rombouts; Abderrahim Taamouti 
Abstract:  This paper proposes a new nonparametric test for conditional independence, which is based on the comparison of Bernstein copula densities using the Hellinger distance. The test is easy to implement because it does not involve a weighting function in the test statistic, and it can be applied in general settings since there is no restriction on the dimension of the data. In fact, to apply the test, only a bandwidth is needed for the nonparametric copula. We prove that the test statistic is asymptotically pivotal under the null hypothesis, establish local power properties, and motivate the validity of the bootstrap technique that we use in finite sample settings. A simulation study illustrates the good size and power properties of the test. We illustrate the empirical relevance of our test by focusing on Granger causality using financial time series data to test for nonlinear leverage versus volatility feedback effects and to test for causality between stock returns and trading volume. In a third application, we investigate Granger causality between macroeconomic variables. 
Keywords:  Nonparametric tests, conditional idependence, Granger noncausality, Bernstein density copula, bootstrap, finance, volatility asymmetry, leverage effect, volatility feedback effect, macroeconomics 
JEL:  C12 C14 C15 C19 G1 G12 E3 E4 E52 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:lvl:lacicr:0927&r=ecm 
By:  Han Lin Shang; Rob J Hyndman 
Abstract:  We present a nonparametric method to forecast a seasonal univariate time series, and propose four dynamic updating methods to improve point forecast accuracy. Our methods consider a seasonal univariate time series as a functional time series. We propose first to reduce the dimensionality by applying functional principal component analysis to the historical observations, and then to use univariate time series forecasting and functional principal component regression techniques. When data in the most recent year are partially observed, we improve point forecast accuracy using dynamic updating methods. We also introduce a nonparametric approach to construct prediction intervals of updated forecasts, and compare the empirical coverage probability with an existing parametric method. Our approaches are datadriven and computationally fast, and hence they are feasible to be applied in real time high frequency dynamic updating. The methods are demonstrated using monthly sea surface temperatures from 1950 to 2008. 
Keywords:  Functional time series, Functional principal component analysis, Ordinary least squares, Penalized least squares, Ridge regression, Sea surface temperatures, Seasonal time series. 
JEL:  C14 C23 
Date:  2009–08 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:20098&r=ecm 
By:  Jing Li (Department of Economics, South Dakota State University); Junsoo Lee (Department of Economics, Finance, and Legal Studies, University of Alabama) 
Abstract:  In this paper, we propose new tests for threshold cointegration in the autoregressive distributed lag (ADL) model. The indicators in the threshold model are based on either a nonstationary or stationary threshold variable. The cointegrating vector in this paper is not prespecied. We adopt a supremum Wald type test to account for the socalled Davies problem. Theasymptotic null distributions of the proposed tests are free of nuisance parameters. As such, a bootstrap procedure is not required and critical values of the proposed tests are tabulated. A Monte Carlo experiment shows a good finitesample performance of the proposed tests. 
Keywords:  Econometric Theory, Time Series 
JEL:  C12 C15 C32 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:sda:workpa:22009&r=ecm 
By:  Manabu Asai (Faculty of Economics, Soka University); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute and Center for International Research on the Japanese Economy (CIRJE), Faculty of Economics, University of Tokyo); Marcelo C. Medeiros (Department of Economics, Pontifical Catholic University of Rio de Janeiro) 
Abstract:  Several methods have recently been proposed in the ultra high frequency financial literature to remove the effects of microstructure noise and to obtain consistent estimates of the integrated volatility (IV) as a measure of expost daily volatility. Even biascorrected and consistent (modified) realized volatility (RV) estimates of the integrated volatility can contain residual microstructure noise and other measurement errors. Such noise is called "realized volatility error". As such measurement errors ignored, we need to take account of them in estimating and forecasting IV. This paper investigates through Monte Carlo simulations the effects of RV errors on estimating and forecasting IV with RV data. It is found that: (i) neglecting RV errors can lead to serious bias in estimators due to model misspecification; (ii) the effects of RV errors on onestep ahead forecasts are minor when consistent estimators are used and when the number of intraday observations is large; and (iii) even the partially corrected R2 recently proposed in the literature should be fully corrected for evaluating forecasts. This paper proposes a full correction of R2 , which can be applied to linear and nonlinear, short and long memory models. An empirical example for &P 500 data is used to demonstrate that neglecting RV errors can lead to serious bias in estimating the model of integrated volatility, and that the new method proposed here can eliminate the effects of the RV noise. The empirical results also show that the full correction for R2 is necessary for an accurate description of goodnessoffit. 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2009cf669&r=ecm 
By:  Andrea Carriero; George Kapetanios; Massimiliano Marcellino 
Abstract:  The paper addresses the issue of forecasting a large set of variables using multivariate models. In particular, we propose three alternative reduced rank forecasting models and compare their predictive performance for US time series with the most promising existing alternatives, namely, factor models, large scale Bayesian VARs, and multivariate boosting. Speci.cally, we focus on classical reduced rank regression, a twostep procedure that applies, in turn, shrinkage and reduced rank restrictions, and the reduced rank Bayesian VAR of Geweke (1996). We .nd that using shrinkage and rank reduction in combination rather than separately improves substantially the accuracy of forecasts, both when the whole set of variables is to be forecast, and for key variables such as industrial production growth, inflation, and the federal funds rate. The robustness of this finding is confirmed by a Monte Carlo experiment based on bootstrapped data. We also provide a consistency result for the reduced rank regression valid when the dimension of the system tends to infinity, which opens the ground to use large scale reduced rank models for empirical analysis. 
Keywords:  Bayesian VARs, factor models, forecasting, reduced rank 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2009/31&r=ecm 
By:  Tetsuya Takaishi 
Abstract:  We perform Markov chain Monte Carlo simulations for a Bayesian inference of the GJRGARCH model which is one of asymmetric GARCH models. The adaptive construction scheme is used for the construction of the proposal density in the MetropolisHastings algorithm and the parameters of the proposal density are determined adaptively by using the data sampled by the Markov chain Monte Carlo simulation. We study the performance of the scheme with the artificial GJRGARCH data. We find that the adaptive construction scheme samples GJRGARCH parameters effectively and conclude that the MetropolisHastings algorithm with the adaptive construction scheme is an efficient method to the Bayesian inference of the GJRGARCH model. 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0909.1478&r=ecm 
By:  Jan J. J. Groen; George Kapetanios 
Abstract:  We suggest a way to perform parsimonious instrumental variables estimation in the presence of many, and potentially weak, instruments. In contrast to standard methods, our approach yields consistent estimates when the set of instrumental variables complies with a factor structure. In this sense, our method is equivalent to instrumental variables estimation that is based on principal components. However, even if the factor structure is weak or nonexistent, our method, unlike the principal components approach, still yields consistent estimates. Indeed, simulations indicate that our approach always dominates standard instrumental variables estimation, regardless of whether the factor relationship underlying the set of instruments is strong, weak, or absent. 
Keywords:  Regression analysis ; Statistical methods ; Econometrics 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:fip:fednsr:386&r=ecm 
By:  Alain Guay; Emmanuel Guerre; Stepana Lazarova 
Abstract:  A new test is proposed for the null of absence of serial correlation. The test uses a datadriven smoothing parameter. The resulting test statistic has a standard limit distribution under the null. The smoothing parameter is calibrated to achieve rateoptimality against several classes of alternatives. The test can detect alternatives with many small correlation coefficients that can go to zero with an optimal adaptive rate which is faster than the parametric rate. The adaptive rateoptimality against smooth alternatives of the new test is established as well. The test can also detect ARMA and local Pitman alternatives converging to the null with a rate close or equal to the parametric one. A simulation experiment and an application to monthly financial square returns illustrate the usefulness of the proposed approach. 
Keywords:  Absence of serial correlation, datadriven nonparametric test, adaptive rateoptimality, small alternatives, time series 
JEL:  C12 C32 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:lvl:lacicr:0925&r=ecm 
By:  Alexander Kriwoluzky 
Abstract:  This paper shows how to identify the structural shocks of a Vector Autoregression (VAR) while simultaneously estimating a dynamic stochastic general equilibrium (DSGE) model that is not assumed to replicate the datagenerating process. It proposes a framework for estimating the parameters of the VAR model and the DSGE model jointly: the VAR model is identified by sign restrictions derived from the DSGE model; the DSGE model is estimated by matching the corresponding impulse response functions. 
Keywords:  Bayesian Model Estimation, Vector Autoregression, Identification 
JEL:  C51 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2009/29&r=ecm 
By:  Shiqing Ling (Department of Mathematics, Hong Kong University of Science and Technology); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute and Center for International Research on the Japanese Economy (CIRJE), Faculty of Economics, University of Tokyo) 
Abstract:  This paper develops a general asymptotic theory for the estimation of strictly stationary and ergodic time series models. Under simple conditions that are straightforward to check, we establish the strong consistency, the rate of strong convergence and the asymptotic normality of a general class of estimators that includes LSE, MLE, and some Mtype estimators. As an application, we verify the assumptions for the longmemory fractional ARIMA model. Other examples include the GARCH(1,1) model, random coefficient AR(1) model and the threshold MA(1) model. 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2009cf670&r=ecm 
By:  Tetsuya Takaishi 
Abstract:  We study the performance of the adaptive construction scheme for a Bayesian inference on the Quadratic GARCH model which introduces the asymmetry in time series dynamics. In the adaptive construction scheme a proposal density in the MetropolisHastings algorithm is constructed adaptively by changing the parameters of the density to fit the posterior density. Using artificial QGARCH data we infer the QGARCH parameters by applying the adaptive construction scheme to the Bayesian inference of QGARCH model. We find that the adaptive construction scheme samples QGARCH parameters effectively, i.e. correlations between the sampled data are very small. We conclude that the adaptive construction scheme is an efficient method to the Bayesian estimation of the QGARCH model. 
Date:  2009–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0907.5276&r=ecm 
By:  Gilles Zumbach 
Abstract:  The covariance matrix is formulated in the framework of a linear multivariate ARCH process with long memory, where the natural cross product structure of the covariance is generalized by adding two linear terms with their respective parameter. The residuals of the linear ARCH process are computed using historical data and the (inverse square root of the) covariance matrix. Simple measure of qualities assessing the independence and unit magnitude of the residual distributions are proposed. The salient properties of the computed residuals are studied for three data sets of size 54, 55 and 330. Both new terms introduced in the covariance help in producing uncorrelated residuals, but the residual magnitudes are very different from unity. The large sizes of the inferred residuals are due to the limited information that can be extracted from the empirical data when the number of time series is large, and denotes a fundamental limitation to the inference that can be achieved. 
Date:  2009–03 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0903.1531&r=ecm 
By:  Michael Dueker; Zacharias Psaradakis; Martin Sola; Fabio Spagnolo 
Abstract:  In this paper we propose a contemporaneous threshold multivariate smooth transition autoregressive (CMSTAR) model in which the regime weights depend on the ex ante probabilities that latent regimespecific variables exceed certain threshold values. A key feature of the model is that the transition function depends on all the parameters of the model as well as on the data. Since the mixing weights are a function of the regimespecific contemporaneous variancecovariance matrix, the model can account for contemporaneous regimespecific comovements of the variables. The stability and distributional properties of the proposed model are discussed, as well as issues of estimation, testing and forecasting. The practical usefulness of the CMSTAR model is illustrated by examining the relationship between US stock prices and interest rates and discussing the regime specific Granger causality relationships. 
Keywords:  Nonlinear autoregressive models; Smooth transition; Stability; Threshold. 
JEL:  C32 G12 
Date:  2009–03 
URL:  http://d.repec.org/n?u=RePEc:udt:wpecon:200903&r=ecm 
By:  Tetsuya Takaishi 
Abstract:  We propose a method to construct a proposal density for the MetropolisHastings algorithm in Markov Chain Monte Carlo (MCMC) simulations of the GARCH model. The proposal density is constructed adaptively by using the data sampled by the MCMC metho d itself. It turns out that autocorrelations between the data generated with our adaptive proposal density are greatly reduced. Thus it is concluded that the adaptive construction method is very efficient and works well for the MCMC simulations of the GARCH model. 
Date:  2009–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0901.0992&r=ecm 
By:  Westerlund, Joakim (Department of Economics, School of Business, Economics and Law, Göteborg University); Narayan, Paresh (Deakin University) 
Abstract:  In search for more efficient unit root tests in the presence of GARCH, some researchers have recently turned their attention to estimation by maximum likelihood. However, although theoretically appealing, the new test is difficult to implement, which has made it quite uncommon in the empirical literature. The current paper offers a panel data based solution to this problem.<p> 
Keywords:  Panel Data; Unit Root Tests; GARCH 
JEL:  C23 G00 
Date:  2009–09–11 
URL:  http://d.repec.org/n?u=RePEc:hhs:gunwpe:0379&r=ecm 
By:  William T. Shaw; Jonathan McCabe 
Abstract:  In mathematical finance and other applications of stochastic processes, it is frequently the case that the characteristic function may be known but explicit forms for density functions are not available. The simulation of any distribution is greatly facilitated by a knowledge of the quantile function, by which uniformly distributed samples may be converted to samples of the given distribution. This article analyzes the calculation of a quantile function direct from the characteristic function of a probability distribution, without explicit knowledge of the density. We form a nonlinear integrodifferential equation that despite its complexity admits an iterative solution for the power series of the quantile about the median. We give some examples including tail models and show how to generate Ccode for examples. 
Date:  2009–03 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0903.1592&r=ecm 
By:  Tommaso Proietti 
Abstract:  The BeveridgeNelson decomposition defines the trend component in terms of the eventual forecast function, as the value the series would take if it were on its longrun path. The paper introduces the multistep BeveridgeNelson decomposition, which arises when the forecast function is obtained by the direct autoregressive approach, which optimizes the predictive ability of the AR model at forecast horizons greater than one. We compare our proposal with the standard BeveridgeNelson decomposition, for which the forecast function is obtained by iterating the onestepahead predictions via the chain rule. We illustrate that the multistep BeveridgeNelson trend is more efficient than the standard one in the presence of model misspecification and we subsequently assess the predictive validity of the extracted transitory component with respect to future growth. 
Keywords:  Trend and Cycle, Forecasting, Filtering. 
JEL:  C22 C52 E32 
Date:  2009–09–24 
URL:  http://d.repec.org/n?u=RePEc:eei:rpaper:eeri_rp_2009_24&r=ecm 
By:  Jeroen V.K. Rombouts; Lars Stentoft 
Abstract:  While stochastic volatility models improve on the option pricing error when compared to the BlackScholesMerton model, mispricings remain. This paper uses mixed normal heteroskedasticity models to price options. Our model allows for significant negative skewness and time varying higher order moments of the risk neutral distribution. Parameter inference using Gibbs sampling is explained and we detail how to compute risk neutral predictive densities taking into account parameter uncertainty. When forecasting outofsample options on the S&P 500 index, substantial improvements are found compared to a benchmark model in terms of dollar losses and the ability to explain the smirk in implied volatilities. 
Keywords:  Bayesian inference, option pricing, finite mixture models, outofsample prediction, GARCH models 
JEL:  C11 C15 C22 G13 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:lvl:lacicr:0926&r=ecm 
By:  Pavel V. Shevchenko; Grigory Temnov 
Abstract:  Typically, operational risk losses are reported above a threshold. Fitting data reported above a constant threshold is a well known and studied problem. However, in practice, the losses are scaled for business and other factors before the fitting and thus the threshold is varying across the scaled data sample. A reporting level may also change when a bank changes its reporting policy. We present both the maximum likelihood and Bayesian Markov chain Monte Carlo approaches to fitting the frequency and severity loss distributions using data in the case of a time varying threshold. Estimation of the annual loss distribution accounting for parameter uncertainty is also presented. 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0904.4075&r=ecm 
By:  Romuald Elie (CREST  Centre de Recherche en Économie et Statistique  INSEE  École Nationale de la Statistique et de l'Administration Économique, CEREMADE  CEntre de REcherches en MAthématiques de la DEcision  CNRS : UMR7534  Université Paris Dauphine  Paris IX) 
Abstract:  This paper adresses the general issue of estimating the sensitivity of the expectation of a random variable with respect to a parameter characterizing its evolution. In finance for example, the sensitivities of the price of a contingent claim are called the Greeks. A new way of estimating the Greeks has been recently introduced by Elie, Fermanian and Touzi through a randomization of the parameter of interest combined with non parametric estimation techniques. This paper studies another type of those estimators whose interest is to be closely related to the score function, which is well known to be the optimal Greek weight. This estimator relies on the use of two distinct kernel functions and the main interest of this paper is to provide its asymptotic properties. Under a little more stringent condition, its rate of convergence equals the one of those introduced by Elie, Fermanian and Touzi and outperforms the finite differences estimator. In addition to the technical interest of the proofs, this result is very encouraging in the dynamic of creating new type of estimators for sensitivities. 
Keywords:  Sensitivity estimation, Monte Carlo simulation, Nonparametric regression. 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal00416449_v1&r=ecm 
By:  Hlouskova, Jaroslava (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria); Wagner, Martin (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria) 
Abstract:  In this paper we present finite T mean and variance correction factors and corresponding response surface regressions for the panel cointegration tests presented in Pedroni (1999, 2004), Westerlund (2005), Larsson et al. (2001), and Breitung (2005). For the single equation tests we consider up to 12 regressors and for the system tests vector autoregression dimensions up to 12 variables. All commonly used specifications for the deterministic components are considered. The time dimension sample sizes are 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 500. 
Keywords:  Panel cointegration test, correction factor, response surface, simulation 
JEL:  C12 C15 C23 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:ihs:ihsesp:244&r=ecm 
By:  Kiefer, Nicholas M. (Cornell University) 
Abstract:  Stochastically ordered random variables with given marginal distributions are combined into a joint distribution preserving the ordering and the marginals using a maximum entropy formulation. A closedform expression is obtained. An application is in default estimation for different portfolio segments, where priors on the individual default probabilities are available and the stochastic ordering is agreeable to separate experts. The ME formulation allows an efficiency improvement over separate analyses. 
Date:  2009–01 
URL:  http://d.repec.org/n?u=RePEc:ecl:corcae:0901&r=ecm 
By:  Gareth W. Peters; Pavel V. Shevchenko; Mario V. W\"uthrich 
Abstract:  In this paper we examine the claims reserving problem using Tweedie's compound Poisson model. We develop the maximum likelihood and Bayesian Markov chain Monte Carlo simulation approaches to fit the model and then compare the estimated models under different scenarios. The key point we demonstrate relates to the comparison of reserving quantities with and without model uncertainty incorporated into the prediction. We consider both the model selection problem and the model averaging solutions for the predicted reserves. As a part of this process we also consider the sub problem of variable selection to obtain a parsimonious representation of the model being fitted. 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0904.1483&r=ecm 
By:  Adam Clements (QUT); Annastiina Silvennoinen (QUT) 
Abstract:  Forecasts of asset return volatility are necessary for many financial applications, including portfolio allocation. Traditionally, the parameters of econometric models used to generate volatility forecasts are estimated in a statistical setting and subsequently used in an economic setting such as portfolio allocation. Differences in the criteria under which the model is estimated and applied may inhibit reduce the overall economic benefit of a model in the context of portfolio allocation. This paper investigates the economic benefit of direct utility based estimation of the parameters of a volatility model and allows for practical issues such as transactions costs to be incorporated within the estimation scheme. In doing so, we compare the benefits stemming from various estimators of historical volatility in the context of portfolio allocation. It is found that maximal utility based estimation, taking into account transactions costs, of a simple volatility model is preferred on the basis of greater realized utility. Estimation of models using historical daily returns is preferred over historical realized volatility. 
Keywords:  Volatility, utility, portfolio allocation, realized volatility, MIDAS 
JEL:  C10 C22 G11 
Date:  2009–07–21 
URL:  http://d.repec.org/n?u=RePEc:qut:auncer:2009_57&r=ecm 
By:  Michael Dueker; Zacharias Psaradakis; Martin Sola; Fabio Spagnolo 
Abstract:  This paper proposes a contemporaneousthreshold smooth transition GARCH (or CSTGARCH) model for dynamic conditional heteroskedasticity. The CSTGARCH model is a generalization to second conditional moments of the contemporaneous smooth transition threshold autoregressive model of Dueker et al. (2007), in which the regime weights depend on the ex ante probability that a contemporaneous latent regimespecific variable exceeds a threshold value. A key feature of the CSTGARCH model is that its transition function depends on all the parameters of the model as well as on the data. These characteristics allow the model to account for the large persistence and regime shifts that are often observed in the conditional second moments of economic and financial time series. 
Keywords:  Conditional heteroskedasticity; Smooth transition GARCH; Threshold; Stock returns. 
JEL:  C22 E31 G12 
Date:  2009–06 
URL:  http://d.repec.org/n?u=RePEc:udt:wpecon:200906&r=ecm 
By:  Romuald Elie (CREST, Ceremade) 
Abstract:  This paper adresses the general issue of estimating the sensitivity of the expectation of a random variable with respect to a parameter characterizing its evolution. In finance for example, the sensitivities of the price of a contingent claim are called the Greeks. A new way of estimating the Greeks has been recently introduced by Elie, Fermanian and Touzi through a randomization of the parameter of interest combined with non parametric estimation techniques. This paper studies another type of those estimators whose interest is to be closely related to the score function, which is well known to be the optimal Greek weight. This estimator relies on the use of two distinct kernel functions and the main interest of this paper is to provide its asymptotic properties. Under a little more stringent condition, its rate of convergence equals the one of those introduced by Elie, Fermanian and Touzi and outperforms the finite differences estimator. In addition to the technical interest of the proofs, this result is very encouraging in the dynamic of creating new type of estimators for sensitivities. 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0909.2624&r=ecm 
By:  Dilem Yildirim; Ralf Becker; Denise R Osborn 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:man:sespap:0915&r=ecm 
By:  Groen, J.J.J.; Paap, R. (Erasmus Econometric Institute) 
Abstract:  This paper revisits inflation forecasting using reduced form Phillips curve forecasts, i.e., inflation forecasts using activity and expectations variables. We propose a Phillips curvetype model that results from averaging across different regression specifications selected from a set of potential predictors. The set of predictors includes lagged values of inflation, a host of real activity data, term structure data, nominal data and surveys. In each of the individual specifications we allow for stochastic breaks in regression parameters, where the breaks are described as occasional shocks of random magnitude. As such, our framework simultaneously addresses structural change and model certainty that unavoidably affects Phillips curve forecasts. We use this framework to describe PCE deflator and GDP deflator inflation rates for the United States across the postWWII period. Over the full 19602008 sample the framework indicates several structural breaks across different combinations of activity measures. These breaks often coincide with, amongst others, policy regime changes and oil price shocks. In contrast to many previous studies, we find less evidence for autonomous variance breaks and inflation gap persistence. Through a \textit{realtime} outofsample forecasting exercise we show that our model specification generally provides superior onequarter and oneyear ahead forecasts for quarterly inflation relative to a whole range of forecasting models that are typically used in the literature. 
Keywords:  inflation forecasting;Phillips correlations;realtime data;structural breaks;model uncertainty;Bayesian model averaging 
Date:  2009–09–10 
URL:  http://d.repec.org/n?u=RePEc:dgr:eureir:1765016709&r=ecm 
By:  Pavel V. Shevchenko 
Abstract:  To quantify the operational risk capital charge under the current regulatory framework for banking supervision, referred to as Basel II, many banks adopt the Loss Distribution Approach. There are many modeling issues that should be resolved to use the approach in practice. In this paper we review the quantitative methods suggested in literature for implementation of the approach. In particular, the use of the Bayesian inference method that allows to take expert judgement and parameter uncertainty into account, modeling dependence and inclusion of insurance are discussed. 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0904.1805&r=ecm 
By:  Marc Barthelemy; JeanPierre Nadal; Henri Berestycki 
Abstract:  A single social phenomenon (such as crime, unemployment or birth rate) can be observed through temporal series corresponding to units at different levels (cities, regions, countries...). Units at a given local level may follow a collective trend imposed by external conditions, but also may display fluctuations of purely local origin. The local behavior is usually computed as the difference between the local data and a global average (e.g. a national average), a view point which can be very misleading. In this article, we propose a method for separating the local dynamics from the global trend in a collection of correlated time series. We take an independent component analysis approach in which we do not assume a small average local contribution in contrast with previously proposed methods. We first test our method on financial time series for which various data analysis tools have already been used. For the S&P500 stocks, our method is able to identify two classes of stocks with marked different behaviors: the `followers' (stocks driven by the collective trend), and the `leaders' (stocks for which local fluctuations dominate). Furthermore, as a byproduct contributing to its validation, the method also allows to classify stocks in several groups consistent with industrials sectors. We then consider crime rate series, a domain where the separation between global and local policies is still a major subject of debate. We apply our method to the states in the US and the regions in France. In the case of the US data, we observe large fluctuations in the transition period of mid70's during which crime rates increased significantly, whereas since the 80's, the state crime rates are governed by external factors, and the importance of local specificities being decreasing. 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0909.1490&r=ecm 
By:  Choi, Hwansik (Texas A&M University); Kiefer, Nicholas M. (Cornell University) 
Abstract:  We show that the asymptotic mean of the loglikelihood ratio in a misspecified model is a differential geometric quantity that is related to the exponential curvature of Efron (1978), Amari (1982), and the preferred point geometry of Critchley et al. (1993, 1994). The mean is invariant with respect to reparametrization, which leads to the differential geometrical approach where coordinatesystem invariant quantities like statistical curvatures play an important role. When models are misspecified, the likelihood ratios do not have the chisquared asymptotic limit, and the asymptotic mean of the likelihood ratio depends on two geometric factors, the departure of models from exponential families (i.e. the exponential curvature) and the departure of embedding spaces from being totally flat in the sense of Critchley et al. (1994). As a special case, the mean becomes the mean of the usual chisquared limit (i.e. the half of the degrees of freedom) when these two curvatures vanish. The effect of curvatures is shown in the nonnested hypothesis testing approach of Vuong (1989), and we correct the numerator of the test statistic with an estimated asymptotic mean of the loglikelihood ratio to improve the asymptotic approximation to the sampling distribution of the test statistic. 
Date:  2009–05 
URL:  http://d.repec.org/n?u=RePEc:ecl:corcae:0908&r=ecm 
By:  Dirk Tasche 
Abstract:  The intention with this paper is to provide all the estimation concepts and techniques that are needed to implement a twophases approach to the parametric estimation of probability of default (PD) curves. In the first phase of this approach, a raw PD curve is estimated based on parameters that reflect discriminatory power. In the second phase of the approach, the raw PD curve is calibrated to fit a target unconditional PD. The concepts and techniques presented include a discussion of different definitions of area under the curve (AUC) and accuracy ratio (AR), a simulation study on the performance of confidence interval estimators for AUC, a discussion of the oneparametric approach to the estimation of PD curves by van der Burgt (2008) and alternative approaches, as well as a simulation study on the performance of the presented PD curve estimators. The topics are treated in depth in order to provide the full rationale behind them and to produce results that can be implemented immediately. 
Date:  2009–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0905.3928&r=ecm 
By:  L. Lin; Ren R. E; D. Sornette 
Abstract:  We present a selfconsistent model for explosive financial bubbles, which combines a meanreverting volatility process and a stochastic conditional return which reflects nonlinear positive feedbacks and continuous updates of the investors' beliefs and sentiments. The conditional expected returns exhibit fasterthanexponential acceleration decorated by accelerating oscillations, called "logperiodic power law." Tests on residuals show a remarkable low rate (0.2%) of false positives when applied to a GARCH benchmark. When tested on the S&P500 US index from Jan. 3, 1950 to Nov. 21, 2008, the model correctly identifies the bubbles ending in Oct. 1987, in Oct. 1997, in Aug. 1998 and the ITC bubble ending on the first quarter of 2000. Different unitroot tests confirm the high relevance of the model specification. Our model also provides a diagnostic for the duration of bubbles: applied to the period before Oct. 1987 crash, there is clear evidence that the bubble started at least 4 years earlier. We confirm the validity and universality of the volatilityconfined LPPL model on seven other major bubbles that have occurred in the World in the last two decades. Using Bayesian inference, we find a very strong statistical preference for our model compared with a standard benchmark, in contradiction with Chang and Feigenbaum (2006) which used a unitroot model for residuals. 
Date:  2009–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0905.0128&r=ecm 
By:  Kei Takeuchi; Akimichi Takemura; Masayuki Kumon 
Abstract:  We propose procedures for testing whether stock price processes are martingales based on limit order type betting strategies. We first show that the null hypothesis of martingale property of a stock price process can be tested based on the capital process of a betting strategy. In particular with high frequency Markov type strategies we find that martingale null hypotheses are rejected for many stock price processes. 
Date:  2009–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0907.3273&r=ecm 
By:  Westerlund, Joakim (Department of Economics, School of Business, Economics and Law, Göteborg University); Costantini, Mauro (University of Vienna); Narayan, Paresh (Deakin University); Popp, Stephan (University of Duisburg–Essen) 
Abstract:  Some unit root testing situations are more difficult than others. In the case of quarterly industrial production there is not only the seasonal variation that needs to be considered but also the occasionally breaking linear trend. In the current paper we take this as our starting point to develop three new seasonal unit root tests that allow for a break in both the seasonal mean and linear trend of a quarterly time series. The asymptotic properties of the tests are derived and investigated in smallsamples using simulations. In the empirical part of the paper we consider as an example the industrial production of 13 European countries. The results suggest that for most of the series there is evidence of stationary seasonality around an otherwise nonseasonal unit root.<p> 
Keywords:  Seasonal unit root tests; Structural breaks; Linear time trend; Industrial production 
JEL:  C12 C22 
Date:  2009–09–11 
URL:  http://d.repec.org/n?u=RePEc:hhs:gunwpe:0377&r=ecm 
By:  T.D. Stanley; Stephen B. Jarrell; Hristos Doucouliagos 
Abstract:  Conventional practice is to draw inferences from all available data and research results, even though there is ample evidence to suggest that empirical literatures suffer from publication selection bias. When a scientific literature is plagued by such bias, a simple discarding of the vast majority of empirical results can actually improve statistical inference and estimation. Simulations demonstrate that, if the majority of researchers, reviewers, and editors use statistical significance as a criterion for reporting or publishing an estimate, discarding 90% of the published findings greatly reduces publication selection bias and is often more efficient than conventional summary statistics. Improving statistical estimation and inference through removing so much data goes against statistical theory and practice; hence, it is paradoxical. We investigate a very simple method to reduce the effects of publication bias and to improve the efficiency of summary estimates of accumulated empirical research results that averages the most precise ten percent of the reported estimates (i.e., ‘Top10’). In the process, the critical importance of precision (the inverse of an estimate’s standard error) as a measure of a study’s quality is brought to light. Reviewers and journal editors should use precision as one objective measure of a study’s quality. 
Keywords:  Publication Selection, Metaanalysis, Precision, Simulations, MetaRegression. 
Date:  2009–09–16 
URL:  http://d.repec.org/n?u=RePEc:dkn:econwp:eco_2009_13&r=ecm 
By:  Suarez, Ronny 
Abstract:  In this paper an alternative nonparametric historical simulation approach, the Mixing Unconditional Disturbances model with constant volatility, where price paths are generated by reshuffling disturbances for S&P 500 Index returns over the period 1950  1998, is used to estimate a Generalized Extreme Value Distribution and a Generalized Pareto Distribution. An ordinary backtesting for period 1999  2008 was made to verify this technique, providing higher accuracy returns level under upper bound of the confidence interval for the Block Maxima and the PeakOver Threshold approaches with Mixing Unconditional Disturbances. This method can be an effective tool to create value for stresstesting valuation. 
Keywords:  Extreme Values; Block Maxima; PeakOver Threshold; Mixing Unconditional Disturbances 
JEL:  C0 C1 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:17482&r=ecm 
By:  Xiaolin Luo; Pavel V. Shevchenko; John B. Donnelly 
Abstract:  Typically, operational risk losses are reported above some threshold. This paper studies the impact of ignoring data truncation on the 0.999 quantile of the annual loss distribution for operational risk for a broad range of distribution parameters and truncation levels. Loss frequency and severity are modelled by the Poisson and Lognormal distributions respectively. Two cases of ignoring data truncation are studied: the "naive model"  fitting a Lognormal distribution with support on a positive semiinfinite interval, and "shifted model"  fitting a Lognormal distribution shifted to the truncation level. For all practical cases, the "naive model" leads to underestimation (that can be severe) of the 0.999 quantile. The "shifted model" overestimates the 0.999 quantile except some cases of small underestimation for large truncation levels. Conservative estimation of capital charge is usually acceptable and the use of the "shifted model" can be justified while the "naive model" should not be allowed. However, if parameter uncertainty is taken into account (in practice it is often ignored), the "shifted model" can lead to considerable underestimation of capital charge. This is demonstrated with a practical example. 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0904.2910&r=ecm 
By:  Fulvio Baldovin; Dario Bovina; Attilio L. Stella 
Abstract:  Financial data give an opportunity to uncover the nonstationarity which may be hidden in many single timeseries. Five years of daily Euro/Dollar trading records in the about three hours following the New York opening session are shown to give an accurate ensemble representation of the selfsimilar, nonMarkovian stochastic process with nonstationary increments recently conjectured to generally underlie financial assets dynamics [PNAS {\bf 104}, 19741 (2007)]. Introducing novel quantitative tools in the analysis of nonMarkovian timeseries we show that empirical nonlinear correlators are in remarkable agreement with model predictions based only on the anomalous scaling form of the logarithmic return distribution. 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0909.3244&r=ecm 
By:  Jiří Witzany (University of Economics, Prague, Czech Republic) 
Abstract:  The paper proposes a new method to estimate correlation of account level Basle II Loss Given Default (LGD). The correlation determines the probability distribution of portfolio level LGD in the context of a copula model which is used to stress the LGD parameter as well as to estimate the LGD discount rate and other parameters. Given historical LGD observations we apply the maximum likelihood method to estimate the best correlation parameter. The method is applied and analyzed on a real large data set of unsecured retail account level LGDs and the corresponding monthly series of the average LGDs. The correlation estimate comes relatively close to the PD regulatory correlation. It is also tested for stability using the bootstrapping method and used in an efficient formula to estimate ex ante oneyear stressed LGD, i.e. oneyear LGD quantiles on any reasonable probability level. 
Keywords:  credit risk, recovery rate, loss given default, correlation, regulatory capital 
JEL:  G21 G28 C14 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:fau:wpaper:wp2009_21&r=ecm 
By:  Debdulal Mallick 
Abstract:  In discrete choice models the marginal effect of a variable of interest that is interacted with another variable differs from the marginal effect of a variable that is not interacted with any variable. The magnitude of the interaction effect is also not equal to the marginal effect of the interaction term. I present consistent estimators of both marginal and interaction effects in ordered response models. This procedure is general and can easily be extended to other discrete choice models. I also provide an example using household survey data on food security in Bangladesh. Results show that marginal effects of interaction terms are estimated by standard statistical software (STATA® 10) with very large error and even with wrong sign. 
Keywords:  Marginal effect, interaction effect, ordered probit, discrete choice. 
JEL:  C12 C25 
Date:  2009–09–22 
URL:  http://d.repec.org/n?u=RePEc:eei:rpaper:eeri_rp_2009_22&r=ecm 
By:  Wolfgang Karl Härdle (Humboldt Universität zu Berlin and National Central University, Taiwan); Nikolaus Hautsch (Humboldt Universität zu Berlin, Quantitative Products Laboratory, Berlin, and CFS); Andrija Mihoci (Humboldt Universität zu Berlin and University of Zagreb, Croatia) 
Abstract:  We model the dynamics of ask and bid curves in a limit order book market using a dynamic semiparametric factor model. The shape of the curves is captured by a factor structure which is estimated nonparametrically. Corresponding factor loadings are assumed to follow multivariate dynamics and are modelled using a vector autoregressive model. Applying the framework to four stocks traded at the Australian Stock Exchange (ASX) in 2002, we show that the suggested model captures the spatial and temporal dependencies of the limit order book. Relating the shape of the curves to variables reflecting the current state of the market, we show that the recent liquidity demand has the strongest impact. In an extensive forecasting analysis we show that the model is successful in forecasting the liquidity supply over various time horizons during a trading day. Moreover, it is shown that the model’s forecasting power can be used to improve optimal order execution strategies. 
Keywords:  Limit Order Book, Liquidity Risk, Semiparametric Model, Factor Structure, Prediction 
JEL:  C14 C32 C53 G1 
Date:  2009–09–15 
URL:  http://d.repec.org/n?u=RePEc:cfs:cfswop:wp200918&r=ecm 
By:  Westerlund, Joakim (Department of Economics, School of Business, Economics and Law, Göteborg University); Breitung, Jörg (University of Bonn) 
Abstract:  This paper points to some of the common myths and facts that have emerged from 20 years of research into the analysis of unit roots in panel data. Some of these are wellknown, others are not. But they all have in common that if ignored the effects can be very serious. This is demonstrated using both simulations and theoretical reasoning.<p> 
Keywords:  Nonstationary panel data; Unit root tests; Crosssection dependence; Multidimensional limits 
JEL:  C13 C33 
Date:  2009–09–11 
URL:  http://d.repec.org/n?u=RePEc:hhs:gunwpe:0380&r=ecm 