
on Econometrics 
By:  Md Atikur Rahman Khan; D.S. Poskitt 
Abstract:  In this paper we propose a new methodology for selecting the window length in Singular Spectral Analysis in which the window length is determined from the data prior to the commencement of modeling. The selection procedure is based on statistical tests designed to test the convergence of the autocovariance function. A classical time series portmanteau type statistic and two test statistics derived using a conditional moment principle are considered. The first two are applicable to shortmemory processes, and the third is applicable to both short and longmemory processes. We derive the asymptotic distribution of the statistics under fairly general regularity conditions and show that the criteria will identify true convergence with a finite window length with probability one as the sample size increases. Results obtained using MonteCarlo simulation indicate the relevance of the asymptotic theory, even in relatively small samples, and that the conditional moment tests will choose a window length consistent with the Whitney embedding theorem. Application to observations on the Southern Oscillation Index shows how observed experimental behaviour can be reflected in features seen with real world data sets. 
Keywords:  Portmanteau type test, Conditional moment test, Asymptotic distribution, Linear regular process, Singular spectrum analysis, Embedding 
JEL:  C12 C22 C52 
Date:  2011–09 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201122&r=ecm 
By:  Marmer, Vadim; Sakata, Shinichi 
Abstract:  Extending the L1IV approach proposed by Sakata (1997, 2007), we develop a new method, named the $rho_{tau}$IV estimation, to estimate structural equations based on the conditional quantile restriction imposed on the error terms. We study the asymptotic behavior of the proposed estimator and show how to make statistical inferences on the regression parameters. Given practical importance of weak identification, a highlight of the paper is a proposal of a test robust to the weak identification. The statistics used in our method can be viewed as a natural counterpart of the Anderson and Rubin's (1949) statistic in the $rho_{tau}$IV estimation. 
Keywords:  quantile regression; instrumental variables; weak identification 
JEL:  C21 
Date:  2011–09–28 
URL:  http://d.repec.org/n?u=RePEc:ubc:pmicro:vadim_marmer201126&r=ecm 
By:  Arnold Zellner ((posthumous) Booth School of Business, University of Chicago, USA); Tomohiro Ando (Graduate School of Business Administration, Keio University, Japan); Nalan Basturk (Econometric Institute, Erasmus University Rotterdam, The Netherlands; The Rimini Centre for Economic Analysis, Rimini, Italy); Lennart Hoogerheide (VU University Amsterdam, The Netherlands); Herman K. van Dijk (Econometric Institute, Erasmus University Rotterdam, and VU University Amsterdam) 
Abstract:  A Direct Monte Carlo (DMC) approach is introduced for posterior simulation in the Instrumental Variables (IV) model with one possibly endogenous regressor, multiple instruments and Gaussian errors under a flat prior. This DMC method can also be applied in an IV model (with one or multiple instruments) under an informative prior for the endogenous regressor's effect. This DMC approach can not be applied to more complex IV models or Simultaneous Equations Models with multiple endogenous regressors. An Approximate DMC (ADMC) approach is introduced that makes use of the proposed Hybrid Mixture Sampling (HMS) method, which facilitates MetropolisHastings (MH) or Importance Sampling from a proper marginal posterior density with highly nonelliptical shapes that tend to infinity for a point of singularity. After one has simulated from the irregularly shaped marginal distri bution using the HMS method, one easily samples the other parameters from their conditional Studentt and InverseWishart posteriors. An example illustrates the close approximation and high MH acceptance rate. While using a simple candidate distribution such as the Studentt may lead to an in¯nite variance of Importance Sampling weights. The choice between the IV model and a simple linear model un der the restriction of exogeneity may be based on predictive likelihoods, for which the efficient simulation of all model parameters may be quite useful. In future work the ADMC approach may be extended to more extensive IV models such as IV with nonGaussian errors, panel IV, or probit/logit IV. 
Keywords:  Instrumental Variables; Errors in Variables; Simultaneous Equations Model; Bayesian estimation; Direct Monte Carlo; Hybrid Mixture Sampling 
Date:  2011–09–27 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20110137&r=ecm 
By:  Wolfgang Rinnergschwentner; Gottfried Tappeiner; Janette Walde 
Abstract:  This paper picks up on a model developed by Philipov and Glickman (2006) for modeling multivariate stochastic volatility via Wishart processes. MCMC simulation from the posterior distribution is employed to fit the model. However, erroneous mathematical transformations in the full conditionals cause false implementation of the approach. We adjust the model, upgrade the analysis and investigate the statistical properties of the estimators using an extensive Monte Carlo study. Employing a Gibbs sampler in combination with a Metropolis Hastings algorithm inference for the timedependent covariance matrix is feasible with appropriate statistical properties. 
Keywords:  Bayesian time series; Stochastic covariance; Timevarying correlation; Markov Chain Monte Carlo 
JEL:  C01 C11 C63 
Date:  2011–08 
URL:  http://d.repec.org/n?u=RePEc:inn:wpaper:201119&r=ecm 
By:  Rasmus Tangsgaard Varneskov (Aarhus University and CREATES) 
Abstract:  This paper extends the class of generalized attop realized kernels, introduced in Varneskov (2011), to the multivariate case, where quadratic covariation of nonsynchronously observed asset prices is estimated in the presence of market microstructure noise that is allowed to exhibit serial dependence and to be correlated with the efficient price process. Estimators in this class are shown to posses desirable statistical properties such as consistency, asymptotic normality, and asymptotic unbiasedness at an optimal n^(1/4)convergence rate. A finite sample correction based on projections of symmetric matrices ensures positive (semi)definiteness without altering asymptotic properties of the class of estimators. The finite sample correction admits nonlinear transformations of the estimated covariance matrix such as correlations and realized betas, and it can be used in portfolio optimization problems. These transformations are all shown to inherit the desirable asymptotic properties of the generalized attop realized kernels. A simulation study shows that the class of estimators has a superior finite sample tradeoff between bias and root mean squared error relative to competing estimators. Lastly, two small empirical applications to high frequency stock market data illustrate the bias reduction relative to competing estimators in estimating correlations, realized betas, and meanvariance frontiers, as well as the use of the new estimators in the dynamics of hedging. 
Keywords:  Bias Reduction, Nonparametric Estimation, Market Microstructure Noise, Portfolio Optimization, Quadratic Covariation, Realized Beta. 
JEL:  C14 C15 G11 
Date:  2011–09–27 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201135&r=ecm 
By:  Todd Clark; Michael W. McCracken 
Abstract:  This paper examines the asymptotic and finitesample properties of tests of equal forecast accuracy when the models being compared are overlapping in the sense of Vuong (1989). Two models are overlapping when the true model contains just a subset of variables common to the larger sets of variables included in the competing forecasting models. We consider an outofsample version of the twostep testing procedure recommended by Vuong but also show that an exact onestep procedure is sometimes applicable. When the models are overlapping, we provide a simpletouse fixed regressor wild bootstrap that can be used to conduct valid inference. Monte Carlo simulations generally support the theoretical results: the twostep procedure is conservative while the onestep procedure can be accurately sized when appropriate. We conclude with an empirical application comparing the predictive content of credit spreads to growth in real stock prices for forecasting U.S. real GDP growth. 
Keywords:  Forecasting 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:fip:fedcwp:1121&r=ecm 
By:  Orth, Walter 
Abstract:  In small samples and especially in the case of small true default probabilities, standard approaches to credit default probability estimation have certain drawbacks. Most importantly, standard estimators tend to underestimate the true default probability which is of course an undesirable property from the perspective of prudent risk management. As an alternative, we present an empirical Bayes approach to default probability estimation and apply the estimator to a comprehensive sample of Standard & Poor's rated sovereign bonds. We further investigate the properties of a standard estimator and the empirical Bayes estimator by means of a simulation study. We show that the empirical Bayes estimator is more conservative and more precise under realistic data generating processes. 
Keywords:  Lowdefault portfolios; empirical Bayes; sovereign default risk; Basel II 
JEL:  C41 G15 G28 C11 
Date:  2011–09–28 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:33778&r=ecm 
By:  Markus Jochmann (Newcastle University, UK; The Rimini Centre for Economic Analysis (RCEA), Italy); Gary Koop (University of Strathclyde, UK; The Rimini Centre for Economic Analysis (RCEA), Italy) 
Abstract:  We develop methods for Bayesian inference in vector error correction models which are subject to a variety of switches in regime (e.g. Markov switches in regime or structural breaks). An important aspect of our approach is that we allow both the cointegrating vectors and the number of cointegrating relationships to change when the regime changes. We show how Bayesian model averaging r model selection methods can be used to deal with the highdimensional model space that results. Our methods are used in an empirical study of the Fisher effect. 
Keywords:  Bayesian, Markov switching, structural breaks, cointegration, model averaging 
JEL:  C11 C32 C52 
Date:  2011–09 
URL:  http://d.repec.org/n?u=RePEc:rim:rimwps:40_11&r=ecm 
By:  Andrew Chesher (Institute for Fiscal Studies and University College London) 
Abstract:  <p><p>The paper studies the partial identifying power of structural single equation threshold crossing models for binary responses when explanatory variables may be endogenous. The paper derives the sharp identified set of threshold functions for the case in which explanatory variables are discrete and provides a constructive proof of sharpness. There is special attention to a widely employed semiparametric shape restriction which requires the threshold crossing function to be a monotone function of a linear index involving the observable explanatory variables. It is shown that the restriction brings great computational benefits, allowing direct calculation of the identified set of index coefficients without calculating the nonparametrically specified threshold function. With the restriction in place the methods of the paper can be applied to produce identified sets in a class of binary response models with mismeasured explanatory variables.</p> </p><p><p>This is a revised version of CWP23/09 "Single equation endogenous binary response models"</p></p> 
Date:  2011–09 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:31/11&r=ecm 
By:  Roger Klein; Chan Shen; Francis Vella 
Abstract:  <p>This paper addresses the estimation of a semiparametric sample selection index model where both the selection rule and the outcome variable are binary. Since the marginal effects are often of primary interest and are difficult to recover in a semiparametric setting, we develop estimators for both the marginal effects and the underlying model parameters. The marginal effect estimator only uses observations which are members of a high probability set in which the selection problem is not present. A key innovation is that this high probability set is data dependent. The model parameter estimator is a quasilikelihood estimator based on regular kernels with bias corrections. We establish their large sample properties and provide simulation evidence confirming that these estimators perform well in finite samples.</p> 
Date:  2011–09 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:30/11&r=ecm 
By:  Joanna Janczura; Rafal Weron 
Abstract:  In this paper we discuss the calibration of models built on meanreverting processes combined with Markov regimeswitching (MRS). We propose a method that greatly reduces the computational burden induced by the introduction of independent regimes and perform a simulation study to test its efficiency. Our method allows for a 100 to over 1000 times faster calibration than in case of a competing approach utilizing probabilities of the last 10 observations. It is also more general and admits any value of gamma in the base regime dynamics. Since the motivation for this research comes from a recent stream of literature in energy economics, we apply the new method to sample series of electricity spot prices from the German EEX and Australian NSW markets. The proposed MRS models fit these datasets well and replicate the major stylized facts of electricity spot price dynamics. 
Keywords:  Markov regimeswitching; Energy economics; Electricity spot price; EM algorithm; Independent regimes; 
JEL:  C13 C51 Q40 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:wuu:wpaper:hsc1102&r=ecm 
By:  Li, Minqiang; Peng, Liang; Qi, Yongcheng 
Abstract:  Since its introduction by Owen in [29, 30], the empirical likelihood method has been extensively investigated and widely used to construct confidence regions and to test hypotheses in the literature. For a large class of statistics that can be obtained via solving estimating equations, the empirical likelihood function can be formulated from these estimating equations as proposed by [35]. If only a small part of parameters is of interest, a profile empirical likelihood method has to be employed to construct confidence regions, which could be computationally costly. In this paper we propose a jackknife empirical likelihood method to overcome this computational burden. This proposed method is easy to implement and works well in practice. 
Keywords:  profile empirical likelihood; estimating equation; Jackknife 
JEL:  C13 C00 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:33744&r=ecm 
By:  Shakeeb Khan; Denis Nekipelov 
Abstract:  Discrete response models are of high interest in economics and econometrics as they encompass treatment effects, social interaction and peer effect models, and discrete games. We study the impact of the structure of information sets of economic agents on the Fisher information of (strategic) interaction parameters in such models. While in complete information models the information sets of participating economic agents coincide, in incomplete information models each agent has a type, which we model as a payoff shock, that is not observed by other agents. We allow for the presence of a payoff component that is common knowledge to economic agents but is not observed by the econometrician (representing unobserved heterogeneity) and have the agents' payoffs in the incomplete information model approach their payoff in the complete information model as the heterogeneity term approaches 0. We find that in the complete information models, there is zero Fisher information for interaction parameters, implying that estimation and inference become nonstandard. In contrast, positive Fisher information can be attained in the incomplete information models with any nonzero variance of player types, and for those we can also find the semiparametric efficiency bound with unknown distribution of unobserved heterogeneity. The contrast in Fisher information is illustrated in two important cases: treatment effect models, which we model as a triangular system of equations, and static game models. In static game models we show this result is not due to equilibrium refinement with an increase in incomplete information, as our model has a fixed equilibrium selection mechanism. We find that the key factor in these models is the relative tail behavior of the unobserved component in the economic agents' payoffs and that of the observable covariates. 
Keywords:  endogeneity, semiparametric efficiency, optimal convergence rate, strategic response 
JEL:  C35 C14 C25 C13 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:duk:dukeec:1119&r=ecm 
By:  Ronny Nilsson; Gyorgy Gyomai 
Abstract:  This paper reports on revision properties of different detrending and smoothing methods (cycle estimation methods), including PAT with MCD smoothing, a double HodrickPrescott (HP) filter and the ChristianoFitzgerald (CF) filter. The different cycle estimation methods are rated on their revision performance in a simulated real time experiment. Our goal is to find a robust method that gives early turning point signals and steady turning point signals. The revision performance of the methods has been evaluated according to bias, overall revision size and signal stability measures. In a second phase, we investigate if revision performance is improved using stabilizing forecasts or by changing the cycle estimation window from the baseline 6 and 96 months (i.e. filtering out high frequency noise with a cycle length shorter than 6 months and removing trend components with cycle length longer than 96 months) to 12 and 120 months. The results show that, for all tested time series, the PAT detrending method is outperformed by both the HP or CF filter. In addition, the results indicate that the HP filter outperforms the CF filter in turning point signal stability but has a weaker performance in absolute numerical precision. Short horizon stabilizing forecasts tend to improve revision characteristics of both methods and the changed filter window also delivers more robust turning point estimates.<BR>Ce document présente l’impact des révisions dû à différentes méthodes de lissage et de correction de la tendance (méthodes d'estimation du cycle), comme la méthode PAT avec lissage en utilisant le mois de dominance cyclique (MCD), le double filtre de HodrickPrescott (HP) et le filtre ChristianoFitzgerald (CF). Les différentes méthodes d'estimation du cycle sont évaluées sur leur performance de révision faite à partir d’une simulation en temps réel. Notre objectif est de trouver une méthode robuste qui donne des signaux de point de retournement tôt et stable á la fois. La performance de révisions de ces méthodes a été évaluée en fonction du biais, de la grandeur de la révision et de la stabilité du signal. Nous examinerons ensuite si la performance de la révision peut être améliorée en utilisant des prévisions de stabilisation ou en changeant la fenêtre d'estimation du cycle de base de 6 et 96 mois à une fenêtre de 12 et 120 mois. La fenêtre d’estimation de base correspond à un filtre pour éliminer le bruit (hautes fréquences) avec une longueur de cycle de moins de 6 mois et supprimer la tendance avec une longueur de cycle supérieure à 96 mois. Les résultats montrent que, pour toutes les séries testées, la méthode PAT est moins performante que les deux filtres HP ou CF. En outre, les résultats indiquent que le filtre HP surpasse le filtre CF du point de vue de la stabilité du signal du point de retournement mais sa performance est plus faible quant à la précision numérique absolue. Des prévisions à court terme ont la tendance à améliorer les caractéristiques des révisions des deux méthodes et la modification de la fenêtre de base offre aussi des estimations plus robustes des points de retournement. 
Date:  2011–05–27 
URL:  http://d.repec.org/n?u=RePEc:oec:stdaaa:2011/4en&r=ecm 
By:  Joanna Janczura; Sebastian Orzel; Agnieszka Wylomanska 
Abstract:  The classical financial models are based on the standard Brownian diffusiontype processes. However, in exhibition of some real market data (like interest or exchange rates) we observe characteristic periods of constant values. Moreover, in the case of financial data, the assumption of normality is often unsatisfied. In such cases the popular Vasicek model, that is a mathematical system describing the evolution of interest rates based on the OrnsteinUhlenbeck process, seems not to be applicable. Therefore we propose an alternative approach based on a combination of the popular OrnsteinUhlenbeck process with a stable distribution and subdiffusion systems that demonstrate such characteristic behavior. The probability density function of the proposed process can be described by a FokkerPlanck type equation and therefore it can be examined as an extension of the basic OrnsteinUhlenbeck model. In this paper we propose the parameters' estimation method and calibrate the subordinated Vasicek model to the interest rate data. 
Keywords:  Vasicek model; OrnsteinUhlenbeck process; Alphastable distribution; Subdiffusion; Estimation; Calibration; Interest rates; 
JEL:  C16 C51 E43 E47 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:wuu:wpaper:hsc1103&r=ecm 
By:  Todd Clark; Michael W. McCracken 
Abstract:  This paper surveys recent developments in the evaluation of point forecasts. Taking West’s (2006) survey as a starting point, we briefly cover the state of the literature as of the time of West’s writing. We then focus on recent developments, including advancements in the evaluation of forecasts at the population level (based on true, unknown model coefficients), the evaluation of forecasts in the finite sample (based on estimated model coefficients), and the evaluation of conditional versus unconditional forecasts. We present original results in a few subject areas: the optimization of power in determining the split of a sample into insample and outofsample portions; whether the accuracy of inference in evaluation of multistep forecasts can be improved with the judicious choice of HAC estimator (it can); and the extension of West’s (1996) theory results for populationlevel, unconditional forecast evaluation to the case of conditional forecast evaluation. 
Keywords:  Forecasting ; Timeseries analysis 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:fip:fedcwp:1120&r=ecm 
By:  Dean P. Foster; Robert Stine; H. Peyton Young 
Abstract:  Alpha is the amount by which the returns from a given asset exceed the returns from the wider market. The standard way of estimating alpha is to correct for correlation with the market by regressing the asset’s returns against the market returns over an extended period of time and then apply the ttest to the intercept. The difficulty is that the residuals often fail to satisfy independence and normality; in fact, portfolio managers may have an incentive to employ strategies whose residuals depart by design from independence and normality. To address these problems we propose a robust test for alpha based on the Markov inequality. Since it is based on the compound value of the estimated excess returns, we call it the compound alpha test (CAT). Unlike the ttest, our test places no restrictions of returns while retaining substantial statistical power. The method is illustrated on the distribution for three assets: a stock, a hedge fund, and a fabricated fund that is deliberately designed to fool standard tests of significance. 
Keywords:  Alpha, Markov inequality, Hypothesis test 
JEL:  G32 D86 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:568&r=ecm 
By:  Dean P. Foster; H. Peyton Young 
Abstract:  Traditional methods for analyzing portfolio returns often rely on multifactor risk assessment, and tests of significance are typically based on variants of the ttest. This approach has serious limitations when analyzing the returns from dynamically traded portfolios that include derivative positions, because standard tests of significance can be ‘gamed’ using options trading strategies. To deal with this problem we propose a test that assumes nothing about the structure of returns except that they form a martingale difference. Although the test is conservative and corrects for unrealized tail risk, the loss in power is small at high levels of significance. 
Keywords:  Excess returns, Martingale maximal inequality, Hypothesis test 
JEL:  G32 D86 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:567&r=ecm 
By:  Barbara Rossi 
Abstract:  The forecasting literature has identi fied two important, broad issues. The fi rst stylized fact is that the predictive content is unstable over time; the second is that insample predictive content does not necessarily translate into outofsample predictive ability, nor ensures the stability of the predictive relation over time. The objective of this chapter is to understand what we have learned about forecasting in the presence of instabilities, especially regarding the two questions above. The empirical evidence raises a multitude of questions. If insample tests provide poor guidance to outofsample forecasting ability, what should researchers do? If there are statistically significant instabilities in the Grangercausality relationships, how do researchers establish whether there is any Grangercausality at all? And if there is substantial instability in predictive relationships, how do researchers establish which models is the "best" forecasting model? And finally, if a model forecasts poorly, why is that, and how should researchers proceed to improve the forecasting models? In this chapter, we will answer these questions by discussing various methodologies for inference as well as estimation that have been recently proposed in the literature. We also provide an empirical analysis of the usefulness of the existing methodologies using an extensive database of macroeconomic predictors of output growth and inflation. 
JEL:  C53 C22 C01 E2 E27 E37 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:duk:dukeec:1120&r=ecm 
By:  Arne Risa Hole (University of Sheffield); Andy Dickerson; Luke Munford 
Abstract:  It is wellknown that the dummy variable estimator for the fixedeffects ordered logit model is inconsistent when T, the dimension of the panel, is fixed. This talk will review a range of alternative fixedeffects ordered logit estimators that are based on Chamberlain's fixedeffects estimator for the binary logit model. The talk will present Stata code for the estimators and discuss the available evidence on their finitesample performance. We will conclude by presenting an empirical example in which the estimators are used to model the relationship between commuting and life satisfaction. 
Date:  2011–09–26 
URL:  http://d.repec.org/n?u=RePEc:boc:usug11:05&r=ecm 
By:  Martin Fukac; Vladimir Havlena 
Abstract:  This paper is written by authors from technical and economic fields, motivated to find a common language and views on the problem of the optimal use of information in model estimation. The center of our interest is the natural condition of control  a common assumption in the Bayesian estimation in technical sciences, which may be violated in economic applications. In estimating dynamic stochatic general equilibrium (DSGE) models, typically only a subset of endogenous variables are treated as measured even if additional data sets are available. The natural condition of control dictates the exploitation of all available information, which improves model adaptability and estimates efficiency. We illustrate our points on a basic RBC model. 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:fip:fedkrw:rwp1103&r=ecm 
By:  Nolan Ritter; Colin Vance 
Abstract:  This note demonstrates that in applied regression analysis, the variance of a coeffi cient of interest may decrease from the inclusion of a control variable, contrasting with Clarke’s assertion (2005, 2009) that the variance can only increase or stay the same. Practitioners may thus be welladvised to include a relevant control variable on this basis alone, particularly when it is weakly correlated with the variable of interest. 
Keywords:  Control variables; variance; model specifi cation 
JEL:  C12 C15 
Date:  2011–09 
URL:  http://d.repec.org/n?u=RePEc:rwi:repape:0282&r=ecm 
By:  Peter Fuleky (UHERO and Department of Economics, University of Hawaii); Carl S. Bonham (UHERO and Department of Economics, University of Hawaii) 
Abstract:  We extend the existing literature on small mixed frequency single factor models by allowing for multiple factors, considering indicators in levels, and allowing for cointegration among the indicators. We capture the cointegrating relationships among the indicators by common factors modeled as stochastic trends. We show that the stationary singlefactor model frequently used in the literature is misspecified if the data set contains common stochastic trends. We find that taking advantage of common stochastic trends improves forecasting performance over a stationary singlefactor model. The commontrends factor model outperforms the stationary singlefactor model at all analyzed forecast horizons on a root mean squared error basis. Our results suggest that when the constituent indicators are integrated and cointegrated, modeling common stochastic trends, as opposed to eliminating them, will improve forecasts. 
Keywords:  Dynamic Factor Model, Mixed Frequency Samples, Common Trends, Forecasting, Tourism Industry 
JEL:  E37 C32 C53 L83 
Date:  2011–06–13 
URL:  http://d.repec.org/n?u=RePEc:hai:wpaper:201110&r=ecm 
By:  Audrino, Francesco; Hu, Yujia 
Abstract:  We provide new empirical evidence on volatility forecasting in relation to asymmetries present in the dynamics of both return and volatility processes. Leverage and volatility feedback effects among continuous and jump components of the S&P500 price and volatility dynamics are examined using recently developed methodologies to detect jumps and to disentangle their size from continuous return and continuous volatility. Granted that jumps in both return and volatility are important components for generating the two effects, we find jumps in return can improve forecasts of volatility, while jumps in volatility improve volatility forecasts to a lesser extent. Moreover, disentangling jump and continuous variations into signed semivariances further improve the outofsample performance of volatility forecasting models, with negative jump semivariance being highly more informative then positive jump semivariance. The model proposed is able to capture many empirical stylized facts while still remaining parsimonious in terms of number of parameters to be estimated. 
Keywords:  High frequency data, Realized volatility forecasting, Downside risk, Leverage effect 
JEL:  C13 C22 C51 C53 
Date:  2011–09 
URL:  http://d.repec.org/n?u=RePEc:usg:econwp:2011:38&r=ecm 
By:  S. Boragan Aruoba; Francis X. Diebold; Jeremy Nalewaik; Frank Schorfheide; Dongo Song 
Abstract:  Two oftendivergent U.S. GDP estimates are available, a widelyused expenditureside version GDPE, and a much less widelyused incomeside version GDI . The authors propose and explore a "forecast combination" approach to combining them. They then put the theory to work, producing a superior combined estimate of GDP growth for the U.S., GDPC. The authors compare GDPC to GDPE and GDPI , with particular attention to behavior over the business cycle. They discuss several variations and extensions. 
Keywords:  Business cycles ; Recessions ; Expenditures, Public 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:fip:fedpwp:1141&r=ecm 