
on Econometrics 
By:  JeanMarie Dufour; Abderrahim Taamouti 
Abstract:  In this paper, we derive simple pointoptimal signbased tests in the context of linear and nonlinear regression models with fixed regressors. These tests are exact, distributionfree, robust against heteroskedasticity of unknown form, and they may be inverted to obtain confidence regions for the vector of unknown parameters. Since the pointoptimal sign tests depend on the alternative hypothesis, we propose an adaptive approach based on splitsample techniques in order to choose an alternative such that the power of pointoptimal sign tests is close to the power envelope. The simulation results show that when using approximately 10% of sample to estimate the alternative and the rest to calculate the test statistic, the power of pointoptimal sign test is typically close to the power envelope. We present a Monte Carlo study to assess the performance of the proposed “quasi”pointoptimal sign test by comparing its size and power to those of some common tests which are supposed to be robust against heteroskedasticity. The results show that our procedures are superior. 
Keywords:  Sign test, Pointoptimal test, Nonlinear model, Heteroskedasticity, Exact inference, Distributionfree, Power envelope, Splitsample, Adaptive method, Projection 
JEL:  C1 C12 C14 C15 C51 
Date:  2008–11 
URL:  http://d.repec.org/n?u=RePEc:cte:werepe:we086027&r=ecm 
By:  Hanck, Christoph 
Abstract:  This paper proposes a new testing approach for panel unit roots that is, unlike previously suggested tests, robust to nonstationarity in the volatility process of the innovations of the time series in the panel. Nonstationarity volatility arises for instance when there are structural breaks in the innovation variances. A prominent example is the reduction in GDP growth variances enjoyed by many industrialized countries, known as the `Great Moderation.' The panel test is based on Simes' [Biometrika 1986, "An Improved Bonferroni Procedure for Multiple Tests of Signicance"] classical multiple test, which combines evidence from time series unit root tests of the series in the panel. As time series unit root tests, we employ recently proposed tests of Cavaliere and Taylor [Journal of Time Series Analysis, "TimeTransformed Unit Root Tests for Models with NonStationary Volatility"]. The panel test is robust to general patterns of crosssectional dependence and yet straightforward to implement, only requiring valid pvalues of time series unit root tests, and no resampling. Monte Carlo experiments show that other panel unit root tests suer from sometimes severe size distortions in the presence of nonstationary volatility, and that this defect can be remedied using the test proposed here. The new test is applied to test for a unit root in an OECD panel of gross domestic products, yielding inference robust to the `Great Moderation.' We nd little evidence of trend stationarity. 
Keywords:  Nonstationary Volatility; Multiple Testing; Panel Unit Root Test; CrossSectional Dependence 
JEL:  C12 C23 
Date:  2008–11–30 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:11988&r=ecm 
By:  Drew Creal (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam); André Lucas (VU University Amsterdam) 
Abstract:  We propose a new class of observation driven time series models referred to as Generalized Autoregressive Score (GAS) models. The driving mechanism of the GAS model is the scaled score of the likelihood function. This approach provides a unified and consistent framework for introducing timevarying parameters in a wide class of nonlinear models. The GAS model encompasses other wellknown models such as the generalized autoregressive conditional heteroskedasticity, the autoregressive conditional duration, the autoregressive conditional intensity, and the single source of error models. In addition, the GAS specification provides a wide range of new observation driven models. Examples include nonlinear regression models with timevarying parameters, observation driven analogues of unobserved components time series models, multivariate point process models with timevarying parameters and pooling restrictions, new models for timevarying copula functions, and models for timevarying higher order moments. We study the properties of GAS models and provide several nontrivial examples of their application. 
Keywords:  dynamic models; timevarying parameters; nonlinearity; exponential family; marked point processes; copulas 
JEL:  C10 C22 C32 C51 
Date:  2008–11–06 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20080108&r=ecm 
By:  Hafner Christian M.; Manner Hans (METEOR) 
Abstract:  We propose a new dynamic copula model where the parameter characterizing dependence follows an autoregressive process. As this model class includes the Gaussian copula with stochastic correlation process, it can be viewed as a generalization of multivariate stochastic volatility models. Despite the complexity of the model, the decoupling of marginals and dependence parameters facilitates estimation. We propose estimation in two steps, where first the parameters of the marginal distributions are estimated, and then those of the copula. Parameters of the latent processes (volatilities and dependence) are estimated using efficient importance sampling (EIS). We discuss goodnessoffit tests and ways to forecast the dependence parameter. For two bivariate stock index series, we show that theproposed model outperforms standard competing models. 
Keywords:  econometrics; 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:dgr:umamet:2008043&r=ecm 
By:  Giuseppe Cavaliere; David I. Harvey; Stephen J. Leybourne; A.M. Robert Taylor (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  In this paper we analyse the impact of nonstationary volatility on the recently devel oped unit root tests which allow for a possible break in trend occurring at an unknown point in the sample, considered in Harris, Harvey, Leybourne and Taylor (2008) [HHLT]. HHLT's analysis hinges on a new break fraction estimator which, when a break in trend occurs, is consistent for the true break fraction at rate Op(T??1). Unlike other available estimators, however, when there is no trend break HHLT's estimator converges to zero at rate Op(T1=2). In their analysis HHLT assume the shocks to follow a linear process driven by IID innovations. Our first contribution is to show that HHLT's break fraction estimator retains the same consistency properties as demonstrated by HHLT for the IID case when the innovations display nonstationary behaviour of a quite general form, in cluding, for example, the case of a single break in the volatility of the innovations which may or may not occur at the same time as a break in trend. However, as we subsequently demonstrate, the limiting null distribution of unit root statistics based around this es timator are not pivotal in the presence of nonstationary volatility. Associated Monte Carlo evidence is presented to quantify the impact of various models of nonstationary volatility on both the asymptotic and finite sample behaviour of such tests. A solution to the identified inference problem is then provided by considering wild bootstrapbased implementations of the HHLT tests, using the trend break estimator from the original sample data. The proposed bootstrap method does not require the practitioner to specify a parametric model for volatility, and is shown to perform very well in practice across a range of models. 
Keywords:  Unit root tests, quasi difference detrending, trend break, nonstationary volatility, wild bootstrap 
JEL:  C22 
Date:  2008–12–02 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200862&r=ecm 
By:  Palm Franz C.; Smeekes Stephan; Urbain JeanPierre (METEOR) 
Abstract:  In this paper we consider the issue of unit root testing in crosssectionally dependent panels. We consider panels that may be characterized by various forms of crosssectionaldependence including (but not exclusive to) the popular common factor framework. Weconsider block bootstrap versions of the groupmean Im, Pesaran, and Shin (2003) and thepooled Levin, Lin, and Chu (2002) unit root coefficient DFtests for panel data, originallyproposed for a setting of no crosssectional dependence beyond a common time effect. Thetests, suited for testing for unit roots in the observed data, can be easily implemented asno specification or estimation of the dependence structure is required. Asymptotic propertiesof the tests are derived for T going to infinity and N finite. Asymptotic validity of thebootstrap tests is established in very general settings, including the presence of commonfactors and even cointegration across units. Properties under the alternative hypothesisare also considered. In a Monte Carlo simulation, the bootstrap tests are found to haverejection frequencies that are much closer to nominal size than the rejection frequenciesfor the corresponding asymptotic tests. The power properties of the bootstrap tests appearto be similar to those of the asymptotic tests. 
Keywords:  Economics (Jel: A) 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:dgr:umamet:2008048&r=ecm 
By:  Kaddour Hadri; Eiji Kurozumi 
Abstract:  This paper develops a simple test for the null hypothesis of stationarity in heterogeneous panel data with crosssectional dependence in the form of a common factor in the disturbance. We do not estimate the common factor but mopup its effect by employing the same method as the one proposed in Pesaran (2007) in the unit root testing context. Our test is basically the same as the KPSS test but the regression is augmented by crosssectional average of the observations. We also develop a Lagrange multiplier (LM) test allowing for crosssectional dependence and, under restrictive assumptions, compare our augmented KPSS test with the extended LM test under the null of stationarity, under the local alternative and under the fixed alternative, and discuss the differences between these two tests. We also extend our test to the more realistic case where the shocks are serially correlated. We use Monte Carlo simulations to examine the finite sample property of the augmented KPSS test. 
Keywords:  Panel data, stationarity, KPSS test, crosssectional dependence, LM test, locally best test 
JEL:  C12 C33 
Date:  2008–10 
URL:  http://d.repec.org/n?u=RePEc:hst:ghsdps:gd08016&r=ecm 
By:  Kleppe, Tore Selland; Skaug, Hans J. 
Abstract:  Maximum likelihood has proved to be a valuable tool for fitting the lognormal stochastic volatility model to financial returns time series. Using a sequential change of variable framework, we are able to cast more general stochastic volatility models into a form appropriate for importance samplers based on the Laplace approximation. We apply the methodology to two example models, showing that efficient importance samplers can be constructed even for highly nonGaussian latent processes such as squareroot diffusions. 
Keywords:  Change of Variable; Heston Model; Laplace Importance Sampler; Simulated Maximum Likelihood; Stochastic Volatility 
JEL:  C13 C22 
Date:  2008–07–10 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:12022&r=ecm 
By:  Òscar Jordà; Massimiliano Marcellino 
Abstract:  A path forecast refers to the sequence of forecasts 1 to H periods into the future. A summary of the range of possible paths the predicted variable may follow for a given confidence level requires construction of simultaneous confidence regions that adjust for any covariance between the elements of the path forecast. This paper shows how to construct such regions with the joint predictive density and Scheffé’s (1953) Smethod. In addition, the joint predictive density can be used to construct simple statistics to evaluate the local internal consistency of a forecasting exercise of a system of variables. Monte Carlo simulations demonstrate that these simultaneous confidence regions provide approximately correct coverage in situations where traditional error bands, based on the collection of marginal predictive densities for each horizon, are vastly off mark. The paper showcases these methods with an application to the most recent monetary episode of interest rate hikes in the U.S. macroeconomy. 
Keywords:  path forecast, simultaneous confidence region, error bands 
JEL:  C32 C52 C53 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2008/34&r=ecm 
By:  Avarucci Marco; Velasco Carlos (METEOR) 
Abstract:  This paper develops new methods for determining the cointegration rank in a nonstationary fractionally integrated system, extending univariate optimal methods for testing the degree of integration. We propose a simple Wald test based on the singular value decompositionof the unrestricted estimate of the long run multiplier matrix. When the "strength" of the cointegrating relationship is less than 1/2, the test statistic has a standard asymptotic distribution, like Lagrange Multiplier tests exploiting local properties. We consider the behavior of our test under estimation of short run parameters and local alternatives. We compare our procedure with other cointegration tests based on dierent principles and find that the new method has better properties in a range of situations by using information on the alternative obtained through a preliminary estimate of the cointegration strength. 
Keywords:  Economics (Jel: A) 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:dgr:umamet:2008049&r=ecm 
By:  Gao, Jiti; Gijbels, Irene 
Abstract:  We propose a sound approach to bandwidth selection in nonparametric kernel testing. The main idea is to find an Edgeworth expansion of the asymptotic distribution of the test concerned. Due to the involvement of a kernel bandwidth in the leading term of the Edgeworth expansion, we are able to establish closedform expressions to explicitly represent the leading terms of both the size and power functions and then determine how the bandwidth should be chosen according to certain requirements for both the size and power functions. For example, when a significance level is given, we can choose the bandwidth such that the power function is maximized while the size function is controlled by the significance level. Both asymptotic theory and methodology are established. In addition, we develop an easy implementation procedure for the practical realization of the established methodology and illustrate this on two simulated examples and a real data example. 
Keywords:  Choice of bandwidth parameter; Edgeworth expansion; nonparametric kernel testing; power function; size function 
JEL:  C14 
Date:  2005–12 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:11982&r=ecm 
By:  Gianluca, MORETTI; Giulio, NICOLETTI 
Abstract:  Recent literature clams that key variables such as aggregate productivity and inflation display long memory dynamics. We study the impllications of this high degree of persistence on the estimation of Dynamic Stochastic General Equilibrium (DGSE) models. We show that long memory data produce substantial bias in the deep parameter estimates when a standard Kalman FilterMLE procedure is used. We propose a modification of the Kalman Filter procedure, we mainly augment the state space, which deals with this problem. By the means of the augmented state space we can consistently estimate the model parameters as well as produce more accurate outofsample forecasts compared to the standard Kalman filter. 
Date:  2008–12–04 
URL:  http://d.repec.org/n?u=RePEc:ctl:louvec:2008037&r=ecm 
By:  Lambert, Dayton M.; Florax, Raymond J.G.M.; Cho, SeongHoon 
Abstract:  This research note documents estimation procedures and results for an empirical investigation of the performance of the recently developed spatial, heteroskedasticity and autocorrelation consistent (HAC) covariance estimator calibrated with different kernel bandwidths. The empirical example is concerned with a hedonic price model for residential property values. The first bandwidth approach varies an a priori determined plugin bandwidth criterion. The second method is a data driven crossvalidation approach to determine the optimal neighborhood. The third approach uses a robust semivariogram to determine the range over which residuals are spatially correlated. Inference becomes more conservative as the plugin bandwidth is increased. The datadriven approaches prove valuable because they are capable of identifying the optimal spatial range, which can subsequently be used to inform the choice of an appropriate bandwidth value. In our empirical example, pertaining to a standard spatial model and ditto dataset, the results of the data driven procedures can only be reconciled with relatively high plugin values (n0.65 or n0.75). The results for the semivariogram and the crossvalidation approaches are very similar which, given its computational simplicity, gives the semivariogram approach an edge over the more flexible crossvalidation approach. 
Keywords:  spatial HAC, semivariogram, bandwidth, hedonic model, Community/Rural/Urban Development, Demand and Price Analysis, Land Economics/Use, Research Methods/ Statistical Methods, C13, C31, R21, 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:ags:puaewp:44258&r=ecm 
By:  Jeroen Hinloopen (University of Amsterdam); Rien Wagenvoort (European Investment Bank, Luxemburg); Charles van Marrewijk (Erasmus University Rotterdam) 
Abstract:  We propose a quantification of the pp plot that assigns equal weight to all distances between the respective distributions: the surface between the pp plot and the diagonal. This surface is labelled the Harmonic Weighted Mass (HWM) index. We introduce the diagonaldeviation (dd) plot that allows the index to be computed exactly under all circumstances. For two balanced samples absent ties the finite sample distribution of the HWM index is derived. Simulations show that in most cases unbalanced samples and ties have little effect on this distribution. The dd plot allows for a straightforward extension to the Ksample HWM index. As we have not been able to derive the distribution of the index for K>2, we simulate significance tables for K=3,...,15. An example involving economic growth rates of the G7 countries illustrates that the HWM test can have better power than alternative Empirical Distribution Function tests. 
Keywords:  EDF test; pp plot; power; dd plot 
JEL:  C12 C14 
Date:  2008–10–20 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20080100&r=ecm 
By:  Christophe Ley; Davy Paindaveine 
Abstract:  When testing symmetry of a univariate density, (parametric classes of) densities skewed by means of the general probability transform introduced in [7] are appealing alternatives. This paper first proposes parametric tests of symmetry that are locally and asymptotically optimal (in the Le Cam sense) against such alternatives. To improve on these parametric tests, which are valid under wellspecified density types only, we turn them into semiparametric tests, either by using a standard studentization approach or by resorting to the invariance principle. The second approach leads to robust yet efficient signedrank tests, which include the celebrated sign and Wilcoxon tests as special cases, and turn out to be Le Cam optimal irrespective of the underlying original symmetric density. Optimality, however, is only achieved under wellspecified “skewing mechanisms”, and we therefore evaluate the overall performances of our tests by deriving their asymptotic relative efficiencies with respect to the classical test of skewness. A MonteCarlo study confirms the asymptotic results. 
Keywords:  Rankbased inference; tests of symmetry; asymmetry models; location tests; local asymptotic normality 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:eca:wpaper:2008_041&r=ecm 
By:  Di Iorio, Francesca; Fachin, Stefano 
Abstract:  We address the issue of estimation and inference in dependent nonstationary panels of small crosssection dimensions. The main conclusion is that the best results are obtained applying bootstrap inference to singleequation estimators. SUR estimators perform badly, or are even unfeasible, when the time dimension is not very large compared to the crosssection dimension. 
Keywords:  Panel cointegration; FMOLS; FMSUR. 
JEL:  C13 C15 C33 
Date:  2008–09–01 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:12053&r=ecm 
By:  Rodney W. Strachan (The University of Queensland, Australia); Herman K. van Dijk (Erasmus University Rotterdam, the Netherlands) 
Abstract:  A Bayesian model averaging procedure is presented that makes use of a finite mixture of many model structures within the class of vector autoregressive (VAR) processes. It is applied to two empirical issues. First, stability of the Great Ratios in U.S. macroeconomic time series is investigated, together with the effect of permanent shocks on business cycles. Second, the linear VAR model is extended to include a smooth transition function in a (monetary) equation and stochastic volatility in the disturbances. The risk of a liquidity trap in the U.S.A. and Japan is evaluated. Although this risk found to be reasonably high, we find only mild evidence that the monetary policy transmission mechanism is different and that central banks consider the expected cost of a liquidity trap in policy setting. Posterior probabilities of different models are evaluated using Markov chain Monte Carlo techniques. 
Keywords:  Posterior probability; Grassman manifold; Orthogonal group; Cointegration; Model averaging; Stochastic trend; Impulse response;Vector autoregressive model; Great Ratios; Liquidity trap 
JEL:  C11 C32 C52 
Date:  2008–10–10 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20080096&r=ecm 
By:  Siem Jan Koopman (VU University Amsterdam); Soon Yip Wong (VU University Amsterdam) 
Abstract:  We consider the problem of smoothing data on twodimensional grids with holes or gaps. Such grids are often referred to as difficult regions. Since the data is not observed on these locations, the gap is not part of the domain. We cannot apply standard smoothing methods since they smooth over and across difficult regions. More unfavorable properties of standard smoothers become visible when the data is observed on an irregular grid in a nonrectangular domain. In this paper, we adopt smoothing spline methods within a state space framework to smooth data on one or twodimensional grids with difficult regions. We make a distinction between two types of missing observations to handle the irregularity of the grid and to ensure that no smoothing takes place over and across the difficult region. For smoothing on twodimensional grids, we introduce a twostep spline smoothing method. The proposed solution applies to all smoothing methods that can be represented in a state space framework. We illustrate our methods for three different cases of interest. 
Keywords:  Bivariate smoothing; Geostatistics; Missing observations; Smoothing spline model; State space methods 
JEL:  C13 C22 C32 
Date:  2008–11–18 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20080114&r=ecm 
By:  Alberto Holly; Alain Montfort; Michael Rockinger 
Abstract:  The objective of this paper is to extend the results on Pseudo Maximum Likelihood (PML) theory derived in Gourieroux, Monfort, and Trognon (GMT) (1984) to a situation where the first four conditional moments are specified. Such an extension is relevant in light of pervasive evidence that conditional distributions are nonGaussian in many economic situations. The key statistical tool here is the quartic exponential family, which allows us to generalize the PML2 and QGPML1 methods proposed in GMT(1984) to PML4 and QGPML2 methods, respectively. An asymptotic theory is developed which shows, in particular, that the QGPML2 method reaches the semiparametric bound. The key numerical tool that we use is the GaussFreud integration scheme which solves a computational problem that has previously been raised in several econometric fields. Simulation exercises show the feasibility and robustness of the methods. 
Keywords:  Quartic Exponential Family, Pseudo Maximum Likelihood, Skewness, Kurtosis 
JEL:  C01 C13 C16 C22 
Date:  2008–08 
URL:  http://d.repec.org/n?u=RePEc:hem:wpaper:0802&r=ecm 
By:  Laurent Ferrara (CES  Centre d'économie de la Sorbonne  CNRS : UMR8174  Université PanthéonSorbonne  Paris I, Banque de France  Business Conditions and Macroeconomic Forecasting Directorate); Dominique Guegan (CES  Centre d'économie de la Sorbonne  CNRS : UMR8174  Université PanthéonSorbonne  Paris I, EEPPSE  Ecole d'Économie de Paris  Paris School of Economics  Ecole d'Économie de Paris); Patrick Rakotomarolahy (CES  Centre d'économie de la Sorbonne  CNRS : UMR8174  Université PanthéonSorbonne  Paris I) 
Abstract:  This papier formalizes the process of forecasting unbalanced monthly data sets in order to obtain robust nowcasts and forecasts of quarterly GDP growth rate through a semiparametric modelling. This innovative approach lies on the use on nonparametric methods, based on nearest neighbors and on radial basis function approaches, ti forecast the monthly variables involved in the parametric modelling of GDP using bridge equations. A realtime experience is carried out on Euro area vintage data in order to anticipate, with an advance ranging from six to one months, the GDP flash estimate for the whole zone. 
Keywords:  Euro area GDP, realtime nowcasting, forecasting, nonparametric models. 
Date:  2008–11 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:halshs00344839_v1&r=ecm 
By:  Ihle, Rico; CramonTaubadel, Stephan von 
Abstract:  We compare two regimedependent econometric models for price transmission analysis, namely the threshold vector error correction model and Markovswitching vector error correction model. We first provide a detailed characterization of each of the models which is followed by a comprehensive comparison. We find that the assumptions regarding the nature of their regimeswitching mechanisms are fundamentally different so that each model is suitable for a certain type of nonlinear price transmission. Furthermore, we conduct a Monte Carlo experiment in order to study the performance of the estimation techniques of both models for simulated data. We find that both models are adequate for studying price transmission since their characteristics match the underlying economic theory and allow hence for an easy interpretation. Nevertheless, the results of the corresponding estimation techniques do not reproduce the true parameters and are not robust against nuisance parameters. The comparison is supplemented by a review of empirical studies in price transmission analysis in which mostly the threshold vector error correction model is applied. 
Keywords:  price transmission, market integration, threshold vector error correction model, Markovswitching vector error correction model, comparison, nonlinear time series analysis, Agricultural Finance, 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:ags:nccest:37603&r=ecm 
By:  DaeJin Lee; Maria Durban 
Abstract:  Penalized splines (Psplines) and individual random effects are used for the analysis of spatial count data. Psplines are represented as mixed models to give a unified approach to the model estimation procedure. First, a model where the spatial variation is modelled by a twodimensional Pspline at the centroids of the areas or regions is considered. In addition, individual areaeffects are incorporated as random effects to account for individual variation among regions. Finally, the model is extended by considering a conditional autoregressive (CAR) structure for the random effects, these are the so called “SmoothCAR” models, with the aim of separating the largescale geographical trend, and local spatial correlation. The methodology proposed is applied to the analysis of lip cancer incidence rates in Scotland. 
Keywords:  Mixed models, Psplines, Overdispersion, Negative Binomial, PQL, CAR models, Scottish lip cancer data 
Date:  2008–11 
URL:  http://d.repec.org/n?u=RePEc:cte:wsrepe:ws085820&r=ecm 
By:  Jean Jacod; Mark Podolskij; Mathias Vetter (School of Economics and Management, University of Aarhus, Denmark) 
Abstract:  This paper presents some limit theorems for certain functionals of moving averages of semimartingales plus noise, which are observed at high frequency. Our method generalizes the preaveraging approach (see [13],[11]) and provides consistent estimates for various characteristics of general semimartingales. Furthermore, we prove the associated multidimensional (stable) central limit theorems. As expected, we find central limit theorems with a convergence rate n1=4, if n is the number of observations. 
Keywords:  central limit theorem, high frequency observations, microstructure noise, quadratic variation, semimartingale, stable convergence. 
JEL:  C10 C13 C14 
Date:  2008–12–01 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200861&r=ecm 
By:  Lennart Hoogerheide (Erasmus University Rotterdam); Herman K. van Dijk (Erasmus University Rotterdam) 
Abstract:  An efficient and accurate approach is proposed for forecasting Value at Risk [VaR] and Expected Shortfall [ES] measures in a Bayesian framework. This consists of a new adaptive importance sampling method for Quantile Estimation via Rapid Mixture of <I>t</I> approximations [QERMit]. As a first step the optimal importance density is approximated, after which multistep `high loss' scenarios are efficiently generated. Numerical standard errors are compared in simple illustrations and in an empirical GARCH model with Student<I>t</I> errors for daily S&P 500 returns. The results indicate that the proposed QERMit approach outperforms several alternative approaches in the sense of more accurate VaR and ES estimates given the same amount of computing time, or equivalently requiring less computing time for the same numerical accuracy. 
Keywords:  Value at Risk; Expected Shortfall; numerical accuracy; numerical standard error; importance sampling; mixture of Student<I>t</I> distributions; variance reduction technique 
JEL:  C11 C15 C53 D81 
Date:  2008–10–02 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20080092&r=ecm 
By:  Ran, Tao; Zapata, Hector 
Abstract:  Using Japanese economic data and a Monte Carlo simulation, this study analyzes the consequences of ignoring deterministic trends in mixed unitroot data for Granger noncausality tests. Results from an augmented VAR suggest overrejection in certain empirically relevant cases at various sample sizes. 
Keywords:  Research Methods/ Statistical Methods, 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:ags:saeaed:6745&r=ecm 
By:  Cecilia Frale; Massimiliano Marcellino; Gian Luigi Mazzi; Tommaso Proietti 
Abstract:  A continuous monitoring of the evolution of the economy is fundamental for the decisions of public and private decision makers. This paper proposes a new monthly indicator of the euro area real Gross Domestic Product (GDP), with several original features. First, it considers both the output side (six branches of the NACE classification) and the expenditure side (the main GDP components) and combines the two estimates with optimal weights reflecting their relative precision. Second, the indicator is based on information at both the monthly and quarterly level, modelled with a dynamic factor specification cast in statespace form. Third, since estimation of the multivariate dynamic factor model can be numerically complex, computational efficiency is achieved by implementing univariate filtering and smoothing procedures. Finally, special attention is paid to chainlinking and its implications, via a multistep procedure that exploits the additivity of the volume measures expressed at the prices of the previous year. 
Keywords:  Temporal Disaggregation, Multivariate State Space Models, Dynamic factor Models, Kalman filter and smoother, Chainlinking 
JEL:  E32 E37 C53 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2008/32&r=ecm 
By:  Xu, Zhiwei 
Abstract:  In this note, we revisit the univariate unobservedcomponent (UC) model of US GDP by relaxing the traditional randomwalk assumption of the permanent component. Since our general UC model is unidentified, we investigate the upper bound of the contribution of the transitory component, and find it is dominated by the permanent component. 
Keywords:  UnobservedComponent Model; Random Walk Assumption; Permanent and Transitory Shocks 
JEL:  E32 C22 C49 
Date:  2008–11–11 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:12038&r=ecm 