
on Econometrics 
By:  Ardia, David; Hoogerheide, Lennart F. 
Abstract:  This chapter proposes an uptodate review of estimation strategies available for the Bayesian inference of GARCHtype models. The emphasis is put on a novel efficient procedure named AdMitIS. The methodology automatically constructs a mixture of Studentt distributions as an approximation to the posterior density of the model parameters. This density is then used in importance sampling for model estimation, model selection and model combination. The procedure is fully automatic which avoids difficult and time consuming tuning of MCMC strategies. The AdMitIS methodology is illustrated with an empirical application to S&P index logreturns where nonnested GARCHtype models are estimated and combined to predict the distribution of nextday ahead logreturns. 
Keywords:  GARCH; Bayesian inference; MCMC; marginal likelihood; Bayesian model averaging; adaptive mixture of Studentt distributions; importance sampling. 
JEL:  C51 C22 C15 C11 
Date:  2010–02–08 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:22919&r=ecm 
By:  Yamaguchi, Keiko 
Abstract:  We propose an estimator of change point in the long memory parameter d of an ARFIMA(p, d, q) process using the sup Wald test. We derive the consistency and the rate of convergence of the parameter. The convergence rate of our change point estimator depends on the magnitude of a shift. Furthermore, we obtain the limiting distribution of our change point estimator without depending on the distribution of the process. Therefore, we can construct the confidence interval of the change point. Simulations show the validity of the asymptotic theory of our estimator if the sample size is large enough. We apply our change point estimator to the yearly Nile river minimum time series. 
Keywords:  Break in persistence, long memory, change point 
JEL:  C22 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:hit:econdp:201007&r=ecm 
By:  Yingyao Hu, Matthew Shum and Wei Tan 
Abstract:  We present a method for estimating Markov dynamic models with unobserved state variables which can be serially correlated over time. We focus on the case where all the model variables have discrete support. Our estimator is simple to compute because it is noniterative, and involves only elementary matrix manipulations. Our estimation method is nonparametric, in that no parametric assumptions on the distributions of the unobserved state variables or the laws of motions of the state variables are required. Monte Carlo simulations show that the estimator performs well in practice, and we illustrate its use with a dataset of doctors' prescription of pharmaceutical drugs. 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:jhu:papers:558&r=ecm 
By:  María Rosa Nieto; Esther Ruiz 
Abstract:  In this paper, we propose a new bootstrap procedure to obtain prediction intervals of future Value at Risk (VaR) and Expected Shortfall (ES) in the context of univariate GARCH models. These intervals incorporate the parameter uncertainty associated with the estimation of the conditional variance of returns. Furthermore, they do not depend on any particular assumption on the error distribution. Alternative bootstrap intervals previously proposed in the literature incorporate the first but not the second source of uncertainty when computing the VaR and ES. We also consider an iterated smoothed bootstrap with better properties than traditional ones when computing prediction intervals for quantiles. However, this latter procedure depends on parameters that have to be arbitrarily chosen and is very complicated computationally. We analyze the finite sample performance of the proposed procedure and show that the coverage of our proposed procedure is closer to the nominal than that of the alternatives. All the results are illustrated by obtaining onestepahead prediction intervals of the VaR and ES of several real time series of financial returns. 
Keywords:  Expected Shortfall, Feasible Historical Simulation, Hill estimator, Parameter uncertainty, Quantile intervals, Value at Risk 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:cte:wsrepe:ws102814&r=ecm 
By:  Parker, Thomas 
Abstract:  Goodness of fit tests based on supnorm statistics of empirical processes have nonstandard limiting distributions when the null hypothesis is composite — that is, when parameters of the null model are estimated. Several solutions to this problem have been suggested, including the calculation of adjusted critical values for these nonstandard distributions and the transformation of the empirical process such that statistics based on the transformed process are asymptotically distributionfree. The approximation methods proposed by Durbin (1985) can be applied to compute appropriate critical values for tests based on supnorm statistics. The resulting tests have quite accurate size, a fact which has gone unrecognized in the econometrics literature. Some justification for this accuracy lies in the similar features that Durbin’s approximation methods share with the theory of extrema for Gaussian random fields and for GaussMarkov processes. These adjustment techniques are also related to the transformation methodology proposed by Khmaladze (1981) through the score function of the parametric model. Monte Carlo experiments suggest that these two testing strategies are roughly comparable to one another and more powerful than a simple bootstrap procedure. 
Keywords:  Goodness of fit test; Estimated parameters; Gaussian process; GaussMarkov process; Boundary crossing probability; Martingale transformation 
JEL:  C14 C12 C46 
Date:  2010–05–26 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:22926&r=ecm 
By:  Janczura, Joanna; Weron, Rafal 
Abstract:  In this paper we propose a novel goodnessoffit testing scheme for regimeswitching models. We consider models with an observable, as well as, a latent state process. The test is based on the KolmogorovSmirnov supremumdistance statistic and the concept of the weighted empirical distribution function. We apply the proposed scheme to test whether a 2state Markov regimeswitching model fits electricity spot price data. 
Keywords:  Regimeswitching; Goodnessoffit; Weighted empirical distribution function; KolmogorovSmirnov test 
JEL:  C52 C12 Q40 
Date:  2010–05–24 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:22871&r=ecm 
By:  Russell Cooper; John Haltiwanger; Jonathan L. Willis 
Abstract:  This paper studies capital adjustment at the establishment level. Our goal is to characterize capital adjustment costs, which are important for understanding both the dynamics of aggregate investment and the impact of various policies on capital accumulation. Our estimation strategy searches for parameters that minimize ex post errors in an Euler equation. This strategy is quite common in models for which adjustment occurs in each period. Here, we extend that logic to the estimation of parameters of dynamic optimization problems in which nonconvexities lead to extended periods of investment inactivity. In doing so, we create a method to take into account censored observations stemming from intermittent investment. This methodology allows us to take the structural model directly to the data, avoiding timeconsuming simulationbased methods. To study the effectiveness of this methodology, we first undertake several Monte Carlo exercises using data generated by the structural model. We then estimate capital adjustment costs for U.S. manufacturing establishments in two sectors. 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2010/21&r=ecm 
By:  Pesaran, M.H.; Chudik, A. 
Abstract:  This paper extends the analysis of infinite dimensional vector autoregressive models (IVAR) proposed in Chudik and Pesaran (2010) to the case where one of the variables or the cross section units in the IVAR model is dominant or pervasive. This extension is not straightforward and involves several technical dificulties. The dominant unit influences the rest of the variables in the IVAR model both directly and indirectly, and its effects do not vanish even as the dimension of the model (N) tends to infinity. The dominant unit acts as a dynamic factor in the regressions of the nondominant units and yields an infinite order distributed lag relationship between the two types of units. Despite this it is shown that the effects of the dominant unit as well as those of the neighborhood units can be consistently estimated by running augmented least squares regressions that include distributed lag functions of the dominant unit. The asymptotic distribution of the estimators is derived and their small sample properties investigated by means of Monte Carlo experiments. 
Keywords:  IVAR Models, Dominant Units, Large Panels, Weak and Strong Cross Section Dependence, Factor Model 
JEL:  C10 C33 C51 
Date:  2010–05–29 
URL:  http://d.repec.org/n?u=RePEc:cam:camdae:1024&r=ecm 
By:  Javier González; Alberto Muñoz 
Abstract:  Functional data are difficult to manage for many traditional statistical techniques given their very high (or intrinsically infinite) dimensionality. The reason is that functional data are essentially functions and most algorithms are designed to work with (low) finitedimensional vectors. Within this context we propose techniques to obtain finitedimensional representations of functional data. The key idea is to consider each functional curve as a point in a general function space and then project these points onto a Reproducing Kernel Hilbert Space with the aid of Regularization theory. In this work we describe the projection method, analyze its theoretical properties and propose a model selection procedure to select appropriate Reproducing Kernel Hilbert spaces to project the functional data. 
Keywords:  Functional data, Reproducing, Kernel Hilbert Spaces, Regularization theory 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:cte:wsrepe:ws102713&r=ecm 
By:  Cameron, A. Colin (University of California, Davis); Miller, Douglas L. (University of California, Davis) 
Abstract:  In this paper we survey methods to control for regression model error that is correlated within groups or clusters, but is uncorrelated across groups or clusters. Then failure to control for the clustering can lead to understatement of standard errors and overstatement of statistical significance, as emphasized most notably in empirical studies by Moulton (1990) and Bertrand, Duo and Mullainathan (2004). We emphasize OLS estimation with statistical inference based on minimal assumptions regarding the error correlation process. Complications we consider include clusterspecific fixed effects, few clusters, multiway clustering, more efficient feasible GLS estimation, and adaptation to nonlinear and instrumental variables estimators. 
JEL:  C12 C21 C23 
Date:  2010–02 
URL:  http://d.repec.org/n?u=RePEc:ecl:ucdeco:106&r=ecm 
By:  Hilde Bjørnland (Norwegian School of Management (BI) and Norges Bank (Central Bank of Norway)); Karsten Gerdrup (Norges Bank (Central Bank of Norway)); Christie Smith (Reserve Bank of New Zealand); Anne Sofie Jore (Norges Bank (Central Bank of Norway)); Leif Anders Thorsrud (Norges Bank (Central Bank of Norway)) 
Abstract:  We apply a suite of models to produce quasirealtime density forecasts of Norwegian GDP and in ation, and evaluate dfferent combination and selection methods using the KullbackLeibler information criterion (KLIC). We use linear and logarithmic opinion pools in conjunction with various weighting schemes, and we compare these combinations to two different selection methods. In our application, logarithmic opinion pools were better than linear opinion pools, and scorebased weights were generally superior to other weighting schemes. Model selection generally yielded poor density forecasts, as evaluated by KLIC. 
Keywords:  Model combination; evaluation; density forecasting; KLIC 
JEL:  C32 C52 C53 E52 
Date:  2010–05–19 
URL:  http://d.repec.org/n?u=RePEc:bno:worpap:2010_06&r=ecm 
By:  Michael McAleer (University of Canterbury); Massimiliano Caporin 
Abstract:  DAMGARCH is a new model that extends the VARMAGARCH model of Ling and McAleer (2003) by introducing multiple thresholds and timedependent structure in the asymmetry of the conditional variances. Analytical expressions for the news impact surface implied by the new model are also presented. DAMGARCH models the shocks affecting the conditional variances on the basis of an underlying multivariate distribution. It is possible to model explicitly assetspecific shocks and common innovations by partitioning the multivariate density support. This paper presents the model structure, describes the implementation issues, and provides the conditions for the existence of a unique stationary solution, and for consistency and asymptotic normality of the quasimaximum likelihood estimators. The paper also presents an empirical example to highlight the usefulness of the new model. 
Keywords:  multivariate asymmetry; conditional variance; stationarity conditions; asymptotic theory; multivariate news impact curve 
JEL:  C32 C51 C52 
Date:  2010–04–01 
URL:  http://d.repec.org/n?u=RePEc:cbt:econwp:10/32&r=ecm 
By:  Hartmann, Wesley (Stanford University); Nair, Harikesh (Stanford University); Narayanan, Sridhar (Stanford University) 
Abstract:  We discuss how regression discontinuity designs arise naturally in settings where firms target marketing activity at consumers, and discuss how this aspect may be exploited for econometric inference of causal effects of marketing effort. Our main insight is to use commonly observed discreteness and kinks in the heuristics by which firms target such marketing activity to consumers for nonparametric identification. Such kinks, along with continuity restrictions that are typically satisfied in marketing and industrial organization applications, are sufficient for identification of local treatment effects. We review the theory of regression discontinuity estimation in the context of targeting, and explore its applicability to several marketing settings. We discuss identifiability of causal marketing effects using the design, and illustrate theoretically the conditions under which the RD estimator may be valid. Specifically, we argue that consideration of an underlying model of strategic consumer behavior reveals how identification hinges on model features such as the specification and value of structural parameters as well as belief structures. We present two empirical applications: the first, to measuring the effect of casino email promotions targeted to customers based on ranges of their expected profitability; and the second, to measuring the effect of direct mail targeted by a B2C company to zipcodes based on thresholds of expected response. In both cases, we illustrate that exploiting the regression discontinuity design reveals negative effects of the marketing campaigns that would not have been uncovered using other approaches. Our results are nonparameteric, easy to compute, and fully control for the endogeneity induced by the targeting rule. 
Date:  2009–11 
URL:  http://d.repec.org/n?u=RePEc:ecl:stabus:2039&r=ecm 
By:  Guglielmo Maria Caporale; Luis A. GilAlana 
Abstract:  This paper examines the degree of persistence in the volatility of financial time series using a Long Memory Stochastic Volatility (LMSV) model. Specifically, it employs a Gaussian semiparametric (or local Whittle) estimator of the memory parameter, based on the frequency domain, proposed by Robinson (1995a), and shown by Arteche (2004) to be consistent and asymptotically normal in the context of signal plus noise models. Daily data on the NASDAQ index are analysed. The results suggest that volatility has a component of long memory behaviour, the order of integration ranging between 0.3 and 0.5, the series being therefore stationary and meanreverting. 
Keywords:  Fractional integration, long memory, stochastic volatility, asset returns 
JEL:  C13 C22 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1006&r=ecm 
By:  Andree Ehlert (GeorgAugustUniversity Göttingen); Martin Schlather (GeorgAugustUniversity Göttingen) 
Abstract:  The extremal coefficient function has been discussed as an analog of the autocovariance function for extreme values. However, as to the behavior of valid extremal coefficient functions little is known apart from their positive definite type. In particular, the reconstruction of valid processes from given extremal coefficient functions has not been considered before. We show, for the onedimensional case, the equivalence of the set correlation functions and the extremal coefficient functions with finite range on a grid, and study an analogy to Bochner’s theorem, namely that any such extremal coefficient function is representable as a convex combination of a finite set of positive definite functions. This allows for the construction of simple maxstable processes complying with a given extremal coefficient function and, in addition, highlights further properties of the latter. We will include an application of this approach and discuss several examples. As to processes with infinite range we will consider a natural extension of the term “long memory” that is wellknown in the Gaussian framework to maxstable processes. 
Keywords:  Extreme value theory; maxstable process; extremal dependence; extremal coefficient function; set covariance function; set correlation function; homometric; long memory; summability 
Date:  2010–05–25 
URL:  http://d.repec.org/n?u=RePEc:got:gotcrc:030&r=ecm 
By:  Md Atikur Rahman Khan; D.S. Poskitt 
Abstract:  This paper provides an information theoretic analysis of the signalnoise separation problem in Singular Spectrum Analysis. We present a signalplusnoise model based on the KarhunenLoève expansion and use this model to motivate the construction of a minimum description length criterion that can be employed to select both the window length and the signal. We show that under very general regularity conditions the criterion will identify the true signal dimension with probability one as the sample size increases, and will choose the smallest window length consistent with the Whitney embedding theorem. Empirical results obtained using simulated and real world data sets indicate that the asymptotic theory is reflected in observed behaviour, even in relatively small samples. 
Keywords:  KarhunenLoève expansion, minimum description length, signalplusnoise model, Singular Spectrum Analysis, embedding 
JEL:  C14 C22 C52 
Date:  2010–05–24 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201013&r=ecm 
By:  Volf Frishling; David G Maher 
Abstract:  We examine three methods of constructing correlated Student$t$ random variables. Our motivation arises from simulations that utilise heavytailed distributions for the purposes of stress testing and economic capital calculations for financial institutions. We make several observations regarding the suitability of the three methods for this purpose. 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1005.4456&r=ecm 
By:  Corrado, L.; Weeks, M. 
Abstract:  In this paper we explore solutions to a particular type of heterogeneity in survey data which is manifest in the presence of individualspecific response scales. We consider this problem in the context of existing evidence on crosscountry differences in subjective life satisfaction, and in particular the extent of crosscountry comparability. In this instance observed responses are not directly comparable, and inference is compromised.<br> We utilise two broad identification strategies to account for scale heterogeneity. Keeping the data fixed, we consider a number of estimators based on alternative generalisations of the ordered response model. We also examine a number of alternative approaches based on the use of additional information in the form of responses on one or more additional questions with the same response categories as the selfassessment question. These additional questions, referred to as anchoring vignettes, can under certain conditions, be used to correct for the resultant biases in model parameters. 
Keywords:  Vignettes, ordered response, generalised ordered response, stochastic thresholds, attitudinal surveys. 
Date:  2010–05–29 
URL:  http://d.repec.org/n?u=RePEc:cam:camdae:1031&r=ecm 
By:  Andrew J. Buck (Department of Economics, Temple University); George M. Lady (Department of Economics, Temple University) 
Abstract:  As currently practiced, the analysis of an economic model’s qualitative properties is very restricted and rarely productive. This paper provides an approach for conducting an expanded qualitative analysis that can be applied to any economic model. The method proposed will enable the qualitative properties of all economic models to be critically assessed. 
Keywords:  Qualitative Analysis, Comparative Statics, Falsification 
JEL:  C12 C14 C52 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:tem:wpaper:1007&r=ecm 
By:  Hyunsok Kim; Ronald MacDonald 
Abstract:  The large appreciation and depreciation of the US dollar in the 1980s stimulated an important debate on the usefulness of unit root tests in the presence of structural breaks. In this paper, we propose a simple model to describe the evolution of the real exchange rate. We then propose a more general smooth transition (STR) function than has hitherto been employed, which is able to capture structural changes along the (longrun) equilibrium path, and show that this is consistent with our economic model. Our framework allows for a gradual adjustment between regimes and allows for under and/or overvalued exchange rate adjustments. Using monthly and quarterly data for up to twenty OECD countries, we apply our methodology to investigate the univariate time series properties of CPIbased real exchange rates with both the U.S. dollar and German mark as the numeraire currencies. The empirical results show that, for more than half of the quarterly series, the evidence in favour of the stationarity of the real exchange rate was clearer in the subsample period post1980. 
Keywords:  Unit root tests, structural breaks, purchasing power parity 
JEL:  C16 C22 F31 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:gla:glaewp:2010_14&r=ecm 
By:  D.S.G. Pollock (University of Leicester) 
Abstract:  Discretetime ARMA processes can be placed in a onetoone correspondence with a set of continuoustime processes that are bounded in frequency by the Nyquist value of ð radians per sample period. It is well known that, if data are sampled from a continuous process of which the maximum frequency exceeds the Nyquist value, then there will be a problem of aliasing. However, if the sampling is too rapid, then other problems will arise that will cause the ARMA estimates to be severely biased. The paper reveals the nature of these problems and it shows how they may be overcome. 
Keywords:  Stochastic Differential Equations, BandLimited Stochastic Processes, Oversampling 
Date:  2010–05–25 
URL:  http://d.repec.org/n?u=RePEc:wse:wpaper:44&r=ecm 