
on Econometrics 
By:  Lei, J. (Tilburg University, Center for Economic Research) 
Abstract:  Abstract: This paper considers spatial autoregressive (SAR) binary choice models in the context of panel data with fixed effects, where the latent dependent variables are spatially correlated. Without imposing any parametric structure of the error terms, this paper proposes a smoothed spatial maximum score (SSMS) estimator which consistently estimates the model parameters up to scale. The identification of parameters is obtained, when the disturbances are timestationary and the explanatory variables vary enough over time along with an exogenous and timeinvariant spatial weight matrix. Consistency and asymptotic distribution of the proposed estimator are also derived in the paper. Finally, a Monte Carlo study indicates that the SSMS estimator performs quite well in finite samples. 
Keywords:  Spatial Autoregressive Models;Binary Choice;Fixed Effects;Maximum Score Estimation 
JEL:  C14 C21 C23 C25 R15 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:2013061&r=ecm 
By:  Cizek, P.; Lei, J. (Tilburg University, Center for Economic Research) 
Abstract:  Abstract: The identification of parameters in a nonseparable singleindex models with correlated random effects is considered in the context of panel data with a fixed number of time periods. The identification assumption is based on the correlated randomeffect structure: the distribution of individual effects depends on the explanatory variables only by means of their timeaverages. Under this assumption, the parameters of interest are identified up to scale and could be estimated by an average derivative estimator based on the local polynomial smoothing. The rate of convergence and asymptotic distribution of the proposed estimator are derived along with a test whether pooled estimation using all available time periods is possible. Finally, a Monte Carlo study indicates that our estimator performs quite well in finite samples. 
Keywords:  average derivative estimation;correlated random effects;local polynomia smoothing;nonlinear panel data 
JEL:  C14 C23 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:2013062&r=ecm 
By:  Monica Billio (Department of Economics, University Of Venice Cà Foscari, Italy); Maddalena Cavicchioli (Department of Economics, University Of Venice Cà Foscari, Italy) 
Abstract:  This paper is devoted to show duality in the estimation of Markov Switching (MS) processes for volatility. It is wellknown that MSGARCH models suffer of path dependence which makes the estimation step unfeasible with usual Maximum Likelihood procedure. However, by rewriting the MSGARCH model in a suitable linear State Space representation, we are able to give a unique framework to reconcile the estimation obtained by the Kalman Filter and with some auxiliary models proposed in the literature. Reasoning in the same way, we present a linear Filter for MSStochastic Volatility (MSSV) models on which different conditioning sets yield more flexibility in the estimation. Estimation on simulated data and on shortterm interest rates shows the feasibility of the proposed approach. 
Keywords:  Markov Switching, MSGARCH model, MSSV model, estimation, auxiliary model, Kalman Filter. 
JEL:  C01 C13 C58 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:ven:wpaper:2013:24&r=ecm 
By:  Nadja Klein; Thomas Kneib; Stephan Klasen; Stefan Lang 
Abstract:  In this paper, we propose a unified Bayesian approach for multivariate structured additive distributional regression analysis where inference is applicable to a huge class of multivariate response distributions, comprising continuous, discrete and latent models, and where each parameter of these potentially complex distributions is modelled by a structured additive predictor. The latter is an additive composition of different types of covariate effects e.g. nonlinear effects of continuous variables, random effects, spatial variations, or interaction effects. Inference is realised by a generic, efficient Markov chain Monte Carlo algorithm based on iteratively weighted least squares approximations and with multivariate Gaussian priors to enforce specific properties of functional effects. Examples will be given by illustrations on analysing the joint model of risk factors for chronic and acute childhood malnutrition in India and on ecological regression for German election results. 
Keywords:  correlated responses; iteratively weighted least squares proposal; Markov chain Monte Carlo simulation; penalised splines; semiparametric regression; Dirichlet regression; seemingly unrelated regression 
Date:  2013–11 
URL:  http://d.repec.org/n?u=RePEc:inn:wpaper:201335&r=ecm 
By:  Fernández Macho, Francisco Javier 
Abstract:  In a recent paper LeongHuang:2010 {Journal of Applied Statistics 37, 215â€“233} proposed a waveletcorrelationbased approach to test for cointegration between two time series. However, correlation and cointegration are two different concepts even when wavelet analysis is used. It is known that statistics based on nonstationary integrated variables have nonstandard asymptotic distributions. However, wavelet analysis offsets the integrating order of nonstationary series so that traditional asymptotics on stationary variables suffices to ascertain the statistical properties of waveletbased statistics. Based on this, this note shows that wavelet correlations cannot be used as a test of cointegration. 
Keywords:  econometric methods, spectral analysis, integrated process, time series models, unit roots, wavelet analysis. 
JEL:  C22 C12 
URL:  http://d.repec.org/n?u=RePEc:ehu:biltok:10862&r=ecm 
By:  Xiangjin B. Chen; Jiti Gao; Degui Li; Param Silvapulle 
Abstract:  This paper introduces a new specification for the heterogeneous autoregressive (HAR) model for the realized volatility of S&P500 index returns. In this new model, the coeffcients of the HAR are allowed to be timevarying with unknown functional forms. We propose a local linear method for estimating this TVCHAR model as well as a bootstrap method for constructing confidence intervals for the time varying coefficient functions. In addition, the estimated nonparametric TVCHAR was calibrated by fitting parametric polynomial functions by minimising the L2type criterion. The calibrated TVCHAR and the simple HAR models were tested separately against the nonparametric TVCHAR model. The test statistics constructed based on the generalised likelihood ratio method augmented with bootstrap method provide evidence in favour of calibrated TVCHAR model. More importantly, the results of conditional predictive ability test developed by Giacomini and White (2006) indicate that the nonparametric TVCHAR model consistently outperforms its calibrated counterpart as well as the simple HAR and the HARGARCH models in outofsample forecasting. 
Keywords:  Bootstrap method, heterogeneous autoregressive model, locally stationary process, nonparametric method 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201321&r=ecm 
By:  Ting Wang; Edgar C. Merkle; Achim Zeileis 
Abstract:  In this paper, we consider a family of recentlyproposed measurement invariance tests that are based on the scores of a fitted model. This family can be used to test for measurement invariance w.r.t. a continuous auxiliary variable, without prespecification of subgroups. Moreover, the family can be used when one wishes to test for measurement invariance w.r.t. an ordinal auxiliary variable, yielding test statistics that are sensitive to violations that are monotonically related to the ordinal variable (and less sensitive to nonmonotonic violations). The paper is specifically aimed at potential users of the tests who may wish to know (i) how the tests can be employed for their data, and (ii) whether the tests can accurately identify specific models parameters that violate measurement invariance (possibly in the presence of model misspecification). After providing an overview of the tests, we illustrate their general use via the R packages lavaan and strucchange. We then describe two novel simulations that provide evidence of the tests' practical abilities. As a whole, the paper provides researchers with the tools and knowledge needed to apply these tests to general measurement invariance scenarios. 
Keywords:  measurement invariance, parameter stability, ordinal variable, factor analysis, structural equation models 
JEL:  C30 C52 C87 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:inn:wpaper:201333&r=ecm 
By:  Bent Jesper Christensen (Aarhus University and CREATES); Robinson Kruse (Leibniz University Hannover and CREATES); Philipp Sibbertsen (Leibniz University Hannover) 
Abstract:  We consider hypothesis testing in a general linear time series regression framework when the possibly fractional order of integration of the error term is unknown. We show that the approach suggested by Vogelsang (1998a) for the case of integer integration does not apply to the case of fractional integration. We propose a Lagrange Multipliertype test whose limiting distribution is independent of the order of integration of the errors. Different testing scenarios for the case of deterministic and stochastic regressors are considered. Simulations demonstrate that the proposed test works well for a variety of different cases, thereby emphasizing its generality. 
Keywords:  Long memory, linear time series regression, Lagrange Multiplier test 
JEL:  C12 C22 
Date:  2013–05–24 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201335&r=ecm 
By:  Kim, ChangJin; Kim, Jaeho 
Abstract:  In the case of a flat prior, a conventional wisdom is that Bayesian inference may not be very different from classical inference, as the likelihood dominates the posterior density. This paper shows that there are cases in which this conventional wisdom does not apply. An ARMA model of real GDP growth estimated by Perron and Wada (2009) is an example. While their maximum likelihood estimation of the model implies that real GDP may be a trend stationary process, Bayesian estimation of the same model implies that most of the variations in real GDP can be explained by the stochastic trend component, as in Nelson and Plosser (1982) and Morley et al. (2003). We show such dramatically different results stem from the differences in how the nuisance parameters are handled between the two approaches, especially when the parameter estimate of interest is dependent upon the estimates of the nuisance parameters for small samples. For the maximum likelihood approach, as the number of the nuisance parameters increases, we have higher probability that the movingaverage root may be estimated to be one even when its true value is less than one, spuriously indicating that the data is `overdifferenced.' However, the Bayesian approach is relatively free from this pileup problem, as the posterior distribution is not dependent upon the nuisance parameters. 
Keywords:  pileup problem, ARMA model, UnobservedComponents Model, Profile likelihood, marginal powterior density, TrendCycle decomposition 
JEL:  C11 E32 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:51118&r=ecm 
By:  Donnet, Sophie; Samson, Adeline 
Abstract:  This paper is a survey of existing estimation methods for pharmacokinetic/pharmacodynamic (PK/PD) models based on stochastic differential equations (SDEs). Most parametric estimation methods proposed for SDEs require high frequency data and are often poorly suited for PK/PD data which are usually sparse. Moreover, PK/PD experiments generally include not a single individual but a group of subjects, leading to a population estimation approach. This review concentrates on estimation methods which have been applied to PK/PD data, for SDEs observed with and without measurement noise, with a standard or a population approach. Besides, the adopted methodologies highly differ depending on the existence or not of an explicit transition density of the SDE solution. 
Keywords:  Stochastic differential equations; Pharmacokinetic; Pharmacodynamic; population approach; maximum likelihood estimation; Kalman Filter; EM algorithm; Hermite expansion; Gauss quadrature; Bayesian estimation; 
JEL:  C11 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:dau:papers:123456789/11429&r=ecm 
By:  Fengler, Matthias R.; Mammen, Enno; Vogt, Michael 
Abstract:  For an additive autoregression model, we study two types of testing problems. First, a parametric specification of a component function is compared against a nonparametric fit. Second, two nonparametric fits of two different time periods are tested for equality. We apply the theory to a nonparametric extension of the linear heterogeneous autoregressive (HAR) model. The linear HAR model is widely employed to describe realized variance data. We find that the linearity assumption is often rejected, in particular on equity, fixed income, and currency futures data; in the presence of a structural break, nonlinearity appears to prevail on the sample before the outbreak of the financial crisis in mid2007. 
Keywords:  Additive models; Backfitting; Nonparametric time series analysis; Specification tests; Realized variance; Heterogeneous autoregressive model. 
JEL:  C14 C58 
Date:  2013–11 
URL:  http://d.repec.org/n?u=RePEc:usg:econwp:2013:32&r=ecm 
By:  Jia Chen; Degui Li; Jiti Gao 
Abstract:  This article provides a selective review on the recent developments of some nonlinear nonparametric and semiparametric panel data models. In particular, we focus on two types of modelling frameworks: nonparametric and semiparametric panel data models with deterministic trends, and semiparametric singleindex panel data models with individual effects. We also review various estimation methodologies which can consistently estimate both the parametric and nonparametric components in these models. The time series length and crosssectional size in this article are allowed to be very large, under which the panel data are called â€œlarge dimensional panels". 
Keywords:  Deterministic trends, local linear fitting, panel data, semiparametric estimation, singleindex models 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201318&r=ecm 
By:  Juan Carlos Escanciano; Lin Zhu 
Abstract:  This paper provides tools for partial identification inference and sensistivity analysis in a general class of semiparametric models. The main working assumption is that the finitedimensional parameter of interest and the possibility infinitedimensional nuisance parameter are identified conditionally on other nuisance parameters being known. This structure arises in numerous applications and leads to relatively simple inference procedures. The paper develops uniform convergence for a set of semiparametric twostep GMM estimators, and it uses the uniformity to establish set inferences, including confidence regions for the identified set and the true parameter. Sensitivity analysis considers a domain of variation for the unidentified parameter that can be well outside its identified set, which demands inference to be established under misspecification. The paper also introduces new measures of sensitivity. Inferences are implemented with new bootstrap methods. Several example applications illustrate the wide applicability of our results. 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:55/13&r=ecm 
By:  Xibin Zhang; Maxwell L. King 
Abstract:  This paper aims to investigate a Bayesian sampling approach to parameter estimation in the GARCH model with an unknown conditional error density, which we approximate by a mixture of Gaussian densities centered at individual errors and scaled by a common standard deviation. This mixture density has the form of a kernel density estimator of the errors with its bandwidth being the standard deviation. This study is motivated by the lack of robustness in GARCH models with a parametric assumption for the error density when used for errordensity based inference such as valueatrisk (VaR) estimation. A contribution of the paper is to construct the likelihood and posterior of the model and bandwidth parameters under the kernelform error density, and to derive the onestepahead posterior predictive density of asset returns. We also investigate the use and benefit of localized bandwidths in the kernelform error density. A Monte Carlo simulation study reveals that the robustness of the kernelform error density compensates for the loss of accuracy when using this density. Applying this GARCH model to daily return series of 42 assets in stock, commodity and currency markets, we find that this GARCH model is favored against the GARCH model with a skewed Student t error density for all stock indices, two out of 11 currencies and nearly half of the commodities. This provides an empirical justification for the value of the proposed GARCH model. 
Keywords:  Bayes factors, Gaussian kernel error density, localized bandwidths, Markov chain Monte Carlo, valueatrisk 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201319&r=ecm 
By:  Xibin Zhang; Maxwell L. King; Han Lin Shang 
Abstract:  We propose to approximate the unknown error density of a nonparametric regression model by a mixture of Gaussian densities with means being the individual error realizations and variance a constant parameter. This mixture density has the form of a kernel density estimator of error realizations. We derive an approximate likelihood and posterior for bandwidth parameters in the kernelform error density and the NadarayaWatson regression estimator and develop a sampling algorithm. A simulation study shows that when the true error density is nonGaussian, the kernelform error density is often favored against its parametric counterparts including the correct error density assumption. Our approach is demonstrated through a nonparametric regression model of the Australian All Ordinaries daily return on the overnight FTSE and S&P 500 returns. Using the estimated bandwidths, we derive the onedayahead density forecast of the All Ordinaries return, and a distributionfree valueatrisk is obtained. The proposed algorithm is also applied to a nonparametric regression model involved in stateâ€“price density estimation based on S&P 500 options data. 
Keywords:  Bayes factors, kernelform error density, MetropolisHastings algorithm, stateâ€“price density, valueatrisk 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201320&r=ecm 
By:  Cinzia Daraio (Department of Computer, Control and Management Engineering, Universita' degli Studi di Roma "La Sapienza"); LŽopold Simar (Institute of Statistics, Biostatistics et Actuarial Sciences, Universite'Catholique de Louvain, Belgium) 
Abstract:  Directional distance functions provide very flexible tools for investigating the performance of Decision Making Units (DMUs). Their flexibility relies on their ability to handle undesirable outputs and to account for nondiscretionary inputs and/or outputs by fixing zero values in some elements of the directional vector. Simar and Vanhems (2012) and Simar et al. (2012) indicate how the statistical properties of FarrellDebreu type of radial efficiency measures can be transferred to directional distances. Moreover, robust versions of these distances are also available, for conditional and unconditional measures. Bùadin et al. (2012) have shown how conditional radial distances are useful to investigate the effect of environmental factors on the production process. In this paper we develop the operational aspects for computing conditional and unconditional directional distances and their robust versions, in particular when some of the elements of the directional vector are fixed at zero. After that, we show how the approach of Bùadin et al. (2012) can be adapted in a directional distance framework, including bandwidth selection and twostage regression of conditional efficiency scores. Finally, we suggest a procedure, based on bootstrap techniques, for testing the significance of environmental factors on directional efficiency scores. The procedure is illustrated through simulated and real data. 
Keywords:  Directional Distances, Data Envelopment Analysis (DEA), Free Disposal Hull (FDH), Conditional efficiency measures, Nonparametric frontiers, Bootstrap 
Date:  2013–11 
URL:  http://d.repec.org/n?u=RePEc:aeg:report:201311&r=ecm 
By:  Ulrich Hounyo (OxfordMan Institute of Quantitative Finance and CREATES) 
Abstract:  The main contribution of this paper is to propose a new bootstrap method for statistics based on high frequency returns. The new method exploits the local Gaussianity and the local constancy of volatility of high frequency returns, two assumptions that can simplify inference in the high frequency context, as recently explained by Mykland and Zhang (2009). Our main contributions are as follows. First, we show that the local Gaussian bootstrap is firstorder consistent when used to estimate the distributions of realized volatility and ealized betas. Second, we show that the local Gaussian bootstrap matches accurately the first four cumulants of realized volatility, implying that this method provides thirdorder refinements. This is in contrast with the wild bootstrap of Gonçalves and Meddahi (2009), which is only secondorder correct. Third, we show that the local Gaussian bootstrap is able to provide secondorder refinements for the realized beta, which is also an improvement of the existing bootstrap results in Dovonon, Gonçalves and Meddahi (2013) (where the pairs bootstrap was shown not to be secondorder correct under general stochastic volatility). Lastly, we provide Monte Carlo simulations and use empirical data to compare the finite sample accuracy of our new bootstrap confidence intervals for integrated volatility and integrated beta with the existing results. 
Keywords:  High frequency data, realized volatility, realized beta, bootstrap, Edgeworth expansions 
JEL:  C15 C22 C58 
Date:  2013–09–16 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201330&r=ecm 
By:  Niels S. Hansen (Aarhus University and CREATES); Asger Lunde (Aarhus University and CREATES) 
Abstract:  In this paper we are interested in the term structure of futures contracts on oil. The objective is to specify a relatively parsimonious model which explains data well and performs well in a real time out of sample forecasting. The dynamic NelsonSiegel model is normally used to analyze and forecast interest rates of different maturities. The structure of oil futures resembles the structure of interest rates and this motivates the use of this model for our purposes. The data set is vast and the dynamic NelsonSiegel model allows for a significant dimension reduction by introducing three factors. By performing a series of crosssection regressions we obtain time series for these factors and we focus on modeling their joint distribution. Using copula decomposition we can set up a model for each factor individually along with a model for their dependence structure. When a reasonable model for the factors has been specified it can be used to forecast prices of futures contracts with different maturities. The outcome of this exercise is a class of models which describes the observed futures contracts well and forecasts better than conventional benchmarks. We carry out a real time value at risk analysis and show that our class of models performs well. 
Keywords:  Oil futures, NelsonSiegel, Normal Inverse Gaussian, GARCH, Copula. 
JEL:  G17 C32 C53 
Date:  2013–10–25 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201336&r=ecm 
By:  Nalan Basturk (Erasmus University Rotterdam Econometric Institute, Tinbergen Institute); Cem Cakmakli (University of Amsterdam Department of Quantitative Economics, Koç University); Pinar Ceyhan (Erasmus University Rotterdam Econometric Institute, Tinbergen Institute); Herman K. van Dijk (Erasmus University Rotterdam Econometric Institute, Tinbergen Institute, VU University Amsterdam Department of Econometrics) 
Abstract:  Changing time series properties of US inflation and economic activity, measured as marginal costs, are modeled within a set of extended Phillips Curve (PC) models. It is shown that mechanical removal or modeling of simple low frequency movements in the data may yield poor predictive results which depend on the model specification used. Basic PC models are extended to include structural time series models that describe typical time varying patterns in levels and volatilities. Forward and backward looking expectation components for inflation are incorporated and their relative importance is evaluated. Survey data on expected inflation are introduced to strengthen the information in the likelihood. Use is made of simulation based Bayesian techniques for the empirical analysis. No credible evidence is found on endogeneity and long run stability between inflation and marginal costs. Backwardlooking inflation appears stronger than forwardlooking one. Levels and volatilities of inflation are estimated more precisely using rich PC models. The extended PC structures compare favorably with existing basic Bayesian vector autoregressive and stochastic volatility models in terms of fit and prediction. Tails of the complete predictive distributions indicate an increase in the probability of deflation in recent years. 
Keywords:  New Keynesian Phillips curve, unobserved components, time varying parameters, level shifts, inflation expectations, survey data 
JEL:  C11 C32 E31 E37 
Date:  2013–11 
URL:  http://d.repec.org/n?u=RePEc:koc:wpaper:1321&r=ecm 
By:  Richard Blundell (Institute for Fiscal Studies and University College London); Joel Horowitz (Institute for Fiscal Studies and Northwestern University); Matthias Parey (Institute for Fiscal Studies) 
Abstract:  Economic theory rarely provides a parametric specification for a model, but it often provides shape restrictions. We consider nonparametric estimation of the heterogeneous demand for gasoline in the U.S. subject to the Slutsky inequality restriction of consumer choice theory. We derive conditions under which the demand function can be estimated consistently by nonparametric quantile regression subject to the Slutsky restriction. The estimated function reveals systematic variation in price responsiveness across the income distribution. A new method for estimating quantile instrumental variables models is also developed to allow for the endogeneity of prices. In our application, shapeconstrained quantile IV estimates show similar patterns of demand as shapeconstrained estimates under exogeneity. The results illustrate the improvements in the finitesample performance of a nonparametric estimator that can be achieved by imposing shape restrictions based on economic theory. 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:54/13&r=ecm 
By:  Yuta Kurose (Center for the Study of Finance and Insurance, Osaka University,); Yasuhiro Omori (Faculty of Economics, University of Tokyo) 
Abstract:  ã€€ã€€ A multivariate stochastic volatility model with dynamic equicorrelation and cross leverage ef fect is proposed and estimated. Using a Bayesian approach, an ecient Markov chain Monte Carlo algorithm is described where we use the multimove sampler, which generates multiple latent variables simultaneously. Numerical examples are provided to show its sampling e ciency in comparison with the simple algorithm that generates one latent variable at a time given other latent variables. Furthermore, the proposed model is applied to the multivariate daily stock price index data. The empirical study shows that our novel model provides a substantial improvement in forecasting with respect to outofsample hedging performances 
Date:  2013–11 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2013cf907&r=ecm 
By:  Mark Podolskij (Heidelberg University and CREATES); Nakahiro Yoshida (Graduate School of Mathematical Science) 
Abstract:  This paper presents new results on the Edgeworth expansion for high frequency functionals of continuous diffusion processes. We derive asymptotic expansions for weighted functionals of the Brownian motion and apply them to provide the Edgeworth expansion for power variation of diffusion processes. Our methodology relies on martingale embedding, Malliavin calculus and stable central limit theorems for semimartingales. Finally, we demonstrate the density expansion for studentized statistics of power variations. 
Keywords:  diffusion processes, Edgeworth expansion, high frequency observations, power variation. 
JEL:  C10 C13 C14 
Date:  2013–10–21 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201333&r=ecm 
By:  Kim, ChangJin; Kim, Jaeho 
Abstract:  One goal of this paper is to develop an efficient MarkovChain Monte Carlo (MCMC) algorithm for estimating an ARMA model with a regimeswitching mean, based on a multimove sampler. Unlike the existing algorithm of Billio et al. (1999) based on a singlemove sampler, our algorithm can achieve reasonably fast convergence to the posterior distribution even when the latent regime indicator variable is highly persistent or when there exist absorbing states. Another goal is to appropriately investigate the dynamics of the latent exante real interest rate (EARR) in the presence of structural breaks, by employing the econometric tool developed. We argue Garcia and Perron's (1996) conclusion that the EARR rate is a constant subject to occasional jumps may be samplespecific. For an extended sample that includes recent data, Garcia and Perron's (1996) AR(2) model of EPRR may be misspecified, and we show that excluding the theoryimplied movingaverage terms may understate the persistence of the observed expost real interest rate (EPRR) dynamics. Our empirical results suggest that, even though we rule out the possibility of a unit root in the EARR, it may be more persistent and volatile than has been documented in some of the literature including Garcia and Perron (1996). 
Keywords:  ARMA model with Regime Switching, Multimove Sampler, SingleMove Sampler, MetropolisHastings Algorithm, Absorbing State, ExAnte Real Interest Rate. 
JEL:  C11 E4 
Date:  2013–08 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:51117&r=ecm 
By:  Lorenzo Frattarolo (Centre d'Economie de la Sorbonne and University Ca Foscari  Department of Economics); Dominique Guegan (Centre d'Economie de la Sorbonne  Paris School of Economics) 
Abstract:  Conditional dependence is expressed as a projection map in the trivariate copula space. The projected copula, its sample counterpart and the related process are defined. The weak convergence of the projected copula process to a tight centered Gaussian Process is obtained under weak assumptions on copula derivatives. 
Keywords:  Conditional independence, empirical process, weak convergence, copula. 
JEL:  D81 C10 C40 C52 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:mse:cesdoc:13068&r=ecm 
By:  Wang, W. (Tilburg University) 
Abstract:  Abstract: This thesis first investigates various issues related with model averaging, and then evaluates two policies, i.e. West Development Drive in China and fiscal decentralization in U.S, using econometric tools. Chapter 2 proposes a hierarchical weighted least squares (HWALS) method to address multiple sources of uncertainty generated from model specification, estimation, and measurement choices. It examines the effects of different growth theories taking into account the measurement problem in the growth regression. Chapter 3 addresses the issue of prediction under model uncertainty, and proposes a weighted average least squares (WALS) prediction procedure that is not conditional on the selected model. Taking both model and error uncertainty into account, it also proposes an appropriate estimate of the variance of the WALS predictor. Chapter 4 focuses on the interplay among resource abundance, institutional quality, and economic growth in China, using two different measures of resource abundance. It employs a functionalcoefficient model to capture the nonlinear interaction effect of institutional quality, and paneldata timevarying coefficient model to describe the dynamic effect of natural resources. Chapter 5 considers a dark side of fiscal decentralization. It models and empirically tests a dressup contest caused by fiscal decentralization, and shows that the dressup contest can lead to a social welfare loss. 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:ner:tilbur:urn:nbn:nl:ui:125928130&r=ecm 
By:  Andrew Gelman; Guido Imbens 
Abstract:  The statistical and econometrics literature on causality is more focused on "effects of causes" than on "causes of effects." That is, in the standard approach it is natural to study the effect of a treatment, but it is not in general possible to define the causes of any particular outcome. This has led some researchers to dismiss the search for causes as "cocktail party chatter" that is outside the realm of science. We argue here that the search for causes can be understood within traditional statistical frameworks as a part of model checking and hypothesis generation. We argue that it can make sense to ask questions about the causes of effects, but the answers to these questions will be in terms of effects of causes. 
JEL:  C01 
Date:  2013–11 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:19614&r=ecm 
By:  Yoosoon Chang; Yongok Choi; Chang Sik Kim; Joon Y. Park; J. Isaac Miller (Department of Economics, University of MissouriColumbia) 
Abstract:  We introduce a panel model with a nonparametric functional coefficient of multiple arguments. The coefficient is a function both of time, allowing temporal changes in an otherwise linear model, and of the regressor itself, allowing nonlinearity. In contrast to a time series model, the effects of the two arguments can be identified using a panel model. We apply the model to the relationship between real GDP and electricity consumption. Our results suggest that the corresponding elasticities have decreased over time in developed countries, but that this decrease cannot be entirely explained by changes in GDP itself or by sectoral shifts. 
Keywords:  semiparametric panel regression, partially linear functional coefficient model, elasticity of electricity demand 
JEL:  C33 C51 C53 Q41 
Date:  2013–11–08 
URL:  http://d.repec.org/n?u=RePEc:umc:wpaper:1320&r=ecm 
By:  Jarociński, Marek; Maćkowiak, Bartosz 
Abstract:  A researcher is interested in a set of variables that he wants to model with a vector autoregression and he has a dataset with more variables. Which variables from the dataset to include in the VAR, in addition to the variables of interest? This question arises in many applications of VARs, in prediction and impulse response analysis. We develop a Bayesian methodology to answer this question. We rely on the idea of Grangercausalpriority, related to the wellknown concept of Grangernoncausality. The methodology is simple to use, because we provide closedform expressions for the relevant posterior probabilities. Applying the methodology to the case when the variables of interest are output, the price level, and the shortterm interest rate, we find remarkably similar results for the United States and the euro area. JEL Classification: C32, C52, E32 
Keywords:  Bayesian model choice, grangercausalpriority, grangernoncausality, structural vector autoregression, Vector autoregression 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:ecb:ecbwps:20131600&r=ecm 
By:  Mittnik, Stefan 
Abstract:  Empirical evidence suggests that asset returns correlate more strongly in bear markets than conventional correlation estimates imply. We propose a method for determining complete tailcorrelation matrices based on ValueatRisk (VaR) estimates. We demonstrate how to obtain more effi cient tailcorrelation estimates by use of overidenti cation strategies and how to guarantee positive semidefi niteness, a property required for valid risk aggregation and Markowitztype portfolio optimization. An empirical application to a 30asset universe illustrates the practical applicability and relevance of the approach in portfolio management.  
Keywords:  Downside risk,Estimation efficiency,Portfolio optimization,Positive semidefiniteness,Solvency II,ValueatRisk 
JEL:  C1 G11 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:zbw:cfswop:201305&r=ecm 