nep-ecm New Economics Papers
on Econometrics
Issue of 2013‒11‒16
29 papers chosen by
Sune Karlsson
Orebro University

  1. Smoothed Spatial Maximum Score Estimation of Spatial Autoregressive Binary Choice Panel Models By Lei, J.
  2. Identification and Estimation of Nonseparable Single-Index Models in Panel Data with Correlated Random Effects By Cizek, P.; Lei, J.
  3. “Markov Switching Models for Volatility: Filtering, Approximation and Duality” By Monica Billio; Maddalena Cavicchioli
  4. Bayesian Structured Additive Distributional Regression for Multivariate Responses By Nadja Klein; Thomas Kneib; Stephan Klasen; Stefan Lang
  5. A Note on Wavelet Correlation and Cointegration By Fernández Macho, Francisco Javier
  6. Nonparametric Estimation and Parametric Calibration of Time-Varying Coefficient Realized Volatility Models By Xiangjin B. Chen; Jiti Gao; Degui Li; Param Silvapulle
  7. Score-Based Tests of Measurement Invariance: Use in Practice By Ting Wang; Edgar C. Merkle; Achim Zeileis
  8. A unified framework for testing in the linear regression model under unknown order of fractional integration By Bent Jesper Christensen; Robinson Kruse; Philipp Sibbertsen
  9. The `Pile-up Problem' in Trend-Cycle Decomposition of Real GDP: Classical and Bayesian Perspectives By Kim, Chang-Jin; Kim, Jaeho
  10. A review on estimation of stochastic differential equations for pharmacokinetic/pharmacodynamic models By Donnet, Sophie; Samson, Adeline
  11. Additive modeling of realized variance: tests for parametric specifications and structural breaks By Fengler, Matthias R.; Mammen, Enno; Vogt, Michael
  12. Non- and Semi-Parametric Panel Data Models: A Selective Review By Jia Chen; Degui Li; Jiti Gao
  13. Set inferences and sensitivity analysis in semiparametric conditionally identified models By Juan Carlos Escanciano; Lin Zhu
  14. Gaussian kernel GARCH models By Xibin Zhang; Maxwell L. King
  15. A sampling algorithm for bandwidth estimation in a nonparametric regression model with a flexible error density By Xibin Zhang; Maxwell L. King; Han Lin Shang
  16. Directional Distances and their Robust versions. Computational and Testing Issues By Cinzia Daraio; LŽopold Simar
  17. Bootstrapping realized volatility and realized beta under a local Gaussianity assumption By Ulrich Hounyo
  18. Analyzing Oil Futures with a Dynamic Nelson-Siegel Model By Niels S. Hansen; Asger Lunde
  19. Posterior-Predictive Evidence on US Inflation using Extended Phillips Curve Models with non-filtered Data By Nalan Basturk; Cem Cakmakli; Pinar Ceyhan; Herman K. van Dijk
  20. Nonparametric estimation of a heterogeneous demand function under the Slutsky inequality restriction By Richard Blundell; Joel Horowitz; Matthias Parey
  21. "Dynamic Equicorrelation Stochastic Volatility" By Yuta Kurose; Yasuhiro Omori
  22. Edgeworth expansion for functionals of continuous diffusion processes By Mark Podolskij; Nakahiro Yoshida
  23. Bayesian Inference in Regime-Switching ARMA Models with Absorbing States: The Dynamics of the Ex-Ante Real Interest Rate Under Structural Breaks By Kim, Chang-Jin; Kim, Jaeho
  24. Empirical Projected Copula Process and Conditional Independence an Extended Version. By Lorenzo Frattarolo; Dominique Guegan
  25. Essays on model averaging and political economics. By Wang, W.
  26. Why ask Why? Forward Causal Inference and Reverse Causal Questions By Andrew Gelman; Guido Imbens
  27. Disentangling Temporal Patterns in Elasticities: A Functional Coefficient Panel Analysis of Electricity Demand By Yoosoon Chang; Yongok Choi; Chang Sik Kim; Joon Y. Park; J. Isaac Miller
  28. Granger-causal-priority and choice of variables in vector autoregressions By Jarociński, Marek; Maćkowiak, Bartosz
  29. VaR-implied tail-correlation matrices By Mittnik, Stefan

  1. By: Lei, J. (Tilburg University, Center for Economic Research)
    Abstract: Abstract: This paper considers spatial autoregressive (SAR) binary choice models in the context of panel data with fixed effects, where the latent dependent variables are spatially correlated. Without imposing any parametric structure of the error terms, this paper proposes a smoothed spatial maximum score (SSMS) estimator which consistently estimates the model parameters up to scale. The identification of parameters is obtained, when the disturbances are time-stationary and the explanatory variables vary enough over time along with an exogenous and time-invariant spatial weight matrix. Consistency and asymptotic distribution of the proposed estimator are also derived in the paper. Finally, a Monte Carlo study indicates that the SSMS estimator performs quite well in finite samples.
    Keywords: Spatial Autoregressive Models;Binary Choice;Fixed Effects;Maximum Score Estimation
    JEL: C14 C21 C23 C25 R15
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:2013061&r=ecm
  2. By: Cizek, P.; Lei, J. (Tilburg University, Center for Economic Research)
    Abstract: Abstract: The identification of parameters in a nonseparable single-index models with correlated random effects is considered in the context of panel data with a fixed number of time periods. The identification assumption is based on the correlated random-effect structure: the distribution of individual effects depends on the explanatory variables only by means of their time-averages. Under this assumption, the parameters of interest are identified up to scale and could be estimated by an average derivative estimator based on the local polynomial smoothing. The rate of convergence and asymptotic distribution of the proposed estimator are derived along with a test whether pooled estimation using all available time periods is possible. Finally, a Monte Carlo study indicates that our estimator performs quite well in finite samples.
    Keywords: average derivative estimation;correlated random effects;local polynomia smoothing;nonlinear panel data
    JEL: C14 C23
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:2013062&r=ecm
  3. By: Monica Billio (Department of Economics, University Of Venice Cà Foscari, Italy); Maddalena Cavicchioli (Department of Economics, University Of Venice Cà Foscari, Italy)
    Abstract: This paper is devoted to show duality in the estimation of Markov Switching (MS) processes for volatility. It is well-known that MS-GARCH models suffer of path dependence which makes the estimation step unfeasible with usual Maximum Likelihood procedure. However, by rewriting the MS-GARCH model in a suitable linear State Space representation, we are able to give a unique framework to reconcile the estimation obtained by the Kalman Filter and with some auxiliary models proposed in the literature. Reasoning in the same way, we present a linear Filter for MS-Stochastic Volatility (MS-SV) models on which different conditioning sets yield more flexibility in the estimation. Estimation on simulated data and on short-term interest rates shows the feasibility of the proposed approach.
    Keywords: Markov Switching, MS-GARCH model, MS-SV model, estimation, auxiliary model, Kalman Filter.
    JEL: C01 C13 C58
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:ven:wpaper:2013:24&r=ecm
  4. By: Nadja Klein; Thomas Kneib; Stephan Klasen; Stefan Lang
    Abstract: In this paper, we propose a unified Bayesian approach for multivariate structured additive distributional regression analysis where inference is applicable to a huge class of multivariate response distributions, comprising continuous, discrete and latent models, and where each parameter of these potentially complex distributions is modelled by a structured additive predictor. The latter is an additive composition of different types of covariate effects e.g. nonlinear effects of continuous variables, random effects, spatial variations, or interaction effects. Inference is realised by a generic, efficient Markov chain Monte Carlo algorithm based on iteratively weighted least squares approximations and with multivariate Gaussian priors to enforce specific properties of functional effects. Examples will be given by illustrations on analysing the joint model of risk factors for chronic and acute childhood malnutrition in India and on ecological regression for German election results.
    Keywords: correlated responses; iteratively weighted least squares proposal; Markov chain Monte Carlo simulation; penalised splines; semiparametric regression; Dirichlet regression; seemingly unrelated regression
    Date: 2013–11
    URL: http://d.repec.org/n?u=RePEc:inn:wpaper:2013-35&r=ecm
  5. By: Fernández Macho, Francisco Javier
    Abstract: In a recent paper Leong-Huang:2010 {Journal of Applied Statistics 37, 215–233} proposed a wavelet-correlation-based approach to test for cointegration between two time series. However, correlation and cointegration are two different concepts even when wavelet analysis is used. It is known that statistics based on nonstationary integrated variables have non-standard asymptotic distributions. However, wavelet analysis offsets the integrating order of nonstationary series so that traditional asymptotics on stationary variables suffices to ascertain the statistical properties of wavelet-based statistics. Based on this, this note shows that wavelet correlations cannot be used as a test of cointegration.
    Keywords: econometric methods, spectral analysis, integrated process, time series models, unit roots, wavelet analysis.
    JEL: C22 C12
    URL: http://d.repec.org/n?u=RePEc:ehu:biltok:10862&r=ecm
  6. By: Xiangjin B. Chen; Jiti Gao; Degui Li; Param Silvapulle
    Abstract: This paper introduces a new specification for the heterogeneous autoregressive (HAR) model for the realized volatility of S&P500 index returns. In this new model, the coeffcients of the HAR are allowed to be time-varying with unknown functional forms. We propose a local linear method for estimating this TVC-HAR model as well as a bootstrap method for constructing confidence intervals for the time varying coefficient functions. In addition, the estimated nonparametric TVC-HAR was calibrated by fitting parametric polynomial functions by minimising the L2-type criterion. The calibrated TVC-HAR and the simple HAR models were tested separately against the nonparametric TVC-HAR model. The test statistics constructed based on the generalised likelihood ratio method augmented with bootstrap method provide evidence in favour of calibrated TVC-HAR model. More importantly, the results of conditional predictive ability test developed by Giacomini and White (2006) indicate that the non-parametric TVC-HAR model consistently outperforms its calibrated counterpart as well as the simple HAR and the HAR-GARCH models in out-of-sample forecasting.
    Keywords: Bootstrap method, heterogeneous autoregressive model, locally stationary process, nonparametric method
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2013-21&r=ecm
  7. By: Ting Wang; Edgar C. Merkle; Achim Zeileis
    Abstract: In this paper, we consider a family of recently-proposed measurement invariance tests that are based on the scores of a fitted model. This family can be used to test for measurement invariance w.r.t. a continuous auxiliary variable, without pre-specification of subgroups. Moreover, the family can be used when one wishes to test for measurement invariance w.r.t. an ordinal auxiliary variable, yielding test statistics that are sensitive to violations that are monotonically related to the ordinal variable (and less sensitive to non-monotonic violations). The paper is specifically aimed at potential users of the tests who may wish to know (i) how the tests can be employed for their data, and (ii) whether the tests can accurately identify specific models parameters that violate measurement invariance (possibly in the presence of model misspecification). After providing an overview of the tests, we illustrate their general use via the R packages lavaan and strucchange. We then describe two novel simulations that provide evidence of the tests' practical abilities. As a whole, the paper provides researchers with the tools and knowledge needed to apply these tests to general measurement invariance scenarios.
    Keywords: measurement invariance, parameter stability, ordinal variable, factor analysis, structural equation models
    JEL: C30 C52 C87
    Date: 2013–10
    URL: http://d.repec.org/n?u=RePEc:inn:wpaper:2013-33&r=ecm
  8. By: Bent Jesper Christensen (Aarhus University and CREATES); Robinson Kruse (Leibniz University Hannover and CREATES); Philipp Sibbertsen (Leibniz University Hannover)
    Abstract: We consider hypothesis testing in a general linear time series regression framework when the possibly fractional order of integration of the error term is unknown. We show that the approach suggested by Vogelsang (1998a) for the case of integer integration does not apply to the case of fractional integration. We propose a Lagrange Multiplier-type test whose limiting distribution is independent of the order of integration of the errors. Different testing scenarios for the case of deterministic and stochastic regressors are considered. Simulations demonstrate that the proposed test works well for a variety of different cases, thereby emphasizing its generality.
    Keywords: Long memory, linear time series regression, Lagrange Multiplier test
    JEL: C12 C22
    Date: 2013–05–24
    URL: http://d.repec.org/n?u=RePEc:aah:create:2013-35&r=ecm
  9. By: Kim, Chang-Jin; Kim, Jaeho
    Abstract: In the case of a flat prior, a conventional wisdom is that Bayesian inference may not be very different from classical inference, as the likelihood dominates the posterior density. This paper shows that there are cases in which this conventional wisdom does not apply. An ARMA model of real GDP growth estimated by Perron and Wada (2009) is an example. While their maximum likelihood estimation of the model implies that real GDP may be a trend stationary process, Bayesian estimation of the same model implies that most of the variations in real GDP can be explained by the stochastic trend component, as in Nelson and Plosser (1982) and Morley et al. (2003). We show such dramatically different results stem from the differences in how the nuisance parameters are handled between the two approaches, especially when the parameter estimate of interest is dependent upon the estimates of the nuisance parameters for small samples. For the maximum likelihood approach, as the number of the nuisance parameters increases, we have higher probability that the moving-average root may be estimated to be one even when its true value is less than one, spuriously indicating that the data is `over-differenced.' However, the Bayesian approach is relatively free from this pile-up problem, as the posterior distribution is not dependent upon the nuisance parameters.
    Keywords: pile-up problem, ARMA model, Unobserved-Components Model, Profile likelihood, marginal powterior density, Trend-Cycle decomposition
    JEL: C11 E32
    Date: 2013–10
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:51118&r=ecm
  10. By: Donnet, Sophie; Samson, Adeline
    Abstract: This paper is a survey of existing estimation methods for pharmacokinetic/pharmacodynamic (PK/PD) models based on stochastic differential equations (SDEs). Most parametric estimation methods proposed for SDEs require high frequency data and are often poorly suited for PK/PD data which are usually sparse. Moreover, PK/PD experiments generally include not a single individual but a group of subjects, leading to a population estimation approach. This review concentrates on estimation methods which have been applied to PK/PD data, for SDEs observed with and without measurement noise, with a standard or a population approach. Besides, the adopted methodologies highly differ depending on the existence or not of an explicit transition density of the SDE solution.
    Keywords: Stochastic differential equations; Pharmacokinetic; Pharmacodynamic; population approach; maximum likelihood estimation; Kalman Filter; EM algorithm; Hermite expansion; Gauss quadrature; Bayesian estimation;
    JEL: C11
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:dau:papers:123456789/11429&r=ecm
  11. By: Fengler, Matthias R.; Mammen, Enno; Vogt, Michael
    Abstract: For an additive autoregression model, we study two types of testing problems. First, a parametric specification of a component function is compared against a nonparametric fit. Second, two nonparametric fits of two different time periods are tested for equality. We apply the theory to a nonparametric extension of the linear heterogeneous autoregressive (HAR) model. The linear HAR model is widely employed to describe realized variance data. We find that the linearity assumption is often rejected, in particular on equity, fixed income, and currency futures data; in the presence of a structural break, nonlinearity appears to prevail on the sample before the outbreak of the financial crisis in mid-2007.
    Keywords: Additive models; Backfitting; Nonparametric time series analysis; Specification tests; Realized variance; Heterogeneous autoregressive model.
    JEL: C14 C58
    Date: 2013–11
    URL: http://d.repec.org/n?u=RePEc:usg:econwp:2013:32&r=ecm
  12. By: Jia Chen; Degui Li; Jiti Gao
    Abstract: This article provides a selective review on the recent developments of some nonlinear nonparametric and semiparametric panel data models. In particular, we focus on two types of modelling frameworks: nonparametric and semiparametric panel data models with deterministic trends, and semiparametric single-index panel data models with individual effects. We also review various estimation methodologies which can consistently estimate both the parametric and nonparametric components in these models. The time series length and cross-sectional size in this article are allowed to be very large, under which the panel data are called “large dimensional panels".
    Keywords: Deterministic trends, local linear fitting, panel data, semiparametric estimation, single-index models
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2013-18&r=ecm
  13. By: Juan Carlos Escanciano; Lin Zhu
    Abstract: This paper provides tools for partial identification inference and sensistivity analysis in a general class of semiparametric models. The main working assumption is that the finite-dimensional parameter of interest and the possibility infinite-dimensional nuisance parameter are identified conditionally on other nuisance parameters being known. This structure arises in numerous applications and leads to relatively simple inference procedures. The paper develops uniform convergence for a set of semiparametric two-step GMM estimators, and it uses the uniformity to establish set inferences, including confidence regions for the identified set and the true parameter. Sensitivity analysis considers a domain of variation for the unidentified parameter that can be well outside its identified set, which demands inference to be established under misspecification. The paper also introduces new measures of sensitivity. Inferences are implemented with new bootstrap methods. Several example applications illustrate the wide applicability of our results.
    Date: 2013–10
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:55/13&r=ecm
  14. By: Xibin Zhang; Maxwell L. King
    Abstract: This paper aims to investigate a Bayesian sampling approach to parameter estimation in the GARCH model with an unknown conditional error density, which we approximate by a mixture of Gaussian densities centered at individual errors and scaled by a common standard deviation. This mixture density has the form of a kernel density estimator of the errors with its bandwidth being the standard deviation. This study is motivated by the lack of robustness in GARCH models with a parametric assumption for the error density when used for error-density based inference such as value-at-risk (VaR) estimation. A contribution of the paper is to construct the likelihood and posterior of the model and bandwidth parameters under the kernel-form error density, and to derive the one-step-ahead posterior predictive density of asset returns. We also investigate the use and benefit of localized bandwidths in the kernel-form error density. A Monte Carlo simulation study reveals that the robustness of the kernel-form error density compensates for the loss of accuracy when using this density. Applying this GARCH model to daily return series of 42 assets in stock, commodity and currency markets, we find that this GARCH model is favored against the GARCH model with a skewed Student t error density for all stock indices, two out of 11 currencies and nearly half of the commodities. This provides an empirical justification for the value of the proposed GARCH model.
    Keywords: Bayes factors, Gaussian kernel error density, localized bandwidths, Markov chain Monte Carlo, value-at-risk
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2013-19&r=ecm
  15. By: Xibin Zhang; Maxwell L. King; Han Lin Shang
    Abstract: We propose to approximate the unknown error density of a nonparametric regression model by a mixture of Gaussian densities with means being the individual error realizations and variance a constant parameter. This mixture density has the form of a kernel density estimator of error realizations. We derive an approximate likelihood and posterior for bandwidth parameters in the kernel-form error density and the Nadaraya-Watson regression estimator and develop a sampling algorithm. A simulation study shows that when the true error density is non-Gaussian, the kernel-form error density is often favored against its parametric counterparts including the correct error density assumption. Our approach is demonstrated through a nonparametric regression model of the Australian All Ordinaries daily return on the overnight FTSE and S&P 500 returns. Using the estimated bandwidths, we derive the one-day-ahead density forecast of the All Ordinaries return, and a distribution-free value-at-risk is obtained. The proposed algorithm is also applied to a nonparametric regression model involved in state–price density estimation based on S&P 500 options data.
    Keywords: Bayes factors, kernel-form error density, Metropolis-Hastings algorithm, state–price density, value-at-risk
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2013-20&r=ecm
  16. By: Cinzia Daraio (Department of Computer, Control and Management Engineering, Universita' degli Studi di Roma "La Sapienza"); LŽopold Simar (Institute of Statistics, Biostatistics et Actuarial Sciences, Universite'Catholique de Louvain, Belgium)
    Abstract: Directional distance functions provide very flexible tools for investigating the performance of Decision Making Units (DMUs). Their flexibility relies on their ability to handle undesirable outputs and to account for non-discretionary inputs and/or outputs by fixing zero values in some elements of the directional vector. Simar and Vanhems (2012) and Simar et al. (2012) indicate how the statistical properties of Farrell-Debreu type of radial efficiency measures can be transferred to directional distances. Moreover, robust versions of these distances are also available, for conditional and unconditional measures. Bùadin et al. (2012) have shown how conditional radial distances are useful to investigate the effect of environmental factors on the production process. In this paper we develop the operational aspects for computing conditional and unconditional directional distances and their robust versions, in particular when some of the elements of the directional vector are fixed at zero. After that, we show how the approach of Bùadin et al. (2012) can be adapted in a directional distance framework, including bandwidth selection and two-stage regression of conditional efficiency scores. Finally, we suggest a procedure, based on bootstrap techniques, for testing the significance of environmental factors on directional efficiency scores. The procedure is illustrated through simulated and real data.
    Keywords: Directional Distances, Data Envelopment Analysis (DEA), Free Disposal Hull (FDH), Conditional efficiency measures, Nonparametric frontiers, Bootstrap
    Date: 2013–11
    URL: http://d.repec.org/n?u=RePEc:aeg:report:2013-11&r=ecm
  17. By: Ulrich Hounyo (Oxford-Man Institute of Quantitative Finance and CREATES)
    Abstract: The main contribution of this paper is to propose a new bootstrap method for statistics based on high frequency returns. The new method exploits the local Gaussianity and the local constancy of volatility of high frequency returns, two assumptions that can simplify inference in the high frequency context, as recently explained by Mykland and Zhang (2009). Our main contributions are as follows. First, we show that the local Gaussian bootstrap is firstorder consistent when used to estimate the distributions of realized volatility and ealized betas. Second, we show that the local Gaussian bootstrap matches accurately the first four cumulants of realized volatility, implying that this method provides third-order refinements. This is in contrast with the wild bootstrap of Gonçalves and Meddahi (2009), which is only second-order correct. Third, we show that the local Gaussian bootstrap is able to provide second-order refinements for the realized beta, which is also an improvement of the existing bootstrap results in Dovonon, Gonçalves and Meddahi (2013) (where the pairs bootstrap was shown not to be second-order correct under general stochastic volatility). Lastly, we provide Monte Carlo simulations and use empirical data to compare the finite sample accuracy of our new bootstrap confidence intervals for integrated volatility and integrated beta with the existing results.
    Keywords: High frequency data, realized volatility, realized beta, bootstrap, Edgeworth expansions
    JEL: C15 C22 C58
    Date: 2013–09–16
    URL: http://d.repec.org/n?u=RePEc:aah:create:2013-30&r=ecm
  18. By: Niels S. Hansen (Aarhus University and CREATES); Asger Lunde (Aarhus University and CREATES)
    Abstract: In this paper we are interested in the term structure of futures contracts on oil. The objective is to specify a relatively parsimonious model which explains data well and performs well in a real time out of sample forecasting. The dynamic Nelson-Siegel model is normally used to analyze and forecast interest rates of different maturities. The structure of oil futures resembles the structure of interest rates and this motivates the use of this model for our purposes. The data set is vast and the dynamic Nelson-Siegel model allows for a significant dimension reduction by introducing three factors. By performing a series of cross-section regressions we obtain time series for these factors and we focus on modeling their joint distribution. Using copula decomposition we can set up a model for each factor individually along with a model for their dependence structure. When a reasonable model for the factors has been specified it can be used to forecast prices of futures contracts with different maturities. The outcome of this exercise is a class of models which describes the observed futures contracts well and forecasts better than conventional benchmarks. We carry out a real time value at risk analysis and show that our class of models performs well.
    Keywords: Oil futures, Nelson-Siegel, Normal Inverse Gaussian, GARCH, Copula.
    JEL: G17 C32 C53
    Date: 2013–10–25
    URL: http://d.repec.org/n?u=RePEc:aah:create:2013-36&r=ecm
  19. By: Nalan Basturk (Erasmus University Rotterdam Econometric Institute, Tinbergen Institute); Cem Cakmakli (University of Amsterdam Department of Quantitative Economics, Koç University); Pinar Ceyhan (Erasmus University Rotterdam Econometric Institute, Tinbergen Institute); Herman K. van Dijk (Erasmus University Rotterdam Econometric Institute, Tinbergen Institute, VU University Amsterdam Department of Econometrics)
    Abstract: Changing time series properties of US inflation and economic activity, measured as marginal costs, are modeled within a set of extended Phillips Curve (PC) models. It is shown that mechanical removal or modeling of simple low frequency movements in the data may yield poor predictive results which depend on the model specification used. Basic PC models are extended to include structural time series models that describe typical time varying patterns in levels and volatilities. Forward and backward looking expectation components for inflation are incorporated and their relative importance is evaluated. Survey data on expected inflation are introduced to strengthen the information in the likelihood. Use is made of simulation based Bayesian techniques for the empirical analysis. No credible evidence is found on endogeneity and long run stability between inflation and marginal costs. Backward-looking inflation appears stronger than forward-looking one. Levels and volatilities of inflation are estimated more precisely using rich PC models. The extended PC structures compare favorably with existing basic Bayesian vector autoregressive and stochastic volatility models in terms of fit and prediction. Tails of the complete predictive distributions indicate an increase in the probability of deflation in recent years.
    Keywords: New Keynesian Phillips curve, unobserved components, time varying parameters, level shifts, inflation expectations, survey data
    JEL: C11 C32 E31 E37
    Date: 2013–11
    URL: http://d.repec.org/n?u=RePEc:koc:wpaper:1321&r=ecm
  20. By: Richard Blundell (Institute for Fiscal Studies and University College London); Joel Horowitz (Institute for Fiscal Studies and Northwestern University); Matthias Parey (Institute for Fiscal Studies)
    Abstract: Economic theory rarely provides a parametric specification for a model, but it often provides shape restrictions. We consider nonparametric estimation of the heterogeneous demand for gasoline in the U.S. subject to the Slutsky inequality restriction of consumer choice theory. We derive conditions under which the demand function can be estimated consistently by nonparametric quantile regression subject to the Slutsky restriction. The estimated function reveals systematic variation in price responsiveness across the income distribution. A new method for estimating quantile instrumental variables models is also developed to allow for the endogeneity of prices. In our application, shape-constrained quantile IV estimates show similar patterns of demand as shape-constrained estimates under exogeneity. The results illustrate the improvements in the finite-sample performance of a nonparametric estimator that can be achieved by imposing shape restrictions based on economic theory.
    Date: 2013–10
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:54/13&r=ecm
  21. By: Yuta Kurose (Center for the Study of Finance and Insurance, Osaka University,); Yasuhiro Omori (Faculty of Economics, University of Tokyo)
    Abstract:    A multivariate stochastic volatility model with dynamic equicorrelation and cross leverage ef- fect is proposed and estimated. Using a Bayesian approach, an ecient Markov chain Monte Carlo algorithm is described where we use the multi-move sampler, which generates multiple latent variables simultaneously. Numerical examples are provided to show its sampling e- ciency in comparison with the simple algorithm that generates one latent variable at a time given other latent variables. Furthermore, the proposed model is applied to the multivariate daily stock price index data. The empirical study shows that our novel model provides a substantial improvement in forecasting with respect to out-of-sample hedging performances
    Date: 2013–11
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2013cf907&r=ecm
  22. By: Mark Podolskij (Heidelberg University and CREATES); Nakahiro Yoshida (Graduate School of Mathematical Science)
    Abstract: This paper presents new results on the Edgeworth expansion for high frequency functionals of continuous diffusion processes. We derive asymptotic expansions for weighted functionals of the Brownian motion and apply them to provide the Edgeworth expansion for power variation of diffusion processes. Our methodology relies on martingale embedding, Malliavin calculus and stable central limit theorems for semimartingales. Finally, we demonstrate the density expansion for studentized statistics of power variations.
    Keywords: diffusion processes, Edgeworth expansion, high frequency observations, power variation.
    JEL: C10 C13 C14
    Date: 2013–10–21
    URL: http://d.repec.org/n?u=RePEc:aah:create:2013-33&r=ecm
  23. By: Kim, Chang-Jin; Kim, Jaeho
    Abstract: One goal of this paper is to develop an efficient Markov-Chain Monte Carlo (MCMC) algorithm for estimating an ARMA model with a regime-switching mean, based on a multi-move sampler. Unlike the existing algorithm of Billio et al. (1999) based on a single-move sampler, our algorithm can achieve reasonably fast convergence to the posterior distribution even when the latent regime indicator variable is highly persistent or when there exist absorbing states. Another goal is to appropriately investigate the dynamics of the latent ex-ante real interest rate (EARR) in the presence of structural breaks, by employing the econometric tool developed. We argue Garcia and Perron's (1996) conclusion that the EARR rate is a constant subject to occasional jumps may be sample-specific. For an extended sample that includes recent data, Garcia and Perron's (1996) AR(2) model of EPRR may be misspecified, and we show that excluding the theory-implied moving-average terms may understate the persistence of the observed ex-post real interest rate (EPRR) dynamics. Our empirical results suggest that, even though we rule out the possibility of a unit root in the EARR, it may be more persistent and volatile than has been documented in some of the literature including Garcia and Perron (1996).
    Keywords: ARMA model with Regime Switching, Multi-move Sampler, Single-Move Sampler, Metropolis-Hastings Algorithm, Absorbing State, Ex-Ante Real Interest Rate.
    JEL: C11 E4
    Date: 2013–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:51117&r=ecm
  24. By: Lorenzo Frattarolo (Centre d'Economie de la Sorbonne and University Ca Foscari - Department of Economics); Dominique Guegan (Centre d'Economie de la Sorbonne - Paris School of Economics)
    Abstract: Conditional dependence is expressed as a projection map in the trivariate copula space. The projected copula, its sample counterpart and the related process are defined. The weak convergence of the projected copula process to a tight centered Gaussian Process is obtained under weak assumptions on copula derivatives.
    Keywords: Conditional independence, empirical process, weak convergence, copula.
    JEL: D81 C10 C40 C52
    Date: 2013–10
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:13068&r=ecm
  25. By: Wang, W. (Tilburg University)
    Abstract: Abstract: This thesis first investigates various issues related with model averaging, and then evaluates two policies, i.e. West Development Drive in China and fiscal decentralization in U.S, using econometric tools. Chapter 2 proposes a hierarchical weighted least squares (HWALS) method to address multiple sources of uncertainty generated from model specification, estimation, and measurement choices. It examines the effects of different growth theories taking into account the measurement problem in the growth regression. Chapter 3 addresses the issue of prediction under model uncertainty, and proposes a weighted average least squares (WALS) prediction procedure that is not conditional on the selected model. Taking both model and error uncertainty into account, it also proposes an appropriate estimate of the variance of the WALS predictor. Chapter 4 focuses on the interplay among resource abundance, institutional quality, and economic growth in China, using two different measures of resource abundance. It employs a functional-coefficient model to capture the nonlinear interaction effect of institutional quality, and panel-data time-varying coefficient model to describe the dynamic effect of natural resources. Chapter 5 considers a dark side of fiscal decentralization. It models and empirically tests a dress-up contest caused by fiscal decentralization, and shows that the dress-up contest can lead to a social welfare loss.
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:ner:tilbur:urn:nbn:nl:ui:12-5928130&r=ecm
  26. By: Andrew Gelman; Guido Imbens
    Abstract: The statistical and econometrics literature on causality is more focused on "effects of causes" than on "causes of effects." That is, in the standard approach it is natural to study the effect of a treatment, but it is not in general possible to define the causes of any particular outcome. This has led some researchers to dismiss the search for causes as "cocktail party chatter" that is outside the realm of science. We argue here that the search for causes can be understood within traditional statistical frameworks as a part of model checking and hypothesis generation. We argue that it can make sense to ask questions about the causes of effects, but the answers to these questions will be in terms of effects of causes.
    JEL: C01
    Date: 2013–11
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:19614&r=ecm
  27. By: Yoosoon Chang; Yongok Choi; Chang Sik Kim; Joon Y. Park; J. Isaac Miller (Department of Economics, University of Missouri-Columbia)
    Abstract: We introduce a panel model with a nonparametric functional coefficient of multiple arguments. The coefficient is a function both of time, allowing temporal changes in an otherwise linear model, and of the regressor itself, allowing nonlinearity. In contrast to a time series model, the effects of the two arguments can be identified using a panel model. We apply the model to the relationship between real GDP and electricity consumption. Our results suggest that the corresponding elasticities have decreased over time in developed countries, but that this decrease cannot be entirely explained by changes in GDP itself or by sectoral shifts.
    Keywords: semiparametric panel regression, partially linear functional coefficient model, elasticity of electricity demand
    JEL: C33 C51 C53 Q41
    Date: 2013–11–08
    URL: http://d.repec.org/n?u=RePEc:umc:wpaper:1320&r=ecm
  28. By: Jarociński, Marek; Maćkowiak, Bartosz
    Abstract: A researcher is interested in a set of variables that he wants to model with a vector auto-regression and he has a dataset with more variables. Which variables from the dataset to include in the VAR, in addition to the variables of interest? This question arises in many applications of VARs, in prediction and impulse response analysis. We develop a Bayesian methodology to answer this question. We rely on the idea of Granger-causal-priority, related to the well-known concept of Granger-non-causality. The methodology is simple to use, because we provide closed-form expressions for the relevant posterior probabilities. Applying the methodology to the case when the variables of interest are output, the price level, and the short-term interest rate, we find remarkably similar results for the United States and the euro area. JEL Classification: C32, C52, E32
    Keywords: Bayesian model choice, granger-causal-priority, granger-noncausality, structural vector autoregression, Vector autoregression
    Date: 2013–10
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20131600&r=ecm
  29. By: Mittnik, Stefan
    Abstract: Empirical evidence suggests that asset returns correlate more strongly in bear markets than conventional correlation estimates imply. We propose a method for determining complete tail-correlation matrices based on Value-at-Risk (VaR) estimates. We demonstrate how to obtain more effi cient tail-correlation estimates by use of overidenti cation strategies and how to guarantee positive semidefi niteness, a property required for valid risk aggregation and Markowitz-type portfolio optimization. An empirical application to a 30-asset universe illustrates the practical applicability and relevance of the approach in portfolio management. --
    Keywords: Downside risk,Estimation efficiency,Portfolio optimization,Positive semidefiniteness,Solvency II,Value-at-Risk
    JEL: C1 G11
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:zbw:cfswop:201305&r=ecm

This nep-ecm issue is ©2013 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.