nep-ecm New Economics Papers
on Econometrics
Issue of 2016‒12‒18
twenty-two papers chosen by
Sune Karlsson
Örebro universitet

  1. IV and GMM Estimation and Testing of Multivariate Stochastic Unit Root Models By Offer Lieberman; Peter C.B. Phillips
  2. Non-parametric estimation of conditional densities: A new method By Otneim, Håkon; Tjøstheim, Dag
  3. Multidimensional Parameter Heterogeneity in Panel Data Models By Timothy Neal
  4. Macroeconomic Forecasting Using Penalized Regression Methods By Smeekes, Stephan; Wijler, Etiënne
  5. Testing part of a DSGE model by Indirect Inference By Minford, Patrick; Wickens, Michael; Xu, Yongdeng
  6. The Identification and Estimation of a Large Factor Model with Structural Instability By Badi H. Baltagi; Chihwa Kao; Fa Wang
  7. Estimation of the global regularity of a multifractional Brownian motion By Joachim Lebovits; Mark Podolskij
  8. Structural Inference from Reduced Forms with Many Instruments By Wayne Yuan Gao; Peter C.B. Phillips
  9. A large deviations approach to the statistics of extreme events By de Valk, Cees
  10. "Change Detection and the Causal Impact of the Yield Curve By Stan Hurn; Peter C. B. Phillips; Shu-Ping Shi
  11. Multivariate extreme value statistics for risk assessment By He, Yi
  12. Variance targeting estimation of the BEKK-X model By Thieu, Le Quyen
  13. Quantile methods for first-price auction: A signal approach By Nathalie Gimenes; Emmanuel Guerre
  14. Optimal Kernel Estimation of Spot Volatility of Stochastic Differential Equations By Jos\'e E. Figueroa-L\'opez; Cheng Li
  15. When is Nonfundamentalness in SVARs A Real Problem? By Beaudry, Paul; Fève, Patrick; Guay, Alain; Portier, Franck
  16. A diagnostic criterion for approximate factor structure By Patrick Gagliardini; Elisa Ossola; Olivier Scaillet
  17. Testing for Symmetry in Weakly Dependent Time Series By Luke Hartigan
  18. Equation by equation estimation of the semi-diagonal BEKK model with covariates By Thieu, Le Quyen
  19. Stationary Points for Parametric Stochastic Frontier Models By William C. Horrace; Ian A. Wright
  20. Nonparametric Identification and Estimation of Productivity Distributions and Trade Costs By Quang Vuong; Ayse Pehlivan
  21. Predictability Hidden by Anomalous Observations By Lorenzo Camponovo; Olivier Scaillet; Fabio Trojani
  22. Adaptive models and heavy tails with an application to inflation forecasting By Davide Delle Monache; Ivan Petrella

  1. By: Offer Lieberman (Bar-Ilan University); Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: Lieberman and Phillips (2016; Journal of Econometrics; LP) introduced a multivariate stochastic unit root (STUR) model, which allows for random, time varying local departures from a unit root (UR) model, where nonlinear least squares (NLLS) may be used for estimation and inference on the STUR coefficient. In a structural version of this model where the driver variables of the STUR coefficient are endogenous, the NLLS estimate of the STUR parameter is inconsistent, as are the corresponding estimates of the associated covariance parameters. This paper develops a nonlinear instrumental variable (NLIV) as well as GMM estimators of the STUR parameter which conveniently addresses endogeneity. We derive the asymptotic distributions of the NLIV and GMM estimators and establish consistency under similar orthogonality and relevance conditions to those used in the linear model. An overidentification test and its asymptotic distribution are also developed. The results enable inference about structural STUR models and a mechanism for testing the local STUR model against a simple UR null, which complements usual UR tests. Simulations reveal that the asymptotic distributions of the the NLIV and GMM estimators of the STUR parameter as well as the test for overidentifying restrictions perform well in small samples and that the distribution of the NLIV estimator is heavily leptokurtic with a limit theory which has Cauchy-like tails. Comparisons of STUR coefficient and a standard UR coefficient test show that the one-sided UR test performs poorly against the one-sided STUR coefficient test both as the sample size and departures from the null rise.
    Keywords: Autoregression, Diffusion; Similarity, Stochastic unit root, Time-varying coefficients
    JEL: C22
    Date: 2016–06
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:2061&r=ecm
  2. By: Otneim, Håkon (Dept. of Business and Management Science, Norwegian School of Economics); Tjøstheim, Dag (Dept. of Mathematics, University of Bergen)
    Abstract: Let X = (X1,...,Xp) be a stochastic vector having joint density function fX(x) with partitions X1 = (X1,...,Xk) and X2 = (Xk+1,...,Xp). A new method for estimating the conditional density function of X1 given X2 is presented. It is based on locally Gaussian approximations, but simplified in order to tackle the curse of dimensionality in multivariate applications, where both response and explanatory variables can be vectors. We compare our method to some available competitors, and the error of approximation is shown to be small in a series of examples using real and simulated data, and the estimator is shown to be particularly robust against noise caused by independent variables. We also present examples of practical applications of our conditional density estimator in the analysis of time series. Typical values for k in our examples are 1 and 2, and we include simulation experiments with values of p up to 6. Large sample theory is established under a strong mixing condition.
    Keywords: Conditional density estimation; local likelihood; multivariate data; crossvalidation
    JEL: C13 C14 C22
    Date: 2016–12–07
    URL: http://d.repec.org/n?u=RePEc:hhs:nhhfms:2016_022&r=ecm
  3. By: Timothy Neal (School of Economics, UNSW Business School, UNSW)
    Abstract: This article introduces an approach to estimation for static or dynamic panel data models that feature intercept and slope heterogeneity across individuals and over time. It is able to estimate each individual observation coefficient as well as the average coefficient over the sample, and allows for correlation between the heterogeneity and the regressors. Asymptotic theory establishes the consistency and asymptotic normality of the estimates as N and T jointly go to infinity. Finally, Monte Carlo simulations demonstrate that the estimator performs well in environments where fixed effects and mean group estimators are inconsistent and severely biased.
    Keywords: Panel Data, parameter heterogeneity, dynamic panels, estimation
    JEL: C13 C22 C23 C33
    Date: 2016–10
    URL: http://d.repec.org/n?u=RePEc:swe:wpaper:2016-15&r=ecm
  4. By: Smeekes, Stephan (QE / Econometrics); Wijler, Etiënne (QE / Econometrics)
    Abstract: We study the suitability of lasso-type penalized regression techniques when applied to macroeconomic forecasting with high-dimensional datasets. We consider performance of the lasso-type methods when the true DGP is a factor model, contradicting the sparsity assumption underlying penalized regression methods. We also investigate how the methods handle unit roots and cointegration in the data. In an extensive simulation study we find that penalized regression methods are morerobust to mis-specification than factor models estimated by principal components, even if the underlying DGP is a factor model. Furthermore, the penalized regression methods are demonstrated to deliver forecast improvements over traditional approaches when applied to non-stationary data containing cointegrated variables, despite a deterioration of the selective capabilities. Finally, we also consider an empirical application to a large macroeconomic U.S. dataset and demonstrate that, in line with our simulations, penalized regression methods attain the best forecast accuracy most frequently.
    Keywords: Forecasting, Lasso, Factor Models, High-Dimensional Data, Cointegration
    JEL: C22 C53 E17
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:unm:umagsb:2016039&r=ecm
  5. By: Minford, Patrick (Cardiff Business School); Wickens, Michael (Cardiff Business School); Xu, Yongdeng (Cardiff Business School)
    Abstract: We propose a new type of test. Its aim is to test subsets of the structural equations of a DSGE model. The test draws on the statistical inference for limited information models and the use of indirect inference to test DSGE models. Using Monte Carlo experiments on two subsets of equations of the Smets-Wouters model we show that the model has accurate size and good power in small samples. In a test of the Smets-Wouters model on US Great Moderation data we reject the speci…cation of the wage-price but not the expenditure sector, pointing to the …first as the source of overall model rejection.
    Keywords: sub sectors of models, limited information, indirect inference, testing DSGE models equations, Monte Carlo, power, test size
    JEL: C12 C32 C52 E1
    Date: 2016–12
    URL: http://d.repec.org/n?u=RePEc:cdf:wpaper:2016/12&r=ecm
  6. By: Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244); Chihwa Kao (Department of Economics, University of Connecticut); Fa Wang (School of Economics, Shanghai University of Finance and Economics)
    Abstract: This paper tackles the identification and estimation of a high dimensional factor model with unknown number of latent factors and a single break in the number of factors and/or factor loadings occurring at unknown common date. First, we propose a least squares estimator of the change point based on the second moments of estimated pseudo factors and show that the estimation error of the proposed estimator is Op(1). We also show that the proposed estimator has some degree of robustness to misspecification of the number of pseudo factors. With the estimated change point plugged in, consistency of the estimated number of pre and post-break factors and convergence rate of the estimated pre and post-break factor space are then established under fairly general assumptions. The finite sample performance of our estimators is investigated using Monte Carlo experiments.
    Keywords: High Dimensional Factor Model, Structural Change, Rate of Convergence, Number of Factors, Model Selection, Factor Space, Panel Data
    JEL: C13 C33
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:194&r=ecm
  7. By: Joachim Lebovits (University Paris 13); Mark Podolskij (Aarhus University and CREATES)
    Abstract: This paper presents a new estimator of the global regularity index of a multifractional Brownian motion. Our estimation method is based upon a ratio statistic, which compares the realized global quadratic variation of a multifractional Brownian motion at two different frequencies. We show that a logarithmic transformation of this statistic converges in probability to the minimum of the Hurst functional parameter, which is, under weak assumptions, identical to the global regularity index of the path.
    Keywords: consistency, Hurst parameter, multifractional Brownian motion, power variation
    JEL: C10 C13 C14
    Date: 2016–12–06
    URL: http://d.repec.org/n?u=RePEc:aah:create:2016-33&r=ecm
  8. By: Wayne Yuan Gao (Yale University); Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: This paper develops exact finite sample and asymptotic distributions for structural equation tests based on partially restricted reduced form estimates. Particular attention is given to models with large numbers of instruments, wherein the use of partially restricted reduced form estimates is shown to be especially advantageous in statistical testing even in cases of uniformly weak instruments and reduced forms. Comparisons are made with methods based on unrestricted reduced forms, and numerical computations showing finite sample performance of the tests are reported. Some new results are obtained on inequalities between noncentral chi-squared distributions with different degrees of freedom that assist in analytic power comparisons.
    Keywords: Exact distributions, Partial identification, Partially restricted reduced form, Structural inference, Unidentified structure, Weak reduced form
    JEL: C23 C32
    Date: 2016–10
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:2062&r=ecm
  9. By: de Valk, Cees (Tilburg University, School of Economics and Management)
    Abstract: A large deviations approach to the statistics of extreme events addresses the statistical analysis of extreme events with very low probabilities: given a random sample of data of size n, the probability is much smaller than 1/n. In particular, it takes a close look at the regularity assumptions on the tail of the (univariate or multivariate) distribution function. The classical assumptions, cast in the form of limits of ratios of probabilities of extreme events, are not directly applicable in this setting. Therefore, additional assumptions are commonly imposed. Because these may be very restrictive, this thesis proposes an alternative regularity assumption, taking the form of asymptotic bounds on ratios of logarithms of probabilities of extreme events, i.e., a large deviation principle (LDP). In the univariate case, this tail LDP is equivalent to the log-Generalised Weibull (log-GW) tail limit, which generalises the Weibull tail limit and the classical Pareto tail limit, amongst others. Its application to the estimation of high quantiles is discussed. In the multivariate case, the tail LDP implies marginal log-GW tail limits together with a standardised tail LDP describing tail dependence. Its application to the estimation of very low probabilities of multivariate extreme events is discussed, and a connection is established to hidden regular variation (residual tail dependence) and similar models.
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:tiu:tiutis:117b3ba0-0e40-4277-b25e-d7fe711863a5&r=ecm
  10. By: Stan Hurn (Queensland University of Technology - School of Economics and Finance); Peter C. B. Phillips (Cowles Foundation, Yale University); Shu-Ping Shi (Macquarie University)
    Abstract: Causal relationships in econometrics are typically based on the concept of predictability and are established in terms of tests for Granger causality. These causal relationships are susceptible to change, especially during times of financial turbulence, making the real-time detection of instability an important practical issue. This paper develops a test for detecting changes in causal relationships based on a recursive rolling window, which is analogous to the procedure used in recent work on financial bubble detection. The limiting distribution of the test takes a simple form under the null hypothesis and is easy to implement in conditions of homoskedasticity, conditional heteroskedasticity and unconditional heteroskedasticity. Simulation experiments compare the efficacy of the proposed test with two other commonly used tests, the forward recursive and the rolling window tests. The results indicate that both the rolling and the recursive rolling approaches offer good finite sample performance in situations where there are one or two changes in the causal relationship over the sample period, although the performance of the rolling window algorithm seems to be the best. The testing strategies are illustrated in an empirical application that explores the causal impact of the slope of the yield curve on real economic activity in the United States over the period 1985–2013.
    Keywords: Causality, Forward recursion, Hypothesis testing, Output, Recursive rolling test, Rolling window, Yield curve
    JEL: C12 C15 C32 G17
    Date: 2016–12
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:2058&r=ecm
  11. By: He, Yi (Tilburg University, School of Economics and Management)
    Abstract: This dissertation consists of three essays about statistical estimation and inference methods concerning extremal events and tail risks. The first essay establishes a natural, semi-parametric estimation procedure for the multivariate half-space depth-based extreme quantile region in arbitrary dimensions. In contrast to the failure of fully non-parametric approaches due to the scarcity of extremal observations, the good finite-sample performance of our extreme estimator is clearly demonstrated in simulation studies. The second essay extends this method to various depth functions, and, furthermore, establishes an asymptotic approximation theory of what-we-called (directed) logarithmic distance between our estimated and true quantile region. Confidence sets that asymptotically cover the quantile region or its complement, or both simultaneously, with a pre-specified probability can be therefore constructed under weak regular variation conditions. The finite-sample coverage probabilities of our asymptotic confidence sets are evaluated in a simulation study for the half-space depth and the projection depth. We use the procedures in both chapters for risk management by applying them to stock market returns. The third essay develops a statistical inference theory of a recently proposed tail risk measure. For regulators who are interested in monitoring these what-we-called relative risks of individual banks, we provide a jackknife empirical likelihood inference procedure based on the smoothed nonparametric estimation and a Wilks type of result. A simulation study and real-life data analysis show that the proposed relative risk measure is useful in monitoring systemic risk.
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:tiu:tiutis:119cc8b9-5198-41d6-a648-f72501cd4229&r=ecm
  12. By: Thieu, Le Quyen
    Abstract: This paper studies the BEKK model with exogenous variables (BEKK-X), which intends to take into account the influence of explanatory variables on the conditional covariance of the asset returns. Strong consistency and asymptotic normality of a variance targeting estimator (VTE) is proved. Monte Carlo experiments and an application to financial series illustrate the asymptotic results.
    Keywords: BEKK model augmented with exogenous variables, BEKK-X model, Variance targeting estimation (VTE)
    JEL: C13
    Date: 2016–08–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:75572&r=ecm
  13. By: Nathalie Gimenes; Emmanuel Guerre
    Abstract: This paper considers a quantile signal framework for first-price auction. Under the independent private value paradigm, a key stability property is that a linear specification for the private value conditional quantile function generates a linear specification for the bids one, from which it can be easily identified. This applies in particular for standard quantile regression models but also to more flexible additive sieve specification which are not affected by the curse of dimensionality. A combination of local polynomial and sieve methods allows to estimate the private value quantile function with a fast optimal rate and for all quantile levels in [0; 1] without boundary effects. The choice of the smoothing parameters is also discussed. Extensions to interdependent values including bidder specific variables are also possible under some functional restrictions, which tie up the signal to the bidder covariate. The identification of this new model is established and some estimation methods are suggested.
    Keywords: First-price auction; independent private value; quantile regression; local polynomial estimation; sieve estimation; dimension reduction; boundary correction; interdependent values
    JEL: C14 L70
    Date: 2016–10–21
    URL: http://d.repec.org/n?u=RePEc:spa:wpaper:2016wpecon23&r=ecm
  14. By: Jos\'e E. Figueroa-L\'opez; Cheng Li
    Abstract: Kernel Estimation is one of the most widely used estimation methods in non-parametric Statistics, having a wide-range of applications, including spot volatility estimation of stochastic processes. The selection of bandwidth and kernel function is of great importance, especially for the finite sample settings commonly encountered in econometric applications. In the context of spot volatility estimation, most of the proposed selection methods are either largely heuristic or just formally stated without any feasible implementation. In this work, an objective method of bandwidth and kernel selection is proposed, under some mild conditions on the volatility, which not only cover classical Brownian motion driven dynamics but also some processes driven by long-memory fractional Brownian motions or other Gaussian processes. We characterize the leading order terms of the Mean Squared Error, which are also ratified by central limit theorems for the estimation error. As a byproduct, an approximated optimal bandwidth is then obtained in closed form. This result allows us to develop a feasible plug-in type bandwidth selection procedure, for which, as a sub-problem, we propose a new estimator of the volatility of volatility. The optimal selection of kernel function is also discussed. For Brownian Motion type volatilities, the optimal kernel function is proved to be the exponential kernel. For fractional Brownian motion type volatilities, numerical results to compute the optimal kernel are devised and, for the deterministic volatility case, explicit optimal kernel functions of different orders are derived. Simulation studies further confirm the good performance of the proposed methods.
    Date: 2016–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1612.04507&r=ecm
  15. By: Beaudry, Paul; Fève, Patrick; Guay, Alain; Portier, Franck
    Abstract: Identification of structural shocks can be subject to nonfundamentalness, as the econometrician may have an information set smaller than the economic agents´i one. How serious is that problem from a quantitative point of view? In this work we propose a simple diagnosis statistics for the quantitative importance of nonfundamentalness in structural VARs. The diagnosis is of interest as nonfundamentalness is not an either/or question, but is a quantitative issue which can be more or less severe. Using our preferred strategy for identifying news shocks, we find that nonfundamentalness is quantitatively unimportant and that news shocks continue to generate significant business cycle type fluctuations when adjust the estimating procedure to take into account the potential nonfundamentalness issue.
    Keywords: Non-Fundamentalness, Business Cycles, SVARs, News.
    JEL: C32 E32
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:tse:wpaper:31229&r=ecm
  16. By: Patrick Gagliardini; Elisa Ossola; Olivier Scaillet
    Abstract: We build a simple diagnostic criterion for approximate factor structure in large cross-sectional equity datasets. Given a model for asset returns with observable factors, the criterion checks whether the error terms are weakly cross-sectionally correlated or share at least one unobservable common factor. It only requires computing the largest eigenvalue of the empirical cross-sectional covariance matrix of the residuals of a large unbalanced panel. A general version of this criterion allows us to determine the number of omitted common factors. The panel data model accommodates both time-invariant and time-varying factor structures. The theory applies to random coefficient panel models with interactive fixed effects under large cross-section and time-series dimensions. The empirical analysis runs on monthly and quarterly returns for about ten thousand US stocks from January 1968 to December 2011 for several time-invariant and time-varying specifications. For monthly returns, we can choose either among time-invariant specifications with at least four financial factors, or a scaled three-factor specification. For quarterly returns, we cannot select macroeconomic models without the market factor.
    Date: 2016–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1612.04990&r=ecm
  17. By: Luke Hartigan (School of Economics, UNSW Business School, UNSW)
    Abstract: I propose a test of symmetry for a stationary time series based on the difference between the dispersion above the central tendency of the series with that below it. The test has many attractive features: it is applicable to dependent processes, it has a familiar form, it can be implemented using regression, and it has a standard Gaussian limiting distribution under the null of symmetry. The finite sample properties of the test are examined via Monte Carlo simulation and suggest that it is more powerful than competing tests in the literature for the DGPs considered. I apply the test to investigate business cycle asymmetry in sectoral data and confirm previous findings that asymmetry is more often detected in goods-producing sectors than service-related sectors.
    Keywords: Symmetry; Weak dependence; Hypothesis testing; Monte Carlo simulation; Business cycle asymmetry
    JEL: C12 C15 C22 C52 E32
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:swe:wpaper:2016-18&r=ecm
  18. By: Thieu, Le Quyen
    Abstract: This paper provide the asymptotic normality of the Equation by Equation estimator for the semi-diagonal BEKK models augmented by the exogenous variables. The results are obtained without assuming that the innovations are independent, which allows investigate different additional explanatory variables into the information set.
    Keywords: BEKK-X, Equation by equation estimation, exogenous variables, covariates, semi-diagonal BEKK-X
    JEL: C10
    Date: 2016–09–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:75582&r=ecm
  19. By: William C. Horrace (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244); Ian A. Wright (Department of Economics, Northeastern University)
    Abstract: The results of Waldman (1982) on the Normal-Half Normal stochastic frontier model are generalized using the theory of the Dirac delta (Dirac, 1930), and distribution-free conditions are established to ensure a stationary point in the likelihood as the variance of the inefficiency distribution goes to zero. Stability of the stationary point and "wrong skew" results are derived or simulated for common parametric assumptions on the model. Identification is discussed.
    Keywords: Neighborhood Effects; Crime; Academic Performance; Racial Disparities; Educational Outcomes
    JEL: C21 D24
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:196&r=ecm
  20. By: Quang Vuong (New York University); Ayse Pehlivan (Bilkent University)
    Abstract: This paper studies the nonparametric identification and estimation of productivity distributions and trade costs in an Eaton and Kortum (2002) type model. Our identification and estimation strategy gains insights from the empirical auction literature, however, our methodology is novel since we face additional problems resulting from the nature of the trade data. Our methodology does not require data on prices which are usually quite hard to obtain and manages to identify the underlying structure by using disaggregated simple bilateral trade data consisting only of trade values and traded quantities. We recover destination-source-sector specific productivity distributions and trade costs nonparametrically. The fact that these productivity distributions and trade costs are both country and sector specific provides important insights about not only cross country differences but also differences across sectors. Moreover, it has now become a common tradition in models of international trade to use either Fréchet or Pareto distributions to represent the distribution of productivities. They provide great analytical convenience; however, recent studies show gains from trade estimates are very sensitive to these parametrizations. To quantify the welfare gains from trade and answer related policy questions, checking the validity of these parametrizations and analyzing how productivity distributions behave is very important.
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:red:sed016:1618&r=ecm
  21. By: Lorenzo Camponovo; Olivier Scaillet; Fabio Trojani
    Abstract: Testing procedures for predictive regressions with lagged autoregressive variables imply a suboptimal inference in presence of small violations of ideal assumptions. We propose a novel testing framework resistant to such violations, which is consistent with nearly integrated regressors and applicable to multi-predictor settings, when the data may only approximately follow a predictive regression model. The Monte Carlo evidence demonstrates large improvements of our approach, while the empirical analysis produces a strong robust evidence of market return predictability hidden by anomalous observations, both in- and out-of-sample, using predictive variables such as the dividend yield or the volatility risk premium.
    Date: 2016–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1612.05072&r=ecm
  22. By: Davide Delle Monache (Bank of Italy); Ivan Petrella (WBS; CEPR)
    Abstract: This paper introduces an adaptive algorithm for time-varying autoregressive models in the presence of heavy tails. The evolution of the parameters is determined by the score of the conditional distribution, the resulting model is observation-driven and is estimated by classical methods. In particular, we consider time variation in both coefficients and volatility, emphasizing how the two interact with each other. Meaningful restrictions are imposed on the model parameters so as to attain local stationarity and bounded mean values. The model is applied to the analysis of inflation dynamics with the following results: allowing for heavy tails leads to significant improvements in terms of fit and forecast, and the adoption of the Student-t distribution proves to be crucial in order to obtain well calibrated density forecasts. These results are obtained using the US CPI inflation rate and are confirmed by other inflation indicators, as well as for CPI inflation of the other G7 countries.
    Keywords: adaptive algorithms, inflation, score-driven models, student-t, time-varying parameters.
    JEL: C22 C51 C53 E31
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:bbk:bbkcam:1603&r=ecm

This nep-ecm issue is ©2016 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.