nep-ecm New Economics Papers
on Econometrics
Issue of 2015‒12‒08
twenty-one papers chosen by
Sune Karlsson
Örebro universitet

  1. On Estimating Long-Run Effects in Models with Lagged Dependent Variables By W. Robert Reed; Min Zhu
  2. A New Regression-Based Tail Index Estimator: An Application to Exchange Rates By João Nicolau; Paulo M.M. Rodrigues
  3. Improving the Finite Sample Performance of Autoregression Estimators in Dynamic Factor Models: A Bootstrap Approach By Mototsugu Shintani; Zi-yi Guo
  4. Dynamic conditional score patent count panel data models By Szabolcs Blazsek; Ãlvaro Escribano
  5. Estimation of short dynamic panels in the presence of cross-sectional dependence and dynamic eterogeneity By Gilhooly, Robert; Weale, Martin; Wieladek, Tomasz
  6. Dynamic hierarchical models for monetary transmission By Paolo Giudici; Laura Parisi
  7. Parameter bias in an estimated DSGE model: does nonlinearity matter? By Yasuo Hirose; Takeki Sunakawa
  8. lCARE – localizing Conditional AutoRegressive Expectiles By Xiu Xu; Andrija Mihoci; Wolfgang Karl Härdle;
  9. A Simple Estimator for Short Panels with Common Factors By Juodis, Arturas; Sarafidis, Vasilis
  10. Convergence of the risk for nonparametric IV quantile regression and nonparametric IV regression with full independence By Fabian Dunker
  11. Large Vector Autoregressions with Asymmetric Priors By Andrea Carriero; Todd E. Clark; Massimiliano Marcellino
  12. About the Categorization of Latent Variables in Hybrid Choice Models By Francisco J. Bahamonde-Birke; Juan de Dios Ortúzar
  13. Applying Flexible Parameter Restrictions in Markov-Switching Vector Autoregression Models By Andrew Binning; Junior Maih
  14. Simultaneous Edit-Imputation for Continuous Microdata By Hang J. Kim; Lawrence H. Cox; Alan F. Karr; Jerome P. Reiter; Quanli Wang
  15. Asymptotic Bias of OLS in the Presence of Reverse Causality By Basu, Deepankar
  16. Testing Alternative Multi-Factor Models By Soederlind, Paul
  17. Building a Structural Model: Parameterization and Structurality By M. Mouchart; R. Orsi
  18. Elliptical Multiple Output Quantile Regression and Convex Optimization By Marc Hallin; Miroslav Šiman
  19. Macro-Driven VaR Forecasts: From Very High to Very-Low Frequency Data By Yves Dominicy; Harry-Paul Vander Elst
  20. Measurement Errors and Monetary Policy: Then and Now By Amir-Ahmadi, Pooyan; Matthes, Christian; Wang, Mu-Chun
  21. Multivariate Moment Based Extreme Valur Index Estimators By Matias Heikkila; Yves Dominicy; Sirkku Pauliina Ilmonen

  1. By: W. Robert Reed (University of Canterbury); Min Zhu
    Abstract: This note points out the hazards of estimating long-run effects from models with lagged dependent variables. We use Monte Carlo experiments to demonstrate that this practice often fails to produce reliable estimates. Biases can be substantial, sample ranges very wide, and hypothesis tests can be rendered useless in realistic data environments. There are three reasons for this poor performance. First, OLS estimates of the coefficient of a lagged dependent variable are downwardly biased in finite samples. Second, small biases in the estimate of the lagged, dependent variable coefficient are magnified in the calculation of long-run effects. And third, and perhaps most importantly, the statistical distribution associated with estimates of the LRP is complicated, heavy-tailed, and difficult to use for hypothesis testing. While alternative procedures such as jackknifing and indirect inference address the first issue, associated estimates of long-run effects remain unreliable.
    Keywords: Hurwicz bias, Auto-Regressive Distributed-Lag models, ARDL, Dynamic Panel Data models, DPD, Anderson-Hsaio, Arellano-Bond, Difference GMM, System GMM, indirect inference, jackknifing, long-run impact, long-run propensity
    JEL: C22 C23
    Date: 2015–11–30
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:15/18&r=ecm
  2. By: João Nicolau; Paulo M.M. Rodrigues
    Abstract: In this paper, a new regression-based approach for the estimation of the tail index of heavy-tailed distributions is introduced. Comparatively to many procedures currently available in the literature, our method does not involve order statistics and can be applied in more general contexts than just Pareto. The procedure is in line with approaches used in experimental data analysis with xed explanatory variables, and has several important features which are worth highlighting. First, it provides a bias reduction when compared to available regression-based methods and a fortiori over standard least-squares based estimators of the tail index. Second, it is more resilient to the choice of the tail length used in the estimation of the index than the widely used Hill estimator. Third, when the effect of the slowly varying function at innity of the Pareto distribution (the so called second order behaviour of the Taylor expansion) vanishes slowly our estimator continues to perform satisfactorily, whereas the Hill estimator rapidly deteriorates. Fourth, our estimator performs well under dependence of unknown form. For inference purposes, we also provide a way to compute the asymptotic variance of the proposed estimator under time dependence and conditional heteroscedasticity. An empirical application of the procedure to exchange rates is also provided.
    JEL: C16 C58
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:ptu:wpaper:w201514&r=ecm
  3. By: Mototsugu Shintani (University of Tokyo and Vanderbilt University); Zi-yi Guo (Vanderbilt University)
    Abstract: We investigate the finite sample properties of the estimator of a persistence parameter of an unobservable common factor when the factor is estimated by the principal components method. When the number of cross-sectional observations is not sufficiently large, relative to the number of time series observations, the autoregressive coefficient estimator of a positively autocorrelated factor is biased downward and the bias becomes larger for a more persistent factor. Based on theoretical and simulation analyses, we show that bootstrap procedures are e¤ective in reducing the bias, and bootstrap confidence intervals outperform naive asymptotic confidence intervals in terms of the coverage probability.
    Keywords: Bias Correction; Bootstrap; Dynamic Factor Model; Principal Components
    JEL: C1 C5
    Date: 2015–12–02
    URL: http://d.repec.org/n?u=RePEc:van:wpaper:vuecon-sub-15-00015&r=ecm
  4. By: Szabolcs Blazsek; Ãlvaro Escribano
    Abstract: We propose a new class of dynamic patent count panel data models that is based on dynamic conditional score (DCS) models. We estimate multiplicative and additive DCS models, MDCS and ADCS respectively, with quasi-ARMA (QARMA) dynamics, and compare them with the finite distributed lag, exponential feedback and linear feedback models. We use a large panel of 4,476 United States (US) firms for period 1979 to 2000. Related to the statistical inference, we discuss the advantages and disadvantages of alternative estimation methods: maximum likelihood estimator (MLE), pooled negative binomial quasi-MLE (QMLE) and generalized method of moments (GMM). For the count panel data models of this paper, the strict exogeneity of explanatory variables assumption of MLE fails and GMM is not feasible. However, interesting results are obtained for pooled negative binomial QMLE. The empirical evidence shows that the new class of MDCS models with QARMA dynamics outperforms all other models considered.
    Keywords: patent count panel data models , dynamic conditional score models , quasi-ARMA model , research and development , patent applications
    JEL: C33 C35 C51 C52 O3
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:we1510&r=ecm
  5. By: Gilhooly, Robert (Monetary Policy Committee Unit, Bank of England); Weale, Martin (Monetary Policy Committee Unit, Bank of England); Wieladek, Tomasz (Monetary Policy Committee Unit, Bank of England)
    Abstract: We propose a Bayesian approach to dynamic panel estimation in the presence of cross-sectional dependence and dynamic heterogeneity which is suitable for inference in short panels, unlike alternative estimators. Monte Carlo simulations indicate that our estimator produces less bias, and a lower root mean squared error, than existing estimators. The method is illustrated by estimating a panel VAR on sector level data for labour productivity and hours worked growth for Canada, Germany, France, Italy, the UK and the US from 1992 Q1 to 2011 Q3. We use historical decompositions to examine the determinants of recent output growth in each country. This exercise demonstrates that failure to take cross-sectional dependence into account leads to highly misleading results.
    Keywords: Bayesian dynamic panel estimator; dynamic heterogeneity; cross-sectional dependence; labour productivity.
    JEL: C11 C31 C33
    Date: 2015–12–01
    URL: http://d.repec.org/n?u=RePEc:mpc:wpaper:0038&r=ecm
  6. By: Paolo Giudici (Department of Economics and Management, University of Pavia); Laura Parisi (Department of Economics and Management, University of Pavia)
    Abstract: Monetary policies, either actual or perceived, cause changes in monetary interest rates. These changes impact the economy through financial institutions, which react to changes in the monetary rates with changes in their administered rates, on both deposits and lendings. The dynamics of administered bank interest rates in response to changes in money market rates is thus essential to examine the impact of monetary policies on the economy. Chong et al. (2006) proposed an error correction model to study such impact, using data previous to the recent financial crisis. Parisi et al. (2015) analyzed the Chong error correction model, extended it and proposed an alternative, simpler to interpret, one-equation model, and applied it to the recent time period, characterized by close-to-zero monetary rates. In this paper we extend the previous models in a dynamic sense, modelling monetary transmission effects by means of dynamic linear models. The main contribution of this work consists in a novel methodology that provides a mechanism to identify the time dynamics of interest rates, linking them to monetary rates and to macroeconomic, country-specific variables. In addition, it introduces a predictive performance assessment methodology, which allows to compare the proposed models on a fair ground. From an applied viewpoint, the paper applies the proposed models to interest rates on different loans, showing how the monetary policy and the specific situation of each country differently impact lendings, not only across countries but also across time.
    Keywords: Forecasting Bank Interest Rates, Dynamic Time Series Models, Hierarchical Models
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:pav:demwpp:demwp0112&r=ecm
  7. By: Yasuo Hirose; Takeki Sunakawa
    Abstract: How can parameter estimates be biased in a dynamic stochastic general equilibrium model that omits nonlinearity in the economy? To answer this question, we simulate data from a fully nonlinear New Keynesian model with the zero lower bound constraint and estimate a linearized version of the model. Monte Carlo experiments show that significant biases are detected in the estimates of monetary policy parameters and the steady-state inflation and real interest rates. These biases arise mainly from neglecting the zero lower bound constraint rather than linearizing equilibrium conditions. With fixed parameters, the variance-covariance matrix and impulse response functions of observed variables implied by the linearized model substantially differ from those implied by its nonlinear counterpart. However, we find that the biased estimates of parameters in the estimated linear model can make most of the differences small.
    Keywords: Nonlinearity, Zero lower bound, DSGE model, Parameter bias, Bayesian estimation
    JEL: C32 E30 E52
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:een:camaaa:2015-46&r=ecm
  8. By: Xiu Xu; Andrija Mihoci; Wolfgang Karl Härdle;
    Abstract: We account for time-varying parameters in the conditional expectile based value at risk (EVaR) model. EVaR appears more sensitive to the magnitude of portfolio losses compared to the quantile-based Value at Risk (QVaR), nevertheless, by fitting the models over relatively long ad-hoc fixed time intervals, research ignores the potential time-varying parameter properties. Our work focuses on this issue by exploiting the local parametric approach in quantifying tail risk dynamics. By achieving a balance between parameter variability and modelling bias, one can safely fit a parametric expectile model over a stable interval of homogeneity. Empirical evidence at three stock markets from 2005- 2014 shows that the parameter homogeneity interval lengths account for approximately 1-6 months of daily observations. Our method outperforms models with one-year fixed intervals, as well as quantile based candidates while employing a time invariant portfolio protection (TIPP) strategy for the DAX portfolio. The tail risk measure implied by our model finally provides valuable insights for asset allocation and portfolio insurance.
    Keywords: expectiles, tail risk, local parametric approach, risk management
    JEL: C32 C51 G17
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2015-052&r=ecm
  9. By: Juodis, Arturas; Sarafidis, Vasilis
    Abstract: There is a substantial theoretical literature on the estimation of short panel data models with common factors nowadays. Nevertheless, such advances appear to have remained largely unnoticed by empirical practitioners. A major reason for this casual observation might be that existing approaches are computationally burdensome and difficult to program. This paper puts forward a simple methodology for estimating panels with multiple factors based on the method of moments approach. The underlying idea involves substituting the unobserved factors with time-specific weighted averages of the variables included in the model. The estimation procedure is easy to implement because unobserved variables are superseded with observed data. Furthermore, since the model is effectively parameterized in a more parsimonious way, the resulting estimator can be asymptotically more efficient than existing ones. Notably, our methodology can easily accommodate observed common factors and unbalanced panels, both of which are important empirical scenarios. We apply our approach to a data set involving a large panel of 4,500 households in New South Wales (Australia), and estimate the price elasticity of urban water demand.
    Keywords: Dynamic Panel Data, Factor Model, Fixed T Consistency, Monte Carlo Simulation, Urban Water Management.
    JEL: C13 C15 C23
    Date: 2015–11–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:68164&r=ecm
  10. By: Fabian Dunker (Georg-August-University Göttingen)
    Abstract: In econometrics some nonparametric instrumental regression models and nonparametric demand models with endogeneity lead to nonlinear integral equations with unknown integral kernels. We prove convergence rates of the risk for the iteratively regularized Newton method applied to these problems. Compared to related results we relay on a weaker non-linearity condition and have stronger convergence results. We demonstrate by numerical simulations for a nonparametric IV regression problem with continuous instrument and regressor that the method produces better results than the standard method.
    Keywords: Nonparametric regression; instrumental variables; nonlinear inverse problems; iterative regularization
    JEL: C13 C14 C31 C36
    Date: 2015–12–03
    URL: http://d.repec.org/n?u=RePEc:got:gotcrc:192&r=ecm
  11. By: Andrea Carriero (Queen Mary University of London); Todd E. Clark (Federal Reserve Bank of Cleveland); Massimiliano Marcellino (Bocconi University, IGIER and CEPR)
    Abstract: We propose a new algorithm which allows easy estimation of Vector Autoregressions (VARs) featuring asymmetric priors and time varying volatilities, even when the cross sectional dimension of the system <i>N</i> is particularly large. The algorithm is based on a simple triangularisation which allows to simulate the conditional mean coefficients of the VAR by drawing them equation by equation. This strategy reduces the computational complexity by a factor of <i>N<sup>2</sup></i> with respect to the existing algorithms routinely used in the literature and by practitioners. Importantly, this new algorithm can be easily obtained by modifying just one of the steps of the existing algorithms. We illustrate the benefits of the algorithm with numerical and empirical applications.
    Keywords: Bayesian VARs, Stochastic volatility, Large datasets, Forecasting, Impulse response functions
    JEL: C11 C13 C33 C53
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp759&r=ecm
  12. By: Francisco J. Bahamonde-Birke; Juan de Dios Ortúzar
    Abstract: Although hybrid choice models are fairly popular nowadays, the way in which different types of latent variables are considered into the utility function has not been extensively analysed. Latent variables accounting for attitudes resemble socioeconomic characteristics and, therefore, systematic taste variations and categorizations of the latent variables should be considered. Nevertheless, categorizing a latent variable is not an easy subject, as these variables are not observed and consequently exhibit an intrinsic variability. Under these circumstances it is not possibly to assign an individual to a specific group, but only to establish a probability with which an individual should be categorized in given way. In this paper we explore different ways to categorize individuals based on latent characteristics, focusing on the categorization of latent variables. This approach exhibits as main advantage (over latent-classes for instance) a clear interpretation of the function utilized in the categorization process, as well as taking exogenous information into account. Unfortunately, technical issues (associated with the estimation technique via simulation) arise when attempting a direct categorization. We propose an alternative to attempt a direct categorization of latent variables (based on an auxiliary variable) and conduct a theoretical and empirical analysis (two case studies), contrasting this alternative with other approaches (latent variable-latent class approach and latent classes with perceptual indicators approach). Based on this analysis, we conclude that the direct categorization is the superior approach, as it offers a consistent treatment of the error term, in accordance with underlying theories, and a better goodness-of-fit.
    Keywords: hybrid choice models, latent variables, latent classes, categorization
    JEL: C35 C50
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1527&r=ecm
  13. By: Andrew Binning (Norges Bank); Junior Maih (Norges Bank and BI Norwegian Business School)
    Abstract: We present a new method for imposing parameter restrictions in Markov-Switching Vector Autoregression (MS-VAR) models. Our method is more flexible than competing methodologies and easily handles a range of parameter restrictions over different equations, regimes and parameter types. We also expand the range of priors used in the MS-VAR literature. We demonstrate the versatility of our approach using three appropriate examples.
    Keywords: Parameter Restrictions, MS-VAR estimation, Block Exogeneity, Zero Restrictions, Bayesian estimation
    Date: 2015–12–01
    URL: http://d.repec.org/n?u=RePEc:bno:worpap:2015_17&r=ecm
  14. By: Hang J. Kim; Lawrence H. Cox; Alan F. Karr; Jerome P. Reiter; Quanli Wang
    Abstract: Many statistical organizations collect data that are expected to satisfy linear constraints; as examples, component variables should sum to total variables, and ratios of pairs of variables should be bounded by expert-specified constants. When reported data violate constraints, organizations identify and replace values potentially in error in a process known as edit-imputation. To date, most approaches separate the error localization and imputation steps, typically using optimization methods to identify the variables to change followed by hot deck imputation. We present an approach that fully integrates editing and imputation for continuous microdata under linear constraints. Our approach relies on a Bayesian hierarchical model that includes (i) a flexible joint probability model for the underlying true values of the data with support only on the set of values that satisfy all editing constraints, (ii) a model for latent indicators of the variables that are in error, and (iii) a model for the reported responses for variables in error. We illustrate the potential advantages of the Bayesian editing approach over existing approaches using simulation studies. We apply the model to edit faulty data from the 2007 U.S. Census of Manufactures. Supplementary materials for this article are available online.
    Keywords: Bayesian; Economic; Editing; Missing; Mixture; Survey
    Date: 2015–12
    URL: http://d.repec.org/n?u=RePEc:cen:wpaper:15-44&r=ecm
  15. By: Basu, Deepankar (Department of Economics, University of Massachusetts, Amherst)
    Abstract: In this paper, I derive an expression for the asymptotic bias in the OLS estimator of the partial effect of a regressor on the dependent variable when there is reverse causality and all variables in the model are covariance stationary. I show that the sign of the asymptotic bias depends only on the signs of the bi-directional causal effects.
    Keywords: Reverse causality, simultaneity bias.
    JEL: C10 C30
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:ums:papers:2015-18&r=ecm
  16. By: Soederlind, Paul
    Abstract: A GMM-based system for two alternative linear factor models can be used to test if the pricing errors (the intercepts) differ, with a bootstrap approach to find the appropriate critical values in finite samples. As an illustration, the test is applied to the Fama-French model.
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:usg:sfwpfi:2015:24&r=ecm
  17. By: M. Mouchart; R. Orsi
    Abstract: A specific concept of structural model is used as a background for discussing the structurality of its parameterization. Conditions for a structural model to be also causal are examined. Difficulties and pitfalls arising from the parameterization are analyzed. In particular, pitfalls when considering alternative parameterizations of a same model are shown to have lead to ungrounded conclusions in the literature. Discussion of observationally equivalent models related to different economic mechanisms are used to make clear the connection between an economicall meaningful parameterization and an economically meaningful decomposition of a complex model. The design of economic policy is used for drawing some practical implications of the proposed analysis.
    JEL: C10 C18 C50 C51 C54
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:bol:bodewp:wp1039&r=ecm
  18. By: Marc Hallin; Miroslav Šiman
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/221191&r=ecm
  19. By: Yves Dominicy; Harry-Paul Vander Elst
    Abstract: This paper studies in some details the joint-use of high-frequency data and economic variables tomodel financial returns and volatility. We extend the Realized LGARCH model by allowing for a timevaryingintercept, which responds to changes in macroeconomic variables in a MIDAS framework andallows macroeconomic information to be included directly into the estimation and forecast procedure.Using more than 10 years of high-frequency transactions for 55 U.S. stocks, we argue that the combinationof low-frequency exogenous economic indicators with high-frequency financial data improves our abilityto forecast the volatility of returns, their full multi-step ahead conditional distribution and the multiperiodValue-at-Risk. We document that nominal corporate profits and term spreads generate accuraterisk measures forecasts at horizons beyond two business weeks.
    Keywords: realized LGARCH; value-at-risk; density forecasts; realized measures of volatility
    JEL: C22 C53
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/220550&r=ecm
  20. By: Amir-Ahmadi, Pooyan (Goethe University Frankfurt); Matthes, Christian (Federal Reserve Bank of Richmond); Wang, Mu-Chun (University of Hamburg)
    Abstract: Should policymakers and applied macroeconomists worry about the difference between real-time and final data? We tackle this question by using a VAR with time-varying parameters and stochastic volatility to show that the distinctionbetween real-time data and final data matters for the impact of monetary policy shocks: The impact on final data is substantially and systematically different (in particular, larger in magnitude for different measures of real activity) from theimpact on real-time data. These differences have persisted over the last 40 years and should be taken into account when conducting or studying monetary policy.
    Keywords: real-time data; time-varying parameters; stochastic volatility; impulse responses
    Date: 2015–11–05
    URL: http://d.repec.org/n?u=RePEc:fip:fedrwp:15-13&r=ecm
  21. By: Matias Heikkila; Yves Dominicy; Sirkku Pauliina Ilmonen
    Abstract: Modeling extreme events is of paramount importance in various areas ofscience — biostatistics, climatology, finance, geology, and telecommunications, toname a few. Most of these application areas involve multivariate data. Estimationof the extreme value index plays a crucial role in modeling rare events. There isan affine invariant multivariate generalization of the well known Hill estimator—theseparating Hill estimator. However, the Hill estimator is only suitable for heavy taileddistributions. As in the case of the separating multivariate Hill estimator, we considerestimation of the extreme value index under the assumption of multivariate ellipticity.We provide affine invariant multivariate generalizations of the moment estimator andthe mixed moment estimator. These estimators are suitable for both: light and heavytailed distributions. Asymptotic properties of the new extreme value index estimatorsare derived under multivariate elliptical distribution with known location and scatter.The effect of replacing true location and scatter by estimates is examined in a thoroughsimulation study.
    Keywords: extreme value index; elliptical distribution; moment estimator; mixed moment estimator
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/220551&r=ecm

This nep-ecm issue is ©2015 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.