nep-ecm New Economics Papers
on Econometrics
Issue of 2016‒01‒29
twelve papers chosen by
Sune Karlsson
Örebro universitet

  1. A Continuous Updating Weighted Least Squares Estimator of Tail Dependence in High Dimensions By Einmahl, John; Kiriliouk, A.; Segers, J.J.J.
  2. Maximum Likelihood Estimation of Time-Varying Loadings in High-Dimensional Factor Models By Jakob Guldbæk Mikkelsen; Eric Hillebrand; Giovanni Urga
  3. Decomposition of Time Series Data of Stock Markets and its Implications for Prediction: An Application for the Indian Auto Sector By Jaydip Sen; Tamal Datta Chaudhuri
  4. Forecasting with EC-VARMA models By Athanasopouolos, George; Poskitt, Don; Vahid, Farshid; Yao, Wenying
  5. Unit root inference for non-stationary linear processes driven by infinite variance innovations By Giuseppe Cavaliere; Iliyan Georgiev; Robert Taylor
  6. Dual regression By Richard H. Spady; Sami Stouli
  7. Monitoring Parameter Constancy with Endogenous Regressors By KUROZUMI, Eiji
  8. The varying coefficient Bayesian panel VAR model By Wieladek, Tomasz
  9. Portfolio Optimisation Under Flexible Dynamic Dependence Modelling By Mauro Bernardi; Leopoldo Catania
  10. Multiple-Output Quantile Regression By Marc Hallin; Miroslav Šiman
  11. Tests of the Co-integration Rank in VAR Models in the Presence of a Possible Break in Trend at an Unknown Point By Harris, David; Leybourne, Stephen J; Taylor, A M Robert
  12. Testing for Causality in Continuous Time Bayesian Network Models of High-Frequency Data By Jonas Hallgren; Timo Koski

  1. By: Einmahl, John (Tilburg University, Center For Economic Research); Kiriliouk, A.; Segers, J.J.J. (Tilburg University, Center For Economic Research)
    Abstract: Likelihood-based procedures are a common way to estimate tail dependence parameters.They are not applicable, however, in non-differentiable models such as those arising from recent max-linear structural equation models. Moreover, they can be hard to compute in higher dimensions. An adaptive weighted least-squares procedure matching nonparametric estimates of the stable tail dependence function with the corresponding values of a parametrically specified proposal yields a novel minimum-distance estimator. The estimator is easy to calculate and applies to a wide range of sampling schemes and tail dependence models. In large samples, it is asymptotically normal with an explicit and estimable covariance matrix. The minimum distance obtained forms the basis of a goodness-of-t statistic whose asymptotic distribution is chi-square. Extensive Monte Carlo simulations confirm the excellent finite-sample performance of the estimator and demonstrate that it is a strong competitor to currently available methods. The estimator is then applied to disentangle sources of tail dependence in European stock markets.
    Keywords: Brown-resnick process; extremal coefficient; max-linear model; multivariate extremes; stable tail dependence function
    JEL: C13 C14
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:tiu:tiucen:a3e7350b-4773-4bd8-9c3c-6bc485b83f4d&r=ecm
  2. By: Jakob Guldbæk Mikkelsen (Aarhus University and CREATES); Eric Hillebrand (Aarhus University and CREATES); Giovanni Urga (Cass Business School)
    Abstract: In this paper, we develop a maximum likelihood estimator of time-varying loadings in high-dimensional factor models. We specify the loadings to evolve as stationary vector autoregressions (VAR) and show that consistent estimates of the loadings parameters can be obtained by a two-step maximum likelihood estimation procedure. In the first step, principal components are extracted from the data to form factor estimates. In the second step, the parameters of the loadings VARs are estimated as a set of univariate regression models with time-varying coefficients. We document the finite-sample properties of the maximum likelihood estimator through an extensive simulation study and illustrate the empirical relevance of the time-varying loadings structure using a large quarterly dataset for the US economy.
    Keywords: High-dimensional factor models, dynamic factor loadings, maximum likelihood, principal components JEL classification: C33, C55, C13
    Date: 2015–12–15
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-61&r=ecm
  3. By: Jaydip Sen; Tamal Datta Chaudhuri
    Abstract: With the rapid development and evolution of sophisticated algorithms for statistical analysis of time series data, the research community has started spending considerable effort in technical analysis of such data. Forecasting is also an area which has witnessed a paradigm shift in its approach. In this work, we have used the time series of the index values of the Auto sector in India during January 2010 to December 2015 for a deeper understanding of the behavior of its three constituent components, e.g., the Trend, the Seasonal component, and the Random component. Based on this structural analysis, we have also designed three approaches for forecasting and also computed their accuracy in prediction using suitably chosen training and test data sets. The results clearly demonstrate the accuracy of our decomposition results and efficiency of our forecasting techniques, even in presence of a dominant Random component in the time series.
    Date: 2016–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1601.02407&r=ecm
  4. By: Athanasopouolos, George (Monash University); Poskitt, Don (Monash University); Vahid, Farshid (Monash University); Yao, Wenying (School of Business and Economics, University of Tasmania)
    Abstract: This article studies error correction vector autoregressive moving average (ECVARMA) models. A complete procedure for identifying and estimating EC-VARMA models is proposed. The cointegrating rank is estimated in the first stage using an extension of the non-parametric method of Poskitt (2000). Then, the structure of the VARMA model for variables in levels is identified using the scalar component model (SCM) methodology developed in Athanasopoulos and Vahid (2008), which leads to a uniquely identifiable VARMA model. In the last stage, the VARMA model is estimated in its error correction form. Monte Carlo simulation is conducted using a 3-dimensional VARMA(1,1) DGP with cointegrating rank 1, in order to evaluate the forecasting performances of the EC-VARMA models. This algorithm is illustrated further using an empirical example of the term structure of U.S. interest rates. The results reveal that the out-of-sample forecasts of the EC-VARMA model are superior to those produced by error correction vector autoregressions (VARs) of finite order, especially in short horizons.
    Keywords: cointegration, VARMA model, iterative OLS, scalar component modelNote:
    JEL: C1 C32 C53
    Date: 2014–02–22
    URL: http://d.repec.org/n?u=RePEc:tas:wpaper:17835&r=ecm
  5. By: Giuseppe Cavaliere (Università di Bologna); Iliyan Georgiev (Università di Bologna); Robert Taylor (University of Essex)
    Abstract: The contribution of this paper is two-fold. First, we derive the asymptotic null distribution of the familiar augmented Dickey-Fuller [ADF] statistics in the case where the shocks follow a linear process driven by in…nite variance innovations. We show that these distributions are free of serial correlation nuisance parameters but depend on the tail index of the in…nite variance process. These distributions are shown to coincide with the corresponding results for the case where the shocks follow a …nite autoregression, provided the lag length in the ADF regression satis…es the same o(T1=3) rate condition as is required in the …nite variance case. In addition, we establish the rates of consistency and (where they exist) the asymptotic distributions of the ordinary least squares sieve estimates from the ADF regression. Given the dependence of their null distributions on the unknown tail index, our second contribution is to explore sieve wild bootstrap implementations of the ADF tests. Under the assumption of symmetry, we demonstrate the asymptotic validity (bootstrap consistency) of the wild bootstrap ADF tests. This is done by establishing that (conditional on the data) the wild bootstrap ADF statistics attain the same limiting distribution as that of the original ADF statistics taken conditional on the magnitude of the innovations.
    Keywords: Bootstrap, Unit roots, Sieve autoregression, Infinite variance, Time Series
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:bot:quadip:wpaper:130&r=ecm
  6. By: Richard H. Spady; Sami Stouli
    Abstract: We propose an alternative (‘dual regression’) to the quantile regression process for the global estimation of conditional distribution functions under minimal assumptions. Dual regression provides all the interpretational power of the quantile regression process while largely avoiding the need for ‘rearrangement’ to repair the intersecting conditional quantile surfaces that quantile regression often produces in practice. Our approach relies on a mathematical programming characterization of conditional distribution functions which, in its simplest form, provides a simultaneous estimator of location and scale parameters in a linear heteroscedastic model. The statistical properties of this estimator are derived.
    Keywords: Conditional distribution, Stochastic representation, Duality, Convexity, Quantile regression, Heteroscedasticity, Method of moments, Mathematical programming, Monotone approximation.
    Date: 2016–01–13
    URL: http://d.repec.org/n?u=RePEc:bri:uobdis:16/669&r=ecm
  7. By: KUROZUMI, Eiji
    Abstract: This paper proposes monitoring tests for parameter change in linear regression models with endogenous regressors. We consider a CUSUM-type test based on the instrumental variable (IV) estimation, as the IV method is standard for models with endogenous regressors. In addition, we propose a test based on the residuals from the least squares (LS) estimation. We show that for a given boundary function, both tests have the same limiting distribution under the null hypothesis, whereas their powers are different. In particular, when a structural change occurs early in a monitoring period, the test based on the LS method tends to detect it more rapidly than that based on the IV method. We apply our methods to investigate the Japanese Phillips curve and show that the LS based test performs well to detect a change in 2007, while neither test finds evidence of a change after 2013.
    Keywords: structural change, CUSUM test, instrumental variable, Phillips curve
    JEL: C12 C22 C26
    Date: 2016–01–20
    URL: http://d.repec.org/n?u=RePEc:hit:econdp:2016-01&r=ecm
  8. By: Wieladek, Tomasz (Bank of England)
    Abstract: Interacted panel VAR (IPVAR) models allow coefficients to vary as a deterministic function of observable country characteristics. The varying coefficient Bayesian panel VAR generalises this to the stochastic case. As an application of this framework, I examine if the impact of commodity price shocks on consumption and the CPI varies with the degree of exchange rate, financial, product and labour market liberalisation on data from 1976 Q1–2006 Q4 for 18 OECD countries. The confidence bands are smaller in the deterministic case and as a result most of the characteristics affect the transmission mechanism in a statistically significant way. But only financial liberalisation is an important determinant of commodity price shocks in the stochastic case. This suggests that results from IPVAR models should be interpreted with caution.
    Keywords: Bayesian panel VAR; commodity price shocks
    JEL: C33 E30
    Date: 2016–01–08
    URL: http://d.repec.org/n?u=RePEc:boe:boeewp:0578&r=ecm
  9. By: Mauro Bernardi; Leopoldo Catania
    Abstract: Signals coming from multivariate higher order conditional moments as well as the information contained in exogenous covariates, can be effectively exploited by rational investors to allocate their wealth among different risky investment opportunities. This paper proposes a new flexible dynamic copula model being able to explain and forecast the time-varying shape of large dimensional asset returns distributions. Moreover, we let the univariate marginal distributions to be driven by an updating mechanism based on the scaled score of the conditional distribution. This framework allows us to introduce time-variation in up to the fourth moment of the conditional distribution. The time-varying dependence pattern is subsequently modelled as function of a latent Markov Switching process, allowing also for the inclusion of exogenous covariates in the dynamic updating equation. We empirically assess that the proposed model substantially improves the optimal portfolio allocation of rational investors maximising their expected utility.
    Date: 2016–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1601.05199&r=ecm
  10. By: Marc Hallin; Miroslav Šiman
    Date: 2016–01
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/224753&r=ecm
  11. By: Harris, David; Leybourne, Stephen J; Taylor, A M Robert
    Abstract: In this paper we consider the problem of testing for the co-integration rank of a vector autoregressive process in the case where a trend break may potentially be present in the data. It is known that un-modelled trend breaks can result in tests which are incorrectly sized under the null hypothesis and inconsistent under the alternative hypothesis. Extant procedures in this literature have attempted to solve this inference problem but require the practitioner to either assume that the trend break date is known or to assume that any trend break cannot occur under the co-integration rank null hypothesis being tested. These procedures also assume the autoregressive lag length is known to the practitioner. All of these assumptions would seem unreasonable in practice. Moreover in each of these strands of the literature there is also a presumption in calculating the tests that a trend break is known to have happened. This can lead to a substantial loss in finite sample power in the case where a trend break does not in fact occur. Using information criteria based methods to select both the autoregressive lag order and to choose between the trend break and no trend break models, using a consistent estimate of the break fraction in the context of the former, we develop a number of procedures which deliver asymptotically correctly sized and consistent tests of the co-integration rank regardless of whether a trend break is present in the data or not. By selecting the no break model when no trend break is present, these procedures also avoid the potentially large power losses associated with the extant procedures in such cases.
    Keywords: Co-integration rank; vector autoregression; error-correction model; trend break; break point estimation; information criteria
    Date: 2016–01
    URL: http://d.repec.org/n?u=RePEc:esy:uefcwp:15847&r=ecm
  12. By: Jonas Hallgren; Timo Koski
    Abstract: Continuous time Bayesian networks are investigated with a special focus on their ability to express causality. A framework is presented for doing inference in these networks. The central contributions are a representation of the intensity matrices for the networks and the introduction of a causality measure. A new model for high-frequency financial data is presented. It is calibrated to market data and by the new causality measure it performs better than older models.
    Date: 2016–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1601.06651&r=ecm

This nep-ecm issue is ©2016 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.