|
on Econometric Time Series |
By: | Eric Eisenstat (University of Queensland); Joshua C.C. Chan (University of Technology Sydney); Rodney W. Strachan (University of Queensland) |
Abstract: | This paper proposes a new approach to estimating high dimensional time varying parameter structural vector autoregressive models (TVP-SVARs) by taking advantage of an empirical feature of TVP-(S)VARs. TVP-(S)VAR models are rarely used with more than 4-5 variables. However recent work has shown the advantages of modelling VARs with large numbers of variables and interest has naturally increased in modelling large dimensional TVP-VARs. A feature that has not yet been utilized is that the covariance matrix for the state equation, when estimated freely, is often near singular. We propose a speci?cation that uses this singularity to develop a factor-like structure to estimate a TVP-SVAR for 15 variables. Using a generalization of the re-centering approach, a rank reduced state covariance matrix and judicious parameter expansions, we obtain e¢ cient and simple computation of a high dimensional TVP-SVAR. An advantage of our approach is that we retain a formal inferential framework such that we can propose formal inference on impulse responses, variance decompositions and, important for our model, the rank of the state equation covariance matrix. We show clear empirical evidence in favour of our model and improvements in estimates of impulse responses. |
Keywords: | Large VAR; time varying parameter; reduced rank covariance matrix |
JEL: | C11 C22 E31 |
Date: | 2018–03–16 |
URL: | http://d.repec.org/n?u=RePEc:uts:ecowps:43&r=ets |
By: | Barassi, Marco; Horvath, Lajos; Zhao, Yuqian |
Abstract: | We propose semi-parametric CUSUM tests to detect a change point in the correlation structures of non--linear multivariate models with dynamically evolving volatilities. The asymptotic distributions of the proposed statistics are derived under mild conditions. We discuss the applicability of our method to the most often used models, including constant conditional correlation (CCC), dynamic conditional correlation (DCC), BEKK, corrected DCC and factor models. Our simulations show that, our tests have good size and power properties. Also, even though the near--unit root property distorts the size and power of tests, de--volatizing the data by means of appropriate multivariate volatility models can correct such distortions. We apply the semi--parametric CUSUM tests in the attempt to date the occurrence of financial contagion from the U.S. to emerging markets worldwide during the great recession. |
Keywords: | Change point detection, Time varying correlation structure, Volatility processes, Monte Carlo simulation, Contagion effect |
JEL: | C12 C14 C32 G10 G15 |
Date: | 2018–07–11 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:87837&r=ets |
By: | Costantini, Mauro (Department of Economics and Finance, Brunel University, London); Kunst, Robert M. (Institute for Advanced Studies, Vienna, and University of Vienna) |
Abstract: | Comparative ex-ante prediction experiments over expanding subsamples are a popular tool for the task of selecting the best forecasting model class in finite samples of practical relevance. Flanking such a horse race by predictive-accuracy tests,such as the test by Diebold and Mariano (1995), tends to increase support for the simpler structure. We are concerned with the question whether such simplicity boosting actually benefits predictive accuracy in finite samples. We consider two variants of the DM test, one with naive normal critical values and one with bootstrapped critical values, the predictive-ability test by Giacomini and White (2006), which continues to be valid in nested problems, the F test by Clark and McCracken (2001), and also model selection via the AIC as a benchmark strategy. Our Monte Carlo simulations focus on basic univariate time-series specifications, such as linear (ARMA) and nonlinear (SETAR) generating processes. |
Keywords: | Forecasting, time series, predictive accuracy, model selection |
JEL: | C22 C52 C53 |
Date: | 2018–07 |
URL: | http://d.repec.org/n?u=RePEc:ihs:ihsesp:341&r=ets |
By: | Puwasala Gamakumara; Anastasios Panagiotelis; George Athanasopoulos; Rob J Hyndman |
Abstract: | Forecast reconciliation involves adjusting forecasts to ensure coherence with aggregation constraints. We extend this concept from point forecasts to probabilistic forecasts by redefining forecast reconciliation in terms of linear functions in general, and projections more specifically. New theorems establish that the true predictive distribution can be recovered in the elliptical case by linear reconciliation, and general conditions are derived for when this is a projection. A geometric interpretation is also used to prove two new theoretical results for point forecasting; that reconciliation via projection both preserves unbiasedness and dominates unreconciled forecasts in a mean squared error sense. Strategies for forecast evaluation based on scoring rules are discussed, and it is shown that the popular log score is an improper scoring rule with respect to the class of unreconciled forecasts when the true predictive distribution coheres with aggregation constraints. Finally, evidence from a simulation study shows that reconciliation based on an oblique projection, derived from the MinT method of Wickramasuriya, Athanasopoulos and Hyndman (2018) for point forecasting, outperforms both reconciled and unreconciled alternatives. |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2018-11&r=ets |
By: | Valerio Scalone |
Abstract: | Estimation of non-linear DSGE models is still very limited due to high computational costs and identification issues arising from the non-linear solution of the models. Besides, the use of small sample amplifies those issues. This paper advocates for the use of Approximate Bayesian Computation (ABC), a set of Bayesian techniques based on moments matching. First, through Monte Carlo exercises, I assess the small sample performance of ABC estimators and run a comparison with the Limited Information Method (Kim, 2002), the state-of-the-art Bayesian method of moments used in DSGE literature. I find that ABC has a better small sample performance, due to the more efficient way through which the information provided by the moments is used to update the prior distribution. Second, ABC is tested on the estimation of a new-Keynesian model with a zero lower bound, a real life application where the occasionally binding constraint complicates the use of traditional method of moments. |
Keywords: | Monte Carlo analysis; Method of moments, Bayesian, Zero Lower Bound, DSGE estimation |
JEL: | C15 C11 E2 |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:bfr:banfra:688&r=ets |
By: | Rangan Gupta (Department of Economics, University of Pretoria, Pretoria, South Africa); Zhihui Lv (School of Mathematics and Statistics, Northeast Normal University, China); Wing-Keung Wong (Department of Finance, Fintech Center, and Big Data Research Center, Asia University; Department of Medical Research, China Medical University Hospital, Taiwan; Department of Economics and Finance, Hang Seng Management College, Hong Kong, China; Department of Economics, Lingnan University, Hong Kong, China.) |
Abstract: | This paper develops a change-point vector autoregressive (VAR) model and then analyzes the regime-specific impact of demand, supply, monetary policy, and spread yield shocks, identified using sign-restrictions, on real estate investment trusts (REITs) returns. The model first isolates four major macroeconomic regimes in the US since the 1970s, and discloses important changes to the statistical properties of REITs returns and its responses to the identified shocks. A variance decomposition analysis revealed aggregate supply shocks to have dominated in the early part of the sample period, and monetary policy spread shocks at the end. |
Keywords: | Change-point VAR Model, Macroeconomic Shocks, US REITs Sector |
JEL: | C32 E32 E42 R30 |
Date: | 2018–08 |
URL: | http://d.repec.org/n?u=RePEc:pre:wpaper:201849&r=ets |
By: | Trofimov, Ivan D. |
Abstract: | Recent global financial crisis and ongoing turbulence in the global economy revived interest in the classical hypothesis of declining profit rates and vanishing profit opportunities as one of the reasons of economic instabilities. This paper, while not joining theoretical debate on the driving factors of profit rates’ decline, reconsiders empirically the hypothesis of the secular decline in economy-wide profit rates. A panel of unit root tests is used and deterministic and stochastic trend models (with or without structural breaks) are estimated. It is shown that instead of continuous downward trend, profit rates exhibit diverse dynamics – random walk, deterioration with breaks, reversals, or the absence of trend. Likewise, it is shown in an exploratory analysis that a variety of factors were determining profit rates, with capital productivity and competitive dynamics in the economy likely being the most salient. |
Keywords: | Profit rates, time series, unit root, trend estimation, classical political economy |
JEL: | B51 C22 P17 |
Date: | 2018–06–08 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:88248&r=ets |
By: | Prüser, Jan; Schlösser, Alexander |
Abstract: | We study the time-varying impact of Economic Policy Uncertainty (EPU) on the US Economy by using a VAR with time-varying coefficients. The coefficients are allowed to evolve gradually over time which allows us to discover structural changes without imposing them a priori. We find three different regimes which match the three major business cycles of the US economy, namely the Great Inflation, the Great Moderation and the Great Recession. This finding is in contrast to previous literature which typically imposes two regimes a priori. Furthermore, we distinguish the effect of EPU on real economic activity and on financial markets. |
Keywords: | TVP-FAVAR,economic policy uncertainty,fat data |
JEL: | C11 C32 E20 E60 |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:zbw:rwirep:761&r=ets |
By: | Kevin Larcher (Department of Economics, Oklahoma State University); Jaebeom Kim (Department of Economics, Oklahoma State University); Youngju Kim (Economic Research Institute, The Bank of Korea) |
Abstract: | This study investigates the impact of uncertainty shocks on macroeconomic activity in Korea. For this purpose, a Smooth Transition VAR model is employed to document the state-dependent dynamics of two distinct types of uncertainty shocks, namely, financial market based and news-based. When nonlinearity is allowed to play a role in our model, quantitatively very different asymmetric dynamics are observed. Following in inflation targeting, the responses tend to be smoother and less pronounced. Our empirical results support the view that the link between uncertainty and macroeconomic activity is clear over both recessions and expansions. Furthermore, the impact of uncertainty shocks is more pronounced when economic activity is depressed especially after shocks originate from the financial market, and not from news-based policy uncertainty in Korea. |
Keywords: | Uncertainty shocks, Smooth transition vector autoregression, Asymmetric dynamics, Recessions |
JEL: | C32 E32 E52 |
Date: | 2018–04–26 |
URL: | http://d.repec.org/n?u=RePEc:bok:wpaper:1812&r=ets |
By: | NYONI, THABANI |
Abstract: | Of uttermost importance is the fact that forecasting macroeconomic variables provides a clear picture of what the state of the economy will be in future (Sultana et al, 2013). Nothing is more important to the conduct of monetary policy than understanding and predicting inflation (Kohn, 2005). Inflation is the scourge of the modern economy and is feared by central bankers globally and forces the execution of unpopular monetary policies. Inflation usually makes some people unfairly rich and impoverishes others and therefore it is an economic pathology that stands in the way of any sustainable economic growth and development. Models that make use of GARCH, as highlighted by Ruzgar & Kale (2007); vary from predicting the spread of toxic gases in the atmosphere to simulating neural activity but Financial Econometrics remains the leading discipline and apparently dominates the research on GARCH. The main objective of this study is to model monthly inflation rate volatility in Zimbabwe over the period July 2009 to July 2018. Our diagnostic tests indicate that our sample has the characteristics of financial time series and therefore, we can employ a GARCH – type model to model and forecast conditional volatility. The results of the study indicate that the estimated model, the AR (1) – GARCH (1, 1) model; is indeed an AR (1) – IGARCH (1, 1) process and is not only appropriate but also the best. Since the study provides evidence of volatility persistence for Zimbabwe’s monthly inflation data; monetary authorities ought to take into cognisance the IGARCH behavioral phenomenon of monthly inflation rates in order to design an appropriate monetary policy. |
Keywords: | ARCH, Forecasting, GARCH, IGARCH, Inflation Rate Volatility, Zimbabwe |
JEL: | C1 C6 E52 G0 |
Date: | 2018–07–22 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:88132&r=ets |
By: | Soumya Bhadury (National Council of Applied Economic Research); Taniya Ghosh (Indira Gandhi Institute of Development Research (IGIDR), Mumbai) |
Abstract: | We investigate the predictive power of Divisia monetary aggregates in explaining exchange rate variations for India, Israel, Poland, UK and the US, in the years leading up to and following the 2007-08 recessions. One valid concern for the chosen sample period is that the interest rate has been stuck at or near the zero lower bound (ZLB) for some major economies. Consequently, the interest rate has become uninformative about the monetary policy stance. An important innovation in our research is to adopt the Divisia monetary aggregate as an alternative to the policy indicator variable. We apply bootstrap Granger causality method which is robust to the presence of non-stationarity in our data. Additionally, we use bootstrap rolling window estimates to account for the problems of parameter non-constancy and structural breaks in our sample covering the Great recession. We find strong causality from Divisia money to exchange rates. By capturing the time-varying link of Divisia money to exchange rate, the importance of Divisia is further established at ZLB. |
Keywords: | Monetary Policy; Divisia Monetary Aggregates; Simple Sum; Nominal Exchange Rate; Real Effective Exchange Rate; Bootstrap Granger Causality |
JEL: | C32 C43 E41 E51 E52 F31 F41 |
Date: | 2018–02 |
URL: | http://d.repec.org/n?u=RePEc:nca:ncaerw:113&r=ets |
By: | Jonathan H. Wright |
Abstract: | In the 2018 comprehensive update of the national income and product accounts, the Bureau of Economic Analysis released not seasonally adjusted data, and modified its seasonal adjustment procedures. I find some indication of residual seasonality in the seasonally adjusted data as published before this update. The evidence for residual seasonality is weaker in the seasonally adjusted data after the update. I also directly seasonally adjusted the aggregate not seasonally adjusted data, and this entirely avoids residual seasonality. The average absolute difference between my seasonally adjusted real GDP data and the current official published version is 1.1 percentage points in quarter-over-quarter annualized growth rates. |
JEL: | C32 E01 |
Date: | 2018–08 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:24895&r=ets |