nep-ecm New Economics Papers
on Econometrics
Issue of 2012‒03‒28
twenty papers chosen by
Sune Karlsson
Orebro University

  1. Model Adequacy Checks for Discrete Choice Dynamic Models By Igor Kheifets; Carlos Velasco
  2. TESTING FOR SKEWNESS IN AR CONDITIONAL VOLATILITY MODELS FOR FINANCIAL RETURN SERIES By Mantalos, Panagiotis; Karagrigoriou, Alex
  3. Efficient bootstrap with weakly dependent processes By Francesco Bravo; Federico Crudu
  4. Distribution Theory for the Studentized Mean for Long, Short, and Negative Memory Time Series By McElroy, Tucker S; Politis, D N
  5. A Hybrid Data Cloning Maximum Likelihood Estimator for Stochastic Volatility Models By Márcio Laurini
  6. Identifying observed factors in approximate factor models: estimation and hypothesis testing By Chen, Liang
  7. Estimation in Non-Linear Non-Gaussian State Space Models with Precision-Based Methods By Joshua Chan; Rodney Strachan
  8. Common Drifting Volatility in Large Bayesian VARs By Carriero, Andrea; Clark, Todd; Marcellino, Massimiliano
  9. Dynamic Functional Data Analysis with Nonparametric State Space Models. By Márcio Laurini
  10. Adaptive Minimax Estimation over Sparse lq-Hulls By Zhan Wang; Sandra Paterlini; Fuchang Gao; Yuhong Yang
  11. Forecasting adoption of ultra-low-emission vehicles using the GHK simulator and Bayes estimates of a multinomial probit model By Daziano, Ricardo A.; Achtnicht, Martin
  12. Pitfalls in Backtesting Historical Simulation VaR Models By Juan Carlos Escanciano; Pei Pei
  13. Haavelmo's Probability Approach and the Cointegrated VAR By Katarina Juselius
  14. Markov Regime-Switching Tests: Asymptotic Critical Values By Steigerwald, Douglas; Carter, Andrew
  15. Adaptive Forecasting in the Presence of Recent and Ongoing Structural Change By Liudas Giraitis; George Kapetanios; Simon Price
  16. On detection of volatility spillovers in simultaneously open stock markets By Kohonen, Anssi
  17. Extracting non-linear signals from several economic indicators By Maximo Camacho; Gabriel Perez-Quiros; Pilar Poncela
  18. Estimating Idiosyncratic Volatility and Its Effects on a Cross-Section of Returns By Serguey Khovansky; Zhylyevskyy, Oleksandr
  19. Finite sample performance of small versus large scale dynamic factor models By Alvarez, Rocio; Camacho, Maximo; Pérez-Quirós, Gabriel
  20. Markov-switching dynamic factor models in real time By Camacho, Maximo; Pérez-Quirós, Gabriel; Poncela, Pilar

  1. By: Igor Kheifets (New Economic School, Moscow); Carlos Velasco (Department of Economics, Universidad Carlos III de Madrid)
    Abstract: This paper proposes new parametric model adequacy tests for possibly nonlinear and nonstationary time series models with noncontinuous data distribution, which is often the case in applied work. In particular, we consider the correct specification of parametric conditional distributions in dynamic discrete choice models, not only of some particular conditional characteristics such as moments or symmetry. Knowing the true distribution is important in many circumstances, in particular to apply efficient maximum likelihood methods, obtain consistent estimates of partial effects and appropriate predictions of the probability of future events. We propose a transformation of data which under the true conditional distribution leads to continuous uniform iid series. The uniformity and serial independence of the new series is then examined simultaneously. The transformation can be considered as an extension of the integral transform tool for noncontinuous data. We derive asymptotic properties of such tests taking into account the parameter estimation effect. Since transformed series are iid we do not require any mixing conditions and asymptotic results illustrate the double simultaneous checking nature of our test. The test statistics converges under the null with a parametric rate to the asymptotic distribution, which is case dependent, hence we justify a parametric bootstrap approximation. The test has power against local alternatives and is consistent. The performance of the new tests is compared with classical specification checks for discrete choice models.
    Keywords: Goodness of fit, diagnostic test, parametric conditional distribution, discrete choice models, parameter estimation effect, bootstrap
    JEL: C12 C22 C52
    Date: 2012–02
    URL: http://d.repec.org/n?u=RePEc:cfr:cefirw:w0170&r=ecm
  2. By: Mantalos, Panagiotis (Department of Business, Economics, Statistics and Informatics); Karagrigoriou, Alex (Department of Mathematics and Statistics, University of Cyprus)
    Abstract: In this paper a test procedure is proposed for the skewness in autoregressive conditional volatility models. The size and the power of the test are investigated through a series of Monte Carlo simulations with various models. Furthermore, applications with financial data are analyzed in order to explore the applicability and the capabilities of the proposed testing procedure.
    Keywords: ARCH /GARCH model; kurtosis; NoVaS; skewness. JEL Classification Codes: C01; C12; C15
    JEL: C01 C12 C15
    Date: 2012–03–21
    URL: http://d.repec.org/n?u=RePEc:hhs:oruesi:2012_004&r=ecm
  3. By: Francesco Bravo; Federico Crudu
    Abstract: The efficient bootstrap methodology is developed for overidentified moment conditions models with weakly dependent observation. The resulting bootstrap procedure is shown to be asymptotically valid and can be used to approximate the distributions of t-statistics, J statistic for overidentifying restrictions, and Wald, Lagrange multiplier and distance statistics for nonlinear hypotheses. The asymptotic validity of the efficient bootstrap based on a computationally less demanding approximate k-step estimator is also shown. The finite sample performance of the proposed bootstrap is assessed using simulations in an intertemporal consumption based asset pricing model.
    Keywords: -mixing, Consumption CAPM, GEL, GMM, Hypothesis testing
    JEL: C12 C13 C58
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:yor:yorken:12/08&r=ecm
  4. By: McElroy, Tucker S; Politis, D N
    Abstract: We consider the problem of estimating the variance of the partial sums of a stationary time series that has either long memory, short memory, negative/intermediate memory, or is the ¯rst- di®erence of such a process. The rate of growth of this variance depends crucially on the type of memory, and we present results on the behavior of tapered sums of sample autocovariances in this context when the bandwidth vanishes asymptotically. We also present asymptotic results for the case that the bandwidth is a ¯xed proportion of sample size, extending known results to the case of °at-top tapers. We adopt the ¯xed proportion bandwidth perspective in our empirical section, presenting two methods for estimating the limiting critical values { both the subsampling method and a plug-in approach. Extensive simulation studies compare the size and power of both approaches as applied to hypothesis testing for the mean. Both methods perform well { although the subsampling method appears to be better sized { and provide a viable framework for conducting inference for the mean. In summary, we supply a uni¯ed asymptotic theory that covers all di®erent types of memory under a single umbrella.
    Keywords: kernel, lag-windows, overdifferencing, spectral estimation, subsampling, tapers, unit-root problem, Econometrics
    Date: 2011–09–01
    URL: http://d.repec.org/n?u=RePEc:cdl:ucsdec:qt0dr145dt&r=ecm
  5. By: Márcio Laurini (IBMEC Business School)
    Abstract: In this paper we analyze a maximum likelihood estimator using data cloning for stochastic volatility models.This estimator is constructed using a hybrid methodology based on Integrated Nested Laplace Approximations to calculate analytically the auxiliary Bayesian estimators with great accuracy and computational efficiency, without requiring the use of simulation methods as Markov Chain Monte Carlo. We analyze the performance of this estimator compared to methods based in Monte Carlo simulations (Simulated Maximum Likelihood, MCMC Maximum Likelihood) and approximate maximum likelihood estimators using Laplace Approximations. The results indicate that this data cloning methodology achieves superior results over methods based on MCMC, and comparable to results obtained by the Simulated Maximum Likelihood estimator.
    Keywords: Stochastic Volatility: Data Cloning, Maximum Likelihood, MCMC, Laplace Approximations.
    JEL: C53 E43 G17
    Date: 2012–03–16
    URL: http://d.repec.org/n?u=RePEc:ibr:dpaper:2012-02&r=ecm
  6. By: Chen, Liang
    Abstract: Despite their popularities in recent years, factor models have long been criticized for the lack of identification. Even when a large number of variables are available, the factors can only be consistently estimated up to a rotation. In this paper, we try to identify the underlying factors by associating them to a set of observed variables, and thus give interpretations to the orthogonal factors estimated by the method of Principal Components. We first propose a estimation procedure to select a set of observed variables, and then test the hypothesis that true factors are exact linear combinations of the selected variables. Our estimation method is shown to able to correctly identity the true observed factor even in the presence of mild measurement errors, and our test statistics are shown to be more general than those of Bai and Ng (2006). The applicability of our methods in finite samples and the advantages of our tests are confirmed by simulations. Our methods are also applied to the returns of portfolios to identify the underlying risk factors.
    Keywords: factor models; observed factors; estimation; hypothesis testing; Fama-French three factors
    JEL: C13 C12 C01
    Date: 2012–03–20
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:37514&r=ecm
  7. By: Joshua Chan; Rodney Strachan
    Abstract: In recent years state space models, particularly the linear Gaussian version, have become the standard framework for analyzing macro-economic and financial data. However, many theoretically motivated models imply non-linear or non-Gaussian specifications – or both. Existing methods for estimating such models are computationally intensive, and often cannot be applied to models with more than a few states. Building upon recent developments in precision-based algorithms, we propose a general approach to estimating high-dimensional non-linear non-Gaussian state space models. The baseline algorithm approximates the conditional distribution of the states by a multivariate Gaussian or t density, which is then used for posterior simulation. We further develop this baseline algorithm to construct more sophisticated samplers with attractive properties: on based on the accept—reject Metropolis-Hastings (ARHM) algorithm, and another adaptive collapsed sampler inspired by the cross-entropy method. To illustrate the proposed approach, we investigate the effect of the zero lower bound of interest rate on monetary transmission mechanism.
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:acb:camaaa:2012-13&r=ecm
  8. By: Carriero, Andrea; Clark, Todd; Marcellino, Massimiliano
    Abstract: The estimation of large Vector Autoregressions with stochastic volatility using standard methods is computationally very demanding. In this paper we propose to model conditional volatilities as driven by a single common unobserved factor. This is justified by the observation that the pattern of estimated volatilities in empirical analyses is often very similar across variables. Using a combination of a standard natural conjugate prior for the VAR coefficients, and an independent prior on a common stochastic volatility factor, we derive the posterior densities for the parameters of the resulting BVAR with common stochastic volatility (BVAR-CSV). Under the chosen prior the conditional posterior of the VAR coefficients features a Kroneker structure that allows for fast estimation, even in a large system. Using US and UK data, we show that, compared to a model with constant volatilities, our proposed common volatility model significantly improves model fit and forecast accuracy. The gains are comparable to or as great as the gains achieved with a conventional stochastic volatility specification that allows independent volatility processes for each variable. But our common volatility specification greatly speeds computations.
    Keywords: Bayesian VARs; forecasting; prior specification; stochastic volatility
    JEL: C11 C13 C33 C53
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:8894&r=ecm
  9. By: Márcio Laurini (IBMEC Business School)
    Abstract: In this article we introduce a new methodology for modeling curves with a dynamic structure, using a non-parametric approach formulated as a state space model. The non-parametric approach is based on the use of penalized splines, represented as a dynamic mixed model. This formulation can capture the dynamic evolution of curves using a limited number of latent factors, allowing a accurate fit with a limited number of parameters. We also present a new method to determine the optimal smoothing parameter through an adaptive procedure using a formulation analogous to a model of stochastic volatility. This methodology allows unifying different methodologies applied to data with a functional structure in finance. We present the advantages and limitations of this methodology through a simulation study and also comparing its predictive performance with other parametric and non-parametric methods used in financial applications using data from term structure of interest rates.
    Keywords: Functional Data, Penalized Splines, MCMC, Bayesian non-parametric methods
    JEL: C11 C15 G12
    Date: 2012–03–16
    URL: http://d.repec.org/n?u=RePEc:ibr:dpaper:2012-01&r=ecm
  10. By: Zhan Wang; Sandra Paterlini; Fuchang Gao; Yuhong Yang
    Abstract: Given a dictionary of Mn initial estimates of the unknown true regression function, we aim to construct linearly aggregated estimators that target the best performance among all the linear combinations under a sparse q-norm (0 <= q <= 1) constraint on the linear coefficients. Besides identifying the optimal rates of aggregation for these lq-aggregation problems, our multi-directional (or universal) aggregation strategies by model mixing or model selection achieve the optimal rates simultaneously over the full range of 0 <= q <= 1 for general Mn and upper bound tn of the q-norm. Both random and xed designs, with known or unknown error variance, are handled, and the lq-aggregations examined in this work cover major types of aggregation problems previously studied in the literature. Consequences on minimax-rate adaptive regression under lq-constrained true coefficients (0 <= q <= 1) are also provided. Our results show that the minimax rate of lq-aggregation (0 <= q <= 1) is basically determined by an effective model size, which is a sparsity index that depends on q, tn, Mn, and the sample size n in an easily interpretable way based on a classical model selection theory that deals with a large number of models. In addition, in the fixed design case, the model selection approach is seen to yield optimal rates of convergence not only in expectation but also with exponential decay of deviation probability. In contrast, the model mixing approach can have leading constant one in front of the target risk in the oracle inequality while not offeering optimality in deviation probability.
    Keywords: minimax risk, adaptive estimation, sparse lq-constraint, linear combining, aggregation, model mixing, model selection
    Date: 2012–01
    URL: http://d.repec.org/n?u=RePEc:mod:recent:078&r=ecm
  11. By: Daziano, Ricardo A.; Achtnicht, Martin
    Abstract: In this paper we use Bayes estimates of a multinomial probit model with fully flexible substitution patterns to forecast consumer response to ultra-low-emission vehicles. In this empirical application of the probit Gibbs sampler, we use stated-preference data on vehicle choice from a Germany-wide survey of potential light-duty-vehicle buyers using computer-assisted personal interviewing. We show that Bayesian estimation of a multinomial probit model with a full covariance matrix is feasible for this medium-scale problem. Using the posterior distribution of the parameters of the vehicle choice model as well as the GHK simulator we derive the choice probabilities of the different alternatives. We first show that the Bayes point estimates of the market shares reproduce the observed values. Then, we define a base scenario of vehicle attributes that aims at representing an average of the current vehicle choice situation in Germany. Consumer response to qualitative changes in the base scenario is subsequently studied. In particular, we analyze the effect of increasing the network of service stations for charging electric vehicles as well as for refueling hydrogen. The result is the posterior distribution of the choice probabilities that represent adoption of the energy-effcient technologies. --
    Keywords: Discrete choice models,Bayesian econometrics,Low emission vehicles,Charging infrastructure
    JEL: C25 D12 Q42
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:zbw:zewdip:12017&r=ecm
  12. By: Juan Carlos Escanciano (Indiana University); Pei Pei (Indiana University and Chinese Academy of Finance and Development, Central University of Finance and Economics)
    Abstract: Historical Simulation (HS) and its variant, the Filtered Historical Simulation (FHS), are the most widely used Value-at-Risk forecast methods at commercial banks. These forecast methods are traditionally evaluated by means of the unconditional backtest. This paper formally shows that the unconditional backtest is always inconsistent for backtesting HS and FHS models, with a power function that can be even smaller than the nominal level in large samples. Our ndings have fundamental implications in the determination of market risk capital requirements, and also explain Monte Carlo and empirical ndings in previous studies. We also propose a data-driven weighted backtest with good power properties to evaluate HS and FHS forecasts. Finally, our theoretical ndings are conrmed in a Monte Carlo simulation study and an empirical application with three U.S. stocks. The empirical application shows that multiplication factors computed under the current regulatory framework are downward biased, as they inherit the inconsistency of the unconditional backtest.
    Date: 2012–02
    URL: http://d.repec.org/n?u=RePEc:inu:caeprp:2012-003&r=ecm
  13. By: Katarina Juselius (Department of Economics)
    Abstract: Some key econometric concepts and problems addressed by Trygve Haavelmo and Ragnar Frisch are discussed within the general framework of a cointegrated VAR. The focus is on problems typical of time series data such as multicollinearity, spurious correlation and regression results, time dependent residuals, normalization, reduced rank, model selection, missing variables, simultaneity, autonomy and identification. Specifically the paper discusses (1) the conditions under which the VAR model represents a full probability formulation of a sample of time-series observations, (2) the plausibility of the multivariate normality assumption underlying the VAR, (3) cointegration as a solution to the problem of spurious correlation and multicollinearity when data contain deterministic and stochastic trends, (4) the existence of a universe, (5) the association between Frisch's confluence analysis and cointegrated VAR analysis, (6) simultaneity and identification when data are nonstationary, (7) conditions under which identified cointegration relations can be considered structural or autonomous, and finally (8) a formulation of a design of experiment for passive observations based on theory consistent CVAR scenarios illustrated with a monetary model for inflation.
    Keywords: Haavelmo, CVAR, autonomy, identification, passive observations
    JEL: B16 B31 B41 C32 C82
    Date: 2012–03–01
    URL: http://d.repec.org/n?u=RePEc:kud:kuiedp:1201&r=ecm
  14. By: Steigerwald, Douglas; Carter, Andrew
    Keywords: Econometrics and Quantitative Economics, mixture model, regime switching, numeric approximation
    Date: 2011–08–12
    URL: http://d.repec.org/n?u=RePEc:cdl:ucsbec:qt5rn986z6&r=ecm
  15. By: Liudas Giraitis (Queen Mary, University of London); George Kapetanios (Queen Mary, University of London); Simon Price (Bank of England and City University)
    Abstract: We consider time series forecasting in the presence of ongoing structural change where both the time series dependence and the nature of the structural change are unknown. Methods that downweight older data, such as rolling regressions, forecast averaging over different windows and exponentially weighted moving averages, known to be robust to historical structural change, are found to be also useful in the presence of ongoing structural change in the forecast period. A crucial issue is how to select the degree of downweighting, usually defined by an arbitrary tuning parameter. We make this choice data dependent by minimizing forecast mean square error, and provide a detailed theoretical analysis of our proposal. Monte Carlo results illustrate the methods. We examine their performance on 191 UK and US macro series. Forecasts using data-based tuning of the data discount rate are shown to perform well.
    Keywords: Recent and ongoing structural change, Forecast combination, Robust forecasts
    JEL: C10 C59
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp691&r=ecm
  16. By: Kohonen, Anssi
    Abstract: Empirical research confirms the existence of volatility spillovers across national stock markets. However, the models in use are mostly statistical ones. Much less is known about the actual transmission mechanisms; theoretical literature is scarce, and so is empirical work trying to estimate specific theoretical models. Some economic theory founded tests for such spillovers have been developed for non-overlapping markets; this institutional set up provides a way around the problems of estimating a system of simultaneous equations. However, volatility spillovers across overlapping markets might be as important a phenomenon as across non-overlapping markets. Building on recent advances in econometrics of identifying structural vector autoregressive models, this paper proposes a way to estimate an existing signal-extraction model that explains volatility spillovers across simultaneously open stock markets. Furthermore, a new empirical test for detection of such spillovers is derived. As an empirical application, the theoretical model is fitted to daily data of eurozone stock markets in years 2010--2011. Evidence of volatility spillovers across the countries is found.
    Keywords: Volatility transmission; financial contagion; SVAR identification; hypothesis testing; stock markets; euro debt crisis
    JEL: G14 C12 G15 C30 D82
    Date: 2012–03–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:37504&r=ecm
  17. By: Maximo Camacho (Universidad de Murcia); Gabriel Perez-Quiros (Banco de España); Pilar Poncela (Universidad Autónoma de Madrid)
    Abstract: We develop a twofold analysis of how the information provided by several economic indicators can be used in Markov-switching dynamic factor models to identify the business cycle turning points. First, we compare the performance of a fully non-linear multivariate specifi cation (one-step approach) with the “shortcut” of using a linear factor model to obtain a coincident indicator which is then used to compute the Markov-switching probabilities (two-step approach). Second, we examine the role of increasing the number of indicators. Our results suggest that one step is generally preferred to two steps, although its marginal gains diminish as the quality of the indicators increases and as more indicators are used to identify the non-linear signal. Using the four constituent series of the Stock-Watson coincident index, we illustrate these results for US data.
    Keywords: Business cycles, output growth, time series
    JEL: E32 C22 E27
    Date: 2012–02
    URL: http://d.repec.org/n?u=RePEc:bde:wpaper:1202&r=ecm
  18. By: Serguey Khovansky; Zhylyevskyy, Oleksandr
    Abstract: We apply a new econometric method -- the generalized method of moments under a common shock -- to estimate idiosyncratic volatility premium and average idiosyncratic stock volatility. In contrast to the popular two-pass estimation approach of Fama and MacBeth (1973), the method requires using only a cross-section of return observations. We apply it to cross-sections of weekly U.S. stock returns in January and October 2008 and fiÂ…nd that during these months, the idiosyncratic volatility premium is nearly always negative and statistically signiÂ…cant. The results also indicate that the average idiosyncratic stock volatility increased by at least 50% between January and October.
    Keywords: Generalized method of moments; Idiosyncratic volatility; Cross-section of stock returns; Idiosyncratic volatility premium
    JEL: C21 C51 G12
    Date: 2012–01–31
    URL: http://d.repec.org/n?u=RePEc:isu:genres:34990&r=ecm
  19. By: Alvarez, Rocio; Camacho, Maximo; Pérez-Quirós, Gabriel
    Abstract: We examine the finite-sample performance of small versus large scale dynamic factor models. Our Monte Carlo analysis reveals that small scale factor models out-perform large scale models in factor estimation and forecasting for high levels of cross-correlation across the idiosyncratic errors of series belonging to the same category, for oversampled categories and, especially, for high persistence in either the common factor series or the idiosyncratic errors. Using a panel of 147 US economic indicators, which are classified into 13 economic categories, we show that a small scale dynamic factor model that uses one representative indicator of each category yields satisfactory or even better forecasting results than a large scale dynamic factor model that uses all the economic indicator
    Keywords: business cycles; output growth; time series
    JEL: C22 E27 E32
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:8867&r=ecm
  20. By: Camacho, Maximo; Pérez-Quirós, Gabriel; Poncela, Pilar
    Abstract: We extend the Markov-switching dynamic factor model to account for some of the specificities of the day-to-day monitoring of economic developments from macroeconomic indicators, such as ragged edges and mixed frequencies. We examine the theoretical benefits of this extension and corroborate the results through several MonteCarlo simulations. Finally, we assess its empirical reliability to compute real-time inferences of the US business cycle.
    Keywords: Business Cycles; Output Growth; Time Series
    JEL: C22 E27 E32
    Date: 2012–02
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:8866&r=ecm

This nep-ecm issue is ©2012 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.