nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒11‒22
twenty papers chosen by
Sune Karlsson
Örebro universitet

  1. High Dimensional Generalized Empirical Likelihood for Moment Restrictions with Dependent Data By Chang, Jinyuan; Chen, Song Xi; Chen, Xiaohong
  2. Efficient estimation of heterogeneous coefficients in panel data models with common shock By Li, Kunpeng; Lu, Lina
  3. Particle learning for Bayesian non-parametric Markov Switching Stochastic Volatility model By Audrone Virbickaite; Hedibert F. Lopes; Concepcion Ausín; Pedro Galeano
  4. Optimal Formulations for Nonlinear Autoregressive Processes By Francisco Blasques; Siem Jan Koopman; Andr� Lucas
  5. Band Width Selection for High Dimensional Covariance Matrix Estimation By Qiu, Yumou; Chen, Song Xi
  6. Interval-valued Time Series: Model Estimation based on Order Statistics By Gloria Gonzalez-Rivera; Wei Lin
  7. Low Frequency and Weighted Likelihood Solutions for Mixed Frequency Dynamic Factor Models By Francisco Blasques; Siem Jan Koopman; Max Mallee
  8. Minimum Distance Estimation of Dynamic Models with Errors-In-Variables By Gospodinov, Nikolay; Komunjer, Ivana; Ng, Serena
  9. Dynamic Panels with Threshold Effect and Endogeneity By Myung Hwan Seo; Yongcheol Shin
  10. Dynamic Selection and Distributional Bounds on Search Costs in Dynamic Unit-Demand Models By Jason R. Blevins; Garrett T. Senney
  11. Score Driven exponentially Weighted Moving Average and Value-at-Risk Forecasting By Andr� Lucas; Xin Zhang
  12. Detrended fluctuation analysis as a regression framework: Estimating dependence at different scales By Ladislav Kristoufek
  13. Spurious Dependence By Chollete, Loran; Pena, Victor de la; Segers , Johan
  14. Nonparametric Identification of Endogenous and Heterogeneous Aggregate Demand Models: Complements, Bundles and the Market Level By Dunker, Fabian; Hoderlein, Stefan; Kaido, Hiroaki
  15. Evaluating Option Pricing Model Performance Using Model Uncertainty By Thorsten Lehnert; Gildas Blanchard; Dennis Bams
  16. Evaluating a Structural Model Forecast: Decomposition Approach By Frantisek Brazdik; Zuzana Humplova; Frantisek Kopriva
  17. The Cult of statistical significance - A Review By Sripad Motiram
  18. Density Forecast Evaluation in Unstable Environments By Gloria Gonzalez-Rivera; Yingying Sun
  19. Modeling Systematic Risk and Point-in-Time Probability of Default under the Vasicek Asymptotic Single Risk Factor Model Framework By Yang, Bill Huajian
  20. Dynamic Model Averaging in Large Model Spaces Using Dynamic Occam's Window By Luca Onorante; Adrian E. Raftery

  1. By: Chang, Jinyuan; Chen, Song Xi; Chen, Xiaohong
    Abstract: This paper considers the maximum generalized empirical likelihood (GEL) estimation and inference on parameters identified by high dimensional moment restrictions with weakly dependent data when the dimensions of the moment restrictions and the parameters diverge along with the sample size. The consistency with rates and the asymptotic normality of the GEL estimator are obtained by properly restricting the growth rates of the dimensions of the parameters and the moment restrictions, as well as the degree of data dependence. It is shown that even in the high dimensional time series setting, the GEL ratio can still behave like a chi-square random variable asymptotically. A consistent test for the over-identification is proposed. A penalized GEL method is also provided for estimation under sparsity setting.
    Keywords: Generalized empirical likelihood; High dimensionality; Penalized likelihood; Variable selection; Over-identification test; Weak dependence.
    JEL: C1 C13 C14
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:59640&r=ecm
  2. By: Li, Kunpeng; Lu, Lina
    Abstract: This paper investigates efficient estimation of heterogeneous coefficients in panel data models with common shocks, which have been a particular focus of recent theoretical and empirical literature. We propose a new two-step method to estimate the heterogeneous coefficients. In the first step, the maximum likelihood (ML) method is first conducted to estimate the loadings and idiosyncratic variances. The second step estimates the heterogeneous coefficients by using the structural relations implied by the model and replacing the unknown parameters with their ML estimates. We establish the asymptotic theory of our estimator, including consistency, asymptotic representation, and limiting distribution. The two-step estimator is asymptotically efficient in the sense that it has the same limiting distribution as the infeasible generalized least squares (GLS) estimator. Intensive Monte Carlo simulations show that the proposed estimator performs robustly in a variety of data setups.
    Keywords: Factor analysis; Block diagonal covariance; Panel data models; Common shocks; Maximum likelihood estimation, heterogeneous coefficients; Inferential theory
    JEL: C33
    Date: 2014–10
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:59312&r=ecm
  3. By: Audrone Virbickaite; Hedibert F. Lopes; Concepcion Ausín; Pedro Galeano
    Abstract: This paper designs a Particle Learning (PL) algorithm for estimation of Bayesian nonparametric Stochastic Volatility (SV) models for financial data. The performance of this particle method is then compared with the standard Markov Chain Monte Carlo (MCMC) methods for non-parametric SV models. PL performs as well as MCMC, and at the same time allows for on-line type inference. The posterior distributions are updated as new data is observed, which is prohibitively costly using MCMC. Further, a new non-parametric SV model is proposed that incorporates Markov switching jumps.The proposed model is estimated by using PL and tested on simulated data. Finally, the performance of the two non-parametric SV models, with and without Markov switching, is compared by using real financial time series. The results show that including a Markov switching specification provides higher predictive power in the tails of the distribution.
    Keywords: Dirichlet Process Mixture, Markov Switching, MCMC, Particle Learning, Stochastic Volatility, Sequential Monte Carlo
    Date: 2014–10
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws142819&r=ecm
  4. By: Francisco Blasques; Siem Jan Koopman; Andr� Lucas (VU University Amsterdam, the Netherlands)
    Abstract: We develop optimal formulations for nonlinear autoregressive models by representing them as linear autoregressive models with time-varying temporal dependence coefficients. We propose a parameter updating scheme based on the score of the predictive likelihood function at each time point. The resulting time-varying autoregressive model is formulated as a nonlinear autoregressive model and is compared with threshold and smooth-transition autoregressive models. We establish the information theoretic optimality of the score driven nonlinear autoregressive process and the asymptotic theory for maximum likelihood parameter estimation. The performance of our model in extracting the time-varying or the nonlinear dependence for finite samples is studied in a Monte Carlo exercise. In our empirical study we present the in-sample and out-of-sample performances of our model for a weekly time series of unemployment insurance claims.
    Keywords: Asymptotic theory; Dynamic models, Observation driven time series models; Smooth-transition model; Time-Varying Parameters; Treshold autoregressive model
    JEL: C13 C22 C32
    Date: 2014–08–11
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20140103&r=ecm
  5. By: Qiu, Yumou; Chen, Song Xi
    Abstract: The banding estimator of Bickel and Levina (2008a) and its tapering version of Cai, Zhang and Zhou (2010), are important high dimensional covariance estimators. Both estimators require choosing a band width parameter. We propose a band width selector for the banding covariance estimator by minimizing an empirical estimate of the expected squared Frobenius norms of the estimation error matrix. The ratio consistency of the band width selector to the underlying band width is established. We also provide a lower bound for the coverage probability of the underlying band width being contained in an interval around the band width estimate. Extensions to the band width selection for the tapering estimator and threshold level selection for the thresholding covariance estimator are made. Numerical simulations and a case study on sonar spectrum data are conducted to confirm and demonstrate the proposed band width and threshold estimation approaches.
    Keywords: Bandable covariance; Banding estimator; Large $p$, small $n$; Ratio-consistency; Tapering estimator; Thresholding estimator.
    JEL: C1 C13 C14 C5
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:59641&r=ecm
  6. By: Gloria Gonzalez-Rivera (Department of Economics, University of California Riverside); Wei Lin
    Date: 2014–09
    URL: http://d.repec.org/n?u=RePEc:ucr:wpaper:201429&r=ecm
  7. By: Francisco Blasques; Siem Jan Koopman; Max Mallee (VU University Amsterdam, the Netherlands)
    Abstract: The multivariate analysis of a panel of economic and financial time series with mixed frequencies is a challenging problem. The standard solution is to analyze the mix of monthly and quarterly time series jointly by means of a multivariate dynamic model with a monthly time index: artificial missing values are inserted for the intermediate months of the quarterly time series. In this paper we explore an alternative solution for a class of dynamic factor models that is specified by means of a low frequency quarterly time index. We show that there is no need to introduce artificial missing values while the high frequency (monthly) information is preserved and can still be analyzed. We also provide evidence that the analysis based on a low frequency specification can be carried out in a computationally more efficient way. A comparison study with existing mixed frequency procedures is presented and discussed. Furthermore, we modify the method of maximum likelihood in the context of a dynamic factor model. We introduce variable-specific weights in the likelihood function to let some variable equations be of more importance during the estimation process. We derive the asymptotic properties of the weighted maximum likelihood estimator and we show that the estimator is consistent and asymptotically normal. We also verify the weighted estimation method in a Monte Carlo study to investigate the effect of differen t choices for the weights in different scenarios. Finally, we empirically illustrate the new developments for the extraction of a coincident economic indicator from a small panel of mixed frequency economic time series.
    Keywords: Asymptotic theory, Forecasting, Kalman filter, Nowcasting, State space
    JEL: C13 C32 C53 E17
    Date: 2014–08–11
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20140105&r=ecm
  8. By: Gospodinov, Nikolay (Federal Reserve Bank of Atlanta); Komunjer, Ivana (University of California-San Diego); Ng, Serena (Columbia University)
    Abstract: Empirical analysis often involves using inexact measures of desired predictors. The bias created by the correlation between the problematic regressors and the error term motivates the need for instrumental variables estimation. This paper considers a class of estimators that can be used when external instruments may not be available or are weak. The idea is to exploit the relation between the parameters of the model and the least squares biases. In cases when this mapping is not analytically tractable, a special algorithm is designed to simulate the latent predictors without completely specifying the processes that induce the biases. The estimators perform well in simulations of the autoregressive distributed lag model and the dynamic panel model. The methodology is used to re-examine the Phillips curve, in which the real activity gap is latent.
    Keywords: measurement error; minimum distance; simulation estimation; dynamic panel
    JEL: C1 C3
    Date: 2014–08–01
    URL: http://d.repec.org/n?u=RePEc:fip:fedawp:2014-11&r=ecm
  9. By: Myung Hwan Seo; Yongcheol Shin
    Abstract: This paper addresses an important and challenging issue as how best to model nonlinear asymmetric dynamics and cross-sectional heterogeneity, simultaneously, in the dynamic threshold panel data framework, in which both threshold variable and regressors are allowed to be endogenous. Depending on whether the threshold variable is strictly exogenous or not, we propose two different estimation methods: first-differenced two-step least squares and first-differenced GMM. The former exploits the fact that the threshold variable is strictly exogenous to achieve the super-consistency of the threshold estimator. We provide asymptotic distributions of both estimators. The bootstrap-based test for the presence of threshold effect as well as the exogeneity test of the threshold variable are also developed. Monte Carlo studies provide a support for our theoretical predictions. Finally, using the UK and the US company panel data, we provide two empirical applications investigating an asymmetric sensitivity of investment to cash flows and an asymmetric dividend smoothing.
    Keywords: Dynamic Panel Threshold Models, Endogenous Threshold Effects and Regressors, FD-GMM and FD-2SLS Estimation, Linearity Test, Exogeneity Test, Investment and Dividend Smoothing.
    JEL: C13 C33 G31 G35
    Date: 2014–09
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2014/577&r=ecm
  10. By: Jason R. Blevins (Department of Economics, Ohio State University); Garrett T. Senney (Department of Economics, Ohio State University)
    Abstract: This paper develops a dynamic model of consumer search that, despite placing very little structure on the dynamic problem faced by consumers, allows us to exploit intertemporal variation in within-period price and search cost distributions to estimate the population distribution from which consumers' search costs are initially drawn. We show that static approaches to estimating this distribution generally suffer from a dynamic sample selection bias because forward-looking consumers with unit demand for a good may delay their purchase in a way that depends on their individual search cost. We analyze identification of the population search cost distribution using only price data and develop estimable nonparametric upper and lower bounds on the distribution function and a nonlinear least squares estimator for parametric models. We also consider the additional identifying power of weak assumptions such as monotonicity of purchase probabilities in search costs. We apply our estimators to analyze the online market for two widely used econometrics textbooks. Our results suggest that static estimates of the search cost distribution are biased upwards, in a distributional sense, relative to the true population distribution. In a small-scale simulation study, we show that this is typical in a dynamic setting where consumers with high search costs are more likely to delay purchase than those with lower search costs.
    Keywords: nonsequential search, consumer search, dynamic selection, nonparametric bounds
    JEL: C14 D83 D43
    Date: 2014–10
    URL: http://d.repec.org/n?u=RePEc:osu:osuewp:14-02&r=ecm
  11. By: Andr� Lucas (VU University Amsterdam); Xin Zhang (Sveriges Riksbank, Sweden)
    Abstract: We present a simple new methodology to allow for time variation in volatilities using a recursive updating scheme similar to the familiar RiskMetrics approach. We update parameters using the score of the forecasting distribution rather than squared lagged observations. This allows the parameter dynamics to adapt automatically to any non-normal data features and robustifies the subsequent volatility estimates. Our new approach nests several extensions to the exponentially weighted moving average (EWMA) scheme as proposed earlier. Our approach also easily handles extensions to dynamic higher-order moments or other choices of the preferred forecasting distribution. We apply our method to Value-at-Risk forecasting with Student's t distributions and a time varying degrees of freedom parameter and show that the new method is competitive to or better than earlier methods for volatility forecasting of individual stock returns and exchange rates.
    Keywords: dynamic volatilities, time varying higher order moments, integrated generalized autoregressive score models, Exponential Weighted Moving Average (EWMA), Value-at-Risk (VaR)
    JEL: C51 C52 C53 G15
    Date: 2014–07–22
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20140092&r=ecm
  12. By: Ladislav Kristoufek
    Abstract: We propose a novel framework combining detrended fluctuation analysis with standard regression methodology. The method is built on detrended variances and covariances and it is designed to estimate regression parameters at different scales and under potential non-stationarity and power-law correlations. Selected examples from physics, finance and environmental sciences illustrate usefulness of the framework.
    Date: 2014–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1411.0496&r=ecm
  13. By: Chollete, Loran (UiS); Pena, Victor de la (Columbia University); Segers , Johan (Universite Catholique de Louvain)
    Abstract: We study the problem of potentially spurious attribution of dependence in moderate to large samples, where both the number of variables and length of variable observations are growing. We approach this question of double asymptotics from both theoretical and empirical perspectives. For theoretical characterization, we consider a combination of poissonization and large deviation techniques. For empirics, we simulate a large dataset of i.i.d. variables and estimate dependence as both sample size and the number of iterations grow. We represent the different effects of sample size versus length of variables, via an empirical dependence surface. Finally, we apply the empirical method to a panel of financial data, comprising daily stock returns for 60 companies. For both simulated and financial data, increasing sample size reduces dependence estimates after a certain point. However, increasing the number of variables does not appear to attenuate the potential for spurious dependence, as measured by maximal Kendall's tau.
    Keywords: Double Asymptotics; Empirical Dependence Surface; Financial Data; Poissonization; Simulation; Spurious Dependence
    JEL: A10
    Date: 2014–08–31
    URL: http://d.repec.org/n?u=RePEc:hhs:stavef:2014_010&r=ecm
  14. By: Dunker, Fabian (University of Goettingen and Boston College); Hoderlein, Stefan (Boston College); Kaido, Hiroaki (Boston University)
    Abstract: This paper studies nonparametric identification in market level demand models for differentiated products. We generalize common models by allowing for the distribution of heterogeneity parameters (random coefficients) to have a nonparametric distribution across the population and give conditions under which the density of the random coefficients is identified. We show that key identifying restrictions are provided by (i) a set of moment conditions generated by instrumental variables together with an inversion of aggregate demand in unobserved product characteristics; and (ii) an integral transform (Radon transform) that maps the random coefficient density to the aggregate demand. This feature is shown to be common across a wide class of models, and we illustrate this by studying leading demand models. Our examples include demand models based on the multinomial choice (Berry, Levinsohn, Pakes, 1995), the choice of bundles of goods that can be substitutes or complements, and the choice of goods consumed in multiple units.
    Date: 2014–10
    URL: http://d.repec.org/n?u=RePEc:ihs:ihsesp:307&r=ecm
  15. By: Thorsten Lehnert; Gildas Blanchard; Dennis Bams (LSF)
    Abstract: The objective of this paper is to evaluate option pricing performance on the cross sectional level. For this purpose, we propose a statistical framework, in which we in particular account for the uncertainty associated with the reported pricing performance. Instead of a single figure, we determine an entire probability distribution function for the loss function that is used to measure option pricing performance. This methodology enables us to visualize the effect of parameter uncertainty on the reported pricing performance. Using a data driven approach, we confirm previous evidence that standard volatility models with clustering and leverage effects are sufficient for the option pricing purpose. In addition, we demonstrate that there is short-term persistence but long-term heterogeneity in crosssectional option pricing information. This finding has two important implications. First, it justifies the practitioner’s routine to refrain from time series approaches, and instead estimate option pricing models on a cross-section by cross-section basis. Second, the long term heterogeneity in option prices pinpoints the importance of measuring, comparing and testing option pricing model for each cross-section separately. To our knowledge no statistical testing framework has been applied to a single cross-section of option prices before. We propose a methodology that addresses this need. The proposed framework can be applied to a broad set of models and data. In the empirical part of the paper, we show by means of example, an application that uses a discrete time volatility model on S&P 500 European options.
    Keywords: option pricing, cross-section, estimation risk, parameter uncertainty,specification test, bootstrapping
    JEL: G12 C15
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:crf:wpaper:14-06&r=ecm
  16. By: Frantisek Brazdik; Zuzana Humplova; Frantisek Kopriva
    Abstract: Macroeconomic forecasters are often criticized for a lack of transparency when presenting their forecasts. To deter such criticism, the transparency of the forecasting process should be enhanced by tracing and explaining the effects of data revisions and expert judgment updates on variations in the forecasts. This paper presents a forecast decomposition analysis framework designed to examine the differences between two forecasts generated by a linear structural model. The differences between the forecasts considered can be decomposed into the contributions of various forecast elements, such as the effect of new data or expert judgment. The framework allows us to evaluate the contributions of forecast assumptions in the presence of expert judgment applied in the expected way. The simplest application of this framework examines alternative forecast scenarios with different forecast assumptions. Next, a one-period difference between the forecasts’ initial periods is added to the examination. Finally, a replication of the Inflation Forecast Evaluation presented in Inflation Report III/2013 is created to illustrate the full capabilities of the decomposition framework.
    Keywords: Data revisions, DSGE models, forecasting, forecast revisions
    JEL: C53 E01 E47
    Date: 2014–08
    URL: http://d.repec.org/n?u=RePEc:cnb:rpnrpn:2014/02&r=ecm
  17. By: Sripad Motiram (Indira Gandhi Institute of Development Research)
    Abstract: I present a review and extended discussion of The Cult of Statistical Significance: How the Standard Error Costs Us Jobs, Justice and Lives by Deirdre McCloskey and Stephen Ziliak, a work that raises important issues related to the practice of statistics and that has been widely commented upon. For this review, I draw upon several other works on statistics and my personal experiences as a teacher of undergraduate econometrics.
    Keywords: Significance, Standard Error, Application of Statistics, Methodology
    JEL: C1 C12
    Date: 2014–09
    URL: http://d.repec.org/n?u=RePEc:ind:igiwpp:2014-038&r=ecm
  18. By: Gloria Gonzalez-Rivera (Department of Economics, University of California Riverside); Yingying Sun
    Date: 2014–08
    URL: http://d.repec.org/n?u=RePEc:ucr:wpaper:201428&r=ecm
  19. By: Yang, Bill Huajian
    Abstract: Systematic risk has been a focus for stress testing and risk capital assessment. Under the Vasicek asymptotic single risk factor model framework, entity default risk for a risk homogeneous portfolio divides into two parts: systematic and entity specific. While entity specific risk can be modelled by a probit or logistic model using a relatively short period of portfolio historical data, modeling of systematic risk is more challenging. In practice, most default risk models do not fully or dynamically capture systematic risk. In this paper, we propose an approach to modeling systematic and entity specific risks by parts and then aggregating together analytically. Systematic risk is quantified and modelled by a multifactor Vasicek model with a latent residual, a factor accounting for default contagion and feedback effects. The asymptotic maximum likelihood approach for parameter estimation for this model is equivalent to least squares linear regression. Conditional entity PDs for scenario tests and through-the-cycle entity PD all have analytical solutions. For validation, we model the point-in-time entity PD for a commercial portfolio, and stress the portfolio default risk by shocking the systematic risk factors. Rating migration and portfolio loss are assessed.
    Keywords: point-in-time PD, through-the-cycle PD, Vasicek model, systematic risk, entity specific risk, stress testing, rating migration, scenario loss
    JEL: B4 C1 C5 E6 G18 G3
    Date: 2014–03–18
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:59025&r=ecm
  20. By: Luca Onorante; Adrian E. Raftery
    Abstract: Bayesian model averaging has become a widely used approach to accounting for uncertainty about the structural form of the model generating the data. When data arrive sequentially and the generating model can change over time, Dynamic Model Averaging (DMA) extends model averaging to deal with this situation. Often in macroeconomics, however, many candidate explanatory variables are available and the number of possible models becomes too large for DMA to be applied in its original form. We propose a new method for this situation which allows us to perform DMA without considering the whole model space, but using a subset of models and dynamically optimizing the choice of models at each point in time. This yields a dynamic form of Occam's window. We evaluate the method in the context of the problem of nowcasting GDP in the Euro area. We find that its forecasting performance compares well that of other methods. Keywords: Bayesian model averaging; Model uncertainty; Nowcasting; Occam's window.
    Date: 2014–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1410.7799&r=ecm

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.