nep-ecm New Economics Papers
on Econometrics
Issue of 2018‒10‒22
eighteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Asymptotic Theory for Rotated Multivariate GARCH Models By Manabu Asai; Chia-Lin Chang; Michael McAleer; Laurent Pauwels
  2. Testing Identification via Heteroskedasticity in Structural Vector Autoregressive Models By Helmut Lütkepohl; Mika Meitz; Aleksei NetŠunajev; Pentti Saikkonen
  3. Assessing Tail Risk Using Expectile Regressions with Partially Varying Coefficients By Zongwu Cai; Ying Fang; Dingshi Tian
  4. Trending Mixture Copula Models with Copula Selection By Bingduo Yang; Zongwu Cai; Christian M. Hafner; Guannan Liu
  5. Unified Tests for a Dynamic Predictive Regression By Bingduo Yang; Xiaohui Liu; Liang Peng; Zongwu Cai
  6. Econometric Modeling of Risk Measures: A Selective Review of the Recent Literature By Dingshi Tian; Zongwu Cai; Ying Fang
  7. Inefficiency and Bank Failures: A Joint Bayesian Estimationof a Stochastic Frontier Model and a Hazards Model By Jim Sánchez González; Diego Restrepo Tobón; Andrés Ramírez Hassan
  8. A General Weighted Average Representation of the Ordinary and Two-Stage Least Squares Estimands By Tymon S{\l}oczy\'nski
  9. Are the hypothesis tests used in time-on-market studies powerful enough? By Paul Anglin
  10. Constructing Joint Confidence Bands for Impulse Response Functions of VAR Models: A Review By Helmut Lütkepohl; Anna Staszewska-Bystrova; Peter Winker
  11. Using the GB2 Income Distribution: A Review By Duangkamon Chotikapanich; William E. Griffiths; Gholamreza Hajargasht; Wasana Karunarathne; D.S. Prasada Rao
  12. Stochastic Revealed Preferences with Measurement Error By Victor H. Aguiar; Nail Kashaev
  13. The Incidental Parameters Problem in Testing for Remaining Cross-section Correlation By Arturas Juodis; Simon Reese
  14. A structural model of firm collaborations with unobserved heterogeneity By Shweta Gaonkar; Angelo Mele
  15. A framework for early-warning modeling with an application to banks By Lang, Jan Hannes; Peltonen, Tuomas A.; Sarlin, Peter
  16. Rehabilitating the Random Utility Model. A comment on Apesteguia and Ballester (2018) By Anna Conte; John D. Hey
  17. Gaussian Copulas for Imposing Structure on VAR By Dallakyan, Aramayis; Bessler, David A.
  18. Granger causality on horizontal sum of Boolean algebras By M. Bohdalov\'a; M. Kalina; O. N\'an\'asiov\'a

  1. By: Manabu Asai (Faculty of Economics, Soka University, Japan.); Chia-Lin Chang (Department of Applied Economics & Department of Finance National Chung Hsing University, Taiwan.); Michael McAleer (Department of Quantitative Finance National Tsing Hua University, Taiwan and Econometric Institute Erasmus School of Economics Erasmus University Rotterdam, The Netherlands and Department of Quantitative Economics Complutense University of Madrid, Spain And Institute of Advanced Sciences Yokohama National University, Japan.); Laurent Pauwels (Discipline of Business Analytics, University of Sydney Business School, Australia.)
    Abstract: In this paper, we derive the statistical properties of a two step approach to estimating multivariate GARCH rotated BEKK (RBEKK) models. By the definition of rotated BEKK, we estimate the unconditional covariance matrix in the first step in order to rotate observed variables to have the identity matrix for its sample covariance matrix. In the second step, we estimate the remaining parameters via maximizing the quasi-likelihood function. For this two step quasi-maximum likelihood (2sQML) estimator, we show consistency and asymptotic normality under weak conditions. While second-order moments are needed for consistency of the estimated unconditional covariance matrix, the existence of finite sixthorder moments are required for convergence of the second-order derivatives of the quasilog-likelihood function. We also show the relationship of the asymptotic distributions of the 2sQML estimator for the RBEKK model and the variance targeting (VT) QML estimator for the VT-BEKK model. Monte Carlo experiments show that the bias of the 2sQML estimator is negligible, and that the appropriateness of the diagonal specification depends on the closeness to either of the Diagonal BEKK and the Diagonal RBEKK models.
    Keywords: BEKK, Rotated BEKK, Diagonal BEKK, Variance targeting, Multivariate GARCH, Consistency, Asymptotic normality.
    JEL: C13 C32
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1827&r=ecm
  2. By: Helmut Lütkepohl; Mika Meitz; Aleksei NetŠunajev; Pentti Saikkonen
    Abstract: Tests for identification through heteroskedasticity in structural vector autoregressive analysis are developed for models with two volatility states where the time point of volatility change is known. The tests are Wald type tests for which only the unrestricted model including the covariance matrices of the two volatility states have to be estimated. The residuals of the model are assumed to be from the class of elliptical distributions which includes Gaussian models. The asymptotic null distributions of the test statistics are derived and simulations are used to explore their small sample properties. Two empirical examples illustrate the usefulness of the tests.
    Keywords: Heteroskedasticity, structural identi cation, vector autoregressive process
    JEL: C32
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1764&r=ecm
  3. By: Zongwu Cai (Department of Economics, The University of Kansas; Center for Financial Stability, New York City; IC2 Institute, University of Texas at Austin); Ying Fang (The Wang Yanan Institute for Studies in Economics, Xiamen University, China); Dingshi Tian (The Wang Yanan Institute for Studies in Economics, Xiamen University, China)
    Abstract: To characterize heteroskedasticity and nonlinearity as well as asymmetry in tail risk, this paper investigates a class of conditional (dynamic) expectile models with partially varying coefficients in which some coefficients are allowed to be constants but others are allowed to be unknown functions of random variables. A three-stage estimation procedure is proposed to estimate both the parametric constant coefficients and nonparametric functional coefficients, and their asymptotic properties are investigated under time series context, together with a new simple and easily implemented test for testing the goodness of fit of models and a bandwidth selector based on newly defined cross-validatory estimation for the expected forecasting expectile errors. The proposed methodology is data-analytic and of sufficient flexibility to analyze complex and multivariate nonlinear structures without suffering from the curse of dimensionality. Finally, the proposed model is illustrated by simulated data and applied to analyzing the daily data of the S&P500 return series.
    Keywords: Expectile; Heteroskedasticity; Nonlinearity; Varying Coefficients; Tail Risk
    JEL: C58 C14
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:kan:wpaper:201804&r=ecm
  4. By: Bingduo Yang (Lingnan (University) College, Sun Yat-sen University, Guangzhou, China); Zongwu Cai (Department of Economics, The University of Kansas); Christian M. Hafner (Institut de statistique and CORE, Universitie Catholique de Louvain, Louvain-la-Neuve, Belgium.); Guannan Liu (The Wang Yanan Institute for Studies in Economics, Xiamen University, Xiamen, China)
    Abstract: Modeling the joint tails of multiple nancial time series has important implications for risk management. Classical models for dependence often encounter a lack of fit in the joint tails, calling for additional flexibility. In this paper we introduce a new nonparametric time-varying mixture copula model, in which both weights and dependence parameters are deterministic functions of time. We propose penalized trending mixture copula models with group smoothly clipped absolute deviation (SCAD) penalty functions to do the estimation and copula selection simultaneously. Monte Carlo simulation results suggest that the shrinkage estimation procedure performs well in selecting and estimating both constant and trending mixture copula models. Using the proposed model and method, we analyze the evolution of the dependence among four international stock markets, and find substantial changes in the levels and patterns of the dependence, in particular around crisis periods.
    Keywords: Copula, Time-Varying Copula, Mixture Copula, Copula Selection
    JEL: C31 C32 C51
    Date: 2018–09
    URL: http://d.repec.org/n?u=RePEc:kan:wpaper:201809&r=ecm
  5. By: Bingduo Yang (School of Finance, Jiangxi University of Finance and Economics, Nanchang, China); Xiaohui Liu (School of Finance, Jiangxi University of Finance and Economics, Nanchang, China); Liang Peng (Department of Risk Management and Insurance, Georgia State University); Zongwu Cai (Department of Economics, University of Kansas)
    Abstract: Testing for predictability of asset returns has been a long history in economics and finance. Recently, based on a simple predictive regression, Kostakis, Magdalinos and Stamatogiannis (2015, Review of Financial Studies) derived a Wald type test based on the context of the extended instrumental variable (IVX) methodology for testing predictability of stock returns and Demetrescu (2014) showed that the local power of the standard IVX-based test could be improved in some cases when a lagged predicted variable is added to the predictive regression on purpose, which poses a general important question on whether a lagged predicted variable should be included in the model or not. This paper proposes novel robust procedures for testing both the existence of a lagged predicted variable and the predictability of asset returns in a predictive regression regardless of regressors being stationary or nearly integrated or unit root and the AR model for regressors with or without intercept. A simulation study confirms the good finite sample performance of the proposed tests before applying the proposed tests to some real datasets in finance to illustrate their practical usefulness.
    Keywords: Autoregressive Errors; Empirical Likelihood; Predictive Regression; Weighted Score.
    JEL: C12 C22
    Date: 2018–09
    URL: http://d.repec.org/n?u=RePEc:kan:wpaper:201808&r=ecm
  6. By: Dingshi Tian (The Wang Yanan Institute for Studies in Economics, Xiamen University, Xiamen, China); Zongwu Cai (Department of Economics, The University of Kansas); Ying Fang (The Wang Yanan Institute for Studies in Economics, Xiamen University, Xiamen, China)
    Abstract: Since the financial crisis in 2008, the risk measures which are the core of risk management, have received increasing attention among economists and practitioners. In this review, the concentrate is on recent developments in the estimation of the most popular risk measures, namely, value at risk (VaR), expected shortfall (ES), and expectile. After introducing the concept of risk measures, the focus is on discussion and comparison of their econometric modeling. Then, parametric and nonparametric estimations of tail dependence are investigated. Finally, we conclude with insights into future research directions.
    Keywords: Expectile; Expected Shortfall; Network; Nonparametric Estimation; Tail Dependence; Value at Risk.
    JEL: C58 C14
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:kan:wpaper:201807&r=ecm
  7. By: Jim Sánchez González; Diego Restrepo Tobón; Andrés Ramírez Hassan
    Abstract: In modeling bank failure, estimating inefficiency separately from the hazards modelresults in inefficient, biased, and inconsistent estimators. We develop a method to si-multaneously estimate a stochastic frontier model and a hazards model using Bayesiantechniques. This method overcomes issues related to two-stage estimation methods,allows for computing the marginal effects of the inefficiency over the probability offailure, and facilitates statistical inference of the functions of the parameters such aselasticities, returns to scale, and individual inefficiencies. Simulation exercises showthat our proposed method performs better than two-stage maximum likelihood, spe-cially in small samples. In addition we find that inefficiency plays a statistically and economically significant role in determining the time to failure of U.S. commercial banks during 2001 to 2010.
    Keywords: Technical Inefficiency; Proportional Hazards Model; Bank Failures
    JEL: C11 G21 G33
    Date: 2018–08–27
    URL: http://d.repec.org/n?u=RePEc:col:000122:016788&r=ecm
  8. By: Tymon S{\l}oczy\'nski
    Abstract: It is standard practice in applied work to study the effect of a binary variable ("treatment") on an outcome of interest using linear models with additive effects. In this paper I study the interpretation of the ordinary and two-stage least squares estimands in such models when treatment effects are in fact heterogeneous. I show that in both cases the coefficient on treatment is identical to a convex combination of two other parameters (different for OLS and 2SLS), which can be interpreted as the average treatment effects on the treated and controls under additional assumptions. Importantly, the OLS and 2SLS weights on these parameters are inversely related to the proportion of each group. The more units get treatment, the less weight is placed on the effect on the treated. What follows, the reliance on these implicit weights can have serious consequences for applied work. I illustrate some of these issues in four empirical applications from different fields of economics. I also develop a weighted least squares correction and simple diagnostic tools that applied researchers can use to avoid potential biases. In an important special case, my diagnostics only require the knowledge of the proportion of treated units.
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1810.01576&r=ecm
  9. By: Paul Anglin
    Abstract: Two approaches are widely used when estimating how long it takes to sell a property: one approach uses least squares and the other is based on a hazard model. Despite many years of study, the field does not seem to have reached a consensus on the “typical” effect of important variables. This paper uses numerical simulations and some simple reasoning to study the power of tests to reject hypotheses which are false. The numerical simulations offer three key insights. First, they quantify the effect of a difference in the hypothesis. Small differences are hard to detect while the power to reject a big difference varies with the characteristics of the data. Second, since the simulation can impose a true distribution, it is possible to study the effect of using a knowingly misspecified model with less statistical theory and more focus on practical implications. I show that analysis based on estimating a hazard model seems to be more robust. Third, the simulations are used to explore the conceptual difference between testing the hypothesis that an effect exists vs. tests aimed at measuring the magnitude of an effect. This difference is very important when the different approaches use different parameters which affect the estimates without being studied directly. I discuss the implications for practitioners and for researchers.
    Keywords: hypothesis testing; market clearing; numerical simulations; Power; Time on Market
    JEL: R3
    Date: 2018–01–01
    URL: http://d.repec.org/n?u=RePEc:arz:wpaper:eres2018_242&r=ecm
  10. By: Helmut Lütkepohl; Anna Staszewska-Bystrova; Peter Winker
    Abstract: Methods for constructing joint confidence bands for impulse response functions which are commonly used in vector autoregressive analysis are reviewed. While considering separate intervals for each horizon individually still seems to be the most common approach, a substantial number of methods have been proposed for making joint inferences about the complete impulse response paths up to a given horizon. A structured presentation of these methods is provided. Furthermore, existing evidence on the small-sample performance of the methods is gathered. The collected information can help practitioners to decide on a suitable confidence band for a structural VAR analysis.
    Keywords: Impulse responses, vector autoregressive model, joint confidence bands
    JEL: C32
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1762&r=ecm
  11. By: Duangkamon Chotikapanich (Monash University); William E. Griffiths (Department of Economics, University of Melbourne); Gholamreza Hajargasht (Swinburne University); Wasana Karunarathne (Department of Economics, University of Melbourne); D.S. Prasada Rao (University of Queensland)
    Abstract: To use the GB2 distribution for the analysis of income and other positively-skewed distributions, knowledge of estimation methods and the ability to compute quantities of interest from the estimated parameters are required. We review estimation methodology that has appeared in the literature, and summarise expressions for inequality, poverty, and propoor growth that can be used to compute these measures from GB2 parameter estimates. An application to data from China and Indonesia is provided.
    JEL: I32 O15 C13
    Date: 2018–02
    URL: http://d.repec.org/n?u=RePEc:mlb:wpaper:2036&r=ecm
  12. By: Victor H. Aguiar; Nail Kashaev
    Abstract: A long-standing question about consumer behavior is whether individuals' observed purchase decisions satisfy the revealed preference (RP) axioms of the utility maximization theory (UMT). Researchers using survey or experimental panel data sets on prices and consumption to answer this question face the well-known problem of measurement error. We show that ignoring measurement error in the RP approach may lead to overrejection of the UMT. To solve this problem, this paper proposes a new statistical RP framework for consumption panel data sets that allows for testing the UMT in the presence of measurement error. Our test is applicable to all consumer models that can be characterized by their first-order conditions. Our approach is nonparametric, allows for unrestricted heterogeneity in preferences, and requires only a centering condition on measurement error. We develop two applications that provide new evidence about the UMT. First, we find support in a survey dataset for the dynamic and time-consistent UMT in single-individual households, in the presence of \emph{nonclassical} measurement error in consumption. In the second application, we cannot reject the static UMT in a widely used experimental dataset where measurement error in prices is assumed to be the result of price misperception due to the experimental design. The first finding stands in contrast to the conclusions drawn from the deterministic RP test of Browning (1989). The second finding reverses the conclusions drawn from the deterministic RP test of Afriat (1967) and Varian (1982)
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1810.05287&r=ecm
  13. By: Arturas Juodis; Simon Reese
    Abstract: In this paper we consider the properties of the Pesaran (2004, 2015a) CD test for cross-section correlation when applied to residuals obtained from panel data models with many estimated parameters. We show that the presence of period-specific parameters leads the CD test statistic to diverge as the length of the time dimension of the sample grows. This result holds even if cross-section dependence is correctly accounted for and hence constitutes an example of the Incidental Parameters Problem. The relevance of this problem is investigated both for the classical Time Fixed Effects estimator as well as the Common Correlated Effects estimator of Pesaran (2006). We discuss approaches for re-establishing standard normal inference under the null hypothesis. Given the widespread use of the CD test statistic to test for remaining cross-section correlation, our results have far reaching implications for empirical researchers.
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1810.03715&r=ecm
  14. By: Shweta Gaonkar (Johns Hopkins Carey Business School, 100 International Dr, Baltimore, MD 21202 - USA); Angelo Mele (Johns Hopkins Carey Business School, 100 International Dr, Baltimore, MD 21202 - USA)
    Abstract: We develop and estimate a structural model of strategic network formation to study the determinants of firms collaborations for patenting new technology in the medical device industry. Our aim is to bridge the strategy literature on interorganizational networks and the economic literature on structural estimation of network models. In our model, firms have payoffs that depend on linking costs and benefits, as well as externalities from common partners and popular partners. Firms are characterized by observed and unobserved characteristics, that affect both their opportunity and their willingness to form links. The equilibrium networks are sparse and match the aggregate clustering levels observed in the data. We use the network of patent collaborations among medical device firms, to estimate the structural parameters using a Bayesian approach. Our results show that firms tend to partner domestically and collaborate with companies in similar markets, perhaps due to technological complementarities or regulation effects. Unobserved heterogeneity matters: we find that firms' payoffs vary by type. Finally we show that the estimated model including unobserved heterogeneity provides a better fit of crucial features of the data.
    Keywords: firm networks, strategic alliances, exponential random graphs, weak dependence, homophily, clustering, sparse networks
    JEL: C13 C31 L14 D85
    Date: 2018–09
    URL: http://d.repec.org/n?u=RePEc:net:wpaper:1807&r=ecm
  15. By: Lang, Jan Hannes; Peltonen, Tuomas A.; Sarlin, Peter
    Abstract: This paper proposes a framework for deriving early-warning models with optimal out-of-sample forecasting properties and applies it to predicting distress in European banks. The main contributions of the paper are threefold. First, the paper introduces a conceptual framework to guide the process of building early-warning models, which highlights and structures the numerous complex choices that the modeler needs to make. Second, the paper proposes a flexible modeling solution to the conceptual framework that supports model selection in real-time. Specifically, our proposed solution is to combine the loss function approach to evaluate early-warning models with regularized logistic regression and cross-validation to find a model specification with optimal real-time out-of-sample forecasting properties. Third, the paper illustrates how the modeling framework can be used in analysis supporting both microand macro-prudential policy by applying it to a large dataset of EU banks and showing some examples of early-warning model visualizations. JEL Classification: G01, G17, G21, G33, C52, C54
    Keywords: bank distress, early-warning models, financial crises, micro- and macro-prudential analysis, regularization
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20182182&r=ecm
  16. By: Anna Conte; John D. Hey
    Abstract: The Random Utility Model (RUM) and the Random Preference Model (RPM) are important tools in the economist’s toolbox when estimating preference functionals from experimental data. In an important recent paper in this journal, Apesteguia and Ballester (2018) cautioned decision theorists against using the RUM, suggesting that the RPM may be preferable. This short note comments on this paper, and concludes that RUM does not suffer from the drawbacks suggested in their paper.
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:yor:yorken:18/12&r=ecm
  17. By: Dallakyan, Aramayis; Bessler, David A.
    Keywords: Research Methods/Econometrics/Stats, Demand and Price Analysis, Food and Agricultural Policy Analysis
    Date: 2018–06–20
    URL: http://d.repec.org/n?u=RePEc:ags:aaea18:274401&r=ecm
  18. By: M. Bohdalov\'a; M. Kalina; O. N\'an\'asiov\'a
    Abstract: The intention of this paper is to discuss the mathematical model of causality introduced by C.W.J. Granger in 1969. The Granger's model of causality has become well-known and often used in various econometric models describing causal systems, e.g., between commodity prices and exchange rates. Our paper presents a new mathematical model of causality between two measured objects. We have slightly modified the well-known Kolmogorovian probability model. In particular, we use the horizontal sum of set $\sigma$-algebras instead of their direct product.
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1810.01654&r=ecm

This nep-ecm issue is ©2018 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.