nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒01‒24
fifteen papers chosen by
Sune Karlsson
Orebro University

  1. Testing for non-linearity in multivariate stochastic processes By Marian Vavra
  2. Functional stable limit theorems for efficient spectral covolatility estimators By Randolf Altmeyer; Markus Bibinger; ;
  3. Testing for linear and Markov switching DSGE models By Marian Vavra
  4. Confidence Bands for Impulse Responses: Bonferroni versus Wald By Helmut Lütkepohl; Anna Staszewska-Bystrova; Peter Winker;
  5. IPW estimation and related estimators for evaluation of active labor market policies in a dynamic setting By Vikström, Johan
  6. Simultaneous Confidence Corridors and Variable Selection for Generalized Additive Models By Shuzhuan Zheng; Rong Liu; Lijian Yang; Wolfgang Karl Härdle
  7. Structural Vector Autoregressive Analysis in a Data Rich Environment: A Survey By Helmut Lütkepohl; ; ;
  8. Mostly Harmless Simulations? On the Internal Validity of Empirical Monte Carlo Studies By Advani, Arun; Sloczynski, Tymon
  9. Is Peace a Missing Value or a Zero? By Colin Vance; Nolan Ritter
  10. Non-linear externalities: A computational estimation method By Giulio Bottazzi; Ugo Gragnolati; Fabio Vanni
  11. Quasi-Hadamard differentiability of general risk functionals and its application By Volker Kr\"atschmer; Alexander Schied; Henryk Z\"ahle
  12. Reducing the Excess Variability of the Hodrick-Prescott Filter by Flexible Penalization By Blöchl, Andreas
  13. Multifractal Diffusion Entropy Analysis: Optimal Bin Width of Probability Histograms By Petr Jizba; Jan Korbel
  14. Un modelo TGARCH con una distribución t de Student asimétrica y las hipotesis de racionalidad de los inversionistas bursátiles en Latinoamérica By Lorenzo-Valdes, Arturo; Ruiz-Porras, Antonio
  15. On the Measurement of Economic Tail Risk By Steven Kou; Xianhua Peng

  1. By: Marian Vavra (National Bank of Slovakia)
    Abstract: Two well known multivariate non-linearity tests are modified using a principal component analysis. The Monte Carlo results show that the proposed principal component-based tests do provide a remarkable dimensionality reduction without any systematic power loss. It can be concluded that using linear dynamic economic models is in sharp contrast with our empirical findings.
    Keywords: non-linearity testing, principal component analysis, Monte Carlo method
    JEL: C12 C15 C32
    Date: 2013–09
    URL: http://d.repec.org/n?u=RePEc:svk:wpaper:1023&r=ecm
  2. By: Randolf Altmeyer; Markus Bibinger; ;
    Abstract: We consider noisy non-synchronous discrete observations of a continuous semimartingale. Functional stable central limit theorems are established under high-frequency asymptotics in three setups: onedimensional for the spectral estimator of integrated volatility, from two-dimensional asynchronous observations for a bivariate spectral covolatility estimator and multivariate for a local method of moments. The results demonstrate that local adaptivity and smoothing noise dilution in the Fourier domain facilitate substantial efficiency gains compared to previous approaches. In particular, the derived asymptotic variances coincide with the benchmarks of semiparametric Cram´er-Rao lower bounds and the considered estimators are thus asymptotically efficient in idealized sub-experiments. Feasible central limit theorems allowing for confidence are provided.
    Keywords: adaptive estimation, asymptotic efficiency, local parametric estimation, microstructure noise, integrated volatility, non-synchronous observations, spectral estimation, stable limit theorem
    JEL: C14 C32
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2014-005&r=ecm
  3. By: Marian Vavra (National Bank of Slovakia)
    Abstract: This paper addresses the issue related to testing for non-linearity in economic models using new principal component based multivariate non-linearity tests. Monte Carlo results suggest that the new multivariate tests have good size and power properties even in small samples usually available in practice. The empirical results indicate that the use of linear economic models is unsuitable for policy recommendations.
    Keywords: DSGE model, Markov-switching, Monte Carlo method, principal components, nonlinearity testing
    JEL: C12 C15 C32
    Date: 2013–12
    URL: http://d.repec.org/n?u=RePEc:svk:wpaper:1024&r=ecm
  4. By: Helmut Lütkepohl; Anna Staszewska-Bystrova; Peter Winker;
    Abstract: In impulse response analysis estimation uncertainty is typically displayed by constructing bands around estimated impulse response functions. These bands may be based on frequentist or Bayesian methods. If they are based on the joint distribution in the Bayesian framework or the joint asymptotic distribution possibly constructed with bootstrap methods in the frequentist framework often individual confidence intervals or credibility sets are simply connected to obtain the bands. Such bands are known to be too narrow and have a joint confidence content lower than the desired one. If instead the joint distribution of the impulse response coefficients is taken into account and mapped into the band it is shown that such a band is typically rather conservative. It is argued that a smaller band can often be obtained by using the Bonferroni method. While these considerations are equally important for constructing forecast bands, we focus on the case of impulse responses in this study.
    Keywords: Impulse responses, Bayesian error bands, frequentist confidence bands, Wald statistic, vector autoregressive process
    JEL: C32
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2014-007&r=ecm
  5. By: Vikström, Johan (IFAU - Institute for Evaluation of Labour Market and Education Policy)
    Abstract: This paper considers treatment evaluation in a discrete time setting in which treatment could start at any point in time. A typical application is an active labor market policy program which could start after any elapsed unemployment duration. It is shown that various average effects on survival time are identified under unconfoundedness and no-anticipation and inverse probability weighting (IPW) estimators are provided for these effects. The estimators are applied to a Swedish work practice program. The IPW estimator is compared with related estimators. One conclusion is that the matching estimator proposed by Fredriksson and Johansson (2008) overlooks a selective censoring problem.
    Keywords: Treatment effects; dynamic treatment assignment; program evaluation; work practice
    JEL: C14 C40
    Date: 2014–01–10
    URL: http://d.repec.org/n?u=RePEc:hhs:ifauwp:2014_001&r=ecm
  6. By: Shuzhuan Zheng; Rong Liu; Lijian Yang; Wolfgang Karl Härdle
    Abstract: In spite of the widespread use of generalized additive models (GAMs), there is no well established methodology for simultaneous inference and variable selection for the components of GAM. There is no doubt that both, inference on the marginal component functions and their selection, are essential in this additive statistical models. To this end, we establish simultaneous confidence corridors (SCCs) and a variable selection criteria through the spline-backfitted kernel smoothing techniques. To characterize the global features of each component, SCCs are constructed for testing their shapes. By extending the BIC to additive models with identity/trivial link, an asymptotically consistent BIC approach for variable selection is proposed. Our procedures are examined in simulations for its theoretical accuracy and performance, and used to forecast the default probability of listed Japanese companies.
    Keywords: BIC, Confidence corridor, Extreme value, Generalized additive model, Spline-backfitted kernel
    JEL: C35 C52 C53 G33
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2014-008&r=ecm
  7. By: Helmut Lütkepohl; ; ;
    Abstract: Large panels of variables are used by policy makers in deciding on policy actions. Therefore it is desirable to include large information sets in models for economic analysis. In this survey methods are reviewed for accounting for the information in large sets of variables in vector autoregressive (VAR) models. This can be done by aggregating the variables or by reducing the parameter space to a manageable dimension. Factor models reduce the space of variables whereas large Bayesian VAR models and panel VARs reduce the parameter space. Global VARs use a mixed approach. They aggregate the variables and use a parsimonious parametrisation. All these methods are discussed in this survey although the main emphasize is on factor models.
    Keywords: factor models, structural vector autoregressive model, global vector autoregression, panel data, Bayesian vector autoregression
    JEL: C32
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2014-004&r=ecm
  8. By: Advani, Arun (Institute for Fiscal Studies, London); Sloczynski, Tymon (Warsaw School of Economics)
    Abstract: In this paper we evaluate the premise from the recent literature on Monte Carlo studies that an empirically motivated simulation exercise is informative about the actual ranking of various estimators when applied to a particular problem. We consider two alternative designs and provide an empirical test for both of them. We conclude that a necessary condition for the simulations to be informative about the true ranking is that the treatment effect in simulations must be equal to the (unknown) true effect. This severely limits the usefulness of such procedures, since were the effect known, the procedure would not be necessary.
    Keywords: empirical Monte Carlo studies, programme evaluation, treatment effects
    JEL: C15 C21 C25 C52
    Date: 2013–12
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp7874&r=ecm
  9. By: Colin Vance; Nolan Ritter
    Abstract: Sample selection models, variants of which are the Heckman and Heckit models, are increasingly used by political scientists to accommodate data in which censoring of the dependent variable raises concerns of sample selectivity bias. Beyond demonstrating several pitfalls in the calculation of marginal effects and associated levels of statistical significance derived from these models, we argue that many of the empirical questions addressed by political scientists would – for both substantive and statistical reasons – be more appropriately addressed using an alternative but closely related procedure referred to as the two-part model (2PM). Aside from being simple to estimate, one key advantage of the 2PM is its less onerous identification requirements. Specifically, the model does not require the specification of so-called exclusion restrictions, variables that are included in the selection equation of the Heckit model but omitted from the outcome equation. Moreover, we argue that the interpretation of the marginal effects from the 2PM, which are in terms of actual outcomes, are more appropriate for the questions typically addressed by political scientists than the potential outcomes ascribed to the Heckit results. Drawing on data compiled by Sweeney (2003) from the Correlates of War database, we present an empirical analysis of conflict intensity illustrating that the choice between the sample selection model and 2PM can bear fundamentally on the conclusions drawn.
    Keywords: Conflict; Heckit model; two-part model; potential effects; actual effects; identification
    JEL: C24
    Date: 2013–12
    URL: http://d.repec.org/n?u=RePEc:rwi:repape:0466&r=ecm
  10. By: Giulio Bottazzi; Ugo Gragnolati; Fabio Vanni
    Abstract: A stochastic discrete choice model and its related estimation method are presented which allow to disentangle non-linear externalities from the intrinsic features of the objects of choice and from the idiosyncratic preferences of agents. Having veried for the ergodicity of the underlying stochastic process, parameter estimates are obtained through numerical methods and so is their statistical signicance. In particular, optimization rests on successive parabolic interpolation. Finally, the model and its related estimation method are applied to the case of rm localization using Italian sectoral census data.
    Keywords: Externalities, Heterogeneity, Computational methods, Firm localization
    Date: 2014–01–15
    URL: http://d.repec.org/n?u=RePEc:ssa:lemwps:2014/01&r=ecm
  11. By: Volker Kr\"atschmer; Alexander Schied; Henryk Z\"ahle
    Abstract: We apply a suitable modification of the functional delta method to statistical functionals that arise from law-invariant coherent risk measures. To this end we establish differentiability of the statistical functional in a relaxed Hadamard sense, namely with respect to a suitably chosen norm and in the directions of a specifically chosen "tangent space". We show that this notion of quasi-Hadamard differentiability yields both strong laws and limit theorems for the asymptotic distribution of the plug-in estimators. Our results can be regarded as a contribution to the statistics and numerics of risk measurement and as a case study for possible refinements of the functional delta method through fine-tuning the underlying notion of differentiability
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1401.3167&r=ecm
  12. By: Blöchl, Andreas
    Abstract: The Hodrick-Prescott filter is the probably most popular tool for trend estimation in economics. Compared to other frequently used methods like the Baxter-King filter it allows to estimate the trend for the most recent periods of a time series. However, the Hodrick- Prescott filter suffers from an increasing excess variability at the margins of the series inducing a too flexible trend function at the margins compared to the middle. This paper will tackle this problem using spectral analysis and a flexible penalization. It will show that the excess variability can be reduced immensely by a flexible penalization, while the gain function for the middle of the time series is used as a measure to determine the degree of the flexible penalization.
    Keywords: Hodrick-Prescott filter; spectral analysis; trend estimation; gain function; flexible penalization
    JEL: C22 C52
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:lmu:muenec:17940&r=ecm
  13. By: Petr Jizba; Jan Korbel
    Abstract: In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. This presented method uses techniques of Renyi's entropy and the mean square error analysis to discuss the conditions under which the error in Renyi's entropy estimation is minimal. We illustrate the utility of our method by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the optimality of the bin-width we compare the $\delta$-spectrum for various bin-widths. Implications for the multifractal $\delta$-spectrum as a function of Renyi's q parameter are also discussed and graphically represented.
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1401.3316&r=ecm
  14. By: Lorenzo-Valdes, Arturo; Ruiz-Porras, Antonio
    Abstract: We propose an ARCH model of the TGARCH type with an asymmetric Student's t distribution. It is built using the methodology of Fernandez and Steel (1998) and the traditional TGARCH model developed by Zakoian (1994). The model is used to describe series of stock market returns and to assess the validity of the rationality hypotheses in Latin America. The results suggest that: 1) The series can be described adequately with the proposed model; (2) the Samuelson´s rationality hypothesis is consistent with the evidence of the markets of Argentina, Brazil, Chile, Colombia and Mexico; 3) the traditional rationality hypothesis is consistent with the evidence of Peru; and (4) the volatility estimated with the proposed model are higher than those estimated with the traditional TGARCH model over the period 2008-2009.
    Keywords: Density Distribution; Asymmetric t-Student; TGARCH; Stock Market Returns; Latin America
    JEL: C22 F30 G10
    Date: 2014–01–17
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:53019&r=ecm
  15. By: Steven Kou; Xianhua Peng
    Abstract: This paper attempts to provide a decision theoretical foundation for the measurement of economic tail risk, which is not only closely related to utility theory but also relevant to statistical model uncertainty. The main result of the paper is that the only tail risk measure that satisfies both a set of economic axioms proposed by Schmeidler (1989, Econometrica) and the statistical property of elicitability (i.e. there exists an objective function such that minimizing the expected objective function yields the risk measure; see Gneiting, 2011, J. Amer. Stat. Assoc.) is median shortfall, which is the median of the tail loss distribution. As an application, we argue that median shortfall is a better alternative than expected shortfall as a risk measure for setting capital requirements in Basel Accords.
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1401.4787&r=ecm

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.