nep-ecm New Economics Papers
on Econometrics
Issue of 2016‒11‒27
fifteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Time-varying Combinations of Bayesian Dynamic Models and Equity Momentum Strategies By Nalan Basturk; Stefano Grassi; Lennart Hoogerheide; Herman K. van Dijk
  2. Topic in nonparametric identification and estimation By Hubner, Stefan
  3. Uncertainty Through the Lenses of A Mixed-Frequency Bayesian Panel Markov Switching Model By Roberto Casarin; Claudia Foroni; Massimiliano Marcellino; Francesco Ravazzolo
  4. You can't always get what you want? Estimator choice and the speed of convergence By Kufenko, Vadim; Prettner, Klaus
  5. The Lila distribution and its applications in risk modelling By Bertrand K. Hassani; Wei Yang
  6. Random matrix approach to estimation of high-dimensional factor models By Joongyeub Yeo; George Papanicolaou
  7. A New Nonlinearity Test to Circumvent the Limitation of Volterra Expansion with Applications By Hui, Yongchang; Wong, Wing-Keung; Bai, Zhidong; Zhu, Zhenzhen
  8. Spatio-temporal statistical assessment of anthropogenic CO2 emissions from satellite data By Patrick Vetter; Wolfgang Schmid; Reimund Schwarze
  9. Estimation of financial agent-based models with simulated maximum likelihood By Kukacka, Jiri; Barunik, Jozef
  10. Monetary policy shocks, set-identifying restrictions, and asset prices: A benchmarking approach for analyzing set-identified models By Uhrin, Gábor B.; Herwartz, Helmut
  11. Beyond the Stars: a New Method for Assessing the Economic Importance of Variables in Regressions By Olivier Sterck
  12. How to Better Measure Hedonic Residential Property Price Indexes By Mick Silver
  13. Multiple imputation for demographic hazard models with left-censored predictor variables: Application to employment duration and fertility in the EU-SILC. By Michael Rendall; Angela Greulich
  14. Selection of an Estimation Window in the Presence of Data Revisions and Recent Structural Breaks By Jari Hännikäinen
  15. Economic Forecasting in Theory and Practice : An Interview with David F. Hendry By Neil R. Ericsson

  1. By: Nalan Basturk (Maastricht University, The Netherlands); Stefano Grassi (University of Kent, United Kingdom); Lennart Hoogerheide (VU University Amsterdam, The Netherlands); Herman K. van Dijk (Erasmus University Rotterdam, The Netherlands)
    Abstract: A novel dynamic asset-allocation approach is proposed where portfolios as well as portfolio strategies are updated at every decision period based on their past performance. For modeling, a general class of models is specified that combines a dynamic factor and a vector autoregressive model and includes stochastic volatility, denoted by FAVAR-SV. Next, a Bayesian strategy combination is introduced in order to deal with a set of strategies. Our approach extends the mixture of the experts analysis by allowing the strategic weights to be dependent between strategies as well as over time and to further allow for strategy incompleteness. Our approach results in a combination of different portfolio strategies: a model-based and a residual momentum strategy. The estimation of this modeling and strategy approach can be done using an extended and modified version of the forecast combination methodology of Casarin, Grassi, Ravazzolo and Van Dijk(2016). Given the complexity of the non-linear and non-Gaussian model used a new and efficient filter is introduced based on the MitISEM approach by Hoogerheide, Opschoor and Van Dijk (2013). Using US industry portfolios between 1926M7 and 2015M6 as data, our empirical results indicate that time-varying combinations of flexible models in the FAVAR-SV class and two momentum strategies lead to better return and risk features than very simple and very complex models. Combinations of two strategies help, in particular, to reduce risk features like volatility and largest loss, which indicates that complete densities provide useful information for risk.
    Keywords: Nonlinear; non-gaussian state space; filters; density combinations; bayesian modeling; equity momentum
    JEL: C11 C15 G11 G17
    Date: 2016–11–17
    URL: http://d.repec.org/n?u=RePEc:tin:wpaper:20160099&r=ecm
  2. By: Hubner, Stefan (Tilburg University, School of Economics and Management)
    Abstract: This dissertation consists of three chapters in which nonparametric methods are developed to estimate econometric models in different contexts. Chapter one and two focus on the collective household consumption model, whereas Chapter three considers the estimation of a smooth transition conditional quantile model in a financial time series context. They all have in common that the distribution of the model's unobserved components as well as some or all of the model's primitives are identified nonparametrically. Chapter one establishes conditions for nonparametric identification of structural components of the collective household model, as for example the conditional sharing rule. In particular it deals with the nonseparable nature of observed demands with respect to unobserved heterogeneity, which arises as a consequence of the bargaining structure of the model. As a result, this allows researches to answer welfare-related questions on an individual level for a heterogeneous population. Chapter two deals with the Collective Axiom of Revealed Preference also in the context of unobserved heterogeneity and shows how one can exploit data from single households in a nonparametric setting to study the empirical validity of the collective axiom. This approach makes use of a finite-dimensional characterization of demands and shows how one can test the collective model or the assumption of preference stability with respect to household composition using a partial-identification approach. Chapter three treats the estimation of Value at Risk in the context of financial time series. To be more precise, it is shown how one can directly estimate a smooth transition generalized conditional quantile model which allows for asymmetric responses to past innovations such as different dynamic behaviour succeeding negative and positive news. The model is generalized in a sense that it may depend on past conditional volatilities for which an auxiliary estimator is developed based on composite quantile regression.
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:tiu:tiutis:08fce56b-3193-46e0-871b-0fa4402832b5&r=ecm
  3. By: Roberto Casarin; Claudia Foroni; Massimiliano Marcellino; Francesco Ravazzolo
    Abstract: We propose a Bayesian panel model for mixed frequency data whose parameters can change over time according to a Markov process. Our model allows for both structural instability and random effects. We develop a proper Markov Chain Monte Carlo algorithm for sampling from the joint posterior distribution of the model parameters and test its properties in simulation experiments. We use the model to study the effects of macroeconomic uncertainty and financial uncertainty on a set of variables in a multi-country context including the US, several European countries and Japan. We find that for most of the variables financial uncertainty dominates macroeconomic uncertainty. Furthermore, we show that uncertainty coefficients differ if the economy is in a contraction regime or in an expansion regime. JEL codes: C13, C14, C51, C53. Keywords: dynamic panel model, mixed-frequency, Markov switching, Bayesian inference, MCMC.
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:igi:igierp:585&r=ecm
  4. By: Kufenko, Vadim; Prettner, Klaus
    Abstract: We propose theory-based Monte Carlo simulations to quantify the extent to which the estimated speed of convergence depends on the underlying econometric techniques. Based on a theoretical growth model as the data generating process, we find that, given a true speed of convergence of around 5%, the estimated values range from 0.2% to 7.72%. This corresponds to a range of the half life of a given gap from around 9 years up to several hundred years. With the exception of the (very inefficient) system GMM estimator with the collapsed matrix of instruments, the true speed of convergence is outside of the 95% confidence intervals of all investigated state-of-the-art estimators. In terms of the squared percent error, the between estimator and the system GMM estimator with the non-collapsed matrix of instruments perform worst, while the system GMM estimator with the collapsed matrix of instruments and the corrected least squares dummy variable estimator perform best. Based on these results we argue that it is not a good strategy to rely on only one or two different estimators when assessing the speed of convergence, even if these estimators are seen as suitable for the given sources of biases and inefficiencies. Instead one should compare the outcomes of different estimators carefully in light of the results of Monte Carlo simulation studies.
    Keywords: Speed of Convergence,Panel Data,Monte-Carlo Simulation,Estimator Bias,Estimator Efficiency,Economic Growth
    JEL: C13 C23 O47
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:zbw:hohdps:202016&r=ecm
  5. By: Bertrand K. Hassani (Centre d'Economie de la Sorbonne, Grupo Santander); Wei Yang (Risk methodology - Santander UK plc)
    Abstract: Risk data sets tend to have heavy-tailed, sometimes bi-modal, empirical distributions, especially in operational risk, market risk and customers behaviour data sets. To capture these observed "unusual" features, we construct a new probability distribution and call it the lowered-inside-leveraged-aside (Lila) distribution as it transfers the embedded weight of data from the body to the tail. This newly constructed distribution can be viewed as a parametric distribution with two peaks. It is constructed through the composition of a Sigmoid-shaped continuous increasing differentiable function with cumulative distribution functions of random variables. Examples and some basic properties of the Lila distribution are illustrated. As an application, we fit a Lila distribution to a set of generated data by using the quantile distance minimisation method (alternative methodologies have been tested too, such as maximum likelihood estimation)
    Keywords: probability distribution, parametric distribution, multimodal distribution, operational risk; market risk; pseudo bi-modal distribution
    JEL: G21 C16 C13 G32
    Date: 2016–10
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:16068&r=ecm
  6. By: Joongyeub Yeo; George Papanicolaou
    Abstract: In dealing with high-dimensional data sets, factor models are often useful for dimension reduction. The estimation of factor models has been actively studied in various fields. In the first part of this paper, we present a new approach to estimate high-dimensional factor models, using the empirical spectral density of residuals. The spectrum of covariance matrices from financial data typically exhibits two characteristic aspects: a few spikes and bulk. The former represent factors that mainly drive the features and the latter arises from idiosyncratic noise. Motivated by these two aspects, we consider a minimum distance between two spectrums; one from a covariance structure model and the other from real residuals of financial data that are obtained by subtracting principal components. Our method simultaneously provides estimators of the number of factors and information about correlation structures in residuals. Using free random variable techniques, the proposed algorithm can be implemented and controlled effectively. Monte Carlo simulations confirm that our method is robust to noise or the presence of weak factors. Furthermore, the application to financial time-series shows that our estimators capture essential aspects of market dynamics.
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1611.05571&r=ecm
  7. By: Hui, Yongchang; Wong, Wing-Keung; Bai, Zhidong; Zhu, Zhenzhen
    Abstract: In this paper, we propose a quick, efficient, and easy method to examine whether a time series Yt possesses any nonlinear feature. The advantage of our proposed nonlinearity test is that it is not required to know the exact nonlinear features and the detailed nonlinear forms of Yt. We find that our proposed test can be used to detect any nonlinearity for the variable being examined and detect GARCH models in the innovations. It can also be used to test whether the hypothesized model, including linear and nonlinear, to the variable being examined is appropriate as long as the residuals of the model being used can be estimated. Our simulation study shows that our proposed test is stable and powerful. We apply our proposed statistic to test whether there is any nonlinear feature in the sunspot data and whether the S&P 500 index follows a random walk model. The conclusion drawn from our proposed test is consistent those from other tests.
    Keywords: Nonlinearity, U-statistics, Volterra expansion, sunspots, efficient market
    JEL: C01 C12 G10
    Date: 2016–11–22
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:75216&r=ecm
  8. By: Patrick Vetter (Faculty of Business Administration and Economics, European University Viadrina, Frankfurt (Oder)); Wolfgang Schmid (Faculty of Business Administration and Economics, European University Viadrina, Frankfurt (Oder)); Reimund Schwarze (Europa University Viadrina and Helmholtz Centre for Environmental Research (UFZ))
    Abstract: The analysis of sources and sinks of CO2 is a dominant topic in diverse research fields and in political debates these days. The threat of climate change fosters the research efforts in the natural sciences in order to quantify the carbon sequestration potential of the terrestrial ecosystem and CO2 mitigation negotiations strengthens the need for a transparent, consistent and verifiable Moni- toring, Verification and Reporting infrastructure. This paper provides a spatio-temporal statistical modeling framework, which allows for a quantification of the Net Ecosystem Production and of anthropogenic sources, based on satellite data for surface CO2 concentrations and source and sink connected covariates. Using spatial and temporal latent random effects, that act as space-time varying coefficients, the complex dependence structure can be modeled adequately. Finally, spatio-temporal smoothed estimates for the sources and sinks can be used to provide dynamic maps on 0.5 × 0.5 grid for the Eurasien area in intervals of 16 days between September 2009 and August 2012. Finally, the self-reported CO2 emissions within the UNFCCC can be compared with the model results.
    Keywords: Anthropogenic CO2 emissions, Net Ecosystem Production, Linear mixed effects, Spatio- temporal model
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:euv:dpaper:24&r=ecm
  9. By: Kukacka, Jiri; Barunik, Jozef
    Abstract: This paper proposes a general computational framework for empirical estimation of financial agent based models, for which criterion functions do not have known analytical form. For this purpose, we adapt a nonparametric simulated maximum likelihood estimation based on kernel methods. Employing one of the most widely analysed heterogeneous agent models in the literature developed by Brock and Hommes (1998), we extensively test properties of the proposed estimator and its ability to recover parameters consistently and efficiently using simulations. Key empirical findings point us to the statistical insignificance of the switching coefficient but markedly significant belief parameters defining heterogeneous trading regimes with superiority of trend-following over contrarian strategies. In addition, we document slight proportional dominance of fundamentalists over trend following chartists in main world markets.
    Keywords: heterogeneous agent model,simulated maximum likelihood,estimation,intensity of choice,switching
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:zbw:fmpwps:63&r=ecm
  10. By: Uhrin, Gábor B.; Herwartz, Helmut
    Abstract: A central question for monetary policy is how asset prices respond to a monetary policy shock. We provide evidence on this issue by augmenting a monetary SVAR for US data with an asset price index, using set-identifying structural restrictions. The impulse responses show a positive asset price response to a contractionary monetary policy shock. The resulting monetary policy shocks correlate weakly with the Romer and Romer (2004) (RR) shocks, which matters greatly when analyzing impulse responses. Considering only models with shocks highly correlated with the RR series uncovers a negative, but near-zero response of asset prices.
    Keywords: monetary policy shocks,asset prices,sign restrictions,zero restrictions,set identification,structural VAR models
    JEL: C32 E44 E52
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:zbw:cegedp:295&r=ecm
  11. By: Olivier Sterck
    Abstract: Economists lack a systematic method to assess the economic importance of effects in regressions. In this article, I use experimental evidence to show that for a large majority of economists, the economic importance of an explanatory variable refers to its contribution to deviations in the level of the dependent variable. Existing statistics, such as standardized beta coefficients and the partial or semi-partial r2 and r, are only imperfect measures of the economic importance of explanatory variables: these statistics do not match with economists' common understanding of economic importance and are difficult to interpret. I therefore develop a new method, which consists in rescaling standardized beta coefficients such as to obtain the percentage contribution of each explanatory variable to deviations in the dependent variable. As an illustration, the method is applied to the study of the causes of long-run economic development.
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:csa:wpaper:2016-31&r=ecm
  12. By: Mick Silver
    Abstract: Hedonic regressions are used for property price index measurement to control for changes in the quality-mix of properties transacted. The paper consolidates the hedonic time dummy approach, characteristics approach, and imputation approaches. A practical hedonic methodology is proposed that (i) is weighted at a basic level; (ii) has a new (quasi-) superlative form and thus mitigates substitution bias; (iii) is suitable for sparse data in thin markets; and (iv) only requires the periodic estimation of hedonic regressions for reference periods and is not subject to the vagrancies of misspecification and estimation issues.
    Date: 2016–11–08
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:16/213&r=ecm
  13. By: Michael Rendall (University of Maryland [College Park]); Angela Greulich (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique)
    Abstract: OBJECTIVE A common problem when using panel data is that individuals’ histories are incompletely known at the first wave. We demonstrate the use of multiple imputation as a method to handle this partial information, and thereby increase statistical power without compromising the model specification. METHODS Using EU-SILC panel data to investigate full-time employment as a predictor of partnered women’s risk of first birth in Poland, we first multiply imputed employment status two years earlier to cases for which employment status is observed only in the most recent year. We then derived regression estimates from the full, multiply imputed sample, and compared the coefficient and standard error estimates to those from complete-case estimation with employment status observed both one and two years earlier. RESULTS Relative to not being full-time employed, having been full-time employed for two or more years was a positive and statistically significant predictor of childbearing in the multiply imputed sample, but was not significant when using complete-case estimation. The variance about the ‘two or more years’ coefficient was one third lower in the multiply imputed sample than in the complete-case sample. CONTRIBUTION By using MI for left-censored observations, researchers using panel data may specify a model that includes characteristics of state or event histories without discarding observations for which that information is only partially available. Using conventional methods, either the analysis model must be simplified to ignore potentially important information about the state or event history (risking biased estimation), or cases with partial information must be dropped from the analytical sample (resulting in inefficient estimation).
    Keywords: fertility, Europe,multiple imputation
    Date: 2016–10–01
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:hal-01396298&r=ecm
  14. By: Jari Hännikäinen (School of Management, University of Tampere)
    Abstract: In this paper, we analyze the forecasting performance of a set of widely used window selection methods in the presence of data revisions and recent structural breaks. Our Monte Carlo and empirical results for U.S. real GDP and inflation show that the expanding window estimator often yields the most accurate forecasts after a recent break. It performs well regardless of whether the revisions are news or noise, or whether we forecast first-release or final values. We find that the differences in the forecasting accuracy are large in practice, especially when we forecast inflation after the break of the early 1980s.
    Keywords: Recent structural break, choice of estimation window, forecasting, real-time data
    JEL: C22 C53 C82
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:tam:wpaper:1692&r=ecm
  15. By: Neil R. Ericsson
    Abstract: David Hendry has made major contributions to many areas of economic forecasting. He has developed a taxonomy of forecast errors and a theory of unpredictability that have yielded valuable insights into the nature of forecasting. He has also provided new perspectives on many existing forecast techniques, including mean square forecast errors, add factors, leading indicators, pooling of forecasts, and multi-step estimation. In addition, David has developed new forecast tools, such as forecast encompassing; and he has improved existing ones, such as nowcasting and robustification to breaks. This interview for the International Journal of Forecasting explores David Hendry’s research on forecasting.
    Keywords: Encompassing ; Equilibrium correction models ; Error correction ; Evaluation ; Exogeneity ; Forecasting ; Modeling ; Nowcasting ; Parameter constancy ; Robustification ; Structural breaks
    JEL: C53
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:fip:fedgif:1184&r=ecm

This nep-ecm issue is ©2016 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.