nep-ecm New Economics Papers
on Econometrics
Issue of 2012‒11‒03
seventeen papers chosen by
Sune Karlsson
Orebro University

  1. Estimation and Properties of a Time-Varying EGARCH(1,1) in Mean Model By Sofia Anyfantaki; Antonis Demos
  2. Estimation of Multivariate Stochastic Volatility Models: A Comparative Monte Carlo Study By Mustafa Hakan Eratalay
  3. Realized mixed-frequency factor models for vast dimensional covariance estimation By Bannouh, K.; Martens, M.P.E.; Oomen, R.C.A.; Dijk, D.J.C. van
  4. The Directional Identification Problem in Bayesian Factor Analysis: An Ex-Post Approach By Christian Aßmann; Jens Boysen-Hogrefe; Markus Pape
  5. A nonparametric Bayesian approach for counterfactual prediction with an application to the Japanese private nursing home market By Sugawara, Shinya
  6. Testing for Measurement Invariance with Respect to an Ordinal Variable By Edgar C. Merkle; Jinyan Fan; Achim Zeileis
  7. Estimating GARCH volatility in the presence of outliers. By Carnero, María Ángeles; Peña, Daniel; Ruiz, Esther
  8. Signal Detection in High Dmension: The Multispiked Case By Alexei Onatski; Marcelo Moreira J.; Marc Hallin
  9. The comparison of optimization algorithms on unit root testing with smooth transition By Omay, Tolga
  10. Moving Average Stochastic Volatility Models with Application to Inflation Forecast By Joshua C C Chan
  11. The Volatility-Return Relationship: Insights from Linear and Non-Linear Quantile Regressions By D.E. Allen; Abhay K Singh; R. Powell; Michael McAleer; James Taylor; Lyn Thomas
  12. Estimating bank loans loss given default by generalized additive models By Raffaella Calabrese
  13. A Forty Year Assessment of Forecasting the Boat Race By Geert Mesters; Siem Jan Koopman
  14. Modelling Primary Health Care Use: A Panel Zero Inflated Interval Regression Approach By Sarah Brown; Mark N. Harris; Jennifer Roberts; Karl Taylor
  15. Canonical Correlation and Assortative Matching: A Remark By Dupuy, Arnaud; Galichon, Alfred
  16. A Model of Market Limit Orders By Stochastic PDE's, Parameter Estimation, and Investment Optimization By Zhi Zheng; Richard B. Sowers
  17. A Bounded Model of Time Variation in Trend Inflation, NAIRU and the Phillips Curve By Joshua C C Chan; Gary Koop; Simon M Potter

  1. By: Sofia Anyfantaki; Antonis Demos (www.aueb.gr/users/demos)
    Abstract: Time-varying GARCH-M models are commonly employed in econometrics and financial economics. Yet the recursive nature of the conditional variance makes exact likelihood analysis of these models computationally infeasible. This paper outlines the issues and suggests to employ a Markov chain Monte Carlo algorithm which allows the calculation of a classical estimator via the simulated EM algorithm or a simulated Bayesian solution in only O(T) computational operations, where T is the sample size. Furthermore, the theoretical dynamic properties of a time-varying-parameter EGARCH(1,1)-M are derived. We discuss them and apply the suggested Bayesian estimation to three major stock markets.
    Keywords: Dynamic heteroskedasticity, in mean models, time varying parameter, Markov chain Monte Carlo, simulated EM algorithm, Bayesian inference
    JEL: C13 C15 C63
    Date: 2012–07–30
    URL: http://d.repec.org/n?u=RePEc:aue:wpaper:1228&r=ecm
  2. By: Mustafa Hakan Eratalay
    Abstract: In this paper, we make two contributions to the MSV literature. First, we propose two new MSV models that account for leverage effects. Second, we compare the small sample performances of Quasi Maximum Likelihood (QML) and Monte Carlo Likelihood (MCL) methods through Monte Carlo studies for Constant Correlations MSV and Time Varying Correlations MSV and for the two MSV models with leverage we propose. We also provide the specific transformations necessary for the MCL estimation of the proposed MSV models with leverage. Our results confirm that the MCL estimator has better small sample performance compared to the QML estimator. In terms of parameter estimation, both estimators perform better when the series are highly correlated. In estimating the underlying volatilities and correlations, QML estimator’s performance comes closer to that of MCL estimator when the SV process has higher variance or when the correlations are time varying, while it is performing relatively worse in MSV models with leverage. Finally we include an empirical illustration by estimating an MSV model with leverage that we propose using a trivariate data from the major European stock markets.
    Keywords: Multivariate Stochastic Volatility, Estimation, Constant Correlations, Time Varying Correlations, Leverage
    JEL: C32
    Date: 2012–10–15
    URL: http://d.repec.org/n?u=RePEc:eus:wpaper:ec0412&r=ecm
  3. By: Bannouh, K.; Martens, M.P.E.; Oomen, R.C.A.; Dijk, D.J.C. van
    Abstract: We introduce a Mixed-Frequency Factor Model (MFFM) to estimate vast dimensional covari- ance matrices of asset returns. The MFFM uses high-frequency (intraday) data to estimate factor (co)variances and idiosyncratic risk and low-frequency (daily) data to estimate the factor loadings. We propose the use of highly liquid assets such as exchange traded funds (ETFs) as factors. Prices for these contracts are observed essentially free of microstructure noise at high frequencies, allowing us to obtain precise estimates of the factor covariances. The factor loadings instead are estimated from daily data to avoid biases due to market microstructure effects such as the relative illiquidity of individual stocks and non-synchronicity between the returns on factors and stocks. Our theoretical, simulation and empirical results illustrate that the performance of the MFFM is excellent, both compared to conventional factor models based solely on low-frequency data and to popular realized covariance estimators based on high-frequency data.
    Keywords: dimensional covariance estimation;mixed-frequency factor models
    Date: 2012–10–23
    URL: http://d.repec.org/n?u=RePEc:dgr:eureri:1765037470&r=ecm
  4. By: Christian Aßmann; Jens Boysen-Hogrefe; Markus Pape
    Abstract: Due to their well-known indeterminacies, factor models require identifying assumptions to guarantee unique parameter estimates. For Bayesian estimation, these identifying assumptions are usually implemented by imposing constraints on certain model parameters. This strategy, however, may result in posterior distributions with shapes that depend on the ordering of cross-sections in the data set. We propose an alternative approach, which relies on a sampler without the usual identifying constraints. Identification is reached ex-post based on a Procrustes transformation. Resulting posterior estimates are ordering invariant and show favorable properties with respect to convergence and statistical as well as numerical accuracy
    Keywords: Bayesian Estimation; Factor Models; Multimodality; Ordering; Orthogonal Transformation
    JEL: C11 C31 C38 C51 C52
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:kie:kieliw:1799&r=ecm
  5. By: Sugawara, Shinya
    Abstract: This paper proposes a new inferential framework for structural econometric models using a nonparametric Bayesian approach. Although estimation methods based on moment conditions can employ a flexible estimation without distributional assumptions, they have difficulty conducting a prediction analysis. I propose a nonparametric Bayesian methodology for an estimation and prediction analysis. My methodology is applied to an empirical analysis of the Japanese private nursing home market. This market has a sticky economic circumstance, and my prediction simulates an intervention that removes this circumstance. The prediction result implies that the outdated circumstance in this market is harmful for consumers today.
    Keywords: Nonparametric Bayes; Nonlinear simultaneous equation model; Prediction; Industrial organization; Nursing home; Long-term care in Japan
    JEL: J14 L11 C11
    Date: 2012–10–23
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:42154&r=ecm
  6. By: Edgar C. Merkle; Jinyan Fan; Achim Zeileis
    Abstract: Researchers are often interested in testing for measurement invariance with respect to an ordinal auxiliary variable such as age group, income class, or school grade. In a factor-analytic context, these tests are traditionally carried out via a likelihood ratio test statistic comparing a model where parameters differ across groups to a model where parameters are equal across groups. This test neglects the fact that the auxiliary variable is ordinal, and it is also known to be overly sensitive at large sample sizes. In this paper, we propose test statistics that explicitly account for the ordinality of the auxiliary variable, resulting in higher power against "monotonic" violations of measurement invariance and lower power against "non-monotonic" ones. The statistics are derived from a family of tests based on stochastic processes that have recently received attention in the psychometric literature. The statistics are illustrated via an application involving real data, and their performance is studied via simulation.
    Keywords: measurement invariance, ordinal variable, parameter stability, factor analysis, structural equation models
    JEL: C30 C38 C52
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:inn:wpaper:2012-24&r=ecm
  7. By: Carnero, María Ángeles; Peña, Daniel; Ruiz, Esther
    Abstract: GARCH volatilities depend on the unconditional variance, which is a non-linear function of the parameters. Consequently, they can have larger biases than estimated parameters. Using robust methods to estimate both parameters and volatilities is shown to outperform Maximum Likelihood procedures.
    Keywords: Financial markets; Heteroscedasticity; QML estimator; Robustness;
    JEL: C22
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:ner:carlos:info:hdl:10016/15744&r=ecm
  8. By: Alexei Onatski; Marcelo Moreira J.; Marc Hallin
    Abstract: This paper deals with the local asymptotic structure, in the sense ofLe Cam’s asymptotic theory of statistical experiments, of the signal detectionproblem in high dimension. More precisely, we consider the problemof testing the null hypothesis of sphericity of a high-dimensional covariancematrix against an alternative of (unspecified) multiple symmetry-breakingdirections (multispiked alternatives). Simple analytical expressions for theasymptotic power envelope and the asymptotic powers of previously proposedtests are derived. These asymptotic powers are shown to lie verysubstantially below the envelope, at least for relatively small values of thenumber of symmetry-breaking directions under the alternative. In contrast,the asymptotic power of the likelihood ratio test based on the eigenvalues ofthe sample covariance matrix is shown to be close to that envelope. Theseresults extend to the case of multispiked alternatives the findings of an earlierstudy (Onatski, Moreira and Hallin, 2011) of the single-spiked case. The methods we are using here, however, are entirely new, as the Laplace approximationsconsidered in the single-spiked context do not extend to themultispiked case.
    Keywords: sphericity tests; large dimentionality; asymptotic power; spiked covariance; contiguity; power enveloppe
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/130318&r=ecm
  9. By: Omay, Tolga
    Abstract: The aim of this study is to search for a better optimization algorithm in applying unit root tests that inherit nonlinear models in the testing process. The algorithms analyzed include Broyden, Fletcher, Goldfarb and Shanno (BFGS), Gauss-Jordan, Simplex, Genetic, and Extensive Grid-Search. The simulation results indicate that the derivative free methods, such as Genetic and Simplex, have advantages over hill climbing methods, such as BFGS and Gauss-Jordan, in obtaining accurate critical values for the Leybourne, Newbold and Vougos (1996, 1998) (LNV) and Sollis (2004) unit root tests. Moreover, when parameters are estimated under the alternative hypothesis of the LNV type of unit root tests the derivative free methods lead to an unbiased and efficient estimator as opposed to those obtained from other algorithms. Finally, the empirical analyses show that the derivative free methods, hill climbing and simple grid search can be used interchangeably when testing for a unit root since all three optimization methods lead to the same empirical test results.
    Keywords: Nonlinear trend; Deterministic smooth transition; Structural change; Estimation methods
    JEL: C15 C22 C01
    Date: 2012–10–22
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:42129&r=ecm
  10. By: Joshua C C Chan
    Abstract: Moving average and stochastic volatility are two important components for modeling and forecasting macroeconomic and financial time series. The former aims to capture short-run dynamics, whereas the latter allows for volatility clustering and time-varying volatility. We introduce a new class of models that includes both of these useful features. The new models allow the conditional mean process to have a state space form. As such, this general framework includes a wide variety of popular specifications, including the unobserved components and time-varying parameter models. Having a moving average process, however, means that the errors in the measurement equation are no longer serially independent, and estimation becomes more difficult. We develop a posterior simulator that builds upon recent advances in precision-based algorithms for estimating this new class of models. In an empirical application involving U.S. inflation we find that these moving average stochastic volatility models provide better in-sample fitness and out-of-sample forecast performance than the standard variants with only stochastic volatility.
    JEL: C11 C51 C53
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:acb:cbeeco:2012-591&r=ecm
  11. By: D.E. Allen (School of Accounting Finance and Economics Edith Cowan University Joondalup Drive Joondalup Western Australia 6027); Abhay K Singh (School of Accouting Finance & Economics, Edith Cowan University, Australia); R. Powell (School of Accounting Finance and Economics Edith Cowan University Joondalup Drive Joondalup Western Australia 6027); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam.); James Taylor (Said Business School, University of Oxford, Oxford); Lyn Thomas (Southampton Management School, University of Southampton, Southampton)
    Abstract: This paper examines the asymmetric relationship between price and implied volatility and the associated extreme quantile dependence using a linear and non- linear quantile regression approach. Our goal is to demonstrate that the relationship between the volatility and market return, as quantied by Ordinary Least Square (OLS) regression, is not uniform across the distribution of the volatility-price re- turn pairs using quantile regressions. We examine the bivariate relationships of six volatility-return pairs, namely: CBOE VIX and S&P 500, FTSE 100 Volatility and FTSE 100, NASDAQ 100 Volatility (VXN) and NASDAQ, DAX Volatility (VDAX) and DAX 30, CAC Volatility (VCAC) and CAC 40, and STOXX Volatility (VS- TOXX) and STOXX. The assumption of a normal distribution in the return series is not appropriate when the distribution is skewed, and hence OLS may not capture a complete picture of the relationship. Quantile regression, on the other hand, can be set up with various loss functions, both parametric and non-parametric (linear case) and can be evaluated with skewed marginal-based copulas (for the non-linear case), which is helpful in evaluating the non-normal and non-linear nature of the relationship between price and volatility. In the empirical analysis we compare the results from linear quantile regression (LQR) and copula based non-linear quantile regression known as copula quantile regression (CQR). The discussion of the properties of the volatility series and empirical ndings in this paper have signicance for portfolio optimization, hedging strategies, trading strategies and risk management, in general.
    Keywords: Return Volatility relationship, quantile regression, copula, copula quantile regression, volatility index, tail dependence.
    JEL: C14 C58 G11
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1224&r=ecm
  12. By: Raffaella Calabrese (University of Milano-Bicocca)
    Abstract: With the implementation of the Basel II accord, the development of accurate loss given default models is becoming increasingly important. The main objective of this paper is to propose a new model to estimate Loss Given Default (LGD) for bank loans by applying generalized additive models. Our proposal allows to represent the high concentration of LGDs at the boundaries. The model is useful in uncovering nonlinear covariate effects and in estimating the mean and the variance of LGDs. The suggested model is applied to a comprehensive survey on loan recovery process of Italian banks. To model LGD in downturn conditions, we include macroeconomic variables in the model. Out-of-time validation shows that our model outperforms popular models like Tobit, decision tree and linear regression models for different time horizons.
    Keywords: downturn LGD, generalized additive model, Basel II
    Date: 2012–10–22
    URL: http://d.repec.org/n?u=RePEc:ucd:wpaper:201224&r=ecm
  13. By: Geert Mesters (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam)
    Abstract: We study the forecasting of the yearly outcome of the Boat Race between Cambridge and Oxford. We compare the relative performance of different dynamic models for forty years of forecasting. Each model is defined by a binary density conditional on a latent signal that is specified as a dynamic stochastic process with fixed predictors. The out-of-sample predictive ability of the models is compared between each other by using a variety of loss functions and predictive ability tests. We find that the model with its latent signal specified as an autoregressive process cannot be outperformed by the other specifications. This model is able to correctly forecast 30 out of 40 outcomes of the Boat Race.
    Keywords: Binary time series; Predictive ability; Non-Gaussian state space model
    JEL: C32 C35
    Date: 2012–10–23
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20120110&r=ecm
  14. By: Sarah Brown (Department of Economics, The University of Sheffield); Mark N. Harris (Department of Econometrics and Quantitative Modelling, Curtin University, Australia); Jennifer Roberts (Department of Economics, The University of Sheffield); Karl Taylor (Department of Economics, The University of Sheffield)
    Abstract: We introduce the (panel) zero-inflated interval regression (ZIIR) model, to investigate GP visits using individual-level data from the British Household Panel Survey. The ZIIR is particularly suitable for this application as it jointly estimates the probability of visiting the GP and then, conditional on visiting, the frequency of visits (defined by given numerical intervals in the data). The results show that different socio-economic factors influence the probability of visiting the GP and the frequency of visits.
    Keywords: GP visits; panel data; zero-Inflated Interval Regression
    JEL: I10 C24 C25
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:shf:wpaper:2012026&r=ecm
  15. By: Dupuy, Arnaud (Reims Management School); Galichon, Alfred (Sciences Po, Paris)
    Abstract: In the context of the Beckerian theory of marriage, when men and women match on a single-dimensional index that is the weighted sum of their respective multivariate attributes, many papers in the literature have used linear canonical correlation, and related techniques, in order to estimate these weights. We argue that this estimation technique is inconsistent and suggest some solutions.
    Keywords: matching, marriage, assignment, assortative matching, canonical correlation
    JEL: C78 D61 C13
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp6942&r=ecm
  16. By: Zhi Zheng; Richard B. Sowers
    Abstract: In this paper we introduce a completely continuous and time-variate model of the evolution of market limit orders based on the existence, uniqueness, and regularity of the solutions to a type of stochastic partial differential equations obtained in Zheng and Sowers (2012). In contrary to several models proposed and researched in literature, this model provides complete continuity in both time and price inherited from the stochastic PDE, and thus is particularly suitable for the cases where transactions happen in an extremely fast pace, such as those delivered by high frequency traders (HFT's). We first elaborate the precise definition of the model with its associated parameters, and show its existence and uniqueness from the related mathematical results given a fixed set of parameters. Then we statistically derive parameter estimation schemes of the model using maximum likelihood and least mean-square-errors estimation methods under certain criteria such as AIC to accommodate to variant number of parameters . Finally as a typical economics and finance use case of the model we settle the investment optimization problem in both static and dynamic sense by analysing the stochastic (It\^{o}) evolution of the utility function of an investor or trader who takes the model and its parameters as exogenous. Two theorems are proved which provide criteria for determining the best (limit) price and time point to make the transaction.
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1210.7230&r=ecm
  17. By: Joshua C C Chan; Gary Koop; Simon M Potter
    Abstract: In this paper, we develop a bivariate unobserved components model for inflation and unemployment. The unobserved components are trend inflation and the non-accelerating inflation rate of unemployment (NAIRU). Our model also incorporates a time-varying Phillips curve and time-varying inflation persistence. What sets this paper apart from the existing literature is that we do not use unbounded random walks for the unobserved components, but rather use bounded random walks. For instance, trend inflation is assumed to evolve within bounds. Our empirical work shows the importance of bounding. We find that our bounded bivariate model forecasts better than many alternatives, including a version of our model with unbounded unobserved components. Our model also yields sensible estimates of trend inflation, NAIRU, inflation persistence and the slope of the Phillips.
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:acb:cbeeco:2012-590&r=ecm

This nep-ecm issue is ©2012 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.