
on Econometrics 
By:  Sofia Anyfantaki; Antonis Demos (www.aueb.gr/users/demos) 
Abstract:  Timevarying GARCHM models are commonly employed in econometrics and financial economics. Yet the recursive nature of the conditional variance makes exact likelihood analysis of these models computationally infeasible. This paper outlines the issues and suggests to employ a Markov chain Monte Carlo algorithm which allows the calculation of a classical estimator via the simulated EM algorithm or a simulated Bayesian solution in only O(T) computational operations, where T is the sample size. Furthermore, the theoretical dynamic properties of a timevaryingparameter EGARCH(1,1)M are derived. We discuss them and apply the suggested Bayesian estimation to three major stock markets. 
Keywords:  Dynamic heteroskedasticity, in mean models, time varying parameter, Markov chain Monte Carlo, simulated EM algorithm, Bayesian inference 
JEL:  C13 C15 C63 
Date:  2012–07–30 
URL:  http://d.repec.org/n?u=RePEc:aue:wpaper:1228&r=ecm 
By:  Mustafa Hakan Eratalay 
Abstract:  In this paper, we make two contributions to the MSV literature. First, we propose two new MSV models that account for leverage effects. Second, we compare the small sample performances of Quasi Maximum Likelihood (QML) and Monte Carlo Likelihood (MCL) methods through Monte Carlo studies for Constant Correlations MSV and Time Varying Correlations MSV and for the two MSV models with leverage we propose. We also provide the specific transformations necessary for the MCL estimation of the proposed MSV models with leverage. Our results confirm that the MCL estimator has better small sample performance compared to the QML estimator. In terms of parameter estimation, both estimators perform better when the series are highly correlated. In estimating the underlying volatilities and correlations, QML estimator’s performance comes closer to that of MCL estimator when the SV process has higher variance or when the correlations are time varying, while it is performing relatively worse in MSV models with leverage. Finally we include an empirical illustration by estimating an MSV model with leverage that we propose using a trivariate data from the major European stock markets. 
Keywords:  Multivariate Stochastic Volatility, Estimation, Constant Correlations, Time Varying Correlations, Leverage 
JEL:  C32 
Date:  2012–10–15 
URL:  http://d.repec.org/n?u=RePEc:eus:wpaper:ec0412&r=ecm 
By:  Bannouh, K.; Martens, M.P.E.; Oomen, R.C.A.; Dijk, D.J.C. van 
Abstract:  We introduce a MixedFrequency Factor Model (MFFM) to estimate vast dimensional covari ance matrices of asset returns. The MFFM uses highfrequency (intraday) data to estimate factor (co)variances and idiosyncratic risk and lowfrequency (daily) data to estimate the factor loadings. We propose the use of highly liquid assets such as exchange traded funds (ETFs) as factors. Prices for these contracts are observed essentially free of microstructure noise at high frequencies, allowing us to obtain precise estimates of the factor covariances. The factor loadings instead are estimated from daily data to avoid biases due to market microstructure effects such as the relative illiquidity of individual stocks and nonsynchronicity between the returns on factors and stocks. Our theoretical, simulation and empirical results illustrate that the performance of the MFFM is excellent, both compared to conventional factor models based solely on lowfrequency data and to popular realized covariance estimators based on highfrequency data. 
Keywords:  dimensional covariance estimation;mixedfrequency factor models 
Date:  2012–10–23 
URL:  http://d.repec.org/n?u=RePEc:dgr:eureri:1765037470&r=ecm 
By:  Christian Aßmann; Jens BoysenHogrefe; Markus Pape 
Abstract:  Due to their wellknown indeterminacies, factor models require identifying assumptions to guarantee unique parameter estimates. For Bayesian estimation, these identifying assumptions are usually implemented by imposing constraints on certain model parameters. This strategy, however, may result in posterior distributions with shapes that depend on the ordering of crosssections in the data set. We propose an alternative approach, which relies on a sampler without the usual identifying constraints. Identification is reached expost based on a Procrustes transformation. Resulting posterior estimates are ordering invariant and show favorable properties with respect to convergence and statistical as well as numerical accuracy 
Keywords:  Bayesian Estimation; Factor Models; Multimodality; Ordering; Orthogonal Transformation 
JEL:  C11 C31 C38 C51 C52 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:kie:kieliw:1799&r=ecm 
By:  Sugawara, Shinya 
Abstract:  This paper proposes a new inferential framework for structural econometric models using a nonparametric Bayesian approach. Although estimation methods based on moment conditions can employ a flexible estimation without distributional assumptions, they have difficulty conducting a prediction analysis. I propose a nonparametric Bayesian methodology for an estimation and prediction analysis. My methodology is applied to an empirical analysis of the Japanese private nursing home market. This market has a sticky economic circumstance, and my prediction simulates an intervention that removes this circumstance. The prediction result implies that the outdated circumstance in this market is harmful for consumers today. 
Keywords:  Nonparametric Bayes; Nonlinear simultaneous equation model; Prediction; Industrial organization; Nursing home; Longterm care in Japan 
JEL:  J14 L11 C11 
Date:  2012–10–23 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:42154&r=ecm 
By:  Edgar C. Merkle; Jinyan Fan; Achim Zeileis 
Abstract:  Researchers are often interested in testing for measurement invariance with respect to an ordinal auxiliary variable such as age group, income class, or school grade. In a factoranalytic context, these tests are traditionally carried out via a likelihood ratio test statistic comparing a model where parameters differ across groups to a model where parameters are equal across groups. This test neglects the fact that the auxiliary variable is ordinal, and it is also known to be overly sensitive at large sample sizes. In this paper, we propose test statistics that explicitly account for the ordinality of the auxiliary variable, resulting in higher power against "monotonic" violations of measurement invariance and lower power against "nonmonotonic" ones. The statistics are derived from a family of tests based on stochastic processes that have recently received attention in the psychometric literature. The statistics are illustrated via an application involving real data, and their performance is studied via simulation. 
Keywords:  measurement invariance, ordinal variable, parameter stability, factor analysis, structural equation models 
JEL:  C30 C38 C52 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:inn:wpaper:201224&r=ecm 
By:  Carnero, María Ángeles; Peña, Daniel; Ruiz, Esther 
Abstract:  GARCH volatilities depend on the unconditional variance, which is a nonlinear function of the parameters. Consequently, they can have larger biases than estimated parameters. Using robust methods to estimate both parameters and volatilities is shown to outperform Maximum Likelihood procedures. 
Keywords:  Financial markets; Heteroscedasticity; QML estimator; Robustness; 
JEL:  C22 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:ner:carlos:info:hdl:10016/15744&r=ecm 
By:  Alexei Onatski; Marcelo Moreira J.; Marc Hallin 
Abstract:  This paper deals with the local asymptotic structure, in the sense ofLe Cam’s asymptotic theory of statistical experiments, of the signal detectionproblem in high dimension. More precisely, we consider the problemof testing the null hypothesis of sphericity of a highdimensional covariancematrix against an alternative of (unspecified) multiple symmetrybreakingdirections (multispiked alternatives). Simple analytical expressions for theasymptotic power envelope and the asymptotic powers of previously proposedtests are derived. These asymptotic powers are shown to lie verysubstantially below the envelope, at least for relatively small values of thenumber of symmetrybreaking directions under the alternative. In contrast,the asymptotic power of the likelihood ratio test based on the eigenvalues ofthe sample covariance matrix is shown to be close to that envelope. Theseresults extend to the case of multispiked alternatives the findings of an earlierstudy (Onatski, Moreira and Hallin, 2011) of the singlespiked case. The methods we are using here, however, are entirely new, as the Laplace approximationsconsidered in the singlespiked context do not extend to themultispiked case. 
Keywords:  sphericity tests; large dimentionality; asymptotic power; spiked covariance; contiguity; power enveloppe 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:eca:wpaper:2013/130318&r=ecm 
By:  Omay, Tolga 
Abstract:  The aim of this study is to search for a better optimization algorithm in applying unit root tests that inherit nonlinear models in the testing process. The algorithms analyzed include Broyden, Fletcher, Goldfarb and Shanno (BFGS), GaussJordan, Simplex, Genetic, and Extensive GridSearch. The simulation results indicate that the derivative free methods, such as Genetic and Simplex, have advantages over hill climbing methods, such as BFGS and GaussJordan, in obtaining accurate critical values for the Leybourne, Newbold and Vougos (1996, 1998) (LNV) and Sollis (2004) unit root tests. Moreover, when parameters are estimated under the alternative hypothesis of the LNV type of unit root tests the derivative free methods lead to an unbiased and efficient estimator as opposed to those obtained from other algorithms. Finally, the empirical analyses show that the derivative free methods, hill climbing and simple grid search can be used interchangeably when testing for a unit root since all three optimization methods lead to the same empirical test results. 
Keywords:  Nonlinear trend; Deterministic smooth transition; Structural change; Estimation methods 
JEL:  C15 C22 C01 
Date:  2012–10–22 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:42129&r=ecm 
By:  Joshua C C Chan 
Abstract:  Moving average and stochastic volatility are two important components for modeling and forecasting macroeconomic and financial time series. The former aims to capture shortrun dynamics, whereas the latter allows for volatility clustering and timevarying volatility. We introduce a new class of models that includes both of these useful features. The new models allow the conditional mean process to have a state space form. As such, this general framework includes a wide variety of popular specifications, including the unobserved components and timevarying parameter models. Having a moving average process, however, means that the errors in the measurement equation are no longer serially independent, and estimation becomes more difficult. We develop a posterior simulator that builds upon recent advances in precisionbased algorithms for estimating this new class of models. In an empirical application involving U.S. inflation we find that these moving average stochastic volatility models provide better insample fitness and outofsample forecast performance than the standard variants with only stochastic volatility. 
JEL:  C11 C51 C53 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:acb:cbeeco:2012591&r=ecm 
By:  D.E. Allen (School of Accounting Finance and Economics Edith Cowan University Joondalup Drive Joondalup Western Australia 6027); Abhay K Singh (School of Accouting Finance & Economics, Edith Cowan University, Australia); R. Powell (School of Accounting Finance and Economics Edith Cowan University Joondalup Drive Joondalup Western Australia 6027); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam.); James Taylor (Said Business School, University of Oxford, Oxford); Lyn Thomas (Southampton Management School, University of Southampton, Southampton) 
Abstract:  This paper examines the asymmetric relationship between price and implied volatility and the associated extreme quantile dependence using a linear and non linear quantile regression approach. Our goal is to demonstrate that the relationship between the volatility and market return, as quantied by Ordinary Least Square (OLS) regression, is not uniform across the distribution of the volatilityprice re turn pairs using quantile regressions. We examine the bivariate relationships of six volatilityreturn pairs, namely: CBOE VIX and S&P 500, FTSE 100 Volatility and FTSE 100, NASDAQ 100 Volatility (VXN) and NASDAQ, DAX Volatility (VDAX) and DAX 30, CAC Volatility (VCAC) and CAC 40, and STOXX Volatility (VS TOXX) and STOXX. The assumption of a normal distribution in the return series is not appropriate when the distribution is skewed, and hence OLS may not capture a complete picture of the relationship. Quantile regression, on the other hand, can be set up with various loss functions, both parametric and nonparametric (linear case) and can be evaluated with skewed marginalbased copulas (for the nonlinear case), which is helpful in evaluating the nonnormal and nonlinear nature of the relationship between price and volatility. In the empirical analysis we compare the results from linear quantile regression (LQR) and copula based nonlinear quantile regression known as copula quantile regression (CQR). The discussion of the properties of the volatility series and empirical ndings in this paper have signicance for portfolio optimization, hedging strategies, trading strategies and risk management, in general. 
Keywords:  Return Volatility relationship, quantile regression, copula, copula quantile regression, volatility index, tail dependence. 
JEL:  C14 C58 G11 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:ucm:doicae:1224&r=ecm 
By:  Raffaella Calabrese (University of MilanoBicocca) 
Abstract:  With the implementation of the Basel II accord, the development of accurate loss given default models is becoming increasingly important. The main objective of this paper is to propose a new model to estimate Loss Given Default (LGD) for bank loans by applying generalized additive models. Our proposal allows to represent the high concentration of LGDs at the boundaries. The model is useful in uncovering nonlinear covariate effects and in estimating the mean and the variance of LGDs. The suggested model is applied to a comprehensive survey on loan recovery process of Italian banks. To model LGD in downturn conditions, we include macroeconomic variables in the model. Outoftime validation shows that our model outperforms popular models like Tobit, decision tree and linear regression models for different time horizons. 
Keywords:  downturn LGD, generalized additive model, Basel II 
Date:  2012–10–22 
URL:  http://d.repec.org/n?u=RePEc:ucd:wpaper:201224&r=ecm 
By:  Geert Mesters (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam) 
Abstract:  We study the forecasting of the yearly outcome of the Boat Race between Cambridge and Oxford. We compare the relative performance of different dynamic models for forty years of forecasting. Each model is defined by a binary density conditional on a latent signal that is specified as a dynamic stochastic process with fixed predictors. The outofsample predictive ability of the models is compared between each other by using a variety of loss functions and predictive ability tests. We find that the model with its latent signal specified as an autoregressive process cannot be outperformed by the other specifications. This model is able to correctly forecast 30 out of 40 outcomes of the Boat Race. 
Keywords:  Binary time series; Predictive ability; NonGaussian state space model 
JEL:  C32 C35 
Date:  2012–10–23 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20120110&r=ecm 
By:  Sarah Brown (Department of Economics, The University of Sheffield); Mark N. Harris (Department of Econometrics and Quantitative Modelling, Curtin University, Australia); Jennifer Roberts (Department of Economics, The University of Sheffield); Karl Taylor (Department of Economics, The University of Sheffield) 
Abstract:  We introduce the (panel) zeroinflated interval regression (ZIIR) model, to investigate GP visits using individuallevel data from the British Household Panel Survey. The ZIIR is particularly suitable for this application as it jointly estimates the probability of visiting the GP and then, conditional on visiting, the frequency of visits (defined by given numerical intervals in the data). The results show that different socioeconomic factors influence the probability of visiting the GP and the frequency of visits. 
Keywords:  GP visits; panel data; zeroInflated Interval Regression 
JEL:  I10 C24 C25 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:shf:wpaper:2012026&r=ecm 
By:  Dupuy, Arnaud (Reims Management School); Galichon, Alfred (Sciences Po, Paris) 
Abstract:  In the context of the Beckerian theory of marriage, when men and women match on a singledimensional index that is the weighted sum of their respective multivariate attributes, many papers in the literature have used linear canonical correlation, and related techniques, in order to estimate these weights. We argue that this estimation technique is inconsistent and suggest some solutions. 
Keywords:  matching, marriage, assignment, assortative matching, canonical correlation 
JEL:  C78 D61 C13 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp6942&r=ecm 
By:  Zhi Zheng; Richard B. Sowers 
Abstract:  In this paper we introduce a completely continuous and timevariate model of the evolution of market limit orders based on the existence, uniqueness, and regularity of the solutions to a type of stochastic partial differential equations obtained in Zheng and Sowers (2012). In contrary to several models proposed and researched in literature, this model provides complete continuity in both time and price inherited from the stochastic PDE, and thus is particularly suitable for the cases where transactions happen in an extremely fast pace, such as those delivered by high frequency traders (HFT's). We first elaborate the precise definition of the model with its associated parameters, and show its existence and uniqueness from the related mathematical results given a fixed set of parameters. Then we statistically derive parameter estimation schemes of the model using maximum likelihood and least meansquareerrors estimation methods under certain criteria such as AIC to accommodate to variant number of parameters . Finally as a typical economics and finance use case of the model we settle the investment optimization problem in both static and dynamic sense by analysing the stochastic (It\^{o}) evolution of the utility function of an investor or trader who takes the model and its parameters as exogenous. Two theorems are proved which provide criteria for determining the best (limit) price and time point to make the transaction. 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1210.7230&r=ecm 
By:  Joshua C C Chan; Gary Koop; Simon M Potter 
Abstract:  In this paper, we develop a bivariate unobserved components model for inflation and unemployment. The unobserved components are trend inflation and the nonaccelerating inflation rate of unemployment (NAIRU). Our model also incorporates a timevarying Phillips curve and timevarying inflation persistence. What sets this paper apart from the existing literature is that we do not use unbounded random walks for the unobserved components, but rather use bounded random walks. For instance, trend inflation is assumed to evolve within bounds. Our empirical work shows the importance of bounding. We find that our bounded bivariate model forecasts better than many alternatives, including a version of our model with unbounded unobserved components. Our model also yields sensible estimates of trend inflation, NAIRU, inflation persistence and the slope of the Phillips. 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:acb:cbeeco:2012590&r=ecm 