
on Econometrics 
By:  Cagnone, Silvia; Bartolucci, Francesco 
Abstract:  Maximum likelihood estimation of dynamic latent variable models requires to solve integrals that are not analytically tractable. Numerical approximations represent a possible solution to this problem. We propose to use the Adaptive GaussianHermite (AGH) numerical quadrature approximation for a class of dynamic latent variable models for timeseries and panel data. These models are based on continuous timevarying latent variables which follow an autoregressive process of order 1, AR(1). Two examples of such models are the stochastic volatility models for the analysis of financial timeseries and the limited dependent variable models for the analysis of panel data. A comparison between the performance of AGH methods and alternative approximation methods proposed in the literature is carried out by simulation. Examples on real data are also used to illustrate the proposed approach. 
Keywords:  AR(1); categorical longitudinal data; GaussianHermite quadrature; limited dependent variable models; stochastic volatility model 
JEL:  C13 C32 C33 
Date:  2013–10–29 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:51037&r=ecm 
By:  El Ghourabi, Mohamed; Francq, Christian; Telmoudi, Fedya 
Abstract:  A twostep approach for conditional Value at Risk (VaR) estimation is considered. In the first step, a generalizedquasimaximum likelihood estimator (gQMLE) is employed to estimate the volatility parameter, and in the second step the empirical quantile of the residuals serves to estimate the theoretical quantile of the innovations. When the instrumental density $h$ of the gQMLE is not the Gaussian density utilized in the standard QMLE, or is not the true distribution of the innovations, both the estimations of the volatility and of the quantile are asymptotically biased. The two errors however counterbalance each other, and we finally obtain a consistent estimator of the conditional VaR. For a wide class of GARCH models, we derive the asymptotic distribution of the VaR estimation based on gQMLE. We show that the optimal instrumental density $h$ depends neither on the GARCH parameter nor on the risk level, but only on the distribution of the innovations. A simple adaptive method based on empirical moments of the residuals makes it possible to infer an optimal element within a class of potential instrumental densities. Important asymptotic efficiency gains are achieved by using gQMLE instead of the usual Gaussian QML when the innovations are heavytailed. We extended our approach to Distortion Risk Measure parameter estimation, where consistency of the gQMLEbased method is also proved. Numerical illustrations are provided, through simulation experiments and an application to financial stock indexes. 
Keywords:  APARCH, Conditional VaR, Distortion Risk Measures, GARCH, Generalized Quasi Maximum Likelihood Estimation, Instrumental density. 
JEL:  C22 C58 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:51150&r=ecm 
By:  Igor Kheifets (New Economic School, Moscow); Carlos Velasco (Dept. of Economics, Universidad Carlos III de Madrid) 
Abstract:  This paper proposes new specification tests for conditional models with discrete responses. In particular, we can test the static and dynamic ordered choice model specifications, which is key to apply efficient maximum likelihood methods, to obtain consistent estimates of partial effects and to get appropriate predictions of the probability of future events. The traditional approach is based on probability integral transforms of a jittered discrete data which leads to continuous uniform iid series under the true conditional distribution. We investigate in this paper an alternative transformation based only on original discrete data. We show analytically and in simulations that our approach dominates the traditional approach in terms of power. We apply the new tests to models of the monetary policy conducted by the Federal Reserve. 
Keywords:  Specification tests, Count data, Dynamic discrete choice models, Conditional probability integral transform 
JEL:  C12 C22 C52 
Date:  2013–11 
URL:  http://d.repec.org/n?u=RePEc:cwl:cwldpp:1924&r=ecm 
By:  Xiaohong Chen (Cowles Foundation, Yale University); Timothy Christensen (Dept. of Economics, Yale University) 
Abstract:  We study the problem of nonparametric regression when the regressor is endogenous, which is an important nonparametric instrumental variables (NPIV) regression in econometrics and a difficult illposed inverse problem with unknown operator in statistics. We first establish a general upper bound on the supnorm (uniform) convergence rate of a sieve estimator, allowing for endogenous regressors and weakly dependent data. This result leads to the optimal supnorm convergence rates for spline and wavelet least squares regression estimators under weakly dependent data and heavytailed error terms. This upper bound also yields the supnorm convergence rates for sieve NPIV estimators under i.i.d. data: the rates coincide with the known optimal L^2norm rates for severely illposed problems, and are power of log(n) slower than the optimal L^2norm rates for mildly illposed problems. We then establish the minimax risk lower bound in supnorm loss, which coincides with our upper bounds on supnorm rates for the spline and wavelet sieve NPIV estimators. This supnorm rate optimality provides another justification for the wide application of sieve NPIV estimators. Useful results on weaklydependent random matrices are also provided. 
Keywords:  Nonparametric instrumental variables; Statistical illposed inverse problems; Optimal uniform convergence rates; Weak dependence; Random matrices; Splines; Wavelets 
JEL:  C13 C14 C32 
Date:  2013–11 
URL:  http://d.repec.org/n?u=RePEc:cwl:cwldpp:1923&r=ecm 
By:  Wolfgang Karl Härdle; Ya'acov Ritov; Weining Wang; 
Abstract:  We consider theoretical bootstrap \coupling" techniques for nonparametric robust smoothers and quantile regression, and verify the bootstrap improvement. To cope with curse of dimensionality, a variant of \coupling" bootstrap techniques are developed for additive models with both symmetric error distributions and further extension to the quantile regression framework. Our bootstrap method can be used in many situations like constructing condence intervals and bands. We demonstrate the bootstrap improvement over the asymptotic band theoretically, and also in simulations and in applications to rm expenditures and the interaction of economic sectors and the stock market. 
Keywords:  Nonparametric Regression, Bootstrap, Quantile Regression, Con dence Bands, Additive Model, Robust Statistics 
JEL:  C00 C14 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2013047&r=ecm 
By:  Anna Freni Sterrantino (Università di Bologna) 
Abstract:  Nonlinear relationships are accommodated in a regression model using smoothing functions. Interaction may occurs between continuous variable, in this case interaction between nonlinear and linear covariate leads to varying coefficent model (VCM), a subclass of generalized additive model. Additive models can be estimated as generalized linear mixed models, after being reparametrized. In this article we show three different type of matrix design for mixed model for VCM, by applying bspline smoothing functions. An application on real data is provided and model estimates re computed with a Bayesian approach. 
Keywords:  Varying Coefficient models, Generalized linear mixed models, reparametrization, Bspline Modelli a coefficienti variabili, Modelli linearu generaliazzati ad effetti misti, parametrizzazione, Bsplinew 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:bot:quadip:122&r=ecm 
By:  Greene, William H.; Gillman, Max; Harris, Mark N.; Spencer, Christopher 
Abstract:  We propose a Tempered Ordered Probit (TOP) model. Our contribution lies not only in explicitly accounting for an excessive number of observations in a given choice category  as is the case in the standard literature on inflated models; rather, we introduce a new econometric model which nests the recently developed Middle Inflated Ordered Probit (MIOP) models of Bagozzi and Mukherjee (2012) and Brooks, Harris, and Spencer (2012) as a special case, and further, can be used as a specification test of the MIOP, where the implicit test is described as being one of symmetry versus asymmetry. In our application, which exploits a panel dataset containing the votes of Bank of England Monetary Policy Committee (MPC) members, we show that the TOP model affords the econometrician considerable flexibility with respect to modeling the impact of different forms of uncertainty on interest rate decisions. Our findings, we argue, reveal MPC members. asymmetric attitudes towards uncertainty and the changeability of interest rates. 
Keywords:  Monetary policy committee, voting, discrete data, uncertainty, tempered equations 
JEL:  C3 E50 
Date:  2013–09 
URL:  http://d.repec.org/n?u=RePEc:hit:hitcei:201304&r=ecm 
By:  Moreira, Humberto; Moreira, Marcelo J. 
Abstract:  This paper considers tests which maximize the weighted average power(WAP). The focus is on determining WAP tests subject to an uncountablenumber of equalities and/or inequalities. The unifying theory allows us toobtain tests with correct size, similar tests, and unbiased tests, among others.A WAP test may be randomized and its characterization is not alwayspossible. We show how to approximate the power of the optimal test bysequences of nonrandomized tests. Two alternative approximations are considered.The rst approach considers a sequence of similar tests for an increasingnumber of boundary conditions. This discretization allows us toimplement the WAP tests in practice. The second method nds a sequenceof tests which approximate the WAP test uniformly. This approximationallows us to show that WAP similar tests are admissible.The theoretical framework is readily applicable to several econometricmodels, including the important class of the curvedexponential family. Inthis paper, we consider the instrumental variable model with heteroskedasticand autocorrelated errors (HACIV) and the nearly integrated regressormodel. In both models, we nd WAP similar and (locally) unbiased testswhich dominate other available tests. 
Date:  2013–10–28 
URL:  http://d.repec.org/n?u=RePEc:fgv:epgewp:747&r=ecm 
By:  Ladislav Kristoufek 
Abstract:  In the paper, we introduce a new measure of correlation between possibly nonstationary series. As the measure is based on the detrending movingaverage crosscorrelation analysis (DMCA), we label it as the DMCA coefficient $\rho_{DMCA}(\lambda)$ with a moving average window length $\lambda$. We analytically show that the coefficient ranges between 1 and 1 as a standard correlation does. In the simulation study, we show that the values of $\rho_{DMCA}(\lambda)$ very well correspond to the true correlation between the analyzed series regardless the (non)stationarity level. Dependence of the newly proposed measure on other parameters  correlation level, moving average window length and time series length  is discussed as well. 
Date:  2013–11 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1311.0657&r=ecm 
By:  Christophe Hurlin (LEO  Laboratoire d'économie d'Orleans  CNRS : UMR6221  Université d'Orléans); Sebastien Laurent (IAE AixenProvence  Institut d'Administration des Entreprises  AixenProvence  Université Paul Cézanne  AixMarseille III, GREQAM  Groupement de Recherche en Économie Quantitative d'AixMarseille  Université de la Méditerranée  AixMarseille II  Université Paul Cézanne  AixMarseille III  École des Hautes Études en Sciences Sociales [EHESS]  CNRS : UMR7316); Rogier Quaedvlieg (Maastricht University  univ. Maastricht); Stephan Smeekes (Maastricht University  univ. Maastricht) 
Abstract:  We propose a widely applicable bootstrap based test of the null hypothesis of equality of two firms' Risk Measures (RMs) at a single point in time. The test can be applied to any marketbased measure. In an iterative procedure, we can identify a complete grouped ranking of the RMs, with particular application to finding buckets of fi rms of equal systemic risk. An extensive Monte Carlo Simulation shows desirable properties. We provide an application on a sample of 94 U.S. financial institutions using the ΔCoVaR, MES and %SRISK, and conclude only the %SRISK can be estimated with enough precision to allow for a meaningful ranking. 
Keywords:  Bootstrap; Grouped Ranking; Risk Measures; Uncertainty 
Date:  2013–10–28 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:halshs00877279&r=ecm 
By:  Nikolay Gospodinov; Raymond Kan; Cesare Robotti 
Abstract:  We show that in misspecified models with useless factors (for example, factors that are independent of the returns on the test assets), the standard inference procedures tend to erroneously conclude, with high probability, that these irrelevant factors are priced and the restrictions of the model hold. Our proposed model selection procedure, which is robust to useless factors and potential model misspecification, restores the standard inference and proves to be effective in eliminating factors that do not improve the model's pricing ability. The practical relevance of our analysis is illustrated using simulations and empirical applications. 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:201309&r=ecm 
By:  Kliem, Martin; Uhlig, Harald 
Abstract:  This paper presents a novel Bayesian method for estimating dynamic stochastic general equilibrium (DSGE) models subject to a constrained posterior distribution of the implied Sharpe ratio. We apply our methodology to a DSGE model with habit formation in consumption and leisure, using an estimate of the Sharpe ratio to construct the constraint. We show that the constrained estimation produces a quantitative model with both reasonable assetpricing as well as businesscycle implications.  
Keywords:  Bayesian estimation,stochastic steadystate,prior choice,Sharpe ratio 
JEL:  C11 E32 E44 G12 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:zbw:bubdps:372013&r=ecm 
By:  Areski Cousin (SAF  Laboratoire de Sciences Actuarielle et Financière  Université Claude Bernard  Lyon I : EA2429); Elena Di Bernardinoy (IMATH  Département Ingénierie Mathématique  Conservatoire National des Arts et Métiers (CNAM)) 
Abstract:  In this paper, we introduce two alternative extensions of the classical univariate ConditionalTailExpectation (CTE) in a multivariate setting. Contrary to allocation measures or systemic risk measures, these measures are also suitable for multivariate risk problems where risks are heterogenous in nature and cannot be aggregated together. 
Keywords:  Multivariate risk measures, Level sets of distribution functions, Multivariate probability integral transformation, Stochastic orders, Copulas and dependence. 
Date:  2013–10–28 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal00877386&r=ecm 
By:  Rodrigue Oeuvray; Pascal Junod 
Abstract:  The aim of this paper is to examine the time scaling of the semivariance when returns are modeled by various types of jumpdiffusion processes, including stochastic volatility models with jumps in returns and in volatility. In particular, we derive an exact formula for the semivariance when the volatility is kept constant, explaining how it should be scaled when considering a lower frequency. We also provide and justify the use of a generalization of the BallTorous approximation of a jumpdiffusion process, this new model appearing to deliver a more accurate estimation of the downside risk. We use Markov Chain Monte Carlo (MCMC) methods to fit our stochastic volatility model. For the tests, we apply our methodology to a highly skewed set of returns based on the Barclays US High Yield Index, where we compare different time scalings for the semivariance. Our work shows that the square root of the time horizon seems to be a poor approximation in the context of semivariance and that our methodology based on jumpdiffusion processes gives much better results. 
Date:  2013–11 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1311.1122&r=ecm 
By:  Xiangrong Yu (Hong Kong Institute for Monetary Research) 
Abstract:  This paper explores frequencyspecific implications of measurement error for the design of stabilization policy rules. Policy evaluation in the frequency domain is interesting because the characterization of policy effects frequency by frequency gives the policymaker additional information about the effects of a given policy. Further, some important aspects of policy analysis can be better understood in the frequency domain than in the time domain. In this paper, I develop a rich set of design limits that describe fundamental restrictions on how a policymaker can alter variance at different frequencies. I also examine the interaction of measurement error and model uncertainty to understand the effects of different sources of informational limit on optimal policymaking. In a linear feedback model with noisy state observations, measurement error seriously distorts the performance of the policy rule that is optimal for the noisefree system. Adjusting the policy to appropriately account for measurement error means that the policymaker becomes less responsive to the raw data. For a parameterized example which corresponds to the choice of monetary policy rules in a simple AR (1) environment, I show that an additive white noise process of measurement error has little impact at low frequencies but induces less active control at high frequencies, and even may lead to more aggressive policy actions at medium frequencies. Local robustness analysis indicates that measurement error reduces the policymaker's reaction to model uncertainty, especially at medium and high frequencies. 
Keywords:  Policy Evaluation, Measurement Error, Spectral Analysis, Design Limits, Model Uncertainty, Monetary Policy Rules 
JEL:  C52 E52 E58 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:hkm:wpaper:172013&r=ecm 