
on Econometrics 
By:  David E. Giles (Department of Economics, University of Victoria) 
Abstract:  The ToppLeone distribution is attractive for reliability studies as it has finite support and a bathtubshaped hazard function. We compare some properties of the method of moments, maximum likelihood, and biasadjusted maximum likelihood estimators of its shape parameter. The last of these estimators is very simple to apply and dominates the method of moments estimator in terms of relative bias and mean squared error. 
Keywords:  Jshaped distribution; maximum likelihood; method of moments; unbiased estimation; mean squared error; bathtub hazard; finite support 
JEL:  C13 C16 C41 C46 
Date:  2012–09–12 
URL:  http://d.repec.org/n?u=RePEc:vic:vicewp:1203&r=ecm 
By:  Christian N. Brinch, Magne Mogstad and Matthew Wiswall (Statistics Norway) 
Abstract:  The interpretation of instrumental variables (IV) estimates as local average treatment effects (LATE) of instrumentinduced shifts in treatment raises concerns about their external validity and policy relevance. We examine how to move beyond LATE in situations where the instrument is discrete, as it often is in applied research. Discrete instruments do not give sufficient support to identify the full range of marginal treatment effects (MTE) in the usual local instrumental variable approach. We show how an alternative estimation approach allows identification of richer specifications of the MTE when the instrument is discrete. One result is that the alternative approach identifies a linear MTE model even with a single binary instrument. Although restrictive, the linear MTE model nests the standard IV estimator: The model gives the exact same estimate of LATE while at the same time providing a simple test for its external validity and a linear extrapolation. Another result is that the alternative approach allows identification of a general MTE model under the auxiliary assumption of additive separability between observed and unobserved heterogeneity in treatment effects. We apply these identification results to empirically assess the interaction between the quantity and quality of children. Motivated by the seminal quantityquality model of fertility, a large and growing body of empirical research has used binary instruments to estimate LATEs of family size on child outcomes. We show that the effects of family size are both more varied and more extensive than what the LATEs suggest. Our MTE estimates reveal that the family size effects vary in magnitude and even sign, and that families act as if they possess some knowledge of the idiosyncratic effects in the fertility decision. 
Keywords:  Local average treatment effects; marginal treatment effects; discrete instrument; quantityquality; fertility 
JEL:  C26 J13 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:ssb:dispap:703&r=ecm 
By:  Hisayuki Tsukuma (Faculty of Medicine, Toho University); Tatsuya Kubokawa (Faculty of Economics, University of Tokyo) 
Abstract:  In estimation of the normal covariance matrix, nding a least favorable sequence of prior distributions has been an open question for a long time. In this paper, we address the classical problem and succeed in construction of such a sequence, which establishes minimaxity of the best equivariant estimator. We also derive unied conditions for a sequence of prior distributions to be least favorable in the general estimation problem with an invariance structure. These unied conditions are applied to both restricted and nonrestricted cases of parameters, and we give a couple of examples which show minimaxity of the best equivariant estimators under restrictions of the covariance matrix. 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2012cf858&r=ecm 
By:  Matteo Luciani (Université Libre de Bruxelles); David Veredas (Université Libre de Bruxelles) 
Abstract:  Realized volatilities, when observed over time, share the following stylised facts: comovements, clustering, longmemory, dynamic volatility, skewness and heavytails. We propose a dynamic factor model that captures these stylised facts and that can be applied to vast panels of volatilities as it does not suffer from the curse of dimensionality. It is an enhanced version of Bai and Ng (2004) in the following respects: i) we allow for longmemory in both the idiosyncratic and the common components, ii) the common shocks are conditionally heteroskedastic, and iii) the idiosyncratic and common shocks are skewed and heavytailed. Estimation of the factors, the idiosyncratic components and the parameters is simple: principal components and low dimension maximum likelihood estimations. A Monte Carlo study shows the usefulness of the approach and an application to 90 daily realized volatilities, pertaining to S&P100, from January 2001 to December 2008, evinces, among others, the following findings: i) All the volatilities have longmemory, more than half in the nonstationary range, that increases during financial turmoils. ii) Tests and criteria point towards one dynamic common factor driving the comovements. iii) The factor has larger longmemory than the assets volatilities, suggesting that long–memory is a market characteristic. iv) The volatility of the realized volatility is not constant and common to all. v) A forecasting horse race against 8 competing models shows that our model outperforms, in particular in periods of stress. 
Keywords:  Realized volatilities, vast dimensions, factor models, long–memory, forecasting 
JEL:  C32 C51 G01 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:bde:wpaper:1230&r=ecm 
By:  Tatsuya Kubokawa (Faculty of Economics, University of Tokyo); Mana Hasukawa (Graduate School of Economics, University of Tokyo); Kunihiko Takahashi (National Institute of Public Health) 
Abstract:  Empirical Bayes (EB) estimates in general linear mixed models are useful for the small area estimation in the sense of increasing precision of estimation of small area means. However, one potential difficulty of EB is that the overall estimate for a larger geographical area based on a (weighted) sum of EB estimates is not necessarily identical to the corresponding direct estimate like the overall sample mean. Another difficulty is that EB estimates yield overshrinking, which results in the sampling variance smaller than the posterior variance. One way to fix these problems is the benchmarking approach based on the constrained empirical Bayes (CEB) estimators, which satisfy the constraints that the aggregated mean and variance are identical to the requested values of mean and variance. In this paper, we treat the general mixed models, derive asymptotic approximations of the mean squared error (MSE) of CEB and provide secondorder unbiased estimators of MSE based on the parametric bootstrap method. These results are applied to natural exponential families with quadratic variance functions (NEFQVF). As a specific example, the Poissongamma model is dealt with, and it is illustrated that the CEB estimates and their MSE estimates work well through real mortality data. 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2012cf861&r=ecm 
By:  Tsionas, Mike 
Abstract:  In this paper we consider a variety of procedures for numerical statistical inference in the family of univariate and multivariate stable distributions. In connection with univariate distributions (i) we provide approximations by finite locationscale mixtures and (ii) versions of approximate Bayesian computation (ABC) using the characteristic function and the asymptotic form of the likelihood function. In the context of multivariate stable distributions we propose several ways to perform statistical inference and obtain the spectral measure associated with the distributions, a quantity that has been a major impediment in using them in applied work. We extend the techniques to handle univariate and multivariate stochastic volatility models, static and dynamic factor models with disturbances and factors from general stable distributions, a novel way to model multivariate stochastic volatility through timevarying spectral measures and a novel way to multivariate stable distributions through copulae. The new techniques are applied to artificial as well as real data (ten major currencies, SP100 and individual returns). In connection with ABC special attention is paid to crafting wellperforming proposal distributions for MCMC and extensive numerical experiments are conducted to provide critical values of the “closeness” parameter that can be useful for further applied econometric work. 
Keywords:  Univariate and multivariate stable distributions; MCMC; Approximate Bayesian Computation; Characteristic function 
JEL:  C13 C11 
Date:  2012–05–10 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:40966&r=ecm 
By:  Jesus FernandezVillaverde; Pablo A. GuerrónQuintana; Juan RubioRamírez 
Abstract:  We propose a novel method to estimate dynamic equilibrium models with stochastic volatility. First, we characterize the properties of the solution to this class of models. Second, we take advantage of the results about the structure of the solution to build a sequential Monte Carlo algorithm to evaluate the likelihood function of the model. The approach, which exploits the profusion of shocks in stochastic volatility models, is versatile and computationally tractable even in largescale models, such as those often employed by policymaking institutions. As an application, we use our algorithm and Bayesian methods to estimate a business cycle model of the U.S. economy with both stochastic volatility and parameter drifting in monetary policy. Our application shows the importance of stochastic volatility in accounting for the dynamics of the data. 
JEL:  C1 E30 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:18399&r=ecm 
By:  Nassim N. Taleb 
Abstract:  Ex ante forecast outcomes should be interpreted as counterfactuals (potential histories), with errors as the spread between outcomes. Reapplying measurements of uncertainty about the estimation errors of the estimation errors of an estimation leads to branching counterfactuals. Such recursions of epistemic uncertainty have markedly different distributial properties from conventional sampling error. Nested counterfactuals of error rates invariably lead to fat tails, regardless of the probability distribution used, and to powerlaws under some conditions. A mere .01% branching error rate about the STD (itself an error rate), and .01% branching error rate about that error rate, etc. (recursing all the way) results in explosive (and infinite) higher moments than 1. Missing any degree of regress leads to the underestimation of small probabilities and concave payoffs (a standard example of which is Fukushima). The paper states the conditions under which higher order rates of uncertainty (expressed in spreads of counterfactuals) alters the shapes the of final distribution and shows which a priori beliefs about conterfactuals are needed to accept the reliability of conventional probabilistic methods (thin tails or mildly fat tails). 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1209.2298&r=ecm 
By:  Roberto LeonGonzalez (National Graduate Institute for Policy Studies); Daniel Montolio (University of Barcelona and Barcelona Institute of Economics) 
Abstract:  Bayesian model averaging (BMA) has been successfully applied in the empirical growth literature as a way to overcome the sensitivity of results to different model specifications. In this paper, we develop a BMA technique to analyze models that differ in the set of instruments, exogeneity restrictions, or the set of controlling regressors. Our framework allows for both crosssection regressions with instrumental variables and for the commonly used panel data model with xed effects and endogenous or predetermined regressors. The large model space that typically arises can be effectively analyzed using a Markov Chain Monte Carlo algorithm. We apply our technique to the dataset used by Burnside and Dollar (2000) who investigated the eect of international aid on GDP growth. We show that BMA is an eective tool for the analysis of panel data growth regressions in cases where the number of models is large and results are sensitive to model assumptions. 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:ngi:dpaper:1208&r=ecm 
By:  Vargas, Jose P Mauricio 
Abstract:  The paper presents estimations of the informal economy size in Bolivia from an application of a Dynamic General Equilibrium Model. The parameter estimation is performed using maximum likelihood method to obtain, as an intermediate result, a latent variable estimation of the informal economy size. This procedure is new, as the estimate of the size of the informal economy using a dynamic structural model represents an alternative study area to latent variable models which assume relationships without a strong support in theory (MIMIC models). The results suggest that the size of the informal economy represents 60% of Bolivian GDP in 2010 and that the trend has been decreasing in the last decade. In addition, we simulated four alternative policies to reduce the size of the underground economy. Some of them allow to identify surprising response mechanisms which allows to analyze the flow of workers from the informal sector into the formal sector and vice versa. The research, besides quantifying the informal economy size, tries to provide a tool and methodology for evaluating alternative policy scenarios related to fiscal policy and labor mobility in a framework of an economy with a large informal sector and evasion. 
Keywords:  Informal Economy; Kalman filter; Structural Model; Underground Economy 
JEL:  E62 O43 C61 E26 
Date:  2012–05–18 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:41290&r=ecm 
By:  William Greene; Colin McKenzie 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:ste:nystbu:1214&r=ecm 
By:  Chaudourne, Jeremy (UQAM); Fève, Patrick (TSE (GREMAQ, IUF, IDEI et Banque de France)); Gay, Alain (UQAM) 
Abstract:  This paper studies the statistical properties of impulse response functions in structural vector autoregressions (SVARs) with a highly persistent variable as hours worked and longrun identifying restrictions. The highly persistent variable is specified as a nearly stationary persistent process. Such process appears particularly well suited to characterized the dynamics of hours worked because it implies a unit root in finite sample but is asymptotically stationary and persistent. This is typically the case for per capita hours worked which are included in SVARs. Theoretical results derived from this specification allow to explain most of the empirical findings from SVARs which include U.S. hours worked. Simulation experiments from an estimated DSGE model confirm theoretical results. 
Keywords:  , , , , , , , SVARs, longrun restrictions, locally nonstationary process, technology shocks, hours worked 
JEL:  C32 E32 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:tse:wpaper:26112&r=ecm 
By:  Ceylan, Ozcan (Galatasaray University Economic Research Center) 
Abstract:  Based on the recent developments in the highfrequency econometrics and asymmetric GARCH modeling literature, I develop a novel model that accounts for the volatility feedback and leverage effects, effectively incorporating signed continuous and jump components of the realized variance in the variance specification through an HAR forecasting model. I then condition the variance specification on the lagged realized variance and the risk aversion (that is proxied by the variance risk premium level) to analyze the eventual statedependent variations in the volatility asymmetry. I find that the volatility asymmetry is clearly more pronounced in the periods of market stress marked by high levels of volatility and risk aversion. In addition, I reveal a further asymmetry in the asymmetric reaction patterns of the volatility to good and bad news: while the market moves through the periods of higher volatility and risk aversion, the impact of a bad news increases much more heavily than that of good news pointing to the fact that the investors become more sensible to bad news in market downturns. 
Keywords:  Timevarying volatility asymmetry; Highfrequency econometrics; EGARCHM; HAR models; Volatility components; Variance risk premium 
JEL:  C13 C14 C32 C58 G12 
Date:  2012–09–05 
URL:  http://d.repec.org/n?u=RePEc:ris:giamwp:2012_004&r=ecm 
By:  Hall, Jamie 
Abstract:  This article describes a new approximation method for dynamic stochastic general equilibrium (DSGE) models. The method allows nonlinear models to be estimated efficiently and relatively quickly with the fullyadapted particle filter. The article demonstrates the method by estimating, on US data, a nonlinear New Keynesian model with a zero lower bound on the nominal interest rate. 
Keywords:  DSGE; nonlinear; particle filter 
JEL:  E0 C1 
Date:  2012–09–11 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:41218&r=ecm 
By:  Francis X. Diebold (Department of Economics, University of Pennsylvania) 
Abstract:  The DieboldMariano (DM) test was intended for comparing forecasts; it has been, and remains, useful in that regard. The DM test was not intended for comparing models. Unfortunately, however, much of the large subsequent literature uses DMtype tests for comparing models, in (pseudo) outofsample environments. In that case, much simpler yet more compelling fullsample model comparison procedures exist; they have been, and should continue to be, widely used. The hunch that (pseudo) outofsample analysis is somehow the “only," or “best," or even a “good" way to provide insurance against insample over fitting in model comparisons proves largely false. On the other hand, (pseudo) outofsample analysis may be useful for learning about comparative historical predictive performance. 
Keywords:  : Forecasting, model comparison, model selection, outofsample tests 
JEL:  C01 C53 
Date:  2012–09–07 
URL:  http://d.repec.org/n?u=RePEc:pen:papers:12035&r=ecm 