
on Econometrics 
By:  Tao Chen (University of Connecticut); Gautam Tripathi (University of Connecticut) 
Abstract:  We test the assumption of conditional symmetry used to identify and estimate parameters in regression models with endogenous regressors without making any distributional assumptions. The specification test proposed here is computationally tractable, does not require nonparametric smoothing, and can detect n1/2deviations from the null. Since the limiting distribution of the test statistic turns out to be a nonpivotal gaussian process, the critical values for implementing the test are obtained by simulation. In a Monte Carlo study we use the approach proposed here to test the assumption of conditional symmetry maintained in the seminal paper of Powell (1986b). Results from this finite sample experiment suggest that our test can work very well in moderately sized samples. 
JEL:  C12 C14 
Date:  2011–01 
URL:  http://d.repec.org/n?u=RePEc:uct:uconnp:201101&r=ecm 
By:  Monica Billio (University of Venice, GRETA Assoc. and School for Advanced Studies in Venice); Roberto Casarin (University of Breccia and GRETA Assoc); Francesco Ravazzolo (Norges Bank (Central Bank of Norway)); Herman K. van Dijk (Econometrics and Tinbergen Institutes, Erasmus University Rotterdam) 
Abstract:  Using a Bayesian framework this paper provides a multivariate combination approach to prediction based on a distributional state space representation of predictive densities from alternative models. In the proposed approach the model set can be incomplete. Several multivariate timevarying combination strategies are introduced. In particular, a weight dynamics driven by the past performance of the predictive densities is considered and the use of learning mechanisms. The approach is assessed using statistical and utilitybased performance measures forevaluating density forecasts of US macroeconomic time series and of surveys of stock market prices. 
Keywords:  Density Forecast Combination, Survey Forecast, Bayesian Filtering, Sequential Monte Carlo 
JEL:  C11 C15 C53 E37 
Date:  2010–12–21 
URL:  http://d.repec.org/n?u=RePEc:bno:worpap:2010_29&r=ecm 
By:  Naoto Kunitomo (Faculty of Economics, University of Tokyo); Kentaro Akashi (Institute of Statistical Mathematics) 
Abstract:  We consider the estimation of coefficients of a dynamic panel structural equation in the simultaneous equation models. As a semiparametric method, we introduce a class of modifications of the limited information maximum likelihood (LIML) estimator to improve its asymptotic properties as well as the small sample properties when we have individual heteroscedasticities. We shall show that an asymptotically optimal modification of the LIML estimator, which is called AOMLIML, removes the asymptotic bias caused by the forwardfiltering and improves the LIML and other estimation methods with individual heteroscedasticities. 
Date:  2010–12 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2010cf780&r=ecm 
By:  Russell Davidson (McGill University); James G. MacKinnon (Queen's University) 
Abstract:  Economists are often interested in the coefficient of a single endogenous explanatory variable in a linear simultaneous equations model. One way to obtain a confidence set for this coefficient is to invert the AndersonRubin test. The "AR confidence sets" that result have correct coverage under classical assumptions. In this paper, however, we show that AR confidence sets also have many undesirable properties. Their coverage conditional on quantities that the investigator can observe, notably the Sargan statistic, can be far from correct. It is well known that they can be unbounded when the instruments are weak. Even when they are bounded, their length may be very misleading. We argue that, at least when the instruments are not so weak that inference is hopeless, it is much better to obtain confidence intervals by bootstrapping either the IV or LIML t statistic on the coefficient of interest in a particular way that we propose. 
Keywords:  bootstrap, confidence interval, instrumental variables, LIML, Sargan test, weak instruments 
JEL:  C15 
Date:  2011–01 
URL:  http://d.repec.org/n?u=RePEc:qed:wpaper:1257&r=ecm 
By:  Oberhofer, Harald (University of Salzburg); Pfaffermayr, Michael (Department of Economics and Statistics, University of Innsbruck) 
Abstract:  This note proposes a generalized twopart model for fractional response variables that nests the onepart model proposed by Papke and Wooldridge (1996). Consequently, a Wald test allows to discriminate between these two competing models. A small scale Monte Carlo simulation demonstrates that the proposed Wald test is properly sized and equipped with higher power than an alternative nonnested Ptest. 
Keywords:  Fractional response models; twopart model; Wald test; Ptest 
JEL:  C12 C15 C21 C25 
Date:  2011–01–05 
URL:  http://d.repec.org/n?u=RePEc:ris:sbgwpe:2011_001&r=ecm 
By:  Ingo Geishecker; Maximilian Riedl 
Abstract:  The paper compares diﬀerent estimation strategies of ordered response models in the presence of nonrandom unobserved heterogeneity. By running Monte Carlo simulations with a range of randomly generated panel data of diﬀering cross¬sectional and longitudinal dimension sizes we assess the consistency and eﬃciency of standard models such as linear ﬁxed eﬀects, ordered and conditional logit and several diﬀerent binary recoding procedures. Among the analyzed binary recoding procedures is the conditional ordered logit estimator proposed by FerreriCarbonell and Frijters (2004) that recently has gained some popularity in the analysis of individual wellbeing. The FerreriCarbonell and Frijters estimator (FCF) performs best if the number of observations is large and the number of categories on the ordered scale is small. However, a much simpler individual mean based binary recoding scheme performs similarly well and even outperforms the FCF estimator if the number of categories on the ordered scale becomes large. If the researcher is, however, only interested in the relative size of coeﬃcients with respect to a baseline the easy to compute linear ﬁxed eﬀect model essentially delivers the same results as the more elaborate binary recoding schemes. 
Keywords:  ﬁxed eﬀects ordered logit, ordered responses, happiness 
JEL:  C23 C25 I31 
Date:  2010–11–30 
URL:  http://d.repec.org/n?u=RePEc:got:cegedp:116&r=ecm 
By:  Ana Paula Martins 
Abstract:  This research contrasts three econometric alternatives for stochastic efficiency frontier analysis: order – interquantile – and inverse order regression under the assumption of truncated error term distribution, and replicated moment estimation. The demonstration departs from a simple linear regression form of the effective frontier; truncated (at zero) errors are then added to it for simulation purposes. For order regression, experiments with the standard normal, uniform, exponential, Cauchy and logistic error terms are provided. For complex error structures we rely on normal distributions only. The three alternatives would perform satisfactorily for simple error disturbances, especially if they are normal. With more than one residual added to the dependent variable, the weight of the unrestricted range one can blur the conclusions regarding observation efficiency. 
Keywords:  Stochastic Frontier Model, Generalized Method of Order Statistics, Minimum Distance Method of Order Statistics, Inverse Order Regression, Replicated Moments, Linear Models. 
JEL:  C24 C10 
Date:  2010–08–10 
URL:  http://d.repec.org/n?u=RePEc:eei:rpaper:eeri_rp_2010_37&r=ecm 
By:  Nikolay Iskrev 
Abstract:  This paper presents a new approach to parameter identification analysis in DSGE models wherein the strength of identification is treated as property of the underlying model and studied prior to estimation. The strength of identification reflects the empirical importance of the economic features represented by the parameters. Identification problems arise when some parameters are either nearly irrelevant or nearly redundant with respect to the aspects of reality the model is designed to explain. The strength of identification therefore is not only crucial for the estimation of models, but also has important implications for model development. The proposed measure of identification strength is based on the Fisher information matrix of DSGE models and depends on three factors: the parameter values, the set of observed variables and the sample size. By applying the proposed methodology, researchers can determine the effect of each factor on the strength of identification of individual parameters, and study how it is related to structural and statistical characteristics of the economic model. The methodology is illustrated using the mediumscale DSGE model estimated in Smets and Wouters (2007). 
JEL:  C32 C51 C52 E32 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:ptu:wpaper:w201032&r=ecm 
By:  Antonio Merlo (Department of Economics, University of Pennsylvania); Aureo de Paula (Department of Economics, University of Pennsylvania) 
Abstract:  This paper studies the nonparametric identification and estimation of voters' preferences when voters are ideological. We build on the methods introduced by Degan and Merlo (2009) representing elections as Voronoi tessellations of the ideological space. We exploit the properties of this geometric structure to establish that voter preference distributions and other parameters of interest can be identified from aggregate electoral data. We also show that these objects can be consistently estimated using the methodology proposed by Ai and Chen (2003) and we illustrate our analysis by performing an actual estimation using data from the 1999 European Parliament elections. 
Keywords:  Voting, Voronoi tessellation,identification, nonparametric 
JEL:  D72 C14 
Date:  2010–12–31 
URL:  http://d.repec.org/n?u=RePEc:pen:papers:11001&r=ecm 
By:  Moauro, Filippo 
Abstract:  The paper presents the results of an extensive real time analysis of alternative modelbased approaches to derive a monthly indicator of employment for the euro area. In the experiment the Eurostat quarterly national accounts series of employment is temporally disaggregated using the information coming from the monthly series of unemployment. The strategy benefits of the contribution of the information set of the euro area and its 6 larger member states, as well as the split into the 6 sections of economic activity. The models under comparison include univariate regressions of the Chow and Lin' type where the euro area aggregate is directly and indirectly derived, as well as multivariate structural time series models of small and medium size. The specification in logarithms is also systematically assessed. The largest multivariate setups, up to 49 series, are estimated through the EM algorithm. Main conclusions are the following: mean revision errors of disaggregated estimates of employment are overall small; a gain is obtained when the model strategy takes into account the information by both sector and member state; the largest multivariate setups outperforms those of small size and the strategies based on classical disaggregation methods. 
Keywords:  temporal disaggregation methods; multivariate structural time series models; mixedfrequency models; EM algorithm; Kalman filter and smoother 
JEL:  C51 C32 C52 C22 
Date:  2010–12–30 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:27797&r=ecm 
By:  Matthias Fengler; Helmut Herwartz; Christian Werner 
Abstract:  Equity index implied volatility functions are known to be excessively skewed in comparison with implied volatility at the single stock level. We study this stylized fact for the case of a major German stock index, the DAX, by recovering index implied volatility from simulating the 30 dimensional return system of all DAX constituents. Option prices are computed after risk neutralization of the multivariate process which is estimated under the physical probability measure. The multivariate models belong to the class of copula asymmetric dynamic conditional correlation models. We show that moderate taildependence coupled with asymmetric correlation response to negative news is essential to explain the index implied volatility skew. Standard dynamic correlation models with zero taildependence fail to generate a sufficiently steep implied volatility skew. 
Keywords:  Copula Dynamic Conditional Correlation, Basket Options, Multivariate GARCH Models, Change of Measure, Esscher Transform 
JEL:  C32 C15 G13 G14 
Date:  2010–12 
URL:  http://d.repec.org/n?u=RePEc:usg:dp2010:201033&r=ecm 
By:  Nidhiya Menon (Department of Economics, Brandeis University); Mark M. Pitt (Brown University) 
Abstract:  This paper proposes a novel instrumental variable method for program evaluation that only requires a single crosssection of data on the spatial intensity of programs and outcomes. The instruments are derived from a simple theoretical model of government decisionmaking in which governments are responsive to the attributes of places and their populations, rather than to the attributes of individuals, in making allocation decisions across space, and have a social welfare function that is spatially weakly separable, that is, that the budgeting process is multistage with respect to administrative districts and subdistricts. The spatial instrumental variables model is then estimated and tested by GMM with a single crosssection of Indonesian census data. The results offer support to the identification strategy proposed. 
Keywords:  Spatial Decentralization, Program Evaluation, Instrumental Variables, Indonesia 
JEL:  C21 H44 O12 C50 
Date:  2010–09 
URL:  http://d.repec.org/n?u=RePEc:brd:wpaper:16&r=ecm 
By:  Marco Buchmann (European Central Bank, DG Financial Stability, Financial Stability Assessment Division, Kaiserstrasse 29, D60311 Frankfurt am Main, Germany.) 
Abstract:  This paper aims at providing a detailed analysis of the leading indicator properties of corporate bond spreads for real economic activity in the euro area. In and outofsample predictive content of corporate bond spreads are examined along three dimensions: the bonds’ quality, their term to maturity, as well as the forecast horizon at which one intends to predict a change in real activity. Numerous alternative leading indicators capturing macroeconomic and financial conditions are included in the analysis. Along with standard time series forecast models, the Least Angle Regression (LAR) technique is used to build multivariate models recursively. Models built via LAR can be used to produce forecasts and allow one to analyze how the composition and the number of relevant model variables evolve over time. Corporate bond spreads turn out to be valuable predictors for real activity, in particular at forecast horizons beyond one year; Medium risk bond spreads with maturities between 5 and 10 years appear particularly rich in content. The spreads also belong to the group of indicators that implied the highest probability of a recession occurring from a precrisis perspective. JEL Classification: E32, E37, E44, G32. 
Keywords:  Corporate bond spreads, point and density forecasting, automatic model building, least angle regression. 
Date:  2011–01 
URL:  http://d.repec.org/n?u=RePEc:ecb:ecbwps:20111286&r=ecm 
By:  Sriram Shankar (School of Economics, University of Queensland); Chris O'Donnell (School of Economics, University of Queensland); John Quiggin (School of Economics, University of Queensland) 
Abstract:  In this article we model production technology in a statecontingent framework. Our model analyzes production under uncertainty without being explicit about the nature of producer risk preferences. In our model producers’ risk preferences are captured by the riskneutral probabilities they assign to the different states of nature. Using a stategeneral statecontingent specification of technology we show that rational producers who encounter the same stochastic technology can make significantly different production choices. Further, we develop an econometric methodology to estimate the riskneutral probabilities and the parameters of stochastic technology when there are two states of nature and only one of which is observed. Finally, we simulate data based on our stategeneral statecontingent specification of technology. Biased estimates of the technology parameters are obtained when we apply conventional ordinary least squares (OLS) estimator on the simulated data. 
Keywords:  CES, CobbDouglas, OLS, outputcubical, riskneutral, stateallocable, statecontingent 
JEL:  C15 C63 D21 D81 Q10 
Date:  2010–12 
URL:  http://d.repec.org/n?u=RePEc:rsm:riskun:r10_3&r=ecm 
By:  Giacomo Sbrana (BETA/CNRS, Université de Strasbourg, France.) 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:afc:wpaper:1008&r=ecm 
By:  Diewert, Erwin 
Abstract:  The paper uses data on sales of detached houses in a small Dutch town over 14 quarters starting at the first quarter of 2005 in order to compare various methods for constructing a house price index over this period. Four classes of methods are considered: (i) stratification techniques plus normal index number theory; (ii) time dummy hedonic regression models; (iii) hedonic imputation techniques and (iv) additive in land and structures hedonic regression models. The last approach is used in order to decompose the price of a house into land and structure components and it relies on the imposition of some monotonicity constraints or exogenous information on price movements for structures. The problems associated with constructing an index for the stock of houses using information on the sales of houses are also considered. 
Keywords:  Property price indexes, hedonic regressions, stratification techniques, rolling year indexes, Fisher ideal indexes 
JEL:  C2 C23 C43 D12 E31 R21 
Date:  2011–01–07 
URL:  http://d.repec.org/n?u=RePEc:ubc:bricol:erwin_diewert20111&r=ecm 