
on Econometrics 
By:  Månsson, Kristofer (Jönköping University); Kibria, B. M. Golam (Florida International University); Sjölander, Pär (Jönköping University); Shukur, Ghazi (Linnaeus University) 
Abstract:  A new shrinkage estimator for the Poisson model is introduced in this paper. This method is a generalization of the Liu (1993) estimator originally developed for the linear regression model and will be generalised here to be used instead of the classical maximum likelihood (ML) method in the presence of multicollinearity since the mean squared error (MSE) of ML becomes inflated in that situation. Furthermore, this paper derives the optimal value of the shrinkage parameter and based on this value some methods of how the shrinkage parameter should be estimated are suggested. Using Monte Carlo simulation where the MSE and mean absolute error (MAE) are calculated it is shown that when the Liu estimator is applied with these proposed estimators of the shrinkage parameter it always outperforms the ML. Finally, an empirical application has been considered to illustrate the usefulness of the new Liu estimators. 
Keywords:  Estimation; MSE; MAE; Multicollinearity; Poisson; Liu; Simulation 
JEL:  C53 
Date:  2011–06–30 
URL:  http://d.repec.org/n?u=RePEc:hhs:huiwps:0051&r=ecm 
By:  Francesca Bruno (Università di Bologna); Daniela Cocchi (Università di Bologna); Alessandro Vagheggini (Università di Bologna) 
Abstract:  When statistical inference is used for spatial prediction, the modelbased framework known as kriging is commonly used. The predictor for an unsampled element of a population is a weighted combination of sampled values, in which weights are obtained by estimating the spatial covariance function. This solution can be affected by model misspecification and can be influenced by sampling design properties. In classical designbased finite population inference, these problems can be overcome; nevertheless, spatial solutions are still seldom used for this purpose. Through the efficient use of spatial information, a conceptual framework for designbased estimation has been developed in this study. We propose a standardized weighted predictor for unsampled spatial data, using the population information regarding spatial locations directly in the weighting system. Our procedure does not require model estimation of the spatial pattern because the spatial relationship is captured exclusively based on the Euclidean distances between locations (which are fixed and do not require assessment after sample selection). The individual predictor is a designbased ratio estimator, and we illustrate its properties for simple random sampling. 
Keywords:  spatial sampling; ratio estimator; designbased inference; modelbased inference; spatial information in finite population inference campionamento spaziale, stimatore del rapporto, inferenza da disegno, inferenza da modello; informazione spaziale nell’inferenza da popolazioni finite 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:bot:quadip:107&r=ecm 
By:  Miguel Artiach (Universidad de Alicante); Josu Arteche (UPV/EHU) 
Abstract:  Strong persistence is a common phenomenon that has been documented not only in the levels but also in the volatility of many time series. The class of doubly fractional models is extended to include the possibility of long memory in cyclical (nonzero) frequencies in both the levels and the volatility and a new model, the GARMAGARMASV (Gegenbauer AutoRegressive Mean Average  Id. Stochastic Volatility) is introduced. A sequential estimation strategy, based on the Whittle approximation to maximum likelihood is proposed and its finite sample performance is evaluated with a Monte Carlo analysis. Finally, a trifactorial in the mean and bifactorial in the volatility version of the model is proved to successfully fit the wellknown sunspot index. 
Keywords:  Stochastic volatility; cycles; long memory; QML estimation; sunspot index. 
JEL:  C22 C13 
Date:  2011–07–14 
URL:  http://d.repec.org/n?u=RePEc:ehu:biltok:201103&r=ecm 
By:  Zeebari , Zangin (Departments of Economics and Statistics); Shukur , Ghazi (Departments of Economics and Statistics); Kibria, B. M. Golam (Florida International University) 
Abstract:  In this paper, we modify a number of new biased estimators of seemingly unrelated regression (SUR) parameters which are developed by Alkhamisi and Shukur (2008), AS, when the explanatory variables are affected by multicollinearity. Nine ridge parameters have been modified and compared in terms of the trace mean squared error (TMSE) and (PR) criterion. The results from this extended study are the also compared with those founded by AS. A simulation study has been conducted to compare the performance of the modified ridge parameters. The results showed that under certain conditions the performance of the multivariate ridge regression estimators based on SUR ridge RMSmax is superior to other estimators in terms of TMSE and PR criterion. In large samples and when the collinearity between the explanatory variables is not high the unbiased SUR, estimator produces a smaller TMSEs. 
Keywords:  Multicollinearity; modified SUR ridge regression; Monte Carlo simulations; TMSE 
JEL:  C30 
Date:  2010–10–01 
URL:  http://d.repec.org/n?u=RePEc:hhs:huiwps:0043&r=ecm 
By:  Arnaud Maurel; Xavier D'Haultfoeuille 
Abstract:  This paper considers the identification and estimation of an extension of Roy’s model (1951) of sectoral choice, which includes a nonpecuniary component in the selection equation and allows for uncertainty on potential earnings. We focus on the identification of the nonpecuniary component, which is key to disentangle the relative importance of monetary incentives versus preferences in the context of sorting across sectors. By making the most of the structure of the selection equation, we show that this component is point identified from the knowledge of the covariates effects on earnings, as soon as one covariate is continuous. Notably, and in contrast to most results on the identification of Roy models, this implies that identification can be achieved without any exclusion restriction nor large support condition on the covariates. As a byproduct, bounds are obtained on the distribution of the ex ante monetary returns. We also propose a threestage semiparametric estimation procedure for this model, which yields rootn consistent and asymptotically normal estimators. Finally, we apply our results to the educational context, by providing new evidence from French data that nonpecuniary factors are a key determinant of higher education attendance decisions. 
Keywords:  Roy model, nonparametric identification, schooling choices, ex ante returns to schooling 
JEL:  C14 C25 J24 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:duk:dukeec:1110&r=ecm 
By:  BIA Michela; FLORES Carlos A.; MATTEI Alessandra 
Abstract:  We propose two semiparametric estimators of the doseresponse function based on spline techniques. Under uncounfoundedness, the generalized propensity score can be used to estimate doseresponse functions (DRF) and marginal treatment effect functions. In many observational studies treatment may not be binary or categorical. In such cases, one may be interested in estimating the doseresponse function in a setting with a continuous treatment. We evaluate the performance of the proposed estimators using Monte Carlo simulation methods. The simulation results suggested that the estimated DRF is robust to the specific semiparametric estimator used, while the parametric estimates of the DRF were sensitive to model misspecification. We apply our approach to the problem of evaluating the effect on innovation sales of Research and Development (R&D) financial aids received by Luxembourgish firms in 2004 and 2005. 
Keywords:  Continuous treatment; Doseresponse function; Generalized Propensity Score; Nonparametric methods; R&D investment 
JEL:  C13 J31 J70 
Date:  2011–07 
URL:  http://d.repec.org/n?u=RePEc:irs:cepswp:201140&r=ecm 
By:  Tommaso, Proietti; Helmut, Luetkepohl 
Abstract:  The paper investigates whether transforming a time series leads to an improvement in forecasting accuracy. The class of transformations that is considered is the BoxCox power transformation, which applies to series measured on a ratio scale. We propose a nonparametric approach for estimating the optimal transformation parameter based on the frequency domain estimation of the prediction error variance, and also conduct an extensive recursive forecast experiment on a large set of seasonal monthly macroeconomic time series related to industrial production and retail turnover. In about one fifth of the series considered the BoxCox transformation produces forecasts significantly better than the untransformed data at onestepahead horizon; in most of the cases the logarithmic transformation is the relevant one. As the forecast horizon increases, the evidence in favour of a transformation becomes less strong. Typically, the na¨ıve predictor that just reverses the transformation leads to a lower mean square error than the optimal predictor at short forecast leads. We also discuss whether the preliminary insample frequency domain assessment conducted provides a reliable guidance which series should be transformed for improving significantly the predictive performance. 
Keywords:  Forecasts comparisons; Multistep forecasting; Rolling forecasts; Nonparametric estimation of prediction error variance. 
JEL:  C53 C14 C52 C22 
Date:  2011–07–18 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:32294&r=ecm 
By:  Giuseppe Cavaliere (Università di Bologna); Iliyan Georgiev (Faculdade de Economia, Universidade Nova de Lisboa); A.M.Robert Taylor (School of Economics, University of Nottingham) 
Abstract:  It is well known that the standard i.i.d. bootstrap of the mean is inconsistent in a location model with infinite variance (?stable) innovations. This occurs because the bootstrap distribution of a normalised sum of infinite variance random variables tends to a random distribution. Consistent bootstrap algorithms based on subsampling methods have been proposed but have the drawback that they deliver much wider confidence sets than those generated by the i.i.d. bootstrap owing to the fact that they eliminate the dependence of the bootstrap distribution on the sample extremes. In this paper we propose sufficient conditions that allow a simple modification of the bootstrap (Wu, 1986, Ann.Stat.) to be consistent (in a conditional sense) yet to also reproduce the narrower confidence sets of the i.i.d. bootstrap. Numerical results demonstrate that our proposed bootstrap method works very well in practice delivering coverage rates very close to the nominal level and significantly narrower confidence sets than other consistent methods 
Keywords:  Bootstrap, distribuzioni stabili, misure di probabilità stocastiche, convergenza debole Bootstrap, stable distributions, random probability measures, weak convergence 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:bot:quadip:108&r=ecm 
By:  Takashi Kamihigashi (Research Institute for Economics and Business Administration, Kobe University); John Stachurski (Research School of Economics, Australian National University, ACT, Australia) 
Abstract:  We discuss stability of stationary distributions for discretetime Markov chains satisfying monotonicity and an ordertheoretic mixing condition that can be seen as an alternative to irreducibility. A process satisfying these conditions has at most one stationary distribution, and any such distribution must be globally stable. 
Date:  2011–07 
URL:  http://d.repec.org/n?u=RePEc:kob:dpaper:dp201124&r=ecm 
By:  Fujimoto, S.; Ishikawa, A.; Mizuno, T.; Watanabe, T. 
Abstract:  We propose a new method for estimating the powerlaw exponent of a firm size variable, such as annual sales. Our focus is on how to empirically identify a range in which a firm size variable follows a powerlaw distribution. As is well known, a firm size variable follows a powerlaw distribution only beyond some threshold. On the other hand, in almost all empirical exercises, the right end part of a distribution deviates from a powerlaw due to finite size effect. We modify the method proposed by Malevergne et al. (2011) so that we can identify both of the lower and the upper thresholds and then estimate the powerlaw exponent using observations only in the range defined by the two thresholds. We apply this new method to various firm size variables, including annual sales, the number of workers, and tangible fixed assets for firms in more than thirty countries. 
Keywords:  Econophysics, powerlaw distributions, powerlaw exponents, firm size variables, finite size effect 
JEL:  C16 D20 E23 
Date:  2011–07 
URL:  http://d.repec.org/n?u=RePEc:hit:cinwps:7&r=ecm 
By:  Patrick Bayer; Robert McMillan; Alvin Murphy; Christopher Timmins 
Abstract:  We develop a tractable model of neighborhood choice in a dynamic setting along with a computationally straightforward estimation approach. This approach uses information about neighborhood choices and the timing of moves to recover moving costs and preferences for dynamicallyevolving housing and neighborhood attributes. The model and estimator are potentially applicable to the study of a wide range of dynamic phenomena in housing markets and cities. We focus here on estimating the marginal willingness to pay for nonmarketed amenities – neighborhood racial composition, air pollution, and violent crime – using rich dynamic data. Consistent with the timeseries properties of each amenity, we find that a static demand model understates willingness to pay to avoid pollution and crime but overstates willingness to pay to live near neighbors of one’s own race. These findings have important implications for the class of static housing demand models typically used to value urban amenities. 
Keywords:  Neighborhood Choice, Housing Demand, Hedonic Valuation, Dynamic Discrete Choice 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:duk:dukeec:1116&r=ecm 
By:  Stephen Pollock 
Abstract:  The algebra of the Kronecker products of matrices is recapitulated using a notation that reveals the tensor structures of the matrices. It is claimed that many of the difficulties that are encountered in working with the algebra can be alleviated by paying close attention to the indices that are concealed beneath the conventional matrix notation. The vectorisation operations and the commutation transformations that are common in multivariate statistical analysis alter the positional relationship of the matrix elements. These elements correspond to numbers that are liable to be stored in contiguous memory cells of a computer, which should remain undisturbed. It is suggested that, in the absence of an adequate index notation that enables the manipulations to be performed without disturbing the data, even the most clearheaded of computer programmers is liable to perform wholly unnecessary and timewasting operations that shift data between memory cells. 
Date:  2011–07 
URL:  http://d.repec.org/n?u=RePEc:lec:leecon:11/34&r=ecm 
By:  Raghav, Manu; Barreto, Humberto 
Abstract:  This paper focuses on econometrics pedagogy. It demonstrates the importance of including probability weights in regression analysis using data from surveys that do not use simple random samples (SRS). We use concrete, numerical examples and simulation to show how to effectively teach this difficult material to a student audience. We relax the assumption of simple random sampling and show how unequal probability of selection can lead to biased, inconsistent OLS slope estimates. We then explain and apply probability weighted least squares, showing how weighting the observations by the reciprocal of the probability of inclusion in the sample improves performance. The exposition is nonmathematical and relies heavily on intuitive, visual displays to make the content accessible to students. This paper will enable professors to incorporate unequal probability of selection into their courses and allow students to use best practice techniques in analyzing data from complex surveys. The primary delivery vehicle is Microsoft Excel®. Two userdefined array functions, SAMPLE and LINESTW, are included in a prepared Excel workbook. We replicate all results in Stata® and offer a do file for easy analysis in Stata. Documented code in Excel and Stata allows users to see each step in the sampling and probability weighted least squares algorithms. All files and code are available at www.depauw.edu/learn/stata. 
Keywords:  unequal probability; complex survey; simulation; weighted regression 
JEL:  A22 A23 C8 C01 
Date:  2011–06–30 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:32334&r=ecm 
By:  Javier FernándezMacho (EA3  UPV/EHU) 
Abstract:  Statistical studies that consider multiscale relationships among several variables use wavelet correlations and crosscorrelations between pairs of variables. This procedure needs to calculate and compare a large number of wavelet statistics. The analysis can then be rather confusing and even frustrating since it may fail to indicate clearly the multiscale overall relationship that might exist among the variables. This paper presents two new statistical tools that help to determine the overall correlation for the whole multivariate set on a scalebyscale basis. This is illustrated in the analysis of a multivariate set of daily Eurozone stock market returns during a recent period. Wavelet multiple correlation analysis reveals the existence of a nearly exact linear relationship for periods longer than the year, which can be interpreted as perfect integration of these Euro stock markets at the longest time scales. It also shows that small inconsistencies between Euro markets seem to be just short withinyear discrepancies possibly due to the interaction of different agents with different trading horizons. On the other hand, multiple crosscorrelation analysis shows that the French CAC40 may lead the rest of the Euro markets at those short time scales. 
Keywords:  Euro zone, MODWT, multiscale analysis, multivariate analysis, stock markets, returns, wavelet transform. 
JEL:  C32 C87 G15 
Date:  2011–07–14 
URL:  http://d.repec.org/n?u=RePEc:ehu:biltok:201104&r=ecm 
By:  Alessandro Gnoatto; Martino Grasselli 
Abstract:  We derive the explicit formula for the joint Laplace transform of the Wishart process and its time integral which extends the original approach of Bru. We compare our methodology with the alternative results given by the variation of constants method, the linearization of the Matrix Riccati ODE's and the RungeKutta algorithm. The new formula turns out to be fast, accurate and very useful for applications when dealing with stochastic volatility and stochastic correlation modelling. 
Date:  2011–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1107.2748&r=ecm 
By:  Matteo Fragetta (University of Salerno); Giovanni Melina (University of Surrey) 
Abstract:  This paper applies graphical modelling theory to recover identifying restrictions for the analysis of monetary policy shocks in a VAR of the US economy. Results are in line with the view that only highfrequency data should be assumed to be in the information set of the monetary authority when the interest rate decision is taken. 
Keywords:  Monetary policy; SVAR; Graphical modelling 
JEL:  E43 E52 
Date:  2011–07 
URL:  http://d.repec.org/n?u=RePEc:sur:surrec:0811&r=ecm 