
on Econometrics 
By:  Cizek, P.; Haerdle, W.; Spokoiny, V. (Tilburg University, Center for Economic Research) 
Abstract:  This paper offers a new method for estimation and forecasting of the linear and nonlinear time series when the stationarity assumption is violated. Our general local parametric approach particularly applies to general varyingcoefficient parametric models, such as AR or GARCH, whose coefficients may arbitrarily vary with time. Global parametric, smooth transition, and changepoint models are special cases. The method is based on an adaptive pointwise selection of the largest interval of homogeneity with a given rightend point by a local changepoint analysis. We construct locally adaptive estimates that can perform this task and investigate them both from the theoretical point of view and by Monte Carlo simulations. In the particular case of GARCH estimation, the proposed method is applied to stockindex series and is shown to outperform the standard parametric GARCH model. 
Keywords:  adaptive pointwise estimation;autoregressive models;conditional heteroscedasticity models;local timehomogeneity 
JEL:  C13 C14 C22 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:200735&r=ecm 
By:  Douglas Miller (Department of Economics, University of MissouriColumbia) 
Abstract:  Parametric stochastic frontier models have a long history in applied production eco nomics, but the class of tractible parametric models is relatively small. Consequently, researchers have recently considered nonparametric alternatives such as kernel den sity estimators, functional approximations, and data envelopment analysis (DEA). The purpose of this paper is to present an information theoretic approach to constructing more flexible classes of parametric stochastic frontier models. Further, the proposed class of models nests all of the commonly used parametric methods as special cases, and the proposed modeling framework provides a comprehensive means to conduct model specification tests. The modeling framework is also extended to develop information theoretic measures of mean technical efficiency and to construct a profile likelihood estimator of the stochastic frontier model. 
Keywords:  KullbackLeibler information criterion, output distance function, profile likelihood, stochastic frontier, technical efficiency 
JEL:  C13 C21 C51 
Date:  2007–07–16 
URL:  http://d.repec.org/n?u=RePEc:umc:wpaper:0717&r=ecm 
By:  Drost, F.C.; Akker, R. van den; Werker, B.J.M. (Tilburg University, Center for Economic Research) 
Abstract:  Summary. This note reconsiders the nonnegative integervalued bilinear processes introduced by Doukhan, Latour, and Oraichi (2006). Using a hidden Markov argument, we extend their result of the existence of a stationary solution for the INBL(1,0,1,1) process to the class of superdiagonal INBL(p; q; m; n) models. Our approach also yields improved parameter restrictions for several moment conditions compared to the ones in Doukhan, Latour, and Oraichi (2006). 
Keywords:  count data;integervalued time series;bilinear model 
JEL:  C22 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:200747&r=ecm 
By:  JeanPierre Florens (Toulouse School of Economics, Institut Universitaire de France); Denis Fougère (CNRS, CRESTINSEE, CEPR and IZA); Michel Mouchart (Université Catholique de Louvain) 
Abstract:  This survey is devoted to the statistical analysis of duration models and point processes. The first section introduces specific concepts and definitions for singlespell duration models. Section two is devoted to the presentation of conditional duration models which incorporate the effects of explanatory variables. Competing risks models are presented in the third section. The fourth section is concerned with statistical inference, with a special emphasis on non and semi parametric estimation of singlespell duration models. Section 5 sets forth the main definitions for point and counting processes. Section 6 presents important elementary examples of point processes, namely Poisson, Markov and semiMarkov processes. The last section presents a general semiparametric framework for studying point processes with explanatory variables. 
Keywords:  duration models, hazard function, point processes, Markov chains, semiMarkovian processes 
JEL:  C41 C33 C44 C51 
Date:  2007–08 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp2971&r=ecm 
By:  A. Craig Burnside 
Abstract:  The risk factors in many consumptionbased asset pricing models display statistically weak correlation with the returns being priced. Some GMMbased procedures used to test these models have very low power to reject proposed stochastic discount factors (SDFs) when they are misspecified and the covariance matrix of the asset returns with the risk factors has less than full column rank. Consequently, these estimators provide potentially misleading positive assessments of the SDFs. Working with SDFs specified in terms of demeaned risk factors improves the performance of GMM but the power to reject misspecified SDFs may remain low. Two summary tests for failure of the rank condition have reasonable power, and lead to no Type I errors in Monte Carlo experiments. 
JEL:  C33 F31 G12 
Date:  2007–08 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:13357&r=ecm 
By:  Kleijnen, J.P.C. (Tilburg University, Center for Economic Research) 
Abstract:  In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naive methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independently distributed with a constant variance; moreover, the regression (meta)model of the simulation model?s I/O behaviour is assumed to have residuals with zero means. This article addresses the following practical questions: (i) How realistic are these assumptions, in practice? (ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions do hold? (iv) If not, which alternative statistical methods can then be applied? 
Keywords:  metamodel;experimental design;jackknife;bootstrap;common random numbers;validation 
JEL:  C0 C1 C9 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:200730&r=ecm 
By:  William C. Horrace (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 132441020); Seth O. Richards 
Abstract:  Parametric stochastic frontier models yield firmlevel conditional distributions of inefficiency that are truncated normal. Given these distributions, how should one assess and rank firmlevel efficiency? This study compares the techniques of estimated (a) the conditional means of inefficiency and (b) probabilities that firms are most or least efficient. Monte Carlo experiments suggest that the efficiency probabilities are more reliable in terms of mean absolute percent error when inefficiency has large variation across firms. Along the way we tackle some interesting problems associated with simulating and assessing estimator performance inthe stochastic frontier environment. 
Keywords:  Truncated normal, stochastic frontier, efficiency, multivariate probabilities. 
JEL:  C12 C16 C44 D24 
Date:  2007–08 
URL:  http://d.repec.org/n?u=RePEc:max:cprwps:97&r=ecm 
By:  Bettonvil, B.W.M.; Castillo, E. del; Kleijnen, J.P.C. (Tilburg University, Center for Economic Research) 
Abstract:  This paper studies simulationbased optimization with multiple outputs. It assumes that the simulation model has one random objective function and must satisfy given constraints on the other random outputs. It presents a statistical procedure for test ing whether a specific input combination (proposed by some optimization heuristic) satisfies the KarushKuhnTucker (KKT) firstorder optimality conditions. The pa per focuses on "expensive" simulations, which have small sample sizes. The paper applies the classic t test to check whether the specific input combination is feasi ble, and whether any constraints are binding; it applies bootstrapping (resampling) to test the estimated gradients in the KKT conditions. The new methodology is applied to three examples, which gives encouraging empirical results. 
Keywords:  Stopping rule; metaheuristics; response surface methodology; design of experiments 
JEL:  C0 C1 C9 C15 C44 C61 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:200745&r=ecm 
By:  Dennis Philip (Cass Business School, City University, 106 Bunhill Road, London EC1Y 8TZ, UK); Chihwa Kao (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 132441020); Giovanni Urga (Cass Business School, City University, 106 Bunhill Row, London EC1Y 8TZ, U.K.) 
Abstract:  A widely relied upon but a formally untested consideration is the issue of stability in actors underlying the term structure of interest rates. In testing for stability, practitioners as well as academics have employed ad yhoc techniques such as splitting the sample into a few subperiods and determining whether the factor loadings have appeared to be similar over all subperiods. Various authors have found mixed evidence on stability in the actors. In this paper we develop a formal testing procedure to evaluate the factor structure stability of the US zero coupon yield term structure. We find the factor structure of level to be unstable over the sample period considered. The slope and curvature factor structures are however found to be stable. Common structural changes affecting all interest rate maturities have fostered instability in the level factor. We corroborate the literature that variances (volatility) explained by the level, slope, and curvature factors are unstable over time. We find that the volatility of slope factor is sensitive to shocks affecting the short rates and the volatility of curvature factor is sensitive to shocks affecting the medium and long rates. Finally, we find evidence of the presence of common economic shocks affecting the level and slope factors, unlike slope and curvature factors that responded differently to economic shocks and were unaffected by any common instabilities. 
Keywords:  Stability, factor structure, principal component analysis, term structure of interest rates. 
JEL:  C12 C13 C14 C51 
Date:  2007–07 
URL:  http://d.repec.org/n?u=RePEc:max:cprwps:96&r=ecm 
By:  A. PRINZIE; D. VAN DEN POEL 
Abstract:  Random Forests (RF) is a successful classifier exhibiting performance comparable to Adaboost, but is more robust. The exploitation of two sources of randomness, random inputs (bagging) and random features, make RF accurate classifiers in several domains. We hypothesize that methods other than classification or regression trees could also benefit from injecting randomness. This paper generalizes the RF framework to other multiclass classification algorithms like the wellestablished MultiNomial Logit (MNL) and Naive Bayes (NB). We propose Random MNL (RMNL) as a new bagged classifier combining a forest of MNLs estimated with randomly selected features. Analogously, we introduce Random Naive Bayes (RNB). We benchmark the predictive performance of RF, RMNL and RNB against stateoftheart SVM classifiers. RF, RMNL and RNB outperform SVM. Moreover, generalizing RF seems promising as reflected by the improved predictive performance of RMNL. 
Date:  2007–06 
URL:  http://d.repec.org/n?u=RePEc:rug:rugwps:07/469&r=ecm 
By:  Ryan R. Brady (United States Naval Academy) 
Abstract:  How fast and how long (and to what magnitude) does a change in housing prices in one region affect its neighbors? In this paper, I apply a time series technique for measuring impulse response functions from linear projections to a spatial autoregressive model of housing prices. For a dynamic panel of California counties, the data reveal that spatial autocorrelation between regional housing prices is highly persistent over time, lasting up to two and half years. This result, and the econometric techniques employed, should be of interest to not only housing and regional economists, but to a variety of applied econometricians as well. 
Date:  2007–08 
URL:  http://d.repec.org/n?u=RePEc:usn:usnawp:19&r=ecm 
By:  Mishra, SK 
Abstract:  No foolproof method exists to fit nonlinear curves to data or estimate the parameters of an intrinsically nonlinear function. Some methods succeed at solving a set of problems but fail at the others. The Differential Evolution (DE) method of global optimization is an upcoming method that has shown its power to solve difficult nonlinear optimization problems. In this study we use the DE to solve some nonlinear least squares problems given by the National Institute of Standards and Technology (NIST), US Department of Commerce, USA and some other challenge problems posed by the CPCX Software (the makers of the AUTO2FIT software). The DE solves the test problems given by the NIST and most of the challenge problems posed by the CPCX, doing marginally better than the AUTO2FIT software in a few cases. 
Keywords:  Nonlinear least squares; curve fitting; Differential Evolution; global optimization; AUTO2FIT; CPCX Software; NIST; National Institute of Standards and Technology; test problems 
JEL:  C61 C13 C63 C20 
Date:  2007–08–29 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:4634&r=ecm 