nep-ecm New Economics Papers
on Econometrics
Issue of 2007‒09‒02
twelve papers chosen by
Sune Karlsson
Orebro University

  1. Adaptive Pointwise Estimation in Time-Inhomogeneous Time-Series Models By Cizek, P.; Haerdle, W.; Spokoiny, V.
  2. An Information Theoretic Approach to Flexible Stochastic Frontier Models By Douglas Miller
  3. Note on Integer-Valued Bilinear Time Series Models By Drost, F.C.; Akker, R. van den; Werker, B.J.M.
  4. Duration Models and Point Processes By Jean-Pierre Florens; Denis Fougère; Michel Mouchart
  5. Empirical Asset Pricing and Statistical Power in the Presence of Weak Risk Factors By A. Craig Burnside
  6. Simulation Experiments in Practice: Statistical Design and Regression Analysis By Kleijnen, J.P.C.
  7. A Monte Carlo Study of Efficiency Estimates from Frontier Models By William C. Horrace; Seth O. Richards
  8. Statistical Testing of Optimality Conditions in Multiresponse Simulation-based Optimization (Revision of 2005-81) By Bettonvil, B.W.M.; Castillo, E. del; Kleijnen, J.P.C.
  9. Testing for Instability in Factor Structure of Yield Curves By Dennis Philip; Chihwa Kao; Giovanni Urga
  10. Random Multiclass Classification: Generalizing Random Forests to Random MNL and Random NB By A. PRINZIE; D. VAN DEN POEL
  11. Measuring the persistence of spatial autocorrelation: How long does the spatial connection between housing markets last? By Ryan R. Brady
  12. Performance of Differential Evolution Method in Least Squares Fitting of Some Typical Nonlinear Curves By Mishra, SK

  1. By: Cizek, P.; Haerdle, W.; Spokoiny, V. (Tilburg University, Center for Economic Research)
    Abstract: This paper offers a new method for estimation and forecasting of the linear and nonlinear time series when the stationarity assumption is violated. Our general local parametric approach particularly applies to general varying-coefficient parametric models, such as AR or GARCH, whose coefficients may arbitrarily vary with time. Global parametric, smooth transition, and changepoint models are special cases. The method is based on an adaptive pointwise selection of the largest interval of homogeneity with a given right-end point by a local change-point analysis. We construct locally adaptive estimates that can perform this task and investigate them both from the theoretical point of view and by Monte Carlo simulations. In the particular case of GARCH estimation, the proposed method is applied to stock-index series and is shown to outperform the standard parametric GARCH model.
    Keywords: adaptive pointwise estimation;autoregressive models;conditional heteroscedasticity models;local time-homogeneity
    JEL: C13 C14 C22
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200735&r=ecm
  2. By: Douglas Miller (Department of Economics, University of Missouri-Columbia)
    Abstract: Parametric stochastic frontier models have a long history in applied production eco- nomics, but the class of tractible parametric models is relatively small. Consequently, researchers have recently considered nonparametric alternatives such as kernel den- sity estimators, functional approximations, and data envelopment analysis (DEA). The purpose of this paper is to present an information theoretic approach to constructing more flexible classes of parametric stochastic frontier models. Further, the proposed class of models nests all of the commonly used parametric methods as special cases, and the proposed modeling framework provides a comprehensive means to conduct model specification tests. The modeling framework is also extended to develop information theoretic measures of mean technical efficiency and to construct a profile likelihood estimator of the stochastic frontier model.
    Keywords: KullbackLeibler information criterion, output distance function, profile likelihood, stochastic frontier, technical efficiency
    JEL: C13 C21 C51
    Date: 2007–07–16
    URL: http://d.repec.org/n?u=RePEc:umc:wpaper:0717&r=ecm
  3. By: Drost, F.C.; Akker, R. van den; Werker, B.J.M. (Tilburg University, Center for Economic Research)
    Abstract: Summary. This note reconsiders the nonnegative integer-valued bilinear processes introduced by Doukhan, Latour, and Oraichi (2006). Using a hidden Markov argument, we extend their result of the existence of a stationary solution for the INBL(1,0,1,1) process to the class of superdiagonal INBL(p; q; m; n) models. Our approach also yields improved parameter restrictions for several moment conditions compared to the ones in Doukhan, Latour, and Oraichi (2006).
    Keywords: count data;integer-valued time series;bilinear model
    JEL: C22
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200747&r=ecm
  4. By: Jean-Pierre Florens (Toulouse School of Economics, Institut Universitaire de France); Denis Fougère (CNRS, CREST-INSEE, CEPR and IZA); Michel Mouchart (Université Catholique de Louvain)
    Abstract: This survey is devoted to the statistical analysis of duration models and point processes. The first section introduces specific concepts and definitions for single-spell duration models. Section two is devoted to the presentation of conditional duration models which incorporate the effects of explanatory variables. Competing risks models are presented in the third section. The fourth section is concerned with statistical inference, with a special emphasis on non- and semi- parametric estimation of single-spell duration models. Section 5 sets forth the main definitions for point and counting processes. Section 6 presents important elementary examples of point processes, namely Poisson, Markov and semi-Markov processes. The last section presents a general semi-parametric framework for studying point processes with explanatory variables.
    Keywords: duration models, hazard function, point processes, Markov chains, semi-Markovian processes
    JEL: C41 C33 C44 C51
    Date: 2007–08
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp2971&r=ecm
  5. By: A. Craig Burnside
    Abstract: The risk factors in many consumption-based asset pricing models display statistically weak correlation with the returns being priced. Some GMM-based procedures used to test these models have very low power to reject proposed stochastic discount factors (SDFs) when they are misspecified and the covariance matrix of the asset returns with the risk factors has less than full column rank. Consequently, these estimators provide potentially misleading positive assessments of the SDFs. Working with SDFs specified in terms of demeaned risk factors improves the performance of GMM but the power to reject misspecified SDFs may remain low. Two summary tests for failure of the rank condition have reasonable power, and lead to no Type I errors in Monte Carlo experiments.
    JEL: C33 F31 G12
    Date: 2007–08
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:13357&r=ecm
  6. By: Kleijnen, J.P.C. (Tilburg University, Center for Economic Research)
    Abstract: In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naive methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independently distributed with a constant variance; moreover, the regression (meta)model of the simulation model?s I/O behaviour is assumed to have residuals with zero means. This article addresses the following practical questions: (i) How realistic are these assumptions, in practice? (ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions do hold? (iv) If not, which alternative statistical methods can then be applied?
    Keywords: metamodel;experimental design;jackknife;bootstrap;common random numbers;validation
    JEL: C0 C1 C9
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200730&r=ecm
  7. By: William C. Horrace (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 13244-1020); Seth O. Richards
    Abstract: Parametric stochastic frontier models yield firm-level conditional distributions of inefficiency that are truncated normal. Given these distributions, how should one assess and rank firm-level efficiency? This study compares the techniques of estimated (a) the conditional means of inefficiency and (b) probabilities that firms are most or least efficient. Monte Carlo experiments suggest that the efficiency probabilities are more reliable in terms of mean absolute percent error when inefficiency has large variation across firms. Along the way we tackle some interesting problems associated with simulating and assessing estimator performance inthe stochastic frontier environment.
    Keywords: Truncated normal, stochastic frontier, efficiency, multivariate probabilities.
    JEL: C12 C16 C44 D24
    Date: 2007–08
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:97&r=ecm
  8. By: Bettonvil, B.W.M.; Castillo, E. del; Kleijnen, J.P.C. (Tilburg University, Center for Economic Research)
    Abstract: This paper studies simulation-based optimization with multiple outputs. It assumes that the simulation model has one random objective function and must satisfy given constraints on the other random outputs. It presents a statistical procedure for test- ing whether a specific input combination (proposed by some optimization heuristic) satisfies the Karush-Kuhn-Tucker (KKT) first-order optimality conditions. The pa- per focuses on "expensive" simulations, which have small sample sizes. The paper applies the classic t test to check whether the specific input combination is feasi- ble, and whether any constraints are binding; it applies bootstrapping (resampling) to test the estimated gradients in the KKT conditions. The new methodology is applied to three examples, which gives encouraging empirical results.
    Keywords: Stopping rule; metaheuristics; response surface methodology; design of experiments
    JEL: C0 C1 C9 C15 C44 C61
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200745&r=ecm
  9. By: Dennis Philip (Cass Business School, City University, 106 Bunhill Road, London EC1Y 8TZ, UK); Chihwa Kao (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 13244-1020); Giovanni Urga (Cass Business School, City University, 106 Bunhill Row, London EC1Y 8TZ, U.K.)
    Abstract: A widely relied upon but a formally untested consideration is the issue of stability in actors underlying the term structure of interest rates. In testing for stability, practitioners as well as academics have employed ad yhoc techniques such as splitting the sample into a few sub-periods and determining whether the factor loadings have appeared to be similar over all sub-periods. Various authors have found mixed evidence on stability in the actors. In this paper we develop a formal testing procedure to evaluate the factor structure stability of the US zero coupon yield term structure. We find the factor structure of level to be unstable over the sample period considered. The slope and curvature factor structures are however found to be stable. Common structural changes affecting all interest rate maturities have fostered instability in the level factor. We corroborate the literature that variances (volatility) explained by the level, slope, and curvature factors are unstable over time. We find that the volatility of slope factor is sensitive to shocks affecting the short rates and the volatility of curvature factor is sensitive to shocks affecting the medium and long rates. Finally, we find evidence of the presence of common economic shocks affecting the level and slope factors, unlike slope and curvature factors that responded differently to economic shocks and were unaffected by any common instabilities.
    Keywords: Stability, factor structure, principal component analysis, term structure of interest rates.
    JEL: C12 C13 C14 C51
    Date: 2007–07
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:96&r=ecm
  10. By: A. PRINZIE; D. VAN DEN POEL
    Abstract: Random Forests (RF) is a successful classifier exhibiting performance comparable to Adaboost, but is more robust. The exploitation of two sources of randomness, random inputs (bagging) and random features, make RF accurate classifiers in several domains. We hypothesize that methods other than classification or regression trees could also benefit from injecting randomness. This paper generalizes the RF framework to other multiclass classification algorithms like the well-established MultiNomial Logit (MNL) and Naive Bayes (NB). We propose Random MNL (RMNL) as a new bagged classifier combining a forest of MNLs estimated with randomly selected features. Analogously, we introduce Random Naive Bayes (RNB). We benchmark the predictive performance of RF, RMNL and RNB against state-of-the-art SVM classifiers. RF, RMNL and RNB outperform SVM. Moreover, generalizing RF seems promising as reflected by the improved predictive performance of RMNL.
    Date: 2007–06
    URL: http://d.repec.org/n?u=RePEc:rug:rugwps:07/469&r=ecm
  11. By: Ryan R. Brady (United States Naval Academy)
    Abstract: How fast and how long (and to what magnitude) does a change in housing prices in one region affect its neighbors? In this paper, I apply a time series technique for measuring impulse response functions from linear projections to a spatial autoregressive model of housing prices. For a dynamic panel of California counties, the data reveal that spatial autocorrelation between regional housing prices is highly persistent over time, lasting up to two and half years. This result, and the econometric techniques employed, should be of interest to not only housing and regional economists, but to a variety of applied econometricians as well.
    Date: 2007–08
    URL: http://d.repec.org/n?u=RePEc:usn:usnawp:19&r=ecm
  12. By: Mishra, SK
    Abstract: No foolproof method exists to fit nonlinear curves to data or estimate the parameters of an intrinsically nonlinear function. Some methods succeed at solving a set of problems but fail at the others. The Differential Evolution (DE) method of global optimization is an upcoming method that has shown its power to solve difficult nonlinear optimization problems. In this study we use the DE to solve some nonlinear least squares problems given by the National Institute of Standards and Technology (NIST), US Department of Commerce, USA and some other challenge problems posed by the CPC-X Software (the makers of the AUTO2FIT software). The DE solves the test problems given by the NIST and most of the challenge problems posed by the CPC-X, doing marginally better than the AUTO2FIT software in a few cases.
    Keywords: Nonlinear least squares; curve fitting; Differential Evolution; global optimization; AUTO2FIT; CPC-X Software; NIST; National Institute of Standards and Technology; test problems
    JEL: C61 C13 C63 C20
    Date: 2007–08–29
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:4634&r=ecm

This nep-ecm issue is ©2007 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.