nep-ecm New Economics Papers
on Econometrics
Issue of 2005‒05‒07
28 papers chosen by
Sune Karlsson
Orebro University

  1. Forecasting Using Time Varying Meta-Elliptical Distributions with a Study of Commodity Futures Prices By Alessio Sancetta; Arina Nikanrova
  2. Estimating the Link Function in Multinomial Response Models under Endogeneity and Quadratic Loss By George Judge; Ron Mittelhammer
  3. Testing for Separability in Household Models with Heterogeneous Behavior: A Mixture Model Approach By Renos Vakis; Elisabeth Sadoulet; Alain de Janvry; Carlo Cafiero
  4. A Simple Lagrange Multiplier F-Test for Multivariate Regression Models By Timothy Beatty; Jeffrey LaFrance; Muzhe Yang
  5. Estimation of threshold time series models using efficient jump MCMC By Terence D.Agbeyegbe; Elena Goldman
  6. Quantile regression methods for recursive structural equation models By Lingjie Ma; Roger Koenker
  7. Generalized empirical likelihood tests in time series models with potential identification failure By Patrik Guggenberger; Richard Smith
  8. Estimating average partial effects under conditional moment independence assumptions By Jeffrey M. Wooldridge
  9. Nonparametric estimation of nonadditive hedonic models By James Heckman; Rosa Matzkin; Lars Nesheim
  10. On the robustness of fixed effects and related estimators in correlated random coefficient panel data models. By Jeffrey M. Wooldridge
  11. Ill-conditioned problems, Fisher information and weak instruments By Giovanni Forchini; Grant Hillier
  12. Inverse probability weighted estimation for general missing data problems By Jeffrey M. Wooldridge
  13. Nonparametric inference for unbalance time series data By Oliver Linton
  14. Nonparametric estimation of an additive quantile regression model By Joel Horowitz; Simon Lee
  15. Endogeneity in quantile regression models: a control function approach By Simon Lee
  16. Identification of sensitivity to variation in endogenous variables By Andrew Chesher
  17. Identification in additive error models with discrete endogenous variables By Andrew Chesher
  18. The Bootstrap and the Edgeworth Correction for Semiparametric Averaged Derivatives By Y. Nishiyama; Peter Robinson
  19. Testing a parametric model against a nonparametric alternative with identification through instrumental variables By Joel Horowitz
  20. A nonparametric test of exogeneity By Richard Blundell; Joel Horowitz
  21. Automatic positive semi-definite HAC covariance matrix and GMM estimation By Richard Smith
  22. Nonparametric methods for the characteristic model By Laura Blow; Martin Browning; Ian Crawford
  23. GEL Criteria for Moment Condition Models By Richard Smith
  24. Structural Equations, Treatment Effects and Econometric Policy Evaluation By James J. Heckman; Edward Vytlacil
  25. Estimating Standard Errors in Finance Panel Data Sets: Comparing Approaches By Mitchell A. Petersen
  26. Understanding and Comparing Factor-Based Forecasts By Jean Boivin; Serena Ng
  27. A Hardware Generator of Multi-point Distributed Random Numbers for Monte Carlo Simulation By Nicola Bruti-Liberati; Filippo Martini; Massimo Piccardi; Eckhard Platen
  28. Testing for a Unit Root against Transitional Autoregressive Models By Joon Y. Park; Mototsugu Shintani

  1. By: Alessio Sancetta; Arina Nikanrova
    Abstract: We propose a methodological approach to the forecast and evaluation of multivariate distributions with time varying parameters. For reasons related to feasible inference attention is restricted to meta-elliptical distributions. We use our approach for the study of a large data set of 16 commodity prices. Our approach leads to a theory for model validation avoiding common problems caused by discontinuities, time variation of parameters and nuisance parameters.
    Keywords: Commodity Prices, Copula Function, Meta-Elliptical Distribution, Nonparametric Estimation, Weibull Distribution.
    JEL: C14 C16 C31 C32
    Date: 2005–05
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:0516&r=ecm
  2. By: George Judge (University of California, Berkeley and Giannini Foundation); Ron Mittelhammer (Washington State University)
    Abstract: This paper considers estimation and inference for the multinomial response model in the case where endogenous variables are arguments of the unknown link function. Semiparametric estimators are proposed that avoid the parametric assumptions underlying the likelihood approach as well as the loss of precision when using nonparametric estimation. A data based shrinkage estimator that seeks an optimal combination of estimators and results in superior risk performance under quadratic loss is also developed.
    Keywords: multinomial process, endogeneity, empirical likelihood procedures, quadratic loss, semiparametric estimation and inference, data dependent shrinkage, asymptotic and finite sample risk,
    Date: 2004–02–03
    URL: http://d.repec.org/n?u=RePEc:cdl:agrebk:970&r=ecm
  3. By: Renos Vakis (World Bank); Elisabeth Sadoulet (University of California, Berkeley); Alain de Janvry (University of California, Berkeley); Carlo Cafiero (Universita degli Studi di Napoli Federico II)
    Abstract: Knowing whether a household behaves according to separability or non-separability is needed for the correct modeling of production decisions. We propose a superior test to those found in the literature on separability by using a mixture distribution approach to estimate the probability that a farm household behaves according to non-separability, and test that the determinants of consumption affect production decisions for households categorized as non-separable. With non-separability attributed to labor market constraints, the switcher equation shows that Peruvian farm households that are indigenous and young, with low levels of education, and lack of local employment opportunities are more likely to be constrained on the labor market.
    Keywords: labor, separability, mixture distributions, Peru,
    Date: 2004–08–01
    URL: http://d.repec.org/n?u=RePEc:cdl:agrebk:990&r=ecm
  4. By: Timothy Beatty (University of British Columbia); Jeffrey LaFrance (University of California, Berkeley); Muzhe Yang (University of California, Berkeley)
    Abstract: This paper proposes a straightforward, easy to implement approximate F-test which is useful for testing restrictions in multivariate regression models. We derive the asymptotics for our test statistic and investigate its finite sample properties through a series of Monte Carlo experiments. Both theory suggests and simulations confirm that our approach will result in strictly better inference than the leading alternative
    Keywords: econometric models, monte carlo analysis, multivariate analysis, regression models,
    Date: 2005–02–01
    URL: http://d.repec.org/n?u=RePEc:cdl:agrebk:996&r=ecm
  5. By: Terence D.Agbeyegbe (Hunter College, CUNY); Elena Goldman (Lubin School of Business, Pace University)
    Abstract: This paper shows how a Metropolis-Hastings algorithm with efficient jump can be constructed for the estimation of multiple threshold time series of the U.S. short term interest rates. The results show that interest rates are persistent in a lower regime and exhibit weak mean reversion in the upper regime. For model selection and specication several techniques are used such as marginal likelihood and information criteria, as well as estimation with and without truncation restrictions imposed on thresholds.
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:htr:hcecon:406&r=ecm
  6. By: Lingjie Ma; Roger Koenker (Institute for Fiscal Studies and University of Illinois)
    Abstract: Two classes of quantile regression estimation methods for the recursive structural equation models of Chesher (2003) are investigated. A class of weighted average derivative estimators based directly on the identification strategy of Chesher is contrasted with a new control variate estimation method. The latter imposes stronger restrictions achieving an asymptotic efficiency bound with respect to the former class. An application of the methods to the study of the effect of class size on the performance of Dutch primary school students shows that (i.) reductions in class size are beneficial for good students in language and for weaker students in mathematics, (ii) larger classes appear bene cial for weaker language students, and (iii.) the impact of class size on both mean and median performance is negligible.
    Date: 2004–02
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp01/04&r=ecm
  7. By: Patrik Guggenberger; Richard Smith (Institute for Fiscal Studies and University of Warwick)
    Abstract: We introduce test statistics based on generalized empirical likelihood methods that can be used to test simple hypotheses involving the unknown parameter vector in moment condition time series models. The test statistics generalize those in Guggenberger and Smith (2005) from the i.i.d. to the time series context and are alternatives to those in Kleibergen (2001) and Otsu (2003). The main feature of these tests is that their empirical null rejection probabilities are not affected much by the strength or weakness of identification. More precisely, we show that the statistics are asymptotically distributed as chi—square under both classical asymptotic theory and weak instrument asymptotics of Stock and Wright (2000). A Monte Carlo study reveals that the finite—sample performance of the suggested tests is very competitive.
    Keywords: Generalized Empirical Likelihood, Nonlinear Moment Conditions, Similar Tests, Size Distortion, Weak Identification
    JEL: C12 C31
    Date: 2005–04
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp01/05&r=ecm
  8. By: Jeffrey M. Wooldridge
    Abstract: I show how to identify and estimate the average partial effect of explanatory variables in a model where unobserved heterogeneity interacts with the explanatory variables and may be unconditionally correlated with the explanatory variables. To identify the populationaveraged effects, I use extensions of ignorability assumptions that are used for estimating linear models with additive heterogeneity and for estimating average treatment effects. New stimators are obtained for estimating the unconditional average partial effect as well as the average partial effect conditional on functions of observed covariates.
    Date: 2004–03
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp03/04&r=ecm
  9. By: James Heckman (Institute for Fiscal Studies and University of Chicago); Rosa Matzkin; Lars Nesheim (Institute for Fiscal Studies)
    Abstract: We analyze equilibria in hedonic economies and study conditions that lead to identification of structural preference parameters in hedonic economies with both additive and nonadditive marginal utility and marginal product functions. The latter class is more general, allows for heterogeneity in the curvature of consumer utility, and can result in conditions that lead to bunching. Such bunching has been largely ignored in the previous literature. We then present methods to estimate marginal utility and marginal product functions that are nonadditive in the unobservable random terms, using observations from a single hedonic equilibrium market. These methods are important when statistical tests reject additive specifications or when prior information suggests that consumer or firm heterogeneity in the curvature of utility or production functions is likely to be significant. We provide conditions under which these types of utility and production functions are nonparametrically identified, and we propose nonparametric estimators for them. The estimators are shown to be consistent and asymptotically normal. When the assumptions required to use single market methods are unjustified, we show how multimarket data can be used to estimate the structural functions.
    Date: 2005–03
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp03/05&r=ecm
  10. By: Jeffrey M. Wooldridge
    Abstract: I show that a class of fixed effects estimators is reasonably robust for estimating the population-averaged slope coefficients in panel data models with individual-specific slopes, where the slopes are allowed to be correlated with the covariates. In addition to including the usual fixed effects estimator, the results apply to estimators that eliminate individual-specific trends. Further, asymptotic variance matrices are straightforward to estimate. I apply the results, and propose alternative estimators, to estimation of average treatment in a general class of unobserved effects models.
    Date: 2004–06
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp04/04&r=ecm
  11. By: Giovanni Forchini; Grant Hillier
    Abstract: The existence of a uniformly consistent estimator for a particular parameter is well-known to depend on the uniform continuity of the functional that defines the parameter in terms of the model. Recently, Pötscher (Econometrica, 70, pp 1035 - 1065) showed that estimator risk may be bounded below by a term that depends on the oscillation (osc) of the functional, thus making the connection between continuity and risk quite explicit. However, osc has no direct statistical interpretation. In this paper we slightly modify the definition of osc so that it reflects a (generalized) derivative (der) of the functional. We show that der can be directly related to the familiar statistical concepts of Fisher information and identification, and also to the condition numbers that are used to measure ‘distance from an ill-posed problem’ in other branches of applied mathematics. We begin the analysis assuming a fully parametric setting, but then generalize to the nonparametric case, where the inverse of the Fisher information matrix is replaced by the covariance matrix of the efficient influence function. The results are applied to a number of examples, including the structural equation model, spectral density estimation, and estimation of variance and precision.
    Keywords: Continuity, Derivative, Divergence, Fisher Information, Ill-conditioned problem, Ill-posed problem, Interest-functional, Oscillation, Precision
    Date: 2005–04
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp04/05&r=ecm
  12. By: Jeffrey M. Wooldridge
    Abstract: I study inverse probability weighted M-estimation under a general missing data scheme. The cases covered that do not previously appear in the literature include M-estimation with missing data due to a censored survival time, propensity score estimation of the average treatment effect for linear exponential family quasi-log-likelihood functions, and variable probability sampling with observed retainment frequencies. I extend an important result known to hold in special cases: estimating the selection probabilities is generally more efficient than if the known selection probabilities could be used in estimation. For the treatment effect case, the setup allows for a simple characterization of a “double robustness” result due to Scharfstein, Rotnitzky, and Robins (1999): given appropriate choices for the conditional mean function and quasi-log-likelihood function, only one of the conditional mean or selection probability needs to be correctly specified in order to consistently estimate the average treatment effect.
    JEL: C13 C21 C23
    Date: 2004–04
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp05/04&r=ecm
  13. By: Oliver Linton (Institute for Fiscal Studies and London School of Economics)
    Abstract: Estimation of heteroskedasticity and autocorrelation consistent covariance matrices (HACs) is a well established problem in time series. Results have been established under a variety of weak conditions on temporal dependence and heterogeneity that allow one to conduct inference on a variety of statistics, see Newey and West (1987), Hansen (1992), de Jong and Davidson (2000), and Robinson (2004). Indeed there is an extensive literature on automating these procedures starting with Andrews (1991). Alternative methods for conducting inference include the bootstrap for which there is also now a very active research program in time series especially, see Lahiri (2003) for an overview. One convenient method for time series is the subsampling approach of Politis, Romano, andWolf (1999). This method was used by Linton, Maasoumi, andWhang (2003) (henceforth LMW) in the context of testing for stochastic dominance. This paper is concerned with the practical problem of conducting inference in a vector time series setting when the data is unbalanced or incomplete. In this case, one can work only with the common sample, to which a standard HAC/bootstrap theory applies, but at the expense of throwing away data and perhaps losing effciency. An alternative is to use some sort of imputation method, but this requires additional modelling assumptions, which we would rather avoid.1 We show how the sampling theory changes and how to modify the resampling algorithms to accommodate the problem of missing data. We also discuss effciency and power. Unbalanced data of the type we consider are quite common in financial panel data, see for example Connor and Korajczyk (1993). These data also occur in cross-country studies.
    Date: 2004–04
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp06/04&r=ecm
  14. By: Joel Horowitz (Institute for Fiscal Studies and Northwestern University); Simon Lee (Institute for Fiscal Studies and University College London)
    Abstract: This paper is concerned with estimating the additive components of a nonparametric additive quantile regression model. We develop an estimator that is asymptotically normally distributed with a rate of convergence in probability of n-r/(2r+1) when the additive components are r-times continuously differentiable for some r = 2. This result holds regardless of the dimension of the covariates and, therefore, the new estimator has no curse of dimensionality. In addition, the estimator has an oracle property and is easily extended to a generalized additive quantile regression model with a link function. The numerical performance and usefulness of the estimator are illustrated by Monte Carlo experiments and an empirical example.
    Date: 2004–04
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp07/04&r=ecm
  15. By: Simon Lee (Institute for Fiscal Studies and University College London)
    Abstract: This paper considers a linear triangular simultaneous equations model with conditional quantile restrictions. The paper adjusts for endogeneity by adopting a control function approach and presents a simple two-step estimator that exploits the partially linear structure of the model. The first step consists of estimation of the residuals of the reduced-form equation for the endogenous explanatory variable. The second step is series estimation of the primary equation with the reduced-form residual included nonparametrically as an additional explanatory variable. This paper imposes no functional form restrictions on the stochastic relationship between the reduced-form residual and the disturbance term in the primary equation conditional on observable explanatory variables. The paper presents regularity conditions for consistency and asymptotic normality of the two-step estimator. In addition, the paper provides some discussions on related estimation methods in the literature and on possible extensions and limitations of the estimation approach. Finally, the numerical performance and usefulness of the estimator are illustrated by the results of Monte Carlo experiments and two empirical examples, demand for fish and returns to schooling.
    Date: 2004–10
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp08/04&r=ecm
  16. By: Andrew Chesher (Institute for Fiscal Studies and University College London)
    Abstract: This lecture explores conditions under which there is identification of the impact on an outcome of exogenous variation in a variable which is endogenous when data are gathered. The starting point is the Cowles Commission linear simultaneous equations model. The parametric and additive error restrictions of that model are successively relaxed and modifications to covariation,order and rank conditions that maintain identifiability are presented. Eventually a just-identifying, non-falsifiable model permitting nonseparablity of latent vari-ates and devoid of parametric restrictions is obtained. The model requires the endogenous variable to be continuously distributed. It is shown that relaxing this restriction results in loss of point identification but set identification is possible if an additional covariation restriction is introduced. Relaxing other restrictions presents significant challenges.
    Date: 2004–07
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp10/04&r=ecm
  17. By: Andrew Chesher (Institute for Fiscal Studies and University College London)
    Abstract: In additive error models with a discrete endogenous variable identification cannot be achieved under a marginal covariation condition when the support of instruments is sparse relative to the support of the endogenous variable. An iterated covariation condition with a weak montonicity restriction is shown to have set identifying power.
    Date: 2004–09
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp11/04&r=ecm
  18. By: Y. Nishiyama; Peter Robinson (Institute for Fiscal Studies and London School of Economics)
    Abstract: In a number of semiparametric models, smoothing seems necessary in order to obtain estimates of the parametric component which are asymptotically normal and converge at parametric rate. However, smoothing can inflate the error in the normal approximation, so that refined approximations are of interest, especially in sample sizes that are not enormous. We show that a bootstrap distribution achieves a valid Edgeworth correction in case of density-weighted averaged derivative estimates of semiparametric index models. Approaches to bias-reduction are discussed. We also develop a higher order expansion, to show that the bootstrap achieves a further reduction in size distortion in case of two-sided testing. The finite sample performance of the methods is investigated by means of Monte Carlo simulations froma Tobit model.
    JEL: C23
    Date: 2004–10
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp12/04&r=ecm
  19. By: Joel Horowitz (Institute for Fiscal Studies and Northwestern University)
    Abstract: This paper is concerned with inference about a function g that is identified by a conditional moment restriction involving instrumental variables. The paper presents a test of the hypothesis that g belongs to a finite-dimensional parametric family against a nonparametric alternative. The test does not require nonparametric estimation of g and is not subject to the illposed inverse problem of nonparametric instrumental variables estimation. Under mild conditions, the test is consistent against any alternative model and has asymptotic power advantages over existing tests. Moreover, it has power arbitrarily close to 1 uniformly over a class of alternatives whose distance from the null hypothesis is O(n-1/2), where n is the sample size.
    Date: 2004–09
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp14/04&r=ecm
  20. By: Richard Blundell (Institute for Fiscal Studies and University College London); Joel Horowitz (Institute for Fiscal Studies and Northwestern University)
    Abstract: This paper is concerned with inference about a function g that is identified by a conditional moment restriction involving instrumental variables. The function is nonparametric. It satisfies mild regularity conditions but is otherwise unknown. The paper presents test of the hypothesis that g is the mean of a random variable Y conditional on a covariate X . The need to test this hypothesis arises frequently in economics. The test does not require nonparametric instrumental-variables (IV) estimation of g and is not subject to the ill-posed inverse problem that nonparametric IV estimation entails. The test is consistent whenever g differs from the conditional mean function of Y on a set of non-zero probability. Moreover, the power of the test is arbitrarily close to 1 uniformly over a set of functions g whose distance from the conditional mean function is O(n-1/2), where is the sample size.
    Keywords: Hypothesis test, instrumental variables, specification testing, consistent testing
    Date: 2004–12
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp15/04&r=ecm
  21. By: Richard Smith (Institute for Fiscal Studies and University of Warwick)
    Abstract: This paper proposes a new class of HAC covariance matrix estimators. The standard HAC estimation method re-weights estimators of the autocovariances. Here we initially smooth the data observations themselves using kernel function based weights. The resultant HAC covariance matrix estimator is the normalised outer product of the smoothed random vectors and is therefore automatically positive semi-definite. A corresponding efficient GMM criterion may also be defined as a quadratic form in the smoothed moment indicators whose normalised minimand provides a test statistic for the over-identifying moment conditions.
    Keywords: GMM, HAC Covariance Matrix Estimation, Overidentifying Moments
    JEL: C13 C30
    Date: 2004–12
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp17/04&r=ecm
  22. By: Laura Blow (Institute for Fiscal Studies); Martin Browning (Institute for Fiscal Studies and University of Copenhagen); Ian Crawford (Institute for Fiscal Studies and University of Surrey)
    Abstract: Characteristics models have been found to be useful in many areas of economics. However, their empirical implementation tends to rely heavily on functional form assumptions. In this paper we develop a revealed preference-based nonparametric approach to characteristics models. We derive the minimal necessary and sufficient empirical conditions under which data on the market behaviour of individual, heterogeneous, pricetaking consumers are nonparametrically consistent with the consumer characteristics model. Where these conditions hold, we show how information may be recovered on individual consumer’s marginal valuations of product attributes. In some cases marginal valuations are point identi- fied and in other cases we can only recover bounds. Where the conditions fail we highlight the role which the introduction of unobserved product attributes can play in rationalising the data. We implement these ideas using consumer panel data on the Danish milk market.
    Keywords: Product characteristics, revealed preference
    JEL: C43 D11
    Date: 2004–12
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp18/04&r=ecm
  23. By: Richard Smith (Institute for Fiscal Studies and University of Warwick)
    Abstract: GEL methods which generalize and extend previous contributions are defined and analysed for moment condition models specified in terms of weakly dependent data. These procedures offer alternative one-step estimators and tests that are asymptotically equivalent to their efficient two-step GMM counterparts. The basis for GEL estimation is via a smoothed version of the moment indicators using kernel function weights which incorporate a bandwidth parameter. Examples for the choice of bandwidth parameter and kernel function are provided. Efficient moment estimators based on implied probabilities derived from the GEL method are proposed, a special case of which is estimation of the stationary distribution of the data. The paper also presents a unified set of test statistics for over-identifying moment restrictions and combinations of parametric and moment restriction hypotheses.
    Keywords: GMM, Generalized Empirical Likelihood, Efficient Moment Estimation,
    JEL: C13 C30
    Date: 2004–12
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:wp19/04&r=ecm
  24. By: James J. Heckman; Edward Vytlacil
    Abstract: This paper uses the marginal treatment effect (MTE) to unify the nonparametric literature on treatment effects with the econometric literature on structural estimation using a nonparametric analog of a policy invariant parameter; to generate a variety of treatment effects from a common semiparametric functional form; to organize the literature on alternative estimators; and to explore what policy questions commonly used estimators in the treatment effect literature answer. A fundamental asymmetry intrinsic to the method of instrumental variables is noted. Recent advances in IV estimation allow for heterogeneity in responses but not in choices, and the method breaks down when both choice and response equations are heterogeneous in a general way.
    JEL: C1
    Date: 2005–04
    URL: http://d.repec.org/n?u=RePEc:nbr:nberte:0306&r=ecm
  25. By: Mitchell A. Petersen
    Abstract: In both corporate finance and asset pricing empirical work, researchers are often confronted with panel data. In these data sets, the residuals may be correlated across firms and across time, and OLS standard errors can be biased. Historically, the two literatures have used different solutions to this problem. Corporate finance has relied on Rogers standard errors, while asset pricing has used the Fama-MacBeth procedure to estimate standard errors. This paper will examine the different methods used in the literature and explain when the different methods yield the same (and correct) standard errors and when they diverge. The intent is to provide intuition as to why the different approaches sometimes give different answers and give researchers guidance for their use.
    JEL: G1 G3 C1
    Date: 2005–04
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:11280&r=ecm
  26. By: Jean Boivin; Serena Ng
    Abstract: Forecasting using `diffusion indices' has received a good deal of attention in recent years. The idea is to use the common factors estimated from a large panel of data to help forecast the series of interest. This paper assesses the extent to which the forecasts are influenced by (i) how the factors are estimated, and/or (ii) how the forecasts are formulated. We find that for simple data generating processes and when the dynamic structure of the data is known, no one method stands out to be systematically good or bad. All five methods considered have rather similar properties, though some methods are better in long horizon forecasts, especially when the number of time series observations is small. However, when the dynamic structure is unknown and for more complex dynamics and error structures such as the ones encountered in practice, one method stands out to have smaller forecast errors. This method forecasts the series of interest directly, rather than the common and idiosyncratic components separately, and it leaves the dynamics of the factors unspecified. By imposing fewer constraints, and having to estimate a smaller number of auxiliary parameters, the method appears to be less vulnerable to misspecification, leading to improved forecasts.
    JEL: E37 E47 C3 C53
    Date: 2005–05
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:11285&r=ecm
  27. By: Nicola Bruti-Liberati (School of Finance and Economics, University of Technology, Sydney); Filippo Martini (Faculty of Information Technology, University of Technology, Sydney); Massimo Piccardi (Faculty of Information Technology, University of Technology, Sydney); Eckhard Platen (School of Finance and Economics, University of Technology, Sydney)
    Abstract: Monte Carlo simulation of weak approximations of stochastic differential equations constitutes an intensive computational task. In applications such as finance, for instance, to achieve "real time" execution, as often required, one needs highly efficient implementations of the multi-point distributed random number generator underlying the simulations. In this paper a fast and flexible dedicated hardware solution on a field programmable gate array is presented. A comparative performance analysis between a software-only and the proposed hardware solution demonstrates that the hardware solution is bottleneck-free, retains the flexibility of the software solution and significantly increases the computational efficiency. Moreover, simulations in applications such as economics, insurance, physics, population dynamics, epidemiology, structural mechanics, chemistry and biotechnology can benefit from the obtained speedup.
    Keywords: random number generators; random bit generators; hardware implementation; field programmable gate arrays (FPGAs); Monte Carlo simulation; weak Taylor schemes; multi-point distributed random variables
    JEL: G10 G13
    Date: 2005–04–01
    URL: http://d.repec.org/n?u=RePEc:uts:rpaper:156&r=ecm
  28. By: Joon Y. Park (Department of Economics, Rice University and SKKU); Mototsugu Shintani (Department of Economics, Vanderbilt University)
    Abstract: This paper considers the test of a unit root in transitional autoregressive models. In particular, we develop the asymptotic theory of the inf-t test for the null hypothesis of a unit root in a wide class of nonlinear autoregressive models having parameters that are identified only under the alternative of stationarity. Our framework is very general and allows for virtually all potentially interesting models with the threshold, discrete and smooth transition functions. The specifications of shortrun dynamics used in the paper are also fully general, and comparable to those used in the linear unit root models. Most importantly, our asymptotics take it into consideration that the parameter space has a random limit. This is an essential feature of the unit root test in transitional autoregressive models, which has been ignored in the literature. For this very general class of transitional autoregressive models, we show that the inf-t test has well-defined limit distribution depending only upon the transition function and the limit parameter space. The critical values of the test are provided for some of the commonly used models under the conventional specification of the parameter space. Our simulation study shows that the test has good size with the power that is significantly higher than the usual ADF test even for samples of relatively small sizes. We apply the test to various economic time series and find strong evidence for the rejection of random walks in favor of stationary transitional autoregressive models.
    Keywords: unit root test, threshold autoregressive models (TAR), logistic and exponential smooth transition autoregressive models (LSTAR and ESTAR)
    JEL: C12 C16 C22
    Date: 2005–04
    URL: http://d.repec.org/n?u=RePEc:van:wpaper:0510&r=ecm

This nep-ecm issue is ©2005 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.