
on Econometrics 
By:  Jeff Racine (McMaster University); James MacKinnon (Queen's University) 
Abstract:  Resampling methods such as the bootstrap are routinely used to estimate the finitesample null distributions of a range of test statistics. We present a simple and tractable way to perform classical hypothesis tests based upon a kernel estimate of the CDF of the bootstrap statistics. This approach has a number of appealing features: i) it can perform well when the number of bootstraps is extremely small, ii) it is approximately exact, and iii) it can yield substantial power gains relative to the conventional approach. The proposed approach is likely to be useful when the statistic being bootstrapped is computationally expensive. 
Keywords:  resampling, Monte Carlo test, bootstrap test, percentiles 
JEL:  C12 C14 C15 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:qed:wpaper:1054&r=ecm 
By:  Donald W.K. Andrews (Cowles Foundation, Yale University); Gustavo Soares (Department of Economics, Yale University) 
Abstract:  This paper considers tests in an instrumental variables (IVs) regression model with IVs that may be weak. Tests that have nearoptimal asymptotic power properties with Gaussian errors for weak and strong IVs have been determined in Andrews, Moreira, and Stock (2006a). In this paper, we seek tests that have nearoptimal asymptotic power with Gaussian errors and improved power with nonGaussian errors relative to existing tests. Tests with such properties are obtained by introducing rank tests that are analogous to the conditional likelihood ratio test of Moreira (2003). We also introduce a rank test that is analogous to the Lagrange multiplier test of Kleibergen (2002) and Moreira (2001). 
Keywords:  Asymptotically similar tests, Conditional likelihood ratio test, Instrumental variables regression, Lagrange multiplier test, Power of test, Rank tests, Thicktailed distribution, Weak instruments 
JEL:  C12 C30 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:cwl:cwldpp:1564&r=ecm 
By:  Cizek,P.; Tamine,J.; Haerdle,W. (Tilburg University, Center for Economic Research) 
Abstract:  The NadarayaWatson nonparametric estimator of regression is known to be highly sensitive to the presence of outliers in data. This sensitivity can be reduced, for example, by using local Lestimates of regression. Whereas the local Lestimation is traditionally done using an empirical conditional distribution function, we propose to use instead a smoothed conditional distribution function. The asymptotic distribution of the proposed estimator is derived under mild ¯mixing conditions, and additionally, we show that the smoothed Lestimation approach provides computational as well as statistical ¯nitesample improvements. Finally, the proposed method is applied to the modelling of implied volatility 
Keywords:  nonparametric regression;Lestimation;smoothed cumulative distribution function 
JEL:  C13 C14 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:200620&r=ecm 
By:  Todd E. Clark; Michael W. McCracken 
Abstract:  Motivated by the common finding that linear autoregressive models forecast better than models that incorporate additional information, this paper presents analytical, Monte Carlo, and empirical evidence on the effectiveness of combining forecasts from nested models. In our analytics, the unrestricted model is true, but as the sample size grows, the DGP converges to the restricted model. This approach captures the practical reality that the predictive content of variables of interest is often low. We derive MSEminimizing weights for combining the restricted and unrestricted forecasts. In the Monte Carlo and empirical analysis, we compare the effectiveness of our combination approach against related alternatives, such as Bayesian estimation. 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:fip:fedkrw:rwp0602&r=ecm 
By:  Stan Hurn; J.Jeisman; K.A. Lindsay (School of Economics and Finance, Queensland University of Technology) 
Abstract:  Maximum likelihood (ML) estimates of the parameters of stochastic differential equations (SDEs) are consistent and asymptotically efficient, but unfortunately difficult to obtain if a closed form expression for the transitional density of the process is not available. As a result, a large number of competing estimation procedures have been proposed. This paper provides a critical evaluation of the various estimation techniques. Special attention is given to the ease of implementation and comparative performance of the procedures when estimating the parameters of the CoxIngersollRoss and OrnsteinUhlenbeck equations respectively. 
Keywords:  stochastic differential equations, parameter estimation, maximum likelihood, simulation, moments 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:qut:sthurn:2006&r=ecm 
By:  Clive G. Bowsher (Nuffield College, Oxford University) 
Abstract:  A continuous time econometric modelling framework for multivariate financial market event (or 'transactions') data is developed in which the model is specified via the vector conditional intensity. This has the advantage that the conditioning information set is updated continuously in time as new information arrives. Generalised Hawkes (gHawkes) models are introduced that are sufficiently flexible to incorporate `inhibitory' events and dependence between trading days. Novel omnibus specification tests for parametric models based on a multivariate random time change theorem are proposed. A computationally efficient thinning algorithm for simulation of gHawkes processes is also developed. A continuous time, bivariate point process model of the timing of trades and midquote changes is presented for a New York Stock Exchange stock and the empirical findings are related to the market microstructure literature. The twoway interaction of trades and quote changes is found to be important empirically. Furthermore, the model delivers a continuous record of instantaneous volatility that is conditional on the timing of trades and quote changes. 
Keywords:  Point process, conditional intensity, Hawkes process, specification test, random time change, transactions data, market microstructure. 
JEL:  C32 C51 C52 G10 
Date:  2005–10–01 
URL:  http://d.repec.org/n?u=RePEc:nuf:econwp:0526&r=ecm 
By:  George Kapetanios and Massimiliano Marcellino 
Abstract:  The estimation of structural dynamic factor models (DFMs) for large sets of variables is attracting considerable attention. In this paper we briefly review the underlying theory and then compare the impulse response functions resulting from two alternative estimation methods for the DFM. Finally, as an example, we reconsider the issue of the identification of the driving forces of the US economy, using data for about 150 macroeconomic variables. 
URL:  http://d.repec.org/n?u=RePEc:igi:igierp:306&r=ecm 
By:  Wayne E. Ferson; Andrew F. Siegel 
Abstract:  We develop asset pricing models' implications for portfolio efficiency when there is conditioning information in the form of a set of lagged instruments. A model of expected returns identifies a portfolio that should be minimum variance efficient with respect to the conditioning information. Our tests refine previous tests of portfolio efficiency, using the conditioning information optimally. We reject the efficiency of all static or timevarying combinations of the three FamaFrench (1996) factors with respect to the conditioning information and also the conditional efficiency of timevarying combinations of the factors, given standard lagged instruments. 
JEL:  C12 C51 C52 G12 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:12098&r=ecm 
By:  Andersson, Martin (CESIS  Centre of Excellence for Science and Innovation Studies, Royal Institute of Technology); Gråsjö, Urban (CESIS  Centre of Excellence for Science and Innovation Studies, Royal Institute of Technology) 
Abstract:  Using the taxonomy by Anselin (2003), this paper investigates how the inclusion of spatially discounted variables on the ‘righthandside’ (RHS) in empirical spatial models affects the extent of spatial autocorrelation. The basic proposition is that the inclusion of inputs external to the spatial observation in question as a separate variable reveals spatial dependence via the parameter estimate. One of the advantages of this method is that it allows for a direct interpretation. The paper also tests to what extent significance of the estimated parameters of the spatially discounted explanatory variables can be interpreted as evidence of spatial dependence. Additionally, the paper advocates the use of the accessibility concept for spatial weights. Accessibility is related to spatial interaction theory and can be motivated theoretically by adhering to the preference structure in random choice theory. Monte Carlo Simulations show that the coefficient estimates of the accessibility variables are significantly different from zero in the case of modelled effects. The rejection frequency of the three typical tests (Moran’s I, LMlag and LMerr) is significantly reduced when these additional variables are included in the model. When the coefficient estimates of the accessibility variables are statistically significant, it suggests that problems of spatial autocorrelation are significantly reduced. Significance of the accessibility variables can be interpreted as spatial dependence 
Keywords:  accessibility; spatial dependence; spatial econometrics; Monte Carlo Simulations; spatial spillovers 
JEL:  C31 C51 R15 
Date:  2006–03–28 
URL:  http://d.repec.org/n?u=RePEc:hhs:cesisp:0051&r=ecm 
By:  Stan Hurn; J.Jeisman; K.A. Lindsay (School of Economics and Finance, Queensland University of Technology) 
Abstract:  Many stochastic differential equations (SDEs) do not have readily available closedform expressions for their transitional probability density functions (PDFs). As a result, a large number of competing estimation approaches have been proposed in order to obtain maximumlikelihood estimates of their parameters. Arguably the most straightforward of these is one in which the required estimates of the transitional PDF are obtained by numerical solution of the FokkerPlanck (or forwardKolmogorov) partial differential equation. Despite the fact that this method produces accurate estimates and is completely generic, it has not proved popular in the applied literature. Perhaps this is attributable to the fact that this approach requires repeated solution of a parabolic partial differential equation to obtain the transitional PDF and is therefore computationally quite expensive. In this paper, three avenues for improving the reliability and speed of this estimation method are introduced and explored in the context of estimating the parameters of the popular CoxIngersollRoss and OrnsteinUhlenbeck models. The recommended algorithm that emerges from this investigation is seen to offer substantial gains in reliability and computational time. 
Keywords:  stochastic differential equations, maximum likelihood, finite difference, finite element, cumulative distribution function, interpolation. 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:qut:sthurn:200601&r=ecm 
By:  Stinstra,Erwin; Rennen,Gijs; Teeuwen,Geert (Tilburg University, Center for Economic Research) 
Abstract:  The subject of this paper is a new approach to Symbolic Regression. Other publications on Symbolic Regression use Genetic Programming. This paper describes an alternative method based on Pareto Simulated Annealing. Our method is based on linear regression for the estimation of constants. Interval arithmetic is applied to ensure the consistency of a model. In order to prevent overfitting, we merit a model not only on predictions in the data points, but also on the complexity of a model. For the complexity we introduce a new measure. We compare our new method with the Kriging metamodel and against a Symbolic Regression metamodel based on Genetic Programming. We conclude that Pareto Simulated Annealing based Symbolic Regression is very competitive compared to the other metamodel approaches 
Keywords:  approximation;metamodeling;pareto simulated annealing;symbolic regression 
JEL:  C14 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:200615&r=ecm 
By:  JeanPhilippe Bouchaud (Science & Finance, Capital Fund Management; CEA Saclay;); Laurent Laloux (Science & Finance, Capital Fund Management); M. Augusta Miceli; Marc Potters (Science & Finance, Capital Fund Management) 
Abstract:  We present a general method to detect and extract from a finite time sample statistically meaningful correlations between input and output variables of large dimensionality. Our central result is derived from the theory of free random matrices, and gives an explicit expression for the interval where singular values are expected in the absence of any true correlations between the variables under study. Our result can be seen as the natural generalization of the Mar?cenkoPastur distribution for the case of rectangular correlation matrices. We illustrate the interest of our method on a set of macroeconomic time series. 
Date:  2005–12 
URL:  http://d.repec.org/n?u=RePEc:sfi:sfiwpa:500066&r=ecm 
By:  Lisa Borland (EvnineVaughan Associates, Inc.); JeanPhilippe Bouchaud (Science & Finance, Capital Fund Management; CEA Saclay;) 
Abstract:  We study, both analytically and numerically, an ARCHlike, multiscale model of volatility, which assumes that the volatility is governed by the observed past price changes on different time scales. With a powerlaw distribution of time horizons, we obtain a model that captures most stylized facts of financial time series: Studentlike distribution of returns with a powerlaw tail, longmemory of the volatility, slow convergence of the distribution of returns towards the Gaussian distribution, multifractality and anomalous volatility relaxation after shocks. At variance with recent multifractal models that are strictly time reversal invariant, the model also reproduces the time assymmetry of financial time series: past large scale volatility influence future small scale volatility. In order to quantitatively reproduce all empirical observations, the parameters must be chosen such that our model is close to an instability, meaning that (a) the feedback effect is important and substantially increases the volatility, and (b) that the model is intrinsically difficult to calibrate because of the very long range nature of the correlations. By imposing the consistency of the model predictions with a large set of different empirical observations, a reasonable range of the parameters value can be determined. The model can easily be generalized to account for jumps, skewness and multiasset correlations. 
JEL:  G10 
Date:  2005–07 
URL:  http://d.repec.org/n?u=RePEc:sfi:sfiwpa:500059&r=ecm 
By:  James Andreoni; William T. Harbaugh 
Date:  2006–03–27 
URL:  http://d.repec.org/n?u=RePEc:cla:levrem:122247000000001257&r=ecm 
By:  J. Isaac Miller (Department of Economics, University of MissouriColumbia) 
Abstract:  We show that typical tests for purchasing power parity (PPP) using exchange rates governed by a target zone regime are inherently misspecified. Regardless of whether or not longrun PPP holds, the real exchange rate cannot be meanreverting in the usual sense, since the nominal exchange rate is generated by a nonlinear transformation of a nonstationary economic fundamental. As an alternative, we propose basing the real exchange rate (and thus a PPP test) on conditional expectations of this unobservable fundamental. As an illustration, we test for longrun PPP between Denmark and the Euro area. 
Keywords:  Target Zone Exchange Rates, Purchasing Power Parity, Nonlinear Transformations, Extended Kalman Filter 
JEL:  C13 C22 C32 
Date:  2006–03–02 
URL:  http://d.repec.org/n?u=RePEc:umc:wpaper:0604&r=ecm 
By:  Toepoel,Vera; Vis,Corrie; Das,Marcel; Soest,Arthur van (Tilburg University, Center for Economic Research) 
Abstract:  In this study we use an informationprocessing perspective to explore the impact of response scales on respondents answers in a web survey. This paper has four innovations compared to the existing literature: research is based on a different mode of administration (web), we use an openended format as a benchmark, four different question types are used, and the study is conducted on a representative sample of the population. We find strong effects of response scales. Questions requiring estimation strategies are more affected by the choice of response format than questions in which direct recall is used. Respondents with a low need for cognition and respondents with a low need to form opinions are more affected by the response categories than respondents with a high need for cognition and a high need to evaluate. The sensitivity to contextual clues is also significantly related to gender, age and education 
Keywords:  web survey;questionnaire design;measurement error;context effects;response categories;need for cognition;need to evaluate 
JEL:  C42 C81 C93 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:200619&r=ecm 
By:  Neil Shephard (Nuffield College, Oxford University) 
Date:  2005–07–01 
URL:  http://d.repec.org/n?u=RePEc:nuf:econwp:0517&r=ecm 
By:  Miguel Segoviano 
Abstract:  The estimation of the profit and loss distribution of a loan portfolio requires the modelling of the portfolio's multivariate distribution. This describes the joint likelihood of changes in the creditrisk quality of the loans that makeup the portfolio. A significant problem for portfolio credit risk measurement is the greatly restricted data that are available for its modelling. Under these circumstances, convenient parametric assumptions, however, usually do not appropiately describe the behaviour of the assets that are the subject of our interest, loans granted to small and medium enterprises (SMEs), unlisted and arm's length firms. This paper proposes the Consistent Information Multivariate Density Optimizing Methodology (CIMDO), based on the crossentropy approach, as an alternative to generate probabilty multivariate densities from partial information and without making parametric assumptions. Using the probabilty integral transformation criterion, we show that the distributions recovered by CIMDO outperform distributions that are used for the measurement of portfolio credit risk of loans granted to SMEs, unlisted and arm's length firms. 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:fmg:fmgdps:dp557&r=ecm 