nep-ecm New Economics Papers
on Econometrics
Issue of 2006‒10‒28
thirty-two papers chosen by
Sune Karlsson
Orebro University

  1. Forecasting Using a Large Number of Predictors: Is Bayesian Regression a Valid Alternative to Principal Components? By De Mol, Christine; Giannone, Domenico; Reichlin, Lucrezia
  2. Nonparametric Transformation to White Noise By Oliver Linton; Enno Mammen
  3. Adaptive Estimation of Autoregressive Models with Time-Varying Variances By Ke-Li Xu; Peter C.B. Phillips
  4. A Quasi Maximum Likelihood Approach for Large Approximate Dynamic Factor Models By Doz, Catherine; Giannone, Domenico; Reichlin, Lucrezia
  5. A Modified Information Criterion for Cointegration Tests based on a VAR Approximation By Zhongjun Qu; Pierre Perron
  6. UNCERTAINTY UNDER A MULTIVARIATE NESTED-ERROR REGRESSION MODEL WITH LOGARITHMIC TRANSFORMATION By Isabel Molina
  7. The Limit Distribution of the CUSUM of Square Test Under Genreal MIxing Conditions* By Ai Deng; Pierre Perron
  8. Log Periodogram Regression: The Nonstationary Case By Chang Sik Kim; Peter C.B. Phillips
  9. Understanding Spurious Regression in Financial Economics By Ai Deng
  10. Conditional-Sum-of-Squares Estimation ofModels for Stationary Time Series with Long Memory By Peter M Robinson
  11. Modelling and forecasting Australian domestic tourism By George Athanasopoulos; Rob J. Hyndman
  12. Estimation of Structural Parameters and Marginal Effects in Binary Choice Panel Data Models with Fixed Effects By Ivan Fernandez-Val;
  13. TESTING FOR STOCHASTICMONOTONICITY By Sokbae Lee; Oliver Linton; Yoon-Jae Whang
  14. Optimal Instruments in Time Series: A Survey By Stanislav Anatolyev
  15. A Complete Asymptotic Series for the Autocovariance Function of a Long Memory Process By Offer Lieberman; Peter C.B. Phillips
  16. Estimating a Class of Triangular Simultaneous Equations Models Without Exclusion Restrictions By Roger Klein; Francis Vella
  17. Moving the Goalposts: Addressing Limited Overlap in the Estimation of Average Treatment Effects by Changing the Estimand By Richard K. Crump; V. Joseph Hotz; Guido W. Imbens; Oscar A. Mitnik
  18. Nonparametric retrospection and monitoring of predictability of financial returns By Stanislav Anatolyev
  19. Semiparametric Estimation of aCharacteristic-based Factor Model ofCommon Stock Returns By Gregory Connor; Oliver Linton
  20. Estimating ATT Effects with Non-Experimental Data and Low Compliance By Manuela Angelucci; Orazio Attanasio
  21. The econometric analysis of microscopic simulation models By Li,Youwei; Donkers,Bas; Melenberg,Bertrand
  22. Estimating Quadratic VariationConsistently in thePresence of Correlated MeasurementError By Ilze Kalnina; Oliver Linton
  23. The non- and semiparametric analysis of MS models : some applications By Li,Youwei; Donkers,Bas; Melenberg,Bertrand
  24. An Analytical Evaluation of the Log-periodogram Estimate in the Presence of Level Shifts and its Implications for Stock Returns Volatility* By Pierre Perron; Zhongjun Qu
  25. Exploiting Randomness for Feature Selection in Multinomial Logit: a CRM Cross-Sell Application By A. PRINZIE; D. VAN DEN POEL
  26. Matching Estimators and the Data from the National Supported Work Demonstration Again By Zhong Zhao
  27. MODELLING LONG-MEMORY VOLATILITIES WITH LEVERAGE EFFECT: ALMSV VERSUS FIEGARCH By Esther Ruiz; Helena Veiga
  28. Does Information Help Recovering Structural Shocks from Past Observations? By Giannone, Domenico; Reichlin, Lucrezia
  29. Money Growth, Output Gaps and Inflation at Low and High Frequency: Spectral Estimates for Switzerland By Assenmacher-Wesche, Katrin; Gerlach, Stefan
  30. Inequality: Measurement By Frank A Cowell
  31. Researcher Incentives and Empirical Methods By Edward L. Glaeser
  32. An Alternative Trend-Cycle Decomposition using a State Space Model with Mixtures of Normals: Specifications and Applications to International Data By Tatsuma Wada; Pierre Perron

  1. By: De Mol, Christine; Giannone, Domenico; Reichlin, Lucrezia
    Abstract: This paper considers Bayesian regression with normal and double exponential priors as forecasting methods based on large panels of time series. We show that, empirically, these forecasts are highly correlated with principal component forecasts and that they perform equally well for a wide range of prior choices. Moreover, we study the asymptotic properties of the Bayesian regression under Gaussian prior under the assumption that data are quasi collinear to establish a criterion for setting parameters in a large cross-section.
    Keywords: Bayesian VAR; large cross-sections; Lasso regression; principal components; ridge regressions
    JEL: C11 C13 C33 C53
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5829&r=ecm
  2. By: Oliver Linton; Enno Mammen
    Abstract: We consider a semiparametric distributed lag model in which the "news impact curve" m isnonparametric but the response is dynamic through some linear filters. A special case ofthis is a nonparametric regression with serially correlated errors. We propose an estimatorof the news impact curve based on a dynamic transformation that produces white noiseerrors. This yields an estimating equation for m that is a type two linear integral equation.We investigate both the stationary case and the case where the error has a unit root. In thestationary case we establish the pointwise asymptotic normality. In the special case of anonparametric regression subject to time series errors our estimator achieves efficiencyimprovements over the usual estimators, see Xiao, Linton, Carroll, and Mammen (2003). Inthe unit root case our procedure is consistent and asymptotically normal unlike the standardregression smoother. We also present the distribution theory for the parameter estimates,which is non-standard in the unit root case. We also investigate its finite sampleperformance through simulation experiments.
    Keywords: Efficiency, Inverse Problem, Kernel Estimation, Nonparametric regression,Time Series, Unit Roots.
    JEL: C14
    Date: 2006–08
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/503&r=ecm
  3. By: Ke-Li Xu (Dept. of Economics, Yale University); Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: Stable autoregressive models of known finite order are considered with martingale differences errors scaled by an unknown nonparametric time-varying function generating heterogeneity. An important special case involves structural change in the error variance, but in most practical cases the pattern of variance change over time is unknown and may involve shifts at unknown discrete points in time, continuous evolution or combinations of the two. This paper develops kernel-based estimators of the residual variances and associated adaptive least squares (ALS) estimators of the autoregressive coefficients. These are shown to be asymptotically efficient, having the same limit distribution as the infeasible generalized least squares (GLS). Comparisons of the efficient procedure and the ordinary least squares (OLS) reveal that least squares can be extremely inefficient in some cases while nearly optimal in others. Simulations show that, when least squares work well, the adaptive estimators perform comparably well, whereas when least squares work poorly, major efficiency gains are achieved by the new estimators.
    Keywords: Adaptive estimation, Autoregression, Heterogeneity, Weighted regression
    JEL: C14 C22
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1585&r=ecm
  4. By: Doz, Catherine; Giannone, Domenico; Reichlin, Lucrezia
    Abstract: This paper considers quasi-maximum likelihood estimations of a dynamic approximate factor model when the panel of time series is large. Maximum likelihood is analyzed under different sources of misspecification: omitted serial correlation of the observations and cross-sectional correlation of the idiosyncratic components. It is shown that the effects of misspecification on the estimation of the common factors is negligible for large sample size (T) and the cross-sectional dimension (n). The estimator is feasible when n is large and easily implementable using the Kalman smoother and the EM algorithm as in traditional factor analysis. Simulation results illustrate what are the empirical conditions in which we can expect improvement with respect to simple principle components considered by Bai (2003), Bai and Ng (2002), Forni, Hallin, Lippi, and Reichlin (2000, 2005b), Stock and Watson (2002a,b).
    Keywords: factor model; large cross-sections; Quasi Maximum Likelihood
    JEL: C32 C33 C51
    Date: 2006–06
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5724&r=ecm
  5. By: Zhongjun Qu (Department of Economics, University of Illinois at Urbana-Champaign); Pierre Perron (Department of Economics, Boston University)
    Abstract: We consider Johansen’s (1988, 1991) cointegration tests when a Vector AutoRegressive (VAR) process of order k is used to approximate a more general linear process with an infinite VAR representation. In this case, and in particular when a moving average component is present, traditional methods to select the lag order, such as Akaike’s (AIC) or the Bayesian information criteria, lead to too parsimonious a model, with the implication that the cointegration tests suffer from substantial size distortions in finite samples. We extend the analysis of Ng and Perron (2001) to derive a Modified Akaike’s Information Criterion (MAIC) in this multivariate setting. The idea is to use the information specified by the null hypothesis as it relates to restrictions on the parameters of the model to keep an extra term in the penalty function of the AIC. This MAIC takes a very simple form for which this extra term is simply the likelihood ratio test for testing the null hypothesis of r against more than r cointegrating vectors. We provide theoretical analyses of its validity and of the fact that cointegration tests constructed from a VAR whose lag order is selected using the MAIC have the same limit distribution as when the order is finite and known. We also provide theoretical and simulation analyses to show how the MAIC leads to VAR approximations that yield tests with drastically improved size properties with little loss of power.
    Date: 2006–02
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2006-011&r=ecm
  6. By: Isabel Molina
    Abstract: Assuming a multivariate linear regression model with one random factor, we consider the parameters defined as exponentials of mixed effects, i.e., linear combinations of fixed and random effects. Such parameters are of particular interest in prediction problems where the dependent variable is the logarithm of the variable that is the object of inference. We derive bias-corrected empirical predictors of such parameters. A second order approximation for the mean crossed product error of the predictors of two of these parameters is obtained, and an estimator is derived from it. The mean squared error is obtained as a particular case.
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws066117&r=ecm
  7. By: Ai Deng (Department of Economics, Boston University); Pierre Perron (Department of Economics, Boston University)
    Abstract: We consider the CUSUM of squares test in a linear regression model with general mixing assumptions on the regressors and the errors. We derive its limit distribution and show how it depends on the nature of the error process. We suggest a corrected version that has a limit distribution free of nuisance parameters. We also discuss how it provides an improvement over the standard approach to testing for a change in the variance in a univariate times series. Simulation evidence is presented to support this. We illustrate the usefulness of our method by analyzing changes in the variance of stock returns and a variety of macroeconomic time series, as well as by testing for change in the variance of the residuals in a typical four-variable VAR model. Our results show the widespread prevalence of changes in the variance of such series and the fact that the variability of shocks affecting the U.S. economy has decreased.
    Keywords: Change-point, Variance shift, Recursive residuals, Dynamic models, Conditional heteroskedasticity.
    JEL: D80 D91 G11 E21
    Date: 2005–11
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-046&r=ecm
  8. By: Chang Sik Kim (Dept. of Economics, Ewha Women's University); Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: Estimation of the memory parameter (d) is considered for models of nonstationary fractionally integrated time series with d > (1/2). It is shown that the log periodogram regression estimator of d is inconsistent when 1 < d < 2 and is consistent when (1/2) < d = 1. For d > 1, the estimator is shown to converge in probability to unity.
    Keywords: Discrete Fourier transform, Fractional Brownian motion, Fractional integration, Inconsistency, Log periodogram regression, Long memory parameter, Nonstationarity, Semiparametric estimation
    JEL: C22
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1587&r=ecm
  9. By: Ai Deng (Department of Economics, Boston University)
    Abstract: This paper provides an asymptotic theory for the spurious regression analyzed by Ferson, Sarkissian and Simin (2003). The asymptotic framework developed by Nabeya and Perron (1994) is used to provide approximations for the various estimates and statistics. Also, using a fixed-bandwidth asymptotic framework, a convergent t test is constructed, following Sun (2005). These are shown to be accurate and to explain the simulation findings in Ferson et al. (2003). Monte Carlo studies show that our asymptotic distribution provides a very good finite sample approximation for sample sizes often encountered in finance. Our analysis also reveals an important potential problem in the theoretical hypothesis testing literature on predictability. A possible reconciling interpretation is provided.
    Keywords: spurious regression, observational equivalence, Nabeya-Perron asymptotics, fixed-b asymptotics, data mining, nearly integrated, nearly white noise (NINW)
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-048&r=ecm
  10. By: Peter M Robinson
    Abstract: Employing recent results of Robinson (2005) we consider the asymptotic properties ofconditional-sum-of-squares (CSS) estimates of parametric models for stationary timeseries with long memory. CSS estimation has been considered as a rival to Gaussianmaximum likelihood and Whittle estimation of time series models. The latter kinds ofestimate have been rigorously shown to be asymptotically normally distributed in case oflong memory. However, CSS estimates, which should have the same asymptoticdistributional properties under similar conditions, have not received comparabletreatment: the truncation of the infinite autoregressive representation inherent in CSSestimation has been essentially ignored in proofs of asymptotic normality. Unlike in shortmemory models it is not straightforward to show the truncation has negligible effect.
    Keywords: Long memory, conditional-sum-of-squares estimation,central limit theorem, almost sure convergence.
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/505&r=ecm
  11. By: George Athanasopoulos; Rob J. Hyndman
    Abstract: In this paper, we model and forecast Australian domestic tourism demand. We use a regression framework to estimate important economic relationships for domestic tourism demand. We also identify the impact of world events such as the 2000 Sydney Olympics and the 2002 Bali bombings on Australian domestic tourism. To explore the time series nature of the data, we use innovation state space models to forecast the domestic tourism demand. Combining these two frameworks, we build innovation state space models with exogenous variables. These models are able to capture the time series dynamics in the data, as well as economic and other relationships. We show that these models outperform alternative approaches for short-term forecasting and also produce sensible long-term forecasts. The forecasts are compared with the official Australian government forecasts, which are found to be more optimistic than our forecasts.
    Keywords: Australia, domestic tourism, exponential smoothing, forecasting, innovation state space models.
    JEL: C13 C22 C53
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2006-19&r=ecm
  12. By: Ivan Fernandez-Val (Department of Economics, Boston University);
    Abstract: Fixed e®ects estimates of structural parameters in nonlinear panel models can be severely biased due to the incidental parameters problem. In this paper I show that the most important com- ponent of this incidental parameters bias for probit ¯xed e®ects estimators of index coe±cients is proportional to the true parameter value, using a large-T expansion of the bias. This result allows me to derive a lower bound for this bias, and to show that ¯xed e®ects estimates of ratios of coe±cients and average marginal e®ects have zero bias in the absence of heterogeneity and have negligible bias relative to their true values for a wide range of distributions of regressors and individual e®ects. Numerical examples suggest that this small bias property also holds for logit and linear probability models, and for exogenous variables in dynamic binary choice models. An empirical analysis of female labor force participation using data from the PSID shows that whereas the signi¯cant biases in ¯xed e®ects estimates of model parameters do not contami- nate the estimates of marginal e®ects in static models, estimates of both index coe±cients and marginal e®ects can be severely biased in dynamic models. Improved bias corrected estimators for index coe±cients and marginal e®ects are also proposed for both static and dynamic models.
    JEL: C23 C25 J22
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-38&r=ecm
  13. By: Sokbae Lee; Oliver Linton; Yoon-Jae Whang
    Abstract: We propose a test of the hypothesis of stochastic monotonicity. This hypothesis isof interest in many applications. Our test is based on the supremum of a rescaledU-statistic. We show that its asymptotic distribution is Gumbel. The proof is difficultbecause the approximating Gaussian stochastic process contains both a stationaryand a nonstationary part and so we have to extend existing results that only applyto either one or the other case.
    Keywords: Distribution function, Extreme Value Theory, Gaussian Process,Monotonicity.
    JEL: C14 C15
    Date: 2006–08
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/504&r=ecm
  14. By: Stanislav Anatolyev (NES)
    Abstract: This article surveys estimation in stationary time series models using the approach of optimal instrumentation. We review tools that allow construction and implementation of optimal instrumental variables estimators in various circumstances { in single- and multiperiod models, in the absence and presence of conditional heteroskedasticity, by considering linear and nonlinear instruments. We also discuss issues adjacent to the theme of optimal instruments. The article is directed primarily towards practitioners, but also may be found useful by econometric theorists and teachers of graduate econometrics.
    Keywords: Instrumental variables estimation; Moment restrictions; Optimal instrument; Effciency bounds; Stationary time series.
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:cfr:cefirw:w0069&r=ecm
  15. By: Offer Lieberman (Technion-Israel Institute of Technology); Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: An infinite-order asymptotic expansion is given for the autocovariance function of a general stationary long-memory process with memory parameter d in (-1/2,1/2). The class of spectral densities considered includes as a special case the stationary and invertible ARFIMA(p,d,q) model. The leading term of the expansion is of the order O(1/k^{1-2d}), where k is the autocovariance order, consistent with the well known power law decay for such processes, and is shown to be accurate to an error of O(1/k^{3-2d}). The derivation uses Erdélyi's (1956) expansion for Fourier-type integrals when there are critical points at the boundaries of the range of integration - here the frequencies {0,2}. Numerical evaluations show that the expansion is accurate even for small k in cases where the autocovariance sequence decays monotonically, and in other cases for moderate to large k. The approximations are easy to compute across a variety of parameter values and models.
    Keywords: Autocovariance, Asymptotic expansion, Critical point, Fourier integral, Long memory
    JEL: C13 C22
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1586&r=ecm
  16. By: Roger Klein (Rutgers University); Francis Vella (Georgetown University and IZA Bonn)
    Abstract: This paper provides a control function estimator to adjust for endogeneity in the triangular simultaneous equations model where there are no available exclusion restrictions to generate suitable instruments. Our approach is to exploit the dependence of the errors on exogenous variables (e.g. heteroscedasticity) to adjust the conventional control function estimator. The form of the error dependence on the exogenous variables is subject to restrictions, but is not parametrically specified. In addition to providing the estimator and deriving its large-sample properties, we present simulation evidence which indicates the estimator works well.
    Keywords: endogeneity, heteroskedasticity, control function
    JEL: C14 C30
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp2378&r=ecm
  17. By: Richard K. Crump; V. Joseph Hotz; Guido W. Imbens; Oscar A. Mitnik
    Abstract: Estimation of average treatment effects under unconfoundedness or exogenous treatment assignment is often hampered by lack of overlap in the covariate distributions. This lack of overlap can lead to imprecise estimates and can make commonly used estimators sensitive to the choice of specification. In such cases researchers have often used informal methods for trimming the sample. In this paper we develop a systematic approach to addressing such lack of overlap. We characterize optimal subsamples for which the average treatment effect can be estimated most precisely, as well as optimally weighted average treatment effects. Under some conditions the optimal selection rules depend solely on the propensity score. For a wide range of distributions a good approximation to the optimal rule is provided by the simple selection rule to drop all units with estimated propensity scores outside the range [0.1,0.9].
    JEL: C1 C13 C14 C2 C21
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:nbr:nberte:0330&r=ecm
  18. By: Stanislav Anatolyev (NES)
    Abstract: We develop and evaluate sequential testing tools for a class of nonparametric tests for predictability of financial returns that includes, in particular, the directional accuracy and excess profitability tests. We consider both the retrospective context where a researcher wants to track predictability over time in a historical sample, and the monitoring context where a researcher conducts testing as new observations arrive. Throughout, we elaborate on both two-sided and one-sided testing, focusing on linear monitoring boundaries that are continuations of horizontal lines corresponding to retrospective critical values. We illustrate our methodology by testing for directional and mean predictability of returns in a dozen of young stock markets in Eastern Europe.
    Keywords: Testing, monitoring, predictability, stock returns
    JEL: C12 C22 C52 C53
    Date: 2006–08
    URL: http://d.repec.org/n?u=RePEc:cfr:cefirw:w0071&r=ecm
  19. By: Gregory Connor; Oliver Linton
    Abstract: We introduce an alternative version of the Fama-French three-factor model of stockreturns together with a new estimation methodology. We assume that the factorbetas in the model are smooth nonlinear functions of observed securitycharacteristics. We develop an estimation procedure that combines nonparametrickernel methods for constructing mimicking portfolios with parametric nonlinearregression to estimate factor returns and factor betas simultaneously. Themethodology is applied to US common stocks and the empirical findings comparedto those of Fama and French.
    Keywords: characteristic-based factor model, arbitrage pricing theory, kernelestimation, nonparametric estimation.
    JEL: G12 C14
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/506&r=ecm
  20. By: Manuela Angelucci (University of Arizona and IZA Bonn); Orazio Attanasio (University College London, NBER, BREAD and CEPR)
    Abstract: In this paper we discuss several methodological issues related to the identification and estimation of Average Treatment on the Treated (ATT) effects in the presence of low compliance. We consider non-experimental data consisting of a treatment group, where a program is implemented, and of a control group that is non-randomly drawn, where the program is not offered. Estimating the ATT involves tackling both the non-random assignment of the program and the non-random participation among treated individuals. We argue against standard matching approaches to deal with the latter issue because they are based on the assumption that we observe all variables that determine both participation and outcome. Instead, we propose an IV-type estimator which exploits the fact that the ATT can be expressed as the Average Intent to Treat divided by the participation share, in the absence of spillover effects. We propose a semi-parametric estimator that couples the flexibility of matching estimators with a standard Instrumental Variable approach. We discuss the different assumptions necessary for the identification of the ATT with each of the two approaches, and we provide an empirical application by estimating the effect of the Mexican conditional cash transfer program, Oportunidades, on food consumption.
    Keywords: program evaluation, treatment effects
    JEL: C31
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp2368&r=ecm
  21. By: Li,Youwei; Donkers,Bas; Melenberg,Bertrand (Tilburg University, Center for Economic Research)
    Abstract: Microscopic simulation models are often evaluated based on visual inspection of the results. This paper presents formal econometric techniques to compare microscopic simulation (MS) models with real-life data. A related result is a methodology to compare different MS models with each other. For this purpose, possible parameters of interest, such as mean returns, or autocorrelation patterns, are classified and characterized. For each class of characteristics, the appropriate techniques are presented. We illustrate the methodology by comparing the MS model developed by Levy, Levy, and Solomon (2000) and the market fraction model developed by He and Li (2005a, b) with actual data
    Keywords: Microscopic simulation models;Econometric analysis
    JEL: C10 G12
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200699&r=ecm
  22. By: Ilze Kalnina; Oliver Linton
    Abstract: We propose an econometric model that captures the e¤ects of marketmicrostructure on a latent price process. In particular, we allow for correlationbetween the measurement error and the return process and we allow themeasurement error process to have a diurnal heteroskedasticity. Wepropose a modification of the TSRV estimator of quadratic variation. Weshow that this estimator is consistent, with a rate of convergence thatdepends on the size of the measurement error, but is no worse than n1=6.We investigate in simulation experiments the finite sample performance ofvarious proposed implementations.
    Keywords: Endogenous noise, Market Microstructure, Realised Volatility,Semimartingale
    JEL: C12
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/509&r=ecm
  23. By: Li,Youwei; Donkers,Bas; Melenberg,Bertrand (Tilburg University, Center for Economic Research)
    Abstract: This paper illustrates how to compare different microscopic simulation (MS) models and how to compare a MS model with real data in case the parameters of interest are estimated non- or semiparametrically. As examples we investigate the marginal single-period probability density function of stock returns, and the corresponding spectral density function and memory parameters. We illustrate the methodology by the MS models developed by Levy, Levy, Solomon (2000) and the market fraction model developed by He and Li (2005a, b), and confront the resulting return data with the S&P 500 stock index data.
    Keywords: Microscopic simulation models;Probability density function;Spectral density function;Memory parameters
    JEL: C14 G12
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200695&r=ecm
  24. By: Pierre Perron (Department of Economics, Boston University); Zhongjun Qu (Department of Economics, Boston University)
    Abstract: Recently, there has been an upsurge of interest on the possibility of confusing long memory and structural changes in level. Many studies have documented the fact that when a stationary short memory process is contaminated by level shifts the estimate of the fractional differencing parameter is biased away from zero and the autocovariance function exhibits a slow rate of decay, akin to a long memory process. Yet, no theoretical results are available pertaining to the distributions of the estimates. We fill this gap by analyzing the properties of the log periodogram estimate when the jump component is specified by a simple mixture model. Our theoretical results explain many findings reported and uncover new features. Simulations are presented to highlight the properties of the distributions and to assess the adequacy of our limit results as approximations to the finite sample distributions. Also, we explain how the limit distribution changes as the number of frequencies used varies, a feature that is different from the case with a pure fractionally integrated model. We confront this practical implication to daily SP500 absolute returns and their square roots over the period 1928-2002. Our findings are remarkable, the path of the log periodogram estimates clearly follows a pattern that would obtain if the true underlying process was one of short-memory contaminated by level shifts instead of a pure fractionally integrated process. A simple testing procedure is also proposed, which reinforces this conclusion.
    Keywords: structural change, jumps, long memory processes, fractional integration, Poisson process, frequency domain estimates.
    JEL: C22
    Date: 2004–06
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2006-015&r=ecm
  25. By: A. PRINZIE; D. VAN DEN POEL
    Abstract: Data mining applications addressing classification problems must master two key tasks: feature selection and model selection. This paper proposes a random feature selection procedure integrated within the multinomial logit (MNL) classifier to perform both tasks simultaneously. We assess the potential of the random feature selection procedure (exploiting randomness) as compared to an expert feature selection method (exploiting domain-knowledge) on a CRM cross-sell application. The results show great promise as the predictive accuracy of the integrated random feature selection in the MNL algorithm is substantially higher than that of the expert feature selection method.
    Date: 2006–05
    URL: http://d.repec.org/n?u=RePEc:rug:rugwps:06/390&r=ecm
  26. By: Zhong Zhao (IZA Bonn)
    Abstract: We use the data from the National Supported Work Demonstration to study performance of non-propensity-score-matching estimators, and to compare them with propensity score matching. We find that all matching estimators we studied here are sensitive to the choice of data set. Propensity score methods are sensitive to smoothing parameters, and they usually have larger standard error. Difference-in-differences and bias-corrected matching improve the performance of the matching estimators considered here. Our results suggest that the 1974 earnings are important for Dehejia and Wahba’s PSID data but not for their CPS data in replicating experiment results. After decomposing the selection bias, we find that a sizable selection bias on unobservables is present in all data sets.
    Keywords: treatment effect, matching estimators, NSW data, selection bias
    JEL: C14 C21 I38
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp2375&r=ecm
  27. By: Esther Ruiz; Helena Veiga
    Abstract: In this paper, we propose a new stochastic volatility model, called A-LMSV, to cope simultaneously with the leverage effect and long-memory. We derive its statistical properties and compare them with the properties of the FIEGARCH model. We show that the dependence of the autocorrelations of squares on the parameters measuring the asymmetry and the persistence is different in both models. The kurtosis and autocorrelations of squares do not depend on the asymmetry in the A-LMSV model while they increase with the asymmetry in the FIEGARCH model. Furthermore, the autocorrelations of squares increase with the persistence in the A-LMSV model and decrease in the FIEGARCH model. On the other hand, the autocorrelations of absolute returns increase with the magnitude of the asymmetry in the FIEGARCH model while they can increase or decrease depending on the sign of the asymmetry in the L-MSV model. Finally, the cross-correlations between squares and original observations are, in general, larger in the FIEGARCH model than in the ALMSV model. The results are illustrated by fitting both models to represent the dynamic evolution of volatilities of daily returns of the S&P500 and DAX indexes.
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws066016&r=ecm
  28. By: Giannone, Domenico; Reichlin, Lucrezia
    Abstract: This paper asks two questions. First, can we detect empirically whether the shocks recovered from the estimates of a structural VAR are truly structural? Second, can the problem of non-fundamentalness be solved by considering additional information? The answer to the first question is 'yes' and that to the second is 'under some conditions'.
    Keywords: identification; information; invertibility; structural VAR
    JEL: C32 C33 E00 E32 O3
    Date: 2006–06
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5725&r=ecm
  29. By: Assenmacher-Wesche, Katrin; Gerlach, Stefan
    Abstract: While monetary targeting has become increasingly rare, many central banks attach weight to money growth in setting interest rates. This raises the issue of how money can be combined with other variables, in particular the output gap, when analysing inflation. The Swiss National Bank emphasises that the indicators it uses to do so vary across forecasting horizons. While real indicators are employed for short-run forecasts, money growth is more important at longer horizons. Using band spectral regressions and causality tests in the frequency domain, we show that this interpretation of the inflation process fits the data well.
    Keywords: frequency domain; Phillips curve; quantity theory; spectral regression
    JEL: C22 E3 E5
    Date: 2006–06
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5723&r=ecm
  30. By: Frank A Cowell
    Abstract: This article provides a brief overview of the key issues in inequality measurement andhas been prepared for inclusion in the second edition of The New Palgrave.
    Keywords: inequality, social welfare, ranking.
    JEL: C13 D63
    Date: 2006–08
    URL: http://d.repec.org/n?u=RePEc:cep:stidar:86&r=ecm
  31. By: Edward L. Glaeser
    Abstract: Economists are quick to assume opportunistic behavior in almost every walk of life other than our own. Our empirical methods are based on assumptions of human behavior that would not pass muster in any of our models. The solution to this problem is not to expect a mass renunciation of data mining, selective data cleaning or opportunistic methodology selection, but rather to follow Leamer's lead in designing and using techniques that anticipate the behavior of optimizing researchers. In this essay, I make ten points about a more economic approach to empirical methods and suggest paths for methodological progress.
    JEL: A11 B4
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:nbr:nberte:0329&r=ecm
  32. By: Tatsuma Wada (Department of Economics, Boston University); Pierre Perron (Department of Economics, Boston University)
    Abstract: This paper first generalizes the trend-cycle decomposition framework of Perron and Wada (2005) based on an unobserved components models with innovations having a mixtures of Normals distribution, which is able to handle sudden level and slope changes to the trend function as well as outliers. We investigate how important are the differences in the implied trend and cycle compared to the popular decomposition based on the Hodrick and Prescott (HP) (1997) filter. Our results show important qualitative and quantitative differences in the implied cycles for both real GDP and consumption series for the G7 countries. Most of the differences can be ascribed to the fact that the HP filter does not handle well slope changes, level shifts and outliers, while our method does so. Third, we assess how such different cycles affect some socalled “stylized facts” about the relative variability of consumption and output across countries. Our results show again important differences. In particular, the crosscountry consumption correlations are generally higher than the output correlations, except for the period from 1975 to 1985, provided Canada is excluded. Our results therefore provide a partial solution to this puzzle. The evidence is particularly strong for the most recent period.
    Keywords: Trend-Cycle Decomposition, Unobserved Components Model, International Business Cycle, Non Gaussian Filter.
    JEL: C22 E32
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-44&r=ecm

This nep-ecm issue is ©2006 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.