nep-ecm New Economics Papers
on Econometrics
Issue of 2008‒01‒26
fourteen papers chosen by
Sune Karlsson
Orebro University

  2. The Stambaugh bias in panel predictive regressions By Erik Hjalmarsson
  3. Testing for cointegration using the Johansen methodology when variables are near-integrated By Erik Hjalmarsson; Par Osterholm
  4. Quantile-Based Nonparametric Inference for First-Price Auctions By Marmer, Vadim; Shneyerov, Artyom
  5. Forming Priors for DSGE Models (and How it Affects the Assessment of Nominal Rigidities) By Marco Del Negro; Frank Schorfheide
  6. Multivariate Markov switching with weighted regime determination: giving France more weight than Finland By Michael J. Dueker; Martin Sola
  7. Testing for Purchasing Power Parity in Cointegrated Panels By Pär Österholm; Mikael Carlsson; Johan Lyhagen
  8. Stochastic Volatilities and Correlations, Extreme Values and Modeling the Macroeconomic Environment, Under Which Brazilian Banks Operate By Marcos Souto; Theodore M. Barnhill
  9. Should we Care for Structural Breaks When Assessing Fiscal Sustainability? By António Afonso; Christophe Rault
  10. Fitting Event-History Models to Uneventful Data By Douglas A. Wolf; Thomas M. Gill
  11. Estimating average marginal effects in nonseparable structural systems By Susanne Schennach; Halbert White; Karim Chalak
  12. Visualizing exploratory factor analysis models By Sigbert Klinke; Cornelia Wagner
  13. Should The Widest Cleft in Statistics - How and Why Fisher opposed Neyman and Pearson By Francisco Louçã
  14. The Default Risk of Firms Examined with Smooth Support Vector Machines By Wolfgang Härdle; Yuh-Jye Lee; Dorothea Schäfer; Yi-Ren Yeh

  1. By: Victoria Zinde-Walsh; Dongming Zhu
    Abstract: The new distribution class, Asymmetric Exponential Power Distribution (AEPD), proposed in this paper generalizes the class of Skewed Exponential Power Distributions (SEPD) in a way that in addition to skewness introduces di¤erent decay rates of density in the left and right tails. Our parametrization provides an interpretable role for each parameter. We derive moments and moment-based measures: skewness, kurtosis, expected shortfall. It is demonstrated that a maximum entropy property holds for the AEPD distributions. We establish consistency, asymptotic normality and e¢ ciency of the maximum likelihood estimators over a large part of the parameter space by dealing with the problems created by non-smooth likelihood function and derive explicit analytical expressions of the asymptotic covariance matrix; where the results apply to the SEPD class they enlarge on the current literature. Finally, we give a convenient stochastic representation of the distribution; our Monte Carlo study illustrates the theoretical results.
    JEL: C13 C16
    Date: 2007–10
  2. By: Erik Hjalmarsson
    Abstract: This paper analyzes predictive regressions in a panel data setting. The standard fixed effects estimator suffers from a small sample bias, which is the analogue of the Stambaugh bias in time-series predictive regressions. Monte Carlo evidence shows that the bias and resulting size distortions can be severe. A new bias-corrected estimator is proposed, which is shown to work well in finite samples and to lead to approximately normally distributed t-statistics. Overall, the results show that the econometric issues associated with predictive regressions when using time-series data to a large extent also carry over to the panel case. The results are illustrated with an application to predictability in international stock indices.
    Date: 2007
  3. By: Erik Hjalmarsson; Par Osterholm
    Abstract: We investigate the properties of Johansen's (1988, 1991) maximum eigenvalue and trace tests for cointegration under the empirically relevant situation of near-integrated variables. Using Monte Carlo techniques, we show that in a system with near-integrated variables, the probability of reaching an erroneous conclusion regarding the cointegrating rank of the system is generally substantially higher than the nominal size. The risk of concluding that completely unrelated series are cointegrated is therefore non-negligible. The spurious rejection rate can be reduced by performing additional tests of restrictions on the cointegrating vector(s), although it is still substantially larger than the nominal size.
    Date: 2007
  4. By: Marmer, Vadim; Shneyerov, Artyom
    Abstract: We propose a quantile-based nonparametric approach to inference on the probability density function (PDF) of the private values in firrst-price sealed- bid auctions with independent private values. Our method of inference is based on a fully nonparametric kernel-based estimator of the quantiles and PDF of observable bids. Our estimator attains the optimal rate of Guerre, Perrigne, and Vuong (2000), and is also asymptotically normal with the appropriate choice of the bandwidth. As an application, we consider the problem of inference on the optimal reserve price.
    JEL: C14 D44
    Date: 2008–01–17
  5. By: Marco Del Negro; Frank Schorfheide
    Abstract: The paper discusses prior elicitation for the parameters of dynamic stochastic general equilibrium (DSGE) models, and provides a method for constructing prior distributions for a subset of these parameters from beliefs about the moments of the endogenous variables. The empirical application studies the role of price and wage rigidities in a New Keynesian DSGE model and finds that standard macro time series cannot discriminate among theories that differ in the quantitative importance of nominal frictions.
    JEL: C11 C32 E3
    Date: 2008–01
  6. By: Michael J. Dueker; Martin Sola
    Abstract: This article deals with using panel data to infer regime changes that are common to all of the cross section. The methods presented here apply to Markov switching vector autoregressions, dynamic factor models with Markov switching and other multivariate Markov switching models. The key feature we seek to add to these models is to permit cross-sectional units to have different weights in the calculation of regime probabilities. We apply our approach to estimating a business cycle chronology for the 50 U.S. States and the Euro area, and we compare results between country-specific weights and the usual case of equal weights. The model with weighted regime determination suggests that Europe experienced a recession in 2002-03, whereas the usual model with equal weights does not.
    Keywords: Business cycles ; France ; Finland
    Date: 2008
  7. By: Pär Österholm; Mikael Carlsson; Johan Lyhagen
    Abstract: This paper applies the maximum likelihood panel cointegration method of Larsson and Lyhagen (2007) to test the strong PPP hypothesis using data for the G7 countries. This method is robust in several important dimensions relative to previous methods, including the well-known issue of cross-sectional dependence of error terms. The findings using this new method are contrasted to those from the Pedroni (1995) cointegration tests and fully modified OLS and dynamic OLS esimators of the cointegrating vectors. Our overall results are the same across all approaches: The strong PPP hypothesis is rejected in favour of weak PPP with heterogenenous cointegrating vectors.
    Date: 2007–12–20
  8. By: Marcos Souto; Theodore M. Barnhill
    Abstract: Using monthly data for a set of variables, we examine the out-of-sample performance of various variance/covariance models and find that no model has consistently outperformed the others. We also show that it is possible to increase the probability mass toward the tails and to match reasonably well the historical evolution of volatilities by changing a decay factor appropriately. Finally, we implement a simple stochastic volatility model and simulate the credit transition matrix for two large Brazilian banks and show that this methodology has the potential to improve simulated transition probabilities as compared to the constant volatility case. In particular, it can shift CTM probabilities towards lower credit risk categories.
    Keywords: Emerging markets , Brazil , Banks , Interest rates , Credit risk ,
    Date: 2007–12–21
  9. By: António Afonso; Christophe Rault
    Abstract: We apply recent panel cointegration methods to a structural equation between government expenditure and revenue. Allowing for multiple endogenous breaks and after computing appropriate bootstrap critical values, we conclude for fiscal sustainability in the overall EU15 panel.
    Keywords: fiscal sustainability; EU; panel cointegration.
    JEL: C23 E62 H62
    Date: 2008–01
  10. By: Douglas A. Wolf (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 13244-1020); Thomas M. Gill (Yale University School of Medicine; Dorothy Adler Geriatric Assessment Center, New Haven, CT 08504)
    Abstract: Data with which to study disability dynamics usually take the form of successive current-status measures of disability rather than a record of events or spell durations. One recent paper presented a semi-Markov model of disability dynamics in which spell durations were inferred from sequences of current-status measures taken at 12-month intervals. In that analysis, it was assumed that no unobserved disablement transitions occurred between annual interviews. We use data from a longitudinal survey in which participants' disability was measured at monthy intervals, and simulate the survival curves for remaining disabled that would be obtained with 1- and 12-month follow-up intervals. The median length of an episode of disability based on the 12-month interval data is over 22 months, while the "true" median, based on the 1-month interval data, is only one month.
    Keywords: Disability; semi-Markov process; duration analysis
    JEL: C41 C81 I19
    Date: 2008–01
  11. By: Susanne Schennach (University of Chicago); Halbert White (University of California-San Diego); Karim Chalak (Boston College)
    JEL: C13 C14 C31
    Date: 2007–12–03
  12. By: Sigbert Klinke; Cornelia Wagner
    Abstract: Exploratory factor analysis (EFA) is an important tool in data analyses, particularly in social science. Usually four steps are carried out which contain a large number of options. One important option is the number of factors and the association of variables with a factor. Our tools aim to visualize various models with different numbers in parallel of factors and to analyze which consequences a specific option has.We apply our method to data collected at the School of Business and Economics for evaluation of lectures by students. These data were analyzed by Zhou (2004) and Reichelt (2007).
    Keywords: Factor analysis, visualization, questionnaire, evaluation of teaching
    JEL: C39 C45 C63
    Date: 2008–01
  13. By: Francisco Louçã
    Abstract: The paper investigates the “widest cleft”, as Savage put it, between frequencists in the foundation of modern statistics: that opposing R.A. Fisher to Jerzy Neyman and Egon Pearson. Apart from deep personal confrontation through their lives, these scientists could not agree on methodology, on definitions, on concepts and on tools. Their premises and their conclusions widely differed and the two groups they inspired ferociously opposed in all arenas of scientific debate. As the abyss widened, with rare exceptions economists remained innocent of this confrontation. The introduction of probability in economics occurred in fact after these ravaging battles began, even if they were not as public as they became in the 1950s. In any case, when Haavelmo, in the 1940s, suggested a reinterpretation of economics according to the probability concepts, he chose sides and inscribed his concepts in the Neyman-Pearson tradition. But the majority of the profession indifferently used tools developed by each of the opposed groups of statisticians, and many puzzled economists chose to ignore the debate. Economics became, as a consequence, one of the experimental fields for “hybridization”, a synthesis between Fisherian and Neyman-Pearsonian precepts, defined as a number of practical proceedings for statistical testing and inference that were developed notwithstanding the original authors, as an eventual convergence between what they considered to be radically irreconcilable.
    Date: 2008–01
  14. By: Wolfgang Härdle; Yuh-Jye Lee; Dorothea Schäfer; Yi-Ren Yeh
    Abstract: In the era of Basel II a powerful tool for bankruptcy prognosis is vital for banks. The tool must be precise but also easily adaptable to the bank's objections regarding the relation of false acceptances (Type I error) and false rejections (Type II error). We explore the suitability of Smooth Support Vector Machines (SSVM), and investigate how important factors such as selection of appropriate accounting ratios (predictors), length of training period and structure of the training sample influence the precision of prediction. Furthermore we showthat oversampling can be employed to gear the tradeoff between error types. Finally, we illustrate graphically how different variants of SSVM can be used jointly to support the decision task of loan officers.
    Keywords: Insolvency Prognosis, SVMs, Statistical Learning Theory, Non-parametric Classification
    JEL: G30 C14 G33 C45
    Date: 2007

This nep-ecm issue is ©2008 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.