nep-ecm New Economics Papers
on Econometrics
Issue of 2012‒09‒09
23 papers chosen by
Sune Karlsson
Orebro University

  1. An Improved Nonparametric Unit-Root Test By Jiti Gao; Maxwell King
  2. Nonparametric Kernel Regression with Multiple Predictors and Multiple Shape Constraints By Pang Du; Christopher F. Parmeter; Jeffrey S. Racine
  3. A simple two-step method for testing moment inequalities with an application to inference in partially identified models By Joseph P. Romano; Azeem M. Shaikh; Michael Wolf
  4. Bootstrap Determination of the Co-integration Rank in Heteroskedastic VAR Models By Giuseppe Cavaliere; Anders Rahbek; A.M.Robert Taylor
  5. Variance Ratio Testing for Fractional Cointegration in Presence of Trends and Trend Breaks By Dechert, Andreas
  6. Estimating High-Dimensional Time Series Models By Marcelo C. Medeiros; Eduardo F. Mendes
  7. BAYESIAN UNIT ROOT TESTING: THE EFFECT OF CHOICE OF PRIOR ON TEST OUTCOMES By Charley Xia and William Griffiths
  8. Spline Regression in the Presence of Categorical Predictors By Shujie Ma; Jeffrey S. Racine; Lijian Yang
  9. Granger-causal analysis of VARMA-GARCH models By Tomasz Wozniaka
  10. Measuring financial risk and portfolio optimization with a non-Gaussian multivariate model By Kim, Young Shin; Giacometti, Rosella; Rachev, Svetlozar T.; Fabozzi, Frank J.; Mignacca, Domenico
  11. Statistical analysis of the Lognormal-Pareto distribution using Probability Weighted Moments and Maximum Likelihood By Marco Bee
  12. A Discrete Choice Approach to Estimating Armed Conflicts’ Casualties: Revisiting the Numbers of a ‘Truth Commission’ By Silvio Rendon
  13. The StoNED age: The departure into a new era of efficiency analysis? An MC study comparing StoNED and the "oldies" (SFA and DEA) By Andor, Mark; Hesse, Frederik
  14. Additive Regression Splines With Irrelevant Categorical and Continuous Regressors By Shujie Ma; Jeffrey S. Racine
  15. Analysis of Numerical Errors By Manuel S. Santos; Adrian Peralta-Alva
  16. Statistical verification of a natural "natural experiment": Tests and sensitivity checks for the sibling sex ratio instrument By Huber, Martin
  17. Target Fitting and Robustness Analysis in CGE Models By Gabriel Garber; Eduardo A. Haddad
  18. WHY IT IS OK TO USE THE HAR-RV(1,5,21) MODEL By Mihaela Craioveanu; Eric Hillebrand
  19. Calculating Poverty Measures from the Generalized Beta Income Distribution By DUANGKAMON CHOTIKAPANICH, WILLIAM GRIFFITHS, WASANA KARUNARATHNE, D.S. PRASADA RAO
  20. Methodological mistakes and econometric consequences By Zaman, Asad
  21. Gravity or Dummies? The Limits of Identification in Gravity Estimations By Cecília Hornok
  22. An Early-warning and Dynamic Forecasting Framework of Default Probabilities for the Macroprudential Policy Indicators Arsenal By Xisong Jin; Francisco Nadal De Simone
  23. Bayesian Semiparametric Dynamic Nelson-Siegel Model By Cem Çakmakli

  1. By: Jiti Gao; Maxwell King
    Abstract: This paper proposes a simple and improved nonparametric unit-root test. An asymptotic distribution of the proposed test is established. Finite sample comparisons with an existing nonparametric test are discussed. Some issues about possible extensions are outlined.
    Keywords: Autoregression, nonparametric unit?root test, nonstationary time series, specification testing.
    JEL: C12 C14 C22
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2012-16&r=ecm
  2. By: Pang Du; Christopher F. Parmeter; Jeffrey S. Racine
    Abstract: Nonparametric smoothing under shape constraints has recently received much well-deserved attention. Powerful methods have been proposed for imposing a single shape constraint such as monotonicity and concavity on univariate functions. In this paper, we extend the monotone kernel regression method in Hall and Huang (2001) to the multivariate and multi-constraint setting. We impose equality and/or inequality constraints on a nonparametric kernel regression model and its derivatives. A bootstrap procedure is also proposed for testing the validity of the constraints. Consistency of our constrained kernel estimator is provided through an asymptotic analysis of its relationship with the unconstrained estimator. Theoretical underpinnings for the bootstrap procedure are also provided. Illustrative Monte Carlo results are presented and an application is considered.
    Keywords: shape restrictions, nonparametric regression, multivariate kernel estimation, hypothesis testing
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:mcm:deptwp:2012-08&r=ecm
  3. By: Joseph P. Romano; Azeem M. Shaikh; Michael Wolf
    Abstract: This paper considers the problem of testing a finite number of moment inequalities. We propose a two-step approach. In the first step, a confidence region for the moments is constructed. In the second step, this set is used to provide information about which moments are “negative.” A Bonferonni-type correction is used to account for the fact that with some probability the moments may not lie in the confidence region. It is shown that the test controls size uniformly over a large class of distributions for the observed data. An important feature of the proposal is that it remains computationally feasible, even when the number of moments is very large. The finite-sample properties of the procedure are examined via a simulation study, which demonstrates, among other things, that the proposal remains competitive with existing procedures while being computationally more attractive.
    Keywords: Bonferonni inequality, bootstrap, moment inequalities, partial identification, uniform validity
    JEL: C12 C14
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:zur:econwp:090&r=ecm
  4. By: Giuseppe Cavaliere (Department of Statistical Sciences, University of Bologna); Anders Rahbek (Department of Economics, University of Copenhagen and CREATES); A.M.Robert Taylor (School of Economics and Granger Centre for Time Series Econometrics, University of Nottingham)
    Abstract: In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates of the underlying VAR model which obtain under the reduced rank null hypothesis. They propose methods based on an i.i.d. bootstrap re-sampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co-integrated VAR model with i.i.d. innovations. In this paper we investigate the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap re-sampling scheme, when time-varying behaviour is present in either the conditional or unconditional variance of the innovations. We show that the bootstrap PLR tests are asymptotically correctly sized and, moreover, that the probability that the associated bootstrap sequential procedures select a rank smaller than the true rank converges to zero. This result is shown to hold for both the i.i.d. and wild bootstrap variants under conditional heteroskedasticity but only for the latter under unconditional heteroskedasticity. Monte Carlo evidence is reported which suggests that the bootstrap approach of Cavaliere et al. (2012) signi?cantly improves upon the ?nite sample performance of corresponding procedures based on either the asymptotic PLR test or an alternative bootstrap method (where the short run dynamics in the VAR model are estimated unrestrictedly) for a variety of conditionally and unconditionally heteroskedastic innovation processes.
    Keywords: Bootstrap, Co-integration, Trace statistic, Rank determination, heteroskedasticity.
    JEL: C30 C32
    Date: 2012–08–31
    URL: http://d.repec.org/n?u=RePEc:aah:create:2012-36&r=ecm
  5. By: Dechert, Andreas
    Abstract: Modeling fractional cointegration relationships has become a major topic in applied time series analysis as it steps back from the traditional rigid I(1)/I(0) methodology. Hence, the number of proposed tests and approaches has grown over the last decade. The aim of this paper is to study the nonparametric variance ratio approach suggested by Nielsen for the case of fractional cointegration in presence of linear trend and trend breaks. The consideration of trend breaks is very important in order to avoid spurious fractional integration, so this possibility should be regarded by practitioners. This paper proposes to calculate p-values by means of gamma distributions and gives response regressions parameters for the asymptotic moments of them. In Monte Carlo simulations this work compares the power of the approach against a Johansen type rank test suggested, which is robust against trend breaks but not fractional (co-)integration. As the approach also obtains an estimator for the cointegration space, the paper compares it with OLS estimates in simulations. As an empirical example the validity of the market expectation hypothesis is tested for monthly Treasury bill rates ranging from 1958-2011, which might have a trend break around September 1979 due to change of American monetary policy.
    Keywords: fractional integration; fractional cointegration; long memory; variance ratio; nonparametric; trend breaks; market expectation hypothesis
    JEL: C32 E43 C14
    Date: 2012–09–04
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:41044&r=ecm
  6. By: Marcelo C. Medeiros (Pontifical Catholic University of Rio de Janeiro); Eduardo F. Mendes (Pontifical Catholic University of Rio de Janeiro)
    Abstract: We study the asymptotic properties of the Adaptive LASSO (adaLASSO) in sparse, high-dimensional, linear time-series models. We assume both the number of covariates in the model and candidate variables can increase with the number of observations and the number of candidate variables is, possibly, larger than the number of observations. We show the adaLASSO consistently chooses the relevant variables as the number of observations increases (model selection consistency), and has the oracle property, even when the errors are non-Gaussian and conditionally heteroskedastic. A simulation study shows the method performs well in very general settings. Finally, we consider two applications: in the first one the goal is to forecast quarterly US inflation one-step ahead, and in the second we are interested in the excess return of the S&P 500 index. The method used outperforms the usual benchmarks in the literature.
    Keywords: sparse models, shrinkage, LASSO, adaLASSO, time series, forecasting.
    JEL: C22
    Date: 2012–09–04
    URL: http://d.repec.org/n?u=RePEc:aah:create:2012-37&r=ecm
  7. By: Charley Xia and William Griffiths
    Abstract: A Monte Carlo experiment is used to examine the size and power properties of alternative Bayesian tests for unit roots. Four different prior distributions for the root that is potentially unity – a uniform prior and priors attributable to Jeffreys, Lubrano, and Berger and Yang – are used in conjunction with two testing procedures: a credible interval test and a Bayes factor test. Two extensions are also considered: a test based on model averaging with different priors and a test with a hierarchical prior for a hyperparameter. The tests are applied to both trending and non-trending series. Our results favor the use of a prior suggested by Lubrano. Outcomes from applying the tests to some Australian macroeconomic time series are presented.
    Keywords: N/A
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:mlb:wpaper:1152&r=ecm
  8. By: Shujie Ma; Jeffrey S. Racine; Lijian Yang
    Abstract: We consider the problem of estimating a relationship nonparametrically using regression splines when there exist both continuous and categorical predictors. We combine the global properties of regression splines with the local properties of categorical kernel functions to handle the presence of categorical predictors rather than resorting to sample splitting as is typically done to accommodate their presence. The resulting estimator possesses substantially better nite-sample performance than either its frequency-based peer or cross-validated local linear kernel regression or even additive regression splines (when additivity does not hold). Theoretical underpinnings are provided and Monte Carlo simulations are undertaken to assess nite-sample behavior, and two illustrative applications are provided. An implementation in R (R Core Team (2012)) is available; see the R package 'crs' for details (Racine & Nie (2012)).
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:mcm:deptwp:2012-06&r=ecm
  9. By: Tomasz Wozniaka
    Abstract: Recent economic developments have shown the importance of spillover and contagion effects in financial markets. Such effects are not limited to relations between the levels of financial variables but also impact on their volatility. I investigate Granger causality in conditional mean and conditional variances of time series. For this purpose a VARMA-GARCH model is used. I derive parametric restrictions for the hypothesis of noncausality in conditional variances between two groups of variables, when there are other variables in the system as well. These novel conditions are convenient for the analysis of potentially large systems of economic variables. Such systems should be considered in order to avoid the problem of omitted variable bias. Further, I propose a Bayesian Lindley-type testing procedure in order to evaluate hypotheses of noncausality. It avoids the singularity problem that may appear in the Wald test. Also, it relaxes the assumption of the existence of higher-order moments of the residuals required for the derivation of asymptotic results of the classical tests. In the empirical example, I find that the dollar-to-Euro exchange rate does not second-order cause the pound-to-Euro exchange rate, in the system of variables containing also the Swiss frank-to-Euro exchange rate, which confirms the meteor shower hypothesis of Engle, Ito & Lin (1990).
    Keywords: Granger causality, second-order noncausality, VARMA-GARCH models, Bayesian testing
    JEL: C11 C12 C32 C53
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2012/19&r=ecm
  10. By: Kim, Young Shin; Giacometti, Rosella; Rachev, Svetlozar T.; Fabozzi, Frank J.; Mignacca, Domenico
    Abstract: In this paper, we propose a multivariate market model with returns assumed to follow a multivariate normal tempered stable distribution. This distribution, defined by a mixture of the multivariate normal distribution and the tempered stable subordinator, is consistent with two stylized facts that have been observed for asset distributions: fat-tails and an asymmetric dependence structure. Assuming infinitely divisible distributions, we derive closed-form solutions for two important measures used by portfolio managers in portfolio construction: the marginal VaR and the marginal AVaR. We illustrate the proposed model using stocks comprising the Dow Jones Industrial Average, first statistically validating the model based on goodness-of-fit tests and then demonstrating how the marginal VaR and marginal AVaR can be used for portfolio optimization using the model. Based on the empirical evidence presented in this paper, our framework offers more realistic portfolio risk measures and a more tractable method for portfolio optimization. --
    Keywords: portfolio risk,portfolio optimization,portfolio budgeting,marginal contribution,fat-tailed distribution,multivariate normal tempered stable distribution
    JEL: C58 C61 G11 G32
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:zbw:kitwps:44&r=ecm
  11. By: Marco Bee
    Abstract: This paper deals with the estimation of the lognormal-Pareto and the lognormal-Generalized Pareto mixture distributions. The log-likelihood function is discontinuous, so that Maximum Likelihood Estimation is not asymptotically optimal. For this reason, we develop an alternative method based on Probability Weighted Moments. We show that the standard version of the method can be applied to the first distribution, but not to the latter. Thus, in the lognormal- Generalized Pareto case, we work out the details of a mixed approach combining Maximum Likelihood Estimation and Probability Weighted Moments. Extensive simulations give precise indications about the relative efficiencies of the methods in various setups. Finally, we apply the techniques to two real datasets in the actuarial and operational risk management fields.
    Keywords: Probability Weighted Moments; Mixed Estimation Method; Lognormal-Pareto Distri- bution; Loss Models
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:trn:utwpde:1208&r=ecm
  12. By: Silvio Rendon (Department of Economics, Stony Brook University)
    Abstract: I discuss the application of capture-recapture methods to estimating the total number of deaths in armed conflicts, and propose an alternative method based on a trivariate discrete choice model. Data come from the ‘Truth and Reconciliation Commission’ (TRC) of Peru, around 25000 deaths, classified by three sources of information, geographical strata, and perpetrator: the State and the Shining Path. In these data many killings have been only documented by one source, which makes a projection of killings unfeasible . TRC consultants Ball et al. (2003) tried to overcome this problem by means of a ‘residual estimation,’ consisting of merging data for different perpetrators. I show theoretically and empirically that this method over-estimates the number of deaths. Using a conditional trivariate Probit I estimate the total number of deaths in around 28000, 60% by the State, 40% by the Shining Path. This number is substantially lower and has a different composition than the around 69000 deaths, 30% by the State, 46% by the Shining Path, and 24% by ‘other perpetrators,’ calculated by Ball et al.
    Keywords: Armed Conflict, Capture-Recapture, Count Data, Discrete Choice, Human Rights, Maximum-Likelihood Estimation, Poisson Regression.
    JEL: D74 C35 C4 O54 P16
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:nys:sunysb:12-03&r=ecm
  13. By: Andor, Mark; Hesse, Frederik
    Abstract: Based on the seminal paper of Farrell (1957), researchers have developed several methods for measuring efficiency. Nowadays, the most prominent representatives are nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA), both introduced in the late 1970s. Since decades, researchers have been attempting to develop a method which combines the virtues - both nonparametric and stochastic - of these oldies. The recently introduced Stochastic non-smooth envelopment of data (StoNED) by Kuosmanen and Kortelainen (2010) is a promising method. This paper compares the StoNED method with the two oldies DEA and SFA and extends the initial Monte Carlo simulation of Kuosmanen and Kortelainen (2010) in two directions. Firstly, we consider a wider range of conditions. Secondly, we also consider the maximum likelihood estimator (ML) and the pseudolikelihood estimator (PL) for SFA and StoNED, respectively. We show that, in scenarios without noise, the rivalry is still between the oldies, while in noisy scenarios, the nonparametric StoNED PL now constitutes a promising alternative to the SFA ML. --
    Keywords: efficiency,stochastic non-smooth envelopment of data (StoNED),data envelopment analysis (DEA),stochastic frontier analysis (SFA),monte carlo simulation
    JEL: C1 C5 D2 L5 Q4
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:zbw:cawmdp:60&r=ecm
  14. By: Shujie Ma; Jeffrey S. Racine
    Abstract: We consider the problem of estimating a relationship using semiparametric additive regression splines when there exist both continuous and categorical regressors, some of which are irrelevant but this is not known a priori. We show that choosing the spline degree, number of subintervals, and bandwidths via cross-validation can automatically remove irrelevant regressors, thereby delivering 'automatic dimension reduction' without the need for pre-testing. Theoretical underpinnings are provided, finite-sample performance is studied, and an illustrative application demonstrates the ecacy of the proposed approach in finite-sample settings. An R package implementing the methods is available from the Comprehensive R Archive Network (Racine and Nie (2011)).
    Keywords: B-spline, discrete, kernel
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:mcm:deptwp:2012-07&r=ecm
  15. By: Manuel S. Santos (Department of Economics, University of Miami); Adrian Peralta-Alva (Research Department, Federal Reserve Bank of Saint Louis)
    Abstract: This paper provides a general framework for the quantitative analysis of stochastic dynamic models. We review convergence properties of some numerical algorithms and available methods to bound approximation errors. We then address convergence and accuracy properties of the simulated moments. Our purpose is to provide an asymptotic theory for the computation, simulation-based estimation, and testing of dynamic economies. The theoretical analysis is complemented with several illustrative examples. We study both optimal and non-optimal economies. Optimal economies generate smooth laws of motion defining Markov equilibria, and can be approximated by recursive methods with contractive properties. Non-optimal economies, however, lack existence of continuous Markov equilibria, and need to be computed by other algorithms with weaker approximation properties.
    Keywords: Stochastic Dynamic Model, Markov Equilibrium, Numerical Solution, Approximation Error, Accuracy, Simulation-Based Estimation, Consistency
    JEL: C63 C60
    Date: 2012–08–19
    URL: http://d.repec.org/n?u=RePEc:mia:wpaper:2012-6&r=ecm
  16. By: Huber, Martin
    Abstract: This paper presents statistical evidence about the validity of the sibling sex ratio instrument proposed by Angrist and Evans (1998), a prominent natural “natural experiment” in the sense of Rosenzweig and Wolpin (2000). The sex ratio of the first two siblings is arguably randomly assigned and influences the probability of having a third child, which makes it a candidate instrument for fertility when estimating the effect of fertility on female labor supply. However, identification hinges on the satisfaction of the instrumental exclusion restriction and the monotonicity of fertility in the instrument, see Imbens and Angrist (1994). Using the methods of Kitagawa (2008), Huber and Mellace (2011a), and Huber and Mellace (2012), we for the first time verify the validity of the sibling sex ratio instrument by statistical hypothesis tests, which suggest that violations are small if not close to nonexistent. We also provide novel sensitivity checks to assess deviations from the exclusion restriction and/or monotonicity in the nonparametric local average treatment effect framework and find the negative labor supply effect of fertility to be robust to a plausible range of violations.
    Keywords: instrumental variable, treatment effects, LATE, tests, sensitivity analysis.
    JEL: C12 C21 C26 J13 J22
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:usg:econwp:2012:19&r=ecm
  17. By: Gabriel Garber; Eduardo A. Haddad
    Abstract: This paper proposes a methodology to integrate econometric models with Johansen-type computable general equilibrium (CGE) models in instances when it is necessary to generate results consistent with a subset of variables that are endogenous to both models. Results for a subset of the CGE endogenous variables are generated from econometric models, and set as targets to be replicated by the CGE model. The methodology is further extended for robustness testing of the outcomes in cases which the targeted scenarios are random. The proposed methodology is illustrated by simulating the impacts of a monetary shock in Brazil.
    Keywords: Model integration, target fitting, sensitivity analysis, CGE models, monetary
    JEL: C63 C68 R13 R15
    Date: 2012–08–12
    URL: http://d.repec.org/n?u=RePEc:spa:wpaper:2012wpecon14&r=ecm
  18. By: Mihaela Craioveanu (University of Central Missouri); Eric Hillebrand (Aarhus University)
    Abstract: The lag structure (1,5,21) is most commonly used for the HAR-RV model for realized volatility (Corsi 2009), where the terms are thought to represent a daily, a weekly, and a monthly time scale. The aggregation of the three scales approximates long mem- ory. We explore flexible lag selection for the model on realized volatility constructed from tick-level data of the thirty constituting stocks of the Dow Jones Industrial Average between 1995 and 2007. The computational costs for flexible lag selection are substantial, and we use a parallel computing environment. We find that flexible lags do not improve in-sample or out-of-sample fit. Our results therefore confirm the standard practice in a large-scale data application.
    Keywords: Time Series, Financial Econometrics, HAR-RV Model
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:umn:wpaper:1201&r=ecm
  19. By: DUANGKAMON CHOTIKAPANICH, WILLIAM GRIFFITHS, WASANA KARUNARATHNE, D.S. PRASADA RAO
    Abstract: Data for measuring poverty and income inequality are frequently available in a summary form that describes the proportion of income or expenditure for each of a number of population proportions. While various discrete measures can be applied directly to data in this limited form, these discrete measures typically ignore inequality within each group. This problem can be overcome by fitting a parametric income distribution to the grouped data and computing required quantities from the estimated parameters of this distribution. In this paper we show how to calculate several poverty measures from parameters of the generalized beta distribution of the second kind, and its popular special cases. An analysis of poverty changes in ten countries from South and Southeast Asia is used to illustrate the methodology.
    JEL: I32 O15 C13
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:mlb:wpaper:1154&r=ecm
  20. By: Zaman, Asad
    Abstract: Econometric Methodology is based on logical positivist principles. Since logical positivism has collapsed, it is necessary to re-think these foundations. We show that positivist methodology has led econometricians to a meaningless search for patterns in the data. An alternative methodology which relates observed patterns to real causal structures is proposed
    Keywords: Econometric Methodology; logical positivism; realism; causality; VAR models; Forecasting; surprise; goodness of fit
    JEL: B16 C19
    Date: 2012–08–30
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:41032&r=ecm
  21. By: Cecília Hornok
    Abstract: Trade economists often estimate gravity equations of international trade with fixed effects. Anderson and van Wincoop (2003, American Economic Review 93, 170–192) have shown the importance of controlling for multilateral trade resistances when estimating a gravity equation. This can be done by including exporter-time and importer-time fixed effects in a panel or exporter and importer fixed effects in a cross section estimation. I argue that this approach limits the identifiability of policy parameters that capture the effect of certain ”club memberships” (EU, NAFTA, euro area, WTO, etc.) on trade flows. I show that, in the baseline case, only one effect can be identified, which precludes, for example, the estimation of separate effects on the exporter and the importer side. The magnitude, and even the sign, of the estimated club effect are very sensitive to the precise identification assumptions, which are often left unspecified in empirical studies. The underlying problem is that club membership provides some, but very little bilateral variation. When heterogeneous club effects are to be identified, the membership dummies can become perfectly collinear with the fixed effects. Empirical researchers may not be aware of the lack of identification, because standard estimation techniques often permit them to run perfectly collinear regressions. I illustrate the findings with estimating the effect of EU enlargement in 2004 on the trade flows of new and old members. Finally, I discuss potential solutions.
    Date: 2012–05–20
    URL: http://d.repec.org/n?u=RePEc:ceu:econwp:2012_11&r=ecm
  22. By: Xisong Jin; Francisco Nadal De Simone
    Abstract: The estimation of banks? marginal probabilities of default using structural credit risk models can be enriched incorporating macro-financial variables readily available to economic agents. By combining Delianedis and Geske?s model with a Generalized Dynamic Factor Model into a dynamic t-copula as a mechanism for obtaining banks? dependence, this paper develops a framework that generates an early warning indicator and robust out-of-sample forecasts of banks? probabilities of default. The database comprises both a set of Luxembourg banks and the European banking groups to which they belong. The main results of this study are, first, that the common component of the forward probability of banks? defaulting on their long-term debt, conditional on not defaulting on their short-term debt, contains a significant early warning feature of interest for an operational macroprudential framework driven by economic activity, credit and interbank activity. Second, incorporating the common and the idiosyncratic components of macro-financial variables improves the analytical features and the out-of-sample forecasting performance of the framework proposed.
    Keywords: financial stability, macroprudential policy, credit risk, early warning indicators, default probability, Generalized Dynamic Factor Model, dynamic copulas, GARCH
    JEL: C30 E44 G1
    Date: 2012–07
    URL: http://d.repec.org/n?u=RePEc:bcl:bclwop:bclwp075&r=ecm
  23. By: Cem Çakmakli (Department of Quantitative Economics, University of Amsterdam, The Netherlands)
    Abstract: This paper proposes the Bayesian semiparametric dynamic Nelson-Siegel model, where the density of the yield curve factors and thereby the density of the yields are estimated along with other model parameters. This is accomplished by modeling the error distributions of the factors according to a Dirichlet process mixture. An efficient and computationally tractable algorithm is implemented to obtain Bayesian inference. The semiparametric structure of the factors enables us to capture various forms of non-normalities including fat tails, skewness and nonlinear dependence between factors using a unified approach. The potential of the proposed framework is examined using US bond yields data. The results show that the model can identify two different periods with distinct characteristics. While the relatively stable years of late 1980s and 1990s comprise the first period, the second period captures the years of severe recessions including the recessions of 1970s and 1980s and the recent recession of 2007-9 together with highly volatile periods of Federal Reserve’s monetary policy experiments in the first half of 1980s. Interestingly, results point out a nonlinear dependence structure between the factors contrasting existing evidence.
    Keywords: Dynamic factor model, Yield curve, Nelson-Siegel model, Dirichlet process mixture, Bayesian inference
    JEL: C14 C33 C38 G12
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:59_12&r=ecm

This nep-ecm issue is ©2012 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.