nep-ecm New Economics Papers
on Econometrics
Issue of 2009‒07‒28
27 papers chosen by
Sune Karlsson
Orebro University

  1. Errors-in-Variables Estimation with No Instruments By Ramazan Gencay; Nikola Gradojevic
  2. Bayesian semiparametric stochastic volatility modeling By Mark J. Jensen; John M. Maheu
  3. Nonparametric Estimation of a Polarization Measure By Gordon Anderson; Oliver Linton; Yoon-Jae Whang
  4. Structural Threshold Regression By Andros Kourtellos; Thanasis Stengos; Chih Ming Tan
  5. GMM estimation of spatial panels By Moscone, Francesco; Tosetti, Elisa
  6. An Improved Bootstrap Test of Stochastic Dominance By Oliver Linton; Kyungchul Song; Yoon-Jae Whang
  7. Estimation of DSGE Models When the Data are Persistent By Yuriy Gorodnichenko; Serena Ng
  8. Realising the future: forecasting with high frequency based volatility (HEAVY) models By Neil Shephard; Kevin Sheppard
  9. Weak and Strong Cross Section Dependence and Estimation of Large Panels By Chudik, A.; Pesaran, M.H.; Tosetti, E.
  10. Testing Changing Harmonic Regressors By Franses, Ph.H.B.F.
  11. Forecasting Volatility under Fractality, Regime-Switching, Long Memory and Student-t Innovations By Thomas Lux; Leonardo Morales-Arias
  12. The Tobit model with feedback and random effects: A Monte-Carlo study By Eva Poen
  13. Rationalizable Counterfactual Choice Probabilities in Dynamic Binary Choice Processes By Xun Tang
  14. Forecasting Inflation Using Dynamic Model Averaging By Gary Koop; Dimitris Korobilis
  15. Estimating Simultaneous Games with Incomplete Information under Median Restrictions By Xun Tang
  16. A Factor Analysis of Bond Risk Premia By Sydney C. Ludvigson; Serena Ng
  17. A Nonlinear Approach to Testing the Unit Root Null Hypothesis: An Application to International Health Expenditures By Paresh Kumar Narayan; Stephan Popp
  18. Asymmetry of Information Flow Between Volatilities Across Time Scales By Ramazan Gencay; Nikola Gradojevic; Faruk Selcuk; Brandon Whitcher
  19. A Note on Adapting Propensity Score Matching and Selection Models to Choice Based Samples By James J. Heckman; Petra E. Todd
  20. Sequential Methodology for Signaling Business Cycle Turning Points By Vasyl Golosnoy; Jens Hogrefe
  21. ARE EU BUDGET DEFICITS STATIONARY? By Mark J. Holmes; Jesús Otero; Theodore Panagiotidis
  22. An (almost) unbiased estimator for the S-Gini index By T. DEMUYNCK
  23. Do high-frequency measures of volatility improve forecasts of return distributions? By John M. Maheu; Thomas H. McCurdy
  24. Nonparamatric estimation in binary choice models By Eric Gautier; Yuichi Kitamura
  25. On the Sensitivity of Kernel-based Tests of Conditional Moment Restrictions By Eduardo Fé-Rodríguez; Chris D. Orme
  26. The Asian Crisis Contagion: A Dynamic Correlation Approach Analysis By Essahbi Essaadi; Jamel Jouini; Wajih Khallouli
  27. Time-Varying Autoregressive Conditional Duration Model By Bortoluzzo, Adriana B.; Morettin, Pedro A.; Toloi, Clelia M. C.

  1. By: Ramazan Gencay (Department of Economics, Simon Fraser University); Nikola Gradojevic (Faculty of Business Administration, Lakehead University)
    Abstract: This paper develops a wavelet (spectral) approach to estimate the parameters of a linear regression model where the regressand and the regressors are persistent processes and contain a measurement error. We propose a wavelet filtering approach which does not require instruments and yields unbiased and consistent estimates for the intercept and the slope parameters. Our Monte Carlo results also show that the wavelet approach is particularly effective when measurement errors for the regressand and the regressor are serially correlated. With this paper, we hope to bring a fresh perspective and stimulate further theoretical research in this area
    Keywords: Cointegration, discrete wavelet transformation, maximum overlap wavelet transformation, energy decomposition, errors-in-variables, persistence
    JEL: C1 C2 C12 C22 F31 G0 G1
    Date: 2009–01
  2. By: Mark J. Jensen (Federal Reserve Bank of Atlanta); John M. Maheu (University of Toronto and RCEA)
    Abstract: This paper extends the existing fully parametric Bayesian literature on stochastic volatility to allow for more general return distributions. Instead of specifying a particular distribution for the return innovation, nonparametric Bayesian methods are used to flexibly model the skewness and kurtosis of the distribution while the dynamics of volatility continue to be modeled with a parametric structure. Our semiparametric Bayesian approach provides a full characterization of parametric and distributional uncertainty. A Markov chain Monte Carlo sampling approach to estimation is presented with theoretical and computational issues for simulation from the posterior predictive distributions. An empirical example compares the new model to standard parametric stochastic volatility modelsClassification-JEL:
    Date: 2009–01
  3. By: Gordon Anderson (University of Toronto); Oliver Linton (London School of Economics); Yoon-Jae Whang (Seoul National University)
    Abstract: This paper develops methodology for nonparametric estimation of a polarization measure due to Anderson (2004) and Anderson, Ge, and Leo (2006) based on kernel estimation techniques. We give the asymptotic distribution theory of our estimator, which in some cases is nonstandard due to a boundary value problem. We also propose a method for conducting inference based on estimation of unknown quantities in the limiting distribution and show that our method yields consistent inference in all cases we consider. We investigate the finite sample properties of our methods by simulation methods. We give an application to the study of polarization within China in recent years.
    Keywords: Kernel estimation, Inequality, Overlap coefficient; Poissonization
    JEL: C12 C13 C14
    Date: 2009–07
  4. By: Andros Kourtellos (Department of Economics, University of Cyprus); Thanasis Stengos (Department of Economics, University of Guelph); Chih Ming Tan (Department of Economics, Tufts University)
    Abstract: This paper extends the simple threshold regression framework of Hansen (2000) and Caner and Hansen (2004) to allow for endogeneity of the threshold variable. We develop a concentrated least squares estimator of the threshold parameter based on an inverse Mills ratio bias correction. We show that our estimator is consistent and investigate its performance using a Monte Carlo simulation that indicates the applicability of the method in …nite samples Classification-JEL: C13, C51
    Date: 2009–01
  5. By: Moscone, Francesco; Tosetti, Elisa
    Abstract: We consider Generalized Method of Moments (GMM) estimation of a regression model with spatially correlated errors. We propose some new moment conditions, and derive the asymptotic distribution of the GMM based on them. The analysis is supported by a small Monte Carlo exercise.
    Keywords: Generalized Method of Moments; spatial econometrics
    JEL: C13 A19
    Date: 2009–04–17
  6. By: Oliver Linton (London School of Economics); Kyungchul Song (University of Pennsylvania); Yoon-Jae Whang (Seoul National University)
    Abstract: We propose a new method of testing stochastic dominance that improves on existing tests based on the standard bootstrap or subsampling. The method admits prospects involving infinite as well as finite dimensional unknown parameters, so that the variables are allowed to be residuals from nonparametric and semiparametric models. The proposed bootstrap tests have asymptotic sizes that are less than or equal to the nominal level uniformly over probabilities in the null hypothesis under regularity conditions. This paper also characterizes the set of probabilities that the asymptotic size is exactly equal to the nominal level uniformly. As our simulation results show, these characteristics of our tests lead to an improved power property in general. The improvement stems from the design of the bootstrap test whose limiting behavior mimics the discontinuity of the original test's limiting distribution.
    Keywords: Set estimation, Size of test, Similarity; Bootstrap, Subsampling
    JEL: C12 C14 C52
    Date: 2009–07
  7. By: Yuriy Gorodnichenko; Serena Ng
    Abstract: Dynamic Stochastic General Equilibrium (DSGE) models are often solved and estimated under specific assumptions as to whether the exogenous variables are difference or trend stationary. However, even mild departures of the data generating process from these assumptions can severely bias the estimates of the model parameters. This paper proposes new estimators that do not require researchers to take a stand on whether shocks have permanent or transitory effects. These procedures have two key features. First, the same filter is applied to both the data and the model variables. Second, the filtered variables are stationary when evaluated at the true parameter vector. The estimators are approximately normally distributed not only when the shocks are mildly persistent, but also when they have near or exact unit roots. Simulations show that these robust estimators perform well especially when the shocks are highly persistent yet stationary. In such cases, linear detrending and first differencing are shown to yield biased or imprecise estimates.
    JEL: E3 F4 O4
    Date: 2009–07
  8. By: Neil Shephard; Kevin Sheppard
    Abstract: This paper studies in some detail a class of high frequency based volatility (HEAVY) models. These models are direct models of daily asset return volatility based on realized measures constructed from high frequency data. Our analysis identifies that the models have momentum and mean reversion effects, and that they adjust quickly to structural breaks in the level of the volatility process. We study how to estimate the models and how they perform through the credit crunch, comparing their fit to more traditional GARCH models. We analyse a model based bootstrap which allow us to estimate the entire predictive distribution of returns. We also provide an analysis of missing data in the context of these models.
    Keywords: ARCH models; bootstrap; missing data; multiplicative error model; multistep ahead prediction; non-nested likelihood ratio test; realised kernel; realised volatility.
    Date: 2009
  9. By: Chudik, A.; Pesaran, M.H.; Tosetti, E.
    Abstract: This paper introduces the concepts of time-specific weak and strong cross section dependence. A double-indexed process is said to be cross sectionally weakly dependent at a given point in time, t, if its weighted average along the cross section dimension (N) converges to its expectation in quadratic mean, as N is increased without bounds for all weights that satisfy certain 'granularity' conditions. Relationship with the notions of weak and strong common factors is investigated and an application to the estimation of panel data models with an infinite number of weak factors and a finite number of strong factors is also considered. The paper concludes with a set of Monte Carlo experiments where the small sample properties of estimators based on principal components and CCE estimators are investigated and compared under various assumptions on the nature of the unobserved common effects.
    Keywords: Panels, Strong and Weak Cross Section Dependence, Weak and Strong Factors
    JEL: C10 C31 C33
    Date: 2009–06–09
  10. By: Franses, Ph.H.B.F. (Erasmus Econometric Institute)
    Abstract: Econometric models for economic time series may include harmonic regressors to describe cyclical patterns in the data. This paper focuses on the possibility that the cycle periods in these regressors change over time. To this end, a smooth regime-switching harmonic regression is proposed, and a diagnostic test for changing cycle periods is proposed. An application to annual GDP growth in the Netherlands (for 1969-2007) shows that around 1975 the business cycle period shifted from about 3 years to about 11 years.
    Keywords: harmonic regressors;smooth regime-switching model
    Date: 2009–07–13
  11. By: Thomas Lux; Leonardo Morales-Arias
    Abstract: We examine the performance of volatility models that incorporate features such as long (short) memory, regime-switching and multifractality along with two competing distributional assumptions of the error component, i.e. Normal vs Student-t. Our precise contribution is twofold. First, we introduce a new model to the family of Markov-Switching Multifractal models of asset returns (MSM), namely, the Markov-Switching Multifractal model of asset returns with Student-t innovations (MSM-t). Second, we perform a comprehensive panel forecasting analysis of the MSM models as well as other competing volatility models of the Generalized Autoregressive Conditional Heteroskedasticity (GARCH) legacy. Our cross-sections consist of all-share equity indices, bond indices and real estate security indices at the country level. Furthermore, we investigate complementarities between models via combined forecasts. We find that: (i) Maximum Likelihood (ML) and Generalized Method of Moments (GMM) estimation are both suitable for MSM-t models, (ii) empirical panel forecasts of MSM-t models show an improvement over the alternative volatility models in terms of mean absolute forecast errors and that (iii) forecast combinations obtained from the different MSM and (FI)GARCH models considered appear to provide some improvement upon forecasts from single models
    Keywords: Multiplicative volatility models, long memory, Student-t innovations, international volatility forecasting
    JEL: C20 G12
    Date: 2009–07
  12. By: Eva Poen (CeDEx, School of Economics, University of Nottingham)
    Abstract: We study a random effects censored regression model in the context of repeated games. Introducing a feedback variable into the model leads to violation of the strict exogeneity assumption, thus rendering the random effects estimator inconsistent. Using the example of contributions to a public good, we investigate the size of this bias in a Monte-Carlo study. We find that the magnitude of the bias is around one per cent when initial values and individual effects are correlated. The rate of censoring, as well as the size of the groups in which subjects interact, both have an effect on the magnitude of the bias. The coefficients of strictly exogenous, continuous regressors remain unaffected by the endogeneity bias. The size of the endogeneity bias in our model is very small compared to the size of the heterogeneity bias, which occurs when individual heterogeneity is not accounted for in estimation of nonlinear models.
    Keywords: Monte-Carlo, Simulation, Random Effects, Censored Regression Model, Public Goods, Heterogeneity, Endogeneity
    JEL: C15 C24 C92
    Date: 2009–07
  13. By: Xun Tang (Department of Economics, University of Pennsylvania)
    Abstract: We address two issues in nonparametric structural analyses of dynamic binary choice processes (DBCP). First, the DBCP is not testable and decision makers’ single-period payoffs (SPP) cannot be identified even when the distribution of unobservable states (USV) is known. Numerical examples show setting SPP from one choice to arbitrary utility levels to identify that from the other can lead to errors in predicting choice probabilities under counterfactual state transitions. We propose two solutions. First, if a data generating process (DGP) has exogenous variations in observable state transitions, the DBCP becomes testable and SPP is identified. Second, exogenous economic restrictions on SPP (such as ranking of states by SPP, or shape restrictions) can be used to recover the identified set of rationalizable counterfactual choice probabilities (RCCP) that are consistent with model restrictions. The other (more challenging) motivating issue is that when the USV distribution is not known, misspecification of the distribution in structural estimation leads to errors in counterfactual predictions. We introduce a simple algorithm based on linear programming to recover sharp bounds on RCCP. This approach exploits the fact that some stochastic restrictions on USV (such as independence from observable states) and economic restrictions on SPP can be represented (without loss of information for counterfactual analyses) as linear restrictions on SPP and distributional parameters of USV. We use numerical examples to illustrate the algorithm and show sizes of identified sets of RCCP can be quite small relative to the outcome space.
    Keywords: Dynamic discrete choice models, counterfactual outcomes, rationalizability, non-parametric and semiparametric identification
    JEL: C13 C14 C25
    Date: 2009–06–20
  14. By: Gary Koop (Department of Economics, University of Strathclyde and RCEA); Dimitris Korobilis (Department of Economics, University of Strathclyde and RCEA)
    Abstract: There is a large literature on forecasting inflation using the generalized Phillips curve (i.e. using forecasting models where inflation depends on past inflation, the unemployment rate and other predictors). The present paper extends this literature through the use of econometric methods which incorporate dynamic model averaging. These not only allow for coefficients to change over time (i.e. the marginal effect of a predictor for inflation can change), but also allows for the entire forecasting model to change over time (i.e. different sets of predictors can be relevant at different points in time). In an empirical exercise involving quarterly US inflation, we fi…nd that dynamic model averaging leads to substantial forecasting improvements over simple benchmark approaches (e.g. random walk or recursive OLS forecasts) and more sophisticated approaches such as those using time varying coefficient models.
    Keywords: Option Pricing; Modular Neural Networks; Non-parametric Methods
    JEL: E31 E37 C11 C53
    Date: 2009–01
  15. By: Xun Tang (Department of Economics, University of Pennsylvania)
    Abstract: I estimate a simultaneous discrete game with incomplete information where players’ private information are only required to be median independent of observed states and can be correlated with observable states. This median restriction is weaker than other assumptions on players’ private information in the literature (e.g. perfect knowledge of its distribution or its independence of the observable states). I show index coefficients in players’ utility functions are point-identified under an exclusion restriction and fairly weak conditions on the support of states. This identification strategy is fundamentally different from that in a single-agent binary response models with median restrictions, and does not involve any parametric assumption on equilibrium selection in the presence of multiple Bayesian Nash equilibria. I then propose a two-step extreme estimator for the linear coefficients, and prove its consistency.
    Keywords: Games with incomplete information, semiparametric identification, median restrictions, consistent estimation
    JEL: C14 C35 C51
    Date: 2009–04–30
  16. By: Sydney C. Ludvigson; Serena Ng
    Abstract: This paper uses the factor augmented regression framework to analyze the relation between bond excess returns and the macro economy. Using a panel of 131 monthly macroeconomic time series for the sample 1964:1-2007:12, we estimate 8 static factors by the method of asymptotic principal components. We also use Gibb sampling to estimate dynamic factors from the 131 series reorganized into 8 blocks. Regardless of how the factors are estimated, macroeconomic factors are found to have statistically significant predictive power for excess bond returns. We show how a bias correction to the parameter estimates of factor augmented regressions can be obtained. This bias is numerically trivial in our application. The predictive power of real activity for excess bond returns is robust even after accounting for finite sample inference problems. Forecasts of excess bond returns (or bond risk premia) are countercyclical. This implies that investors are compensated for risks associated with recessions.
    JEL: G12
    Date: 2009–07
  17. By: Paresh Kumar Narayan (School of Accounting, Economics and Finance, Deakin University); Stephan Popp (University of Duisburg-Essen)
    Abstract: In this paper, we examine the unit root null hypothesis for per capita total health expenditures, per capital private health expenditures, and per capital public health expenditures for 29 OECD countries. The novelty of our work is that we use a new nonlinear unit root test that allows for one structural break in the data series. We find that for around 45 per cent of the countries we are able to reject the unit root hypothesis for each of the three health expenditure series. Moreover, using Monte Carlo simulations, we show that our proposed unit root model has better size and power properties than the widely used ADF and LM type tests.
    Date: 2009–06–23
  18. By: Ramazan Gencay (Department of Economics, Simon Fraser University); Nikola Gradojevic (Faculty of Business Administration, Lakehead University); Faruk Selcuk (Department of Economics, Bilkent University); Brandon Whitcher (GlaxoSmithKline Clinical Imaging Centre, Hammersmith Hospital London, United Kingdom)
    Abstract: Conventional time series analysis, focusing exclusively on a time series at a given scale, lacks the ability to explain the nature of the data generating process. A process equation that successfully explains daily price changes, for example, is unable to characterize the nature of hourly price changes. On the other hand, statistical properties of monthly price changes are often not fully covered by a model based on daily price changes. In this paper, we simultaneously model regimes of volatilities at multiple time scales through wavelet-domain hidden Markov models. We establish an important stylized property of volatility across different time scales. We call this property asymmetric vertical dependence. It is asymmetric in the sense that a low volatility state (regime) at a long time horizon is most likely followed by low volatility states at shorter time horizons. On the other hand, a high volatility state at long time horizons does not necessarily imply a high volatility state at shorter time horizons. Our analysis provides evidence that volatility is a mixture of high and low volatility regimes, resulting in a distribution that is non-Gaussian. This result has important implications regarding the scaling behavior of volatility, and consequently, the calculation of risk at different time scales
    Keywords: Discrete wavelet transform, wavelet-domain hidden Markov trees, foreign exchange markets, stock markets, multiresolution analysis, scaling
    JEL: G0 G1 C1
    Date: 2009–01
  19. By: James J. Heckman; Petra E. Todd
    Abstract: The probability of selection into treatment plays an important role in matching and selection models. However, this probability can often not be consistently estimated, because of choice-based sampling designs with unknown sampling weights. This note establishes that the selection and matching procedures can be implemented using propensity scores fit on choice-based samples with misspecified weights, because the odds ratio of the propensity score fit on the choice-based sample is monotonically related to the odds ratio of the true propensity scores.
    JEL: C13 C51
    Date: 2009–07
  20. By: Vasyl Golosnoy; Jens Hogrefe
    Abstract: The dates of U.S. business cycle are reported by NBER with a considerable delay, so an early notion of turning points is of particular interest. This paper proposes a novel sequential approach designed for timely signaling these turning points. A directional cumulated sum decision rule is adapted for the purpose of on-line monitoring of transitions between subsequent phases of economic activity. The introduced procedure shows a sound detection ability for business cycle peaks and troughs compared to the established dynamic factor Markov switching methodology. It exhibits a range of theoretical optimality properties for early signaling, moreover, it is transparent and easy to implement
    Keywords: Business cycle; CUSUM control chart; Dynamic Factor Markov switching models; Early signaling; NBER dating
    JEL: C44 C50 E32
    Date: 2009–06
  21. By: Mark J. Holmes (Waikato University Management School, New Zealand); Jesús Otero (Universidad del Rosario, Colombia); Theodore Panagiotidis (University of Macedonia, Greece and The Rimini Center for Economic Analysis, Italy)
    Abstract: In this paper, we test for the stationarity of European Union budget deficits over the period 1971 to 2006, using a panel of thirteen member countries. Our testing strategy addresses two key concerns with regard to unit root panel data testing, namely (i) the presence of cross-sectional dependence among the countries in the panel and (ii) the identification of potential structural breaks that might have occurred at different points in time. To address these concerns, we employ an AR-based bootstrap approach that allows us to test the null hypothesis of joint stationarity with endogenously determined structural breaks. In contrast to the existing literature, we find that the EU countries considered are characterised by fiscal stationarity over the full sample period irrespective of us allowing for structural breaks. This conclusion also holds when analysing sub-periods based on before and after the Maastricht treaty.
    Keywords: Heterogeneous dynamic panels, fiscal sustainability, mean reversion, panel stationarity test.
    JEL: C33 F32 F41
    Date: 2009–01
  22. By: T. DEMUYNCK
    Abstract: tba
    Date: 2009–03
  23. By: John M. Maheu (Department of Economics, University of Toronto and RCEA); Thomas H. McCurdy (Rotman School of Management, University of Toronto, and CIRANO)
    Abstract: Many finance questions require the predictive distribution of returns. We propose a bivariate model of returns and realized volatility (RV), and explore which features of that time-series model contribute to superior density forecasts over horizons of 1 to 60 days out of sample. This term structure of density forecasts is used to investigate the importance of: the intraday information embodied in the daily RV estimates; the functional form for log(RV ) dynamics; the timing of information availability; and the assumed distributions of both return and log(RV) innovations. We find that a joint model of returns and volatility that features two components for log(RV) provides a good fit to S&P 500 and IBM data, and is a significant improvement over an EGARCH model estimated from daily returns
    Keywords: Realized Volatility, multiperiod out-of-sample prediction, term structure of density forecasts, Stochastic Volatility
    Date: 2009–01
  24. By: Eric Gautier (CREST - Centre de Recherche en Économie et Statistique - INSEE - École Nationale de la Statistique et de l'Administration Économique); Yuichi Kitamura (Cowles Foundation for Research in Economics - Université Yale - New Haven)
    Abstract: Nous considérons dans cet article des modèles à choix binaires et coefficients aléatoires. Le but est d'estimer de manière nonparamétrique la densité du coefficient aléatoire. Il s'agit d'un problème inverse mal posé caractérisé par une transformation intégrale. Un nouvel estimateur de la densité du coefficient aléatoire est proposé. Il est basé sur les développements en séries de Fourier-Laplace sur la sphère. Cette approche permet une étude fine du problème d'identification mais aussi d'obtenir un estimateur par injection ayant une expression explicite et ne nécessitant aucun optimisation numérique. Le nouvel estimateur est donc très facile à obtenir numériquement, tout en étant souple sur le traitement de l'hétérogénéité inobservée. Nous présentons des extensions parmi lesquellesle traitement de coefficients non aléatoires et de modèles avec endogénéité.
    Date: 2009–07–14
  25. By: Eduardo Fé-Rodríguez; Chris D. Orme
    Date: 2009
  26. By: Essahbi Essaadi (GATE - Groupe d'analyse et de théorie économique - CNRS : UMR5824 - Université Lumière - Lyon II - Ecole Normale Supérieure Lettres et Sciences Humaines); Jamel Jouini (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - Ecole des Hautes Etudes en Sciences Sociales - CNRS : UMR6579, Université 7 Novembre de Carthage - université 7 Novembre de Carthage); Wajih Khallouli (Ecole Supérieure des Sciences Economiques et Commerciales de Tunis - Université de Tunis)
    Abstract: In this paper we are testing for contagion caused by the Thai baht collapse of July 1997. In line with earlier work, shift-contagion is defined as a structural change within the international propagation mechanisms of financial shocks. We adopt Bai and Perron's (1998) structural break approach in order to detect the endogenous break points of the pair-wise time-varying correlations between Thailand and seven Asian stock market returns. Our approach enables us to solve the misspecification problem of the crisis window. Our results illustrate the existence of shift-contagion in the Asian crisis caused by the crisis in Thailand.
    Keywords: Shift-contagion; time-varying correlation; sequential selection procedure
    Date: 2009
  27. By: Bortoluzzo, Adriana B.; Morettin, Pedro A.; Toloi, Clelia M. C.
    Date: 2008–10

This nep-ecm issue is ©2009 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.