nep-ecm New Economics Papers
on Econometrics
Issue of 2008‒11‒04
27 papers chosen by
Sune Karlsson
Orebro University

  1. Exact Maximum Likelihood estimation for the BL-GARCH model under elliptical distributed innovations By Abdou Kâ Diongue; Dominique Guegan; Rodney C. Wolff
  2. The Information Content of Implied Probabilities to Detect Structural Change By Alain Guay; Jean-François Lamarche
  3. Estimation of k-factor GIGARCH process : a Monte Carlo study By Abdou Kâ Diongue; Dominique Guegan
  4. Forecasting with Dynamic Models using Shrinkage-based Estimation By Andrea Carriero; George Kapetanios; Massimiliano Marcellino
  5. A Powerful Test of the Autoregressive Unit Root Hypothesis Based on a Tuning Parameter Free Statistic By Morten Ørregaard Nielsen
  6. Estimating and Forecasting GARCH Volatility in the Presence of Outiers By M. Angeles Carnero; Daniel Peña; Esther Ruiz
  7. Changing regime volatility: A fractionally integrated SETAR model By Gilles Dufrenot; Dominique Guegan; Anne Peguin-Feissolle
  8. A non-parametric method to nowcast the Euro Area IPI By Laurent Ferrara; Thomas Raffinot
  9. Non-stationarity and meta-distribution By Dominique Guegan
  10. Forecasting chaotic systems : the role of local Lyapunov exponents By Dominique Guegan; Justin Leroux
  11. Effect of noise filtering on predictions : on the routes of chaos By Dominique Guegan
  12. Is forecasting with large models informative? Assessing the role of judgement in macro-economic forecasts. By Ricardo Mestre; Peter McAdam
  13. Statistical Fourier Analysis: Clarifications and Interpretations By D.S.G. Pollock
  14. Short-term forecasts of euro area GDP growth. By Elena Angelini; Gonzalo Camba-Méndez; Domenico Giannone; Gerhard Rünstler; Lucrezia Reichlin
  15. Wavelets unit root test vs DF test : A further investigation based on monte carlo experiments By Ibrahim Ahamada; Philippe Jolivaldt
  16. Estimating and forecasting the euro area monthly national accounts from a dynamic factor model. By Elena Angelini; Marta Bańbura; Gerhard Rünstler
  17. Seeing inside the black box: Using diffusion index methodology to construct factor proxies in large scale macroeconomic time series environments By Nii Ayi Armah; Norman R. Swanson
  18. Analysis of the dependence structure in econometric time series By Aurélien Hazan; Vincent Vigneron
  19. Markov-chain approximations of vector autoregressions: application of general multivariate-normal integration techniques By Edward S. Knotek II; Stephen Terry
  20. Self-selection and Subjective Well-being : Copula Models with an Application to Public and Private Sector Work By Simon Luechinger; Alois Stutzer; Rainer Winkelmann
  21. Business surveys modelling with Seasonal-Cyclical Long Memory models By Laurent Ferrara; Dominique Guegan
  22. Opening the Black-Box of Intrahousehold Decision-Making Theory and Nonparametric Empirical Tests of General Collective Consumption Models By Laurens Cherchye; Bram De Rock; Frederic Vermeulen
  23. Standard and Shuffled Halton Sequences in a Mixed Logit Model By Alexander Staus
  24. Clustering techniques applied to outlier detection of financial market series using a moving window filtering algorithm. By Josep Maria Puigvert Gutiérrez; Josep Fortiana Gregori
  25. The Evaluation of Public Program Effect Using Regression Discontinuity Method : An introduction By Santarossa, Gino
  26. A Measure of Variability in Comovement for Economic Variables : a Time-Varying Coherence Function Approach By Essahbi Essaadi; Mohamed Boutahar
  27. Forecasting VaR and Expected shortfall using dynamical Systems : a risk Management Strategy, By Dominique Guegan; Cyril Caillault

  1. By: Abdou Kâ Diongue (UFR SAT - Université Gaston Berger - Université Gaston Berger de Saint-Louis); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Rodney C. Wolff (School of Mathematical Sciences - Queensland University of Technology)
    Abstract: In this paper, we discuss the class of Bilinear GATRCH (BL-GARCH) models which are capable of capturing simultaneously two key properties of non-linear time series : volatility clustering and leverage effects. It has been observed often that the marginal distributions of such time series have heavy tails ; thus we examine the BL-GARCH model in a general setting under some non-Normal distributions. We investigate some probabilistic properties of this model and we propose and implement a maximum likelihood estimation (MLE) methodology. To evaluate the small-sample performance of this method for the various models, a Monte Carlo study is conducted. Finally, within-sample estimation properties are studied using S&P 500 daily returns, when the features of interest manifest as volatility clustering and leverage effects.
    Keywords: BL-GARCH process, elliptical distribution, leverage effects, Maximum Likelihood, Monte Carlo method, volatility clustering.
    Date: 2008–03
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00270719_v1&r=ecm
  2. By: Alain Guay; Jean-François Lamarche
    Abstract: This paper proposes Pearson-type statistics based on implies probabilities to detect structural change. The class of generalized empirical likelihood estimators (see Smith (1997)) assigns a set of probabilities to each observation such that moment conditions are satisfied. These restricted probabilities are called implied probabilities. Implied probabilities may also be constructed for the standard GMM (see Back and Brown (1993)). The proposed test statistics for structural change are based on the information content in these implied probabilities. We consider cases of structural change with unknown breakpoint which can occur in the parameters of interest or in the overidentifying restrictions used to estimate these parameters. The test statistics considered here have good size and power properties.
    Keywords: Generalized empirical likelihood, generalized method of moments, parameter instability, structural change
    JEL: C12 C32
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:lvl:lacicr:0833&r=ecm
  3. By: Abdou Kâ Diongue (UFR SAT - Université Gaston Berger - Université Gaston Berger de Saint-Louis, School of Economics and Finance - Queensland University of Technology); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: In this paper, we discuss the parameter estimation for a k-factor generalized long memory process with conditionally heteroskedastic noise. Two estimation methods are proposed. The first method is based on the conditional distribution of the process and the second is obtained as an extension of Whittle's estimation approach. For comparison purposes, Monte Carlo simulations are used to evaluate the finite sample performance of these estimation techniques.
    Keywords: Long memory, Gegenbauer polynomial, heteeroskedasticity, conditional sum of squares, Whittle estimation.
    Date: 2008–01
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00235179_v1&r=ecm
  4. By: Andrea Carriero (Queen Mary, University of London); George Kapetanios (Queen Mary, University of London); Massimiliano Marcellino (Bocconi University and EUI)
    Abstract: The paper provides a proof of consistency of the ridge estimator for regressions where the number of regressors tends to infinity. Such result is obtained without assuming a factor structure. A Monte Carlo study suggests that shrinkage autoregressive models can lead to very substantial advantages compared to standard autoregressive models. An empirical application focusing on forecasting inflation and GDP growth in a panel of countries confirms this finding.
    Keywords: Shrinkage, Forecasting
    JEL: C13 C22 C53
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp635&r=ecm
  5. By: Morten Ørregaard Nielsen (Queen's University)
    Abstract: This paper presents a family of simple nonparametric unit root tests indexed by one parameter, d, and containing Breitung's (2002) test as the special case d=1. It is shown that (i) each member of the family with d>0 is consistent, (ii) the asymptotic distribution depends on d, and thus reflects the parameter chosen to implement the test, and (iii) since the asymptotic distribution depends on d and the test remains consistent for all d>0, it is possible to analyze the power of the test for different values of d. The usual Phillips-Perron or Dickey-Fuller type tests are indexed by bandwidth, lag length, etc., but have none of these three properties. It is shown that members of the family with d<1 have higher asymptotic local power than the Breitung (2002) test, and when d is small the asymptotic local power of the proposed nonparametric test is relatively close to the parametric power envelope, particularly in the case with a linear time-trend. Furthermore, GLS detrending is shown to improve power when d is small, which is not the case for Breitung's (2002) test. Simulations demonstrate that when applying a sieve bootstrap procedure, the proposed variance ratio test has very good size properties, with finite sample power that is higher than that of Breitung's (2002) test and even rivals the (nearly) optimal parametric GLS detrended augmented Dickey-Fuller test with lag length chosen by an information criterion.
    Keywords: Augmented Dickey-Fuller test, fractional integration, GLS detrending, nonparametric, nuisance parameter, tuning parameter, power envelope, unit root test, variance ratio
    JEL: C22
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1185&r=ecm
  6. By: M. Angeles Carnero (Universidad de Alicante); Daniel Peña (Universidad Carlos III de Madrid); Esther Ruiz (Universidad Carlos III de Madrid)
    Abstract: The main goal when fitting GARCH models to conditionally heteroscedastic time series is to estimate the underlying volatilities. It is well known that outliers affect the estimation of the GARCH parameters. However, little is known about their effects when estimating volatilities. In this paper, we show that when estimating the volatility by using Maximum Likelihood estimates of the parameters, the biases incurred can be very large even if estimated parameters have small biases. Consequently, we propose to use robust procedures. In particular, a simple robust estimator of the parameters is proposed and shown that its properties are comparable with other more complicated ones available in the literature. The properties of the estimated and predicted volatilities obtained by using robust filters based on robust parameter estimates are analyzed. All the results are illustrated using daily S&P500 and IBEX35 returns.
    Keywords: Heteroscedasticity, M-estimator, QML estimator, Robustness, Financial Markets
    JEL: C22
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:ivi:wpasad:2008-13&r=ecm
  7. By: Gilles Dufrenot (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - Ecole des Hautes Etudes en Sciences Sociales - CNRS : UMR6579); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Anne Peguin-Feissolle (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - Ecole des Hautes Etudes en Sciences Sociales - CNRS : UMR6579)
    Abstract: This paper presents a 2-regime SETAR model with different long-memory processes in both regimes. We briefly present the memory properties of this model and propose an estimation method. Such a process is applied to the absolute and squared returns of five stock indices. A comparison with simple FARIMA models is made using some forecastibility criteria. Our empirical results suggest that our model offers an interesting alternative competing framework to describe the persistent dynamics in modeling the returns.
    Keywords: SETAR - Long-memory - Stock indices - Forecasting
    Date: 2008–04
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00185369_v1&r=ecm
  8. By: Laurent Ferrara (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, DGEI-DAMEP - Banque de France); Thomas Raffinot (CPR-Asset Management - CPR Asset Management)
    Abstract: Non-parametric methods have been empirically proved to be of great interest in the statistical literature in order to forecast stationary time series, but very few applications have been proposed in the econometrics literature. In this paper, our aim is to test whether non-parametric statistical procedures based on a Kernel method can improve classical linear models in order to nowcast the Euro area manufacturing industrial production index (IPI) by using business surveys released by the European Commission. Moreover, we consider the methodology based on bootstrap replications to estimate the confidence interval of the nowcasts.
    Keywords: Non-parametric, Kernel, nowcasting, bootstrap, Euro area IPI.
    Date: 2008–04
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00275769_v1&r=ecm
  9. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: In this paper we deal with the problem of non-stationarity encountered in a lot of data sets, mainly in financial and economics domains, coming from the presence of multiple seasonnalities, jumps, volatility, distorsion, aggregation, etc. Existence of non-stationarity involves spurious behaviors in estimated statistics as soon as we work with finite samples. We illustrate this fact using Markov switching processes, Stopbreak models and SETAR processes. Thus, working with a theoretical framework based on the existence of an invariant measure for a whole sample is not satisfactory. Empirically alternative strategies have been developed introducing dynamics inside modelling mainly through the parameter with the use of rolling windows. A specific framework has not yet been proposed to study such non-invariant data sets. The question is difficult. Here, we address a discussion on this topic proposing the concept of meta-distribution which can be used to improve risk management strategies or forecasts.
    Keywords: Non-stationarity, switching processes, SETAR processes, jumps, forecast, risk management, copula, probability distribution function.
    Date: 2008–03
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00270708_v1&r=ecm
  10. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Justin Leroux (Institute for Applied Economics - HEC MONTRÉAL, CIRPEE - Centre Interuniversitaire sur le Risque, les Politiques Economiques et l'Emploi)
    Abstract: We propose a novel methodology for forecasting chaotic systems which is based on the nearest-neighbor predictor and improves upon it by incorporating local Lyapunov exponents to correct for its inevitable bias. Using simulated data, we show that gains in prediction accuracy can be substantial. The general intuition behind the proposed method can readily be applied to other non-parametric predictors.
    Keywords: Chaos theory, Lyapunov exponent, logistic map, Monte Carlo simulations.
    Date: 2008–02
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00259238_v1&r=ecm
  11. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: The detection of chaotic behaviors in commodities, stock markets and weather data is usually complicated by large noise perturbation inherent to the underlying system. It is well known, that predictions, from pure deterministic chaotic systems can be accurate mainly in the short term. Thus, it will be important to be able to reconstruct in a robust way the attractor in which evolves the data, if this attractor exists. In chaotic theory, the deconvolution methods have been largely studied and there exist different approaches which are competitive and complementary. In this work, we apply two methods : the singular value method and the wavelet approach. This last one has not been investigated a lot of filtering chaotic systems. Using very large Monte Carlo simulations, we show the ability of this last deconvolution method. Then, we use the de-noised data set to do forecast, and we discuss deeply the possibility to do long term forecasts with chaotic systems.
    Keywords: Deconvolution, chaos, SVD, state space method, wavelets method.
    Date: 2008–01
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00235448_v1&r=ecm
  12. By: Ricardo Mestre (Corresponding author: European Central Bank, DG Research, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Peter McAdam (European Central Bank, DG Research, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.)
    Abstract: We evaluate residual projection strategies in the context of a large-scale macro model of the euro area and smaller benchmark time-series models. The exercises attempt to measure the accuracy of model-based forecasts simulated both out-of-sample and in-sample. Both exercises incorporate alternative residual-projection methods, to assess the importance of unaccounted-for breaks in forecast accuracy and off-model judgment. Conclusions reached are that simple mechanical residual adjustments have a significant impact of forecasting accuracy irrespective of the model in use, ostensibly due to the presence of breaks in trends in the data. The testing procedure and conclusions are applicable to a wide class of models and thus of general interest. JEL Classification: C52, E30, E32, E37.
    Keywords: Macro-model, Forecast Projections, Out-of-Sample, In-Sample, Forecast Accuracy, Structural Break.
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20080950&r=ecm
  13. By: D.S.G. Pollock
    Abstract: This paper expounds some of the results of Fourier theory that are essential to the statistical analysis of time series. It employs the algebra of circulant matrices to expose the structure of the discrete Fourier transform and to elucidate the filtering operations that may be applied to finite data sequences. An ideal filter with a gain of unity throughout the pass band and a gain of zero throughout the stop band is commonly regarded as incapable of being realised in finite samples. It is shown here that, to the contrary, such a filter can be realised both in the time domain and in the frequency domain. The algebra of circulant matrices is also helpful in revealing the nature of statistical processes that are band limited in the frequency domain. In order to apply the conventional techniques of autoregressive moving-average modelling, the data generated by such processes must be subjected to antialiasing filtering and sub sampling. These techniques are also described. It is argued that band-limited processes are more prevalent in statistical and econometric time series than is commonly recognised.
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:08/36&r=ecm
  14. By: Elena Angelini (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Gonzalo Camba-Méndez (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Domenico Giannone (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Gerhard Rünstler (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Lucrezia Reichlin (London Business School, Regent’s Park, London NW1 4SA, United Kingdom.)
    Abstract: This paper evaluates models that exploit timely monthly releases to compute early estimates of current quarter GDP (now-casting) in the euro area. We compare traditional methods used at institutions with a new method proposed by Giannone, Reichlin, and Small (2005). The method consists in bridging quarterly GDP with monthly data via a regression on factors extracted from a large panel of monthly series with different publication lags. We show that bridging via factors produces more accurate estimates than traditional bridge equations. We also show that survey data and other ‘soft’ information are valuable for now-casting. JEL Classification: E52, C33, C53.
    Keywords: Forecasting, Monetary Policy, Factor Model, Real Time Data, Large datasets, News.
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20080949&r=ecm
  15. By: Ibrahim Ahamada (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Philippe Jolivaldt (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: Test for unit root based in wavelets theory is recently defined (Genay and Fan, 2007). While the new test is supposed to be robust to the initial value, we bring out by contrast the significant effects of the initial value in the size and the power. We found also that both the wavelets unit root test and ADF test give the same efficiency if the data are corrected of the initial value. Our approach is based in monte carlo experiment.
    Keywords: Unit root tests, wavelets, monte carlo experiments, size-power curve.
    Date: 2008–03
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00275767_v1&r=ecm
  16. By: Elena Angelini (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Marta Bańbura (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Gerhard Rünstler (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.)
    Abstract: We estimate and forecast growth in euro area monthly GDP and its components from a dynamic factor model due to Doz et al. (2005), which handles unbalanced data sets in an efficient way. We extend the model to integrate interpolation and forecasting together with cross-equation accounting identities. A pseudo real-time forecasting exercise indicates that the model outperforms various benchmarks, such as quarterly time series models and bridge equations in forecasting growth in quarterly GDP and its components. JEL Classification: E37, C53.
    Keywords: Dynamic factor models, interpolation, nowcasting.
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20080953&r=ecm
  17. By: Nii Ayi Armah; Norman R. Swanson
    Abstract: In economics, common factors are often assumed to underlie the co-movements of a set of macroeconomic variables. For this reason, many authors have used estimated factors in the construction of prediction models. In this paper, we begin by surveying the extant literature on diffusion indexes. We then outline a number of approaches to the selection of factor proxies (observed variables that proxy unobserved estimated factors) using the statistics developed in Bai and Ng (2006a,b). Our approach to factor proxy selection is examined via a small Monte Carlo experiment, where evidence supporting our proposed methodology is presented, and via a large set of prediction experiments using the panel dataset of Stock and Watson (2005). One of our main empirical findings is that our “smoothed” approaches to factor proxy selection appear to yield predictions that are often superior not only to a benchmark factor model, but also to simple linear time series models which are generally difficult to beat in forecasting competitions. In some sense, by using our approach to predictive factor proxy selection, one is able to open up the “black box” often associated with factor analysis, and to identify actual variables that can serve as primitive building blocks for (prediction) models of a host of macroeconomic variables, and that can also serve as policy instruments, for example. Our findings suggest that important observable variables include various S&P500 variables, including stock price indices and dividend series; a 1-year Treasury bond rate; various housing activity variables; industrial production; and exchange rates.
    Keywords: Macroeconomics
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:fip:fedpwp:08-25&r=ecm
  18. By: Aurélien Hazan (IBISC - Informatique, Biologie Intégrative et Systèmes Complexes - CNRS : FRE2873 - Université d'Evry-Val d'Essonne); Vincent Vigneron (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, SAMOS - Statistique Appliquée et MOdélisation Stochastique - Université Panthéon-Sorbonne - Paris I)
    Abstract: The various scales of a signal maintain relations of dependence the on es with the others. Those can vary in time and reveal speed changes in the studied phenomenon. In the goal to establish these changes, one shall compute first the wavelet transform of a signal, on various scales. Then one shall study the statistical dependences between these transforms thanks to an estimator of mutual information. One shall then propose to summarize the resulting network of dependences by a graph of dependences by thresholding the values of the mutual information or by quantifying its values. The method can be applied to several types of signals, such as fluctuations of market indexes for instance the S&P 500, or high frequency foreign exchange (FX) rates.
    Keywords: wavelet, dependence; mutual information; financial; time-series; FX
    Date: 2008–06–05
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:hal-00287463_v1&r=ecm
  19. By: Edward S. Knotek II; Stephen Terry
    Abstract: Discrete Markov chains can be useful to approximate vector autoregressive processes for economists doing computational work. One such approximation method first presented by Tauchen (1986) operates under the general theoretical assumption of a transformed VAR with diagonal covariance structure for the process error term. We demonstrate one simple method of more conveniently treating this approximation problem in practice using readily available multivariate-normal integration techniques to allow for arbitrary positive-semidefinite covariance structures. Examples are provided using processes with non-diagonal and singular non-diagonal error covariances.
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:fip:fedkrw:rwp08-02&r=ecm
  20. By: Simon Luechinger; Alois Stutzer; Rainer Winkelmann
    Abstract: We discuss a new approach to specifying and estimating ordered probit models with endogenous switching, or with binary endogenous regressor, based on copula functions. These models provide a framework of analysis for self-selection in economic well-being equations, where assigment of regressors may be choice based, resulting from well-being maximization, rather than random. In an application to public and private sector job satisfaction, and using data on male workers from the German Socio-Economic Panel, we find that a model based on Frank's copula is preferred over two alternative models with independence and normal copula, respectively. The results suggest that public sector workers are negatively selected.
    Keywords: Ordered probit, switching regression, Frank copula, job satisfaction, German Socio-Economic Panel
    JEL: I31 C23
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:diw:diwsop:diw_sp135&r=ecm
  21. By: Laurent Ferrara (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, DGEI-DAMEP - Banque de France); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: Business surveys are an important element in the analysis of the short-term economic situation because of the timeliness and nature of the information they convey. Especially, surveys are often involved in econometric models in order to provide an early assessment of the current state of the economy, which is of great interest for policy-makers. In this paper, we focus on non-seasonally adjusted business surveys released by the European Commission. We introduce an innovative way for modelling those series taking the persistence of the seasonal roots into account through seasonal-cyclical long memory models. We empirically prove that such models produce more accurate forecasts than classical seasonal linear models.
    Keywords: Euro area, nowcasting, business surveys, seasonal, long memory.
    Date: 2008–05
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00277379_v1&r=ecm
  22. By: Laurens Cherchye; Bram De Rock; Frederic Vermeulen
    Abstract: We provide nonparametric 'revealed preference' tests of general collective consumption models that account for public consumption and externalities within the household. We further propose a novel approach to model special cases of the general collective model, which imply alternative assumptions regarding the sharing rule that underlies the observed collective consumption behavior. Our application uses data from the Russia Longitudinal Monitoring Survey (RLMS); the panel structure of this data set allows nonparametric testing of the behavioral models without relying on preference homogeneity assumptions across similar individuals. Our application includes test results but also a power analysis for the models under study. Our main finding is that the most general collective model, together with a large class of special cases of the general model, cannot be rejected for the couples in our data set. By contrast, our tests do reject the standard unitary model for these couples. Since our tests are entirely nonparametric, this provides strong evidence in favor of models focusing on intrahousehold decision-making.
    Keywords: collective household models, intrahousehold allocation, revealed preferences, nonparametric analysis
    JEL: D11 D12 C14
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2008_030&r=ecm
  23. By: Alexander Staus (Institute for Agricultural Policy and Agricultural Markets, University of Hohenheim)
    Abstract: Modeling consumer choice in different areas has lead to an increase use of discrete choice models. Probit or Multinomial Logit Models are often the base of further empirical research of consumer choice. In some of these models the equations to solve have no closed-form expression. They include multi-dimensional integrals which can not be solved analytically. Simulation methods have been developed to approximate a solution for these integrals. This paper describes the Standard Halton sequence and a modification of it, the Shuffled Halton sequence. Both are simulation methods which can reduce computational effort compared to a random sequence. We compare the simulation methods in their coverage of the multi-dimensional area and in their estimation results using data of consumer choice on grocery store formats.
    Keywords: simulation, mixed logit, halton sequence
    JEL: C15 C25
    Date: 2008–09
    URL: http://d.repec.org/n?u=RePEc:hoh:hoh420:17&r=ecm
  24. By: Josep Maria Puigvert Gutiérrez (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Josep Fortiana Gregori (Universitat de Barcelona, Departament de Probabilitat, Lògica i Estadística, Gran Via de les Corts Catalanes, 585, 08007 Barcelona, Spain.)
    Abstract: In this study we combine clustering techniques with a moving window algorithm in order to filter financial market data outliers. We apply the algorithm to a set of financial market data which consists of 25 series selected from a larger dataset using a cluster analysis technique taking into account the daily behaviour of the market; each of these series is an element of a cluster that represents a different segment of the market. We set up a framework of possible algorithm parameter combinations that detect most of the outliers by market segment. In addition, the algorithm parameters that have been found can also be used to detect outliers in other series with similar economic behaviour in the same cluster. Moreover, the crosschecking of the behaviour of different series within each cluster reduces the possibility of observations being misclassified as outliers. JEL Classification: C19, C49, G19.
    Keywords: Outliers, financial market, cluster analysis, moving filtering window algorithm.
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20080948&r=ecm
  25. By: Santarossa, Gino
    Abstract: This note briefly describes regression discontinuity method which estimates program impact when participation is a function of a selection variable predetermined value (threshold). The program evaluation impact might be done in a 2-step way. First, one graphically analyses the relation between the dependent variable and the selection factor in order to get some information about the functional form and the nature of the discontinuity at the threshold. Next, an econometric analysis is formally done in order to estimate program impact.
    Keywords: regression discontinuity program evaluation impact econometric
    JEL: C01
    Date: 2008–10–10
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:11268&r=ecm
  26. By: Essahbi Essaadi (GATE - Groupe d'analyse et de théorie économique - CNRS : UMR5824 - Université Lumière - Lyon II - Ecole Normale Supérieure Lettres et Sciences Humaines); Mohamed Boutahar (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - Ecole des Hautes Etudes en Sciences Sociales - CNRS : UMR6579)
    Abstract: In this paper, we test the instability of comovement, in time and frequency domain, for the GDP growth rate of the US and the UK. We use the frequency approach, which is based on evolutionary spectral analysis (Priestley, 1965-1996). The graphical analysis of the Time-Varying Coherence Function (TVCF) reports the existence of variability in correlation between the two series. Our goal is to estimate first the TVCF of the two series, then to test stability in both the cross-spectra density and in TVCF by detecting various breakpoints in each function.
    Keywords: comovement ; spectral analysis ; time-varying coherence function ; structural change
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:hal:journl:halshs-00333582_v1&r=ecm
  27. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Cyril Caillault (FORTIS Investments - Fortis investments)
    Abstract: Using non-parametric (copulas) and parametric models, we show that the bivariate distribution of an Asian portfolio is not stable along all the period under study. We suggest several dynamic models to compute two market risk measures, the Value at Risk and the Expected Shortfall: the RiskMetric methodology, the Multivariate GARCH models, the Multivariate Markov-Switching models, the empirical histogram and the dynamic copulas. We discuss the choice of the best method with respect to the policy management of bank supervisors. The copula approach seems to be a good compromise between all these models. It permits taking financial crises into account and obtaining a low capital requirement during the most important crises.
    Keywords: Value at Risk - Expected Shortfall - Copula - RiskMetrics - Risk management -GARCH models - Switching models.
    Date: 2008–03–06
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00185374_v1&r=ecm

This nep-ecm issue is ©2008 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.