|
on Econometrics |
By: | Abdou Kâ Diongue (UFR SAT - Université Gaston Berger - Université Gaston Berger de Saint-Louis); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Rodney C. Wolff (School of Mathematical Sciences - Queensland University of Technology) |
Abstract: | In this paper, we discuss the class of Bilinear GATRCH (BL-GARCH) models which are capable of capturing simultaneously two key properties of non-linear time series : volatility clustering and leverage effects. It has been observed often that the marginal distributions of such time series have heavy tails ; thus we examine the BL-GARCH model in a general setting under some non-Normal distributions. We investigate some probabilistic properties of this model and we propose and implement a maximum likelihood estimation (MLE) methodology. To evaluate the small-sample performance of this method for the various models, a Monte Carlo study is conducted. Finally, within-sample estimation properties are studied using S&P 500 daily returns, when the features of interest manifest as volatility clustering and leverage effects. |
Keywords: | BL-GARCH process, elliptical distribution, leverage effects, Maximum Likelihood, Monte Carlo method, volatility clustering. |
Date: | 2008–03 |
URL: | http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00270719_v1&r=ecm |
By: | Alain Guay; Jean-François Lamarche |
Abstract: | This paper proposes Pearson-type statistics based on implies probabilities to detect structural change. The class of generalized empirical likelihood estimators (see Smith (1997)) assigns a set of probabilities to each observation such that moment conditions are satisfied. These restricted probabilities are called implied probabilities. Implied probabilities may also be constructed for the standard GMM (see Back and Brown (1993)). The proposed test statistics for structural change are based on the information content in these implied probabilities. We consider cases of structural change with unknown breakpoint which can occur in the parameters of interest or in the overidentifying restrictions used to estimate these parameters. The test statistics considered here have good size and power properties. |
Keywords: | Generalized empirical likelihood, generalized method of moments, parameter instability, structural change |
JEL: | C12 C32 |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:lvl:lacicr:0833&r=ecm |
By: | Abdou Kâ Diongue (UFR SAT - Université Gaston Berger - Université Gaston Berger de Saint-Louis, School of Economics and Finance - Queensland University of Technology); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris) |
Abstract: | In this paper, we discuss the parameter estimation for a k-factor generalized long memory process with conditionally heteroskedastic noise. Two estimation methods are proposed. The first method is based on the conditional distribution of the process and the second is obtained as an extension of Whittle's estimation approach. For comparison purposes, Monte Carlo simulations are used to evaluate the finite sample performance of these estimation techniques. |
Keywords: | Long memory, Gegenbauer polynomial, heteeroskedasticity, conditional sum of squares, Whittle estimation. |
Date: | 2008–01 |
URL: | http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00235179_v1&r=ecm |
By: | Andrea Carriero (Queen Mary, University of London); George Kapetanios (Queen Mary, University of London); Massimiliano Marcellino (Bocconi University and EUI) |
Abstract: | The paper provides a proof of consistency of the ridge estimator for regressions where the number of regressors tends to infinity. Such result is obtained without assuming a factor structure. A Monte Carlo study suggests that shrinkage autoregressive models can lead to very substantial advantages compared to standard autoregressive models. An empirical application focusing on forecasting inflation and GDP growth in a panel of countries confirms this finding. |
Keywords: | Shrinkage, Forecasting |
JEL: | C13 C22 C53 |
Date: | 2008–10 |
URL: | http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp635&r=ecm |
By: | Morten Ørregaard Nielsen (Queen's University) |
Abstract: | This paper presents a family of simple nonparametric unit root tests indexed by one parameter, d, and containing Breitung's (2002) test as the special case d=1. It is shown that (i) each member of the family with d>0 is consistent, (ii) the asymptotic distribution depends on d, and thus reflects the parameter chosen to implement the test, and (iii) since the asymptotic distribution depends on d and the test remains consistent for all d>0, it is possible to analyze the power of the test for different values of d. The usual Phillips-Perron or Dickey-Fuller type tests are indexed by bandwidth, lag length, etc., but have none of these three properties. It is shown that members of the family with d<1 have higher asymptotic local power than the Breitung (2002) test, and when d is small the asymptotic local power of the proposed nonparametric test is relatively close to the parametric power envelope, particularly in the case with a linear time-trend. Furthermore, GLS detrending is shown to improve power when d is small, which is not the case for Breitung's (2002) test. Simulations demonstrate that when applying a sieve bootstrap procedure, the proposed variance ratio test has very good size properties, with finite sample power that is higher than that of Breitung's (2002) test and even rivals the (nearly) optimal parametric GLS detrended augmented Dickey-Fuller test with lag length chosen by an information criterion. |
Keywords: | Augmented Dickey-Fuller test, fractional integration, GLS detrending, nonparametric, nuisance parameter, tuning parameter, power envelope, unit root test, variance ratio |
JEL: | C22 |
Date: | 2008–10 |
URL: | http://d.repec.org/n?u=RePEc:qed:wpaper:1185&r=ecm |
By: | M. Angeles Carnero (Universidad de Alicante); Daniel Peña (Universidad Carlos III de Madrid); Esther Ruiz (Universidad Carlos III de Madrid) |
Abstract: | The main goal when fitting GARCH models to conditionally heteroscedastic time series is to estimate the underlying volatilities. It is well known that outliers affect the estimation of the GARCH parameters. However, little is known about their effects when estimating volatilities. In this paper, we show that when estimating the volatility by using Maximum Likelihood estimates of the parameters, the biases incurred can be very large even if estimated parameters have small biases. Consequently, we propose to use robust procedures. In particular, a simple robust estimator of the parameters is proposed and shown that its properties are comparable with other more complicated ones available in the literature. The properties of the estimated and predicted volatilities obtained by using robust filters based on robust parameter estimates are analyzed. All the results are illustrated using daily S&P500 and IBEX35 returns. |
Keywords: | Heteroscedasticity, M-estimator, QML estimator, Robustness, Financial Markets |
JEL: | C22 |
Date: | 2008–10 |
URL: | http://d.repec.org/n?u=RePEc:ivi:wpasad:2008-13&r=ecm |
By: | Gilles Dufrenot (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - Ecole des Hautes Etudes en Sciences Sociales - CNRS : UMR6579); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Anne Peguin-Feissolle (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - Ecole des Hautes Etudes en Sciences Sociales - CNRS : UMR6579) |
Abstract: | This paper presents a 2-regime SETAR model with different long-memory processes in both regimes. We briefly present the memory properties of this model and propose an estimation method. Such a process is applied to the absolute and squared returns of five stock indices. A comparison with simple FARIMA models is made using some forecastibility criteria. Our empirical results suggest that our model offers an interesting alternative competing framework to describe the persistent dynamics in modeling the returns. |
Keywords: | SETAR - Long-memory - Stock indices - Forecasting |
Date: | 2008–04 |
URL: | http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00185369_v1&r=ecm |
By: | Laurent Ferrara (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, DGEI-DAMEP - Banque de France); Thomas Raffinot (CPR-Asset Management - CPR Asset Management) |
Abstract: | Non-parametric methods have been empirically proved to be of great interest in the statistical literature in order to forecast stationary time series, but very few applications have been proposed in the econometrics literature. In this paper, our aim is to test whether non-parametric statistical procedures based on a Kernel method can improve classical linear models in order to nowcast the Euro area manufacturing industrial production index (IPI) by using business surveys released by the European Commission. Moreover, we consider the methodology based on bootstrap replications to estimate the confidence interval of the nowcasts. |
Keywords: | Non-parametric, Kernel, nowcasting, bootstrap, Euro area IPI. |
Date: | 2008–04 |
URL: | http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00275769_v1&r=ecm |
By: | Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris) |
Abstract: | In this paper we deal with the problem of non-stationarity encountered in a lot of data sets, mainly in financial and economics domains, coming from the presence of multiple seasonnalities, jumps, volatility, distorsion, aggregation, etc. Existence of non-stationarity involves spurious behaviors in estimated statistics as soon as we work with finite samples. We illustrate this fact using Markov switching processes, Stopbreak models and SETAR processes. Thus, working with a theoretical framework based on the existence of an invariant measure for a whole sample is not satisfactory. Empirically alternative strategies have been developed introducing dynamics inside modelling mainly through the parameter with the use of rolling windows. A specific framework has not yet been proposed to study such non-invariant data sets. The question is difficult. Here, we address a discussion on this topic proposing the concept of meta-distribution which can be used to improve risk management strategies or forecasts. |
Keywords: | Non-stationarity, switching processes, SETAR processes, jumps, forecast, risk management, copula, probability distribution function. |
Date: | 2008–03 |
URL: | http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00270708_v1&r=ecm |
By: | Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Justin Leroux (Institute for Applied Economics - HEC MONTRÉAL, CIRPEE - Centre Interuniversitaire sur le Risque, les Politiques Economiques et l'Emploi) |
Abstract: | We propose a novel methodology for forecasting chaotic systems which is based on the nearest-neighbor predictor and improves upon it by incorporating local Lyapunov exponents to correct for its inevitable bias. Using simulated data, we show that gains in prediction accuracy can be substantial. The general intuition behind the proposed method can readily be applied to other non-parametric predictors. |
Keywords: | Chaos theory, Lyapunov exponent, logistic map, Monte Carlo simulations. |
Date: | 2008–02 |
URL: | http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00259238_v1&r=ecm |
By: | Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris) |
Abstract: | The detection of chaotic behaviors in commodities, stock markets and weather data is usually complicated by large noise perturbation inherent to the underlying system. It is well known, that predictions, from pure deterministic chaotic systems can be accurate mainly in the short term. Thus, it will be important to be able to reconstruct in a robust way the attractor in which evolves the data, if this attractor exists. In chaotic theory, the deconvolution methods have been largely studied and there exist different approaches which are competitive and complementary. In this work, we apply two methods : the singular value method and the wavelet approach. This last one has not been investigated a lot of filtering chaotic systems. Using very large Monte Carlo simulations, we show the ability of this last deconvolution method. Then, we use the de-noised data set to do forecast, and we discuss deeply the possibility to do long term forecasts with chaotic systems. |
Keywords: | Deconvolution, chaos, SVD, state space method, wavelets method. |
Date: | 2008–01 |
URL: | http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00235448_v1&r=ecm |
By: | Ricardo Mestre (Corresponding author: European Central Bank, DG Research, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Peter McAdam (European Central Bank, DG Research, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.) |
Abstract: | We evaluate residual projection strategies in the context of a large-scale macro model of the euro area and smaller benchmark time-series models. The exercises attempt to measure the accuracy of model-based forecasts simulated both out-of-sample and in-sample. Both exercises incorporate alternative residual-projection methods, to assess the importance of unaccounted-for breaks in forecast accuracy and off-model judgment. Conclusions reached are that simple mechanical residual adjustments have a significant impact of forecasting accuracy irrespective of the model in use, ostensibly due to the presence of breaks in trends in the data. The testing procedure and conclusions are applicable to a wide class of models and thus of general interest. JEL Classification: C52, E30, E32, E37. |
Keywords: | Macro-model, Forecast Projections, Out-of-Sample, In-Sample, Forecast Accuracy, Structural Break. |
Date: | 2008–10 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20080950&r=ecm |
By: | D.S.G. Pollock |
Abstract: | This paper expounds some of the results of Fourier theory that are essential to the statistical analysis of time series. It employs the algebra of circulant matrices to expose the structure of the discrete Fourier transform and to elucidate the filtering operations that may be applied to finite data sequences. An ideal filter with a gain of unity throughout the pass band and a gain of zero throughout the stop band is commonly regarded as incapable of being realised in finite samples. It is shown here that, to the contrary, such a filter can be realised both in the time domain and in the frequency domain. The algebra of circulant matrices is also helpful in revealing the nature of statistical processes that are band limited in the frequency domain. In order to apply the conventional techniques of autoregressive moving-average modelling, the data generated by such processes must be subjected to antialiasing filtering and sub sampling. These techniques are also described. It is argued that band-limited processes are more prevalent in statistical and econometric time series than is commonly recognised. |
Date: | 2008–10 |
URL: | http://d.repec.org/n?u=RePEc:lec:leecon:08/36&r=ecm |
By: | Elena Angelini (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Gonzalo Camba-Méndez (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Domenico Giannone (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Gerhard Rünstler (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Lucrezia Reichlin (London Business School, Regent’s Park, London NW1 4SA, United Kingdom.) |
Abstract: | This paper evaluates models that exploit timely monthly releases to compute early estimates of current quarter GDP (now-casting) in the euro area. We compare traditional methods used at institutions with a new method proposed by Giannone, Reichlin, and Small (2005). The method consists in bridging quarterly GDP with monthly data via a regression on factors extracted from a large panel of monthly series with different publication lags. We show that bridging via factors produces more accurate estimates than traditional bridge equations. We also show that survey data and other ‘soft’ information are valuable for now-casting. JEL Classification: E52, C33, C53. |
Keywords: | Forecasting, Monetary Policy, Factor Model, Real Time Data, Large datasets, News. |
Date: | 2008–10 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20080949&r=ecm |
By: | Ibrahim Ahamada (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Philippe Jolivaldt (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris) |
Abstract: | Test for unit root based in wavelets theory is recently defined (Genay and Fan, 2007). While the new test is supposed to be robust to the initial value, we bring out by contrast the significant effects of the initial value in the size and the power. We found also that both the wavelets unit root test and ADF test give the same efficiency if the data are corrected of the initial value. Our approach is based in monte carlo experiment. |
Keywords: | Unit root tests, wavelets, monte carlo experiments, size-power curve. |
Date: | 2008–03 |
URL: | http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00275767_v1&r=ecm |
By: | Elena Angelini (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Marta Bańbura (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Gerhard Rünstler (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.) |
Abstract: | We estimate and forecast growth in euro area monthly GDP and its components from a dynamic factor model due to Doz et al. (2005), which handles unbalanced data sets in an efficient way. We extend the model to integrate interpolation and forecasting together with cross-equation accounting identities. A pseudo real-time forecasting exercise indicates that the model outperforms various benchmarks, such as quarterly time series models and bridge equations in forecasting growth in quarterly GDP and its components. JEL Classification: E37, C53. |
Keywords: | Dynamic factor models, interpolation, nowcasting. |
Date: | 2008–10 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20080953&r=ecm |
By: | Nii Ayi Armah; Norman R. Swanson |
Abstract: | In economics, common factors are often assumed to underlie the co-movements of a set of macroeconomic variables. For this reason, many authors have used estimated factors in the construction of prediction models. In this paper, we begin by surveying the extant literature on diffusion indexes. We then outline a number of approaches to the selection of factor proxies (observed variables that proxy unobserved estimated factors) using the statistics developed in Bai and Ng (2006a,b). Our approach to factor proxy selection is examined via a small Monte Carlo experiment, where evidence supporting our proposed methodology is presented, and via a large set of prediction experiments using the panel dataset of Stock and Watson (2005). One of our main empirical findings is that our “smoothed” approaches to factor proxy selection appear to yield predictions that are often superior not only to a benchmark factor model, but also to simple linear time series models which are generally difficult to beat in forecasting competitions. In some sense, by using our approach to predictive factor proxy selection, one is able to open up the “black box” often associated with factor analysis, and to identify actual variables that can serve as primitive building blocks for (prediction) models of a host of macroeconomic variables, and that can also serve as policy instruments, for example. Our findings suggest that important observable variables include various S&P500 variables, including stock price indices and dividend series; a 1-year Treasury bond rate; various housing activity variables; industrial production; and exchange rates. |
Keywords: | Macroeconomics |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedpwp:08-25&r=ecm |
By: | Aurélien Hazan (IBISC - Informatique, Biologie Intégrative et Systèmes Complexes - CNRS : FRE2873 - Université d'Evry-Val d'Essonne); Vincent Vigneron (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, SAMOS - Statistique Appliquée et MOdélisation Stochastique - Université Panthéon-Sorbonne - Paris I) |
Abstract: | The various scales of a signal maintain relations of dependence the on es with the others. Those can vary in time and reveal speed changes in the studied phenomenon. In the goal to establish these changes, one shall compute first the wavelet transform of a signal, on various scales. Then one shall study the statistical dependences between these transforms thanks to an estimator of mutual information. One shall then propose to summarize the resulting network of dependences by a graph of dependences by thresholding the values of the mutual information or by quantifying its values. The method can be applied to several types of signals, such as fluctuations of market indexes for instance the S&P 500, or high frequency foreign exchange (FX) rates. |
Keywords: | wavelet, dependence; mutual information; financial; time-series; FX |
Date: | 2008–06–05 |
URL: | http://d.repec.org/n?u=RePEc:hal:paris1:hal-00287463_v1&r=ecm |
By: | Edward S. Knotek II; Stephen Terry |
Abstract: | Discrete Markov chains can be useful to approximate vector autoregressive processes for economists doing computational work. One such approximation method first presented by Tauchen (1986) operates under the general theoretical assumption of a transformed VAR with diagonal covariance structure for the process error term. We demonstrate one simple method of more conveniently treating this approximation problem in practice using readily available multivariate-normal integration techniques to allow for arbitrary positive-semidefinite covariance structures. Examples are provided using processes with non-diagonal and singular non-diagonal error covariances. |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedkrw:rwp08-02&r=ecm |
By: | Simon Luechinger; Alois Stutzer; Rainer Winkelmann |
Abstract: | We discuss a new approach to specifying and estimating ordered probit models with endogenous switching, or with binary endogenous regressor, based on copula functions. These models provide a framework of analysis for self-selection in economic well-being equations, where assigment of regressors may be choice based, resulting from well-being maximization, rather than random. In an application to public and private sector job satisfaction, and using data on male workers from the German Socio-Economic Panel, we find that a model based on Frank's copula is preferred over two alternative models with independence and normal copula, respectively. The results suggest that public sector workers are negatively selected. |
Keywords: | Ordered probit, switching regression, Frank copula, job satisfaction, German Socio-Economic Panel |
JEL: | I31 C23 |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:diw:diwsop:diw_sp135&r=ecm |
By: | Laurent Ferrara (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, DGEI-DAMEP - Banque de France); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris) |
Abstract: | Business surveys are an important element in the analysis of the short-term economic situation because of the timeliness and nature of the information they convey. Especially, surveys are often involved in econometric models in order to provide an early assessment of the current state of the economy, which is of great interest for policy-makers. In this paper, we focus on non-seasonally adjusted business surveys released by the European Commission. We introduce an innovative way for modelling those series taking the persistence of the seasonal roots into account through seasonal-cyclical long memory models. We empirically prove that such models produce more accurate forecasts than classical seasonal linear models. |
Keywords: | Euro area, nowcasting, business surveys, seasonal, long memory. |
Date: | 2008–05 |
URL: | http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00277379_v1&r=ecm |
By: | Laurens Cherchye; Bram De Rock; Frederic Vermeulen |
Abstract: | We provide nonparametric 'revealed preference' tests of general collective consumption models that account for public consumption and externalities within the household. We further propose a novel approach to model special cases of the general collective model, which imply alternative assumptions regarding the sharing rule that underlies the observed collective consumption behavior. Our application uses data from the Russia Longitudinal Monitoring Survey (RLMS); the panel structure of this data set allows nonparametric testing of the behavioral models without relying on preference homogeneity assumptions across similar individuals. Our application includes test results but also a power analysis for the models under study. Our main finding is that the most general collective model, together with a large class of special cases of the general model, cannot be rejected for the couples in our data set. By contrast, our tests do reject the standard unitary model for these couples. Since our tests are entirely nonparametric, this provides strong evidence in favor of models focusing on intrahousehold decision-making. |
Keywords: | collective household models, intrahousehold allocation, revealed preferences, nonparametric analysis |
JEL: | D11 D12 C14 |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:eca:wpaper:2008_030&r=ecm |
By: | Alexander Staus (Institute for Agricultural Policy and Agricultural Markets, University of Hohenheim) |
Abstract: | Modeling consumer choice in different areas has lead to an increase use of discrete choice models. Probit or Multinomial Logit Models are often the base of further empirical research of consumer choice. In some of these models the equations to solve have no closed-form expression. They include multi-dimensional integrals which can not be solved analytically. Simulation methods have been developed to approximate a solution for these integrals. This paper describes the Standard Halton sequence and a modification of it, the Shuffled Halton sequence. Both are simulation methods which can reduce computational effort compared to a random sequence. We compare the simulation methods in their coverage of the multi-dimensional area and in their estimation results using data of consumer choice on grocery store formats. |
Keywords: | simulation, mixed logit, halton sequence |
JEL: | C15 C25 |
Date: | 2008–09 |
URL: | http://d.repec.org/n?u=RePEc:hoh:hoh420:17&r=ecm |
By: | Josep Maria Puigvert Gutiérrez (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Josep Fortiana Gregori (Universitat de Barcelona, Departament de Probabilitat, Lògica i Estadística, Gran Via de les Corts Catalanes, 585, 08007 Barcelona, Spain.) |
Abstract: | In this study we combine clustering techniques with a moving window algorithm in order to filter financial market data outliers. We apply the algorithm to a set of financial market data which consists of 25 series selected from a larger dataset using a cluster analysis technique taking into account the daily behaviour of the market; each of these series is an element of a cluster that represents a different segment of the market. We set up a framework of possible algorithm parameter combinations that detect most of the outliers by market segment. In addition, the algorithm parameters that have been found can also be used to detect outliers in other series with similar economic behaviour in the same cluster. Moreover, the crosschecking of the behaviour of different series within each cluster reduces the possibility of observations being misclassified as outliers. JEL Classification: C19, C49, G19. |
Keywords: | Outliers, financial market, cluster analysis, moving filtering window algorithm. |
Date: | 2008–10 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20080948&r=ecm |
By: | Santarossa, Gino |
Abstract: | This note briefly describes regression discontinuity method which estimates program impact when participation is a function of a selection variable predetermined value (threshold). The program evaluation impact might be done in a 2-step way. First, one graphically analyses the relation between the dependent variable and the selection factor in order to get some information about the functional form and the nature of the discontinuity at the threshold. Next, an econometric analysis is formally done in order to estimate program impact. |
Keywords: | regression discontinuity program evaluation impact econometric |
JEL: | C01 |
Date: | 2008–10–10 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:11268&r=ecm |
By: | Essahbi Essaadi (GATE - Groupe d'analyse et de théorie économique - CNRS : UMR5824 - Université Lumière - Lyon II - Ecole Normale Supérieure Lettres et Sciences Humaines); Mohamed Boutahar (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - Ecole des Hautes Etudes en Sciences Sociales - CNRS : UMR6579) |
Abstract: | In this paper, we test the instability of comovement, in time and frequency domain, for the GDP growth rate of the US and the UK. We use the frequency approach, which is based on evolutionary spectral analysis (Priestley, 1965-1996). The graphical analysis of the Time-Varying Coherence Function (TVCF) reports the existence of variability in correlation between the two series. Our goal is to estimate first the TVCF of the two series, then to test stability in both the cross-spectra density and in TVCF by detecting various breakpoints in each function. |
Keywords: | comovement ; spectral analysis ; time-varying coherence function ; structural change |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:halshs-00333582_v1&r=ecm |
By: | Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Cyril Caillault (FORTIS Investments - Fortis investments) |
Abstract: | Using non-parametric (copulas) and parametric models, we show that the bivariate distribution of an Asian portfolio is not stable along all the period under study. We suggest several dynamic models to compute two market risk measures, the Value at Risk and the Expected Shortfall: the RiskMetric methodology, the Multivariate GARCH models, the Multivariate Markov-Switching models, the empirical histogram and the dynamic copulas. We discuss the choice of the best method with respect to the policy management of bank supervisors. The copula approach seems to be a good compromise between all these models. It permits taking financial crises into account and obtaining a low capital requirement during the most important crises. |
Keywords: | Value at Risk - Expected Shortfall - Copula - RiskMetrics - Risk management -GARCH models - Switching models. |
Date: | 2008–03–06 |
URL: | http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00185374_v1&r=ecm |