|
on Econometrics |
By: | Dikaios Tserkezos (Department of Economics, University of Crete, Greece); Konstantinos Tsagarakis (Department of Environmental Engineering, Democritus University of Thrace) |
Abstract: | This short paper demonstrates the effects of using missing data on the power of the well-known Hausman (1978) test for simultaneity in structural econometric models. This test is a reliable test and is widely used for testing simultaneity in linear and nonlinear structural models. Using Monte Carlo techniques, we find that the existence of missing data could affect seriously the power of the test. As their number is getting larger, the probability of rejecting simultaneity with Hausman test is increasing significantly especially in small samples. A Full Information Maximum Likelihood Missing Data correction technique is used to overcome the problem and then we find out that that the test is more effective when we retrieve these data and include them in the sample. |
Keywords: | Hausman (1978) simultaneity test, structural econometric models, FIML, missing data, simulation |
JEL: | C01 C12 C15 |
Date: | 2008–06–03 |
URL: | http://d.repec.org/n?u=RePEc:crt:wpaper:0821&r=ecm |
By: | Alicia Pérez Alonso (Universidad de Alicante); Juan Mora (Universidad de Alicante) |
Abstract: | We discuss how to test whether the distribution of regression errors belongs to a parametric family of continuous distribution functions, making no parametric assumption about the conditional mean or the conditional variance in the regression model. We propose using test statistics that are based on a martingale transform of the estimated empirical process. We prove that these statistics are asymptotically distribution-free, and two Monte Carlo experiments show that they work reasonably well in practice. |
Keywords: | Specification Tests; Nonparametric Regression; Empirical Processes. |
Date: | 2008–06 |
URL: | http://d.repec.org/n?u=RePEc:ivi:wpasad:2008-11&r=ecm |
By: | Hall, Alastair R.; Han, Sanggohn; Boldea, Otilia |
Abstract: | In this paper, we present a limiting distribution theory for the break point estimator in a linear regression model estimated via Two Stage Least Squares under two different scenarios regarding the magnitude of the parameter change between regimes. First, we consider the case where the parameter change is of fixed magnitude; in this case the resulting distribution depends on distribution of the data and is not of much practical use for inference. Second, we consider the case where the magnitude of the parameter change shrinks with the sample size; in this case, the resulting distribution can be used to construct approximate large sample confidence intervals for the break point. The finite sample performance of these intervals are analyzed in a small simulation study and the intervals are illustrated via an application to the New Keynesian Phillips curve. |
JEL: | C13 C32 C12 |
Date: | 2008–07–04 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:9472&r=ecm |
By: | Franco Peracchi (Faculty of Economics, University of Rome "Tor Vergata"); Andrei V. Tanase (Faculty of Economics, University of Rome "Tor Vergata") |
Abstract: | Unlike the value at risk, the expected shortfall is a coherent measure of risk. In this paper, we discuss estimation of the expected shortfall of a random variable Yt with special reference to the case when auxiliary information is available in the form of a set of predictors Xt. We consider three classes of estimators of the conditional expected shortfall of Yt given Xt: a class of fully non-parametric estimators and two classes of analog estimators based, respectively, on the empirical conditional quantile function and the empirical conditional distribution function. We study their sampling properties by means of a set of Monte Carlo experiments and analyze their performance in an empirical application to financial data. |
Keywords: | risk measures, quantile regression, logistic regression |
JEL: | C13 E44 G11 |
Date: | 2008–07–14 |
URL: | http://d.repec.org/n?u=RePEc:rtv:ceisrp:122&r=ecm |
By: | Gianluca Cubadda (Dipartimento SEFEMEQ - Università di Roma "Tor Vergata"); Alain Hecq (University of Maastricht); Franz C. Palm (University of Maastricht) |
Abstract: | For non-stationary vector autoregressive models (VAR hereafter, or VAR with moving average, VARMA hereafter), we show that the presence of common cyclical features or cointegration leads to a reduction of the order of the implied univariate autoregressive-moving average (ARIMA hereafter) models. This finding can explain why we identify parsimonious univariate ARIMA models in applied research although VAR models of typical order and dimension used in macroeconometrics imply nonparsimonious univariate ARIMA representations. Next, we develop a strategy for studying interactions between variables prior to possibly modelling them in a multivariate setting. Indeed, the similarity of the autoregressive roots will be informative about the presence of co-movements in a set of multiple time series. Our results justify both the use of a panel setup with homogeneous autoregression and heterogeneous cross-correlated vector moving average errors and a factor structure, and the use of cross-sectional aggregates of ARIMA series to estimate the homogeneous autoregression. |
Keywords: | Interactions, multiple time series, co-movements, ARIMA, cointegration, common cycles, dynamic panel data. |
JEL: | C32 |
Date: | 2008–07–14 |
URL: | http://d.repec.org/n?u=RePEc:rtv:ceisrp:125&r=ecm |
By: | Ole E. Barndorff-Nielsen; Peter Reinhard Hansen; Asger Lunde; Neil Shephard |
Abstract: | We propose a multivariate realised kernel to estimate the ex-post covariation of log-prices. We show this new consistent estimator is guaranteed to be positive semi-definite and is robust to measurement noise of certain types and can also handle non-synchronous trading. It is the first estimator which has these three properties which are all essential for empirical work in this area. We derive the large sample asymptotics of this estimator and assess its accuracy using a Monte Carlo study. We implement the estimator on some US equity data, comparing our results to previous work which has used returns measured over 5 or 10 minutes intervals. We show the new estimator is substantially more precise. |
Keywords: | HAC estimator, Long run variance estimator; Market frictions; Quadratic variation; Realised variance |
JEL: | C01 C14 C32 |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:sbs:wpsefe:2008fe29&r=ecm |
By: | Martin, Will; Pham, Cong S. |
Abstract: | In this paper we estimate the gravity model allowing for the pervasive issues of heteroscedasticity and zero bilateral trade flows identified in an influential paper by Santos Silva and Tenreyro. We use Monte Carlo simulations with data generated using a heteroscedastic, limited-dependent-variable process to investigate the extent to which different estimators can deal with the resulting parameter biases. While the Poisson Pseudo-Maximum Likelihood estimator recommended by Santos Silva and Tenreyro solves the heteroscedasticity-bias problem when this is the only problem, it apprears to yield severely biased estimates when zero trade values are frequent. Standard threshold-Tobit estimators perform better as long as the heteroscedasticity problem is satisfactorily dealt with. The Heckman Maximum Likelihood estimators appear to perform well if true identifying restrictions are available. |
Keywords: | International trade; Gravity equation; Limited dependent variable regression; Poisson pseudo-maximum likelihood; Zero trade flows |
JEL: | C13 P45 C01 |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:9453&r=ecm |
By: | Karl Schlag |
Abstract: | Small sample properties are of fundamental interest when only limited data is avail- able. Exact inference is limited by constraints imposed by speci.c nonrandomized tests and of course also by lack of more data. These e¤ects can be separated as we propose to evaluate a test by comparing its type II error to the minimal type II error among all tests for the given sample. Game theory is used to establish this minimal type II error, the associated randomized test is characterized as part of a Nash equilibrium of a .ctitious game against nature. We use this method to investigate sequential tests for the di¤erence between two means when outcomes are constrained to belong to a given bounded set. Tests of inequality and of noninferiority are included. We .nd that inference in terms of type II error based on a balanced sample cannot be improved by sequential sampling or even by observing counter factual evidence providing there is a reasonable gap between the hypotheses. |
Keywords: | Exact, distribution-free, nonparametric, independent samples, matched pairs, Z test, unavoidable type II error, noninferiority |
JEL: | C12 C14 |
Date: | 2008–06 |
URL: | http://d.repec.org/n?u=RePEc:upf:upfgen:1099&r=ecm |
By: | Felix Chan (School of Economics and Finance, Curtin University of Technology); Tommaso Mancini-Griffoli (Swiss National Bank, Paris School of Economics (PSE), CEPREMAP); Laurent L. Pauwels (Hong Kong Monetary Authority, Graduate Institute of International Studies, Geneva) |
Abstract: | This paper proposes a new test for structural stability in panels by extending the testing procedure proposed in the seminal work of Andrews (2003) originally developed for time series. The test is robust to non-normal, heteroskedastic and serially correlated errors, and, importantly, allows for the number of post break observations to be small. Moreover, the test accommodates the possibility of a break affecting only some - and not all - individuals of the panel. Under mild assumptions the test statistic is shown to be asymptotically normal, thanks to the cross sectional dimension of panel data. This greatly facilitates the calculation of critical values with respect to the test's time series counterpart. Monte Carlo experiments show that the test has good size and power under a wide range of circumstances. Finally, the test is illustrated in practice, in a brief study of the euro's effect on trade. |
Keywords: | Structural Change, Instability, Cross Sectionally Dependent Errors, Heterogeneous Panels, Monte Carlo, Euro Effect on Trade |
JEL: | C23 C52 |
Date: | 2008–09 |
URL: | http://d.repec.org/n?u=RePEc:hkm:wpaper:092008&r=ecm |
By: | Mishra, SK |
Abstract: | This paper elaborates on the deleterious effects of outliers and corruption of dataset on estimation of linear regression coefficients by the Ordinary Least Squares method. Motivated to ameliorate the estimation procedure, we have introduced the robust regression estimators based on Campbell’s robust covariance estimation method. We have investigated into two possibilities: first, when the weights are obtained strictly as suggested by Campbell and secondly, when weights are assigned in view of the Hampel’s median absolute deviation measure of dispersion. Both types of weights are obtained iteratively. Using these two types of weights, two different types of weighted least squares procedures have been proposed. These procedures are applied to detect outliers in and estimate regression coefficients from some widely used datasets such as stackloss, water salinity, Hawkins-Bradu-Kass, Hertzsprung-Russell Star and pilot-point datasets. It has been observed that Campbell-II in particular detects the outlier data points quite well (although occasionally signaling false positive too as very mild outliers). Subsequently, some Monte Carlo experiments have been carried out to assess the properties of these estimators. Findings of these experiments indicate that for larger number and size of outliers, the Campbell-II procedure outperforms the Campbell-I procedure. Unless perturbations introduced to the dataset are sizably numerous and very large in magnitude, the estimated coefficients by the Campbell-II method are also nearly unbiased. A Fortan Program for the proposed method has also been appended. |
Keywords: | Robust regression; Campbell's robust covariance; outliers; Stackloss;Water Salinity; Hawkins-Bradu-Kass; Hertzsprung-Russell Star; Pilot-Plant; Dataset;Monte Carlo; Experiment; Fortran Computer Program |
JEL: | C13 C14 C63 C15 C01 |
Date: | 2008–07–04 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:9445&r=ecm |
By: | Tommaso Proietti (Faculty of Economics, University of Rome "Tor Vergata"); Alessandra Luati (University of Bologna) |
Abstract: | The paper focuses on the adaptation of local polynomial filters at the end of the sample period. We show that for real time estimation of signals (i.e. exactly at the boundary of the time support) we cannot rely on the automatic adaptation of the local polynomial smoothers, since the direct real time filter turns out to be strongly localised, and thereby yields extremely volatile estimates. As an alternative we evaluate a general family of asymmetric filters that minimises the mean square revision error subject to polynomial reproduction constraints; in the case of the Henderson filter it nests the well known Musgrave’s surrogate filters. The class of filters depends on unknown features of the series such as the slope and the curvature of the underlying signal, which can be estimated from the data. Several empirical examples illustrate the effectiveness of our proposal. We also discuss the merits of using a nearest neighbour bandwidth as opposed to a fixed bandwidth for improving the quality of the approximation. |
Keywords: | Henderson filter. Trend estimation. Nearest Neighbour Bandwidth. Musgrave asymmetric filters |
Date: | 2008–07–14 |
URL: | http://d.repec.org/n?u=RePEc:rtv:ceisrp:112&r=ecm |
By: | Enzo Weber |
Abstract: | Information flows across international financial markets typically occur within hours, making volatility spillover appear contemporaneous in daily data. Such simultaneous transmission of variances is featured by the stochastic volatility model developed in this paper, in contrast to usually employed multivariate ARCH processes. The identification problem is solved by considering heteroscedasticity of the structural volatility innovations, and estimation takes place in an appropriately specied state space setup. In the empirical application, unidirectional volatility spillovers from the US stock market to three American countries are revealed. The impact is strongest for Canada, followed by Mexico and Brazil, which are subject to idiosyncratic crisis effects. |
Keywords: | Stochastic Volatility, Identification, Variance Transmission |
JEL: | C32 G15 |
Date: | 2008–07 |
URL: | http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2008-049&r=ecm |
By: | Schanne, Norbert (Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany]); Wapler, Rüdiger (Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany]); Weyh, Antje (Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany]) |
Abstract: | "We forecast unemployment for the 176 German labour-market districts on a monthly basis. Because of their small size, strong spatial interdependencies exist between these regional units. To account for these as well as for the heterogeneity in the regional development over time, we apply different versions of an univariate spatial GVAR model. When comparing the forecast precision with univariate time-series methods, we find that the spatial model does indeed perform better or at least as well. Hence, the GVAR model provides an alternative or complementary approach to commonly used methods in regional forecasting which do not consider regional interdependencies." (author's abstract, IAB-Doku) ((en)) |
JEL: | C31 C53 E24 O18 |
Date: | 2008–07–10 |
URL: | http://d.repec.org/n?u=RePEc:iab:iabdpa:200828&r=ecm |
By: | Nikolaus Hautsch; Vahidin Jeleskovic |
Abstract: | In this paper, we study the dynamic interdependencies between high-frequency volatility, liquidity demand as well as trading costs in an electronic limit order book market. Using data from the Australian Stock Exchange we model 1-min squared mid-quote returns, average trade sizes, number of trades and average (excess) trading costs per time interval in terms of a four-dimensional multiplicative error model. The latter is augmented to account also for zero observations. We find evidence for significant contemporaneous relationships and dynamic interdependencies between the individual variables. Liquidity is causal for future volatility but not vice versa. Furthermore, trade sizes are negatively driven by past trading intensities and trading costs. Finally, excess trading costs mainly depend on their own history. |
Keywords: | Multiplicative error models, volatility, liquidity, high-frequency data |
JEL: | C13 C32 C52 |
Date: | 2008–07 |
URL: | http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2008-047&r=ecm |
By: | Lux, Thomas |
Abstract: | In this paper we consider daily financial data from various sources (stock market indices, foreign exchange rates and bonds) and analyze their multi-scaling properties by estimating the parameters of a Markovswitching multifractal model (MSM) with Lognormal volatility components. In order to see how well estimated models capture the temporal dependency of the empirical data, we estimate and compare (generalized) Hurst exponents for both empirical data and simulated MSM models. In general, the Lognormal MSM models generate ‘apparent’ long memory in good agreement with empirical scaling provided one uses sufficiently many volatility components. In comparison with a Binomial MSM specification [7], results are almost identical. This suggests that a parsimonious discrete specification is flexible enough and the gain from adopting the continuous Lognormal distribution is very limited. |
Keywords: | Markov-switching multifractal, scaling, return volatility |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:zbw:cauewp:7329&r=ecm |
By: | Antonio Falcó (Universidad CEU Cardenal Herrera); Juan Nave (Universidad de Castilla-La Mancha); Lluís Navarro (Universidad CEU Cardenal Herrera) |
Abstract: | In this paper we propose an alternate calibration algorithm, by using a consistent family of yield curves, that fits a Gaussian Heath-Jarrow-Morton model jointly to the implied volatilities of caps and zero-coupon bond prices. The algorithm is capable for finding several Pareto optimal points as is expected for a general nonlinear multicriteria optimization problem. The calibration approach is evaluated in terms of in-sample data fitting as well as stability of parameter estimates. Furthermore, the efficiency is tested against a non-consistent traditional method by using simulated and US market data. |
Keywords: | HJM models, consistent forward rate curves, multiobjective calibration |
JEL: | E43 C13 |
Date: | 2008–04 |
URL: | http://d.repec.org/n?u=RePEc:ivi:wpasad:2008-09&r=ecm |
By: | Marco Centoni (Dipartimento SEGES - Università del Molise); Gianluca Cubadda (Dipartimento SEFEMEQ - Università di Roma "Tor Vergata"); Alain Hecq (University of Maastricht) |
Abstract: | This paper proposes an econometric framework to assess the importance of common shocks and common transmission mechanisms in generating international business cycles. Then we show how to decompose the cyclical effects of permanent-transitory shocks into those due to their domestic and those due to foreign components. Our empirical analysis reveals that the business cycles of the US, Japan, Canada are clearly dominated by their domestic components. The Euro area is more sensitive to foreign shocks compared to the other three countries of our analysis. |
Keywords: | International business cycles; Permanent-transitory decomposition; Serial correlation common features; Frequency domain analysis. |
JEL: | C32 E32 |
Date: | 2008–07–07 |
URL: | http://d.repec.org/n?u=RePEc:rtv:ceisrp:106&r=ecm |
By: | A. Talha Yalta; Olaf Jenal |
Date: | 2008–07 |
URL: | http://d.repec.org/n?u=RePEc:tob:wpaper:0804&r=ecm |