|
on Econometrics |
By: | Nedeljkovic, Milan (Department of Economics,University of Warwick) |
Abstract: | This paper studies testing for the presence of smooth transition nonlinearity in adjustment parameters of the vector error correction model. We specify the generalized model with multiple cointegrating vectors and different transition functions across equations. Given that the nonlinear model is highly complex, this paper proposes an optimal LM test based only on estimation of the linear model. The null asymptotic distribution is derived using empirical process theory and since the transition parameters of the model cannot be identified under the null hypothesis bootstrap procedures are used to approximate the limit. Monte Carlo simulations indicate a good performance of the test. |
Keywords: | Nonlinearity ; Cointegration ; Empirical process theory ; Bootstrap |
JEL: | C12 C32 |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:wrk:warwec:876&r=ecm |
By: | Mutl, Jan (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria); Pfaffermayr, Michael (Department of Economics, University of Innsbruck, Innsbruck, Austria and Austrian Institute of Economic Research, Vienna, Austria and CESIFO, Germany) |
Abstract: | This paper studies the spatial random effects and spatial fixed effects model. The model includes a Cliff and Ord type spatial lag of the dependent variable as well as a spatially lagged one-way error component structure, accounting for both heterogeneity and spatial correlation across units. We discuss instrumental variable estimation under both the fixed and the random effects specification and propose a spatial Hausman test which compares these two models accounting for spatial autocorrelation in the disturbances. We derive the large sample properties of our estimation procedures and show that the test statistic is asymptotically chi-square distributed. A small Monte Carlo study demonstrates that this test works well even in small panels. |
Keywords: | Spatial econometrics, Panel data, Random effects estimator, Within estimator, Hausman test |
JEL: | C21 C23 |
Date: | 2008–10 |
URL: | http://d.repec.org/n?u=RePEc:ihs:ihsesp:229&r=ecm |
By: | Kuswanto, Heri; Sibbertsen, Philipp |
Abstract: | This paper discusses the existence of spurious long memory in common nonlinear time series models, namely Markov switching and threshold models. We describe the asymptotic behavior of the process in terms of autocovariance and autocorrelation function and support the theoretical evidences by providing Monte Carlo simulation. The existence of long memory in these nonlinear processes is induced by the nature of the process in certain conditions. In addition, GPH estimator itself introduces bias. |
Keywords: | long memory, nonlinear time series, regime switching |
JEL: | C12 C22 |
Date: | 2008–11 |
URL: | http://d.repec.org/n?u=RePEc:han:dpaper:dp-410&r=ecm |
By: | Costantini, Mauro (Department of Economics, University of Vienna BWZ, Vienna, Austria); Pappalardo, Carmine (Institute for Studies and Economic Analysis (ISAE), Rome, Italy) |
Abstract: | This paper proposes a strategy to increase the efficiency of forecast combining methods. Given the availability of a wide range of forecasting models for the same variable of interest, our goal is to apply combining methods to a restricted set of models. To this aim, an algorithm procedure based on a widely used encompassing test (Harvey, Leybourne, Newbold, 1998) is developed. First, forecasting models are ranked according to a measure of predictive accuracy (RMSFE) and, in a consecutive step, each prediction is chosen for combining only if it is not encompassed by the competing models. To assess the robustness of this procedure, an empirical application to Italian monthly industrial production using ISAE short-term forecasting models is provided. |
Keywords: | Combining forecasts, Econometric models, Evaluating forecasts, Models selection, Time series |
JEL: | C32 C53 |
Date: | 2008–11 |
URL: | http://d.repec.org/n?u=RePEc:ihs:ihsesp:228&r=ecm |
By: | BOUEZMARNI, Taoufik (---); ROMBOUTS, Jeroen V.K.; TAAMOUTI, Abderrahim |
Abstract: | Copulas are extensively used for dependence modeling. In many cases the data does not reveal how the dependence can be modeled using a particular parametric copula. Nonparametric copulas do not share this problem since they are entirely data based. This paper proposes nonparametric estimation of the density copula for a-mixing data using Bernstein polynomials. We study the asymptotic properties of the Bernstein density copula, i.e., we provide the exact asymptotic bias and variance, we establish the uniform strong consistency and the asymptotic normality. |
Keywords: | nonparametric estimation, copula, Bernstein polynomial, a-mixing, asymptotic properties, boundary bias |
JEL: | C13 C14 |
Date: | 2008–07 |
URL: | http://d.repec.org/n?u=RePEc:ctl:louvco:2008045&r=ecm |
By: | WANG , Shin-Huei (Université catholique de Louvain (UCL). Center for Operations Research and Econometrics (CORE)); HSIAO, Cheng |
Abstract: | This paper proposes an easy test for two stationary autoregressive fractionally integrated moving average (ARFIMA) processes being uncorrelated via AR approximations. We prove that an ARFIMA process can be approximated well by an autoregressive (AR) model and establish the theoretical foundation of Haugh's (1976) statistics to test two ARFIMA processes being uncorrelated. Using AIC or Mallow's Cp criterion as a guide, we demonstrate through Monte Carlo studies that a lower order AR(k) model is sufficient to prewhiten an ARFIMA process and the Haugh test statistics perform very well in finite sample. We illustrate the methodology by investigating the independence between the volatility of two daily nominal dollar exchange rates-Euro and Japanese Yen and find that there exists "strongly simultaneous correlation" between the volatilities of Euro and Yen within 25 days. |
Keywords: | forecasting, long memory process, structural break. |
JEL: | C22 C53 |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:ctl:louvco:2008047&r=ecm |
By: | Aurea Grane; Anna V. Tchirina |
Abstract: | We study the efficiency properties of the goodness-of-fit test based on the Qn statistic introduced in Fortiana and Grané (2003) using the concepts of Bahadur asymptotic relative efficiency and Bahadur asymptotic optimality. We compare the test based on this statistic with those based on the Kolmogorov-Smirnov, the Cramér-von Mises and the Anderson-Darling statistics. We also describe the distribution families for which the test based on Qn is asymptotically optimal in the Bahadur sense and, as an application, we use this test to detect the presence of hidden periodicities in a stationary time series. |
Keywords: | Bahadur asymptotic relative efficiency, Goodness-of-fit, Local asymptotic optimality, L-statistics, Maximum correlation |
Date: | 2008–09 |
URL: | http://d.repec.org/n?u=RePEc:cte:wsrepe:ws084211&r=ecm |
By: | Bryan S. Graham; James Powell |
Abstract: | In this paper we study identification and estimation of the causal effect of a small change in an endogenous regressor on a continuously-valued outcome of interest using panel data. We focus on the average partial effect (APE) over the full population distribution of unobserved heterogeneity (e.g., Chamberlain, 1984; Blundell and Powell, 2003; Wooldridge, 2005a). In our basic model the outcome of interest varies linearly with a (scalar) regressor, but with an intercept and slope coefficient that may vary across units and over time in a way which depends on the regressor. This model is a special case of Chamberlain's (1980b, 1982, 1992a) correlated random coefficients (CRC) model, but not does not satisfy the regularity conditions he imposes. Irregularity, while precluding estimation at parametric rates, does not result in a loss of identification under mild smoothness conditions. We show how two measures of the outcome and regressor for each unit are sufficient for identification of the APE as well as aggregate time trends. We identify aggregate trends using units with a zero first difference in the regressor or, in the language of Chamberlain (1980b, 1982), 'stayers' and the average partial effect using units with non-zero first differences or 'movers'. We discuss extensions of our approach to models with multiple regressors and more than two time periods. We use our methods to estimate the average elasticity of calorie consumption with respect to total outlay for a sample of poor Nicaraguan households (cf., Strauss and Thomas, 1995; Subramanian and Deaton, 1996). Our CRC average elasticity estimate declines with total outlay more sharply than its parametric counterpart. |
JEL: | C14 C23 I1 O1 O15 |
Date: | 2008–11 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:14469&r=ecm |
By: | Ole E. Barndorff-Nielsen (Department of Mathematical Sciences, University of Aarhus); Peter Reinhard Hansen (Department of Economics, Stanford University); Asger Lunde (Aarhus School of Business, University of Aarhus); Neil Shephard (Oxford-Man Institute, University of Oxford) |
Abstract: | We propose a multivariate realised kernel to estimate the ex-post covariation of log-prices. We show this new consistent estimator is guaranteed to be positive semi-definite and is robust to measurement noise of certain types and can also handle non-synchronous trading. It is the first estimator which has these three properties which are all essential for empirical work in this area. We derive the large sample asymptotics of this estimator and assess its accuracy using a Monte Carlo study. We implement the estimator on some US equity data, comparing our results to previous work which has used returns measured over 5 or 10 minutes intervals. We show the new estimator is substantially more precise. |
Keywords: | HAC estimator, Long run variance estimator; Market frictions; Quadratic variation; Realised variance. |
JEL: | C01 C14 C32 |
Date: | 2008–01–07 |
URL: | http://d.repec.org/n?u=RePEc:nuf:econwp:0810&r=ecm |
By: | Valentina Corradi; Andres Fernandez; Norman Swanson |
Abstract: | Rationality of early release data is typically tested using linear regressions. Thus, failure to reject the null does not rule out the possibility of nonlinear dependence. This paper proposes two tests which instead have power against generic nonlinear alternatives. A Monte Carlo study shows that the suggested tests have good finite sample properties. Additionally, we carry out an empirical illustration using a real-time dataset for money, output, and prices. Overall, we find strong evidence against data rationality. Interestingly, for money stock the null is not rejected by linear tests but is rejected by our tests. |
Keywords: | Real-time data |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedpwp:08-27&r=ecm |
By: | Jesús P. Colino |
Abstract: | In this paper, a new kind of additive process is proposed. Our main goal is to define, characterize and prove the existence of the LIBOR additive process as a new stochastic process. This process will be de.ned as a piecewise stationary process with independent increments, continuous in probability but with discontinuous trajectories, and having "càdlàg" sample paths. The proposed process is specifically designed to derive interest-rates modelling because it allows us to introduce a jump-term structure as an increasing sequence of Lévy measures. In this paper we characterize this process as a Markovian process with an infinitely divisible, selfsimilar, stable and self-decomposable distribution. Also, we prove that the Lévy-Khintchine characteristic function and Lévy-Itô decomposition apply to this process. Additionally we develop a basic framework for density transformations. Finally, we show some examples of LIBOR additive processes. |
Keywords: | Jump processes, Processes with independent increments |
Date: | 2008–11 |
URL: | http://d.repec.org/n?u=RePEc:cte:wsrepe:ws085316&r=ecm |
By: | Özer Karagedikli (Bank of England); Troy Matheson (Reserve Bank of New Zealand); Christie Smith (Norges Bank (Central Bank of Norway)); Shaun Vahey (Melbourne Business School, Norges Bank (Central Bank of Norway) , and Reserve Bank of New Zealand) |
Abstract: | Real Business Cycle (RBC) and Dynamic Stochastic General Equilibrium (DSGE) methods have become essential components of the macroeconomist’s toolkit. This literature review stresses recently developed techniques for computation and inference, providing a supplement to the Romer (2006) textbook, which stresses theoretical issues. Many computational aspects are illustrated with reference to the simple divisible labour RBC model. Code and US data to replicate the computations are provided on the Internet, together with a number of appendices providing background details. |
Keywords: | RBC, DSGE, Computation, Bayesian Analysis, Simulation |
JEL: | C11 C50 E30 |
Date: | 2008–10–24 |
URL: | http://d.repec.org/n?u=RePEc:bno:worpap:2008_17&r=ecm |
By: | Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Cyril Caillault (FORTIS Investments - Fortis investments) |
Abstract: | Using non-parametric (copulas) and parametric models, we show that the bivariate distribution of an Asian portfolio is not stable along all the period under study. We suggest several dynamic models to compute two market risk measures, the Value at Risk and the Expected Shortfall: the RiskMetric methodology, the Multivariate GARCH models, the Multivariate Markov-Switching models, the empirical histogram and the dynamic copulas. We discuss the choice of the best method with respect to the policy management of bank supervisors. The copula approach seems to be a good compromise between all these models. It permits taking financial crises into account and obtaining a low capital requirement during the most important crises. |
Keywords: | Value at Risk - Expected Shortfall - Copula - RiskMetrics - Risk management -GARCH models - Switching models. |
Date: | 2008–03–06 |
URL: | http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00185374_v1&r=ecm |
By: | CHOLLETE, Loran (Norwegian School of Economics and Business Administration (NHH)); HEINEN, Andréas (Universidad Carlos III de Madrid); VALDESOGO, Alfonso (Université catholique de Louvain (UCL). Center for Operations Research and Econometrics (CORE)) |
Abstract: | In order to capture observed asymmetric dependence in international financial returns, we construct a multivariate regime-switching model of copulas. We model dependence with one Gaussian and one canonical vine copula regime. Canonical vines are constructed from bivariate conditional copulas and provide a very flexible way of characterizing dependence in multivariate settings. We apply the model to returns from the G5 and Latin American regions, and document two main findings. First, we discover that models with canonical vines generally dominate alternative dependence structures. Second, the choice of copula is important for risk management, because it modifies the Value at Risk (VaR) of international portfolio returns. |
Keywords: | asymmetric dependence, canonical vine copula, international returns, regime-switching, risk management, Value-at-Risk. |
JEL: | C32 C35 G10 |
Date: | 2008–03 |
URL: | http://d.repec.org/n?u=RePEc:ctl:louvco:2008013&r=ecm |
By: | Emilio Leton; Pilar Zuluaga |
Abstract: | It is fairly common to find medical examples with survival data with unequal sample size among the groups. There are several tests for those cases, but in practice, the use of one test instead of another is done without justifying the election. Sometimes, the choice of one test or another can lead to different conclusions, so it is important to have some guidelines to help to choose the suitable test in unbalanced groups. The computation of the tests is done with the statistical software (BMDP, SAS, SPSS, Stata, Statgraphics, and S-Plus). However the commercial software only covers tests for the family of the weigthed tests, none of the score tests, and the nomenclature is not unified, using different names for the same test. We perform several simulations to give some pieces of advice for picking out the right test. Due to the fact that there are situations where it is advisable to use a test from the family of the score tests against a weighted one, we have developed a new software in JavaScript for Internet that computes score and weighted tests versions (10 tests) that unifies the nomenclature (this software is available from the authors upon request). We include real examples where we apply, using the new JavaScript programs, the recommendations suggested by the simulations. |
Keywords: | Comparison of several survival curves, Score tests, Weighted test |
Date: | 2008–10 |
URL: | http://d.repec.org/n?u=RePEc:cte:wsrepe:ws085215&r=ecm |
By: | Debopam Bhattacharya; Pascaline Dupas |
Abstract: | We consider the problem of efficiently allocating a binary treatment among a target population based on a set of discrete and continuous observed characteristics. The goal is to maximize the population mean of an eventual outcome when a budget constraint limits what fraction of the population can be treated. Using sample data resulting from randomized treatment allocation, the ATE conditional on covariates (CATE) is nonparametrically estimated in a first step. The optimal treatment threshold and resulting value function, which are non-smooth functionals of the CATE, are estimated based on sample realizations of the estimated CATE. We derive large-sample distribution theory for these estimates and for the estimated dual value, i.e. the minimum resources needed to attain a specific average outcome via efficient treatment assignment. These inferential methods are applied to the optimal provision of anti-malaria bed nets, using data from a randomized experiment conducted in western Kenya. We find that a government which can afford to distribute subsidized bed nets to only 50% of its target population can, with an efficient allocation rule based on multiple covariates, increase bed-net use by 8 percentage points (25 percent) relative to random allocation and by 4 percentage points (11 percent) relative to one based on wealth only. Our methods can be extended to infer optimal design of eligibility in conditional cash transfer programs. |
JEL: | C01 C14 I38 |
Date: | 2008–10 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:14447&r=ecm |
By: | Santos Monteiro, Paulo (University of Warwick) |
Abstract: | Full consumption insurance implies that consumers are able to perfectly share risk by equalizing state by state their inter-temporal marginal rates of substitution in the presence of idiosyncratic endowment shocks. In this paper I test the implications of full consumption insurance using band spectrum regression methods. I argue that moving to the frequency domain provides a possible solution to many difficulties tied to tests of perfect risk sharing. In particular, it provides a unifying framework to test consumption smoothing, both over time and across states of nature. Full consumption insurance is soundly rejected at business cycle frequencies. |
Keywords: | Consumption insurance ; Idiosyncratic risk ; Frequency domain |
JEL: | D1 E21 |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:wrk:warwec:874&r=ecm |
By: | Sibbertsen, Philipp; Stahl, Gerhard; Luedtke, Corinna |
Abstract: | Model risk as part of the operational risk is a serious problem for financial institutions. As the pricing of derivatives as well as the computation of the market or credit risk of an institution depend on statistical models the application of a wrong model can lead to a serious over- or underestimation of the institution’s risk. Because the underlying data generating process is unknown in practice evaluating the model risk is a challenge. So far, definitions of model risk are either application-oriented including risk induced by the statistician rather than by the statistical model or research-oriented and too abstract to be used in practice. Especially, they are not data-driven. We introduce a data driven notion of model risk which includes the features of the research-oriented approach by extending it by a statistical model building procedure and therefore compromises between the two definitions at hand. We furthermore suggest the application of robust estimates to reduce the model risk and advocate the application of stress tests with respect to the valuation of the portfolio. |
Keywords: | risk evaluation, model risk, robust estimation, stress tests |
JEL: | C50 G32 |
Date: | 2008–11 |
URL: | http://d.repec.org/n?u=RePEc:han:dpaper:dp-409&r=ecm |
By: | Hu, Jian |
Abstract: | In this paper, we use a Time-Varying Conditional Copula approach (TVCC) to model Chinese and U.S. stock markets‚ dependence structures with other financial markets. The AR-GARCH-t model is used to examine the marginals, while Normal and Generalized Joe-Clayton copula models are employed to analyze the joint distributions. In this pairwise analysis, both constant and time-varying conditional dependence parameters are estimated by a two-step maximum likelihood method. A comparative analysis of dependence structures in Chinese versus U.S. stock markets is also provided. There are three main findings: First, the time-varying-dependence model does not always perform better than constant-dependence model. This result has not previously been reported in the literature. Second, although previous research extensively reports that the lower tail dependence between stock markets tends to be higher than the upper tail dependence, we find a counterexample where the upper tail dependence is much higher than the lower tail dependence in some short periods. Last, Chinese financial market is relatively separate from other international financial markets in contrast to the U.S. market. The tail dependence with other financial markets is much lower in China than in the U.S. |
Keywords: | AR-GARCH-t model; Time-varying conditional copula; Dependence structure; Stock market |
JEL: | C51 G15 F36 P52 |
Date: | 2008–10–31 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:11401&r=ecm |
By: | Asher A. Blass; Saul Lach; Charles F. Manski |
Abstract: | When data on actual choices are not available, researchers studying preferences sometimes pose choice scenarios and ask respondents to state the actions they would choose if they were to face these scenarios. The data on stated choices are then used to estimate random utility models, as if they are data on actual choices. Stated choices may differ from actual ones because researchers typically provide respondents with less information than they would have facing actual choice problems. Elicitation of choice probabilities overcomes this problem by permitting respondents to express uncertainty about their behavior. This paper shows how to use elicited choice probabilities to estimate random utility models with random coefficients and applies the methodology to estimate preferences for electricity reliability in Israel. |
JEL: | C25 C42 D12 L51 L94 |
Date: | 2008–10 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:14451&r=ecm |
By: | Stefano Iezzi (Bank of Italy, Economics, Research and International Relations area) |
Abstract: | In a choice model of risky assets the role of risk aversion is analyzed. The measure of risk preference comes from a direct subjective survey question and it is considered as an imperfect information about the true risk attitude of investors. Misclassification between the true and the observed risk aversion is explicitly taken into account in the empirical model. A Data Augmentation approach, a Bayesian procedure for incomplete-data problems, is applied on data from the 2006 Survey of Household Income and Wealth by the Bank of Italy. Results indicate that when misclassification of investors is taken into account model estimates show the good performance of the subjective question when used as a control in a portfolio choice models. Moreover risk aversion emerges as a strong predictor of the probability to hold risky assets. The analysis also shows that probability of misclassification decreases as latent risk aversion increases, that means that more risk tolerant investors tend to be classified erroneously more often than less risk tolerant investors. |
Keywords: | Portfolio choice, risk attitude, misclassification error, Bayesian analysis |
JEL: | I31 I32 D63 D31 |
Date: | 2008–09 |
URL: | http://d.repec.org/n?u=RePEc:bdi:wptemi:td_692_08&r=ecm |
By: | Grassi, Stefano; Proietti, Tommaso |
Abstract: | The local level model with stochastic volatility, recently proposed for U.S. by Stock and Watson (Why Has U.S. Inflation Become Harder to Forecast?, Journal of Money, Credit and Banking, Supplement to Vol. 39, No. 1, February 2007), provides a simple yet sufficently rich framework for characterizing the evolution of the main stylized facts concerning the U.S. inflation. The model decomposes inflation into a core component, evolving as a random walk, and a transitory component. The volatility of the disturbances driving both components is allowed to vary over time. The paper provides a full Bayesian analysis of this model and readdresses some of the main issues that were raised by the literature concerning the evolution of persistence and predictability and the extent and timing of the great moderation. The assessment of various nested models of inflation volatility and systematic model selection provide strong evidence in favor of a model with heteroscedastic disturbances in the core component, whereas the transitory component has time invariant size. The main evidence is that the great moderation is over, and that volatility, persistence and predictability of inflation underwent a turning point in the late 1990s. During the last decade volatility and persistence have been increasing and predictability has been going down. |
Keywords: | Marginal Likelihood; Bayesian Model Comparison; Stochastic Volatility; Great Moderation; Inflation Persistence. |
JEL: | E31 C22 |
Date: | 2008–11–07 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:11453&r=ecm |