|
on Econometrics |
By: | Waldyr Dutra Areosa (Department of Economics PUC-Rio e Banco Central do Brasil); Michael McAleer (Erasmus School of Economics e Tinbergen Institute e Center for International Research on the Japanese Economy (CIRJE)); Marcelo Cunha Medeiros (Department of Economics PUC-Rio) |
Abstract: | Nonlinear regression models have been widely used in practice for a variety of time series and cross-section datasets. For purposes of analyzing univariate and multivariate time series data, in particular, Smooth Transition Regression (STR) models have been shown to be very useful for representing and capturing asymmetric behavior. Most STR models have been applied to univariate processes, and have made a variety of assumptions, including stationary or cointegrated processes, uncorrelated, homoskedastic or conditionally heteroskedastic errors, and weakly exogenous regressors. Under the assumption of exogeneity, the standard method of estimation is nonlinear least squares. The primary purpose of this paper is to relax the assumption of weakly exogenous regressors and to discuss moment based methods for estimating STR models. The paper analyzes the properties of the STR model with endogenous variables by providing a diagnostic test of linearity of the underlying process under endogeneity, developing an estimation procedure and a misspecification test for the STR model, presenting the results of Monte Carlo simulations to show the usefulness of the model and estimation method, and providing an empirical application for inflation rate targeting in Brazil. We show that STR models with endogenous variables can be specified and estimated by a straightforward application of existing results in the literature. |
Keywords: | Smooth transition, nonlinear models, nonlinear instrumental variables, generalized method of moments, endogeneity, inflation targeting. |
Date: | 2010–03 |
URL: | http://d.repec.org/n?u=RePEc:rio:texdis:571&r=ecm |
By: | Jia Chen (School of Economics, University of Adelaide); Jiti Gao (School of Economics, University of Adelaide); Degui Li (School of Economics, University of Adelaide) |
Abstract: | In this paper, we study semiparametric estimation for a single-index panel data model where the nonlinear link function varies among the individuals. We propose using the so-called refined minimum average variance estimation based on a local linear smoothing method to estimate both the parameters in the single-index and the average link function. As the cross-section dimension N and the time series dimension T tend to infinity simultaneously, we establish asymptotic distributions for the proposed parametric and nonparametric estimates. In addition, we provide two real-data examples to illustrate the nite sample behavior of the proposed estimation method. |
Keywords: | Asymptotic distribution, local linear smoother, minimum average variance estimation, panel data, semiparametric estimation, single-index models |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:adl:wpaper:2010-09&r=ecm |
By: | Stefano Iacus (Department of Economics, Business and Statistics, University of Milan, IT) |
Abstract: | We consider a dynamical system with small noise where the drift is parametrized by a finite dimensional parameter. For this model we consider minimum distance estimation from continuous time observations under some penalty imposed on the parameters in the spirit of the Lasso approach. This approach allows for simultaneous estimation and model selection for this model. |
Keywords: | dynamical systems, lasso estimation, model selection, inference for stochastic processes, diffusion processes, |
Date: | 2010–02–05 |
URL: | http://d.repec.org/n?u=RePEc:bep:unimip:1101&r=ecm |
By: | Francesco Audrino (University of St. Gallen); Marcelo Cunha Medeiros (Department of Economics PUC-Rio) |
Abstract: | In this paper we propose a smooth transition tree model for both the conditional mean and variance of the short-term interest rate process. The estimation of such models is addressed and the asymptotic properties of the quasi-maximum likelihood estimator are derived. Model specification is also discussed. When the model is applied to the US short-term interest rate we find (1) leading indicators for inflation and real activity are the most relevant predictors in characterizing the multiple regimes’ structure; (2) the optimal model has three limiting regimes. Moreover, we provide empirical evidence of the power of the model in forecasting the first two conditional moments when it is used in connection with bootstrap aggregation (bagging). |
Keywords: | short-term interest rate, regression tree, smooth transition, conditional variance, bagging, asymptotic theory |
Date: | 2010–03 |
URL: | http://d.repec.org/n?u=RePEc:rio:texdis:570&r=ecm |
By: | Matias D. Cattaneo; Richard K. Crump; Michael Jansson |
Abstract: | Employing the "small-bandwidth" asymptotic framework of Cattaneo, Crump, and Jansson (2009), this paper studies the properties of several bootstrap-based inference procedures associated with a kernel-based estimator of density-weighted average derivatives proposed by Powell, Stock, and Stoker (1989). In many cases, the validity of bootstrap-based inference procedures is found to depend crucially on whether the bandwidth sequence satisfies a particular (asymptotic linearity) condition. An exception to this rule occurs for inference procedures involving a studentized estimator that employs a "robust" variance estimator derived from the "small-bandwidth" asymptotic framework. The results of a small-scale Monte Carlo experiment are found to be consistent with the theory and indicate in particular that sensitivity with respect to the bandwidth choice can be ameliorated by using the "robust" variance estimator. |
Keywords: | Statistical methods ; Econometrics ; Econometrics - Asymptotic theory ; Econometric models |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:fip:fednsr:452&r=ecm |
By: | Søren Johansen (University of Copenhagen and CREATES); Morten Ørregaard Nielsen (Queen?s University and CREATES) |
Abstract: | We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model based on the conditional Gaussian likelihood. The model allows the process X_{t} to be fractional of order d and cofractional of order d-b; that is, there exist vectors ß for which ß'X_{t} is fractional of order d-b. The parameters d and b satisfy either d=b=1/2, d=b=1/2, or d=d_{0}=b=1/2. Our main technical contribution is the proof of consistency of the maximum likelihood estimators on the set 1/2=b=d=d_{1} for any d_{1}=d_{0}. To this end, we consider the conditional likelihood as a stochastic process in the parameters, and prove that it converges in distribution when errors are i.i.d. with suitable moment conditions and initial values are bounded. We then prove that the estimator of ß is asymptotically mixed Gaussian and estimators of the remaining parameters are asymptotically Gaussian. We also find the asymptotic distribution of the likelihood ratio test for cointegration rank, which is a functional of fractional Brownian motion of type II. |
Keywords: | Cofractional processes, cointegration rank, fractional cointegration, likelihood inference, vector autoregressive model |
JEL: | C32 |
Date: | 2010–05–18 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-24&r=ecm |
By: | Degui Li (School of Economics, University of Adelaide); Jia Chen (School of Economics, University of Adelaide); Jiti Gao (School of Economics, University of Adelaide) |
Abstract: | This paper is concerned with developing a nonparametric time-varying coefficient model with fixed effects to characterize nonstationarity and trending phenomenon in nonlinear panel data analysis. We develop two methods to estimate the trend function and the coefficient function without taking the first difference to eliminate the fixed effects. The first one eliminates the fixed effects by taking cross-sectional averages, and then uses a nonparametric local linear approach to estimate the trend function and the coefficient function. The asymptotic theory for this approach reveals that although the estimates of both the trend function and the coefficient function are consistent, the estimate of the coefficient function has a rate of convergence that is slower than that of the trend function. To estimate the coefficient function more efficiently, we propose a pooled local linear dummy variable approach. This is motivated by a least squares dummy variable method proposed in parametric panel data analysis. This method removes the fixed effects by deducting a smoothed version of cross-time average from each individual. The asymptotic distributions of both of the estimates are established when T tends to infinity and N is fixed or both T and N tend to infinity. Simulation results are provided to illustrate the finite sample behavior of the proposed estimation methods. |
Keywords: | Fixed effects, local linear estimation, nonstationarity, panel data, specification testing, time-varying coeffcient function |
JEL: | C13 C14 C23 |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:adl:wpaper:2010-08&r=ecm |
By: | Hyeongwoo Kim; Nazif Durmaz |
Abstract: | We evaluate the usefulness of bias-correction methods for autoregressive (AR) models in terms of out-of-sample forecast accuracy, employing two popular methods proposed by Hansen (1999) and So and Shin (1999). Our Monte Carlo simulations show that these methods do not necessarily achieve better forecasting performances than the bias-uncorrected Least Squares (LS) method, because bias correction tends to increase the variance of the estimator. There is a gain from correcting for bias only when the true data generating process is sufficiently persistent. Though the bias arises in finite samples, the sample size (N) is not a crucial factor of the gains from bias-correction, because both the bias and the variance tend to decrease as N goes up. We also provide a real data application with 7 commodity price indices which confirms our findings. |
Keywords: | Small-Sample Bias, Grid Bootstrap, Recursive Mean Adjustment, Out-of-Sample Forecast |
JEL: | C52 C53 |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:abn:wpaper:auwp2010-02&r=ecm |
By: | Tsunehiro Ishihara (Graduate School of Economics, University of Tokyo); Yasuhiro Omori (Faculty of Economics, University of Tokyo) |
Abstract: | An efficient Bayesian estimation using a Markov chain Monte Carlo method is proposed in the case of a multivariate stochastic volatility model as a natural extension of the univariate stochastic volatility model with leverage and heavy-tailed errors. Note that we further incorporate cross-leverage effects among stock returns. Our method is based on a multi-move sampler that samples a block of latent volatility vectors. The method is presented as a multivariate stochastic volatility model with cross leverage and heavytailed errors. Its high sampling efficiency is shown using numerical examples in comparison with a single-move sampler that samples one latent volatility vector at a time, given other latent vectors and parameters. To illustrate the method, empirical analyses are provided based on five-dimensional S&P500 sector indices returns. |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:tky:fseres:2010cf746&r=ecm |
By: | Alessandro De Gregorio; Stefano Iacus (Department of Economics, Business and Statistics, University of Milan, IT) |
Abstract: | The LASSO is a widely used statistical methodology for simultaneous estimation and variable selection. In the last years, many authors analyzed this technique from a theoretical and applied point of view. We introduce and study the adaptive LASSO problem for discretely observed ergodic diffusion processes. We prove oracle properties also deriving the asymptotic distribution of the LASSO estimator. Our theoretical framework is based on the random field approach and it applied to more general families of regular statistical experiments in the sense of Ibragimov-Hasminskii (1981). Furthermore, we perform a simulation and real data analysis to provide some evidence on the applicability of this method. |
Keywords: | discretely observed diffusion processes, model selection, oracle properties, random fields, stochastic differential equations, |
Date: | 2010–02–05 |
URL: | http://d.repec.org/n?u=RePEc:bep:unimip:1100&r=ecm |
By: | Guido Imbens (Institute for Fiscal Studies and Harvard University); Karthik Kalyanaraman (Institute for Fiscal Studies and UCL) |
Abstract: | <p><p>We investigate the problem of optimal choice of the smoothing parameter (bandwidth) for the regression discontinuity estimator. We focus on estimation by local linear regression, which was shown to be rate optimal (Porter, 2003). We derive the optimal bandwidth. This optimal bandwidth depends on unknown functionals of the distribution of the data and we propose specific, consistent, estimators for these functionals to obtain a fully data-driven bandwidth choice that has the "asymptotic no-regret" property. We illustrate our proposed bandwidth, and the sensitivity to the choices made in this bandwidth proposal, using a data set previously analyzed by Lee (2008), as well as a small simulation study based on the Lee data set. The simulations suggest that the proposed rule performs well.</p></p> |
Date: | 2010–03 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:05/10&r=ecm |
By: | Jia Chen (School of Economics, University of Adelaide); Jiti Gao (School of Economics, University of Adelaide); Degui Li (School of Economics, University of Adelaide) |
Abstract: | A semiparametric fixed effects model is introduced to describe the nonlinear trending phenomenon in panel data analysis and it allows for the cross-sectional dependence in both the regressors and the residuals. A semiparametric profile likelihood approach based on the first-stage local linear fitting is developed to estimate both the parameter vector and the time trend function. |
Keywords: | Cross-sectional dependence, nonlinear time trend, panel data, profile likelihood, semiparametric regression |
JEL: | C13 C14 C23 |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:adl:wpaper:2010-10&r=ecm |
By: | James Heckman (Institute for Fiscal Studies and University of Chicago); Sergio Urzua |
Abstract: | <p>This paper compares the economic questions addressed by instrumental variables estimators with those addressed by structural approaches. We discuss Marschak's Maxim: estimators should be selected on the basis of their ability to answer well-posed economic problems with minimal assumptions. A key identifying assumption that allows structural methods to be more informative than IV can be tested with data and does not have to be imposed. </p> |
Date: | 2010–04 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:08/10&r=ecm |
By: | James Heckman (Institute for Fiscal Studies and University of Chicago); Daniel Schmierer; Sergio Urzua |
Abstract: | <p><p>The recent literature on instrumental variables (IV) features models in which agents sort into treatment status on the basis of gains from treatment as well as on baseline-pretreatment levels. Components of the gains known to the agents and acted on by them may not be known by the observing economist. Such models are called correlated random coefficient models. Sorting on unobserved components of gains complicates the interpretation of what IV estimates. This paper examines testable implications of the hypothesis that agents do not sort into treatment based on gains. In it, we develop new tests to gauge the empirical relevance of the correlated random coefficient model to examine whether the additional complications associated with it are required. We examine the power of the proposed tests. We derive a new representation of the variance of the instrumental variable estimator for the correlated random coefficient model. We apply the methods in this paper to the prototypical empirical problem of estimating the return to schooling and find evidence of sorting into schooling based on unobserved components of gains.</p></p> |
Date: | 2010–04 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:10/10&r=ecm |
By: | Andrew Chesher (Institute for Fiscal Studies and University College London); Konrad Smolinski (Institute for Fiscal Studies) |
Abstract: | <p>Instrumental variable models for discrete outcomes are set, not point, identifying. The paper characterises identified sets of structural functions when endogenous variables are discrete. Identified sets are unions of large numbers of convex sets and may not be convex nor even connected. Each of the component sets is a projection of a convex set that resides in a much higher dimensional space onto the space in which a structural function resides. The paper develops a symbolic expression for this projection and gives a constructive demonstration that it is indeed the identified set. We provide a MathematicaTM notebook which computes the set symbolically. We derive properties of the set, suggest how the set can be used in practical econometric analysis when outcomes and endogenous variables are discrete and propose a method for estimating identified sets under parametric or shape restrictions. We develop an expression for a set of structural functions for the case in which endogenous variables are continuous or mixed discrete-continuous and show that this set contains all structural functions in the identified set in the non-discrete case.</p> |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:11/10&r=ecm |
By: | Nadezhda V. Baryshnikova (School of Economics, University of Adelaide) |
Abstract: | In order to improve the small sample performance of the Generalized Empirical Likelihood Kleibergen type tests (GELK), we propose to re-weight the variance of moments matrix with GEL probabilities. Our modification improves GELK significantly by cutting the size distortion in half. Using simulations, we compare the performance of our modified tests with Kleibergen's K-test and the original GELK tests in a dynamic panel setting. As an empirical application, we use the Arellano and Bond's dynamic panel data for 140 UK firms to estimate labor demand. We compare our results with the traditionalWald test to illustrate the practical importance of using tests which are robust to weak instruments in a dynamic panel setting. |
Keywords: | GELK, weak instruments, dynamic panel, empirical likelihood |
Date: | 2010–04 |
URL: | http://d.repec.org/n?u=RePEc:adl:wpaper:2010-03&r=ecm |
By: | Òscar Jordà; Malte Knüppel; Massimiliano Marcellino |
Abstract: | Measuring and displaying uncertainty around path-forecasts, i.e. forecasts made in period T about the expected trajectory of a random variable in periods T+1 to T+H is a key ingredient for decision making under uncertainty. The probabilistic assessment about the set of possible trajectories that the variable may follow over time is summarized by the simultaneous confidence region generated from its forecast generating distribution. However, if the null model is only approximative or altogether unavailable, one cannot derive analytic expressions for this confidence region, and its non-parametric estimation is impractical given commonly available predictive sample sizes. Instead, this paper derives the approximate rectangular confidence regions that control false discovery rate error, which are a function of the predictive sample covariance matrix and the empirical distribution of the Mahalanobis distance of the path-forecast errors. These rectangular regions are simple to construct and appear to work well in a variety of cases explored empirically and by simulation. The proposed techniques are applied to provide con.dence bands around the Fed and Bank of England real-time path-forecasts of growth and inflation. |
Keywords: | path forecast, forecast uncertainty, simultaneous confidence region, Scheffé’s S-method,Mahalanobis distance, false discovery rate. |
JEL: | C32 C52 C53 |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:eui:euiwps:eco2010/18&r=ecm |
By: | Charles Manski (Institute for Fiscal Studies and Northwestern University) |
Abstract: | <p><p><p>This paper develops a formal language for study of treatment response with social interactions, and uses it to obtain new findings on identification of potential outcome distributions. Defining a person's treatment response to be a function of the entire vector of treatments received by the population, I study identification when shape restrictions and distributional assumptions are placed on response functions. An early key result is that the traditional assumption of individualistic treatment response (ITR) is a polar case within the broad class of constant treatment response (CTR) assumptions, the other pole being unrestricted interactions. Important non-polar cases are interactions within reference groups and distributional interactions. I show that established findings on identification under assumption ITR extend to assumption CTR. These include identification with assumption CTR alone and when this shape restriction is strengthened to semi-monotone response. I next study distributional assumptions using instrumental variables. Findings obtained previously under assumption ITR extend when assumptions of statistical independence (SI) are posed in settings with social interactions. However, I find that random assignment of realized treatments generically has no identifying power when some persons are leaders who may affect outcomes throughout the population. Finally, I consider use of models of endogenous social interactions to derive restrictions on response functions. I emphasize that identification of potential outcome distributions differs from the longstanding econometric concern with identification of structural functions.</p> </p><p></p><p><p> </p> </p><p></p><p><p>This paper is a revised version of CWP01/10</p></p></p> |
Date: | 2010–02 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:01/10&r=ecm |
By: | Maurice Bun; Frank Windmeijer (Institute for Fiscal Studies and University of Bristol) |
Abstract: | <p>We consider the bias of the 2SLS estimator in the linear instrumental variables regression with one endogenous regressor only. By using asymptotic expansion techniques we approximate 2SLS coefficient estimation bias under various scenarios regarding the number and strength of instruments. The resulting approximation encompasses existing bias approximations, which are valid in particular cases only. Simulations show that the developed approximation gives an accurate description of the 2SLS bias in case of either weak or many instruments or both.</p> |
Date: | 2010–04 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:07/10&r=ecm |
By: | Chia-Lin Chang; Philip Hans Franses; Michael McAleer (University of Canterbury) |
Abstract: | Macro-economic forecasts are often based on the interaction between econometric models and experts. A forecast that is based only on an econometric model is replicable and may be unbiased, whereas a forecast that is not based only on an econometric model, but also incorporates an expert’s touch, is non-replicable and is typically biased. In this paper we propose a methodology to analyze the qualities of combined non-replicable forecasts. One part of the methodology seeks to retrieve a replicable component from the non-replicable forecasts, and compares this component against the actual data. A second part modifies the estimation routine due to the assumption that the difference between a replicable and a non-replicable forecast involves a measurement error. An empirical example to forecast economic fundamentals for Taiwan shows the relevance of the methodological approach. |
Keywords: | Combined forecasts; efficient estimation; generated regressors; replicable forecasts; non-replicable forecasts; expert’s intuition |
JEL: | C53 C22 E27 E37 |
Date: | 2010–05–01 |
URL: | http://d.repec.org/n?u=RePEc:cbt:econwp:10/35&r=ecm |
By: | Seungmoon Choi (School of Economics, University of Adelaide) |
Abstract: | The aim of this paper is to find approximate log-transition density functions for multivariate time-inhomogeneous diffusions in closed form. There are many empirical evidences that the underlying data generating processes for many economic variables might change over time. One possible way to explain the time-dependent behavior of state variables is to model the drift or volatility terms as functions of time t as well as state variables. Closed-form likelihood expansions for multivariate time-homogeneous diffusions have been obtained by Ait-Sahalia (2008). This research is built on his work and extends his results to time-inhomogeneous cases. Simulation study reveals that our method yields a very accurate approximate likelihood function that can be a good candidate when the true likelihood function is unavailable. |
Keywords: | Likelihood function; Multivariate time-inhomogeneous diffusion; Reducible diffusions, Irreducible diffusions |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:adl:wpaper:2010-11&r=ecm |
By: | Alexander Strasak; Nikolaus Umlauf; Ruth Pfeiffer; Stefan Lang |
Abstract: | P(enalized)-splines and fractional polynomials (FPs) have emerged as powerful smoothing techniques with increasing popularity in several fields of applied research. Both approaches provide considerable flexibility, but only limited comparative evaluations of the performance and properties of the two methods have been conducted to date. We thus performed extensive simulations to compare FPs of degree 2 (FP2) and degree 4 (FP4) and P-splines that used generalized cross validation (GCV) and restricted maximum likelihood (REML) for smoothing parameter selection. We evaluated the ability of P-splines and FPs to recover the “true” functional form of the association between continuous, binary and survival outcomes and exposure for linear, quadratic and more complex, non-linear functions, using different sample sizes and signal to noise ratios. We found that for more curved functions FP2, the current default implementation in standard software, showed considerably bias and consistently higher mean squared error (MSE) compared to spline-based estimators (REML, GCV) and FP4, that performed equally well in most simulation settings. FPs however, are prone to artefacts due to the specific choice of the origin, while P-splines based on GCV reveal sometimes wiggly estimates in particular for small sample sizes. Finally,we highlight the specific features of the approaches in a real dataset. |
Keywords: | generalized additive models; GAMs; simulation; smoothing |
JEL: | C14 |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:inn:wpaper:2010-11&r=ecm |
By: | A. Carriero; G. Kapetanios; M. Marcellino |
Abstract: | We propose a new approach to forecasting the term structure of interest rates, which allows to efficiently extract the information contained in a large panel of yields. In particular, we use a large Bayesian Vector Autoregression (BVAR) with an optimal amount of shrinkage towards univariate AR models. Focusing on the U.S., we provide an extensive study on the forecasting performance of our proposed model relative to most of the existing alternative speci.cations. While most of the existing evidence focuses on statistical measures of forecast accuracy, we also evaluate the performance of the alternative forecasts when used within trading schemes or as a basis for portfolio allocation. We extensively check the robustness of our results via subsample analysis and via a data based Monte Carlo simulation. We .nd that: i) our proposed BVAR approach produces forecasts systematically more accurate than the random walk forecasts, though the gains are small; ii) some models beat the BVAR for a few selected maturities and forecast horizons, but they perform much worse than the BVAR in the remaining cases; iii) predictive gains with respect to the random walk have decreased over time; iv) di¤erent loss functions (i.e., "statistical" vs "economic") lead to di¤erent ranking of speci.c models; v) modelling time variation in term premia is important and useful for forecasting. |
Keywords: | Bayesian methods, Forecasting, Term Structure. |
JEL: | C11 C53 E43 E47 |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:eui:euiwps:eco2010/17&r=ecm |
By: | Aedin Doris (Economics,Finance and Accounting National University of Ireland, Maynooth); Donal O'Neill (Economics,Finance and Accounting National University of Ireland,); Olive Sweetman (Economics,Finance and Accounting National University of Ireland,) |
Abstract: | In this paper we study the performance of the GMM estimator in the context of the covariance structure of earnings. Using analytical and Monte Carlo techniques we examine the sensitivity of parameter identification to key features such as panel length, sample size, the degree of persistence of earnings shocks and the evolution of inequality over time. We show that the interaction of transitory persistence with the time pattern of inequality determines identification in these models and offer some practical recommendations that follow from our findings. |
JEL: | J31 D31 |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:may:mayecw:n208-10.pdf&r=ecm |
By: | Grant Hillier (Institute for Fiscal Studies and University of Southampton); Federico Martellosio |
Abstract: | <p>The cumulants of the quadratic forms associated to the so-called spatial design matrices are often needed for inference in the context of isotropic processes on uniform grids. Unfortunately, because the eigenvalues of the matrices involved are generally unknown, the computation of the cumulants may be very demanding if the grids are large. This paper constructs circular counterparts, with known eigenvalues, to the spatial design matrices. It then studies some of their properties, and analyzes their performance in a number of applications.</p> |
Date: | 2010–03 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:06/10&r=ecm |
By: | Abidoye, Babatunde; Herriges, Joseph A.; Tobias, Justin |
Abstract: | Random Utility Maximization (RUM) models of recreation demand are typically plagued by limited information on environmental and other attributes characterizing the available sites in the choice set. To the extent that these unobserved site attributes are correlated with the observed characteristics and/or the key travel cost variable, the resulting parameter estimates and subsequent welfare calculations are likely to be biased. In this paper we develop a Bayesian approach to estimating a RUM model that incorporates a full set of alternative specific constants, insulating the key travel cost parameter from the influence of the unobserved site attributes. In contrast to estimation procedures recently outlined in Murdock (2006), the posterior simulator we propose (combining data augmentation and Gibbs sampling techniques) can be used in the more general mixed logit framework in which some parameters of the conditional utility function are random. Following a series of generated data experiments to illustrate the performance of the simulator, we apply the estimation procedures to data from the Iowa Lakes Project. In contrast to an earlier study using the same data (Egan \textit{et al.} \cite{eganetal}), we find that, with the addition of a full set of alternative specific constants, water quality attributes no longer appear to influence the choice of where to recreate. |
Keywords: | nonmarket valuation; water quality; discrete choice |
JEL: | C25 Q25 Q51 |
Date: | 2010–05–31 |
URL: | http://d.repec.org/n?u=RePEc:isu:genres:31559&r=ecm |
By: | Calzolari, Giorgio; Di Pino, Antonino |
Abstract: | We consider a simultaneous equation model with two endogenous limited dependent variables (individual wage and reservation wage) characterized by a selection mechanism determining a two-regimes endogenous-switching. We extend the FIML procedure proposed by Poirier-Ruud (1981) for a single equation switching model providing a stochastic specification for both equations and for the selection criterion. An accurate Monte Carlo experiment shows that the relative efficiency of the FIML estimator over to the Two-Stage procedure is remarkably high in presence of a high degree of endogeneity in the selection equation. |
Keywords: | Selection bias; endogenous switching |
JEL: | C34 C31 |
Date: | 2009–09–23 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:22984&r=ecm |
By: | Alessandro Andreoli; Francesco Caravenna; Paolo Dai Pra; Gustavo Posta |
Abstract: | We propose a simple stochastic model for time series which is analytically tractable, easy to simulate and which captures some relevant stylized facts of financial indexes, including scaling properties. We show that the model fits the Dow Jones Industrial Average time series in the period 1935-2009 with a remarkable accuracy. Despite its simplicity, the model has several interesting features. The volatility is not constant and displays high peaks. The empirical distribution of the log-returns (increments of the logarithm of the index) is non-Gaussian and may exhibit heavy tails. Log-returns corresponding to disjoint time intervals are uncorrelated but not independent: the correlation of their absolute values decays exponentially fast in the distance between the time intervals for large distances, while it has a slower decay for moderate distances. Finally, the distribution of the log-returns obeys scaling relations that are detected on real time series, but are not satisfied by most available models. |
Date: | 2010–06 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1006.0155&r=ecm |
By: | ROMBOUTS, Jeroen J. K (Institute of Applied Economics at HEC MontrŽal, CIRANO, CIRPEE, MontrŽal (QC), Canada; UniversitŽ catholique de Louvain, CORE, B-1348 Louvain-la-Neuve, Belgium); STENTOFT, Lars (Department of Finance at HEC MontrŽal, CIRANO, CIRPEE, CREATES, MontrŽal (QC), Canada) |
Abstract: | In recent years multivariate models for asset returns have received much attention, in particular this is the case for models with time varying volatility. In this paper we consider models of this class and examine their potential when it comes to option pricing. Specifically, we derive the risk neutral dynamics for a general class of multivariate heteroskedastic models, and we provide a feasible way to price options in this framework. Our framework can be used irrespective of the assumed underlying distribution and dynamics, and it nests several important special cases. We provide an application to options on the minimum of two indices. Our results show that not only is correlation important for these options but so is allowing this correlation to be dynamic. Moreover, we show that for the general model exposure to correlation risk carries an important premium, and when this is neglected option prices are estimated with errors. Finally, we show that when neglecting the non-Gaussian features of the data, option prices are also estimated with large errors. |
Keywords: | multivariate risk premia, option pricing, GARCH models |
JEL: | C11 C15 C22 G13 |
Date: | 2010–05–01 |
URL: | http://d.repec.org/n?u=RePEc:cor:louvco:2010020&r=ecm |
By: | Melvin. J. Hinich; Phillip Wild; John Foster (School of Economics, The University of Queensland) |
Abstract: | In this article, we present two nonparametric trispectrum based tests for testing the hypothesis that an observed time series was generated by what we call a generalized Wiener process (GWP). Assuming the existence of a Weiner process for asset rates of return is critical to the Black-Scholes model and its extension by Merton (BSM). The Hinich trispectrum-based test of linearity and the trispectrum extension of the Hinich-Rothman bispectrum test for time reversibility are used to test the validity of BSM. We apply the tests to a selection of high frequency NYSE and Australian (ASX) stocks. |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:qld:uq2004:408&r=ecm |
By: | DECANCQ, Koen (Katholieke Universiteit Leuven, CES, B-3000 Leuven, Belgium; UniversitŽ catholique de Louvain, CORE, B-1348 Louvain-la-Neuve, Belgium) |
Abstract: | In this paper I investigate the problem of defining a multivariate dependence ordering. First, I provide a characterization of the concordance dependence ordering between multivariate random vectors with fixed margins. Central to the characterization is a multivariate generalization of a well-known bivariate elementary dependence increasing rearrangement. Second, to order multivariate random vectors with non- fixed margins, I impose a scale invariance principle which leads to a copula-based concordance dependence ordering. Finally, a wide family of copula-based measures of dependence is characterized to which Spearman’s rank correlation coefficient belongs. |
Keywords: | copula, concordance ordering, dependence measures, dependence orderings, multivariate stochastic dominance, supermodular ordering |
JEL: | C14 |
Date: | 2010–03–01 |
URL: | http://d.repec.org/n?u=RePEc:cor:louvco:2010012&r=ecm |
By: | Evans, Keith; Herriges, Joseph A. |
Abstract: | A commonly observed feature of visitation data, elicited via a survey instrument, is a greater propensity for individuals to report trip numbers that are multiples of 5's, relative to other possible integers (such as 3 or 6). One explanation of this phenomenon is that some survey respondents have difficulty recalling the exact number of trips taken and instead choose to round their responses. This paper examines the impact that rounding can have on the estimated demand for recreation and the bias that it may induce on subsequent welfare estimates. We propose the use of a latent class structure in which respondents are assumed to be members of either a nonrounding or a rounding class. A series of generated data experiments are provided to illustrate the range of possible impacts that ignoring rounding can have on the estimated parameters of the model and on the welfare implications from site closure. The results suggest that biases can be substantial, particularly when then unconditional mean number of trips is in the range from two to four. An illustrative application is provided using visitation data to Saylorville Lake in central Iowa. |
Keywords: | recreation demand; count data; rounding |
JEL: | C25 Q51 |
Date: | 2010–06–02 |
URL: | http://d.repec.org/n?u=RePEc:isu:genres:31594&r=ecm |