|
on Econometrics |
By: | Min Seong Kim (Department of Economics, Ryerson University, Toronto, Canada); Yixiao Sun (Department of Economics, UC San Diego) |
Abstract: | The paper develops an asymptotically valid F test that is robust to spatial autocorrelation in a GMM framework. The test is based on the class of series covariance matrix estimators and ?fixed-smoothing asymptotics. The fi?xed-smoothing asymptotics and F approximation are established under mild sufficient conditions for a central limit theorem. These conditions can accommodate a wide range of spatial processes. This is in contrast with the standard arguments, which often impose very restrictive assumptions so that a functional central limit theorem holds. The proposed F test is very easy to implement, as critical values are from a standard F distribution. To a great extent, the asymptotic F test achieves triple robustness: it is asymptotically valid regardless of the spatial autocorrelation, the sampling region, and the limiting behavior of the smoothing parameter. Simulation shows that the F test is more accurate in size than the conventional chi-square tests, and it has the same size accuracy and power property as nonstandard tests that require computationally intensive simulation or bootstrap. |
Keywords: | F distribution, Fixed-smoothing asymptotics, Heteroskedasticity and Autocorrelation Robust, Robust Standard Error, Series Method, Spatial Analysis, Spatial Autocorrelation. |
JEL: | C12 C14 C18 C31 |
Date: | 2012–06 |
URL: | http://d.repec.org/n?u=RePEc:rye:wpaper:wp032&r=ecm |
By: | Dinghai Xu (Department of Economics, University of Waterloo) |
Abstract: | This paper develops a simple alternative estimation method for the GARCH models based on the empirical characteristic function. A set of Monte Carlo experiments is carried out to assess the performance of the proposed estimator. The results reveal that the proposed estimator has good finite sample properties and is comparable to the conventional maximum likelihood estimator. The method is applied to the foreign exchange data for empirical illustration. |
JEL: | C01 C58 |
Date: | 2012–05 |
URL: | http://d.repec.org/n?u=RePEc:wat:wpaper:1204&r=ecm |
By: | Pierre Chausse (Department of Economics, University of Waterloo); Dinghai Xu (Department of Economics, University of Waterloo) |
Abstract: | This paper investigates alternative generalized method of moments (GMM) estimation procedures of a stochastic volatility model with realized volatility measures. The extended model can accommodate a more general correlation structure. General closed form moment conditions are derived to examine the model properties and to evaluate the performance of various GMM estimation procedures under Monte Carlo environment, including standard GMM, principal component GMM, robust GMM and regularized GMM. An application to five company stocks and one stock index is also provided for an empirical demonstration. |
JEL: | G17 G32 C58 C01 |
Date: | 2012–05 |
URL: | http://d.repec.org/n?u=RePEc:wat:wpaper:1203&r=ecm |
By: | Kaufmann, Hendrik; Kruse, Robinson; Sibbertsen, Philipp |
Abstract: | A simple procedure for the specification of the transition function describing the regime switch in nonlinear autoregressive models is proposed. This procedure is based on auxiliary regressions of unit root tests and is applicable to a variety of transition functions. In contrast to other procedures, complicated and computer-intense estimation of the candidate models is not necessary. Our approach entirely relies on OLS estimation of auxiliary regressions instead. We use standard information criteria for the selection of the unknown transition function. Our Monte Carlo simulations reveal that the approach works well in practice. Empirical applications to the S&P500 price-earnings ratio and the US interest spread highlight the merits of our suggested procedure. |
Keywords: | Nonlinearity, Smooth transition, Threshold model, Model selection, Unit root |
JEL: | C15 C22 C52 |
Date: | 2012–07 |
URL: | http://d.repec.org/n?u=RePEc:han:dpaper:dp-500&r=ecm |
By: | Harvey, A.; Sucarrat, G. |
Abstract: | An EGARCH model in which the conditional distribution is heavy-tailed and skewed is proposed. The properties of the model, including unconditional moments, autocorrelations and the asymptotic distribution of the maximum likelihood estimator, are obtained. Evidence for skewness in conditional t-distribution is found for a range of returns series and the model is shown to give a better .t than the corresponding skewed-t GARCH model. |
Keywords: | General error distribution; heteroskedasticity; leverage; score; Student?s t, two components. |
JEL: | C22 G17 |
Date: | 2012–08–17 |
URL: | http://d.repec.org/n?u=RePEc:cam:camdae:1236&r=ecm |
By: | Bontempi, Maria Elena; Mammi, Irene |
Abstract: | The problem of instrument proliferation and its consequences (overfitting of endogenous variables, bias of estimates, weakening of Sargan/Hansen test) are well known. The literature provides little guidance on how many instruments is too many. It is common practice to report the instrument count and to test the sensitivity of results to the use of more or fewer instruments. Strategies to alleviate the instrument proliferation problem are the lag-depth truncation and/or the collapse of the instrument set (the latter being an horizontal squeezing of the instrument matrix). However, such strategies involve either a certain degree of arbitrariness (based on the ability and the experience of the researcher) or of trust in the restrictions implicitly imposed (and hence untestable) on the instrument matrix. The aim of the paper is to introduce a new strategy to reduce the instrument count. The technique we propose is statistically founded and purely datadriven and, as such, it can be considered a sort of benchmark solution to the problem of instrument proliferation. We apply the principal component analysis (PCA) on the instrument matrix and exploit the PCA scores as the instrument set for the panel generalized method-of-moments (GMM) estimation. Through extensive Monte Carlo simulations, under alternative characteristics of persistence of the endogenous variables, we compare the performance of the Difference GMM, Level and System GMM estimators when lag truncation, collapsing and our principal component-based IV reduction (PCIVR henceforth) are applied to the instrument set. The same comparison has been carried out with two empirical applications on real data: the first replicates the estimates of Blundell and Bond [1998]; the second exploits a new and large panel data-set in order to assess the role of tangible and intangible capital on productivity. Results show that PCIVR is a promising strategy of instrument reduction. |
Keywords: | Panel data; generalized method of moments; proliferation of instruments; principal component analysis; persistence |
JEL: | C13 C15 C33 D24 |
Date: | 2012–08–16 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:40720&r=ecm |
By: | Helmut Lütkepohl |
Abstract: | It is emphasized that the shocks in structural vector autoregressions are only identified up to sign and it is pointed out that this feature can result in very misleading confidence intervals for impulse responses if simulation methods such as Bayesian or bootstrap methods are used. The confidence intervals heavily depend on which variable is used for fixing the sign of the initial responses. In particular, when the shocks are identified via long-run restrictions the problem can be severe. It is pointed out that a suitable choice of variable for fixing the sign of the initial responses can result in substantial reductions in the confidence bands for impulse responses. |
Keywords: | Vector autoregressive process, impulse responses, bootstrap, Bayesian estimation |
JEL: | C32 |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1235&r=ecm |
By: | John Knight (Department of Economics, University of Western Ontario); Stephen Satchell (Trinity College, Cambridge; Department of Finance, University of Sydney); Jessica Zhang (University of Greenwich) |
Abstract: | We examine a popular practitioner methodology used in the construction of linear factor models whereby particular factors are increased/decreased in relative importance within the model. This allows model builders to customise models and, as such, reflect those factors that the client/modeller may think important. We call this process Pragmatic Bayesianism (or prag-Bayes for short) and we provide analysis which shows when such a procedure is likely to be successful. |
Keywords: | linear factor models, Bayesian statistics, sequential regression |
JEL: | C13 C22 |
Date: | 2012–08 |
URL: | http://d.repec.org/n?u=RePEc:bbk:bbkefp:1213&r=ecm |
By: | Dagsvik, John K. (Research Department, Statistics Norway and the Frisch Centre for Economic Research) |
Abstract: | This paper discusses how specification of probabilistic models for multistate duration data generated by individual choices should be justified on a priori theoretical grounds. Preferences are assumed represented by random utilities, where utilities are viewed as random also to the agent himself. First, the paper proposes a characterization of exogenous preferences, (that is, in the special case with no state dependence effects). The main assumption asserts that when preferences are exogenous the current and future indirect utilities are uncorrelated with current and past choices, given unobservables that are perfectly known to the agent. It is demonstrated that under rather weak and general regularity conditions this characterization yields an explicit structure of the utility function as a so-called Extremal stochastic process. Furthermore, from this utility representation it follows that the choice process is a Markov Chain (in continuous- or discrete time), with a particular functional form of the transition probabilities, as explicit functions of the parameters of the utility function and choice set. Subsequently, we show how the model can be extended to allow for structural state dependence effects, and how such state dependence effects can be identified. Moreover, it is discussed how a version of Chamberlain’s conditional estimation method applies in the presence of fixed effects. Finally, we discuss two examples of applications. |
Keywords: | Duration models; Random utility models; Habit persistence; True state dependence; Extremal process; Markov chain |
JEL: | C23 C25 C41 C51 D01 |
Date: | 2012–05–20 |
URL: | http://d.repec.org/n?u=RePEc:hhs:osloec:2012_017&r=ecm |
By: | Andrew Chesher (Institute for Fiscal Studies and University College London); Adam Rosen (Institute for Fiscal Studies and University College London) |
Abstract: | This paper studies simultaneous equations models for two or more discrete outcomes. These models may be incoherent, delivering no values of the outcomes at certain values of the latent variables and covariates, and they may be incomplete, delivering more than one value of the outcomes at certain values of the covariates and latent variates. We revisit previous approaches to the problems of incompleteness and incoherence in such models, and we propose a new approach for dealing with these. For each approach, we use random set theory to characterize sharp identification regions for the marginal distribution of latent variables and the structural function relating outcomes to covariates, illustrating the relative identifying power and tradeoffs of the different approaches. We show that these identified sets are characterized by systems of conditional moment equalities and inequalities, and we provide a generically applicable algorithm for constructing these. We demonstrate these results for the simultaneous equations model for binary outcomes studied in for example Heckman (1978) and Tamer (2003) and the triangular model with a discrete endogenous variable studied in Chesher (2005) and Jun, Pinkse, and Xu (2011) as illustrative examples. |
JEL: | C10 C14 C50 C51 |
Date: | 2012–08 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:21/12&r=ecm |
By: | Joel Horowitz (Institute for Fiscal Studies and Northwestern University) |
Date: | 2012–08 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:20/12&r=ecm |
By: | Tziogkidis, Panagiotis |
Abstract: | Bootstrapping non-parametric models is a fairly complicated exercise which is associated with implicit assumptions or requirements that are not always obvious to the non-expert user. Bootstrap DEA is a significant development of the past decade; however, some of its assumptions and properties are still quite unclear, which may lead to mistakes in implementation and hypothesis testing. This paper clarifies these issues and proposes a hypothesis testing procedure, along with its limitations, which could be extended to test almost any hypothesis in bootstrap DEA. Moreover, it enhances the intuition behind bootstrap DEA and highlights logical and theoretical pitfalls that should be avoided. |
Keywords: | Data Envelopment Analysis; Efficiency; Bootstrap; Bootstrap DEA; Hypothesis Testing |
JEL: | C12 C14 C15 C61 C67 |
Date: | 2012–08 |
URL: | http://d.repec.org/n?u=RePEc:cdf:wpaper:2012/18&r=ecm |
By: | Maican, Florin G. (Department of Economics, School of Business, Economics and Law, Göteborg University); Sweeney, Richard J. (McDonough School of Business, Georgetown University) |
Abstract: | This paper examines power issues for the ADF and four break models (Perron 1989, Zivot and Andrews 1992) when the DGP corresponds to one of the break models. Choosing to test an incorrect break model can but need not greatly reduce the probability of rejecting the null. Break points that are relatively early in the sample period have substantial effects of increasing power. For modest shifts in time trends, simply including a time trend without shift in the model preserves power, but not for large time-trend shifts.<p> |
Keywords: | Unit root; Monte Carlo; Break models |
JEL: | C15 C22 C32 C33 E31 F31 |
Date: | 2012–08–27 |
URL: | http://d.repec.org/n?u=RePEc:hhs:gunwpe:0536&r=ecm |
By: | Jozef Barunik; Michaela Barunikova |
Abstract: | This paper revisits the fractional cointegrating relationship between ex-ante implied volatility and ex-post realized volatility. We argue that the concept of corridor implied volatility (CIV) should be used instead of the popular model-free option-implied volatility (MFIV) when assessing the fractional cointegrating relation as the latter may introduce bias to the estimation. For the realized volatility, we use recently proposed methods which are robust to noise as well as jumps and interestingly we find that it does not affect the implied-realized volatility relation. In addition, we develop a new tool for the estimation of fractional cointegrating relation between implied and realized volatility based on wavelets, a wavelet band least squares (WBLS). The main advantage of WBLS in comparison to other frequency domain methods is that it allows us to work conveniently with potentially non-stationary volatility due to the properties of wavelets. We study the dynamics of the relationship in the time-frequency domain with the wavelet coherence confirming that the dependence comes solely from the lower frequencies of the spectra. Motivated by this result we estimate the relationship only on this part of the spectra using WBLS and compare our results to the fully modified narrow-band least squares (FMNBLS) based on the Fourier frequencies. In the estimation, we use the S&P 500 and DAX monthly and bi-weekly option prices covering the recent financial crisis and we conclude that in the long-run, volatility inferred from the option prices using the corridor implied volatility (CIV) provides an unbiased forecast of the realized volatility. |
Date: | 2012–08 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1208.4831&r=ecm |
By: | Dirk G Baur (Finance Discipline Group, UTS Business School, University of Technology, Sydney) |
Abstract: | The copula function defines the degree of dependence and the structure of dependence. This paper proposes an alternative framework to decompose the dependence using quantile regression. It is demonstrated that the methodology provides a detailed picture of dependence including asymmetric and non-linear relationships. In addition, changes in the degree or structure of dependence can be modelled and tested for each quantile of the distribution. The empirical part applies the framework to three different sets of financial time-series and demonstrates substantial differences in dependence patterns among asset classes and through time. The analysis of 54 global equity markets shows that detailed information about the structure of dependence is crucial to adequately assess the benefits of diversification in normal times and crisis times. |
Keywords: | quantile regression; copula; dependence modelling; tail dependence; contagion; financial crises |
JEL: | C22 G14 |
Date: | 2012–08–01 |
URL: | http://d.repec.org/n?u=RePEc:uts:wpaper:170&r=ecm |
By: | Mehmke, Fabian; Cremers, Heinz; Packham, Natalie |
Abstract: | -- Market risk management is one of the key factors to success in managing financial institutions. Underestimated risk can have desastrous consequences for individual companies and even whole economies, not least as could be seen during the recent crises. Overestimated risk, on the other side, may have negative effects on a company's capital requirements. Companies as well as national authorities thus have a strong interest in developing market risk models that correctly quantify certain key figures such as Value at Risk or Expected Shortfall. This paper presents several state of the art methods to evaluate the adequacy of almost any given market risk model. Existing models are enhanced by in-depth analysis and simulations of statistical properties revealing some previously unknown effects, most notably inconsistent behaviour of alpha and beta errors. Furthermore, some new market risk validation models are introduced. In the end, a simulation with various market patterns demonstrates strenghts and weaknesses of each of the models presented under realistic conditions. |
Keywords: | Backtesting,Market Risk,Value at Risk,Expected Shortfall,Validation,Alpha Error,Beta Error,Time Until First Failure,Proportion of Failure,Traffic Light Approach,Magnitude of Loss Function,Markow-Test,Gauss-Test,Rosenblatt,Kuiper,Kolmogorov-Smirnov,Jarque-Bera,Regression,Likelihood Ratio,Truncated Distribution,Censored Distribution,Simulation |
JEL: | C01 C02 C12 C13 C14 C15 C32 G32 G38 |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:zbw:fsfmwp:192&r=ecm |
By: | Ying-Hui Shao (ECUST); Gao Feng Gu (ECUST); Zhi-Qiang Jiang (ECUST); Wei-Xing Zhou (ECUST); Didier Sornette (ETH Zurich) |
Abstract: | Notwithstanding the significant efforts to develop estimators of long-range correlations (LRC) and to compare their performance, no clear consensus exists on what is the best method and under which conditions. In addition, synthetic tests suggest that the performance of LRC estimators varies when using different generators of LRC time series. Here, we compare the performances of four estimators [Fluctuation Analysis (FA), Detrended Fluctuation Analysis (DFA), Backward Detrending Moving Average (BDMA), and centred Detrending Moving Average (CDMA)]. We use three different generators [Fractional Gaussian Noises, and two ways of generating Fractional Brownian Motions]. We find that CDMA has the best performance and DFA is only slightly worse in some situations, while FA performs the worst. In addition, CDMA and DFA are less sensitive to the scaling range than FA. Hence, CDMA and DFA remain "The Methods of Choice" in determining the Hurst index of time series. |
Date: | 2012–08 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1208.4158&r=ecm |
By: | Carlos León; Karen Leiton; Alejandro Reveiz |
Abstract: | Financial basics and intuition stresses the importance of investment horizon for risk management and asset allocation. However, the beta parameter of the Capital Asset Pricing Model (CAPM) is invariant to the holding period. Such contradiction is due to the assumption of long-term independence of financial returns; an assumption that has been proven erroneous. Following concerns regarding the impact of the long-term dependence assumption on risk (Holton, 1992), this paper quantifies and fixes the CAPM’s bias resulting from this abiding –but flawed- assumption. The proposed procedure is based on Greene and Fielitz (1980) seminal work on the application of fractional Brownian motion to CAPM, and on a revised technique for estimating time-series’ fractal dimension with the Hurst exponent (León and Vivas, 2010; León and Reveiz, 2011a). Using a set of 85 stocks from the S&P100, this paper finds that relaxing the long-term independence assumption results in significantly different estimations of beta. According to three tests herein implemented with a 99% confidence level, more than 60% of the stocks exhibit significantly different beta parameters. Hence, expected returns are biased; on average, the bias is about ±60bps for a contemporary one-year investment horizon. Thus, as emphasized by Holton (1992), risk is a two-dimensional quantity, with holding period almost as important as asset class. The procedure herein proposed is valuable since it parsimoniously achieves an investment |
Keywords: | CAPM, Hurst exponent, long-term dependence, fractional Brownian motion, asset allocation, investment horizon. Classification JEL: G12, G14, G32, G20, C14 |
Date: | 2012–08 |
URL: | http://d.repec.org/n?u=RePEc:bdr:borrec:730&r=ecm |
By: | Peter Huber (WIFO); Harald Oberhofer; Michael Pfaffermayr (WIFO) |
Abstract: | This paper shows that applying simple employment-weighted OLS estimation to Davis – Haltiwanger – Schuh (1996) firm level job creation rates taking the values 2 and –2 for entering and exiting firms, respectively, provides biased and inconsistent parameter estimates. Consequently, we argue that entries and exits should be analysed separately and propose an alternative, consistent estimation procedure assuming that the size of continuing firms follows a lognormal distribution. A small-scale Monte Carlo analysis confirms the analytical results. Using a sample of Austrian firms, we demonstrate that the impact of small firms on net job creation is substantially underestimated when applying employment-weighted OLS estimation. |
Keywords: | Job creation, DHS growth rate, firm size, firm age, Monte Carlo simulation |
Date: | 2012–08–27 |
URL: | http://d.repec.org/n?u=RePEc:wfo:wpaper:y:2012:i:435&r=ecm |
By: | Kleijnen, Jack P.C.; Mehdad, E.; Beers, W.C.M. van (Tilburg University, Center for Economic Research) |
Abstract: | Abstract: Distribution-free bootstrapping of the replicated responses of a given discreteevent simulation model gives bootstrapped Kriging (Gaussian process) metamodels; we require these metamodels to be either convex or monotonic. To illustrate monotonic Kriging, we use an M/M/1 queueing simulation with as output either the mean or the 90% quantile of the transient-state waiting times, and as input the traffic rate. In this example, monotonic bootstrapped Kriging enables better sensitivity analysis than classic Kriging; i.e., bootstrapping gives lower MSE and confidence intervals with higher coverage and the same length. To illustrate convex Kriging, we start with simulationoptimization of an (s, S) inventory model, but we next switch to a Monte Carlo experiment with a second-order polynomial inspired by this inventory simulation. We could not find truly convex Kriging metamodels, either classic or bootstrapped; nevertheless, our bootstrapped "nearly convex" Kriging does give a confidence interval for the optimal input combination. |
Keywords: | Distribution-free bootstrapping;Gaussian process;random simulation;sensitivity analysis;optimization;confidence intervals. |
JEL: | C0 C1 C9 C15 C44 |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:dgr:kubcen:2012066&r=ecm |
By: | Stefan Gerhold; Max Kleinert; Piet Porkert; Mykhaylo Shkolnikov |
Abstract: | We give conditions under which the normalized marginal distribution of a semimartingale converges to a Gaussian limit law as time tends to zero. In particular, our result is applicable to solutions of stochastic differential equations with locally bounded and continuous coefficients. The limit theorems are subsequently extended to functional central limit theorems on the process level. We present two applications of the results in the field of mathematical finance: to the pricing of at-the-money digital options with short maturities and short time implied volatility skews. |
Date: | 2012–08 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1208.4282&r=ecm |
By: | Martha Banbura; Domenico Giannone; Michèle Modugno; Lucrezia Reichlin |
Date: | 2012–08 |
URL: | http://d.repec.org/n?u=RePEc:eca:wpaper:2013/125192&r=ecm |