
on Econometrics 
By:  Stefano Favaro (University of Turin and Collegio Carlo Alberto); Antonio Lijoi (Department of Economics and Quantitative Methods, University of Pavia, and Collegio Carlo Alberto); Igor Prunster (University of Turin and Collegio Carlo Alberto) 
Abstract:  In Bayesian nonparametric inference, random discrete probability measures are commonly used as priors within hierarchical mixture models for density estimation and for inference on the clustering of the data. Recently it has been shown that they can also be exploited in species sampling problems: indeed they are natural tools for modeling the random proportions of species within a population thus allowing for inference on various quantities of statistical interest. For applications that involve large samples, the exact evaluation of the corresponding estimators becomes impracticable and, therefore, asymptotic approximations are sought. In the present paper we study the limiting behaviour of the number of new species to be observed from further sampling, conditional on observed data, assuming the observations are exchangeable and directed by a normalized generalized gamma process prior. Such an asymptotic study highlights a connection between the normalized generalized gamma process and the two–parameter Poisson–Dirichlet process that was previously known only in the unconditional case. 
Keywords:  Bayesian Nonparametrics; Species sampling models; Asymptotics; s–diversity; Polynomially and exponentially tilted random variables; Completely random measures; Normalized generalized gamma process; Two parameter Poisson–Dirichlet process. 
Date:  2011–05 
URL:  http://d.repec.org/n?u=RePEc:pav:wpaper:144&r=ecm 
By:  Rothe, Christoph (Toulouse School of Economics); Wied, Dominik (TU Dortmund) 
Abstract:  We propose a specification test for a wide range of parametric models for the conditional distribution function of an outcome variable given a vector of covariates. The test is based on the Cramervon Mises distance between an unrestricted estimate of the joint distribution function of the data, and a restricted estimate that imposes the structure implied by the model. The procedure is straightforward to implement, is consistent against fixed alternatives, has nontrivial power against local deviations of order n^1/2 from the null hypothesis, and does not require the choice of smoothing parameters. In an empirical application, we use our test to study the validity of various models for the conditional distribution of wages in the US. 
Keywords:  Cramervon Mises distance, quantile regression, distributional regression, locationscale model, bootstrap, wage distribution 
JEL:  C12 C14 C31 C52 J31 
Date:  2012–02 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp6364&r=ecm 
By:  Maheu, John; Song, Yong 
Abstract:  This paper develops an efficient approach to model and forecast timeseries data with an unknown number of changepoints. Using a conjugate prior and conditional on timeinvariant parameters, the predictive density and the posterior distribution of the changepoints have closed forms. The conjugate prior is further modeled as hierarchical to exploit the information across regimes. This framework allows breaks in the variance, the regression coefficients or both. Regime duration can be modelled as a Poisson distribution. An new efficient Markov Chain Monte Carlo sampler draws the parameters as one block from the posterior distribution. An application to Canada inflation time series shows the gains in forecasting precision that our model provides. 
Keywords:  multiple changepoints; regime duration; inflation targeting; predictive density; MCMC 
JEL:  C51 C22 C11 
Date:  2012–02–22 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:36870&r=ecm 
By:  G. Pan; J. Gao; Y. Yang; M. Guo 
Abstract:  This paper proposes a new mutual independence test for a large number of high dimensional random vectors. The test statistic is based on the characteristic function of the empirical spectral distribution of the sample covariance matrix. The asymptotic distributions of the test statistic under the null and local alternative hypotheses are established as dimensionality and the sample size of the data are comparable. We apply this test to examine multiple MA(1) and AR(1) models, panel data models with some spatial crosssectional structures. In addition, in a flexible applied fashion, the proposed test can capture some dependent but uncorrelated structures, for example, nonlinear MA(1) models, multiple ARCH(1) models and vandermonde matrices. Simulation results are provided for detecting these dependent structures. An empirical study of dependence between closed stock prices of several companies from New York Stock Exchange (NYSE) demonstrates that the feature of crosssectional dependence is popular in stock markets 
Keywords:  Independence test, crosssectional dependence, empirical spectral distribution, characteristic function, MarcenkoPastur Law 
JEL:  C12 C21 C22 
Date:  2012–01–20 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:20121&r=ecm 
By:  Song Li; Mervyn J. Silvapulle; Param Silvapulle; Xibin Zhang 
Abstract:  This paper investigates nonparametric estimation of density on [0,1]. The kernel estimator of density on [0,1] has been found to be sensitive to both bandwidth and kernel. This paper proposes a unified Bayesian framework for choosing both the bandwidth and kernel function. In a simulation study, the Bayesian bandwidth estimator performed better than others, and kernel estimators were sensitive to the choice of the kernel and the shapes of the population densities on [0,1]. The simulation and empirical results demonstrate that the methods proposed in this paper can improve the way the probability densities on [0,1] are presently estimated. 
Keywords:  Asymmetric kernel, Bayes factor, boundary bias, kernel selection, marginal likelihood, recoveryrate density 
JEL:  C11 C14 C15 
Date:  2012–01 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:20123&r=ecm 
By:  Mantalos, Panagiotis (Department of Business, Economics, Statistics and Informatics) 
Abstract:  In this paper, we introduce a set of critical values for unit root tests that are robust in the presence of conditional heteroscedasticity errors using the normalizing and variancestabilizing transformation (NoVaS) in Politis (2007) and examine their properties using Monte Carlo methods. In terms of the size of the test, our analysis reveals that unit root tests with NoVaSmodified critical values have actual sizes close to the nominal size. For the power of the test, we find that unit root tests with NoVaSmodified critical values either have the same power as, or slightly better than, tests using conventional Dickey–Fuller critical values across the sample range considered. 
Keywords:  Critical values; normalizing and variancestabilizing transformation; unit root tests 
JEL:  C01 C12 C15 
Date:  2012–02–02 
URL:  http://d.repec.org/n?u=RePEc:hhs:oruesi:2012_002&r=ecm 
By:  Gimeno, Ricardo; Gonzalez, Clara I. 
Abstract:  Extreme Value Theory is increasingly used in the modelling of financial time series. The nonnormality of stock returns leads to the search for alternative distributions that allows skewness and leptokurtic behavior. One of the most used distributions is the Pareto Distribution because it allows nonnormal behaviour, which requires the estimation of a tail index. This paper provides a new method for estimating the tail index. We propose an automatic procedure based on the computation of successive normality tests over the whole of the distribution in order to estimate a Gaussian Distribution for the central returns and two Pareto distributions for the tails. We find that the method proposed is an automatic procedure that can be computed without need of an external agent to take the decision, so it is clearly objective. 
Keywords:  Tail Index; Hill estimator; Normality Test 
JEL:  C10 C15 G19 G00 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:37023&r=ecm 
By:  Wan, Lei (Maastricht University) 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:ner:maastr:urn:nbn:nl:ui:2728190&r=ecm 
By:  Eiji Kurozumi 
Abstract:  This paper investigates tests for multiple structural changes with nonhomogeneous regressors, such as polynomial trends. We consider exponentialtype, supremumtype and averagetype tests as well as the corresponding weightedtype tests suggested in the literature. We show that the limiting distributions depend on regressors in general, and we need to tabulate critical values depending on them. Then, we focus on the linear trend case and obtain the critical values of the test statistics. The Mote Carlo simulations are conducted to investigate the finite sample properties of the tests proposed in the paper, and it is found that the specification of the number of breaks is an important factor for the finite sample performance of the tests. Since it is often the case that we cannot prespecify the number of breaks under the alternative but can suppose only the maximum number of breaks, the weightedtype tests are useful in practice. 
Keywords:  Multiple Breaks, Exptype Test, Suptype Test, Avgtype Test, Meantype Test 
JEL:  C12 C22 
Date:  2012–02 
URL:  http://d.repec.org/n?u=RePEc:hst:ghsdps:gd11227&r=ecm 
By:  Ioannis Kasparis; Peter C.B. Phillips; Tassos Magdalinos 
Abstract:  In regressions involving integrable functions we examine the limit properties of IV estimators that utilise integrable transformations of lagged regressors as instruments. The regressors can be either I(0) or I(1) processes. We show that this kind of nonlinearity in the regression function can significantly affect the relevance of the instruments. In particular, such instruments become weak when the signal of the regressor is strong, as it is in the I(1) case. Instruments based on integrable functions of lagged I(1) regressors display long range dependence and so remain relevant even at long lags, continuing to contribute to variance reduction in IV estimation. However, simulations show that OLS is generally superior to IV estimation in terms of MSE, even in the presence of endogeneity. Estimation precision is also reduced when the regressor is nonstationary. 
Keywords:  Instrumental variables, Integrable function, Integrated process, Invariance principle, Local time, Mixed normality, Stationarity, Nonlinear cointegration, Unit roots, Weak Instruments. 
Date:  2012–01 
URL:  http://d.repec.org/n?u=RePEc:ucy:cypeua:022012&r=ecm 
By:  Rong Zhang; Brett A. Inder; Xibin Zhang 
Abstract:  We present a Bayesian sampling approach to parameter estimation in a discreteresponse model with double rules of selectivity, where the dependent variables contain two layers of binary choices and one ordered response. Our investigation is motivated by an empirical study using such a doubleselection rule for three labormarket outcomes, namely labor force participation, employment and occupational skill level. Full information maximum likelihood (FIML) estimation often encounters convergence problems in numerical optimization. The contribution of our investigation is to present a sampling algorithm through a new reparameterization strategy. We conduct Monte Carlo simulation studies and find that the numerical optimization of FIML fails for more than half of the simulated samples. Our Bayesian method performs as well as FIML for the simulated samples where FIML works. Moreover, for the simulated samples where FIML fails, Bayesian works as well as it does for the simulated samples where FIML works. We apply the proposed sampling algorithm to the doubleselection model of laborforce participation, employment and occupational skill level. We derive the 95% Bayesian credible intervals for marginal effects of the explanatory variable on the three laborforce outcomes. In particular, the marginal effects of mental health factors on these three outcomes are discussed. 
Keywords:  Bayesian sampling; conditional posterior; marginal effects; mental illness; reparameterization. 
JEL:  C35 C11 
Date:  2012–02 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:20125&r=ecm 
By:  Diaa Noureldin; Neil Shephard; Kevin Sheppard 
Abstract:  This paper introduces a new class of multivariate volatility models which is easy to estimate using covariance targeting, even with rich dynamics. We call them rotated ARCH (RARCH) models. The basic structure is to rotate the returns and then to fit them using a BEKKtype parameterization of the timevarying covariance whose longrun covariance is the identity matrix. The extension to DCCtype parameterizations is given, introducing the rotated conditional correlation (RCC) model. Inference for these mdoels is computationally attractive, and the asymptotics are standard. The techniques are illustrated using data on some SJIA stocks. 
Keywords:  RCC, Multivariate volatiity, Covariance targeting, Common persistence, Empirical Bayes, Predictive likelihood 
JEL:  C32 C52 C58 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:594&r=ecm 
By:  Chaohua Dong; Jiti Gao 
Abstract:  In this paper, expansions of functionals of Lévy processes are established under some Hilbert spaces and their orthogonal bases. From practical standpoint, both timehomogeneous and timeinhomogeneous functionals of Lévy processes are considered. Several expansions and rates of convergence are established. In order to state asymptotic distributions for statistical estimators of unknown parameters involved in a general regression model, we develop a general asymptotic theory for partial sums of functionals of Lévy processes. The results show that these estimators of the unknown parameters in different situations converge to quite different random variables. In addition, the rates of convergence depend on various factors rather than just the sample size. 
Keywords:  Expansion, Lévy Process, Orthogonal Series, Statistical Estimation. 
JEL:  C13 C14 C22 
Date:  2012–01 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:20122&r=ecm 
By:  ElenaIvona Dumitrescu (LEO  Laboratoire d'économie d'Orleans  CNRS : UMR6221  Université d'Orléans); Christophe Hurlin (LEO  Laboratoire d'économie d'Orleans  CNRS : UMR6221  Université d'Orléans); Vinson Pham (UCSC  University of California at Santa Cruz  University of California at Santa Cruz) 
Abstract:  In this paper we propose a new tool for backtesting that examines the quality of Valueat Risk (VaR) forecasts. To date, the most distinguished regressionbased backtest, proposed by Engle and Manganelli (2004), relies on a linear model. However, in view of the di chotomic character of the series of violations, a nonlinear model seems more appropriate. In this paper we thus propose a new tool for backtesting (denoted DB) based on a dy namic binary regression model. Our discretechoice model, e.g. Probit, Logit, links the sequence of violations to a set of explanatory variables including the lagged VaR and the lagged violations in particular. It allows us to separately test the unconditional coverage, the independence and the conditional coverage hypotheses and it is easy to implement. MonteCarlo experiments show that the DB test exhibits good small sample properties in realistic sample settings (5% coverage rate with estimation risk). An application on a portfolio composed of three assets included in the CAC40 market index is nally proposed. 
Keywords:  ValueatRisk; Risk Management; Dynamic Binary Choice Models 
Date:  2012–02–07 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:halshs00671658&r=ecm 
By:  Jakob Söhl 
Abstract:  Confidence intervals and joint confidence sets are constructed for the nonparametric calibration of exponential Lévy models based on prices of European options. This is done by showing joint asymptotic normality for the estimation of the volatility, the drift, the intensity and the Lévy density at nitely many points in the spectral calibration method. Furthermore, the asymptotic normality result leads to a test on the value of the volatility in exponential Lévy models. 
Keywords:  European option, Jump diffusion, Confidence sets, Asymptotic normality, Nonlinear inverse problem 
JEL:  G13 C14 
Date:  2012–02 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2012012&r=ecm 
By:  Hecq Alain; Victor Issler João (METEOR) 
Abstract:  It is well known that cointegration between the level of two variables (labeled Y_{t} and y_{t} inthis paper) is a necessary condition to assess the empirical validity of a presentvalue model (PVand PVM, respectively, hereafter) linking them. The work on cointegration has been so prevalentthat it is often overlooked that another necessary condition for the PVM to hold is that theforecast error entailed by the model is orthogonal to the past. The basis of this result is theuse of rational expectations in forecasting future values of variables in the PVM. If thiscondition fails, the presentvalue equation will not be valid, since it will contain an additionalterm capturing the (nonzero) conditional expected value of future error terms.Our article has a few novel contributions, but two stand out. First, in testing for PVMs, weadvise to split the restrictions implied by PV relationships into orthogonality conditions (orreduced rank restrictions) before additional tests on the value of parameters. We show that PVrelationships entail a weakform common feature relationship as in Hecq, Palm, and Urbain (2006)and in Athanasopoulos, Guillén, Issler and Vahid (2011) and also a polynomial serialcorrelationcommon feature relationship as in Cubadda and Hecq (2001), which represent restrictions on dynamicmodels which allow several tests for the existence of PV relationships to be used. Because theserelationships occur mostly with financial data, we propose tests based on generalized method ofmoment (GMM) estimates, where it is straightforward to propose robust tests in the presence ofheteroskedasticity. We also propose a robust Wald test developed to investigate the presence ofreduced rank models. Their performance is evaluated in a MonteCarlo exercise.Second, in the context of asset pricing, we propose applying a permanenttransitory (PT)decomposition based on Beveridge and Nelson (1981), which focus on extracting the longruncomponent of asset prices, a key concept in modern financial theory as discussed in Alvarez andJermann (2005), Hansen and Scheinkman (2009), and Nieuwerburgh, Lustig, Verdelhan (2010). Hereagain we can exploit the results developed in the common cycle literature to easily extractpermament and transitory components under both long and also shortrun restrictions. Thetechniques discussed herein are applied to long span annual data on long and shortterm interestrates and on price and dividend for the U.S. economy. In both applications we do not reject theexistence of a common cyclical feature vector linking these two series. Extracting the longruncomponent shows the usefulness of our approach and highlights the presence of assetpricing bubbles. 
Keywords:  macroeconomics ; 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:dgr:umamet:2012006&r=ecm 
By:  Ola L{\o}vsletten; Martin Rypdal 
Abstract:  We introduce tools for inference in the multifractal random walk introduced by Bacry et al. (2001). These tools include formulas for smoothing, filtering and volatility forecasting. In addition, we present methods for computing conditional densities for one and multistep returns. The inference techniques presented in this paper, including maximum likelihood estimation, are applied to data from the Oslo Stock Exchange, and it is observed that the volatility forecasts based on the multifractal random walk have a much richer structure than the forecasts obtained from a basic stochastic volatility model. 
Date:  2012–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1202.5376&r=ecm 
By:  Yingying Dong (Department of Economics, University of CaliforniaIrvine) 
Abstract:  Many empirical applications of regression discontinuity (RD) models use a running variable that is rounded and hence is discrete, e.g., age in years, or birth weight in ounces. This paper shows that standard RD estimation using a rounded discrete running variable leads to inconsistent estimates of treatment effects, even when the true functional form relating the outcome and the running variable is known and is correctly specified. This paper provides simple formulas to correct for this discretization bias. The proposed approach does not require instrumental variables, but instead uses information regarding the distribution of rounding errors, which is easily obtained and often close to uniform. The proposed approach is applied to estimate the effect of Medicare on insurance coverage in the US, and to investigate the retirementconsumption puzzle in China, utilizing the Chinese mandatory retirement policy. 
Keywords:  Regression discontinuity; Rounding; Rounding errors; Discrete running variable 
JEL:  C21 C26 I18 
Date:  2012–01 
URL:  http://d.repec.org/n?u=RePEc:irv:wpaper:111206&r=ecm 
By:  Darwin Ugarte Ontiveros (Center for Research in the Economics of Development, University of Namur); Vincenzo Verardi (Center for Research in the Economics of Development, University of Namur; European Center for Advanced Research in Economics and Statistics, Universite Libre de Bruxelles) 
Abstract:  In this paper, we warn on the overoptimistic conclusions led by weak instruments testing when good leverage points are present in the first stage of an IV estimation. Some simulations and an empirical application are provided to illustrate the point raised. 
Keywords:  Instrumental variables, Weak instruments, Outliers, Robust statistics 
JEL:  C3 C12 O1 
Date:  2012–02 
URL:  http://d.repec.org/n?u=RePEc:nam:wpaper:1203&r=ecm 
By:  Joshua C C Chan; Gary Koop; Simon M Potter 
Abstract:  This paper introduces a new model of trend (or underlying) inflation. In contrast to many earlier approaches, which allow for trend inflation to evolve according to a random walk, ours is a bounded model which ensures that trend inflation is constrained to lie in an interval. The bounds of this interval can either be fixed or estimated from the data. Our model also allows for a timevarying degree of persistence in the transitory component of inflation. The bounds placed on trend inflation mean that standard econometric methods for estimating linear Gaussian state space models cannot be used and we develop a posterior simulation algorithm for estimating the bounded trend inflation model. In an empirical exercise with CPI inflation we find the model to work well, yielding more sensible measures of trend inflation and forecasting better than popular alternatives such as the unobserved components stochastic volatility model. 
Date:  2012–02 
URL:  http://d.repec.org/n?u=RePEc:acb:camaaa:201208&r=ecm 
By:  Gianluca Cubadda (Faculty of Economics, University of Rome "Tor Vergata"); Barbara Guardabascio (ISTAT); Alain Hecq (Maastricht University) 
Abstract:  Combining economic time series with the aim to obtain an indicator for business cycle analyses is an important issue for policy makers. In this area, econometric techniques usually rely on systems with either a small number of series, N, (VAR or VECM) or, at the other extreme, a very large N (factor models). In this paper we propose tools to select the relevant business cycle indicators in a "medium" N framework, a situation that is likely to be the most frequent in empirical works. An example is provided by our empirical application, in which we study jointly the shortrun comovements of 24 European countries. We show, under not too restrictive conditions, that parsimonious singleequation models can be used to split a set of N countries in three groups. The first group comprises countries that share a synchronous common cycle, a nonsynchronous common cycle is present among the countries of the second group, and the third group collects countries that exhibit idiosyncratic cycles. Moreover, we offer a method for constructing a composite coincident indicator that explicitly takes into account the existence of these various forms of shortrun comovements among variables. 
Keywords:  Comovements, common cycles, composite business cycle indicators, Euro area. 
JEL:  C32 
Date:  2012–02–27 
URL:  http://d.repec.org/n?u=RePEc:rtv:ceisrp:224&r=ecm 
By:  Jakob S\"ohl; Mathias Trabs 
Abstract:  Observing prices of European put and call options, we calibrate exponential L\'evy models nonparametrically. We discuss the implementation of the spectral estimation procedures for L\'evy models of finite jump activity as well as for selfdecomposable L\'evy models and improve these methods. Confidence intervals are constructed for the estimators in the finite activity case. They allow inference on the behavior of the parameters when the option prices are observed in a sequence of trading days. We compare the performance of the procedures for finite and infinite jump activity based on real option data. 
Date:  2012–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1202.5983&r=ecm 
By:  Per A. Mykland; Neil Shephard; Kevin Sheppard 
Abstract:  High frequency financial data allows us to learn more about volatility, volatility of volatility and jumps. One of the key techniques developed in the literature in recent years has been bipower variation and its multipower extension, which estimates timevarying volatility robustly to jumps. We improve the scope and efficiency of multipower variation by the use of a more sophisticated exploitation of high frequency data. This suggests very significant improvements in the power of jump tests. It also yields efficiency estimates of the integrated variance of the continuous part of a semimartingale. The paper also shows how to extend the theory to the case where there is microstructure in the observations and derive the first nonparametric high frequency estimator of the volatility of volatility. A fundamental device in the paper is a new type of result showing pathbypath (strong) approximation between multipower and the (unobserved) RV based on the continuous part of the process. 
Keywords:  Bipower variation, Jumps, Market microstructure noise, Multipower variation, Nonparametric analysis, Quadratic variations, Semimartingale, Volatility, Volatility of volatility 
JEL:  C01 C02 C13 C14 C22 D53 D82 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:593&r=ecm 
By:  Antonello Maruotti (University of Roma3); Valentina Raponi (University of Rome La Sapienza) 
Abstract:  We describe a mixedeffects hurdle model for zeroinflated longitudinal count data, where a baseline variable is included in the model specification. Association between the count data process and the endogenous baseline variable is modeled through a latent structure, assumed to be dependent across equations. We show how model parameters can be estimated in a fnite mixture context, allowing for overdispersion, multivariate association and endogeneity of the baseline variable. The model behavior is investigated through a large scale simulation experiment. An empirical example on health care utilization data is provided. 
Keywords:  Hurdle model  Baseline conditions  Longitudinal count data  Zeroinflation. 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:rcr:wpaper:02_12&r=ecm 
By:  Michael Wolf; Dan Wunderli 
Abstract:  Many economic and financial applications require the forecast of a random variable of interest over several periods into the future. The sequence of individual forecasts, one period at a time, is called a pathforecast, where the term path refers to the sequence of individual future realizations of the random variable. The problem of constructing a corresponding joint prediction region has been rather neglected in the literature so far: such a region is supposed to contain the entire future path with a prespecified probability. We develop bootstrap methods to construct joint prediction regions. The resulting regions are proven to be asymptotically consistent under a mild highlevel assumption. We compare the finitesample performance of our joint prediction regions to some previous proposals via Monte Carlo simulations. An empirical application to a real data set is also provided. 
Keywords:  Generalized error rates, pathforecast, simultaneous prediction intervals 
JEL:  C14 C32 C53 
Date:  2012–02 
URL:  http://d.repec.org/n?u=RePEc:zur:econwp:064&r=ecm 
By:  Peter Arcidiacono; Patrick Bayer; Federico A. Bugni; Jonathan James 
Abstract:  Many dynamic problems in economics are characterized by large state spaces which make both computing and estimating the model infeasible. We introduce a method for approximating the value function of highdimensional dynamic models based on sieves and establish results for the: (a) consistency, (b) rates of convergence, and (c) bounds on the error of approximation. We embed this method for approximating the solution to the dynamic problem within an estimation routine and prove that it provides consistent estimates of the model's parameters. We provide Monte Carlo evidence that our method can successfully be used to approximate models that would otherwise be infeasible to compute, suggesting that these techniques may substantially broaden the class of models that can be solved and estimated. 
JEL:  C13 C14 C54 C61 C63 C73 
Date:  2012–03 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:17890&r=ecm 
By:  Pesaran, M. H. 
Abstract:  This paper considers testing the hypothesis that errors in a panel data model are weakly cross sectionally dependent, using the exponent of crosssectional dependence <img src="http://www.econ.cam.ac.uk/faculty/pesaran/wp12/image3.png" width="11" height="13" />, introduced recently in Bailey, Kapetanios and Pesaran (2012). It is shown that the implicit null of the <em>CD</em> test depends on the relative expansion rates of <em>N</em> and <em>T</em>. When <em>T</em> = <em>O</em> <img src="http://www.econ.cam.ac.uk/faculty/pesaran/wp12/image4.png" width="29" height="15" />, for some <img src="http://www.econ.cam.ac.uk/faculty/pesaran/wp12/image5.png" width="82" height="14" />, then the implicit null of the CD test is given by <img src="http://www.econ.cam.ac.uk/faculty/pesaran/wp12/image7.png" width="118" height="15" />, which gives <img src="http://www.econ.cam.ac.uk/faculty/pesaran/wp12/image6.png" alt="image6" width="87" height="14" />, when <em>N</em> and <em>T</em> tend to infinity at the same rate such that <em>T</em>/<em>N</em> <img src="http://www.econ.cam.ac.uk/faculty/pesaran/wp12/image8.png" width="38" height="15" />, with <img src="http://www.econ.cam.ac.uk/faculty/pesaran/wp12/image9.png" width="12" height="15" /> being a finite positive constant. It is argued that in the case of large <em>N</em> panels, the null of weak dependence is more appropriate than the null of independence which could be quite restrictive for large panels. Using Monte Carlo experiments, it is shown that the CD test has the correct size for values of <img src="http://www.econ.cam.ac.uk/faculty/pesaran/wp12/image3.png" width="11" height="13" /> in the range [0, 1/4], for all combinations of <em>N</em> and <em>T</em>, and irrespective of whether the panel contains lagged values of the dependent variables, so long as there are no major asymmetries in the error distribution. 
Keywords:  Exponent of crosssectional dependence, Diagnostic tests, Panel data models, Dynamic heterogenous panels 
JEL:  C12 C13 C33 
Date:  2012–02–28 
URL:  http://d.repec.org/n?u=RePEc:cam:camdae:1208&r=ecm 
By:  ShihKang Chao; Wolfgang Karl Härdle; Weining Wang 
Abstract:  Financial risk control has always been challenging and becomes now an even harder problem as joint extreme events occur more frequently. For decision makers and government regulators, it is therefore important to obtain accurate information on the interdependency of risk factors. Given a stressful situation for one market participant, one likes to measure how this stress affects other factors. The CoVaR (Conditional VaR) framework has been developed for this purpose. The basic technical elements of CoVaR estimation are two levels of quantile regression: one on market risk factors; another on individual risk factor. Tests on the functional form of the twolevel quantile regression reject the linearity. A flexible semiparametric modeling framework for CoVaR is proposed. A partial linear model (PLM) is analyzed. In applying the technology to stock data covering the crisis period, the PLM outperforms in the crisis time, with the justification of the backtesting procedures. Moreover, using the data on global stock markets indices, the analysis on marginal contribution of risk (MCR) defined as the local first order derivative of the quantile curve sheds some light on the source of the global market risk. 
Keywords:  CoVaR, ValueatRisk, quantile regression, locally linear quantile regression, partial linear model, semiparametric model 
JEL:  C14 C21 C22 C53 G01 G10 G20 G32 
Date:  2012–01 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2012006&r=ecm 