
on Econometrics 
By:  Jiti Gao; Xiao Han; Guangming Pan; Yanrong Yang 
Abstract:  Statistical inferences for sample correlation matrices are important in high dimensional data analysis. Motivated by this, this paper establishes a new central limit theorem (CLT) for a linear spectral statistic (LSS) of high dimensional sample correlation matrices for the case where the dimension p and the sample size n are comparable. This result is of independent interest in large dimensional random matrix theory. Meanwhile, we apply the linear spectral statistic to an independence test for p random variables, and then an equivalence test for p factor loadings and n factors in a factor model. The finite sample performance of the proposed test shows its applicability and effectiveness in practice. An empirical application to test the independence of household incomes from different cities in China is also conducted. 
Keywords:  Central limit theorem; equivalence test; high dimensional correlation matrix; independence test; linear spectral statistics. 
JEL:  C21 C32 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201426&r=ecm 
By:  Peter C.B. Phillips (Cowles Foundation, Yale University); Ye Chen (Singapore Management University) 
Abstract:  Chen and Deo (2009a) proposed procedures based on restricted maximum likelihood (REML) for estimation and inference in the context of predictive regression. Their method achieves bias reduction in both estimation and inference which assists in overcoming size distortion in predictive hypothesis testing. This paper provides extensions of the REML approach to more general cases which allow for drift in the predictive regressor and multiple regressors. It is shown that without modification the REML approach is seriously oversized and can have unit rejection probability in the limit under the null when the drift in the regressor is dominant. A limit theory for the modified REML test is given under a localized drift specification that accommodates predictors with varying degrees of persistence. The extension is useful in empirical work where predictors typically involve stochastic trends with drift and where there are multiple regressors. Simulations show that with these modifications, the good performance of the restricted likelihood ratio test (RLRT) is preserved and that RLRT outperforms other predictive tests in terms of size and power even when there is no drift in the regressor. 
Keywords:  Localized drift, Predictive regression, Restricted likelihood ratio test, Size distortion 
JEL:  C12 C13 C58 
URL:  http://d.repec.org/n?u=RePEc:cwl:cwldpp:1968&r=ecm 
By:  Ping Yu (University of Hong Kong); Peter C.B. Phillips (Cowles Foundation, Yale University) 
Abstract:  This paper studies estimation and specification testing in threshold regression with endogeneity. Three key results differ from those in regular models. First, both the threshold point and the threshold effect parameters are shown to be identified without the need for instrumentation. Second, in partially linear threshold models, both parametric and nonparametric components rely on the same data, which prima facie suggests identification failure. But, as shown here, the discontinuity structure of the threshold itself supplies identifying information for the parametric coefficients without the need for extra randomness in the regressors. Third, instrumentation plays different roles in the estimation of the system parameters, delivering identification for the structural coefficients in the usual way, but raising convergence rates for the threshold effect parameters and improving efficiency for the threshold point. Specification tests are developed to test for the presence of endogeneity and threshold effects without relying on instrumentation of the covariates. The threshold effect test extends conventional parametric structural change tests to the nonparametric case. A wild bootstrap procedure is suggested to deliver finite sample critical values for both tests. Simulation studies corroborate the theory and the asymptotics. An empirical application is conducted to explore the effects of 401(k) retirement programs on savings, illustrating the relevance of threshold models in treatment effects evaluation in the presence of endogeneity. 
Keywords:  Threshold regression, Endogeneity, Local shifter, Identification, Efficiency, Integrated difference kernel estimator, Regression discontinuity design, Optimal rate of convergence, Partial linear model, Specification test, Ustatistic, Wild bootstrap, Threshold treatment model, 401(k) plan 
JEL:  C12 C13 C14 C21 C26 
Date:  2014–12 
URL:  http://d.repec.org/n?u=RePEc:cwl:cwldpp:1966&r=ecm 
By:  Ryo Okui (Institute of Economic Research Kyoto University); Takahide Yanagi (Graduate School of Economics Kyoto University) 
Abstract:  This paper proposes the analysis of panel data whose dynamic structure is heterogeneous across individuals. Our aim is to estimate the crosssectional distributions and/or some distributional features of the heterogeneous mean and autocovariances. We do not assume any specific model for the dynamics. Our proposed method is easy to implement. We first compute the sample mean and autocovariances for each individual and then estimate the parameter of interest based on the empirical distributions of the estimated mean and autocovariances. The asymptotic properties of the proposed estimators are investigated using double asymptotics under which both the crosssectional sample size (N) and the length of the time series (T) tend to infinity. We prove the functional central limit theorem for the empirical process of the proposed distribution estimator. By using the functional delta method, we also derive the asymptotic distributions of the estimators for various parameters of interest. We show that the distribution estimator exhibits a bias whose order is proportional to 1/√T. Conversely, when the parameter of interest can be written as the expectation of a smooth function of the heterogeneous mean and/or autocovariances, the bias is of order 1/T and can be corrected by the jackknife method. The results of Monte Carlo simulations show that our asymptotic results are informative regarding the finitesample properties of the estimators. They also demonstrate that the proposed jackknife bias correction is successful. 
Keywords:  Panel data; heterogeneity; functional central limit theorem; autocovariance; jackknife;long panel. 
JEL:  C13 C14 C23 
Date:  2014–11 
URL:  http://d.repec.org/n?u=RePEc:kyo:wpaper:906&r=ecm 
By:  CARPANTIER, JeanFrançois (CREA, Université du Luxembourg); DUFAYS, Arnaud (ENSAECREST, Paris) 
Abstract:  We propose an estimation method that circumvents the path dependence problem existing in ChangePoint (CP) and Markov Switching (MS) ARMA models. Our model embeds a sticky infinite hidden Markovswitching structure (sticky IHMM), which makes possible a selfdetermination of the number of regimes as well as of the specification : CP or MS. Furthermore, CP and MS frameworks usually assume that all the model parameters vary from one regime to another. We relax this restrictive assumption. As illustrated by simulations on moderate samples (300 observations), the sticky IHMMARMA algorithm detects which model parameters change over time. Applications to the U.S. GDP growth and the DJIA realized volatility highlight the relevance of estimating different structural breaks for the mean and variance parameters. 
Keywords:  Bayesian inference, Markovswitching model, ARMA model, infinite hidden Markov model, Dirichlet Process 
JEL:  C11 C15 C22 C58 
Date:  2014–06–11 
URL:  http://d.repec.org/n?u=RePEc:cor:louvco:2014014&r=ecm 
By:  Maurice J.G. Bun; Martin A. Carree; Arturas Juodis 
Abstract:  We analyze the finite sample properties of maximum likelihood estimators for dynamic panel data models. In particular, we consider Transformed Maximum Likelihood (TML) and Random effects Maximum Likelihood (RML) estimation. We show that TML and RML estimators are solutions to a cubic firstorder condition in the autoregressive parameter. Furthermore, in finite samples both likelihood estimators might lead to a negative estimate of the variance of the individual specific effects. We consider different approaches taking into account the nonnegativity restriction for the variance. We show that these approaches may lead to a boundary solution different from the unique global unconstrained maximum. In an extensive Monte Carlo study we find that this boundary solution issue is nonnegligible for small values of T and that different approaches might lead to substantially different finite sample properties. Furthermore, we find that the Likelihood Ratio statistic provides size control in small samples, albeit with low power due to the flatness of the loglikelihood function. We illustrate these issues modeling U.S. state level unemployment dynamics. 
Date:  2014–12–16 
URL:  http://d.repec.org/n?u=RePEc:ame:wpaper:1404&r=ecm 
By:  W. Robert Reed (University of Canterbury) 
Abstract:  This paper demonstrates that unit root tests can suffer from inflated Type I error rates when data are cointegrated. Results from Monte Carlo simulations show that three commonly used unit root tests – the ADF, PhillipsPerron, and DFGLS tests – frequently overreject the true null of a unit root for at least one of the cointegrated variables. The findings extend previous research which reports size distortions for unit roots tests when the associated error terms are serially correlated (Schwert, 1989; DeJong et al., 1992; Harris, 1992). While the addition to the DickeyFullertype specification of the correct number of lagged differenced (LD) terms can eliminate the size distortion, I demonstrate that determining the correct number of LD terms is unachievable in practice. Standard diagnostics such as testing for serial correlation in the residuals, and using information criteria to compare different lag specifications, are unable to identify the requird number of lags. A unique feature of this study is that it includes programs (an Excel spreadsheet and Stata .do files) that allow the reader to simulate their own cointegrated data  using parameters of their own choosing  to confirm the findings reported in this paper. 
Keywords:  Unit root testing, cointegration, DFGLS test, Augmented DickeyFuller test, PhillipsPerron test, simulation 
JEL:  C32 C22 C18 
Date:  2014–12–14 
URL:  http://d.repec.org/n?u=RePEc:cbt:econwp:14/28&r=ecm 
By:  Andrew Harvey; Stephen Thiele 
Abstract:  A test for timevarying correlation is developed within the framework of a dynamic conditional score (DCS) model for both Gaussian and Student tdistributions. The test may be interpreted as a Lagrange multiplier test and modified to allow for the estimation of models for timevarying volatility in the individual series. Unlike standard momentbased tests, the scorebased test statistic includes information on the level of correlation under the null hypothesis and local power arguments indicate the benefits of doing so. A simulation study shows that the performance of the scorebased test is strong relative to existing tests across a range of data generating processes. An application to the Hong Kong and South Korean equity markets shows that the new test reveals changes in correlation that are not detected by the standard momentbased test. 
Keywords:  Dynamic conditional score, EGARCH, Lagrange multiplier test, Portmanteau test, Timevarying covariance matrices. 
JEL:  C14 C22 F36 
Date:  2014–11–28 
URL:  http://d.repec.org/n?u=RePEc:cam:camdae:1439&r=ecm 
By:  Kaspar Wüthrich 
Abstract:  This paper analyzes estimators based on the instrumental variable quantile regression (IVQR) model (Chernozhukov and Hansen, 2004, 2005, 2006) under the local quantile treatment effects (LQTE) framework (Abadie et al., 2002). I show that the quantile treatment effect (QTE) estimators in the IVQR model are equivalent to LQTE for the compliers at transformed quantile levels. This transformation adjusts for differences between the subpopulationspecific potential outcome distributions that are identified in the LQTE model. Moreover, the IVQR estimator of the average treatment effect (ATE) corresponds to a convex combination of the local average treatment effect (LATE) and a weighted average of LQTE for the compliers. I extend the analysis to more general setups that allow for partial failures of the LQTE assumptions, nonbinary instruments, and covariates. The results are illustrated with two empirical applications. 
Keywords:  Endogeneity; instrumental variables; quantile treatment effect; local quantile treatment effect; average treatment effect; local average treatment effect; rank similarity 
JEL:  C14 C21 C26 
Date:  2014–11 
URL:  http://d.repec.org/n?u=RePEc:ube:dpvwib:dp1408&r=ecm 
By:  Roberto Casarin (Department of Economics, University of Venice Cà Foscari); Fabrizio Leisen (Department of Economics, University of Kent); German Molina (Idalion Capital US LP); Enrique Ter Horst (CESA & IESA) 
Abstract:  We build on Fackler and King (1990) and propose a general calibration model for implied risk neutral densities. Our model allows for the joint calibration of a set of densities at different maturities and dates. The model is a Bayesian dynamic beta Markov random field which allows for possible time dependence between densities with the same maturity and for dependence across maturities at the same point in time. The assumptions on the prior distribution allow us to compound the needs of model flexibility, parameter parsimony and information pooling across densities. 
Keywords:  Bayesian inference, Beta random fields, Exchange Metropolis Hastings, Markov chain Monte Carlo, Risk neutral measure. 
JEL:  C11 C15 C33 C51 C58 G13 G17 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:ven:wpaper:2014:22&r=ecm 
By:  YABE, Ryota 
Abstract:  By using the empirical likelihood (EL), we consider the construction of pointwise confidence intervals (CIs) for nonparametric nonlinear nonstationary regression models with nonlinear nonstationary heterogeneous errors. It is well known that the ELbased CI has attractive properties such as data dependency and automatic studentization in crosssectional and weakdependence models. We extend EL theory to the nonparametric nonlinear nonstationary regression model and show that the logEL ratio converges to a chisquared random variable with one degree of freedom. This means that Wilks' theorem holds even if the covariate follows a nonstationary process. We also conduct empirical analysis of Japan's inverse money demand to demonstrate the datadependency property of the ELbased CI. 
Date:  2014–12 
URL:  http://d.repec.org/n?u=RePEc:hit:econdp:201420&r=ecm 
By:  Timothy Christensen 
Abstract:  We introduce econometric methods to perform estimation and inference on the permanent and transitory components of the stochastic discount factor (SDF) in dynamic Markov environments. The approach is nonparametric in that it does not impose parametric restrictions on the law of motion of the state process. We propose sieve estimators of the eigenvalueeigenfunction pair which are used to decompose the SDF into its permanent and transitory components, as well as estimators of the longrun yield and the entropy of the permanent component of the SDF, allowing for a wide variety of empirically relevant setups. Consistency and convergence rates are established. The estimators of the eigenvalue, yield and entropy are shown to be asymptotically normal and semiparametrically efficient when the SDF is observable. We also introduce nonparametric estimators of the continuation value under EpsteinZin preferences, thereby extending the scope of our estimators to an important class of recursive preferences. The estimators are simple to implement, perform favorably in simulations, and may be used to numerically compute the eigenfunction and its eigenvalue in fully specified models when analytical solutions are not available. 
Date:  2014–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1412.4428&r=ecm 
By:  Gaure, Simen (The Ragnar Frisch Centre for Economic Research, Oslo, Norway) 
Abstract:  When doing twoway fixed effects OLS estimations, both the variances and covariance of the fixed effects are biased. A formula for a bias correction is known, but in large datasets it involves inverses of impractically large matrices. We detail how to compute the bias correction in this case. 
Keywords:  Limited mobility bias; Two way fuxed effects; Linear regression 
JEL:  A19 C13 C33 C50 C87 
Date:  2014–08–30 
URL:  http://d.repec.org/n?u=RePEc:hhs:osloec:2014_021&r=ecm 
By:  Liana Jacobi; Helga Wagner; Sylvia FrühwirthSchnatter 
Abstract:  Child birth leads to a break in a woman's employment history and is considered one reason for the relatively poor labor market outcomes observed for women compared to men. However, the time spent at home after child birth varies significantly across mothers and is likely driven by observed and, more importantly, unobserved factors that also affect labor market outcomes directly. In this paper we propose two alternative Bayesian treatment modeling and inferential frameworks for panel outcomes to estimate dynamic earnings effects of a long maternity leave on mothers' earnings in the years following the return to the labor market. The frameworks differ in their modeling of the endogeneity of the treatment and the panel structure of the earnings, with the first framework based on the modeling tradition of the Roy switching regression model, and the second based on the shared factor approach. We show how stochastic variable selection can be implemented within both frameworks and can be used, for example, to test for the heterogeneity of the treatment effects. Our analysis is based on a large sample of mothers from the Austrian Social Security Register (ASSD) and exploits a recent change in the maternity leave policy to help identify the causal earnings effects. We find substantial negative earning effects from long leave over a 5 year period after mothers' return to the labor market, with the earnings gap between short and long leave mothers steadily narrowing over time. 
Keywords:  treatment effects models, switching regression model, shared factor model, factor analysis, spike and slab priors, variable selection, Markov Chain Monte Carlo method, earnings effects, maternity leave 
JEL:  C11 C31 C33 C38 C52 J31 J13 J16 
Date:  2014–02 
URL:  http://d.repec.org/n?u=RePEc:jku:nrnwps:2014_12&r=ecm 
By:  Kleijnen, Jack P.C. (Tilburg University, Center For Economic Research); Mehdad, E. (Tilburg University, Center For Economic Research) 
Abstract:  Abstract: To analyze the input/output behavior of simulation models with multiple responses, we may apply either univariate or multivariate Kriging (Gaussian process) metamodels. In multivariate Kriging we face a major problem: the covariance matrix of all responses should remain positivede nite; we therefore use the recently proposed "nonseparable dependence" model. To evaluate the performance of univariate and multivariate Kriging, we perform several Monte Carlo experiments that simulate Gaussian processes. These Monte Carlo results suggest that the simpler univariate Kriging gives smaller mean square error. 
Keywords:  Simulation; Stochastic processes; Multivariate statistics 
JEL:  C0 C1 C9 C15 C44 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:tiu:tiucen:8a096696f7004cbe9474c6e93f5e323b&r=ecm 
By:  Ghysels, Eric 
Abstract:  We consider estimating volatility risk factors using large panels of filtered or realized volatilities. The data structure involves three types of asymptotic expansions. There is the crosssection of volatility estimates at each point in time, namely i = 1,…, N observed at dates t = 1,…, T: In addition to expanding N and T; we also have the sampling frequency h of the data used to compute the volatility estimates which rely on data collected at increasing frequency, h ? 0: The continuous record or infill asymptotics (h ? 0) allows us to control the crosssectional and serial correlation among the idiosyncratic errors of the panel. A remarkable result emerges. Under suitable regularity conditions the traditional principal component analysis yields superconsistent estimates of the factors at each point in time. Namely, contrary to the rootN standard normal consistency we find Nconsistency, also standard normal, due to the fact that the high frequency sampling scheme is tied to the size of the crosssection, boosting the rate of convergence. We also show that standard crosssectional driven criteria suffice for consistent estimation of the number of factors, which is different from the traditional panel data results. Finally, we also show that the panel data estimates improve upon the individual volatility estimates. 
Keywords:  ARCHtype filters; Principal Component Analysis; realized volatility 
JEL:  C13 C33 
Date:  2014–06 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:10034&r=ecm 
By:  Kerstin Gärtner (Vienna University); Mark Podolskij (Aarhus University and CREATES) 
Abstract:  In this paper we present some new asymptotic results for high frequency statistics of Brownian semistationary (BSS) processes. More precisely, we will show that singularities in the weight function, which is one of the ingredients of a BSS process, may lead to nonstandard limits of the realised quadratic variation. In this case the limiting process is a convex combination of shifted integrals of the intermittency function. Furthermore, we will demonstrate the corresponding stable central limit theorem. Finally, we apply the probabilistic theory to study the asymptotic properties of the realized ratio statistics, which estimates the smoothness parameter of a BSS process. 
Keywords:  Brownian semistationary processes, high frequency data, limit theorems, stable convergence 
JEL:  C10 C13 C14 
Date:  2014–12–10 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201450&r=ecm 
By:  Giovannelli, Alessandro; Proietti, Tommaso 
Abstract:  We address the problem of selecting the common factors that are relevant for forecasting macroeconomic variables. In economic forecasting using diffusion indexes the factors are ordered, according to their importance, in terms of relative variability, and are the same for each variable to predict, i.e. the process of selecting the factors is not supervised by the predictand. We propose a simple and operational supervised method, based on selecting the factors on the basis of their significance in the regression of the predictand on the predictors. Given a potentially large number of predictors, we consider linear transformations obtained by principal components analysis. The orthogonality of the components implies that the standard tstatistics for the inclusion of a particular component are independent, and thus applying a selection procedure that takes into account the multiplicity of the hypotheses tests is both correct and computationally feasible. We focus on three main multiple testing procedures: Holm’s sequential method, controlling the family wise error rate, the BenjaminiHochberg method, controlling the false discovery rate, and a procedure for incorporating prior information on the ordering of the components, based on weighting the pvalues according to the eigenvalues associated to the components. We compare the empirical performances of these methods with the classical diffusion index (DI) approach proposed by Stock and Watson, conducting a pseudoreal time forecasting exercise, assessing the predictions of 8 macroeconomic variables using factors extracted from an U.S. dataset consisting of 121 quarterly time series. The overall conclusion is that nature is tricky, but essentially benign: the information that is relevant for prediction is effectively condensed by the first few factors. However, variable selection, leading to exclude some of the low order principal components, can lead to a sizable improvement in forecasting in specific cases. Only in one instance, real personal income, we were able to detect a significant contribution from high order components. 
Keywords:  Variable selection; Multiple testing; pvalue weighting. 
JEL:  C22 C32 C38 C53 E3 E32 
Date:  2014–11–30 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:60673&r=ecm 
By:  YABE, Ryota 
Abstract:  This paper considers the conditional sum of squares estimator (CSSE) for the moderate deviation MA(1) process that has the parameter of the MA(1) with the distance between the parameter and unity being larger than O(T 1). We show that the asymptotic distribution of the CSSE is normal, even though the process belongs to the localtounity class. The convergence rate continuously changes from an invertible order to a noninvertible one. In this sense, the moderate deviation process in MA(1) has a continuous bridge property like the AR process. 
Keywords:  Moving average, Noninvertible moving average, Unit root, local to unity, Moderate Deviations, Conditional sum of squares estimation 
Date:  2014–12 
URL:  http://d.repec.org/n?u=RePEc:hit:econdp:201419&r=ecm 
By:  Fiorini, Mario; Katrien Stevens 
Abstract:  Whenever treatment e_ects are heterogeneous and there is sorting into treatment based on the gain, monotonicity is a condition that both Instrumental Variable and fuzzy Regression Discontinuity designs have to satisfy for their estimate to be interpretable as a LATE. However, applied economic work rarely discusses this important assumption. This is in stark contrast to the lengthy discussions dedicated to the other IV and fuzzy RD conditions. We show that monotonicity can and should be investigated using a mix of economic insights, data patterns and formal tests. This is just an extra step to validate the results. We provide examples in a variety of settings as a guide to practice. 
Keywords:  essential heterogeneity, monotonicity assumption, LATE, instrumental variable, regression discontinuity 
Date:  2014–10 
URL:  http://d.repec.org/n?u=RePEc:syd:wpaper:201413&r=ecm 
By:  Stephen G. Hall; P. A. V. B. Swamy; George S. Tavlas 
Keywords:  Timevarying coefficient model, Coefficient driver, Specification Problem, Correct interpretation of coefficients 
JEL:  C13 C19 C22 
Date:  2014–12 
URL:  http://d.repec.org/n?u=RePEc:lec:leecon:14/18&r=ecm 
By:  Lei, J. (Tilburg University, School of Economics and Management) 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:tiu:tiutis:302d1ae7031043b0b2536e3da1413d35&r=ecm 
By:  Wagner Piazza Gaglianone; João Victor Issler 
Abstract:  In this paper, we propose a microfounded framework to investigate a panel of forecasts (e.g. modeldriven or surveybased) and the possibility to improve their outofsample forecast performance by employing a biascorrection device. Following Patton and Timmermann (2007), we theoretically justify the modeling of forecasts as function of the conditional expectation, based on the optimization problem of individual forecasters. This approach allows us to relax the standard assumption of mean squared error (MSE) loss function and, thus, to obtain optimal forecasts under more general functions. However, different from these authors, we apply our results to a panel of forecasts, in order to construct an optimal (combined) forecast. In this sense, a feasible GMM estimator is proposed to aggregate the information content of each individual forecast and optimally recover the conditional expectation. Our setup can be viewed as a generalization of the threeway forecast error decomposition of Davies and Lahiri (1995); and as an extension of the biascorrected average forecast of Issler and Lima (2009). A realtime forecasting exercise using the Brazilian Focus survey illustrates the proposed methodology 
Date:  2014–12 
URL:  http://d.repec.org/n?u=RePEc:bcb:wpaper:372&r=ecm 
By:  Cerqueti, Roy; Lupi, Claudio 
Abstract:  This note contributes to the development of the theory of stochastic dependence by employing the general concept of copula. In particular, it deals with the construction of a new family of nonexchangeable copulas characterizing the multivariate total positivity of order 2 (MTP2) dependence. 
Keywords:  Copulas, MTP2 dependence, Nonexchangeability 
JEL:  C19 C39 
Date:  2014–11–23 
URL:  http://d.repec.org/n?u=RePEc:mol:ecsdps:esdp14075&r=ecm 
By:  Gustavo Fruet Dias (Aarhus University and CREATES); Fotis Papailias (Queen's University Belfast and quantf Research) 
Abstract:  A twostage forecasting approach for long memory time series is introduced. In the first step we estimate the fractional exponent and, applying the fractional differencing operator, we obtain the underlying weakly dependent series. In the second step, we perform the multistep ahead forecasts for the weakly dependent series and obtain their long memory counterparts by applying the fractional cumulation operator. The methodology applies to stationary and nonstationary cases. Simulations and an application to seven time series provide evidence that the new methodology is more robust to structural change and yields good forecasting results. 
Keywords:  Forecasting, Spurious Long Memory, Structural Change, Local Whittle 
JEL:  C22 C53 
Date:  2014–12–15 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201455&r=ecm 
By:  GonzalezAstudillo, Manuel (Board of Governors of the Federal Reserve System (U.S.)) 
Abstract:  In this paper, I propose an econometric technique to estimate a Markovswitching Taylor rule subject to the zero lower bound of interest rates. I show that incorporating a Tobitlike specification allows to obtain consistent estimators. More importantly, I show that linking the switching of the Taylor rule coefficients to the switching of the coefficients of an auxiliary uncensored Markovswitching regression improves the identification of an otherwise unidentifiable prevalent monetary regime. To illustrate the proposed estimation technique, I use U.S. quarterly data spanning 1960:12013:4. The chosen auxiliary Markovswitching regression is a fiscal policy rule where federal revenues react to debt and the output gap. Results show that there is evidence of policy comovements with debtstabilizing fiscal policy more likely accompanying active monetary policy, and vice versa. 
Keywords:  Markovswitching coefficients; zero lower bound; monetaryfiscal policy interactions 
JEL:  C34 E52 E63 
Date:  2014–09–19 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:201497&r=ecm 
By:  Subhash C. Ray (University of Connecticut) 
Abstract:  Over the past decades Data Envelopment Analysis (DEA) has emerged as an important nonparametric method of evaluating performance of decision making units through benchmarking. Although developed primarily for measuring technical efficiency, DEA is now applied extensively for measuring scale efficiency, cost efficiency, and profit efficiency as well. This paper integrates the different DEA models commonly applied in empirical research with their underlying theoretical foundations in neoclassical production economics. 
Keywords:  Linear Programming; Technical Efficiency; Returns to Scale; Distance Functions 
JEL:  C6 D2 
Date:  2014–11 
URL:  http://d.repec.org/n?u=RePEc:uct:uconnp:201433&r=ecm 