nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒12‒24
27 papers chosen by
Sune Karlsson
Örebro universitet

  1. High Dimensional Correlation Matrices: CLT and Its Applications By Jiti Gao; Xiao Han; Guangming Pan; Yanrong Yang
  2. Restricted Likelihood Ratio Tests in Predictive Regression By Peter C.B. Phillips; Ye Chen
  3. Threshold Regression with Endogeneity By Ping Yu; Peter C.B. Phillips
  4. Panel Data Analysis with Heterogeneous Dynamics By Ryo Okui; Takahide Yanagi
  5. Specific Markov-switching behaviour for ARMA parameters By CARPANTIER, Jean-François; DUFAYS, Arnaud
  6. On Maximum Likelihood estimation of dynamic panel data models By Maurice J.G. Bun; Martin A. Carree; Arturas Juodis
  7. Unit Root Tests, Size Distortions, and Cointegrated Data By W. Robert Reed
  8. Testing against Changing Correlation By Andrew Harvey; Stephen Thiele
  9. A Comparison of two Quantile Models with Endogeneity By Kaspar Wüthrich
  10. A Bayesian Beta Markov Random Field calibration of the term structure of implied risk neutral densities By Roberto Casarin; Fabrizio Leisen; German Molina; Enrique Ter Horst
  11. Empirical Likelihood Confidence Intervals for Nonparametric Nonlinear Nonstationary Regression Models By YABE, Ryota
  12. Nonparametric Stochastic Discount Factor Decomposition By Timothy Christensen
  13. Practical Correlation Bias Correction in Two-way Fixed Effects Linear Regression By Gaure, Simen
  14. Bayesian Treatment Effects Models with Variable Selection for Panel Outcomes with an Application to Earnings Effects of Maternity Leave By Liana Jacobi; Helga Wagner; Sylvia Frühwirth-Schnatter
  15. Multivariate Versus Univariate Kriging Metamodels for Multi-Response Simulation Models (Revision of 2012-039) By Kleijnen, Jack P.C.; Mehdad, E.
  16. Factor Analysis with Large Panels of Volatility Proxies By Ghysels, Eric
  17. On non-standard limits of Brownian semi-stationary By Kerstin Gärtner; Mark Podolskij
  18. On the Selection of Common Factors for Macroeconomic Forecasting By Giovannelli, Alessandro; Proietti, Tommaso
  19. Asymptotic Distribution of the Conditional Sum of Squares Estimator Under Moderate Deviation From a Unit Root in MA(1) By YABE, Ryota
  20. Assessing the Monotonicity Assumption in IV and fuzzy RD designs By Fiorini, Mario; Katrien Stevens
  21. Time Varying Coefficient Models; A Proposal for selecting the Coefficient Driver Sets By Stephen G. Hall; P. A. V. B. Swamy; George S. Tavlas
  22. Essays on nonlinear panel data models By Lei, J.
  23. Microfounded Forecasting By Wagner Piazza Gaglianone; João Victor Issler
  24. A new family of nonexchangeable copulas for positive dependence By Cerqueti, Roy; Lupi, Claudio
  25. Forecasting Long Memory Series Subject to Structural Change: A Two-Stage Approach By Gustavo Fruet Dias; Fotis Papailias
  26. Identifying the Stance of Monetary Policy at the Zero Lower Bound: A Markov-switching Estimation Exploiting Monetary-Fiscal Policy Interdependence By Gonzalez-Astudillo, Manuel
  27. Data Envelopment Analysis: An Overview By Subhash C. Ray

  1. By: Jiti Gao; Xiao Han; Guangming Pan; Yanrong Yang
    Abstract: Statistical inferences for sample correlation matrices are important in high dimensional data analysis. Motivated by this, this paper establishes a new central limit theorem (CLT) for a linear spectral statistic (LSS) of high dimensional sample correlation matrices for the case where the dimension p and the sample size n are comparable. This result is of independent interest in large dimensional random matrix theory. Meanwhile, we apply the linear spectral statistic to an independence test for p random variables, and then an equivalence test for p factor loadings and n factors in a factor model. The finite sample performance of the proposed test shows its applicability and effectiveness in practice. An empirical application to test the independence of household incomes from different cities in China is also conducted.
    Keywords: Central limit theorem; equivalence test; high dimensional correlation matrix; independence test; linear spectral statistics.
    JEL: C21 C32
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2014-26&r=ecm
  2. By: Peter C.B. Phillips (Cowles Foundation, Yale University); Ye Chen (Singapore Management University)
    Abstract: Chen and Deo (2009a) proposed procedures based on restricted maximum likelihood (REML) for estimation and inference in the context of predictive regression. Their method achieves bias reduction in both estimation and inference which assists in overcoming size distortion in predictive hypothesis testing. This paper provides extensions of the REML approach to more general cases which allow for drift in the predictive regressor and multiple regressors. It is shown that without modification the REML approach is seriously oversized and can have unit rejection probability in the limit under the null when the drift in the regressor is dominant. A limit theory for the modified REML test is given under a localized drift specification that accommodates predictors with varying degrees of persistence. The extension is useful in empirical work where predictors typically involve stochastic trends with drift and where there are multiple regressors. Simulations show that with these modifications, the good performance of the restricted likelihood ratio test (RLRT) is preserved and that RLRT outperforms other predictive tests in terms of size and power even when there is no drift in the regressor.
    Keywords: Localized drift, Predictive regression, Restricted likelihood ratio test, Size distortion
    JEL: C12 C13 C58
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1968&r=ecm
  3. By: Ping Yu (University of Hong Kong); Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: This paper studies estimation and specification testing in threshold regression with endogeneity. Three key results differ from those in regular models. First, both the threshold point and the threshold effect parameters are shown to be identified without the need for instrumentation. Second, in partially linear threshold models, both parametric and nonparametric components rely on the same data, which prima facie suggests identification failure. But, as shown here, the discontinuity structure of the threshold itself supplies identifying information for the parametric coefficients without the need for extra randomness in the regressors. Third, instrumentation plays different roles in the estimation of the system parameters, delivering identification for the structural coefficients in the usual way, but raising convergence rates for the threshold effect parameters and improving efficiency for the threshold point. Specification tests are developed to test for the presence of endogeneity and threshold effects without relying on instrumentation of the covariates. The threshold effect test extends conventional parametric structural change tests to the nonparametric case. A wild bootstrap procedure is suggested to deliver finite sample critical values for both tests. Simulation studies corroborate the theory and the asymptotics. An empirical application is conducted to explore the effects of 401(k) retirement programs on savings, illustrating the relevance of threshold models in treatment effects evaluation in the presence of endogeneity.
    Keywords: Threshold regression, Endogeneity, Local shifter, Identification, Efficiency, Integrated difference kernel estimator, Regression discontinuity design, Optimal rate of convergence, Partial linear model, Specification test, U-statistic, Wild bootstrap, Threshold treatment model, 401(k) plan
    JEL: C12 C13 C14 C21 C26
    Date: 2014–12
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1966&r=ecm
  4. By: Ryo Okui (Institute of Economic Research Kyoto University); Takahide Yanagi (Graduate School of Economics Kyoto University)
    Abstract: This paper proposes the analysis of panel data whose dynamic structure is heterogeneous across individuals. Our aim is to estimate the cross-sectional distributions and/or some distributional features of the heterogeneous mean and autocovariances. We do not assume any specific model for the dynamics. Our proposed method is easy to implement. We first compute the sample mean and autocovariances for each individual and then estimate the parameter of interest based on the empirical distributions of the estimated mean and autocovariances. The asymptotic properties of the proposed estimators are investigated using double asymptotics under which both the cross-sectional sample size (N) and the length of the time series (T) tend to infinity. We prove the functional central limit theorem for the empirical process of the proposed distribution estimator. By using the functional delta method, we also derive the asymptotic distributions of the estimators for various parameters of interest. We show that the distribution estimator exhibits a bias whose order is proportional to 1/√T. Conversely, when the parameter of interest can be written as the expectation of a smooth function of the heterogeneous mean and/or autocovariances, the bias is of order 1/T and can be corrected by the jackknife method. The results of Monte Carlo simulations show that our asymptotic results are informative regarding the finitesample properties of the estimators. They also demonstrate that the proposed jackknife bias correction is successful.
    Keywords: Panel data; heterogeneity; functional central limit theorem; autocovariance; jackknife;long panel.
    JEL: C13 C14 C23
    Date: 2014–11
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:906&r=ecm
  5. By: CARPANTIER, Jean-François (CREA, Université du Luxembourg); DUFAYS, Arnaud (ENSAE-CREST, Paris)
    Abstract: We propose an estimation method that circumvents the path dependence problem existing in Change-Point (CP) and Markov Switching (MS) ARMA models. Our model embeds a sticky infinite hidden Markov-switching structure (sticky IHMM), which makes possible a self-determination of the number of regimes as well as of the specification : CP or MS. Furthermore, CP and MS frameworks usually assume that all the model parameters vary from one regime to another. We relax this restrictive assumption. As illustrated by simulations on moderate samples (300 observations), the sticky IHMM-ARMA algorithm detects which model parameters change over time. Applications to the U.S. GDP growth and the DJIA realized volatility highlight the relevance of estimating different structural breaks for the mean and variance parameters.
    Keywords: Bayesian inference, Markov-switching model, ARMA model, infinite hidden Markov model, Dirichlet Process
    JEL: C11 C15 C22 C58
    Date: 2014–06–11
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2014014&r=ecm
  6. By: Maurice J.G. Bun; Martin A. Carree; Arturas Juodis
    Abstract: We analyze the finite sample properties of maximum likelihood estimators for dynamic panel data models. In particular, we consider Transformed Maximum Likelihood (TML) and Random effects Maximum Likelihood (RML) estimation. We show that TML and RML estimators are solutions to a cubic first-order condition in the autoregressive parameter. Furthermore, in finite samples both likelihood estimators might lead to a negative estimate of the variance of the individual specific effects. We consider different approaches taking into account the non-negativity restriction for the variance. We show that these approaches may lead to a boundary solution different from the unique global unconstrained maximum. In an extensive Monte Carlo study we find that this boundary solution issue is non-negligible for small values of T and that different approaches might lead to substantially different finite sample properties. Furthermore, we find that the Likelihood Ratio statistic provides size control in small samples, albeit with low power due to the flatness of the log-likelihood function. We illustrate these issues modeling U.S. state level unemployment dynamics.
    Date: 2014–12–16
    URL: http://d.repec.org/n?u=RePEc:ame:wpaper:1404&r=ecm
  7. By: W. Robert Reed (University of Canterbury)
    Abstract: This paper demonstrates that unit root tests can suffer from inflated Type I error rates when data are cointegrated. Results from Monte Carlo simulations show that three commonly used unit root tests – the ADF, Phillips-Perron, and DF-GLS tests – frequently overreject the true null of a unit root for at least one of the cointegrated variables. The findings extend previous research which reports size distortions for unit roots tests when the associated error terms are serially correlated (Schwert, 1989; DeJong et al., 1992; Harris, 1992). While the addition to the Dickey-Fuller-type specification of the correct number of lagged differenced (LD) terms can eliminate the size distortion, I demonstrate that determining the correct number of LD terms is unachievable in practice. Standard diagnostics such as testing for serial correlation in the residuals, and using information criteria to compare different lag specifications, are unable to identify the requird number of lags. A unique feature of this study is that it includes programs (an Excel spreadsheet and Stata .do files) that allow the reader to simulate their own cointegrated data -- using parameters of their own choosing -- to confirm the findings reported in this paper.
    Keywords: Unit root testing, cointegration, DF-GLS test, Augmented Dickey-Fuller test, Phillips-Perron test, simulation
    JEL: C32 C22 C18
    Date: 2014–12–14
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:14/28&r=ecm
  8. By: Andrew Harvey; Stephen Thiele
    Abstract: A test for time-varying correlation is developed within the framework of a dynamic conditional score (DCS) model for both Gaussian and Student t-distributions. The test may be interpreted as a Lagrange multiplier test and modified to allow for the estimation of models for time-varying volatility in the individual series. Unlike standard moment-based tests, the score-based test statistic includes information on the level of correlation under the null hypothesis and local power arguments indicate the benefits of doing so. A simulation study shows that the performance of the score-based test is strong relative to existing tests across a range of data generating processes. An application to the Hong Kong and South Korean equity markets shows that the new test reveals changes in correlation that are not detected by the standard moment-based test.
    Keywords: Dynamic conditional score, EGARCH, Lagrange multiplier test, Portmanteau test, Time-varying covariance matrices.
    JEL: C14 C22 F36
    Date: 2014–11–28
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:1439&r=ecm
  9. By: Kaspar Wüthrich
    Abstract: This paper analyzes estimators based on the instrumental variable quantile regression (IVQR) model (Chernozhukov and Hansen, 2004, 2005, 2006) under the local quantile treatment effects (LQTE) framework (Abadie et al., 2002). I show that the quantile treatment effect (QTE) estimators in the IVQR model are equivalent to LQTE for the compliers at transformed quantile levels. This transformation adjusts for differences between the subpopulation-specific potential outcome distributions that are identified in the LQTE model. Moreover, the IVQR estimator of the average treatment effect (ATE) corresponds to a convex combination of the local average treatment effect (LATE) and a weighted average of LQTE for the compliers. I extend the analysis to more general setups that allow for partial failures of the LQTE assumptions, non-binary instruments, and covariates. The results are illustrated with two empirical applications.
    Keywords: Endogeneity; instrumental variables; quantile treatment effect; local quantile treatment effect; average treatment effect; local average treatment effect; rank similarity
    JEL: C14 C21 C26
    Date: 2014–11
    URL: http://d.repec.org/n?u=RePEc:ube:dpvwib:dp1408&r=ecm
  10. By: Roberto Casarin (Department of Economics, University of Venice Cà Foscari); Fabrizio Leisen (Department of Economics, University of Kent); German Molina (Idalion Capital US LP); Enrique Ter Horst (CESA & IESA)
    Abstract: We build on Fackler and King (1990) and propose a general calibration model for implied risk neutral densities. Our model allows for the joint calibration of a set of densities at different maturities and dates. The model is a Bayesian dynamic beta Markov random field which allows for possible time dependence between densities with the same maturity and for dependence across maturities at the same point in time. The assumptions on the prior distribution allow us to compound the needs of model flexibility, parameter parsimony and information pooling across densities.
    Keywords: Bayesian inference, Beta random fields, Exchange Metropolis Hastings, Markov chain Monte Carlo, Risk neutral measure.
    JEL: C11 C15 C33 C51 C58 G13 G17
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:ven:wpaper:2014:22&r=ecm
  11. By: YABE, Ryota
    Abstract: By using the empirical likelihood (EL), we consider the construction of pointwise confidence intervals (CIs) for nonparametric nonlinear nonstationary regression models with nonlinear nonstationary heterogeneous errors. It is well known that the EL-based CI has attractive properties such as data dependency and automatic studentization in cross-sectional and weak-dependence models. We extend EL theory to the nonparametric nonlinear nonstationary regression model and show that the log-EL ratio converges to a chi-squared random variable with one degree of freedom. This means that Wilks' theorem holds even if the covariate follows a nonstationary process. We also conduct empirical analysis of Japan's inverse money demand to demonstrate the data-dependency property of the EL-based CI.
    Date: 2014–12
    URL: http://d.repec.org/n?u=RePEc:hit:econdp:2014-20&r=ecm
  12. By: Timothy Christensen
    Abstract: We introduce econometric methods to perform estimation and inference on the permanent and transitory components of the stochastic discount factor (SDF) in dynamic Markov environments. The approach is nonparametric in that it does not impose parametric restrictions on the law of motion of the state process. We propose sieve estimators of the eigenvalue-eigenfunction pair which are used to decompose the SDF into its permanent and transitory components, as well as estimators of the long-run yield and the entropy of the permanent component of the SDF, allowing for a wide variety of empirically relevant setups. Consistency and convergence rates are established. The estimators of the eigenvalue, yield and entropy are shown to be asymptotically normal and semiparametrically efficient when the SDF is observable. We also introduce nonparametric estimators of the continuation value under Epstein-Zin preferences, thereby extending the scope of our estimators to an important class of recursive preferences. The estimators are simple to implement, perform favorably in simulations, and may be used to numerically compute the eigenfunction and its eigenvalue in fully specified models when analytical solutions are not available.
    Date: 2014–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1412.4428&r=ecm
  13. By: Gaure, Simen (The Ragnar Frisch Centre for Economic Research, Oslo, Norway)
    Abstract: When doing two-way fixed effects OLS estimations, both the variances and covariance of the fixed effects are biased. A formula for a bias correction is known, but in large datasets it involves inverses of impractically large matrices. We detail how to compute the bias correction in this case.
    Keywords: Limited mobility bias; Two way fuxed effects; Linear regression
    JEL: A19 C13 C33 C50 C87
    Date: 2014–08–30
    URL: http://d.repec.org/n?u=RePEc:hhs:osloec:2014_021&r=ecm
  14. By: Liana Jacobi; Helga Wagner; Sylvia Frühwirth-Schnatter
    Abstract: Child birth leads to a break in a woman's employment history and is considered one reason for the relatively poor labor market outcomes observed for women compared to men. However, the time spent at home after child birth varies significantly across mothers and is likely driven by observed and, more importantly, unobserved factors that also affect labor market outcomes directly. In this paper we propose two alternative Bayesian treatment modeling and inferential frameworks for panel outcomes to estimate dynamic earnings effects of a long maternity leave on mothers' earnings in the years following the return to the labor market. The frameworks differ in their modeling of the endogeneity of the treatment and the panel structure of the earnings, with the first framework based on the modeling tradition of the Roy switching regression model, and the second based on the shared factor approach. We show how stochastic variable selection can be implemented within both frameworks and can be used, for example, to test for the heterogeneity of the treatment effects. Our analysis is based on a large sample of mothers from the Austrian Social Security Register (ASSD) and exploits a recent change in the maternity leave policy to help identify the causal earnings effects. We find substantial negative earning effects from long leave over a 5 year period after mothers' return to the labor market, with the earnings gap between short and long leave mothers steadily narrowing over time.
    Keywords: treatment effects models, switching regression model, shared factor model, factor analysis, spike and slab priors, variable selection, Markov Chain Monte Carlo method, earnings effects, maternity leave
    JEL: C11 C31 C33 C38 C52 J31 J13 J16
    Date: 2014–02
    URL: http://d.repec.org/n?u=RePEc:jku:nrnwps:2014_12&r=ecm
  15. By: Kleijnen, Jack P.C. (Tilburg University, Center For Economic Research); Mehdad, E. (Tilburg University, Center For Economic Research)
    Abstract: Abstract: To analyze the input/output behavior of simulation models with multiple responses, we may apply either univariate or multivariate Kriging (Gaussian process) metamodels. In multivariate Kriging we face a major problem: the covariance matrix of all responses should remain positive-de nite; we therefore use the recently proposed "non-separable dependence" model. To evaluate the performance of univariate and multivariate Kriging, we perform several Monte Carlo experiments that simulate Gaussian processes. These Monte Carlo results suggest that the simpler univariate Kriging gives smaller mean square error.
    Keywords: Simulation; Stochastic processes; Multivariate statistics
    JEL: C0 C1 C9 C15 C44
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:tiu:tiucen:8a096696-f700-4cbe-9474-c6e93f5e323b&r=ecm
  16. By: Ghysels, Eric
    Abstract: We consider estimating volatility risk factors using large panels of filtered or realized volatilities. The data structure involves three types of asymptotic expansions. There is the cross-section of volatility estimates at each point in time, namely i = 1,…, N observed at dates t = 1,…, T: In addition to expanding N and T; we also have the sampling frequency h of the data used to compute the volatility estimates which rely on data collected at increasing frequency, h ? 0: The continuous record or in-fill asymptotics (h ? 0) allows us to control the cross-sectional and serial correlation among the idiosyncratic errors of the panel. A remarkable result emerges. Under suitable regularity conditions the traditional principal component analysis yields super-consistent estimates of the factors at each point in time. Namely, contrary to the root-N standard normal consistency we find N-consistency, also standard normal, due to the fact that the high frequency sampling scheme is tied to the size of the cross-section, boosting the rate of convergence. We also show that standard cross-sectional driven criteria suffice for consistent estimation of the number of factors, which is different from the traditional panel data results. Finally, we also show that the panel data estimates improve upon the individual volatility estimates.
    Keywords: ARCH-type filters; Principal Component Analysis; realized volatility
    JEL: C13 C33
    Date: 2014–06
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:10034&r=ecm
  17. By: Kerstin Gärtner (Vienna University); Mark Podolskij (Aarhus University and CREATES)
    Abstract: In this paper we present some new asymptotic results for high frequency statistics of Brownian semi-stationary (BSS) processes. More precisely, we will show that singularities in the weight function, which is one of the ingredients of a BSS process, may lead to non-standard limits of the realised quadratic variation. In this case the limiting process is a convex combination of shifted integrals of the intermittency function. Furthermore, we will demonstrate the corresponding stable central limit theorem. Finally, we apply the probabilistic theory to study the asymptotic properties of the realized ratio statistics, which estimates the smoothness parameter of a BSS process.
    Keywords: Brownian semi-stationary processes, high frequency data, limit theorems, stable convergence
    JEL: C10 C13 C14
    Date: 2014–12–10
    URL: http://d.repec.org/n?u=RePEc:aah:create:2014-50&r=ecm
  18. By: Giovannelli, Alessandro; Proietti, Tommaso
    Abstract: We address the problem of selecting the common factors that are relevant for forecasting macroeconomic variables. In economic forecasting using diffusion indexes the factors are ordered, according to their importance, in terms of relative variability, and are the same for each variable to predict, i.e. the process of selecting the factors is not supervised by the predictand. We propose a simple and operational supervised method, based on selecting the factors on the basis of their significance in the regression of the predictand on the predictors. Given a potentially large number of predictors, we consider linear transformations obtained by principal components analysis. The orthogonality of the components implies that the standard t-statistics for the inclusion of a particular component are independent, and thus applying a selection procedure that takes into account the multiplicity of the hypotheses tests is both correct and computationally feasible. We focus on three main multiple testing procedures: Holm’s sequential method, controlling the family wise error rate, the Benjamini-Hochberg method, controlling the false discovery rate, and a procedure for incorporating prior information on the ordering of the components, based on weighting the p-values according to the eigenvalues associated to the components. We compare the empirical performances of these methods with the classical diffusion index (DI) approach proposed by Stock and Watson, conducting a pseudo-real time forecasting exercise, assessing the predictions of 8 macroeconomic variables using factors extracted from an U.S. dataset consisting of 121 quarterly time series. The overall conclusion is that nature is tricky, but essentially benign: the information that is relevant for prediction is effectively condensed by the first few factors. However, variable selection, leading to exclude some of the low order principal components, can lead to a sizable improvement in forecasting in specific cases. Only in one instance, real personal income, we were able to detect a significant contribution from high order components.
    Keywords: Variable selection; Multiple testing; p-value weighting.
    JEL: C22 C32 C38 C53 E3 E32
    Date: 2014–11–30
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:60673&r=ecm
  19. By: YABE, Ryota
    Abstract: This paper considers the conditional sum of squares estimator (CSSE) for the moderate deviation MA(1) process that has the parameter of the MA(1) with the distance between the parameter and unity being larger than O(T -1). We show that the asymptotic distribution of the CSSE is normal, even though the process belongs to the local-to-unity class. The convergence rate continuously changes from an invertible order to a noninvertible one. In this sense, the moderate deviation process in MA(1) has a continuous bridge property like the AR process.
    Keywords: Moving average, Noninvertible moving average, Unit root, local to unity, Moderate Deviations, Conditional sum of squares estimation
    Date: 2014–12
    URL: http://d.repec.org/n?u=RePEc:hit:econdp:2014-19&r=ecm
  20. By: Fiorini, Mario; Katrien Stevens
    Abstract: Whenever treatment e_ects are heterogeneous and there is sorting into treatment based on the gain, monotonicity is a condition that both Instrumental Variable and fuzzy Regression Discontinuity designs have to satisfy for their estimate to be interpretable as a LATE. However, applied economic work rarely discusses this important assumption. This is in stark contrast to the lengthy discussions dedicated to the other IV and fuzzy RD conditions. We show that monotonicity can and should be investigated using a mix of economic insights, data patterns and formal tests. This is just an extra step to validate the results. We provide examples in a variety of settings as a guide to practice.
    Keywords: essential heterogeneity, monotonicity assumption, LATE, instrumental variable, regression discontinuity
    Date: 2014–10
    URL: http://d.repec.org/n?u=RePEc:syd:wpaper:2014-13&r=ecm
  21. By: Stephen G. Hall; P. A. V. B. Swamy; George S. Tavlas
    Keywords: Time-varying coefficient model, Coefficient driver, Specification Problem, Correct interpretation of coefficients
    JEL: C13 C19 C22
    Date: 2014–12
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:14/18&r=ecm
  22. By: Lei, J. (Tilburg University, School of Economics and Management)
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:tiu:tiutis:302d1ae7-0310-43b0-b253-6e3da1413d35&r=ecm
  23. By: Wagner Piazza Gaglianone; João Victor Issler
    Abstract: In this paper, we propose a microfounded framework to investigate a panel of forecasts (e.g. model-driven or survey-based) and the possibility to improve their out-of-sample forecast performance by employing a bias-correction device. Following Patton and Timmermann (2007), we theoretically justify the modeling of forecasts as function of the conditional expectation, based on the optimization problem of individual forecasters. This approach allows us to relax the standard assumption of mean squared error (MSE) loss function and, thus, to obtain optimal forecasts under more general functions. However, different from these authors, we apply our results to a panel of forecasts, in order to construct an optimal (combined) forecast. In this sense, a feasible GMM estimator is proposed to aggregate the information content of each individual forecast and optimally recover the conditional expectation. Our setup can be viewed as a generalization of the three-way forecast error decomposition of Davies and Lahiri (1995); and as an extension of the bias-corrected average forecast of Issler and Lima (2009). A real-time forecasting exercise using the Brazilian Focus survey illustrates the proposed methodology
    Date: 2014–12
    URL: http://d.repec.org/n?u=RePEc:bcb:wpaper:372&r=ecm
  24. By: Cerqueti, Roy; Lupi, Claudio
    Abstract: This note contributes to the development of the theory of stochastic dependence by employing the general concept of copula. In particular, it deals with the construction of a new family of non-exchangeable copulas characterizing the multivariate total positivity of order 2 (MTP2) dependence.
    Keywords: Copulas, MTP2 dependence, Non-exchangeability
    JEL: C19 C39
    Date: 2014–11–23
    URL: http://d.repec.org/n?u=RePEc:mol:ecsdps:esdp14075&r=ecm
  25. By: Gustavo Fruet Dias (Aarhus University and CREATES); Fotis Papailias (Queen's University Belfast and quantf Research)
    Abstract: A two-stage forecasting approach for long memory time series is introduced. In the first step we estimate the fractional exponent and, applying the fractional differencing operator, we obtain the underlying weakly dependent series. In the second step, we perform the multi-step ahead forecasts for the weakly dependent series and obtain their long memory counterparts by applying the fractional cumulation operator. The methodology applies to stationary and nonstationary cases. Simulations and an application to seven time series provide evidence that the new methodology is more robust to structural change and yields good forecasting results.
    Keywords: Forecasting, Spurious Long Memory, Structural Change, Local Whittle
    JEL: C22 C53
    Date: 2014–12–15
    URL: http://d.repec.org/n?u=RePEc:aah:create:2014-55&r=ecm
  26. By: Gonzalez-Astudillo, Manuel (Board of Governors of the Federal Reserve System (U.S.))
    Abstract: In this paper, I propose an econometric technique to estimate a Markov-switching Taylor rule subject to the zero lower bound of interest rates. I show that incorporating a Tobit-like specification allows to obtain consistent estimators. More importantly, I show that linking the switching of the Taylor rule coefficients to the switching of the coefficients of an auxiliary uncensored Markov-switching regression improves the identification of an otherwise unidentifiable prevalent monetary regime. To illustrate the proposed estimation technique, I use U.S. quarterly data spanning 1960:1-2013:4. The chosen auxiliary Markov-switching regression is a fiscal policy rule where federal revenues react to debt and the output gap. Results show that there is evidence of policy co-movements with debt-stabilizing fiscal policy more likely accompanying active monetary policy, and vice versa.
    Keywords: Markov-switching coefficients; zero lower bound; monetary-fiscal policy interactions
    JEL: C34 E52 E63
    Date: 2014–09–19
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2014-97&r=ecm
  27. By: Subhash C. Ray (University of Connecticut)
    Abstract: Over the past decades Data Envelopment Analysis (DEA) has emerged as an important nonparametric method of evaluating performance of decision making units through benchmarking. Although developed primarily for measuring technical efficiency, DEA is now applied extensively for measuring scale efficiency, cost efficiency, and profit efficiency as well. This paper integrates the different DEA models commonly applied in empirical research with their underlying theoretical foundations in neoclassical production economics.
    Keywords: Linear Programming; Technical Efficiency; Returns to Scale; Distance Functions
    JEL: C6 D2
    Date: 2014–11
    URL: http://d.repec.org/n?u=RePEc:uct:uconnp:2014-33&r=ecm

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.