nep-ecm New Economics Papers
on Econometrics
Issue of 2010‒01‒10
23 papers chosen by
Sune Karlsson
Orebro University

  1. Generalized Least Squares Estimation for Cointegration Parameters Under Conditional Heteroskedasticity By Helmut Herwartz; Helmut Luetkepohl
  2. A Consistent Test for Multivariate Conditional Distributions By Fuchun Li; Greg Tkacz
  3. Forecasting time series with complex seasonal patterns using exponential smoothing By Alysha M De Livera; Rob J Hyndman
  4. Fully Modified Narrow-Band Least Squares Estimation of Weak Fractional Cointegration By Morten Ørregaard Nielsen; Per Frederiksen
  5. Bootstrap Confidence Bands for Forecast Paths By Anna Staszewska-Bystrova
  6. Indirect inference methods for stochastic volatility models based on non-Gaussian Ornstein-Uhlenbeck processes By Arvid Raknerud and Øivind Skare
  7. Local polynomial regression with truncated or censored response By Karlsson, Maria; Cantoni, Eva; de Luna, Xavier
  8. Modelling the Volatility-Return Trade-off when Volatility may be Nonstationary By Christian M. Dahl; Emma M. Iglesias
  9. The Fragility of the KPSS Stationarity Test By Nunzio Cappuccio; Diego Lubian
  10. "Block Structure Multivariate Stochastic Volatility Models" By Manabu Asai; Massimiliano Caporin; Michael McAleer
  11. A Consistent Model of `Explosive' Financial Bubbles With Mean-Reversing Residuals By D. Sornette; L. Lin; Ren R.E.
  12. Estimation of a transformation model with truncation, interval observation and time-varying covariates By Bo E. Honoré; Luojia Hu
  13. Time series segmentation by Cusum, AutoSLEX and AutoPARM methods By Ana Badagian; Regina Kaiser; Daniel Pena
  14. Non-parametric identication of the mixed proportional hazards model with interval-censored durations By Christian N. Brinch
  15. Nested models and model uncertainty By Alexander Kriwoluzky; Christian A. Stoltenbergz
  16. A Volatility Targeting GARCH model with Time-Varying Coefficients By Thorsten Lehnert; Bart Frijns; Remco Zwinkels
  17. The Impact of the National School Lunch Program on Child Health: A Nonparametric Bounds Analysis By Gundersen, Craig; Kreider, Brent; Pepper, John V.
  18. Statistical Inference for Multidimensional Inequality Indices By Abul Naga, Ramses
  19. Profiling Poverty with Multivariate Adaptive Regression Splines By Mina, Christian D.; Barrios, Erniel B.
  20. Partial Identification of Discrete Counterfactual Distributions with Sequential Update of Information By Stefan Boes
  21. Persistent Disparities in Regional Unemployment: Application of a Spatial Filtering Approach to Local Labour Markets in Germany By Roberto Patuelli; Norbert Schanne; Daniel A. Griffith; Peter Nijkamp
  22. A quarterly fiscal database for the euro area based on intra-annual fiscal information. By Joan Paredes; Diego J. Pedregal; Javier J. Pérez
  23. The econometrics of randomly spaced financial data: a survey By Andre A. Monteiro

  1. By: Helmut Herwartz; Helmut Luetkepohl
    Abstract: In the presence of generalized conditional heteroscedasticity (GARCH) in the residuals of a vector error correction model (VECM), maximum likelihood (ML) estimation of the cointegration parameters has been shown to be efficient. On the other hand, full ML estimation of VECMs with GARCH residuals is computationally di±cult and may not be feasible for larger models. Moreover, ML estimation of VECMs with independently identically distributed residuals is known to have potentially poor small sample properties and this problem also persists when there are GARCH residuals. A further disadvantage of the ML estimator is its sensitivity to misspecification of the GARCH process. We propose a feasible generalized least squares estimator which addresses all these problems. It is easy to compute and has superior small sample properties in the presence of GARCH residuals.
    Keywords: Vector autoregressive process, vector error correction model, cointegration, reduced rank estimation, maximum likelihood estimation, multivariate GARCH
    JEL: C32
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2009/42&r=ecm
  2. By: Fuchun Li; Greg Tkacz
    Abstract: We propose a new test for a multivariate parametric conditional distribution of a vector of variables yt given a conditional vector xt. The proposed test is shown to have an asymptotic normal distribution under the null hypothesis, while being consistent for all fixed alternatives, and having non-trivial power against a sequence of local alternatives. Monte Carlo simulations show that our test has reasonable size and good power for both univariate and multivariate models, even for highly persistent dependent data with sample sizes often encountered in empirical finance.
    Keywords: Econometric and statistical methods
    JEL: C12 C22
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:09-34&r=ecm
  3. By: Alysha M De Livera; Rob J Hyndman
    Abstract: A new innovations state space modeling framework, incorporating Box-Cox transformations, Fourier series with time varying coefficients and ARMA error correction, is introduced for forecasting complex seasonal time series that cannot be handled using existing forecasting models. Such complex time series include time series with multiple seasonal periods, high frequency seasonality, non-integer seasonality and dual-calendar effects. Our new modelling framework provides an alternative to existing exponential smoothing models, and is shown to have many advantages. The methods for initialization and estimation, including likelihood evaluation, are presented, and analytical expressions for point forecasts and interval predictions under the assumption of Gaussian errors are derived, leading to a simple, comprehensible approach to forecasting complex seasonal time series. Our trigonometric formulation is also presented as a means of decomposing complex seasonal time series, which cannot be decomposed using any of the existing decomposition methods. The approach is useful in a broad range of applications, and we illustrate its versatility in three empirical studies where it demonstrates excellent forecasting performance over a range of prediction horizons. In addition, we show that our trigonometric decomposition leads to the identification and extraction of seasonal components, which are otherwise not apparent in the time series plot itself.
    Keywords: Exponential smoothing, Fourier series, prediction intervals, seasonality, state space models, time series decomposition
    JEL: C22 C53
    Date: 2009–12–12
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2009-15&r=ecm
  4. By: Morten Ørregaard Nielsen (Queen's University and CREATES); Per Frederiksen (Nordea Markets)
    Abstract: We consider estimation of the cointegrating relation in the weak fractional cointegration model, where the strength of the cointegrating relation (difference in memory parameters) is less than one-half. A special case is the stationary fractional cointegration model, which has found important application recently, especially in financial economics. Previous research on this model has considered a semiparametric narrow-band least squares (NBLS) estimator in the frequency domain, but in the stationary case its asymptotic distribution has been derived only under a condition of non-coherence between regressors and errors at the zero frequency. We show that in the absence of this condition, the NBLS estimator is asymptotically biased, and also that the bias can be consistently estimated. Consequently, we introduce a fully modified NBLS estimator which eliminates the bias, and indeed enjoys a faster rate of convergence than NBLS in general. We also show that local Whittle estimation of the integration order of the errors can be conducted consistently based on NBLS residuals, but the estimator has the same asymptotic distribution as if the errors were observed only under the condition of non-coherence. Furthermore, compared to much previous research, the development of the asymptotic distribution theory is based on a different spectral density representation, which is relevant for multivariate fractionally integrated processes, and the use of this representation is shown to result in lower asymptotic bias and variance of the narrow-band estimators. We present simulation evidence and a series of empirical illustrations to demonstrate the feasibility and empirical relevance of our methodology.
    Keywords: Fractional cointegration, frequency domain, fully modified estimation, long memory, semiparametric
    JEL: C22
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1226&r=ecm
  5. By: Anna Staszewska-Bystrova
    Abstract: The problem of forecasting from vector autoregressive models has attracted considerable attention in the literature. The most popular non-Bayesian approaches use large sample normal theory or the bootstrap to evaluate the uncertainty associated with the forecast. The literature has concentrated on the problem of assessing the uncertainty of the prediction for a single period. This paper considers the problem of how to assess the uncertainty when the forecasts are done for a succession of periods. It describes and evaluates bootstrap method for constructing confidence bands for forecast paths. The bands are constructed from forecast paths obtained in bootstrap replications with an optimisation procedure used to find the envelope of the most concentrated paths. The method is shown to have good coverage properties in a Monte Carlo study.
    Keywords: vector autoregression, forecast path, bootstrapping, simultaneous statistical inference
    JEL: C15 C32 C53
    Date: 2009–12–07
    URL: http://d.repec.org/n?u=RePEc:com:wpaper:024&r=ecm
  6. By: Arvid Raknerud and Øivind Skare (Statistics Norway)
    Abstract: This paper aims to develop new methods for statistical inference in a class of stochastic volatility models for financial data based on non-Gaussian Ornstein-Uhlenbeck (OU) processes. Our approach uses indirect inference methods: First, a quasi-likelihood for the actual data is estimated. This quasi-likelihood is based on an approximative Gaussian state space representation of the OU-based model. Next, simulations are made from the data generating OU-model for given parameter values. The indirect inference estimator is the parameter value in the OU-model which gives the best "match" between the quasi-likelihood estimator for the actual data and the quasi-likelihood estimator for the simulated data. Our method is applied to Euro/NOK and US Dollar/NOK daily exchange rates for the period 1.7.1989 until 15.12.2008. Accompanying R-package, that interfaces C++ code is documented and can be downloaded.
    Keywords: stochastic volatility; financial econometrics; Ornstein-Uhlenbeck processes; indirect inference; state space models; exchange rates
    JEL: C13 C22 C51 G10
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:ssb:dispap:601&r=ecm
  7. By: Karlsson, Maria (Department of Statistics, Umeå University); Cantoni, Eva (Department of Econometrics, University of Geneva); de Luna, Xavier (Department of Statistics, Umeå University)
    Abstract: Truncation or censoring of the response variable in a regression model is a problem in many applications, e.g. when the response is insurance claims or the durations of unemployment spells. We introduce a local polynomial re­gression estimator which can deal with such truncated or censored responses. For this purpose, we use local versions of the STLS and SCLS estimators of Powell (1986) and the QME estimator of Lee (1993) and Laitila (2001). The asymptotic properties of our estimators, and the conditions under which they are valid, are given. In addition, a simulation study is presented to investigate the finite sample properties of our proposals.
    Keywords: Non-parametric regression; truncation; censoring; asymptotic properties
    JEL: C14
    Date: 2009–12–14
    URL: http://d.repec.org/n?u=RePEc:hhs:ifauwp:2009_025&r=ecm
  8. By: Christian M. Dahl (University of Aarhus and CREATES); Emma M. Iglesias (Department of Economics, Michigan State University and University of Essex)
    Abstract: In this paper a new GARCH–M type model, denoted the GARCH-AR, is proposed. In particular, it is shown that it is possible to generate a volatility-return trade-off in a regression model simply by introducing dynamics in the standardized disturbance process. Importantly, the volatility in the GARCH-AR model enters the return function in terms of relative volatility, implying that the risk term can be stationary even if the volatility process is nonstationary. We provide a complete characterization of the stationarity properties of the GARCH-AR process by generalizing the results of Bougerol and Picard (1992b). Furthermore, allowing for nonstationary volatility, the asymptotic properties of the estimated parameters by quasi-maximum likelihood in the GARCH-AR process are established. Finally, we stress the importance of being able to choose correctly between AR-GARCH and GARCH-AR processes: First, it is shown, by a small simulation study, that the estimators for the parameters in an ARGARCH model will be seriously inconsistent if the data generating process actually is a GARCH-AR process. Second, we provide an LM test for neglected GARCH-AR effects and discuss its finite sample size properties. Third, we provide an empirical illustration showing the empirical relevance of the GARCH-AR model based on modelling a wide range of leading US stock return series.
    Keywords: Quasi-Maximum Likelihood, GARCH-M Model, Asymptotic Properties, Risk-return Relation.
    JEL: C12 C13 C22 G12
    Date: 2009–10–02
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-59&r=ecm
  9. By: Nunzio Cappuccio (Department of Economics and Management, University of Padova); Diego Lubian (Department of Economics (University of Verona))
    Abstract: Stationarity tests exhibit extreme size distortions if the observable process is stationary yet highly persistent. In this paper we provide a theoretical explanation for the size distortion of the KPSS test for DGPs with a broad range of first order autocorrelation coefficient. Considering a near-integrated, nearly stationary process we show that the asymptotic distribution of the test contains an additional term, which can potentially explain the amount of size distortion documented in previous simulation studies.
    Keywords: KPSS stationarity test, size distortion, nearly white noise nearly integrated model
    JEL: C01 C22
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:ver:wpaper:67/2009&r=ecm
  10. By: Manabu Asai (Faculty of Economics, Soka University); Massimiliano Caporin (Department of Economics and Management "Marco Fanno", University of Padova); Michael McAleer (Econometric Institute, Erasmus University Rotterdam, Erasmus School of Economics and Tinbergen Institute)
    Abstract: Most multivariate variance models suffer from a common problem, the "curse of dimensionality". For this reason, most are fitted under strong parametric restrictions that reduce the interpretation and flexibility of the models. Recently, the literature has focused on multivariate models with milder restrictions, whose purpose was to combine the need for interpretability and efficiency faced by model users with the computational problems that may emerge when the number of assets is quite large. We contribute to this strand of the literature proposing a block-type parameterization for multivariate stochastic volatility models.
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2009cf699&r=ecm
  11. By: D. Sornette; L. Lin; Ren R.E.
    Abstract: We present a self-consistent model for explosive financial bubbles, which combines a mean-reverting volatility process and a stochastic conditional return which reflects nonlinear positive feedbacks and continuous updates of the investors' beliefs and sentiments. The conditional expected returns exhibit faster-than-exponential acceleration decorated by accelerating oscillations, called ``log-periodic power law.'' Tests on residuals show a remarkable low rate (0.2%) of false positives when applied to a GARCH benchmark. When tested on the S&P500 US index from Jan. 3, 1950 to Nov. 21, 2008, the model correctly identifies the bubbles ending in Oct. 1987, in Oct. 1997, in Aug. 1998 and the ITC bubble ending on the first quarter of 2000. Different unit-root tests confirm the high relevance of the model specification. Our model also provides a diagnostic for the duration of bubbles: applied to the period before Oct. 1987 crash, there is clear evidence that the bubble started at least 4 years earlier. We confirm the validity and universality of the volatility-confined LPPL model on seven other major bubbles that have occurred in the World in the last two decades. Using Bayesian inference, we find a very strong statistical preference for our model compared with a standard benchmark, in contradiction with \citet{Feigenbaum2006} which used a unit-root model for residuals.
    Keywords: Rational bubbles, finite-time singularity, super-exponential growth, Bayesian analysis, log-periodic power law
    JEL: C11
    Date: 2009–05–01
    URL: http://d.repec.org/n?u=RePEc:stz:wpaper:ccss-09-00002&r=ecm
  12. By: Bo E. Honoré; Luojia Hu
    Abstract: Abrevaya (1999b) considered estimation of a transformation model in the presence of left-truncation. This paper observes that a cross-sectional version of the statistical model considered in Frederiksen, Honoré, and Hu (2007) is a generalization of the model considered by Abrevaya (1999b) and the generalized model can be estimated by a pairwise comparison version of one of the estimators in Frederiksen, Honoré, and Hu (2007). Specifically, our generalization will allow for discretized observations of the dependent variable and for piecewise constant time- varying explanatory variables.
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:fip:fedhwp:wp-09-16&r=ecm
  13. By: Ana Badagian; Regina Kaiser; Daniel Pena
    Abstract: Time series segmentation has many applications in several disciplines as neurology, cardiology, speech, geology and others. Many time series in this fields do not behave as stationary and the usual transformations to linearity cannot be used. This paper describes and evaluates different methods for segmenting non-stationary time series. We propose a modification of the algorithm in Lee et al. (2003) which is designed to searching for a unique change in the parameters of a time series, in order to find more than one change using an iterative procedure. We evaluate the performance of three approaches for segmenting time series: AutoSLEX (Ombao et al., 2002), AutoPARM (Davis et al., 2006) and the iterative cusum method mentioned above and referred as ICM. The evaluation of each methodology consists of two steps. First, we compute how many times each procedure fails in segmenting stationary processes properly. Second, we analyze the effect of different change patterns by counting how many times the corresponding methodology correctly segments a piecewise stationary process. ICM method has a better performance than AutoSLEX for piecewise stationary processes. AutoPARM presents a very satisfactory behaviour. The performance of the three methods is illustrated with time series datasets of neurology and speech
    Keywords: Time series segmentation, AutoSLEX, AutoPARM, Cusum Methods
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws098025&r=ecm
  14. By: Christian N. Brinch (Statistics Norway)
    Abstract: This note presents identication results for the mixed proportional hazards model when duration data are interval-censored. Earlier positive results on identication under intervalcensoring require both parametric specication on how covariates enter the hazard functions and assumptions of unbounded support for covariates. New results provided here show how one can dispense with both of these assumptions. The mixed proportional hazards model is non-parametrically identied with interval-censored duration data, provided covariates have support on an open set and the hazard function is a non-constant continuous function of covariates.
    Keywords: duration analysis; interval-censoring; non-parametric identication
    JEL: C41
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:ssb:dispap:600&r=ecm
  15. By: Alexander Kriwoluzky; Christian A. Stoltenbergz
    Abstract: Uncertainty about the appropriate choice among nested models is a central concern for optimal policy when policy prescriptions from those models differ. The standard procedure is to specify a prior over the parameter space ignoring the special status of some sub-models, e.g. those resulting from zero restrictions. This is especially problematic if a model's generalization could be either true progress or the latest fad found to fit the data. We propose a procedure that ensures that the specified set of sub-models is not discarded too easily and thus receives no weight in determining optimal policy. We find that optimal policy based on our procedure leads to substantial welfare gains compared to the standard practice.
    Keywords: Optimal monetary policy, model uncertainty, Bayesian model estimation
    JEL: E32 C51 E52
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2009/37&r=ecm
  16. By: Thorsten Lehnert (Luxembourg School of Finance, University of Luxembourg); Bart Frijns (Department of Finance, Auckland University of Technology, New Zealand); Remco Zwinkels (Erasmus School of Economics, Erasmus University Rotterdam.)
    Abstract: GARCH-type models have been very successful in describing the volatility dynamics of financial return series for short periods of time. However, for example macroeconomic events may cause the structure of volatility to change and the assumption of stationarity is no longer plausible. In order to deal with this issue, the current paper proposes a conditional volatility model with time varying coefficients based on a multinomial switching mechanism. By giving more weight to either the persistence or shock term in a GARCH model, conditional on their relative ability to forecast a benchmark volatility measure, the switching reinforces the persistent nature of the GARCH model. Estimation of this volatility targeting or VT-GARCH model for Dow 30 stocks indicates that the switching model is able to outperform a number of relevant GARCH setups, both in- and out-of-sample, also without any informational advantages.
    Keywords: GARCH, time varying coefficients, multinomial logit
    JEL: C22
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:crf:wpaper:09-08&r=ecm
  17. By: Gundersen, Craig; Kreider, Brent; Pepper, John V.
    Abstract: Children in households reporting the receipt of free or reduced price school meals through the National School Lunch Program (NSLP) are more likely to have negative health outcomes than eligible nonparticipants. Assessing the causal effects of the program is made difficult, however, by the presence of endogenous selection into the program and systematic misreporting of participation status. Using data from the National Health and Nutrition Examination Survey (NHANES), we extend and apply partial identification methods to account for these two identification problems in a single unifying framework. Similar to a regression discontinuity design, we introduce a new way to conceptualize the monotone instrumental variable (MIV) assumption using eligibility criteria as monotone instruments. Under relatively weak assumptions, we find evidence that receipt of free and reduced price lunches through the NSLP improves the health outcomes of children.
    Keywords: partial identification, selection problem, classification error, monotone instrumental variable, regression discontinuity, National School Lunch Program, food insecurity, obesity
    JEL: C1 C2 I3
    Date: 2009–12–15
    URL: http://d.repec.org/n?u=RePEc:isu:genres:13148&r=ecm
  18. By: Abul Naga, Ramses
    Abstract: We use the delta method to derive the large sample distribution of mul- tidimensional inequality indices. We also present a simple method for com- puting standard errors and obtain explicit formulas in the context of two families of indices
    Keywords: multidimensional inequality indices; large sample distributions; standard errors
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:eid:wpaper:2/09&r=ecm
  19. By: Mina, Christian D.; Barrios, Erniel B.
    Abstract: Using data from the 2003 Family Income and Expenditure Survey and 2005 Community-based Monitoring System for a city, Multivariate Adaptive Regression Splines (MARS) is used in identifying household poverty correlates in the Philippines. Models produced by MARS are more parsimonious yet contain theoretically and empirically sound set of household poverty correlates and have high accuracy in identifying a poor household. MARS provides a better alternative to logistic regression for a more efficient and effective implementation of a proxy means test in the identification of potential beneficiaries of poverty alleviation programs.
    Keywords: community-based monitoring system, multivariate adaptive regression splines, logistic regression, poverty correlates, proxy means test
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:phd:dpaper:dp_2009-29&r=ecm
  20. By: Stefan Boes (Socioeconomic Institute, University of Zurich)
    Abstract: The credibility of standard instrumental variables assumptions is often under dispute. This paper imposes weak monotonicity in order to gain information on counterfactual outcomes, but avoids independence or exclusion restrictions. The outcome process is assumed to be sequentially ordered, building up and depending on the information level of agents. The potential outcome distribution is assumed to weakly increase (or decrease) with the instrument, conditional on the continuation up to a certain stage. As a general result, the counterfactual distributions can only be bounded, but the derived bounds are informative compared to the no-assumptions bounds thus justifying the instrumental variables terminology. The construction of bounds is illustrated in two data examples.
    Keywords: nonparametric bounds, treatment effects, endogeneity, binary choice, monotone instrumental variables, policy evaluation
    JEL: C14 C25 C35
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:soz:wpaper:0918&r=ecm
  21. By: Roberto Patuelli (Institute for Economic Research (IRE), University of Lugano, Switzerland; The Rimini Centre for Economic Analysis, Italy); Norbert Schanne (Institute for Employment Research (IAB), Nuremberg, Germany); Daniel A. Griffith (School of Economic, Political and Policy Sciences, University of Texas at Dallas, USA); Peter Nijkamp (Department of Spatial Economics, VU University Amsterdam, The Netherlands)
    Abstract: The geographical distribution and persistence of regional/local unemployment rates in heterogeneous economies (such as Germany) have been, in recent years, the subject of various theoretical and empirical studies. Several researchers have shown an interest in analysing the dynamic adjustment processes of unemployment and the average degree of dependence of the current unemployment rates or gross domestic product from the ones observed in the past. In this paper, we present a new econometric approach to the study of regional unemployment persistence, in order to account for spatial heterogeneity and/or spatial autocorrelation in both the levels and the dynamics of unemployment. First, we propose an econometric procedure suggesting the use of spatial filtering techniques as a substitute for fixed effects in a panel estimation framework. The spatial filter computed here is a proxy for spatially distributed region-specific information (e.g., the endowment of natural resources, or the size of the ‘home market’) that is usually incorporated in the fixed effects coefficients. The advantages of our proposed procedure are that the spatial filter, by incorporating region-specific information that generates spatial autocorrelation, frees up degrees of freedom, simultaneously corrects for time-stable spatial autocorrelation in the residuals, and provides insights about the spatial patterns in regional adjustment processes. In the paper we present several experiments in order to investigate the spatial pattern of the heterogeneous autoregressive coefficients estimated for unemployment data for German NUTS-3 regions.
    Keywords: unemployment persistence, dynamic panel, hysteresis, spatial filtering, fixed effects
    JEL: C21 C23 R12
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:lug:wpaper:1001&r=ecm
  22. By: Joan Paredes (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.); Diego J. Pedregal (Uuniversidad. Castilla-La Mancha, Real Casa de la Misericordia C/ Altagracia 50, 13071 Ciudad Real, España.); Javier J. Pérez (Banco de España, Research Department, Alcalá 50, E-28014 Madrid, Spain.)
    Abstract: The analysis of the macroeconomic impact of fiscal policies in the euro area has been traditionally limited by the absence of quarterly fiscal data. To overcome this problem, we provide two new databases in this paper. Firstly, we construct a quarterly database of euro area fiscal variables for the period 1980-2008 for a quite disaggregated set of fiscal variables; secondly, we present a real-time fiscal database for a subset of fiscal variables, composed of biannual vintages of data for the euro area period (2000-2009). All models are multivariate, state space mixed-frequencies models estimated with available national accounts fiscal data (mostly annual) and, more importantly, monthly and quarterly information taken from the cash accounts of the governments. We provide not seasonally- and seasonally-adjusted data. Focusing solely on intra-annual fiscal information for interpolation purposes allows us to capture genuine intra-annual "fiscal" dynamics in the data. Thus, we provide fiscal data that avoid some problems likely to appear in studies using fiscal time series interpolated on the basis of general macroeconomic indicators, namely the well-known decoupling of tax collection from the evolution of standard macroeconomic tax bases (revenue windfalls/shortfalls). JEL Classification: C53, E6, H6.
    Keywords: Euro area, Fiscal policies, Interpolation, Unobserved Components models, Mixed frequencies.
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20091132&r=ecm
  23. By: Andre A. Monteiro
    Abstract: This paper provides an introduction to the problem of modeling randomly spaced longitudinal data. Although Point Process theory was developed mostly in the sixties and early seventies, only in the nineties did this field of Probability theory attract the attention of researchers working in Financial Econometrics. The large increase, observed since, in the number of different classes of Econometric models for dealing with financial duration data, has been mostly due to the increased availability of both trade-by-trade data from equity markets and daily default and rating migration data from credit markets. This paper provides an overview of the main Econometric models available in the literature for dealing with what is sometimes called tick data. Additionally, a synthesis of the basic theory underlying these models is also presented. Finally, a new theorem dealing with the identifiability of latent intensity factors from point process data, jointly with a heuristic proof, is introduced.
    Keywords: Tick data, Financial duration models, Point processes, Migration models
    JEL: C22 C32 C34 C41 G10
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws097924&r=ecm

This nep-ecm issue is ©2010 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.