nep-ecm New Economics Papers
on Econometrics
Issue of 2009‒05‒09
eleven papers chosen by
Sune Karlsson
Orebro University

  1. Principal Components and Long Run Implications of Multivariate Diffusions By Xiaohong Chen; Lars Peter Hansen; Jose Scheinkman
  2. Smoothed Versions of Statistical Functionals from a Finite Population By Motoyama, Hitoshi; Takahashi, Hajime
  3. Sato Processes in Default Modeling By Kokholm, Thomas; Nicolato, Elisa
  4. Regression methods for stochastic control problems and their convergence analysis By Denis Belomestny; Anastasia Kolodko; John Schoenmakers
  5. A Joint Dynamic Bi-Factor Model of the Yield Curve and the Economy as a Predictor of Business Cycles By Chauvet, Marcelle; Senyuz, Zeynep
  6. On the Conjecture of Kochar and Korwar By Nuria Torrado; Rosa E. Lillo; Michael P. Wiper
  7. A Methodology to Compute Regional Housing Index Price using Matching Estimator Methods By Paredes - Araya, Dusan
  8. The forecasting power of international yield curve linkages By Michele Modugno; Kleopatra Nikolaou
  9. Benchmarking and Firm Heterogeneity in Electricity Distribution : A Latent Class Analysis of Germany By Astrid Cullmann
  10. Forecasting the Fragility of the Banking and Insurance Sector By Kerstin Bernoth; Andreas Pick
  11. Testing for Exceptional Bulls and Bears: a Non-Parametric Perspective By Candelon Bertrand; Metiu Norbert

  1. By: Xiaohong Chen (Cowles Foundation, Yale University); Lars Peter Hansen (Dept. of Economics, University of Chicago); Jose Scheinkman (Dept. of Economics, Princeton University)
    Abstract: We investigate a method for extracting nonlinear principal components. These principal components maximize variation subject to smoothness and orthogonality constraints; but we allow for a general class of constraints and multivariate densities, including densities without compact support and even densities with algebraic tails. We provide primitive sufficient conditions for the existence of these principal components. We characterize the limiting behavior of the associated eigenvalues, the objects used to quantify the incremental importance of the principal components. By exploiting the theory of continuous-time, reversible Markov processes, we give a different interpretation of the principal components and the smoothness constraints. When the diffusion matrix is used to enforce smoothness, the principal components maximize long-run variation relative to the overall variation subject to orthogonality constraints. Moreover, the principal components behave as scalar autoregressions with heteroskedastic innovations; this supports semiparametric identification of a multivariate reversible diffusion process and tests of the overidentifying restrictions implied by such a process from low frequency data. We also explore implications for stationary, possibly non-reversible diffusion processes.
    Keywords: Nonlinear principal components, Discrete spectrum, Eigenvalue decay rates, Multivariate diffusion, Quadratic form, Conditional expectations operator
    JEL: C12 C22
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1694&r=ecm
  2. By: Motoyama, Hitoshi; Takahashi, Hajime
    Abstract: We will consider the central limit theorem for the smoothed version of statistical functionals in a finite population. For the infinite population Reeds and Fernholz discuss the problem under the conditions of Hadamard differentiability of the statistical functionals and derive Taylor type expansions Lindeberg Feller’s central limit theorem is applied to the leading term and controlling the remainder terms the central limit theorem for the statistical functionals are proved We will modify Fernholz’s method and apply it to the finite population with smoothed empirical distribution functions and we will also obtain Taylor type expansions. We then apply the Erdos-Renyi central limit theorem to the leading linear term to obtain the central limit theorem. We will also obtain sufficient conditions for the central limit theorem, both for the smoothed influence function, and the original non-smoothed versions. Some Monte Carlo simulation results are also included.
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:hit:econdp:2005-05&r=ecm
  3. By: Kokholm, Thomas (Department of Business Studies, Aarhus School of Business); Nicolato, Elisa (Department of Business Studies, Aarhus School of Business)
    Abstract: In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time-change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared with those of a recently proposed class of intensity models based on Ornstein-Uhlenbeck type processes. It is shown how the time-inhomogeneous Levy models achieve comparable calibration errors with fever parameters, and with more stable parameter estimates over time. However, the calibration performances of the Sato processes and the time-change specifications are practically indistinguishable
    Keywords: credit default swap; reduced form model; Sato process; time-changed Lévy process; cumulative hazard
    Date: 2009–04–27
    URL: http://d.repec.org/n?u=RePEc:hhb:aarbfi:2009-01&r=ecm
  4. By: Denis Belomestny; Anastasia Kolodko; John Schoenmakers
    Abstract: In this paper we develop several regression algorithms for solving general stochastic optimal control problems via Monte Carlo. This type of algorithms is particularly useful for problems with a highdimensional state space and complex dependence structure of the underlying Markov process with respect to some control. The main idea behind the algorithms is to simulate a set of trajectories under some reference measure and to use the Bellman principle combined with fast methods for approximating conditional expectations and functional optimization. Theoretical properties of the presented algorithms are investigated and the convergence to the optimal solution is proved under some assumptions. Finally, the presented methods are applied in a numerical example of a high-dimensional controlled Bermudan basket option in a financial market with a large investor.
    Keywords: Optimal stochastic control, Regression methods, Convergence analysis
    JEL: C15 C61
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2009-026&r=ecm
  5. By: Chauvet, Marcelle; Senyuz, Zeynep
    Abstract: This paper proposes an econometric model of the joint dynamic relationship between the yield curve and the economy to predict business cycles. We examine the predictive value of the yield curve to forecast both future economic growth as well as the beginning and end of economic recessions at the monthly frequency. The proposed multivariate dynamic factor model takes into account not only the popular term spread but also information extracted from the entire yield curve. The nonlinear model is used to investigate the interrelationship between the phases of the bond market and of the business cycle. The results indicate a strong interrelation between these two sectors. Although the popular term spread has a reasonable forecasting performance, the proposed factor model of the yield curve exhibits substantial incremental predictive value. This result holds in-sample and out-of-sample, using revised or real time unrevised data.
    Keywords: Forecasting; Business Cycles; Yield Curve; Dynamic Factor Models; Markov Switching.
    JEL: C32 E32 E44
    Date: 2008–12
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15076&r=ecm
  6. By: Nuria Torrado; Rosa E. Lillo; Michael P. Wiper
    Abstract: In this paper, we solve for some cases a conjecture by Kochar and Korwar (1996) in relation with the normalized spacings of the order statistics related to a sample of independent exponential random variables with different scale parameter. In the case of a sample of size n=3, they proved the ordering of the normalized spacings and conjectured that result holds for all n. We give the proof of this conjecture for n=4 and for both spacing and normalized spacings. We also generalize some results to n>4
    Keywords: Heterogeneous exponential distribution, Hazard rate order, Normalized
    Date: 2009–03
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws092108&r=ecm
  7. By: Paredes - Araya, Dusan
    Abstract: This paper proposes a methodology for a spatial cost index of housing that considers spatial heterogeneity in properties across regions. The index is built by combining three different techniques to reduce the spatial heterogeneity in housing: Quasi-experimental methods, hedonic prices and Fisher spatial price index. Using microdata from the Chilean survey CASEN 2006, it is shown that the quasi-experimental method called Mahalanobis metric within propensity score calipers (MMWPS) leads to a significant reduction in the potential bias. The technique matches dwellings of a particular region with other properties of similar characteristics in the benchmark region (Metropolitan region). Once the houses are matched, a hedonic price model is computed, and a regional housing price matrix is created using Fisher spatial price indices. The paper concludes the existence of price differentials for homogeneous houses across regions in Chile.
    Keywords: Housing Cost; Index Hedonic Prices Index; Matching Estimator; Spatial Fisher Index.
    JEL: C43 C21 R21
    Date: 2009–05–04
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15016&r=ecm
  8. By: Michele Modugno (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Kleopatra Nikolaou (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.)
    Abstract: This paper investigates whether information from foreign yield curves helps forecast domestic yield curves out-of-sample. A nested methodology to forecast yield curves in domestic and international settings is applied on three major countries (the US, Germany and the UK). This novel methodology is based on dynamic factor models, the EM algorithm and the Kalman …lter. The domestic model is compared vis-á-vis an international one, where information from foreign yield curves is allowed to enrich the information set of the domestic yield curve. The results have interesting and original implications. They reveal clear international dependency patterns, strong enough to improve forecasts of Germany and to a lesser extent UK. The US yield curve exhibits a more independent behaviour. In this way, the paper also generalizes anecdotal evidence on international interest rate linkages to the whole yield curve. JEL Classification: F31.
    Keywords: Yield curve forecast, Dynamic factor model, EM algorithm, International linkages.
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:200901044&r=ecm
  9. By: Astrid Cullmann
    Abstract: In January 2009 Germany introduced incentive regulation for the electricity distribution sector based on results obtained from econometric and nonparametric benchmarking analysis. One main problem for the regulator in assigning the relative efficiency scores are unobserved firm-specific factors such as network and technological differences. Comparing the efficiency of different firms usually assumes that they operate under the same production technology, thus unobserved factors might be inappropriately understood as inefficiency. To avoid this type of misspecification in regulatory practice estimation is carried out in two stages: in a first stage observations are classified into two categories according to the size of the network operators. Then separate analyses are conducted for each sub-group. This paper shows how to disentangle the heterogeneity from inefficiency in one step, using a latent class model for stochastic frontiers. As the classification is not based on a priori sample separation criteria it delivers more robust, statistical significant and testable results. Against this backround we analyze the level of technical efficiency of a sample of 200 regional and local German electricity distribution companies for a balanced panel data set (2001-2005). Testing the hypothesis if larger distributors operate under a different technology than smaller ones we assess if a single step latent class model provides new insights to the use of benchmarking approaches within the incentive regulation schemes.
    Keywords: Stochastic frontiers, latent class model, electricity distribution, incentive regulation
    JEL: C24 C81 D24 L94
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp881&r=ecm
  10. By: Kerstin Bernoth; Andreas Pick
    Abstract: This paper considers the issue of forecasting financial fragility of banks and insurances using a panel data set of performance indicators, namely distance-to- default, taking unobserved common factors into account. We show that common factors are important in the performance of banks and insurances, analyze the influences of a number of observable factors on banking and insurance performance, and evaluate the forecasts from our model. We find that taking unobserved common factors into account reduces the the root mean square forecasts error of firm specific forecasts by up to 11% and of system forecasts by up to 29% relative to a model based only on observed variables. Estimates of the factor loadings suggest that the correlation of financial institutions has been relatively stable over the forecast period.
    Keywords: Financial stability, financial linkages, banking, insurances, unobserved common factors, forecasting
    JEL: C53 G21 G22
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp882&r=ecm
  11. By: Candelon Bertrand; Metiu Norbert (METEOR)
    Abstract: This paper investigates exceptional phases of stock market cycles. Defined in Pagan and Sossounov (2003) as unusual, they are detected as outliers in the historical distribution. Moreover, this study completes the growing literature on stock market bulls and bears in several aspects. First,it extends the description of financial cy- cles by going beyond solely the duration feature. Second, a new strategy to test for single and multiple outliers is presented. Based on this procedure, the exceptional bulls and bears that occurred since 1973 are detected. A complementary analysis deals with the specific cross-country patterns of the current sub-prime crisis. Our results are mixed, in the sense that they do not support the idea that the ongoing bear is exceptional for all the analyzed countries. Moreover, the results indicate that the stock market indices are still far away from the thresholds beyond which the current bear phase will become exceptional worldwide.
    Keywords: monetary economics ;
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:dgr:umamet:2009019&r=ecm

This nep-ecm issue is ©2009 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.