|
on Econometrics |
By: | Naoto Kunitomo (Faculty of Economics, University of Tokyo); Seisho Sato (Institute of Statistical Mathematics) |
Abstract: | For estimating the realized volatility and covariance by using high frequency data, we introduce the Separating Information Maximum Likelihood (SIML) method when there are possibly micro-market noises. The resulting estimator is simple and it has the representation as a specific quadratic form of returns. The SIML estimator has reasonable asymptotic properties; it is consistent and it has the asymptotic normality (or the stable convergence in the general case) when the sample size is large under general conditions including non-Gaussian processes and volatility models. Based on simulations, we find that the SIML estimator has reasonable finite sample properties and thus it would be useful for practice. It is also possible to use the limiting distribution of the SIML estimator for constructing testing procedures and confidence intervals. |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:tky:fseres:2008cf581&r=ecm |
By: | T. W. Anderson (Department of Statistics and Department of Economics, Stanford University); Naoto Kunitomo (Faculty of Economics, University of Tokyo); Yukitoshi Matsushita (Graduate School of Economics, University of Tokyo) |
Abstract: | We compare four different estimation methods for the coefficients of a linear structural equation with instrumental variables. As the classical methods we consider the limited information maximum likelihood (LIML) estimator and the two-stage least squares (TSLS) estimator, and as the semi-parametric estimation methods we consider the maximum empirical likelihood (MEL) estimator and the generalized method of moments (GMM) (or the estimating equation) estimator. Tables and figures of the distribution functions of four estimators are given for enough values of the parameters to cover most linear models of interest and we include some heteroscedastic cases and nonlinear cases. We have found that the LIML estimator has good performance in terms of the bounded loss functions and probabilities when the number of instruments is large, that is, the micro-econometric models with "many instruments" in the terminology of recent econometric literature. |
Date: | 2008–07 |
URL: | http://d.repec.org/n?u=RePEc:tky:fseres:2008cf577&r=ecm |
By: | Christophe Planas (Joint Research Centre of the European Commission); Alessandro Rossi (Joint Research Centre of the European Commission); Gabriele Fiorentini (University of Florence, Italy and The Rimini Centre for Economic Analysis, Italy) |
Abstract: | We propose a simple procedure for evaluating the marginal likelihood in univariate Structural Time Series (STS) models. For this we exploit the statistical properties of STS models and the results in Dickey (1968) to obtain the likelihood function marginally to the variance parameters. This strategy applies under normal-inverted gamma-2 prior distributions for the structural shocks and associated variances. For trend plus noise models such as the local level and the local linear trend, it yields the marginal likelihood by simple or double integration over the (0,1)-support. For trend plus cycle models, we show that marginalizing out the variance parameters greatly improves the accuracy of the Laplace method. We apply this ethodology to the analysis of US and euro area NAIRU. |
Keywords: | Marginal likelihood, Markov Chain Monte Carlo, unobserved components, bridge sampling, Laplace method, NAIRU |
Date: | 2008–01 |
URL: | http://d.repec.org/n?u=RePEc:rim:rimwps:21-08&r=ecm |
By: | F Bravo; |
Abstract: | This paper introduces a new class of M-estimators based on generalised empirical likelihood estimation with some auxiliary information available in the sample. The resulting class of estimators is efficient in the sense that it achieves the same asymptotic lower bound as that of the efficient generalised method of moment-based M-estimator with the same auxiliary information. The results of the paper are quite general and apply to M-estimators defined by both smooth and nonsmooth estimating equations. Simulations show that the proposed estimators perform well in finite samples, and can be less biased and more precise than standard M-estimators within China. |
Keywords: | Asymptotic efficiency. Generalised empirical likelihood. Generalised method of moments. M-estimators. Generalised method of moments, M-estimators. |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:yor:yorken:08/26&r=ecm |
By: | Gary Koop (University of Strathclyde, UK and The RImini Centre for Economic Analisys, Italy); Roberto Leon-Gonzalez (National Graduate Institute for Policy Studies, Japan and The RImini Centre for Economic Analisys - Italy); Rodney W. Strachan (University of Queensland, Australia and The RImini Centre for Economic Analisys - Italy) |
Abstract: | There are both theoretical and empirical reasons for believing that the parameters of macroeconomic models may vary over time. However, work with time-varying parameter models has largely involved Vector autoregressions (VARs), ignoring cointegration. This is despite the fact that cointegration plays an important role in informing macroeconomists on a range of issues. In this paper we develop time varying parameter models which permit cointegration. Time-varying parameter VARs (TVP-VARs) typically use state space representations to model the evolution of parameters. In this paper, we show that it is not sensible to use straightforward extensions of TVP-VARs when allowing for cointegration. Instead we develop a specication which allows for the cointegrating space to evolve over time in a manner comparable to the random walk variation used with TVP-VARs. The properties of our approach are investigated before developing a method of posterior simulation. We use our methods in an empirical investigation involving a permanent/transitory variance decomposition for ination. |
Keywords: | Bayesian, time varying cointegration, error correctionmodel, reduced rank regression, Markov Chain Monte Carlo. |
JEL: | C11 C32 C33 |
Date: | 2008–01 |
URL: | http://d.repec.org/n?u=RePEc:rim:rimwps:23-08&r=ecm |
By: | Francesco Audrino; Marcelo C. Medeiros |
Abstract: | In this paper we propose a smooth transition tree model for both the conditional mean and the conditional variance of the short-term interest rate process. Our model incorporates the interpretability of regression trees and the flexibility of smooth transition models to describe regime switches in the short-term interest rate series. The estimation of such models is addressed and the asymptotic properties of the quasi-maximum likelihood estimator are derived. Model specification is also discussed. When the model is applied to the US short-term interest rate we find (1) leading indicators for inflation and real activity are the most relevant predictors in characterizing the multiple regimes' structure; (2) the optimal model has three limiting regimes, with significantly different local conditional mean and variance dynamics. Moreover, we provide empirical evidence of the strong power of the model in forecasting the first two conditional moments of the short rate process, in particular when it is used in connection with bootstrap aggregating (bagging). |
Keywords: | Short-term interest rate, Regression tree, Smooth transition, Conditional variance, Bagging, Asymptotic theory |
JEL: | C13 C22 C51 C53 |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:usg:dp2008:2008-16&r=ecm |
By: | Andrea Mercatanti (Bank of Italy, Economic and Financial Statistics Department.) |
Abstract: | This paper examines the problem of relaxing the exclusion restriction for the evaluation of causal effects in randomized experiments with imperfect compliance. Exclusion restriction is a relevant assumption for identifying causal effects by the nonparametric instrumental variables technique, in which the template of a randomized experiment with imperfect compliance represents a natural parametric extension. However, the full relaxation of the exclusion restriction yields likelihood functions characterized by the presence of mixtures of distributions. This complicates a likelihood-based analysis because it implies partially identified models and more than one maximum likelihood point. We consider the model identifiability when the outcome distributions of various compliance states are in the same parametric class. A two-step estimation procedure based on detecting the root closest to the method of moments estimate of the parameter vector is proposed and analyzed in detail under normally distributed outcomes. An economic example with real data on return to schooling concludes the paper. |
Keywords: | compliers, exclusion restriction, mixture distributions, return to schooling. |
JEL: | C13 C21 |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:bdi:wptemi:td_683_08&r=ecm |
By: | Frölich, Markus (University of Mannheim); Melly, Blaise (Brown University) |
Abstract: | Traditional instrumental variable estimators do not generally estimate effects for the treated population but for the unobserved population of compliers. They do identify effects for the treated when there is one-sided perfect non-compliance. However, this property is lost when covariates are included in the model. In this case, we show that the effects for the treated are still identified but require modified estimators. We consider both average and quantile treatment effects and allow the instrument to be discrete or continuous. |
Keywords: | treatment effects, instrumental variables, non-compliance, missing data |
JEL: | C13 C14 C21 |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp3671&r=ecm |
By: | Eo, Yunjong; Morley, James C. |
Abstract: | In this paper, we propose a new approach to constructing confidence sets for the timing of structural breaks. This approach involves using Markov-chain Monte Carlo methods to simulate marginal “fiducial” distributions of break dates from the likelihood function. We compare our proposed approach to asymptotic and bootstrap confidence sets and find that it performs best in terms of producing short confidence sets with accurate coverage rates. Our approach also has the advantages of i) being broadly applicable to different patterns of structural breaks, ii) being computationally efficient, and iii) requiring only the ability to evaluate the likelihood function over parameter values, thus allowing for many possible distributional assumptions for the data. In our application, we investigate the nature and timing of structural breaks in postwar U.S. Real GDP. Based on marginal fiducial distributions, we find much tighter 95% confidence sets for the timing of the so-called “Great Moderation” than has been reported in previous studies. |
Keywords: | Fiducial Inference; Bootstrap Methods; Structural Breaks; Confidence Intervals and Sets; Coverage Accuracy and Expected Length; Markov-chain Monte Carlo; |
JEL: | C15 C22 |
Date: | 2008–09–05 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:10372&r=ecm |
By: | Carlos Santos (Faculdade de Economia e Gestão - Universidade Católica Portuguesa (Porto)) |
Abstract: | Monte Carlo evidence is provided as to the efficiency of the impulse saturation estimator in a location-scale model with heavy-tailed distributions. Comparisons show that the IS estimator is always more efficient than the OLS and can even outperform the Method of Moments estimator in some instances. |
Keywords: | nonnormality; impulse saturation; robust estimation |
JEL: | C13 C16 |
Date: | 2008–09 |
URL: | http://d.repec.org/n?u=RePEc:cap:wpaper:052008&r=ecm |
By: | Wolfgang Brunauer; Stefan Lang; Peter Wechselberger; Sven Bienert |
Abstract: | We apply additive mixed regression models (AMM) to estimate hedonic price equations. Non-linear effects of continuous covariates as well as a smooth time trend are modeled non-parametrically through P-splines. Unobserved district-specific heterogeneity is modeled in two ways: First, by location specific intercepts with the postal code serving as a location variable. Second, in order to permit spatial variation in the nonlinear price gradients, we introduce multiplicative scaling factors for nonlinear covariates. This allows highly nonlinear implicit price functions to vary within a regularized framework, accounting for district-specific spatial heterogeneity. Using this model extension, we find substantial spatial variation in house price gradients, leading to a considerable improvement of model quality and predictive power. |
Keywords: | Hedonic regression, submarkets, multiplicative spatial scaling factors, semiparametric models, P-splines |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:inn:wpaper:2008-17&r=ecm |
By: | Kraay, Aart |
Abstract: | The validity of instrumental variables (IV) regression models depends crucially on fundamentally untestable exclusion restrictions. Typically exclusion restrictions are assumed to hold exactly in the relevant population, yet in many empirical applications there are reasonable prior grounds to doubt their literal truth. In this paper I show how to incorporate prior uncertainty about the validity of the exclusion restriction into linear IV models, and explore the consequences for inference. In particular I provide a mapping from prior uncertainty about the exclusion restriction into increased uncertainty about parameters of interest. Moderate prior uncertainty about exclusion restrictions can lead to a substantial loss of precision in estimates of structural parameters. This loss of precision is relatively more important in situations where IV estimates appear to be more precise, for example in larger samples or with stronger instruments. The author illustrates these points using several prominent recent empirical papers that use linear IV models. |
Keywords: | Economic Theory&Research,Statistical&Mathematical Sciences,Currencies and Exchange Rates,Econometrics,Access to Finance |
Date: | 2008–05–01 |
URL: | http://d.repec.org/n?u=RePEc:wbk:wbrwps:4632&r=ecm |
By: | Gary Koop (University of Strathclyde, UK and The RImini Centre for Economic Analisys, Italy); Roberto Leon-Gonzalez (National Graduate Institute for Policy Studies, Japan and The RImini Centre for Economic Analisys - Italy); Rodney W. Strachan (University of Queensland, Australia and The RImini Centre for Economic Analisys - Italy) |
Abstract: | Empirical macroeconomists are increasingly using models (e.g. regressions or Vector Autoregressions) where the parameters vary over time. State space methods are frequently used to specify the evolution of parameters in such models. In any application, there are typically restrictions on the parameters that a researcher might be interested in. This motivates the question of how to calculate the probability that a restriction holds at a point in time without assuming the restriction holds at all (or any other) points in time. This paper develops methods to answer this question. In particular, the principle of the Savage-Dickey density ratio is used to obtain the time-varying posterior probabilities of restrictions. We use our methods in a macroeconomic application involving the Phillips curve. Macroeconomists are interested in whether the long-run Phillips curve is vertical. This is a restriction for which we can calculate the posterior probability using our methods. Using U.S. data, the probability that this restriction holds tends to be fairly high, but decreases slightly over time (apart from a slight peak in the late 1970s). We also calculate the probability that another restriction, that the NAIRU is not identied, holds. The probability that it holds uctuates over time with most evidence in favor of the restriction occurring after 1990. |
Keywords: | Bayesian, state space model, Savage-Dickey density ratio, time varying parameter model. |
JEL: | C11 C32 E52 |
Date: | 2008–01 |
URL: | http://d.repec.org/n?u=RePEc:rim:rimwps:26-08&r=ecm |
By: | John Geweke (University of Iowa. USA); Gianni Amisano (University of Brescia - Italy, European Central Bank and The RImini Centre for Economic Analisys - Italy) |
Abstract: | A prediction model is any statement of a probability distribution for an outcome not yet observed. This study considers the properties of weighted linear combinations of n prediction models, or linear pools, evaluated using the conventional log predictive scoring rule. The log score is a concave function of the weights and, in general, an optimal linear combination will include several models with positive weights despite the fact that exactly one model has limiting posterior probability one. The paper derives several interesting formal results: for example, a prediction model with positive weight in a pool may have zero weight if some other models are deleted from that pool. The results are illustrated using S&P 500 returns with prediction models from the ARCH, stochastic volatility and Markov mixture families. In this example models that are clearly inferior by the usual scoring criteria have positive weights in optimal linear pools, and these pools substantially outperform their best components. |
Keywords: | forecasting; GARCH; log scoring; Markov mixture; model combination; S&P 500 returns; stochastic volatility |
Date: | 2008–01 |
URL: | http://d.repec.org/n?u=RePEc:rim:rimwps:22-08&r=ecm |
By: | Kengo Kato (Graduate School of Economics, University of Tokyo); Naoto Kunitomo (Faculty of Economics, University of Tokyo); Satoshi Masuda (Chuo Mitsui Trust Holdings, Inc.) |
Abstract: | We summarize the recent developments on the statistical method of Lasso-Quantile Regression and we apply it to a Non-life Insurance problem. We discuss the asymptotic properties of the Quantile Regression estimator, the computational aspects related to the Linear Programming problem and the selection of Quantile regressors. We illustrate the practical aspects of measuring risk factors by using a Non-life insurance data. |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:tky:jseres:2008cj203&r=ecm |
By: | Gemechis D. Djira; Frank Schaarschmidt; Bichaka Fayissa |
Abstract: | Location quotient (LQ) is an index frequently used in geography and economics to measure the relative concentration of activities. This quotient is calculated in a variety of ways depending on which group to use as a reference. Here, we focus on simultaneous inference for the ratios of the individual proportions to the overall proportion based on binomial data. Apparently, this is a multiple comparison problem and multiplicity adjusted location quotients have not been addressed up to now. In fact, there is a negative correlation between the comparisons. The quotients can be simultaneously tested against unity and simultaneous confidence intervals can be constructed for the LQs based on existing probability inequalities and by directly using the asymptotic joint distribution of the associated z-statistics. The proposed inferences are appropriate for analysis based on sample surveys. A real data set is used to demonstrate the application of multiplicity adjusted LQs. A simulation study is also carried out to assess the performance of the proposed methods in terms of achieving a nominal coverage probability. It is observed that the coverage of the simple Bonferroni adjusted Fieller intervals for LQs is just as good as the coverage of the method which directly takes the correlations into account. |
Keywords: | Location quotients, Fieller's theorem, Multiple comparison. |
JEL: | C12 R11 |
Date: | 2008–09 |
URL: | http://d.repec.org/n?u=RePEc:mts:wpaper:200809&r=ecm |
By: | Gautam Tripathi (University of Connecticut) |
Abstract: | We show how to do efficient moment based inference using the generalized method of moments (GMM) when data is collected by standard stratified sampling and the maintained assumption is that the aggregate shares are known. |
Keywords: | Generalized method of moments, GMM, standard stratified sampling. |
JEL: | C30 |
Date: | 2008–09 |
URL: | http://d.repec.org/n?u=RePEc:uct:uconnp:2008-31&r=ecm |
By: | William Greene |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:ste:nystbu:08-9&r=ecm |
By: | Tatevik Sekhposyan; Barbara Rossi |
Abstract: | We evaluate various models’ relative performance in forecasting future US output growth and inflation on a monthly basis. Our approach takes into account the possibility that the models’ relative performance can be varying over time. We show that the models’ relative performance has, in fact, changed dramatically over time, both for revised and real-time data, and investigate possible factors that might explain such changes. In addition, this paper establishes two empirical stylized facts. Namely, most predictors for output growth lost their predictive ability in the mid-1970s, and became essentially useless in the last two decades. When forecasting inflation, instead, fewer predictors are significant (among which, notably, capacity utilization and unemployment), and their predictive ability significantly worsened around the time of the Great Moderation. |
Keywords: | Output Forecasts, Inflation Forecasts, Model Selection, Structural Change, Forecast Evaluation, Real-time data. Evaluation |
JEL: | C22 C52 C53 |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:duk:dukeec:08-5&r=ecm |
By: | Claude Lopez; Christian J. Murray; David H. Papell |
Abstract: | Using median-unbiased estimation, recent research has questioned the validity of Rogoff’s “remarkable consensus” of 3-5 year half-lives of deviations from PPP. These half-life estimates, however, are based on estimates from regressions where the resulting unit root test has low power. We extend median-unbiased estimation to the DF-GLS regression of Elliott, Rothenberg, and Stock (1996). We find that median-unbiased estimation based on this regression has the potential to tighten confidence intervals for half-lives. Using long horizon real exchange rate data, we find that the typical lower bound of the confidence intervals for median-unbiased half-lives is just under 3 years. Thus, while previous confidence intervals for half-lives are consistent with virtually anything, our tighter confidence intervals now rule out economic models with nominal rigidities as candidates for explaining the observed behavior of real exchange rates. Therefore, while we obtain more information using efficient unit root tests on longer term data, this information moves us away from solving the PPP puzzle. |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:cin:ucecwp:2008-05&r=ecm |
By: | Mark J. Holmes (Department of Economics, Waikato University Management School); Theodore Panagiotidis (Department of Economics, University of Macedonia); Jesus Otero (Facultad de Economia, Universidad del Rosario) |
Abstract: | In this paper, we test for the stationarity of European Union budget deficits over the period 1971 to 2006, using a panel of thirteen member countries. Our testing strategy addresses two key concerns with regard to unit root panel data testing, namely (i) the presence of cross-sectional dependence among the countries in the panel and (ii) the identification of potential structural break s that might have occurred at different points in time. To address these concerns, we employ an AR-based bootstrap approach that allows us to test the null hypothesis of joint stationarity with endogenously determined structural breaks. In contrast to the existing literature, we find that the EU countries considered are characterised by fiscal stationarity over the full sample period irrespective of us allowing for structural breaks. This conclusion also holds when analysing sub-periods based on before and after the Maastricht treaty. |
Keywords: | Heterogeneous dynamic panels, fiscal sustainability, mean reversion, panel stationarity test. |
JEL: | C33 F32 F41 |
Date: | 2008–09 |
URL: | http://d.repec.org/n?u=RePEc:mcd:mcddps:2008_07&r=ecm |
By: | Carlo Giovanni Camarda; Maria Durban |
Abstract: | Mortality data on an aggregate level are characterized by very large sample sizes. For this reason, uninformative outcomes are evident in common Goodness-of-Fit measures. In this paper we propose a new measure that allows comparison of different mortality models even for large sample sizes. Particularly, we develop a measure which uses a null model specifically designed for mortality data. Several simulation studies and actual applications will demonstrate the performances of this new measure with special emphasis on demographic models and Pspline approach. |
Keywords: | Goodness of fit, P-splines, R2, Mortality |
Date: | 2009–09 |
URL: | http://d.repec.org/n?u=RePEc:cte:wsrepe:ws083909&r=ecm |
By: | Michael Greenacre |
Abstract: | Subcompositional coherence is a fundamental property of Aitchison’s approach to compositional data analysis, and is the principal justification for using ratios of components. We maintain, however, that lack of subcompositional coherence, that is incoherence, can be measured in an attempt to evaluate whether any given technique is close enough, for all practical purposes, to being subcompositionally coherent. This opens up the field to alternative methods, which might be better suited to cope with problems such as data zeros and outliers, while being only slightly incoherent. The measure that we propose is based on the distance measure between components. We show that the two-part subcompositions, which are the most sensitive to subcompositional incoherence, can be used to establish a distance matrix which can be directly compared with the pairwise distances in the full composition. The closeness of these two matrices can be quantified using a stress measure that is common in multidimensional scaling, providing a measure of subcompositional incoherence. Furthermore, we strongly advocate introducing weights into this measure, where rarer components are weighted proportionally less than more abundant components. The approach is illustrated using power-transformed correspondence analysis, which has already been shown to converge to logratio analysis as the power transform tends to zero. |
Keywords: | Chi-square distance, correspondence analysis, logratio distance, multidimensional scaling, stress, subcompositional coherence |
JEL: | C19 C88 |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:upf:upfgen:1106&r=ecm |
By: | Lans, R.J.A. van der; Cote, J.A.; Cole, C.A.; Leong, S.M.; Smidts, A.; Henderson, P.W.; Bluemelhuber, C.; Bottomley, P.A.; Doyle, J.R.; Fedorikhin, A.S.; Janakiraman, M.; Ramaseshan, B.; Schmitt, B.H. (Erasmus Research Institute of Management (ERIM), RSM Erasmus University) |
Abstract: | The universality of design perception and response is tested using data collected from ten countries: Argentina, Australia, China, Germany, Great Britain, India, the Netherlands, Russia, Singapore, and the United States. A Bayesian, finite-mixture, structural-equation model is developed that identifies latent logo clusters while accounting for heterogeneity in evaluations. The concomitant variable approach allows cluster probabilities to be country specific. Rather than a priori defined clusters, our procedure provides a posteriori cross-national logo clusters based on consumer response similarity. To compare the a posteriori cross-national logo clusters, our approach is integrated with Steenkamp and Baumgartner’s (1998) measurement invariance methodology. Our model reduces the ten countries to three cross-national clusters that respond differently to logo design dimensions: the West, Asia, and Russia. The dimensions underlying design are found to be similar across countries, suggesting that elaborateness, naturalness, and harmony are universal design dimensions. Responses (affect, shared meaning, subjective familiarity, and true and false recognition) to logo design dimensions (elaborateness, naturalness, and harmony) and elements (repetition, proportion, and parallelism) are also relatively consistent, although we find minor differences across clusters. Our results suggest that managers can implement a global logo strategy, but they also can optimize logos for specific countries if desired. |
Keywords: | design;logos;international marketing;standardization;adaptation;structural equation models;Gibbs sampling;concomitant variable;Bayesian;mixture models |
Date: | 2008–09–02 |
URL: | http://d.repec.org/n?u=RePEc:dgr:eureri:1765013181&r=ecm |
By: | Gary Koop (University of Strathclyde, UK and The RImini Centre for Economic Analisys, Italy); Roberto Leon-Gonzalez (National Graduate Institute for Policy Studies, Japan and The RImini Centre for Economic Analisys - Italy); Rodney W. Strachan (University of Queensland, Australia and The RImini Centre for Economic Analisys - Italy) |
Abstract: | This paper investigates the evolution of monetary policy in the U.S. using a standard set of macroeconomic variables. Many recent papers have addressed the issue of whether the monetary transmission mechanism has changed (e.g. due to the Fed taking a more aggressive stance against ination) or whether apparent changes are simply due to changes in the volatility of exogenous shocks. A subsidiary question is whether any such changes have been gradual or abrupt. In this paper, we shed light on these issues using a mixture innovation model which extends the class of time varying Vector Autoregressive models with stochastic volatility which have been used in the past. The advantage of our extension is that it allows us to estimate whether, where, when and how parameter change is occurring (as opposed to assuming a particular form of parameter change). Our empirical results strongly indicate that the transmission mechanism, the volatility of exogenous shocks and the correlations between exogenous shocks are all changing (albeit at different times and to di¤erent extents) Furthermore, evolution of parameters is gradual. |
Keywords: | structural VAR, monetary policy, Bayesian, mixture innovation model, time varying parameter model |
JEL: | C11 C32 E52 |
Date: | 2008–01 |
URL: | http://d.repec.org/n?u=RePEc:rim:rimwps:24-08&r=ecm |
By: | Schrimpf, Andreas |
Abstract: | This paper examines return predictability when the investor is uncertain about the right state variables. A novel feature of the model averaging approach used in this paper is to account for finite-sample bias of the coefficients in the predictive regressions. Drawing on an extensive international dataset, we find that interest-rate related variables are usually among the most prominent predictive variables, whereas valuation ratios perform rather poorly. Yet, predictability of market excess returns weakens substantially, once model uncertainty is accounted for. We document notable dierences in the degree of in-sample and out-of-sample predictability across different stock markets. Overall, these findings suggests that return predictability is not a uniform and a universal feature across international capital markets. |
Keywords: | Stock Return Predictability, Bayesian Model Averaging, Model Uncertainty, International Stock Markets |
JEL: | E44 G12 G14 G15 |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:zbw:zewdip:7358&r=ecm |
By: | Håvard Hungnes (Statistics Norway) |
Abstract: | In a system with n input factors there are n − 1 independent cost shares. An often-used approach in estimating factor demand systems is to (implicitly or explicitly) assume that there is a (independent) cointegrating relationship for each of the n − 1 independent cost shares. However, due to technological changes there might not be as many cointegrating relationships as there are (independent) cost shares. The paper presents a flexible demand system that allows for both factor neutral technological changes as well as technological changes that affect the relative use of the different factors. The empirical tests indicate that there are fewer cointegrating relationships than usually implied by using conventional estimation approaches. This result is consistent with technological changes. I argue that since such unexplained technological changes are likely to affect input factor decisions, a demand system that allows for such changes should be preferred. |
Keywords: | Factor demand; technological changes; growth rates |
JEL: | C32 C52 D24 |
Date: | 2008–09 |
URL: | http://d.repec.org/n?u=RePEc:ssb:dispap:556&r=ecm |
By: | Beck, Thorsten |
Abstract: | This paper reviews different econometric methodologies to assess the relationship between financial development and growth. It illustrates the identification problem, which is at the center of the finance and growth literature, using the example of a simple Ordinary Least Squares estimation. It discusses cross-sectional and panel instrumental variable approaches to overcome the identification problem. It presents the time-series approach, which focuses on the forecast capacity of financial development for future growth rates, and differences-in-differences techniques that try to overcome the identification problem by assessing the differential effect of financial sector development across states with different policies or across industries with different needs for external finance. Finally, it discusses firm-level and household approaches that allow analysts to dig deeper into the channels and mechanisms through which financial development enhances growth and welfare, but pose their own methodological challenges. |
Keywords: | Access to Finance,Achieving Shared Growth,Economic Theory&Research,Debt Markets,Statistical&Mathematical Sciences |
Date: | 2008–04–01 |
URL: | http://d.repec.org/n?u=RePEc:wbk:wbrwps:4608&r=ecm |
By: | Kagie, M.; Loos, M. van der; Wezel, M.C. van (Erasmus Research Institute of Management (ERIM), RSM Erasmus University) |
Abstract: | We propose a new hybrid recommender system that combines some advantages of collaborative and content-based recommender systems. While it uses ratings data of all users, as do collaborative recommender systems, it is also able to recommend new items and provide an explanation of its recommendations, as do content-based systems. Our approach is based on the idea that there are communities of users that find the same characteristics important to like or dislike a product. This model is an extension of the probabilistic latent semantic model for collaborative filtering with ideas based on clusterwise linear regression. On a movie data set, we show that the model is competitive to other recommenders and can be used to explain the recommendations to the users. |
Keywords: | algorithms;recommender systems;probabilistic latent semantic analysis;hybrid recommender systems |
Date: | 2008–08–27 |
URL: | http://d.repec.org/n?u=RePEc:dgr:eureri:1765013180&r=ecm |
By: | William H. Greene; David A. Hensher |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:ste:nystbu:08-26&r=ecm |