|
on Econometrics |
By: | Loriano Mancini (University of Zurich); Fabio Trojani (University of St-Gallen) |
Abstract: | We propose a general robust semiparametric bootstrap method to estimate conditional predictive distributions of GARCH-type models. Our approach is based on a robust estimator for the parameters in GARCH-type models and a robustified resampling method for standardized GARCH residuals, which controls the bootstrap instability due to influential observations in the tails of standardized GARCH residuals. Monte Carlo simulation shows that our method consistently provides lower VaR forecast errors, often to a large extent, and in contrast to classical methods never fails validation tests at usual significance levels. We test extensively our approach in the context of real data applications to VaR prediction for market risk, and find that only our robust procedure passes all validation tests at usual confidence levels. Moreover, the smaller tail estimation risk of robust VaR forecasts implies VaR prediction intervals that can be nearly 20% narrower and 50% less volatile over time. This is a further desirable property of our method, which allows to adapt risky positions to VaR limits more smoothly and thus more efficiently. |
Keywords: | Backtesting, M-estimator, Extreme Value Theory, Breakdown Point. |
JEL: | C14 C15 C23 C59 |
Date: | 2005–10 |
URL: | http://d.repec.org/n?u=RePEc:chf:rpseri:rp0731&r=ecm |
By: | Andersson, Michael K (Monetary Policy Department, Central Bank of Sweden); Karlsson, Sune (Department of Business, Economics, Statistics and Informatics) |
Abstract: | We consider forecast combination and, indirectly, model selection for VAR models when there is uncertainty about which variables to include in the model in addition to the forecast variables. The key difference from traditional Bayesian variable selection is that we also allow for uncertainty regarding which endogenous variables to include in the model. That is, all models include the forecast variables, but may otherwise have differing sets of endogenous variables. This is a difficult problem to tackle with a traditional Bayesian approach. Our solution is to focus on the forecasting performance for the variables of interest and we construct model weights from the predictive likelihood of the forecast variables. The procedure is evaluated in a small simulation study and found to perform competitively in applications to real world data. |
Keywords: | Bayesian model averaging; Predictive likelihood; GDP forecasts |
JEL: | C11 C15 C32 C52 C53 |
Date: | 2007–11–01 |
URL: | http://d.repec.org/n?u=RePEc:hhs:rbnkwp:0216&r=ecm |
By: | Gregory Connor (The London School of Economics); Matthias Hagmann (Concordia Advisors and Swiss Finance Institute); Oliver Linton (The London School of Economics) |
Abstract: | This paper develops a new estimation procedure for characteristic-based factor models of security returns. We treat the factor model as a weighted additive nonparametric regression model, with the factor returns serving as time-varying weights, and a set of univariate nonparametric functions relating security characteristic to the associated factor betas. We use a time-series and cross-sectional pooled weighted additive nonparametric regression methodology to simultaneously estimate the factor returns and characteristic-beta functions. By avoiding the curse of dimensionality our methodology allows for a larger number of factors than existing semiparametric methods. We apply the technique to the three-factor Fama-French model, Carhart’s four-factor extension of it adding a momentum factor, and a five-factor extension adding an own-volatility factor. We find that momentum and own-volatility factors are at least as important if not more important than size and value in explaining equity return comovements. We test the multifactor beta pricing theory against the Capital Asset Pricing model using a standard test, and against a general alternative using a new nonparametric test. |
Keywords: | Additive Models; Arbitrage pricing theory; Factor model; Fama-French; Kernel estimation; Nonparametric regression; Panel data. |
JEL: | G12 C14 |
Date: | 2007–09 |
URL: | http://d.repec.org/n?u=RePEc:chf:rpseri:rp0726&r=ecm |
By: | Douglas Miller (Department of Economics, University of Missouri-Columbia) |
Abstract: | Conditional Markov chain models of observed aggregate sharetype data have been used by economic researchers for several years, but the classes of models commonly used in practice are often criticized as being purely ad hoc because they are not derived from microbehavioral foundations. The primary purpose of this paper is to show that the estimating equations commonly used to estimate these conditional Markov chain models may be derived from the assumed statistical properties of an agentspecific discrete decision process. Thus, any conditional Markov chain model estimated from these estimating equations may be compatible with some underlying agentspecific decision process. The secondary purpose of this paper is to use an information theoretic approach to derive a new class of conditional Markov chain models from this set of estimating equations. The proposed modeling framework is based on the behavioral foundations but does not require specific assumptions about the utility function or other components of the agentspecific discrete decision process. The asymptotic properties of the proposed estimators are developed to facilitate model selection procedures and classical tests of behavioral hypotheses. |
Keywords: | controlled stochastic process, Frechet derivative, firstorder Markov chain, CressieRead power divergence criterion |
JEL: | C40 C51 |
Date: | 2007–09–01 |
URL: | http://d.repec.org/n?u=RePEc:umc:wpaper:0718&r=ecm |
By: | Dominique Guégan (Centre d'Economie de la Sorbonne); Zhiping Lu (East China Normal University et Centre d'Economie de la Sorbonne) |
Abstract: | The purpose of this paper is to study the self-similar properties of discrete-time long memory processes. We apply our results to specific processes such as GARMA processes and GIGARCH processes, heteroscedastic models and the processes with switches and jumps. |
Keywords: | Covariance stationary, Long memory processes, short memory processes, self-similar, asymptotically second-order self-similar, autocorrelation function. |
JEL: | C02 C32 C40 C60 |
Date: | 2007–11 |
URL: | http://d.repec.org/n?u=RePEc:mse:cesdoc:b07055&r=ecm |
By: | Dominique Guégan (Centre d'Economie de la Sorbonne); Florian Ielpo (Centre d'Economie de la Sorbonne) |
Abstract: | In this paper, we introduce a new approach to estimate the subjective distribution of the future short rate from the historical dynamics of futures, based on a model generated by a Normal Inverse Gaussian distribution, with dynamical parameters. The model displays time varying conditional volatility, skewness and kurtosis and provides a flexible framework to recover the conditional distribution of the future rates. For the estimation, we use maximum likelihood method. Then, we apply the model to Fed Fund futures and discuss its performance. |
Keywords: | Subjective distribution, autoregressive conditional density, generalized hyperbolic distribution, Fed Funds futures contracts. |
JEL: | C51 E44 |
Date: | 2007–10 |
URL: | http://d.repec.org/n?u=RePEc:mse:cesdoc:b07056&r=ecm |
By: | J. Hirschberg; J. Lye |
Abstract: | Regression specifications in applied econometrics frequently employ regressors that are defined as the product of two other regressors to form an interaction. Unfortunately, the interpretation of the results of these models is not as straight forward as in the linear case. In this paper, we present a method for drawing inferences for interaction models by defining the partial influence function. We present an example that demonstrates how one may draw new inferences by constructing the confidence intervals for the partial influence functions based on the traditional published findings for regressions with interaction terms. |
Keywords: | Interaction effects; dummy variables; linear transformation; Fieller method |
JEL: | C12 C51 |
Date: | 2007 |
URL: | http://d.repec.org/n?u=RePEc:mlb:wpaper:1015&r=ecm |
By: | Dominique Guégan (Centre d'Economie de la Sorbonne) |
Abstract: | In this paper we deal with the problem of non-stationarity encountered in a lot of data sets coming from existence of multiple seasonnalities, jumps, volatility, distorsion, aggregation, etc. We study the problem caused by these non stationarities on the estimation of the sample autocorrelation function and give several examples of models for which spurious behaviors is created by this fact. It concerns Markov switching processes, Stopbreak models and SETAR processes. Then, new strategies are suggested to study locally these data sets. We propose first a test based on the k-the cumulants and mainly the construction of a meta-distribution based on copulas for the data set which will permit to take into account all the non-stationarities. This approach suggests that we can be able to do risk management for portfolio containing non stationary assets and also to obtain the distribution function of some specific models. |
Keywords: | Non-stationarity, distribution function, copula, long-memory, switching, SETAR, Stopbreak models, cumulants, estimation. |
JEL: | C32 C51 G12 |
Date: | 2007–04 |
URL: | http://d.repec.org/n?u=RePEc:mse:cesdoc:b07053&r=ecm |
By: | Xu, Pai; Shneyerov, Artyom; Marmer, Vadim |
Abstract: | We develop a nonparametric approach that allows one to discriminate among alternative models of entry in first-price auctions. Three models of entry are considered: Levin and Smith (1994), Samuelson (1985), and a new model in which the information received at the entry stage is imperfectly correlated with valuations. We derive testable restrictions that these three models impose on the quantiles of active bidders' valuations, and develop nonparametric tests of these restrictions. We implement the tests on a dataset of highway procurement auctions in Oklahoma. Depending on the project size, we find no support for the Samuelson model, some support for the Levin and Smith model, and somewhat more support for the new model. |
JEL: | C12 C14 D44 |
Date: | 2007–11–22 |
URL: | http://d.repec.org/n?u=RePEc:ubc:pmicro:marmer-07-11-22-02-26-44&r=ecm |
By: | Eduardo José Araújo Lima; Benjamin Miranda Tabak |
Abstract: | This paper compares different versions of the multiple variance ratio test based on bootstrap techniques for the construction of empirical distributions. It also analyzes the crucial issue of selecting optimal block sizes when block bootstrap procedures are used, by applying the methods developed by Hall et al. (1995) and by Politis and White (2004). By comparing the results of the different methods using Monte Carlo simulations, we conclude that methodologies using block bootstrap methods present better performance for the construction of empirical distributions of the variance ratio test. Moreover, the results are highly sensitive to methods employed to test the null hypothesis of random walk. |
Date: | 2007–11 |
URL: | http://d.repec.org/n?u=RePEc:bcb:wpaper:151&r=ecm |
By: | Ricardo Schechtman |
Abstract: | The Basel Committee on Banking Supervision recognizes that one of the greatest technical challenges to the implementation of the new Basel II Accord lies on the validation of the banks’ internal credit rating models (CRMs). This study investigates new proposals of statistical tests for validating the PDs (probabilities of default) of CRMs. It distinguishes between proposals aimed at checking calibration and those focused at discriminatory power. The proposed tests recognize the existence of default correlation, deal jointly with the default behaviour of all the ratings and, differently to previous literature, control the error of validating incorrect CRMs. Power sensitivity analysis and strategies for power improvement are discussed, providing insights on the trade-offs and limitations pertained to the calibration tests. An alternative goal is proposed for the tests of discriminatory power and results of power dominance are shown for them with direct practical consequences. Finally, as the proposed tests are asymptotic, Monte-Carlo simulations investigate the small sample bias for varying scenarios of parameters. |
Date: | 2007–10 |
URL: | http://d.repec.org/n?u=RePEc:bcb:wpaper:149&r=ecm |
By: | Frisén, Marianne (Statistical Research Unit, Department of Economics, School of Business, Economics and Law, Göteborg University) |
Abstract: | Multivariate surveillance is of interest in industrial production as it enables the monitoring of several components. Recently there has been an increased interest also in other areas such as detection of bioterrorism, spatial surveillance and transaction strategies in finance.<p>Several types of multivariate counterparts to the univariate Shewhart, EWMA and CUSUM methods have been proposed. Here a review of general approaches to multivariate surveillance is given with respect to how suggested methods relate to general statistical inference principles. <p> Suggestions are made on the special challenges of evaluating multivariate sur-veillance methods. |
Keywords: | Surveillance; Multivariate |
JEL: | C10 |
Date: | 2007–01–01 |
URL: | http://d.repec.org/n?u=RePEc:hhs:gunsru:2007_004&r=ecm |
By: | Andersson, Eva (Statistical Research Unit, Department of Economics, School of Business, Economics and Law, Göteborg University) |
Abstract: | In many situations we need a system for detecting changes early. Examples are early detection of disease outbreaks, of patients at risk and of financial instability. In influenza outbreaks, for example, we want to detect an increase in the number of cases. Important indicators might be the number of cases of influenza-like illness and pharmacy sales (e.g. aspirin). By continually monitoring these indicators, we can early detect a change in the process of interest. The methodology of statistical surveillance is used. Often, the conclusions about the process(es) of interest is improved if the surveillance is based on several indicators. Here three systems for multivariate surveillance are compared. One system, called LRpar, is based on parallel likelihood ratio methods, since the likelihood ratio has been shown to have several optimality properties. In LRpar, the marginal density of each indicator is monitored and an alarm is called as soon as one of the likelihood ratios exceeds its alarm limit. The LRpar is compared to an optimal alarm system, called LRjoint, which is derived from the full likelihood ratio for the joint density. The performances of LRpar and LRjoint are compared to a system where the Hotellings T2 is monitored. The evaluation is made using the delay of a motivated alarm, as a function of the times of the changes. The effect of dependency is investigated: both dependency between the monitored processes and correlation between the time points when the changes occur. When the first change occurs immediately, the three methods work rather similarly, for independent processes and zero correlation between the change times. But when all processes change later, the T2 has much longer delay than LRjoint and LRpar. This holds both when the processes are independent and when they have a positive covariance. When we assume a positive correlation between the change times, the LRjoint yields a shorter delay than LRpar when the changes actually do occur simultaneously, whereas the opposite is true when the changes do actually occur at different time point. |
Keywords: | Multivariate; Surveillance; Dependency; Optimal; Covariance; Likelihood ratio |
JEL: | C10 |
Date: | 2007–01–01 |
URL: | http://d.repec.org/n?u=RePEc:hhs:gunsru:2007_001&r=ecm |
By: | D.S.G. Pollock |
Abstract: | Methods are described that are available for extracting the trend from an economic data sequence and for isolating the cycles that might surround it. The latter often consist of a business cycle of variable duration and a perennial seasonal cycle. There is no evident distinction that can serve unequivocally to determine a point in the frequency spectrum where the trend ends and the business cycle begins. Unless it can be represented by a simple analytic function, such as an exponential growth path, there is bound to be a degree of arbitrariness in the definition of the trend. The business cycle, however defined, is liable to have an upper limit to its frequency range that falls short of the Nyquist frequency, which is the maximum observable frequency in sampled data. Therefore, if it is required to fit a parametric model to the business cycle, this ought to describe a band-limited process. The appropriate method for estimating a band-limited process is described for the case where the band includes the zero frequency. |
Date: | 2007–11 |
URL: | http://d.repec.org/n?u=RePEc:lec:leecon:07/17&r=ecm |
By: | Gregor W. Smith (Queen's University); James Yetman (University of Hong Kong) |
Abstract: | Dynamic Euler equations restrict multivariate forecasts. Thus a range of links between macroeconomic variables can be studied by seeing whether they hold within the multivariate predictions of professional forecasters. We illustrate this novel way of testing theory by studying the links between forecasts of U.S. nominal interest rates, inflation, and real consumption growth since 1981. By using forecast data for both returns and macroeconomic fundamentals, we use the complete cross-section of forecasts, rather than the median. The Survey of Professional Forecasters yields a three-dimensional panel, across quarters, forecasters, and forecast horizons. This approach yields 14727 observations, much greater than the 107 time series observations. The resulting precision reveals a significant, negative relationship between consumption growth and interest rates. |
Keywords: | forecast survey, asset pricing, Fisher effect |
JEL: | E17 E21 E43 |
Date: | 2007–11 |
URL: | http://d.repec.org/n?u=RePEc:qed:wpaper:1144&r=ecm |
By: | Dimitrios Thomakos; Tao Wang |
Abstract: | We examine the `relative optimality' of sign predictions for financial returns, extending the work of Christoffersen and Diebold (2006) on volatility dynamics and sign predictability. We show that there is a more general decomposition of financial returns than that implied by the sign decomposition and which depends on the choice of the threshold that defines direction. We then show that the choice of the threshold matters and that a threshold of zero (leading to sign predictions) is not necessarily `optimal'. We provide explicit conditions that allow for the choice of a threshold that has maximum responsiveness to changes in volatility dynamics and thus leads to `optimal' probabilistic predictions. Finally, we connect the evolution of volatility to probabilistic predictions and show that the volatility ratio is the crucial variable in this context. Our work strengthens the arguments in favor of accurate volatility measurement and prediction, as volatility dynamics are integrated into the `optimal' threshold. We provide an empirical illustration of our findings using monthly returns and realized volatility for the S&P500 index. |
Date: | 2007 |
URL: | http://d.repec.org/n?u=RePEc:uop:wpaper:0006&r=ecm |
By: | Till Dannewald; Henning Kreis; Nadja Silberhorn |
Abstract: | Traditional choice models assume that observable behavior results from an unspecified evaluation process of the observed individual. When it comes to the revelation of this process mere choice models rapidly meet their boundaries, as psychological factors (e.g., consumers’ perception or attitudes towards products) are not directly measurable variables and therefore cannot offhand be integrated within the model structure. The causal-analytic approach offers the possibility to specify not directly measurable factors as latent variables, and can thus reasonable supplement choice models. So far, methodological approaches investigating latent variables, and traditional choice models are perceived and applied independently of one another. In this paper the possibilities of an integration of latent variables into traditional choice models is pointed out, and an introduction into the modeling of hybrid choice models is provided. Furthermore, potential areas of application in marketing research are outlined. |
Keywords: | Hybrid choice model, latent variables, causal model, choice model |
JEL: | M30 C51 C10 |
Date: | 2007–11 |
URL: | http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2007-062&r=ecm |
By: | Frisén, Marianne (Statistical Research Unit, Department of Economics, School of Business, Economics and Law, Göteborg University) |
Abstract: | The aim of sequential surveillance is on-line detection of an important change in an underlying process as soon as possible after the change has occurred. Statistical methods suitable for surveillance differ from hypothesis testing methods. In addition, the criteria for optimality differ from those used in hypothesis testing.<p> The need for sequential surveillance in industry, economics, medicine and for environmental purposes is described. Even though the methods have been developed under different scientific cultures, inferential similarities can be identified. <p>Applications contain complexities such as autocorrelations, complex distributions, complex types of changes, and spatial as well as other multivariate settings. Approaches to handling these complexities are discussed. <p>Expressing methods for surveillance through likelihood functions makes it possible to link the methods to various optimality criteria. This approach also facilitates the choice of an optimal surveillance method for each specific application and provides some directions for improving earlier suggested methods. |
Keywords: | Change point; Likelihood ratio; Monitoring; Multivariate surveillance; Minimum expected delay; Online detection |
JEL: | C10 |
Date: | 2007–01–01 |
URL: | http://d.repec.org/n?u=RePEc:hhs:gunsru:2007_002&r=ecm |
By: | Paul Levine (University of Surrey); Joseph Pearlman (London Metropolitan University); George Perendia (London Metropolitan University) |
Abstract: | Most DSGE models and methods make inappropriate asymmetric information assumptions. They assume that all economic agents have full access to measurement of all variables and past shocks, whereas the econometricians have no access to this. An alternative assumption is that there is symmetry, in that the information set available to both agents and econometricians is incomplete. The reality lies somewhere between the two, because agents are likely to be subject to idiosyncratic shocks which they can observe, but are unable to observe other agents’ idiosyncratic shocks, as well as being unable to observe certain economy-wide shocks; however such assumptions generally lead to models that have no closed-form solution. This research aims to compare the two alternatives - the asymmetric case, as commonly used in the literature, and the symmetric case, which uses the partial information solution of Pearlman et al. (1986) using standard EU datasets. We use Bayesian MCMC methods, with log-likelihoods accounting for partial information. The work then extends the data to allow for a greater variety of measurements, and evaluates the effect on estimates, along the lines of work by Boivin and Giannoni (2005). |
Keywords: | partial information, DSGE models, Bayesian maximum likelihood |
JEL: | C11 C13 D58 D82 |
Date: | 2007–11 |
URL: | http://d.repec.org/n?u=RePEc:sur:surrec:1607&r=ecm |
By: | Bock, David (Statistical Research Unit, Department of Economics, School of Business, Economics and Law, Göteborg University) |
Abstract: | In systems for on-line detection of regime shifts, a process is continually observed. Based on the data available an alarm is given when there is enough evidence of a change. There is a risk of a false alarm and here two different ways of controlling the false alarms are compared: a fixed average run length until the first false alarm and a fixed probability of any false alarm (fixed size). The two approaches are evaluated in terms of the timeliness of alarms. A system with a fixed size is found to have a drawback: the ability to detect a change deteriorates with the time of the change. Consequently, the probability of successful detection will tend to zero and the expected delay of a motivated alarm tends to infinity. This drawback is present even when the size is set to be very large (close to 1). Utility measures expressing the costs for a false or a too late alarm are used in the comparison. How the choice of the best approach can be guided by the parameters of the process and the different costs of alarms is demonstrated. The technique is illustrated by financial transactions of the Hang Seng Index. |
Keywords: | Monitoring; Surveillance; Repeated decisions; Moving average; Shewhart method |
JEL: | C10 |
Date: | 2007–01–01 |
URL: | http://d.repec.org/n?u=RePEc:hhs:gunsru:2007_003&r=ecm |
By: | Holzmann, Hajo; Vollmer, Sebastian; Weisbrod, Julian |
Abstract: | This paper contributes towards the growing debate concerning the world distribution of income and its evolution over that past three to four decades. Our methodological approach is twofold. First, we formally test for the number of modes in a cross-sectional analysis where each country is represented by one observation. We contribute to existing studies with technical improvements of the testing procedure, enabling us to draw new conclusions, and an extension of the time horizon being analyzed. Second, we estimate a global distribution of income from national log-normal distributions of income, as well as a global distribution of log-income as a mixture of national normal distributions of log-income. From this distribution we obtain measures for global inequality and poverty as well as global growth incidence curves. |
Keywords: | Convergence, Silverman's test, non-parametric statistics, bimodal, global income distribution, poverty, inequality, growth incidence curves |
JEL: | C5 F0 I3 O0 |
Date: | 2007 |
URL: | http://d.repec.org/n?u=RePEc:zbw:gdec07:6555&r=ecm |
By: | Dominique Guégan (Centre d'Economie de la Sorbonne) |
Abstract: | In this article, we specify the different approaches followed by the economists and the financial economists in order to use chaos theory. We explain the main difference using this theory with other research domains like the mathematics and the physics. Finally, we present tools necessary for the economists and financial economists to explore this domain empirically. |
Keywords: | Chaos theory, attractor, Economy, Finance, estimation theory, forecasting. |
JEL: | C51 |
Date: | 2007–06 |
URL: | http://d.repec.org/n?u=RePEc:mse:cesdoc:b07054&r=ecm |
By: | Ricardo Mora; Javier Ruiz-Castillo |
Abstract: | This paper explores the statistical properties of an index of multigroup segregation based on the entropy concept, the Mutual Information (M ) index. In the context of school segregation by ethnic group, the paper establishes that (i) the M index is a monotonic transformation of the likelihood-ratio test for the independence of school and ethnic status, while (ii) the within-group term of the M index for any partition of the set of schools (or ethnic groups) is a monotonic transformation of the likelihood-ratio test for the independence of schools and ethnic groups within the partition in question. This last result is applied to study whether the level of segregation differs significantly within any number of cities, countries or time periods. It is also shown how statistical tests for pair wise comparisons of segregation levels between schools, school districts, ethnic groups, supergroups, cities, countries or time periods can be performed. |
Date: | 2007–11 |
URL: | http://d.repec.org/n?u=RePEc:cte:werepe:we077443&r=ecm |
By: | Juan Manuel Julio |
Abstract: | Abstract. Our current implementation of the Fan Chart follows the original Britton Fisher and Whitley [2] and Blix and Sellin [1] proposal in which the inputs enter at the fourth and ninth quarters and are distributed within the forecasting horizon according to pre established weights. This procedure does not allow enough °exibility to control the shape of the distribution at the shorter end of the horizon when more reliable information is available. On the other hand, no published material presents the details of the Fan Chart computation. In this note all the technical details of the new implementation are described. This implementation provides more °exibility than the previous one since it permits the inputs to be entered on a quarterly basis instead of on a yearly basis. A Visual Basic for Excel program is available. |
Date: | 2007–11–19 |
URL: | http://d.repec.org/n?u=RePEc:col:000094:004294&r=ecm |
By: | Barnett, William A.; Duzhak, Evgeniya A. |
Abstract: | Grandmont (1985) found that the parameter space of the most classical dynamic models are stratified into an infinite number of subsets supporting an infinite number of different kinds of dynamics, from monotonic stability at one extreme to chaos at the other extreme, and with many forms of multiperiodic dynamics between. The econometric implications of Grandmont’s findings are particularly important, if bifurcation boundaries cross the confidence regions surrounding parameter estimates in policy-relevant models. Stratification of a confidence region into bifurcated subsets seriously damages robustness of dynamical inferences. Recently, interest in policy in some circles has moved to New Keynesian models. As a result, in this paper we explore bifurcation within the class of New Keynesian models. We develop the econometric theory needed to locate bifurcation boundaries in log-linearized New-Keynesian models with Taylor policy rules or inflation-targeting policy rules. Central results needed in this research are our theorems on the existence and location of Hopf bifurcation boundaries in each of the cases that we consider. |
Keywords: | Bifurcation; Hopf bifurcation; Euler equations; New Keynesian macroeconometrics; Bergstrom-Wymer model |
JEL: | C3 C5 E3 |
Date: | 2007–11–27 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:6005&r=ecm |