nep-ecm New Economics Papers
on Econometrics
Issue of 2008‒06‒21
twenty-two papers chosen by
Sune Karlsson
Orebro University

  1. Likelihood-based Analysis for Dynamic Factor Models By Borus Jungbacker; Siem Jan Koopman
  2. Garch Parameter Estimation Using High-Frequency Data By Visser, Marcel P.
  3. Parameter Driven Multi-state Duration Models: Simulated vs. Approximate Maximum Likelihood Estimation By André A. Monteiro
  4. Forecasting Random Walks Under Drift Instability By Pesaran, M.H.; Pick, A.
  5. An Hourly Periodic State Space Model for Modelling French National Electricity Load By V. Dordonnat; S.J. Koopman; M. Ooms; A. Dessertaine; J. Collet
  6. MDL Mean Function Selection in Semiparametric Kernel Regression Models By Jan G. De Gooijer; Ao Yuan
  7. Speed of Adjustment in Cointegrated Systems By Fanelli, Luca; Paruolo, Paolo
  8. Confidence sets based on penalized maximum likelihood estimators By Pötscher, Benedikt M.; Schneider, Ulrike
  9. Testing for Granger (non)-Causality in a Time Varying Coefficient VAR Model By Dimitris K. Christopoulos; Miguel Leon-Ledesma
  10. "Bayesian Estimation of Demand Functions under Block Rate Pricing" By Koji Miyawaki; Yasuihro Omori; Akira Hibiki
  11. Instrumental Variable Estimation for Duration Data By Govert E. Bijwaard
  12. Model-based Estimation of High Frequency Jump Diffusions with Microstructure Noise and Stochastic Volatility By Charles S. Bos
  13. Early Detection Techniques for Market Risk Failure By Jose Olmo; William Pouliot
  14. Optimal Asset Allocation with Factor Models for Large Portfolios By Pesaran, M.H.; Zaffaroni, P.
  15. A New Procedure to Monitor the Mean of a Quality Characteristic By Kiani, Mehdi; Panaretos, John; Psarakis, Stelios
  16. Old and new spectral techniques for economic time series By Sella Lisa
  17. Short-term forecasting of GDP using large monthly datasets - a pseudo real-time forecast evaluation exercise By Karim Barhoumi; Szilard Benk; Riccardo Cristadoro; Ard Den Reijer; Audrone Jakaitiene; Piotr Jelonek; António Rua; Gerhard Rünstler; Karsten Ruth; Christophe Van Nieuwenhuyze
  18. Analysis of the dependence structure in econometric time series By Aurélien Hazan; Vincent Vigneron
  19. Incorporating judgement with DSGE models By Jaromír Beneš; Andrew Binning; Kirdan Lees
  20. Rational Forecasts or Social Opinion Dynamics? Identification of Interaction Effects in a Business Climate Survey By Thomas Lux
  21. The Information Content of a Stated Choice Experiment By Jan Rouwendal; Arianne de Blaeij; Piet Rietveld; Erik Verhoef
  22. Bad Luck When Joining the Shortest Queue By Blanc, J.P.C.

  1. By: Borus Jungbacker (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam)
    Abstract: We present new results for the likelihood-based analysis of the dynamic factor model that possibly includes intercepts and explanatory variables. The latent factors are modelled by stochastic processes. The idiosyncratic disturbances are specified as autoregressive processes with mutually correlated innovations. The new results lead to computationally efficient procedures for the estimation of the factors and parameter estimation by maximum likelihood and Bayesian methods. An illustration is provided for the analysis of a large panel of macroeconomic time series.
    Keywords: EM algorithm; Kalman Filter; Forecasting; Latent Factors; Markov chain Monte Carlo; Principal Components; State Space
    JEL: C33 C43
    Date: 2008–01–17
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20080007&r=ecm
  2. By: Visser, Marcel P.
    Abstract: Estimation of the parameters of Garch models for financial data is typically based on daily close-to-close returns. This paper shows that the efficiency of the parameter estimators may be greatly improved by using volatility proxies based on intraday data. The paper develops a Garch quasi maximum likelihood estimator (QMLE) based on these proxies. Examples of such proxies are the realized volatility and the intraday high-low range. Empirical analysis of the S&P 500 index tick data shows that the use of a suitable proxy may reduce the variances of the estimators of the Garch autoregression parameters by a factor 20.
    Keywords: volatility estimation; quasi maximum likelihood; volatility proxy; Gaussian QMLE; log-Gaussian QMLE; autoregressive conditional heteroscedasticity
    JEL: C51 G1 C14 C22
    Date: 2008–06–10
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:9076&r=ecm
  3. By: André A. Monteiro (VU University Amsterdam, and University of Western Australia)
    Abstract: Likelihood based inference for multi-state latent factor intensity models is hindered by the fact that exact closed-form expressions for the implied data density are not available. This is a common and well-known problem for most parameter driven dynamic econometric models. This paper reviews, adapts and compares three different approaches for solving this problem. For evaluating the likelihood, two of the methods rely on Monte Carlo integration with importance sampling techniques. The third method, in contrast, is based on fully deterministic numerical procedures. A Monte Carlo study is conducted to illustrate the use of each method, and assess its corresponding finite sample performance.
    Keywords: Multi-state Duration models; Parameter Driven models; Simulated Maximum Likelihood; Importance Sampling
    JEL: C15 C32 C33 C41
    Date: 2008–02–27
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20080021&r=ecm
  4. By: Pesaran, M.H.; Pick, A.
    Abstract: This paper considers forecast averaging when the same model is used but estimation is carried out over different estimation windows. It develops theoretical results for random walks when their drift and/or volatility are subject to one or more structural breaks. It is shown that compared to using forecasts based on a single estimation window, averaging over estimation windows leads to a lower bias and to a lower root mean square forecast error for all but the smallest of breaks. Similar results are also obtained when observations are exponentially down-weighted, although in this case the performance of forecasts based on exponential down-weighting critically depends on the choice of the weighting coefficient. The forecasting techniques are applied to monthly inflation series of 21 OECD countries and it is found that average forecasting methods in general perform better than using forecasts based on a single estimation window.
    Keywords: Forecast combinations, averaging over estimation windows, exponentially down-weighting observations, structural breaks.
    JEL: C22 C53
    Date: 2008–03
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:0814&r=ecm
  5. By: V. Dordonnat (VU University Amsterdam); S.J. Koopman (VU University Amsterdam); M. Ooms (VU University Amsterdam); A. Dessertaine (Electricité de France, Clamart, France); J. Collet (Electricité de France, Clamart, France)
    Abstract: We present a model for hourly electricity load forecasting based on stochastically time-varying processes that are designed to account for changes in customer behaviour and in utility production efficiencies. The model is periodic: it consists of different equations and different parameters for each hour of the day. Dependence between the equations is introduced by covariances between disturbances that drive the time-varying processes. The equations are estimated simultaneously. Our model consists of components that represent trends, seasons at different levels (yearly, weekly, daily, special days and holidays), short-term dynamics and weather regression effects including nonlinear functions for heating effects. The implementation of our forecasting procedure relies on the multivariate linear Gaussian state space framework and is applied to national French hourly electricity load. The analysis focuses on two hours, 9 AM and 12 AM, but forecasting results are presented for all twenty-four hours. Given the time series length of nine years of hourly observations, many features of our model can be readily estimated including yearly patterns and their time-varying nature. The empirical analysis involves an out-of sample forecasting assessment up to seven days ahead. The one-day ahead forecasts from fourty-eight bivariate models are compared with twenty-four univariate models for all hours of the day. We find that the implied forecasting function strongly depends on the hour of the day.
    Keywords: Kalman filter; Maximum likelihood estimation; Seemingly Unrelated Regression Equations; Unobserved Components; Time varying parameters; Heating effect
    JEL: C22 C32 C52 C53
    Date: 2008–01–17
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20080008&r=ecm
  6. By: Jan G. De Gooijer (University of Amsterdam); Ao Yuan (Howard University, Washington DC, USA)
    Abstract: We study the problem of selecting the optimal functional form among a set of non-nested nonlinear mean functions for a semiparametric kernel based regression model. To this end we consider Rissanen's minimum description length (MDL) principle. We prove the consistency of the proposed MDL criterion. Its performance is examined via simulated data sets of univariate and bivariate nonlinear regression models.
    Keywords: Kernel density estimator; Maximum likelihood estimator; Minimum description length; Nonlinear regression; Semiparametric model
    JEL: C14
    Date: 2008–05–07
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20080046&r=ecm
  7. By: Fanelli, Luca; Paruolo, Paolo
    Abstract: This paper considers the speed of adjustment to long-run equilibria, in the context of cointegrated Vector Autoregressive Processes (VAR). We discuss the definition of multivariate p-lives for any indicator of predictive ability, concentrating on cumulated interim multipliers which converge to impact factor for increasing forecasting horizon. Interim multipliers are related to autoregressive Granger-causality coefficients, structural or generalized cumulative impulse responses. We discuss the relation of the present definition of multivariate p-lives with existing definitions for univariate time series and for nonlinear multivariate stationary processes. For multivariate (possibly cointegrated) VAR systems, p-lives are functions of the dynamics of the system only,and do not depend on the history path on which the forecast is based. Hence one can discuss inference on p-lives as (discrete) functions of parameters in the VAR model. We discuss a likelihood-based approach, both for point estimation and for confidence regions. An illustrative application to adjustment to purchasing-power parity (PPP) is presented.
    Keywords: p-life; speed of adjustment; impact factors; vector equilibrium correction; shock absorption.
    JEL: C32 C52 F31
    Date: 2007–06
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:9174&r=ecm
  8. By: Pötscher, Benedikt M.; Schneider, Ulrike
    Abstract: The finite-sample coverage properties of confidence intervals based on penalized maximum likelihood estimators like the LASSO, adaptive LASSO, and hard-thresholding are analyzed. It is shown that symmetric intervals are the shortest. The length of the shortest intervals based on the hard-thresholding estimator is larger than the length of the shortest interval based on the adaptive LASSO, which is larger than the length of the shortest interval based on the LASSO, which in turn is larger than the standard interval based on the maximum likelihood estimator. In the case where the penalized estimators are tuned to possess the `sparsity property', the intervals based on these estimators are larger than the standard interval by an order of magnitude. A simple asymptotic confidence interval construction in the `sparse' case, that also applies to the smoothly clipped absolute deviation estimator, is also discussed.
    Keywords: penalized maximum likelihood; Lasso; adaptive Lasso; hard-thresholding; confidence set; coverage probability; sparsity; model selection.
    JEL: C13 C01
    Date: 2008–06
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:9062&r=ecm
  9. By: Dimitris K. Christopoulos; Miguel Leon-Ledesma
    Abstract: In this paper we propose Granger (non-)causality tests based on a VAR model allowing for time-varying coefficients. The functional form of the time-varying coefficients is a Logistic Smooth Transition Autoregressive (LSTAR) model using time as the transition variable. The model allows for testing Granger non-causality when the VAR is subject to a smooth break in the coefficients of the Granger causal variables. The proposed test then is applied to the money-output relationship using quarterly US data for the period 1952:2-2002:4. We find that causality from money to output becomes stronger after 1978:4 and the model is shown to have a good out of sample forecasting performance for output relative to a linear VAR model.
    Keywords: Granger causality; Time-varying coefficients; LSTAR models
    JEL: C51 C52
    Date: 2008–01
    URL: http://d.repec.org/n?u=RePEc:ukc:ukcedp:0802&r=ecm
  10. By: Koji Miyawaki (Graduate School of Economics, University of Tokyo); Yasuihro Omori (Faculty of Economics, University of Tokyo); Akira Hibiki (cNational Institute for Environmental Studies and Department of Social Engineering, Tokyo Institute of Technology)
    Abstract: This article proposes a Bayesian estimation method of demand functions under block rate pricing, focusing on increasing one. Under this pricing structure, price changes when consumption exceeds a certain threshold and the consumer faces a utility maximization problem subject to a piecewise-linear budget constraint. We apply the so-called discrete/continuous choice approach to derive the corresponding demand function. Taking a hierarchical Bayesian approach, we implement a Markov chain Monte Carlo simulation to estimate the demand function. Moreover, a separability condition is explicitly considered to obtain proper estimates. We find, however, that the convergence of the distribution of simulated samples to the posterior distribution is slow, requiring an additional scale transformation step for parameters to the Gibbs sampler. The model is also extended to allow random coefficients for panel data and spatial correlation for spatial data. These proposed methods are applied to estimate the Japanese residential water and electricity demand function.
    Date: 2008–06
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2008cf568&r=ecm
  11. By: Govert E. Bijwaard (Erasmus University Rotterdam)
    Abstract: In this article we develop an Instrumental Variable estimation procedure that corrects for possible endogeneity of a variable in a duration model. We assume a Generalized Accelerated Failure Time (GAFT) model. This model is based on transforming the durations and assuming a distribution for these transformed durations. The GAFT model encompasses two competing approaches to duration data; the (Mixed) Proportional Hazard (MPH) model and the Accelerated Failure Time (AFT) model. The basis of the Instrumental Variable Linear Rank estimator (IVLR) is that for the true GAFT model the instrument does not influence the hazard of the transformed duration. The inverse of an extended rank test provide the estimation equations the IVLR estimation procedure is based on. We discuss the large sample properties and the efficiency of this estimator. We discuss the practical issues of implementation of the estimator. We apply the IVLR estimation approach to the Illinois re-employment bonus experiment. In this experiment individuals who became unemployed were divided at random in three groups: two bonus groups and a control group. Those in the bonus groups could refuse to participate in the experiment. It is very likely that this decision is related to the unemployment duration. We use the IVLR estimator to obtain the effect of these endogenous claimant and employer bonuses on the re-employment hazard.
    Keywords: Endogenous Variable; Duration model; Censoring; Instrumental Variable
    JEL: C21 C41 J64
    Date: 2008–03–27
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20080032&r=ecm
  12. By: Charles S. Bos (VU University Amsterdam)
    Abstract: When analysing the volatility related to high frequency financial data, mostly non-parametric approaches based on realised or bipower variation are applied. This article instead starts from a continuous time diffusion model and derives a parametric analog at high frequency for it, allowing simultaneously for microstructure effects, jumps, missing observations and stochastic volatility. Estimation of the model delivers measures of daily variation outperforming their non-parametric counterparts. Both with simulated and actual exchange rate data, the feasibility of this novel approach is shown. The parametric setting is used to estimate the intra-day trend in the Euro/U.S. Dollar exchange rate.
    Keywords: High frequency; integrated variation; intra-day; jump diffusions; microstructure noise; stochastic volatility; exchange rates
    JEL: C11 C14 D53 E44
    Date: 2008–01–22
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20080011&r=ecm
  13. By: Jose Olmo (Department of Economics, City University, London); William Pouliot (Department of Economics, City University, London)
    Abstract: The implementation of appropriate statistical techniques for monitoring conditional VaR models, i.e, backtesting, reported by institutions is fundamental to determine their exposure to market risk. Backtesting techniques are important since the severity of the departures of the VaR model from market results determine the penalties imposed for inadequate VaR models. In this paper we make six contributions to backtesting techniques. In particular, we show that the Kupiec test can be viewed as a combination of CUSUM change point tests; we detail the lack of power of CUSUM methods in detecting violations of VaR as soon as these occur; we develop an alternative technique based on weighted U-statistic type processes that have power against wrong specifications of the risk measure and early detection; we show these new backtesting techniques are robust to the presence of estimation risk; we construct a new class of weight functions that can be used to weight our processes; and our methods are applicable both under conditional and unconditional VaR settings.
    Keywords: Asymmetries, crises; Extreme values; Hypothesis testing; Leverage effect; Nonlinearities; Threshold models
    Date: 2008–05
    URL: http://d.repec.org/n?u=RePEc:cty:dpaper:0809&r=ecm
  14. By: Pesaran, M.H.; Zaffaroni, P.
    Abstract: This paper characterizes the asymptotic behaviour, as the number of assets gets arbitrarily large, of the portfolio weights for the class of tangency portfolios belonging to the Markowitz paradigm. It is as- sumed that the joint distribution of asset returns is characterized by a general factor model, with possibly heteroskedastic components. Under these conditions, we establish that a set of appealing properties, so far unnoticed, characterize traditional Markowitz portfolio trading strategies. First, we show that the tangency portfolios fully diversify the risk associated with the factor component of asset return innovations. Second, with respect to determination of the portfolio weights, the conditional distribution of the factors is of second-order importance as compared to the distribution of the factor loadings and that of the idiosyncratic components. Third, although of crucial importance in forecasting asset returns, current and lagged factors do not enter the limit portfolio returns. Our theoretical results also shed light on a number of issues discussed in the literature regarding the limiting properties of portfolio weights such as the diversi¯ability property and the number of dominant factors.
    Keywords: Asset allocation, Large Porftolios, Factor models, Diversi¯cation.
    JEL: C32 C52 C53 G1
    Date: 2008–03
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:0813&r=ecm
  15. By: Kiani, Mehdi; Panaretos, John; Psarakis, Stelios
    Abstract: The Shewhart, Bonferroni-adjustment and analysis of means (ANOM) control chart are typically applied to monitor the mean of a quality characteristic. The Shewhart and Bonferroni procedure are utilized to recognize special causes in production process, where the control limits are constructed by assuming normal distribution for known parameters (mean and standard deviation), and approximately normal distribution regarding to unknown parameters. The ANOM method is an alternative to the analysis of variance method. It can be used to establish the mean control charts by applying equicorrelated multivariate non-central t distribution. In this paper, we establish new control charts, in phases I and II monitoring, based on normal and t distributions having as a cause a known (or unknown) parameter (standard deviation). Our proposed methods are at least as effective as the classical Shewhart methods and have some advantages.
    Keywords: Shewhart; Bonferroni-adjustment; Analysis of means; Average run length; False alarm probability
    JEL: C10
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:9066&r=ecm
  16. By: Sella Lisa (University of Turin)
    Abstract: This methodological paper reviews different spectral techniques well suitable to the analysis of economic time series. While econometric time series analysis is generally yielded in the time domain, these techniques propose a complementary approach based on the frequency domain. Spectral decomposition and time series reconstruction provide a precise quantitative and formal description of the main oscillatory components of a series: thus, it is possible to formally identify trends, lowfrequency components, business cycles, seasonalities, etc. Since recent developments in spectral techniques allow to manage even with short noisy dataset, nonstationary processes, non purely periodic components these tools could be applied on economic datasets more widely than they nowadays are.
    Date: 2008–05
    URL: http://d.repec.org/n?u=RePEc:uto:dipeco:200809&r=ecm
  17. By: Karim Barhoumi; Szilard Benk; Riccardo Cristadoro; Ard Den Reijer; Audrone Jakaitiene; Piotr Jelonek; António Rua; Gerhard Rünstler (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.); Karsten Ruth; Christophe Van Nieuwenhuyze
    Abstract: This paper evaluates different models for the short-term forecasting of real GDP growth in ten selected European countries and the euro area as a whole. Purely quarterly models are compared with models designed to exploit early releases of monthly indicators for the nowcast and forecast of quarterly GDP growth. Amongst the latter, we consider small bridge equations and forecast equations in which the bridging between monthly and quarterly data is achieved through a regression on factors extracted from large monthly datasets. The forecasting exercise is performed in a simulated real-time context, which takes account of publication lags in the individual series. In general, we find that models that exploit monthly information outperform models that use purely quarterly data and, amongst the former, factor models perform best. JEL Classification: E37, C53.
    Keywords: Bridge models, Dynamic factor models, real-time data flow.
    Date: 2008–04
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbops:20080084&r=ecm
  18. By: Aurélien Hazan (IBISC - Informatique, Biologie Intégrative et Systèmes Complexes - CNRS : FRE2873 - Université d'Evry-Val d'Essonne); Vincent Vigneron (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, SAMOS - Statistique Appliquée et MOdélisation Stochastique - Université Panthéon-Sorbonne - Paris I)
    Abstract: The various scales of a signal maintain relations of dependence the on es with the others. Those can vary in time and reveal speed changes in the studied phenomenon. In the goal to establish these changes, one shall compute first the wavelet transform of a signal, on various scales. Then one shall study the statistical dependences between these transforms thanks to an estimator of mutual information. One shall then propose to summarize the resulting network of dependences by a graph of dependences by thresholding the values of the mutual information or by quantifying its values. The method can be applied to several types of signals, such as fluctuations of market indexes for instance the S&P 500, or high frequency foreign exchange (FX) rates.
    Keywords: wavelet, dependence; mutual information; financial; time-series; FX
    Date: 2008–06–05
    URL: http://d.repec.org/n?u=RePEc:hal:papers:hal-00287463_v1&r=ecm
  19. By: Jaromír Beneš; Andrew Binning; Kirdan Lees (Reserve Bank of New Zealand)
    Abstract: Central bank policymakers often cast judgement about macroeconomic forecasts in reduced form terms, basing this on off-model information that is not easily mapped to a structural DSGE framework. We show how to compute forecasts conditioned on policymaker judgement that are the most likely conditional forecasts from the perspective of the DSGE model, thereby maximising the influence of the model structure on the forecasts. We suggest using a simple implausibility index to track the magnitude and type of policymaker judgement. This is based on the structural shocks required to return policymaker judgement. We show how to use the methods for practical use in the policy environment and also apply the techniques to condition DSGE model forecasts on: (i) the long history of published forecasts from the Reserve Bank of New Zealand; (ii) constant interest rate forecasts; and (iii) inflation forecasts from a Bayesian VAR currently used in the policy environment at the Reserve Bank of New Zealand.
    Keywords: DSGE models; monetary policy; conditional forecasts
    JEL: C51 C53
    Date: 2008–06
    URL: http://d.repec.org/n?u=RePEc:nzb:nzbdps:2008/10&r=ecm
  20. By: Thomas Lux
    Abstract: This paper develops a methodology for estimating the parameters of dynamic opinion or expectation formation processes with social interactions. We study a simple stochastic framework of a collective process of opinion formation by a group of agents who face a binary decision problem. The aggregate dynamics of the individuals' decisions can be analyzed via the stochastic process governing the ensemble average of choices. Numerical approximations to the transient density for this ensemble average allow the evaluation of the likelihood function on the base of discrete observations of the social dynamics. This approach can be used to estimate the parameters of the opinion formation process from aggregate data on its average realization. Our application to a well-known business climate index provides strong indication of social interaction as an important element in respondents' assessment of the business climate
    Keywords: business climate, business cycle forecasts, opinion formation, social interactions
    JEL: C42 D84 E37
    Date: 2008–06
    URL: http://d.repec.org/n?u=RePEc:kie:kieliw:1424&r=ecm
  21. By: Jan Rouwendal (VU University Amsterdam); Arianne de Blaeij (LEI, The Hague); Piet Rietveld (VU University Amsterdam); Erik Verhoef (VU University Amsterdam)
    Abstract: This paper presents a method to assess the distribution of values of time, and values of statistical life, over participants to a stated choice experiment, that does not require the researcher to make an a priori assumption on the type of distribution, as is required for example for mixed logit models. The method requires a few assumptions to hold true, namely that the valuations to be determined are constant for each individual, and that respondents make choices according to their preferences. These assumptions allow the derivation of lower and upper bounds on the (cumulative) distribution of the values of interest over respondents, by deriving for each choice set the value(s) for which the respondent would be indifferent between the alternatives offered, and next deriving from the choice actually made the respondent’s implied minimum or maximum value(s). We also provide an extension of the method that incorporates the possibility that errors are made. The method is illustrated using data from an experiment investigating the value of time and the value of statistical life. We discuss the possibility to improve the information content of stated choice experiments by optimizing the attribute levels shown to respondents, which is especially relevant because it would help in selecting the appropriate distribution for mixed logit estimates for the same data.
    Keywords: stated preferences; value of a statistical life
    JEL: C81 D12 D61 R41
    Date: 2008–05–22
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20080053&r=ecm
  22. By: Blanc, J.P.C. (Tilburg University, Center for Economic Research)
    Abstract: A frequent observation in service systems with queues in parallel is that customers in other queues tend to be served faster than those in one?s own queue. This paper quantifies the probability that one?s service would have started earlier if one had joined another queue than the queue that was actually chosen, for exponential multiserver systems with queues in parallel in which customers join one of the shortest queues upon arrival and in which jockeying is not possible.
    Keywords: Queueing;Join-the-shortest-queue; Probability of bad luck; Power-series algorithm; Overtaking customers; Dedicated customers.
    JEL: C44 C60
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200854&r=ecm

This nep-ecm issue is ©2008 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.