|
on Computational Economics |
Issue of 2005‒10‒29
five papers chosen by |
By: | Xavier Vilà |
Abstract: | We analyze the classical Bertrand model when consumers exhibit some strategic behavior in deciding from which seller they will buy. We use two related but different tools. Both consider a probabilistic learning (or evolutionary) mechanism, and in the two of them consumers' behavior in uences the competition between the sellers. The results obtained show that, in general, developing some sort of loyalty is a good strategy for the buyers as it works in their best interest. First, we consider a learning procedure described by a deterministic dynamic system and, using strong simplifying assumptions, we can produce a description of the process behavior. Second, we use nite automata to represent the strategies played by the agents and an adaptive process based on genetic algorithms to simulate the stochastic process of learning. By doing so we can relax some of the strong assumptions used in the rst approach and still obtain the same basic results. It is suggested that the limitations of the rst approach (analytical) provide a good motivation for the second approach (Agent-Based). Indeed, although both approaches address the same problem, the use of Agent-Based computational techniques allows us to relax hypothesis and overcome the limitations of the analytical approach. |
Keywords: | Agent-Based Computational Economics, Evolutionary Game Theory, Imperfect Competition |
JEL: | C6 C7 D4 |
Date: | 2005–10–25 |
URL: | http://d.repec.org/n?u=RePEc:aub:autbar:654.05&r=cmp |
By: | Carl Chiarella (School of Finance and Economics, University of Technology, Sydney); Christina Nikitopoulos-Sklibosios (School of Finance and Economics, University of Technology, Sydney); Erik Schlogl (School of Finance and Economics, University of Technology, Sydney) |
Abstract: | This paper examines the pricing of interest rate derivatives when the interest rate dynamics experience infrequent jump shocks modelled as a Poisson process and within the Markovian HJM framework developed in Chiarella & Nikitopoulos (2003). Closed form solutions for the price of a bond option under deterministic volatility specifications are derived and a control variate numerical method is developed under a more general state dependent volatility structure, a case in which closed form solutions are generally not possible. In doing so, we provide a novel perspective on the control variate methods by going outside a given complex model to a simpler more tractable setting to provide the control variates. |
Keywords: | HJM model; jump process; bond option prices; control variate; Monte Carlo simulation |
JEL: | E43 G33 G13 |
Date: | 2005–09–01 |
URL: | http://d.repec.org/n?u=RePEc:uts:rpaper:167&r=cmp |
By: | Justin van de Ven |
Abstract: | A dynamic microsimulation model of cohort earnings based on the Australian population aged between 20 and 55 years is described. A highly parsimonious modular structure is adopted to facilitate sensitivity analysis and enable additional characteristics to be added, should they be desired. Despite the restrictive specifications used, the model closely reflects the data used for calibration. |
Date: | 2005–03 |
URL: | http://d.repec.org/n?u=RePEc:nsr:niesrd:254&r=cmp |
By: | Caron, E.; Daniels, H.A.M. (Erasmus Research Institute of Management (ERIM), RSM Erasmus University) |
Abstract: | In this paper, we describe an extension of the methodology for explanation generation in financial knowledge-based systems, offering the possibility to automatically generate explanations and diagnostics to support business decision tasks. The central goal is the identification of specific knowledge structures and reasoning methods required to construct computerized explanations from financial data and business models. A multi-step look-ahead algorithm is proposed that deals with so-called calling-out effects, which are a common phenomenon in financial data sets. The extended methodology was tested on a case-study conducted for Statistics Netherlands involving the comparison of financial figures of firms in the Dutch retail branch. The analyses are performed with a diagnostic software application which implements our theory of explanation. Comparison of results of the classic explanation methodology with the results of the extended methodology shows significant improvements in the analyses when cancelling-out effects are present in the data. |
Keywords: | Decision support systems;Finance;Production statistics;Artificial intelligence;Explanation; |
Date: | 2005–10–14 |
URL: | http://d.repec.org/n?u=RePEc:dgr:eureri:30007655&r=cmp |
By: | Sigbert Klinke; Uwe Ziegenhagen; Yuval Guri |
Abstract: | Statistical research has always been at the edge of available computing power. Huge datasets, e.g in DataMining or Quantitative Finance, and computationally intensive techniques, e.g. bootstrap methods, always require a little bit more computing power than is currently available. But the most popular statistical programming language R, as well as statistical programming languages like S or XploRe, are interpreted which makes them slow in computing intensive areas. The common solution is to implement these routines in low-level programming languages like C/C++ or Fortran and subsequently integrate them as dynamic linked libraries (DLL) or shared object libraries (SO) in the statistical programming language. |
Keywords: | statistical programming language, XploRe, Yxilon, Java, dynamic linked libraries, shared object libraries |
JEL: | C80 |
Date: | 2005–03 |
URL: | http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2005-018&r=cmp |