nep-cmp New Economics Papers
on Computational Economics
Issue of 2020‒07‒20
25 papers chosen by



  1. A comparative study of forecasting Corporate Credit Ratings using Neural Networks, Support Vector Machines, and Decision Trees By Parisa Golbayani; Ionu\c{t} Florescu; Rupak Chatterjee
  2. An overall view of key problems in algorithmic trading and recent progress By Micha\"el Karpe
  3. Seasonal impacts of climate change on electricity production By Jacques Despres; Marko Adamovic
  4. Distributional effects of emission pricing in a carbon-intensive economy: the case of Poland By Marek Antosiewicz; Rodrigo Fuentes; Piotr Lewandowski; Jan Witajewski-Baltvilks
  5. Driving factors behind the changes in income distribution in the Baltics: income, policy, demography By Jekaterina Navicke
  6. Redistribution, Inequality, and Efficiency with Credit Constraints By Yoseph Y. Getachew; Stephen J. Turnovsky
  7. A Model of Global Beverage Markets By Anderson, Kym; Wittwer, Glyn
  8. Risk management of guaranteed minimum maturity benefits under stochastic mortality and regime-switching by Fourier space time-stepping framework By Wenlong Hu
  9. V-, U-, L-, or W-shaped recovery after COVID? Insights from an Agent Based Model By Dhruv Sharma; Jean-Philippe Bouchaud; Stanislao Gualdi; Marco Tarzia; Francesco Zamponi
  10. Machine Learning Econometrics: Bayesian algorithms and methods By Dimitris Korobilis; Davide Pettenuzzo
  11. Deep Investing in Kyle's Single Period Model By Paul Friedrich; Josef Teichmann
  12. 25 Years of Austria's EU Membership. Quantifying the Economic Benefits With a DSGE Model By Fritz Breuss
  13. Determining Secondary Attributes for Credit Evaluation in P2P Lending By Revathi Bhuvaneswari; Antonio Segalini
  14. Combining Microsimulation and Optimization to Identify Optimal Flexible Tax-Transfer Rules By Colombino, Ugo; Islam, Nizamul
  15. Global macroeconomic balances for mid-century climate analyses By REY LOS SANTOS Luis; WOJTOWICZ Krzysztof; TAMBA Marie; VANDYCK Toon; WEITZEL Matthias; SAVEYN Bert; TEMURSHO Umed
  16. Valuing the quality option in agricultural commodity futures: a Monte Carlo simulation based approach By Sanjay Mansabdar; Hussain C Yaganti
  17. Does the choice of balance-measure matter under Genetic Matching? By Adeola Oyenubi; Martin Wittenberg
  18. Nowcasting Industrial Production Using Uncoventional Data Sources By Fornaro, Paolo
  19. Building(s and) cities: Delineating urban areas with a machine learning algorithm By Arribas-Bel, Daniel; Garcia-Lopez, Miquel-Angel; Viladecans-Marsal, Elisabet
  20. The Cost of Omitting the Credit Channel in DSGE Models: A Policy Mix Approach By Takeshi Yagihashi
  21. Approximate XVA for European claims By Fabio Antonelli; Alessandro Ramponi; Sergio Scarlatti
  22. What can be learned from the free destination option in the LNG imbroglio? By Amina Baba; Anna Cretti; Olivier Massol
  23. Deeply Equal-Weighted Subset Portfolios By Sang Il Lee
  24. Robust Product Markovian Quantization By Ralph Rudd; Thomas A. McWalter; Joerg Kienitz; Eckhard Platen
  25. Fast calibration of the LIBOR Market Model with Stochastic Volatility based on analytical gradient By Herv\'e Andres; Pierre-Edouard Arrouy; Paul Bonnefoy; Alexandre Boumezoued; Sophian Mehalla

  1. By: Parisa Golbayani; Ionu\c{t} Florescu; Rupak Chatterjee
    Abstract: Credit ratings are one of the primary keys that reflect the level of riskiness and reliability of corporations to meet their financial obligations. Rating agencies tend to take extended periods of time to provide new ratings and update older ones. Therefore, credit scoring assessments using artificial intelligence has gained a lot of interest in recent years. Successful machine learning methods can provide rapid analysis of credit scores while updating older ones on a daily time scale. Related studies have shown that neural networks and support vector machines outperform other techniques by providing better prediction accuracy. The purpose of this paper is two fold. First, we provide a survey and a comparative analysis of results from literature applying machine learning techniques to predict credit rating. Second, we apply ourselves four machine learning techniques deemed useful from previous studies (Bagged Decision Trees, Random Forest, Support Vector Machine and Multilayer Perceptron) to the same datasets. We evaluate the results using a 10-fold cross validation technique. The results of the experiment for the datasets chosen show superior performance for decision tree based models. In addition to the conventional accuracy measure of classifiers, we introduce a measure of accuracy based on notches called "Notch Distance" to analyze the performance of the above classifiers in the specific context of credit rating. This measure tells us how far the predictions are from the true ratings. We further compare the performance of three major rating agencies, Standard $\&$ Poors, Moody's and Fitch where we show that the difference in their ratings is comparable with the decision tree prediction versus the actual rating on the test dataset.
    Date: 2020–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2007.06617&r=all
  2. By: Micha\"el Karpe
    Abstract: We summarize the fundamental issues at stake in algorithmic trading, and the progress made in this field over the last twenty years. We first present the key problems of algorithmic trading, describing the concepts of optimal execution, optimal placement, and price impact. We then discuss the most recent advances in algorithmic trading through the use of Machine Learning, discussing the use of Deep Learning, Reinforcement Learning, and Generative Adversarial Networks.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.05515&r=all
  3. By: Jacques Despres (European Commission - JRC); Marko Adamovic (European Commission - JRC)
    Abstract: PESETA IV assesses the impacts of climate change on electricity production by hydro, wind, solar, nuclear and other thermal power plants, including biomass, coal, gas and oil. We assess these impacts in the present power system and in 2050 for a dynamic scenario in line with 2°C mitigation efforts. Both scenarios show that, at EU-level, the production of hydropower plants increases with global warming thanks to higher water availability (although this does not imply substantial development of new hydro plants), while nuclear power decreases. However, there are regional differences in the impacts, such as increased hydro production in the North, and a decline in hydro- and nuclear power production in southern Europe due to lower water availability for direct production or for cooling river-based plants. In northern Europe, the increasing availability of cheaper hydro results in substitution effects and lower production costs, while in southern Europe production costs could increase. Based on the modelling methodology used and the latest available climate simulations, the direct impacts of climate change on wind and solar production are not significant at EU-level. However, in the 2050 power system their capacity would increase in southern regions to compensate for the lost hydro and nuclear production. Climate change impacts on energy in the rest of the world show a negligible spill-over effect on Europe. Improved cooling technologies have the potential to reduce strongly the negative effects of water scarcity, particularly for nuclear plants in southern Europe.
    Keywords: Climate change impacts, Water scarcity, Hydropower, Thermal plants, Wind, Solar, Climate change, Electricity production, Electricity supply
    Date: 2020–05
    URL: http://d.repec.org/n?u=RePEc:ipt:iptwpa:jrc118155&r=all
  4. By: Marek Antosiewicz; Rodrigo Fuentes; Piotr Lewandowski; Jan Witajewski-Baltvilks
    Abstract: In this paper, we assess the distributional impact of introducing a carbon tax in Poland. We apply a two- step simulation procedure. First, we evaluate the economy-wide effects with a dynamic general equilibrium model. Second, we use a microsimulation model based on household budget survey data to assess the effects on various income groups and on inequality. We introduce a new adjustment channel related to employment changes, which is qualitatively different from price and behavioural effects, and is quantitatively important. We find that the overall distributional effect of a carbon tax is largely driven by how the revenue is spent: distributing the revenues from a carbon tax as lump-sum transfers to households reduces income inequality, while spending the revenues on a reduction of labour taxation increases inequality. These results could be relevant for other coal-producing countries, such as South Africa, Germany, or Australia
    Keywords: climate policy, distribution effect, microsimulation model, general equilibrium model, employment
    JEL: H23 P18
    Date: 2020–07
    URL: http://d.repec.org/n?u=RePEc:ibt:wpaper:wp072020&r=all
  5. By: Jekaterina Navicke (Vilniaus Universitetas)
    Abstract: This paper aims at disentangling factors behind the changes in income inequality and relative poverty in the Baltics. The evaluation of the income, policy and demographic effects was based on counterfactual scenarios constructed using tax-benefit microsimulation and re-weighting techniques. Decomposition showed that income and policy effects were dominant for changes in inequality and relative poverty. The policy effects were inequality- and poverty-reducing before the crisis and after the EU accession as a whole. The income effects for the same periods were inequality- and poverty-increasing. Despite rapid demographic changes, the demographic effect on income inequality and relative poverty was marginal.
    Keywords: income inequality, poverty, demographic change, policy reform, Baltics
    JEL: J31
    Date: 2020
    URL: http://d.repec.org/n?u=RePEc:fme:wpaper:44&r=all
  6. By: Yoseph Y. Getachew; Stephen J. Turnovsky
    Abstract: We develop a model that characterizes the joint determination of income distribution and macroeconomic aggregate dynamics. We identify multiple channels through which alternative public policies such as transfers, consumption and income taxes, and public investment will affect the inequality;efficiency trade off. Some policy changes can affect net income inequality both directly, and indirectly by inducing structural changes in the private-public capital ratio. This in turn influences market inequality and determines the distribution of the next period’s investment and net income. Income tax and transfers have both a direct income effect and an indirect substitution effect, whereas the consumption tax has only the latter. After developing some theoretical propositions summarizing these policy tradeoffs, we present extensive numerical simulations motivated by the South African National Development Plan 2030, the objective of which is to tame soaring inequality and increase per capita GDP. Our numerical simulations illustrate how the judicious combination of these policies may help achieve these targets. The simulations also suggest that the sharp decline in private-public capital ratio coupled with high degree of complementarity between the public and private capitals could be behind the persistence of market inequality in South Africa during the last two decades.
    Keywords: Redistribution policies, Incomplete Capital Market, Idiosyncratic shocks, Efficiency, inequality
    JEL: D31 O41
    Date: 2020–04
    URL: http://d.repec.org/n?u=RePEc:rza:wpaper:817&r=all
  7. By: Anderson, Kym; Wittwer, Glyn
    Abstract: This paper describes a new empirical model of the world's markets for alcoholic beverages and, to illustrate its usefulness, reports results from projections of those markets from 2016-18 to 2025 under various scenarios. It not only revises and updates a model of the world's wine markets (Wittwer, Berger and Anderson, 2003) but also adds beer and spirits so as to capture the substitutability of those beverages among consumers. The model has some of the features of an economywide computable general equilibrium model, with international trade linking the markets of its 44 countries and seven residual regions. It is used to simulate prospects for these markets by 2025 (business-as-usual), which points to Asia's rise. Then two alternative scenarios to 2025 are explored: one simulates the withdrawal of the United Kingdom from the European Union (EU); the other simulates the effects of the recent imposition of additional 25% tariffs on selected beverages imported by the United States from several EU member countries. Future applications of the model are discussed in the concluding section.
    Keywords: beer; Cge modeling; changes in beverage preferences; international trade in beverages; spirits; Wine
    JEL: C53 F11 F17 Q13
    Date: 2020–02
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:14387&r=all
  8. By: Wenlong Hu
    Abstract: This paper presents a novel framework for valuation and hedging of the insurer's net liability on a Guaranteed Minimum Maturity Benefit (GMMB) embedded in variable annuity (VA) contracts whose underlying mutual fund dynamics evolve under the influence of the regime-switching model. Numerical solutions for valuations and Greeks (i.e. valuation sensitivities with respect to model parameters) of GMMB under stochastic mortality are derived. Valuation and hedging is performed using an accurate, fast and efficient Fourier Space Time-stepping (FST) algorithm. The mortality component of the model is calibrated to the American male population. Sensitivity analysis is performed with respect to various parameters. The hedge effectiveness is assessed by comparing profit-and-loss performances for an unhedged and three statically hedged portfolios. The results provide a comprehensive analysis on valuation and hedging the longevity risk, interest rate risk and equity risk for the GMMB embedded in VAs, and highlight the benefits to insurance providers who offer those products.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.15483&r=all
  9. By: Dhruv Sharma; Jean-Philippe Bouchaud; Stanislao Gualdi; Marco Tarzia; Francesco Zamponi
    Abstract: We discuss the impact of a Covid-like shock on a simple toy economy, described by the Mark-0 Agent-Based Model that we developed and discussed in a series of previous papers. We consider a mixed supply and demand shock, and show that depending on the shock parameters (amplitude and duration), our toy economy can display V-shaped, U-shaped or W-shaped recoveries, and even an L-shaped output curve with permanent output loss. This is due to the existence of a self-sustained "bad" state of the economy. We then discuss two policies that attempt to moderate the impact of the shock: giving easy credit to firms, and the so-called helicopter money, i.e. injecting new money into the households savings. We find that both policies are effective if strong enough, and we highlight the potential danger of terminating these policies too early. While we only discuss a limited number of scenarios, our model is flexible and versatile enough to allow for a much wider exploration, thus serving as a useful tool for the qualitative understanding of post-Covid recovery.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.08469&r=all
  10. By: Dimitris Korobilis (University of Glasgow); Davide Pettenuzzo (Brandeis University)
    Abstract: As the amount of economic and other data generated worldwide increases vastly, a challenge for future generations of econometricians will be to master efficient algorithms for inference in empirical models with large information sets. This Chapter provides a review of popular estimation algorithms for Bayesian inference in econometrics and surveys alternative algorithms developed in machine learning and computing science that allow for efficient computation in high-dimensional settings. The focus is on scalability and parallelizability of each algorithm, as well as their ability to be adopted in various empirical settings in economics and finance.
    Keywords: MCMC; approximate inference; scalability; parallel computation
    JEL: C11 C15 C49 C88
    Date: 2020–04
    URL: http://d.repec.org/n?u=RePEc:brd:wpaper:130&r=all
  11. By: Paul Friedrich; Josef Teichmann
    Abstract: The Kyle model describes how an equilibrium of order sizes and security prices naturally arises between a trader with insider information and the price providing market maker as they interact through a series of auctions. Ever since being introduced by Albert S. Kyle in 1985, the model has become important in the study of market microstructure models with asymmetric information. As it is well understood, it serves as an excellent opportunity to study how modern deep learning technology can be used to replicate and better understand equilibria that occur in certain market learning problems. We model the agents in Kyle's single period setting using deep neural networks. The networks are trained by interacting following the rules and objectives as defined by Kyle. We show how the right network architectures and training methods lead to the agents' behaviour converging to the theoretical equilibrium that is predicted by Kyle's model.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.13889&r=all
  12. By: Fritz Breuss
    Abstract: Austria's EU accession 25 years ago, alongside Finland and Sweden, was preceded by an extended period of convergence toward the EU: via the free trade agreement concluded with the EC in 1973, and the participation in the European Economic Area (EEA) in 1994. Although the COVID-19 crisis in 2020 seems to overshadow the overall positive balance of 25 years of EU membership, on average the real GDP growth dividend amounted to 0.8 percentage points per year since 1995. To check the robustness of this result, obtained with an integration macro model, a DSGE model for Austria is used here. Usually other methods are applied to estimate integration effects: trade gravity models, CGE models, macro models. Following in't Veld's (2019) approach with a DSGE model for the EU, we adapt an earlier version of the two-country DSGE model for Austria and the Euro area (Breuss and Rabitsch, 2009) to evaluate the benefits of Austria's EU membership. It turns out that grosso modo the macro results can be confirmed with the DSGE model.
    Keywords: European Integration; Model simulations; country studies
    Date: 2020–06–26
    URL: http://d.repec.org/n?u=RePEc:wfo:wpaper:y:2020:i:603&r=all
  13. By: Revathi Bhuvaneswari; Antonio Segalini
    Abstract: There has been an increased need for secondary means of credit evaluation by both traditional banking organizations as well as peer-to-peer lending entities. This is especially important in the present technological era where sticking with strict primary credit histories doesn't help distinguish between a 'good' and a 'bad' borrower, and ends up hurting both the individual borrower as well as the investor as a whole. We utilized machine learning classification and clustering algorithms to accurately predict a borrower's creditworthiness while identifying specific secondary attributes that contribute to this score. While extensive research has been done in predicting when a loan would be fully paid, the area of feature selection for lending is relatively new. We achieved 65% F1 and 73% AUC on the LendingClub data while identifying key secondary attributes.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.13921&r=all
  14. By: Colombino, Ugo (University of Turin); Islam, Nizamul (LISER (CEPS/INSTEAD))
    Abstract: We use a behavioural microsimulation model embedded in a numerical optimization procedure in order to identify optimal (social welfare maximizing) tax-transfer rules. We consider the class of tax-transfer rules consisting of a universal basic income and a tax defined by a 4th degree polynomial. The rule is applied to total taxable household income. A microeconometric model of household, which simulates household labour supply decisions, is embedded into a numerical routine in order to identify – within the class defined above – the tax-transfer rule that maximizes a social welfare function. We present the results for five European countries: France, Italy, Luxembourg, Spain and United Kingdom. For most values of the inequality aversion parameter, the optimized rules provide a higher social welfare than the current rule, with the exception of Luxembourg. In France, Italy and Luxembourg the optimized rules are significantly different from the current ones and are close to a Negative Income Tax or a Universal basic income with a flat tax rate. In Spain and the UK, the optimized rules are instead close to the current rule. With the exception of Spain, the optimal rules are slightly disequalizing and the social welfare gains are due to efficiency gains. Nonetheless, the poverty gap index tends to be lower under the optimized regime.
    Keywords: empirical optimal taxation, microsimulation, microeconometrics, evaluation of tax-transfer rules
    JEL: H21 C18
    Date: 2020–05
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp13309&r=all
  15. By: REY LOS SANTOS Luis (European Commission - JRC); WOJTOWICZ Krzysztof (European Commission - JRC); TAMBA Marie (European Commission - JRC); VANDYCK Toon (European Commission - JRC); WEITZEL Matthias (European Commission - JRC); SAVEYN Bert (European Commission - JRC); TEMURSHO Umed
    Abstract: In this document the economic balances for the Baseline scenario used in the Global Energy and Climate Outlook (GECO) 2018 are presented. The Baseline scenario represents a projection of the world economy with corresponding energy demand and GHGs emissions under the assumption of current climate and energy policies and also realization of Nationally Determined Contributions (NDC) in line with Paris agreement. As this scenario still does not allow avoiding catastrophic climate change in the future, it is used as a reference to compare alternative scenarios with more stringent policy measures. Economic balances are supplemented by energy balances, where the latter are coming from energy models, but are consistent with economic data. Finally we show GHGs evolution over time that comes from economic activity presented in the Baseline. The procedure that was used to generate Baseline scenario is called PIRAMID which stands for: Platform to Integrate, Reconcile and Align Model-based Input-output Data. PIRAMID is a new methodology to project Multi-Regional Input-Output tables over time. This approach allows for integrating data from external models and databases. The result is a series of consistent and transparent IO tables.
    Keywords: Baseline, CGE, Input-Output tables, Macroeconomic projections
    Date: 2018–12
    URL: http://d.repec.org/n?u=RePEc:ipt:iptwpa:jrc113981&r=all
  16. By: Sanjay Mansabdar; Hussain C Yaganti
    Abstract: Agricultural commodity futures are often settled by delivery. Quality options that allow the futures short to deliver one of several underlying assets are commonly used in such contracts to prevent manipulation. Inclusion of these options reduces the price of the futures contract and leads to degraded contract hedging performance. Valuation of these options is a first step in assessing the impact of the quality options embedded into a futures contract. This paper demonstrates a Monte Carlo simulation based approach to estimate the value of a quality option. In order to improve simulation efficiency, the technique of antithetic variables is used. This approach can help in the assessment of the impact of embedded quality options.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.11222&r=all
  17. By: Adeola Oyenubi; Martin Wittenberg
    Abstract: In applied studies, the influence of balance measures on the performance of matching estimators is often taken for granted. This paper considers the performance of different balance measures that have been used in the literature when balance is being optimized. We also propose the use of the entropy measure in assessing balance. To examine the effect of balance measures, we conduct a simulation study where we optimize balance using Genetic Algorithm (GenMatch).We found that balance measures do influence matching estimates under the GenMatch algorithm. The bias and Root Mean Square Error (RMSE) of the estimated treatment effect vary with the choice of balance measure. In the artificial Data Generating Process (DGP) with one covariate considered in this study, the proposed entropy balance measure has the lowest RMSE.The implication of these results is that sensitivity of matching estimates to the choice of balance measure should be given greater attention in empirical studies.
    Keywords: Genetic matching, balance measures, Information Theory, entropy metric
    JEL: I38 H53 C21 D13
    Date: 2020–05
    URL: http://d.repec.org/n?u=RePEc:rza:wpaper:819&r=all
  18. By: Fornaro, Paolo
    Abstract: Abstract In this work, we rely on unconventional data sources to nowcast the year-on-year growth rate of Finnish industrial production, for different industries. As predictors, we use real-time truck traffic volumes measured automatically in different geographical locations around Finland, as well as electricity consumption data. In addition to standard time-series models, we look into the adoption of machine learning techniques to compute the predictions. We find that the use of non-typical data sources such as the volume of truck traffic is beneficial, in terms of predictive power, giving us substantial gains in nowcasting performance compared to an autoregressive model. Moreover, we find that the adoption of machine learning techniques improves substantially the accuracy of our predictions in comparison to standard linear models. While the average nowcasting errors we obtain are higher compared to the current revision errors of the official statistical institute, our nowcasts provide clear signals of the overall trend of the series and of sudden changes in growth.
    Keywords: Flash Estimates, Machine Learning, Big Data, Nowcasting
    JEL: C33 C55 E37
    Date: 2020–06–30
    URL: http://d.repec.org/n?u=RePEc:rif:wpaper:80&r=all
  19. By: Arribas-Bel, Daniel; Garcia-Lopez, Miquel-Angel; Viladecans-Marsal, Elisabet
    Abstract: This paper proposes a novel methodology for delineating urban areas based on a machine learning algorithm that groups build-ings within portions of space of suffi cient density. To do so, we use the precise geolocation of all 12 million buildings in Spain. We exploit building heights to create a new dimension for urban areas, namely, the vertical land, which provides a more accurate measure of their size. To better understand their internal structure and to illustrate an additional use for our algorithm, we also identify employment centers within the delineated urban areas. We test the robustness of our method and compare our urban areas to other delineations obtained using administrative borders and commuting-based patterns. We show that: 1) our urban areas are more similar to the commuting-based delineations than the administrative boundaries but that they are more precisely measured; 2) when analyzing the urban areas' size distribution, Zipf's law appears to hold for their population, surface and vertical land; and 3) the impact of transportation improvements on the size of the urban areas is not underestimated.
    Keywords: Buildings; City size; Machine Learning; Transportation; urban areas
    JEL: R12 R14 R2 R40
    Date: 2020–02
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:14450&r=all
  20. By: Takeshi Yagihashi (Senior Economist, Policy Research Institute)
    Abstract: This paper examines whether the omission of the credit channel from policy models used by both monetary and fiscal policymakers would lead to a noticeably gbad h policy outcome through model simulation. First, we simulate a financial crisis in which the financial market friction grows and the risk premium becomes more volatile. Next, both monetary and fiscal policymakers readjust their policy to stabilize the economy using an approximating DSGE model that does not feature the credit channel. We show that while the model misspecification does not affect much how policymakers perceive the crisis, the newly adopted policy based on the approximating model would cause further destabilization of the economy. We also show that the destabilization of the economy could be prevented if the fiscal policymaker is equipped with the correctly-specified credit channel model and chooses its new policy while taking into account the decision-making of the monetary policymaker. Finally, under the scenario that the correctlyspecified model is unknown, we show that the destabilization of the economy could still be prevented if both policymakers can apply judgement to unreasonable parameter estimates during the crisis period. In sum, prediction of policy outcomes and cautiousness in interpreting estimation results can help in mitigating the credit channel misspecification.
    Keywords: DSGE model, Lucas Critique, Bayesian estimation, Financial Accelerator model, monetary policy, fiscal policy, policy mix
    Date: 2020–03
    URL: http://d.repec.org/n?u=RePEc:mof:wpaper:ron324&r=all
  21. By: Fabio Antonelli; Alessandro Ramponi; Sergio Scarlatti
    Abstract: We consider the problem of computing the Value Adjustment of European contingent claims when default of either party is considered, possibly including also funding and collateralization requirements. As shown in Brigo et al. (\cite{BLPS}, \cite{BFP}), this leads to a more articulate variety of Value Adjustments ({XVA}) that introduce some nonlinear features. When exploiting a reduced-form approach for the default times, the adjusted price can be characterized as the solution to a possibly nonlinear Backward Stochastic Differential Equation (BSDE). The expectation representing the solution of the BSDE is usually quite hard to compute even in a Markovian setting, and one might resort either to the discretization of the Partial Differential Equation characterizing it or to Monte Carlo Simulations. Both choices are computationally very expensive and in this paper we suggest an approximation method based on an appropriate change of numeraire and on a Taylor's polynomial expansion when intensities are represented by means of affine processes correlated with the asset's price. The numerical discussion at the end of this work shows that, at least in the case of the CIR intensity model, even the simple first-order approximation has a remarkable computational efficiency.
    Date: 2020–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2007.07701&r=all
  22. By: Amina Baba; Anna Cretti; Olivier Massol
    Abstract: We examine the profitability of flexible routing by LNG cargoes for a single supplier taking into account uncertainty in the medium-term dynamics of gas markets. First, we model the trajectory of natural gas prices in Asia, Northern America, and Europe using a Threshold Vector AutoRegression representation (TVAR) in which the system’s dynamics switches back and forth between high and low regimes of oil price volatility. We then use the generalized impulse response functions (GIRF) obtained from the estimated threshold model to analyze the effects of volatility shocks on the regional gas markets dynamics. Lastly, the valuation of destination flexibility in LNG supplies is conducted using a real option approach. We generate a sample of possible future regional price trajectories using Monte Carlo simulations of our empirical model and determine for each trajectory the optimal shipping decisions and their profitability. Our results portend a substantial source of profit for the industry and reveal future movements of vessels. We discuss the conditional impact of destination flexibility on the globalization of natural gas markets.
    Keywords: LNG arbitrage, Volatility, TVAR, Monte Carlo simulation
    JEL: C32 C15 Q40 M31
    Date: 2020
    URL: http://d.repec.org/n?u=RePEc:cec:wpaper:2004&r=all
  23. By: Sang Il Lee
    Abstract: The high sensitivity of optimized portfolios to estimation errors has prevented their practical application. To mitigate this sensitivity, we propose a new portfolio model called a Deeply Equal-Weighted Subset Portfolio (DEWSP). DEWSP is a subset of top-N ranked assets in an asset universe, the members of which are selected based on the predicted returns from deep learning algorithms and are equally weighted. Herein, we evaluate the performance of DEWSPs of different sizes N in comparison with the performance of other types of portfolios such as optimized portfolios and historically equal-weighed subset portfolios (HEWSPs), which are subsets of top-N ranked assets based on the historical mean returns. We found the following advantages of DEWSPs: First, DEWSPs provides an improvement rate of 0.24% to 5.15% in terms of monthly Sharpe ratio compared to the benchmark, HEWSPs. In addition, DEWSPs are built using a purely data-driven approach rather than relying on the efforts of experts. DEWSPs can also target the relative risk and return to the baseline of the EWP of an asset universe by adjusting the size N. Finally, the DEWSP allocation mechanism is transparent and intuitive. These advantages make DEWSP competitive in practice.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.14402&r=all
  24. By: Ralph Rudd; Thomas A. McWalter; Joerg Kienitz; Eckhard Platen
    Abstract: Recursive marginal quantization (RMQ) allows the construction of optimal discrete grids for approximating solutions to stochastic differential equations in d-dimensions. Product Markovian quantization (PMQ) reduces this problem to d one-dimensional quantization problems by recursively constructing product quantizers, as opposed to a truly optimal quantizer. However, the standard Newton-Raphson method used in the PMQ algorithm suffers from numerical instabilities, inhibiting widespread adoption, especially for use in calibration. By directly specifying the random variable to be quantized at each time step, we show that PMQ, and RMQ in one dimension, can be expressed as standard vector quantization. This reformulation allows the application of the accelerated Lloyd's algorithm in an adaptive and robust procedure. Furthermore, in the case of stochastic volatility models, we extend the PMQ algorithm by using higher-order updates for the volatility or variance process. We illustrate the technique for European options, using the Heston model, and more exotic products, using the SABR model.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.15823&r=all
  25. By: Herv\'e Andres; Pierre-Edouard Arrouy; Paul Bonnefoy; Alexandre Boumezoued; Sophian Mehalla
    Abstract: We propose to take advantage of the common knowledge of the characteristic function of the swap rate process as modelled in the LIBOR Market Model with Stochastic Volatility and Displaced Diffusion (DDSVLMM) to derive analytical expressions of the gradient of swaptions prices with respect to the model parameters. We use this result to derive an efficient calibration method for the DDSVLMM using gradient-based optimization algorithms. Our study relies on and extends the work by (Cui et al., 2017) that developed the analytical gradient for fast calibration of the Heston model, based on an alternative formulation of the Heston moment generating function proposed by (del Ba{\~n}o et al., 2010). Our main conclusion is that the analytical gradient-based calibration is highly competitive for the DDSVLMM, as it significantly limits the number of steps in the optimization algorithm while improving its accuracy. The efficiency of this novel approach is compared to classical standard optimization procedures.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.13521&r=all

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.