
on Risk Management 
By:  Belloni, Alexandre., Chen, Mingli., Chernozhukov, Victor (Department of Economics, University of Warwick & Department of Psychology, University of Warwick) 
Abstract:  We propose Quantile Graphical Models (QGMs) to characterize predictive and conditional independence relationships within a set of random variables of interest. This framework is intended to quantify the dependence in nonGaussian settings which are ubiquitous in many econometric applications. We consider two distinct QGMs. First, Condition Independence QGMs characterize conditional independence at each quantile index revealing the distributional dependence structure. Second, Predictive QGMs characterize the best linear predictor under asymmetric loss functions. Under Gaussianity these notions essentially coincide but nonGaussian settings lead us to different models as prediction and conditional independence are fundamentally different properties. Combined the models complement the methods based on normal and nonparanormal distributions that study mean predictability and use covariance and precision matrices for conditional independence. We also propose estimators for each QGMs. The estimators are based on highdimension techniques including (a continuum of) l1penalized quantile regressions and low biased equations, which allows us to handle the potentially large number of variables. We build upon recent results to obtain valid choice of the penalty parameters and rates of convergence. These results are derived without any assumptions on the separation from zero and are uniformly valid across a widerange of models. With the additional assumptions that the coefficients are wellseparated from zero, we can consistently estimate the graph associated with the dependence structure by hard thresholding the proposed estimators. Further we show how QGM can be used to represent the tail interdependence of the variables which plays an important role in application concern with extreme events in opposition to average behavior. We show that the associated tail risk network can be used for measuring systemic risk contributions. We also apply the framework to study financial contagion and the impact of downside movement in the market on the dependence structure of assets’ return. Finally, we illustrate the properties of the proposed framework through simulated examples. 
Keywords:  Highdimensional sparse model, tail risk, conditional independence, nonlinear correlation, penalized quantile regression, systemic risk, financial contagion, downside movement 
JEL:  I30 I31 
Date:  2016 
URL:  http://d.repec.org/n?u=RePEc:wrk:warwec:1125&r=rmg 
By:  Takahiro Hattori (Faculty of Economics, Keio University) 
Abstract:  This is the first paper to analyze the predictability of implied volatility based on swaption for the major currencies US Dollar (USD), Euro (EUR), and Japanese Yen (JPY). Managing interest rate risk is of huge importance for risk management in financial institutions, and swaption is an overthecounter contract and wellused instrument that enables us to test whether the option contains the information required to predict future realized volatility. Our result shows that implied volatility has greater power to predict future realized volatility compared with the GARCH prediction or HV for the USD and EUR, which is consistent with the equity or futures options markets. However, the GARCH forecast and HV have stronger predictive power for JPY because of the lack of liquidity. 
Keywords:  Implied volatility; Predictive power; GARCH; Swaption 
JEL:  G13 G14 G12 G13 
Date:  2016–07–11 
URL:  http://d.repec.org/n?u=RePEc:keo:dpaper:2016018&r=rmg 
By:  \c{C}a\u{g}{\i}n Ararat; Birgit Rudloff 
Abstract:  The financial crisis showed the importance of measuring, allocating and regulating systemic risk. Recently, the systemic risk measures that can be decomposed into an aggregation function and a scalar measure of risk, received a lot of attention. In this framework, capital allocations are added after aggregation and can represent bailout costs. More recently, a framework has been introduced, where institutions are supplied with capital allocations before aggregation. This yields an interpretation that is particularly useful for regulatory purposes. In each framework, the set of all feasible capital allocations leads to a multivariate risk measure. In this paper, we present dual representations for scalar systemic risk measures as well as for the corresponding multivariate risk measures concerning capital allocations. Our results cover both frameworks: aggregating after allocating and allocating after aggregation. Economic interpretations of the obtained results are provided. It turns out that the representations in both frameworks are closely related. 
Date:  2016–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1607.03430&r=rmg 
By:  Giovanni BARONEADESI (Università della Svizzera italiana and Swiss Finance Institute) 
Abstract:  VaR (Value at Risk) and CVaR (Conditional Value at Risk) are implied by option prices. Their relationships to option prices are derived initially under the pricing measure. It does not require assumptions about the distribution of portfolio returns. The effects of measure change are later discussed. Some extensions and applications are also illustrated. 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1545&r=rmg 
By:  Eric JONDEAU (University of Lausanne and Swiss Finance Institute); Qunzi ZHANG (Shandong University) 
Abstract:  In this paper, we document evidence that downside betas tend to comove more than upside betas during a financial crisis, but upside betas tend to comove more than the downside betas during financial booms. We find that the asymmetry between DownsideBeta Comovement and UpsideBeta Comovement is the main driving force for market level skewness. An indicator called "Systematic Downside Risk" (SDR) is defined to characterize this asymmetry in the comovement of betas. This indicator negatively predicts future market returns. The SDR effectively forecasts future monthly stock market movements with an outofsample Rsquare above 2.27% relative to a strategy based on historical mean. An investor who timed the market using SDR would have obtained a Sharpe ratio gain of 0.206. 
Keywords:  Systematic Risk, Skewness, Predictability, Trading Strategies 
JEL:  G11 G12 G14 G17 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1459&r=rmg 
By:  Paul Schneider (University of Lugano, EPFL, Swiss Finance Institute, and Boston University) 
Abstract:  This paper introduces a decomposition of the market return in terms of higherorder realized, and optionimplied risk aversion, connecting it to level, slope, and curvature of the implied volatility surface. Empirically, secondorder risk aversion  loss aversion  explains most of the market return. Signals revealed by this risk anatomy provide predictive power outofsample for realized returns in particular for longer maturities. The decomposition also shows that compensation for disaster risk is not prominently featured in the market return. Furthermore it highlights that models with identically and independently distributed state variables are not suited to represent in particular longermaturity returns. 
Keywords:  equity premium, modelfree, risk aversion, skewness 
JEL:  C02 C23 C52 C61 G11 G12 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1561&r=rmg 
By:  JeanPaul Decamps (University of Toulouse 1  Toulouse School of Economics (TSE)); Sebastian Gryglewicz (Erasmus University Rotterdam (EUR)  Erasmus School of Economics (ESE)); Erwan Morellec (Ecole Polytechnique Fédérale de Lausanne; Ecole Polytechnique Fédérale de Lausanne  Swiss Finance Institute); Stephane Villeneuve (University of Toulouse 1  Toulouse School of Economics (TSE)) 
Abstract:  We model the financing, cash holdings, and hedging policies of a firm facing financing frictions and subject to permanent and transitory cash flow shocks. We show that permanent and transitory shocks generate distinct, sometimes opposite, effects on corporate policies and use the model to develop a rich set of empirical predictions. In our model, correlated permanent and transitory shocks imply less risk, lower cash savings, and a drop in the value of credit lines. The composition of cashflow shocks affects the cashflow sensitivity of cash, which can be positive or negative. Optimal hedging of permanent and transitory shocks may involve opposite positions. 
Keywords:  Financing frictions; cash holdings; risk management; credit lines; permanent and transitory shocks 
JEL:  G31 G32 G35 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1618&r=rmg 
By:  Peter H. GRUBER (University of Lugano); Claudio TEBALDI (Bocconi University, IGIER and CAREFIN); Fabio TROJANI (University of Geneva and Swiss Finance Institute) 
Abstract:  In a tractable stochastic volatility model, we identify the price of the smile as the price of the unspanned risks traded in SPX option markets. The price of the smile reflects two persistent volatility and skewness risks, which imply a downward sloping term structure of lowfrequency variance risk premia in normal times. In periods of distress, the term structure is upward sloping and dominated by a highfrequency premium for jump variance. This dichotomy is consistent with the puzzling skew sensitivities of option markets with creditconstrained intermediaries and it builds a challenge for many reducedform and structural models of stochastic volatility. 
Keywords:  Price of the Smile, Price of Volatility, Option Pricing, Stochastic Volatility, Unspanned Skewness, Financial Constrains, Financial Intermediation, Financial Crisis, Factor Models, Matrix Jump Diffusions, Variance Swaps, Skew Swaps 
JEL:  G10 G12 G13 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1536&r=rmg 
By:  Mathieu CAMBOU (Ecole Polytechnique Fédérale de Lausanne); Damir FILIPOVIC (Ecole Polytechnique Fédérale de Lausanne and Swiss Finance Institute) 
Abstract:  This paper provides a coherent method for scenario aggregation addressing model uncertainty. It is based on divergence minimization from a reference probability measure subject to scenario constraints. An example from regulatory practice motivates the definition of five fundamental criteria that serve as a basis for our method. Standard risk measures, such as valueatrisk and expected shortfall, are shown to be robust with respect to minimum divergence scenario aggregation. Various examples illustrate the tractability of our method. 
Keywords:  model uncertainty, scenario aggregation, expected shortfall, valueatrisk, statistical divergence, Swiss Solvency Test 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1438&r=rmg 
By:  Paul SCHNEIDER (University of Lugano and Swiss Finance Institute) 
Abstract:  This paper develops an optimal trading strategy explicitly linked to an agent's preferences and assessment of the distribution of asset returns. The price of this strategy is a portfolio of implied moments, and its expected excess returns naturally accommodate compensation for higherorder moment risk. Variance risk and the equity premium approximate it to first order and it nests crosssectional asset pricing models such as the CAPM. An empirical study in the US index market compares the investment behavior of an agent with recursive longrun risk preferences to one who merely uses an i.i.d. time series model and takes market prices as given. The two agents exhibit very similar behavior during crises and can be distinguished mostly during calm periods. 
Keywords:  Predictability, pricing kernel, model risk, trading strategy, modelfree, variance premium, skew premium, kurtosispremium OR from SSRN: Preference trading, pricing kernel, model risk, trading strategy, modelfree, variance premium, equity premium, skew premium, kurtosis premium 
JEL:  C02 C23 C52 C61 G11 G12 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1429&r=rmg 
By:  James A. Giesecke; Peter B. Dixon; Maureen T. Rimmer 
Abstract:  Financial regulators are requiring banks to raise additional equity capital to finance their acquisition of physical assets (e.g. buildings) and financial assets (e.g. loans). The benefits of this are understood in terms of reducing the risk of incurring the significant costs of another financial crisis. But there are potential costs from securing these benefits, in the form of unanticipated macroeconomic impacts as banks reduce leverage ratios. In this paper, we explore the economic consequences of a 100 basis point increase in commercial bank capital adequacy ratios using a financial computable general equilibrium model of the Australian economy. We find that the macroeconomic consequences of the policy are small. Our results suggest that prudential regulators can move forward to secure the financial system stability benefits that they expect from higher capital adequacy requirements, without concern that significant costs will be imposed on the wider economy in the form of macroeconomic disruption. 
Keywords:  Capital adequacy ratio, financial stability, macroeconomic disruption 
JEL:  E17 E44 G21 C68 
Date:  2016–05 
URL:  http://d.repec.org/n?u=RePEc:cop:wpaper:g261&r=rmg 
By:  Bell, Peter N 
Abstract:  This paper presents a method to characterize the typical path of a stochastic process, which I refer to as the Median Path. The paper describes how to estimate the Median Path in simulation and compares it to a different estimate of the typical path that is defined as the median value at each time step, like an ensemble average. The Median Path is an actual path from the stochastic process, whereas the path of median values is not. Therefore, the two paths may have very different properties. The Median Path is a single path from a set of simulated paths and is identified using a Ranking Algorithm that calculates the rank of each path at each time and then averages the ranks over time, similar to a time average. The Median Path is potentially useful in simulation applications where it is important to characterize the actual behaviour of a path generated by the stochastic process rather than the behaviour of statistics of the process over time. 
Keywords:  Simulation, Ensemble Average, Time Average. 
JEL:  C1 C14 C15 C6 C63 
Date:  2015–11–26 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:72680&r=rmg 
By:  Maximilian ADELMANN (University of Zurich); Lucio FERNANDEZ ARJONA (Zurich Insurance Group Ltd.); Janos MAYER (University of Zurich); Karl SCHMEDDERS (University of Zurich and Swiss Finance Institute) 
Abstract:  Replicating portfolios have recently emerged as an important tool in the life insurance industry, used for the valuation of companies' liabilities. This paper presents a replicating portfolio (RP) model for approximating life insurance liabilities as closely as possible. We minimize the L1 error between the discounted life insurance liability cash flows and the discounted RP cash flows over a multiperiod time horizon for a broad range of different future economic scenarios. We apply two different linear reformulations of the L1 problem to solve largescale RP optimization problems and also present several outofsample tests for assessing the quality of RPs. A numerical application of our RP model to empirical data sets demonstrates that the model delivers RPs that match the liabilities rather closely. The numerical analysis demonstrates that our model delivers RPs with excellent practical properties in a reasonable amount of time. We complete the paper with a description of an implementation of the RP model at a global insurance company. 
Keywords:  Insurance regulation; liability cash flows; linear programming; outofsample tests; replicating portfolios; Solvency II 
JEL:  C61 C65 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1604&r=rmg 
By:  Priyank Gandhi (University of Notre Dame); Patrick Christian Kiefer (UCLA Anderson School of Management); Alberto Plazzi (USILugano and Swiss Finance Institute) 
Abstract:  Modern U.S. banks engage into activities traditionally considered as noncore for the banking sector. Consistent with extant models of financial intermediation, which suggest banks diversify to lower risk and improve profitability, we document that banks with higher probability of financial distress and deadweight financial costs diversify more aggressively. Diversified banks appear to benefit from "coinsurance", are more profitable, less financially constrained, and supply more credit. However, diversification does not lead to real reductions in risk as its benefits are limited to "good" times. Diversified banks are more exposed to systematic risk and their lending is more sensitive to macroeconomic conditions. They are also more prone to correlation risk, the risk that diversification benefits provided by noncore activities may unexpectedly change especially when they are most needed. Our study contributes to the current debate on the optimal scope of bank activities, and highlights novel channels through which diversification impacts banks' credit supply and therefore the real economy. 
Keywords:  Bank diversification, Noninterest income, Systemic risk, Financial crisis 
JEL:  G01 G21 G28 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1643&r=rmg 
By:  Kara, Gazi 
Abstract:  Despite the extensive attention that the Basel capital adequacy standards have received internationally, significant variation exists in the implementation of these standards across countries. Furthermore, a significant number of countries increase or decrease the stringency of capital regulations over time. The paper investigates the empirical determinants of the variation in the data based on the theories of bank capital regulation. The results show that countries with high average returns to investment and a high ratio of government ownership of banks choose less stringent capital regulation standards. Capital regulations may also be less stringent in countries with more concentrated banking sectors. 
Keywords:  Capital Requirements ; Basel Capital Accord ; Financial regulation ; International policy coordination 
JEL:  G21 G28 F33 
Date:  2016–07 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:201657&r=rmg 
By:  Qunzhi Zhang (ETH Zurich); Didier Sornette (Swiss Finance Institute; ETH Zürich  Department of Management, Technology, and Economics (DMTEC)); Mehmet Balcilar (Eastern Mediterranean University); Rangan Gupta (University of Pretoria  Department of Economics); Zeynel Abidin Ozdemir (Gazi University); I. Hakan Yetkiner (Izmir University of Economics) 
Abstract:  The aim of this paper is to present novel tests for the early causal diagnostic of positive and negative bubbles in the S&P 500 index and the detection of EndofBubble signals with their corresponding confidence levels. We use monthly S&P 500 data covering the period from August 1791 to August 2014. This study is the first work in the literature showing the possibility to develop reliable exante diagnostics of the frequent regime shifts over two centuries of data. We show that the DS LPPLS (logperiodic power law singularity) approach successfully diagnoses positive and negative bubbles, constructs efficient EndofBubble signals for all of the welldocumented bubbles, and obtains for the first time new statistical evidence of bubbles for some other events. We also compare the DS LPPLS method to the exponential curve fitting and the generalized sup ADF test approaches and find that DS LPPLS system is more accurate in identifying wellknown bubble events, with significantly smaller numbers of false negatives and false positives. 
Keywords:  S&P 500, LPPL method, stock market bubble, forecast, bubble indicators 
JEL:  J16 O47 C32 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1605&r=rmg 
By:  Agata M. Lozinskaia (National Research University Higher School); Evgeniy M. Ozhegov (National Research University Higher School); Alexander M. Karminsky (National Research University Higher School) 
Abstract:  This paper investigates the distribution of relative credit losses given mortgage default for loans provided by a major governmentsponsored creditor in a local area. We use borrower’s individual and loanlevel data on residential mortgages originated in the period 2008–2012. Our numerical analysis indicates that mortgages bunching at certain LoantoValue ratios (LTV) led to a discontinuity in relative credit loss given mortgage default. Through regression analysis, we demonstrate discrete jumps in the approximated historical credit losses generated by loans with a high LTV ratios and find thresholds allowing the segmentation of loans according their credit risk. In addition, our results suggest that mortgage insurance is a potentially valuable instrument for compensation for expected loss in certain risk segments. 
Keywords:  discontinuity; credit risk; mortgage default; government mortgage lending programs; loss evaluation. 
JEL:  C21 G21 G32 R20 R58 
Date:  2016 
URL:  http://d.repec.org/n?u=RePEc:hig:wpaper:55/fe/2016&r=rmg 
By:  Jonathan YuMeng Li 
Abstract:  The theory of convex risk functions has now been well established as the basis for identifying the families of risk functions that should be used in risk averse optimization problems. Despite its theoretical appeal, the implementation of a convex risk function remains difficult, as there is little guidance regarding how a convex risk function should be chosen so that it also well represents one's own risk preferences. In this paper, we address this issue through the lens of inverse optimization. Specifically, given solution data from some (forward) riskaverse optimization problems we develop an inverse optimization framework that generates a risk function that renders the solutions optimal for the forward problems. The framework incorporates the wellknown properties of convex risk functions, namely, monotonicity, convexity, translation invariance, and law invariance, as the general information about candidate risk functions, and also the feedbacks from individuals, which include an initial estimate of the risk function and pairwise comparisons among random losses, as the more specific information. Our framework is particularly novel in that unlike classical inverse optimization, no parametric assumption is made about the risk function, i.e. it is nonparametric. We show how the resulting inverse optimization problems can be reformulated as convex programs and are polynomially solvable if the corresponding forward problems are polynomially solvable. We illustrate the imputed risk functions in a portfolio selection problem and demonstrate their practical value using reallife data. 
Date:  2016–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1607.07099&r=rmg 
By:  Chang, CL.; McAleer, M.J.; Wang, Y. 
Abstract:  There is substantial empirical evidence that energy and financial markets are closely connected. As one of the most widelyused energy resources worldwide, natural gas has a large daily trading volume. In order to hedge the risk of natural gas spot markets, a large number of hedging strategies can be used, especially with the rapid development of natural gas derivatives markets. These hedging instruments include natural gas futures and options, as well as Exchange Traded Fund (ETF) prices that are related to natural gas stock prices. The volatility spillover effect is the delayed effect of a returns shock in one physical, biological or financial asset on the subsequent volatility or covolatility of another physical, biological or financial asset. Investigating volatility spillovers within and across energy and financial markets is a crucial aspect of constructing optimal dynamic hedging strategies. The paper tests and calculates spillover effects among natural gas spot, futures and ETF markets using the multivariate conditional volatility diagonal BEKK model. The data used include natural gas spot and futures returns data from two major international natural gas derivatives markets, namely NYMEX (USA) and ICE (UK), as well as ETF data of natural gas companies from the stock markets in the USA and UK. The empirical results show that there are significant spillover effects in natural gas spot, futures and ETF markets for both USA and UK. Such a result suggests that both natural gas futures and ETF products within and beyond the country might be considered when constructing optimal dynamic hedging strategies for natural gas spot prices. 
Keywords:  Energy, natural gas, spot, futures, ETF, NYMEX, ICE, optimal hedging strategy, covolatility spillovers, diagonal BEKK 
JEL:  C58 D53 G13 G31 O13 
Date:  2016–06–03 
URL:  http://d.repec.org/n?u=RePEc:ems:eureir:93116&r=rmg 
By:  Pierre Matek (Croatian Financial Services Supervisory Agency); Marko Lukač (Effectus  University Collage for Law and Finance); Vedrana Repač (Effectus  University Collage for Law and Finance) 
Abstract:  The goal of this paper is to determine whether managers of Croatian mandatory pension funds have displayed investment skill on a riskadjusted basis during the 20052014 period. We have calculated various riskadjusted investment performance measures and have then used a number of statistical tools to test the significance of the results. Evidence from our analysis suggests that Croatian mandatory pension funds have reached their investment targets in terms of riskfree rates or benchmarks. Evidence of investment skill was found in some of the funds analysed. 
Keywords:  pension funds, riskadjusted return, performance appraisal 
JEL:  G11 G18 G19 G23 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:eff:wpaper:0004&r=rmg 
By:  Elisa Fusco ("Sapienza" University of Rome); Bernardo Maggi ("Sapienza" University of Rome) 
Abstract:  In light of the recent financial world crisis, is crucial to investigate into the responsibilities of the main actors in the credit sectors, i.e. banks and local governments. In this framework, we propose a methodology able to analyze the quality of the problem loans adopted by banks, their level of efficiency in the risk management strategies and the governments policy action in the supervision of the local banking system. Our approach is based on the introduction of the “Non performing Loans” variable as an undesirable output in an output distance function (as stochastic frontier) in order to estimate the efficiency of the bank and calculate the shadow price of the NPLs (not normally observable) per each year, bank and country. Then we compare the management of the NPLs and their price across geographic areas and bank dimension over time in order to map the responsibilities and to draw some policy implications. From an econometric point of view, we to our knowledge for first adopt the seminonparametric Fourier specification which, among the functionalflexibleform alternatives, is capable to guarantee the convergence of the estimated parameters and the related Xefficiency to the true ones. 
Keywords:  Commercial bank, Financial world crisis, Non performing loans, Efficiency, Flexible forms, Distance function. 
JEL:  G21 D24 C33 C51 L23 
Date:  2016–07 
URL:  http://d.repec.org/n?u=RePEc:sas:wpaper:20162&r=rmg 
By:  Santiago MorenoBROMBERG (University of Zurich  Department of Banking and Finance); Guillaume ROGER (University of Sydney  School of Economics) 
Abstract:  We study a dynamic contracting problem in which size is relevant. The agent may take on excessive risk to enhance shortterm gains, which exposes the principal to large, infrequent losses. To preserve incentive compatibility, the optimal contract uses size as an instrument; there is downsizing on the equilibrium path. The contract may be implemented using the full array of financial securities or as a regulation contract with a leverage ratio. We show that holding equity is essential to curb risk taking. Firms that are less prone to risk taking can afford a higher leverage. 
Keywords:  asymmetric information; dynamic contracts; moral hazard; risk taking 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1564&r=rmg 
By:  Vladimir Filimonov (Swiss Federal Institute of Technology Zurich (ETH Zurich)); Guilherme Demos (ETH Zurich); Didier Sornette (Swiss Finance Institute; ETH Zürich  Department of Management, Technology, and Economics (DMTEC)) 
Abstract:  We present a detailed methodological study of the application of the modified profile likelihood method for the calibration of nonlinear financial models characterised by a large number of parameters. We apply the general approach to the LogPeriodic Power Law Singularity (LPPLS) model of financial bubbles. This model is particularly relevant because one of its parameters, the critical time tc signalling the burst of the bubble, is arguably the target of choice for dynamical risk management. However, previous calibrations of the LPPLS model have shown that the estimation of tc is in general quite unstable. Here, we provide a rigorous likelihood inference approach to determine tc, which takes into account the impact of the other nonlinear (socalled "nuisance") parameters for the correct adjustment of the uncertainty on tc. This provides a rigorous interval estimation for the critical time, rather than a point estimation in previous approaches. As a bonus, the interval estimations can also be obtained for the nuisance parameters (m,w, damping), which can be used to improve filtering of the calibration results. We show that the use of the modified profile likelihood method dramatically reduces the number of local extrema by constructing much simpler smoother loglikelihood landscapes. The remaining distinct solutions can be interpreted as genuine scenarios that unfold as the time of the analysis flows, which can be compared directly via their likelihood ratio. Finally, we develop a multiscale profile likelihood analysis to visualize the structure of the financial data at different scales (typically from 100 to 750 days). We test the methodology successfully on synthetic price time series and on three wellknown historical financial bubbles. 
Keywords:  financial bubbles; crashes; inference; nuisance parameters; modified profile likelihood; nonlinear regression; JLS model; logperiodic power law; finite time singularity: nonlinear optimization 
JEL:  C13 C18 C53 G01 G17 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1612&r=rmg 