The second part of the “Portfolio Construction”-series explores whether introducing parameter uncertainty to the model would improve the out-of-sample performance of the optimal portfolio. Additionally, the article proposes and tests two adjustments to regular utility optimisation.
The rapid evolution of computational technologies has enabled businesses to leverage machine learning methods to tackle challenging, labour-intensive tasks involving various degrees of judgement and decision making. Financial markets are no exception. In this article we present the case of using our AI-driven solution to tackle a common challenge in finance – the fair value measurement of illiquid financial instruments.
There is a number of challenges associated with portfolio construction based on historical data. This three-part article series explores some of the most common issues attributed to the model-based portfolio optimization: the sensitivity to changes in data, large variations in portfolio weights and the bad out-of-sample performance.
As machine learning methods grow in use and popularity, we explore yet another dimension of wealth management that our experts consider fit for applying such frameworks. In this article, we deploy hierarchical clustering to find more consistent ways of predicting the relative future performance of funds.
The modern wealth management industry still relies on the 50-year-old approaches to portfolio management, widely popularized by Markowitz's Modern Portfolio Theory (1952). Despite heavy criticism within the academic circles, the alternative methods remain undeservingly overlooked in practice. In the context of the modern leap for hyper-customization, we look into one of the alternatives to Modern Portfolio Theory in greater detail - the Utility-based approach.
In the second part of the article series, we outline a framework utilising both the Self-Normalizing Neural Networks (SNNs) and the logistic regression for bond liquidity classification. This framework is subsequently applied to the Swedish bond market in an investigative case study.
Machine learning applications have become more prominent in the financial industry in recent years. Our new article series is exploring the benefits and challenges of using self-normalising neural networks (SNNs) for calculating liquidity risk. The first piece of the series introduces the main concepts used in the investigative case study for the Swedish bond market.
In the third and concluding article in the ALM using LMSC series, we focus on analyzing the optimal asset allocations in the context of changing asset classes as well as finding the optimal allocation by maximizing the risk-adjusted net asset value. The estimates based on the LSMC method are then compared to the estimates obtained from the full nested Monte Carlo method.
The second part of the series exploring the use of Least Squares Monte Carlo in Asset and Liability Management is focused on evaluation of accuracy and performance of this method in comparison to full nested Monte Carlo simulation benchmarks.
In the first part of the ”Asset and Liability Management using LSMC” article series, we outline an ALM framework based on a replicating portfolio approach along with a suitable financial objective. This ALM framework, albeit simplified, is constructed to provide a straightforward replication of the complex interactions between assets and liabilities. Moreover, a brief introduction to the LSMC method used to generate all underlying risk factors is presented.
In continuation of our discussion of cyber risk, this paper investigates the issues of cyber risk management within financial industry. In particular, we look into the process of determining the optimal size of the investments in cyber security as well as the quantification of the appropriate cyber insurance premiums.
In continuation of our discussion of cyber risk, this article reviews different methods and models, which can be used to analyse and quantify the risks of information security breaches faced by the contemporary financial industry.
This article addresses the topic of cyber risk and different aspects of the mitigation of its adverse effects on financial institutions.
This article will discuss why it is important to model credit indices and detail a number of different approaches to this problem.
In this article, we evaluate the rolling window procedure to alleviate the problem of inadequate data by increasing the number of observations extracted from a limited set of data.
This article is composed of discussions on dynamic hedging and presentation of a case study in order to investigate the impacts of dynamic management actions on Solvency capital requirement.
In this article, we will develop a framework for structuring the operational risk within a company.
In this article we finalize the CVA series by presenting a case study where the aim is to price the CVA of a vanilla interest rate swap (IRS) using the four different frameworks.
In this article we will expand the concept of CVA by presenting different cases where the investor is seen as either risk-free or risky. We then present four different CVA pricing frameworks and discuss their level of sophistication.
In this part we evaluate the framework by performing simulations and discuss the implications of utilizing a dependence model like this.
This article will focus on explaining what CVA is as well as regulatory measures regarding CVA. In later parts of this series, we will describe and evaluate different methods for modelling CVA.
In this article, we will focus on how management actions used in an internal SCR model can be evaluated and validated. This will be done from a perspective of both risk and return.
In this article, we will explain what management actions are. Our main focus will be the regulatory requirements on management actions under Solvency II.
In this article we seek to develop a model allowing for dependence between equity and credit risk.
Part I of III describing a framework for analysing dependency between equity and credit risk.
In this article we conduct a case study of the operational risk capital requirement, with the ambition of comparing it with the Solvency II Standard Formula.
In this article we investigate the performance of the LSMC approach on a stylised financial product.
We will in this article give an introduction to operational risk, and explain the subject as it is defined in Basel II.
In this article we will introduce an efficient way of estimating and calibrating regression functions in a LSMC environment.