top of page
Advisory and financial modelling
ccc.jpg

All models are wrong but some are useful
(George EP Box)

What is Risk Analysis for?
Risk analysis of a financial portfolio is the process of identifying, assessing and controlling threats to an investments portfolio.
What is Return Analysis for?
Return analysis is the calculation of expected profits and losses of an assets' portfolio. Different methodologies allow to estimate the fair value of single assets, to calculate the correct mark to market value of the portfolio and to determine the expected returns of the investments.
Calculating the expected returns, not only observing only happened in the past, and analyzing risks properly are the only two relevant elements in a capital allocation process
The best allocation: Stat Algorithms 
riskreturn.png
Many statistical algorithms allow to find the best asset allocation simply simulating ten thousands of portfolios in order to maximize the ratio Return/Risk. The “Risk” usually is the standard deviation calculated on the historical data and the expected “Return” is an average of historic return made by the portfolio in the past. Most of statistical tools do the same calculation and use past data to predict future ones even if there is no guarantee that what happened in the past will repeat itself in the future.

Portfolio Enhancement

​

The data points are coloured according to their respective Sharpe ratios, with blue signifying a higher value, and red a lower value. Red star is for the maximum Sharp ratio portfolio and a green star for the minimum variance portfolio

The best allocation: Risk Analysis 
var.png

Var Calculation of multi asset portfolio

vtt.png

Var Tests

Since the 90s years the Value at Risk is one of the most used methodology to understand financial risks in a portfolio. Historical, Parametric, Montecarlo methods are easy to calculate but all assumes that the returns’ distribution is normally distributed something that doesn’t always happen in the real life.
CorMatrix.png

Correlation Matrix

The best allocation:  Forecasting and Fair Values
The solution in our view is the perfect understanding of what are the expected returns of the portfolio’s assets. This means that for fixed income instruments we need to calculate the fair value of each bond and derivatives importing yield and credit curves, for equities we need to apply fundamental analysis to calculate the fair values and following technical indicators and news to discover the right time to invest. Beside in the recent years Neural Networks improve a lot forecasting methodologies.
lstm.png

Forecast Model with Recurrent Neural Network in Python

The best allocation:  Backtesting
Best strategies need to be backtested before going on market in order to maximize probability of success.
The best allocation:  Sensitivities
A deep knowledge of the risks associated to a portfolio leads to the exact understanding of happens in case something goes wrong. Past data are not indicative for the future, so this can be achieved only through sensitivity analysis. The Sensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system can be divided and allocated to different sources of uncertainty in its inputs. In the financial markets the Sensitivity analysis is a model that assesses how target variables are affected based on changes in other variables known as input variables. This model is also referred to as "what-if" or simulation analysis. It is a way to predict the outcome of a decision given a certain range of variables.
The combination of the calculation of expected returns  and an accurate analysis of financial risks allow to find the best portfolio allocation.
Although most of statistical and mathematical models are freely available on the web, their implementation though the correct use of inputs isn’t often that easy and this is what we do as follows:

the company offers advisory services in the following sectors:

Risk Analysis:

VaR Model

Value at Risk (VaR) is defined as the maximum loss with a given probability, in a set time period, with an assumed distribution and under standard market conditions. It is a statistical measure of the risk of loss for an investment. The VAR is calculated with the following methods: the historical, the variance-covariance and the Monte Carlo Simulation.


Sensitivities Analysis 

Sensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system can be divided and allocated to different sources of uncertainty in its inputs. A related practice is uncertainty analysis, which has a greater focus on uncertainty quantification and propagation of uncertainty; the process of recalculating outcomes under alternative assumptions to determine the impact of a variable under sensitivity analysis can be useful for a range of purposes; testing the robustness of the results of a model or system in the presence of uncertainty.

In the financial markets the Sensitivity analysis is a  model that determines how target variables are affected based on changes in other variables known as input variables. This model is also referred to as "what-if" or simulation analysis. It is a way to predict the outcome of a decision given a certain range of variables. 
Some indicators are:


Dv01   
Credit01
   
Inflation01
   
Greeks (delta, gamma etc)   



Machine Learning 

Emulators are data-modelling / machine learning approaches that involve building a relatively simple mathematical function, known as an emulator, that approximates the input/output behaviour of the model itself. In other words, it is the concept of "modelling a model" (hence the name "metamodel"). The idea is that, although computer models may be a very complex series of equations that can take a long time to solve, they can always be regarded as a function of their inputs. By running the model at a number of points in the input space, it may be possible to fit a much simpler emulator η(X), such that η(X) ≈ f(X) to within an acceptable margin of error. Then, sensitivity measures can be calculated from the emulator (either with Monte Carlo or analytically), which will have a negligible additional computational cost. Importantly, the number of model runs required to fit the emulator can be orders of magnitude less than the number of runs required to directly estimate the sensitivity measures from the model.


Scenario Analysis 

Scenario analysis is a method for predicting the possible occurrence of an object or the consequences of a situation, assuming that a phenomenon or a trend will be continued in the future.


BackTesting 

Backtesting measures the accuracy of the value at risk calculations or the probability of success of a strategy. Backtesting is the process of determining how well a strategy would perform using historical data. 


Stress Test 

Stress testing is a computer-simulated technique to analyse how investment portfolios fare in drastic economic scenarios. Stress testing helps gauge investment risk and the adequacy of assets, as well as to help evaluate internal processes and controls.


Liquidity Analysis   

Liquidity Assessment of Portfolios   provides the ability to quantitatively evaluate market liquidity across multiple 
asset classes in normal and stressed market conditions.


Forecasting

Predicting stock prices is an uncertain task which is modelled using machine learning to predict the return on stocks. There are a lot of methods and tools used for the purpose of market predictions. The financial markets are considered to be very dynamic and complex in nature. An accurate prediction of future prices may lead to a higher yield of profit for investors through stock investments. As per the predictions, investors will be able to pick the stocks that may give a higher return. Over the years, various machine learning techniques have been used in stock market prediction, but with the increased amount of data and expectation of more accurate prediction, the deep learning models are being used nowadays which have proven their advantage over traditional machine learning methods in terms of accuracy and speed of prediction. Long-Short-Term Memory Recurrent Neural Network belongs to the family of deep learning algorithms. It is a recurrent network because of the feedback connections in its architecture. It has an advantage over traditional neural networks due to its capability to process the entire sequence of data. Its architecture comprises the cell, input gate, output gate and forget gate. The cell remembers values ​​over arbitrary time intervals, and the three gates regulate the flow of information into and out of the cell. The cell of the model is responsible for keeping track of the dependencies between the elements in the input sequence. The input gate controls the extent to which a new value flows into the cell, the forget gate controls the extent to which a value remains in the cell, and the output gate controls the extent to which the value in the cell is used to compute the output activation of the LSTM unit.

  
Financial Engineering:

Knowledge for pricing and hedging  every kind of structured product. Structuring and origination of complex strategies.


Portfolio Enhancement & Cost Analysis:

Find the optimal portfolios that offer the highest expected return for a defined level of risk. At same time the cost analysis allows to reduce the impact of transaction and management costs.


Sentiment Analysis:

Sentiment analysis (also known as emotion AI) is the use of natural language processing, text analysis, computational linguistics, and biometrics to systematically identify, extract, quantify, and study affective states and subjective information. In the financial world it allow PM to understand which is the mood of retail and institutional traders and adopt accordigly appropriate strategies. The main difficult is to link tickers to the news and understand when the news is fake or don't effect the price of related stock.

 

Why we are different

Sophisticated pricing, hedging, forecasting and risk analysis models are often useless if the calculation is done by people who do not constantly follow  financial markets.
All our models are   adopted and tested by traders and portfolio managers for portfolio managers

keras.png

Piece of the code of Graph neural networks. it is the preferred neural network architecture for processing data structured as graphs (for example, social networks or molecule structures), yielding better results than fully-connected networks.

back.png

Backtesting of long short strategy

Stunning "white label" reports available for our Institutional Clients
 

bottom of page