An interview with Prof. Marco Avellaneda, Professor of Mathematical Finance at New York University Courant Institute of Mathematical Sciences, as told to Irene Aldridge.
With 2016 registering levels of volatility in the U.S. markets not seen for a long period of time, we sat down with the volatility expert, Prof. Marco Avellaneda of NYU Courant Institute for Mathematical Sciences to discuss what is underpinning developments in the markets. Prof. Marco Avellaneda is
an internationally recognized figure in the field of computational volatility modeling. Some of his latest research will be on display at the Big Data Finance Conference to take place at New York University on May 19-20, 2016.
IA: What causes the volatility we see in the markets today?
MA: The observed Q1 ‘16 volatility in the markets is related to the following factors:
1) Financials of companies exposed to energy volatility
2) Emerging markets in places like Brazil, Russia and China that are experiencing political and energy-related turmoil, and
3) A near-all-time high in the U.S. equity prices, implying a possible correction, or a new high, with a roughly 50/50 probability.
Most worrisome to me, as a mathematical researcher in the market developments, is the strong correlation between oil and the S&P 500 – a healthy economy should be independent of oil prices, or, even better, have a negative dependency on oil. Reactions to other non-energy related news also plays a role in volatility as these events are always exaggerated by highly-leveraged hedge funds, and create bounces in the markets. As a result, many traders end up in a highly volatile sideways market, with one finger on the “buy” button and another finger on a “sell” button, and not knowing which button to press.
Most of my research has focused on the risk management of such volatility. In today’s age of Big Data, risk management is not only desirable, it is mandatory and even affordable. Since the markets do not have a clear direction, and most money managers are paid to prudently grow and protect capital, and not to guess market’s direction, we need to be in the market, but keep a strong control on portfolio swings. The advances in data processing and the mathematics of volatility modeling allow us to quickly and efficiently model markets in ways unthinkable even some ten years ago.
Today, we can simultaneously simulate risk factors, volatilities and correlations in multi-asset markets with thousands and thousands of financial instruments, accurately tracing the effects of the slightest
perturbations into full-blown multi-dimensional market risk scenarios. From there, we can efficiently estimate the required hedging and find instruments to implement it in the most cost-effective way.
IA: How is the Big Data changing Finance as usual?
MA: Data grows in size. Sell side and buy side firms need to take inspiration from BigData companies such as FB and GOOGLE and feel comfortable with data as a source of ideas and also to keep records and data-mine their own business. The upcoming Fundamental Review of the Trading
Book (FRTB) which has become the latest addition to Dodd-Frank, requires that all desks in a bank report their risk and take initial margin for trades according to detailed procedures. The execution of FRTB will require massive trading data warehousing, as well as price histories of everything
under the sun. All of these developments are one of the primary reasons we decided to hold the annual Big Data Finance conference at NYU Courant, now in its fourth year and a tremendous success.
While quantitative analysis was the job description for the past 15 years, quantitative analysis coupled with number-crunching will be a massive job of the future. Rather than considering this an annoyance, the sell side should realize that the business is data-driven and embrace the new paradigm (or new normal). If they cannot respond to regulators and client requirements, then they will not be in the business.
Another aspect of data which is very interesting is the Central Counterparty story. Data repositories and vendors are being acquired by financial institutions at a great speed. I note in particular the acquisition of IDC by ICE (Intercontinental Exchange) for 2.5 billion USD and the recent merger of MARKIT and IHS. Basically, as markets become more like big risk warehousing and mark-to-market is mandatory for all sorts of products (swaps, cds, OTC energy) the data required for computing prices and risk
(based usually historical data and math-finance pricing models) becomes a precious commodity.
Some people say that water is the precious commodity of the future. I think data is. Seriously.
Prof. Marco Avellaneda is a Full Professor at New York University Courant Institute for Mathematical Sciences. Prof. Avellaneda was named Quant of the Year by Risk, and will speak at fourth annual Big Data Finance 2016 conference at NYU Courant on May 19-20, 2016. He consults in the area of option pricing and related optimization and has recently launched a new efficient option-pricing streaming data platform. He can be reached at marco.avellaneda@