article - finance

The role of technology in driving better trading decisions

By Mike Woodacre

Improving your ability to apply more insight to risk-based decision-making is critical in the financial trading space. And technology has a huge role to play, argues Mike Woodacre.

Data is transforming the way finance firms and hedge funds think about the way they operate. It informs their risk management strategies. It governs their trading algorithms. It influences their modelling.

And as such, a robust strategy for managing that data in order to get the most out of the information at your disposal is critical.

The first thing to consider is just the huge amount of data available these days. There’s more data than ever before, and it’s increasingly coming from multiple sources – whether it’s historic trading data, real-time updates and streaming data, or so-called alternative data such as social media sentiment, geospatial information, news and other sources that could potentially have an impact or otherwise provide an advantage.

Approached in the right way, all those different datasets can be combined together to build a richer and more complete picture of the environment you’re dealing with, and enable a better understanding of what’s going on in a particular market. Are there historic patterns of risk? What are the types of events that precede those situations? And how does that information enable you to anticipate new risk events that may occur in the future?

In order to get more accurate answers to some of those questions, working with data in near real-time is increasingly critical. The nearer you can get to a real-time event, the more valuable that is. The more processing time it takes to make sense of the data and come up with an output to a decision-maker, the greater the chance of you missing an opportunity or mitigating a risk.

The good news is that we’ve now got the technology to start dealing with the amounts of data at the speeds needed to give those near real-time decisions. And that technology is evolving in a number of ways.

We’re seeing a proliferation of alternative processing technologies such as computational accelerators, which are key to doing dense compute on patterns of activity rather than relying on traditional algorithms. We’re looking at using artificial intelligence and machine learning to process and make sense of that data. Unsupervised machine learning is pretty interesting in this space, because it’s looking for the ‘unknown unknowns’ in what is an increasingly huge stack of data. There are many techniques.

And as such, how you connect all these different elements together is increasingly important. You’ve got to be able to ingest data into your infrastructure at unprecedented rates. You’ve got to be able to move it around and apply the appropriate processing element to it. And then you’ve got to give a dashboard-like experience to the decision-maker, and make it easy for them to know what to do with the knowledge or the information that’s being presented to them.

There’s no one-size-fits-all solution. But it is increasingly possible to use multiple techniques and combine them together to give you the best answer.

A comparison with the life sciences industry provides an interesting illustration. A few years ago, a doctor would pore through medical journals in order to find out whatever it is they needed to know. Today, however, there are so many journals, so much information, that it’s impossible for an individual to retain all of that historic knowledge in their head at one time. By applying machine learning techniques to the wealth of data contained within all those medical journals, technology can surface the key information that the human in the loop – the physician – needs in order to make the correct real-time decision.

Ultimately, technology is about augmenting that decision-making process, and so having the human in the loop is critical. How do we use these tools to understand the data in order to make better quality decisions, whether they relate to risk management in the finance space, or which medical treatment path to go down in the life sciences space? That is the key question.

And increasingly, technology is providing some interesting answers.

Mike Woodacre is an HPE Fellow and Chief Technology Officer for the High Performance Computing (HPC) and Mission Critical Solutions (MCS) group at HPE. Mike joined HPE with the SGI acquisition in 2016, after 26 years at SGI where he was Chief Engineer for scalable systems. He has a BSc in Computer Systems Engineering from The University of Kent, Canterbury, and has been granted multiple US patents in the field of computer systems architecture.

Related Events