With reference to HBO hit Westworld, capital markets and fintech research specialist Virginie O’Shea argues that having a solid data foundation is critical to AI success
The industry may be abuzz with the potential of artificial intelligence to transform the way capital markets operate. But is rushing into an implementation without the necessary groundwork really the best approach?
After all, things didn’t work out too well for denizens of HBO’s Westworld, and the movie Robocop is testament to the perils of automating compliance without the right security controls in place.
Westworld envisions a future of realistic, humanoid, AI-based robots that dominate a Wild West-style theme park. Things go vastly awry when these beings start to think on their own. In the business world, AI can similarly be perilous – or it can potentially help to save your enterprise.
Although it is easy to laugh off the lessons that can be learned from a TV series or a movie, they often have a recurring message that can be applied to the capital markets. AI is only as good as the data it has access to.
“Would you want your AI making decisions based on faulty data?”Virginie O’Shea, capital markets and fintech research specialist
This year overall and the COVID-19 crisis in particular have proven that digitalization and transformation are essential to the future of this industry – not only in the front office, but in the middle and back office, as well.
I have had frequent conversations with operations teams over the last six months that indicate we’ve still got a long way to go before we’re fully efficient. Firms are still struggling with manual processes, dependent on fax machines, printers, scanners and paper-based documentation. A lot of information is hard to access, and automation is lacking within core post-trade functions.
Artificial intelligence has huge potential to augment and transform current market practices, but we need to make sure we’re putting the groundwork in to make sure it’s successfully implemented. Market leaders have learned the lesson that firms first need to build out their data foundation layers to make best use of this technology.
Faulty Data will ‘Corrupt’ Financial AI Systems
There’s been a lot of industry discussion about the potential of machine learning and AI to tackle challenges ranging from regulatory change management to improving the client experience over the last few years.
In 2018, Sibos, the annual mega-banking conference that travels the globe, indicated that AI had overtaken blockchain and other technologies in its dominance on the industry’s agenda with its record 25 sessions covering the many facets of the potential of machine learning.
Last year, AI continued to dominate the agenda, and the conference’s closing keynote saw Google Cloud CEO Thomas Kurian championing the use of AI by industry regulators to ease the burden of compliance.
However, another recent consideration is the importance of underlying data infrastructure in the effort to turn any of these technology ambitions into a reality.
As we creep closer to this year’s first-ever virtual Sibos in October, it’s important for the industry to keep data infrastructure improvements top of mind. After all, AI’s successful deployment for any task is dependent on the provision of high-quality, accurate data.
Think back to Westworld again: Would you want your AI making decisions based on faulty data? Of course, your compliance or trading system isn’t likely to shoot you in a Wild West gunfight if you get it wrong. But it could have significant negative impacts on your business and reputation.
Top-tier banks that have successfully deployed AI to support tasks such as market sentiment analysis or trade surveillance can tell you that, though the technology is great at pattern recognition, it doesn’t work well if the data inputs are inconsistent or low volume. AI requires a high volume of consistent and compatible data to deliver actionable insights to traders, compliance teams or any other functions and lines of business attempting to deploy the technology.
Building Intelligence into Financial Data Management
So, how does the industry tackle the thorny issue of data management when it must gather that data from a patchwork of systems, including decades old legacy technologies and static and inflexible data lakes that are more like ‘data swamps’?
One approach is to deploy a next-generation, real-time intelligent data layer that can sit between existing applications and AI-enabled applications to allow seamless and real-time data access, integration and analysis.
These applications must scale out dynamically to accommodate increases in data volumes and workloads, as markets and volatility levels spike in times of crisis. This means that firms can benefit from next-generation technology without an entire structural rebuild of every enterprise data store.
“One approach is to deploy a next-generation, real-time intelligent data layer that can sit between existing applications”Virginie O’Shea, capital markets and fintech research specialist
Contrasted with data lake initiatives of the past, these are ‘hot, smart data lakes’. Hot refers to the ability to incorporate real-time data. Smart provides the ability to harmonize the data and run all sorts of advanced analytics that can transform the data into value for the business.
As AI continues to bring value to capital markets firms by automating manual processes, improving compliance, reducing risk and increasing alpha, forward-looking firms are combining the latest AI and machine learning software with the newest advancements in real-time, intelligent data management technologies.
That’s how the industry’s data leaders are conquering the ‘Wild West’ of today’s harried environment and putting AI to work for good.
- Virginie O’Shea is a capital markets and fintech research specialist, with two decades of experience in tracking financial technology developments in the sector. This article is based on her paper, Through the Pandemic and Beyond: Overcoming Data Management Challenges in Capital Markets, which was produced in collaboration with InterSystems