A revolution in Asset Management is coming. The current cost pressures, increased regulation and changing customer expectations are not sustainably served with the systems on offer today. Our belief is that technology will start and help this revolution and dictate how the future of the industry looks.
What has changed?
· Portfolios have become more complicated: multi-asset, multi-class and multi-time zone.
· Regulators have increased their demands for timeliness, accuracy and lineage of data.
· Institutional clients, especially pension funds, are being made acutely aware of their fiduciary responsibilities: not just to inform their members but to be seen to manage their asset managers. They want transparency not just of costs but also of performance.
In my view, it will become increasingly unacceptable for the industry not to use the infrastructure that enables a mutually beneficial connection between asset owner, manager and servicer.
What does the current landscape look like?
Our clients tell us that current systems architecture in investment management is holding back the industry. Typically, systems are heavily based on batch processing producing end-of-day positions with limited real-time capability. When data issues arise or when we want to look back and discover which investment decisions and activities got us to where we are now, it may be almost impossible.
It is easy to see why such system choices were made. We have made these decisions and they were made with solid reasoning based on the best information then available. In the recent past there was no requirement for real-time data – or more specifically, the cost-benefit trade-off of providing the capability weighed heavily on the cost side. Investment Managers were generally single region and single asset class. Overnight downtime was a negligible issue.
Cast your mind back to some of the technology affecting these decisions: We’re talking 1990s, early 2000s:
· Memory was expensive and managed byte by byte. Compressing data to only essential pieces of information was the only correct choice.
· Storage was expensive. Keeping peta-bytes of information without a regulatory or client imperative was not sensible and indeed, it was not technically feasible to manage this quantity of data.
· Processing was slow and search for data was sub-optimal. Finding value from large amounts of data was a twinkle in some research student's eye because the financial industry made most of the decisions governing the technical landscape, Google didn’t exist!
Times have changed, however, and now is the time to start again with the latest technology. While huge advances in financial technology have been made in banking and commerce (on the SellSide) by exploiting modern computing power, only recently have a few pioneers decided to bring the latest technology to the BuySide: to asset managers and their clients
What can we do differently now?
· Create software that gives efficient real-time valuations and analytics 24/7. Elastic compute and virtually infinite storage available through cloud means we can run more calculations. We can also use, keep and search more data.
· Build systems that never forget a transaction – bi-temporal data stores which act as a time machine allowing you to store you data in a manner that means you don’t have to go to back-ups to find out what the system looked like in the past.
· Build out of the box systems to support what-if scenarios and simulations of change.
· Make performance transparency a reality – we have all the data we need now.
· Through the capability of ‘never forgetting’, automatically retain data needed for compliance, allowing specialised ‘RegTech’ software to worry about the workflow rather than taking copies of the trading book to prove the state of the system at the point decisions were made.
· Put paid to payment for upgrades. If we build systems to be API first, simple and performant we can move to a model where large system upgrades are a thing of the past.
Through cloud-hosting, enhanced and shared identity which even regulators have begun to endorse and use, we can:
· ‘mutualise’ on a common platform the ‘plumbing’ of digital systems – the non-differentiating functions of infrastructure that are as logically shared as are utilities such as water and electricity
· Create fully scalable systems reactive to fluctuations in demand and so eliminate capacity worries, and their capital implications – forever
· Dramatically cut back on the costs asset managers experience over changes to their systems
· Enjoy the impressive physical security, close to military grade, that cloud hosts such as AWS and Azure can offer
· Almost inevitably, not just reduce operating costs, but slash them.
The combination of positive effects on the fund management industry of this approach is impressive. By mutualising infrastructure, management will be liberated from digital worries and be able to concentrate on its core competency. Operating costs will almost certainly reduce at a time when margins are threatened by regulation and by the growth of low cost passive funds. Through transaction memory and real-time data, portfolio managers will better understand current positions and trace the steps that got them there.
But for any business, the true measure has to be the gaining and retention of clients. Two things stand out in this respect: institutional clients experiencing new pressures to rigorously supervise their investments can now be better informed on the real performance of their investment managers than was hitherto possible and where asset managers have taken advantage of the new cloud-based technology to free themselves from much of the admin load of old systems, they can devote freed-up cash and personnel to outperforming in their core function and ensure success in client retention and recruitment.
By Thomas McHugh
Thomas McHugh is a former Head of Quantitative Development for RBS and is a founder director of FINBOURNE Technology whose LUSID ® asset management platform was announced in October.