About a-team Marketing Services

A-Team Insight Blogs

Bank of America Merrill Lynch Focused on Building a Data Fabric to Support “Multiple Truths”, Says Dalglish

Subscribe to our newsletter

Bank Of America Merrill Lynch has moved with the times in terms of recalibrating the goal of its reference data project, thus the focus is no longer on merely establishing a single gold copy, rather it is on building a robust data fabric to support all of its downstream users’ requirements, according to Tom Dalglish, director and chief information architect at the firm. The project has also achieved the three Ms of a successful reference data project: management, money and mandate, he explains to Reference Data Review.

“In some sense the single golden copy era has passed and now there is more of a focus on building a data fabric that is able to cope with the business requirement for global distribution and multiple different output formats for downstream systems,” explains Dalglish. “Individual business lines need different data sets and firms must be able to deal with these client driven multiple truths and manage the raw data accordingly.”

Reference Data Review readers should be no strangers to this development, given the comments made by Nick Murphy, data specialist for pricing and evaluations at ING Investment Management, at an industry conference last year on the death of “golden copy” as a term within his own institution. The increased focus on meeting downstream users’ requirements, whether internal or external, is paramount in the reference data industry at the moment and this has affected the approach to data management projects significantly. 

However, this development does not mean the death of golden copy as a concept (even if the term has fallen out of favour in some circles). Dalglish explains: “Golden copy engines surely still have their uses but individual data models may be somewhat less important: firms need a semantics-based repository and a discoverable metadata repository that allows users to navigate their reference data effectively.”

This focus still requires strong extract, transformation and loading (ETL) tools, a powerful database and rules engine, and some notion of enterprise data management (EDM), he explains, although the real key to success is still in the adoption of enterprise data by internal clients. “It is not sufficient to have a good data model if nobody wants to use it. The need for all reference data to be ubiquitous simply translates into providing easier access to entitled data,” he elaborates.

To this end, Bank of America Merrill Lynch’s data management focused team has received the funding and support it needs in order to be able to start down this road in earnest. “There is a real focus from senior management on the importance of getting the data architecture right for the business. Funding has been strong and it is widely recognised that reference and market data management needs to be a true horizontal across all business lines, much like a utility that is run within the firm,” says Dalglish. And he reckons his own institution is not alone in this: “Finally, many firms have been handed the mandate to fix reference data in advance of anticipated regulations.”

Since the crisis in 2008, Dalglish notes that there was a pulling back from spending on data management, but in 2010 there was a strong focus on this space because of the requirement to keep a closer track of counterparty risk exposure and meet new regulatory requirements. The ongoing ripples as a result of the fall of Lehman also, likely, had their part to play in raising the profile of counterparty risk, along with regulatory developments around Basel III and the like.

Dalglish reckons that the prevalent attitude is to treat data as a first class construct and address problems with legacy systems. “There are also a lot of new vendors getting on the data bandwagon and increasing the range of options out there; 2011 is likely to see this trend continue,” he adds.

On the flip side, the maturing of the EDM practice has meant that a number of the traditional EDM solution vendors are struggling for position, especially in the face of new competitors in the market. “It is likely that we will see further consolidation in this space,” contends Dalglish. “On the plus side, we have a much wider choice of vendors than ever before though it has become difficult to sort through so many products all claiming to be in the EDM space. There are some fascinating new vendors in the market including some which are providing corporate hierarchies along with strong data visualisation tools.”

It will certainly be interesting to see who emerges as the winners in the race to gain greater market share in the data management sector, as vendors up their game in light of the increasing number of regulatory driven projects getting the green light this year. Buzzwords such as data caching, near-time and the ability to be standards agnostic have been prevalent over recent months and are likely to gain more interest over 2011.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to ensure employees meet fit and proper requirements under global accountability regimes

Date: 17 September 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Fitness and proprietary requirements for employees of financial institutions are not an option, but a regulatory obligation that calls on employers to regularly assess employees’ honesty, integrity and reputation, competence and capability, and financial soundness. In the UK, these...

BLOG

Generative AI in 2024 – The Look Ahead for Investors

By Marsal Gavaldà, CTO, Clarity AI. Unlike some technology trends, the hype around generative artificial intelligence (AI) will not fade, and I expect AI to remain a priority for investors in 2024. The emergence of generative artificial intelligence (GenAI) is a watershed moment in the tech industry, as transformational as the advent of the internet,...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Connecting to Today’s Fast Markets

At the same time, the growth of high frequency and event-driven trading techniques is spurring demand for direct feed services sourced from exchanges and other trading venues, including alternative trading systems and multilateral trading facilities. Handling these high-speed data feeds its presenting market data managers and their infrastructure teams with a challenge: how to manage...