About a-team Marketing Services

A-Team Insight Blogs

Regulatory Pressure is Compelling Investment in Data Quality, Says Bank of America Merrill Lynch’s Dalglish

Subscribe to our newsletter

Regulatory pressure is compelling firms to plough investment into their data infrastructures in order to be able to ensure the consistency of the basic reference data underlying their businesses, according to Tom Dalglish, director and chief information architect at Bank Of America Merrill Lynch. Developments such as the Office of Financial Research also represent the potential for collaboration between regulators and other parties from within the industry, he says.

“The pressing requirement on the regulatory front is the ability to provide consistent data across the firm, track data flows and usage, leverage consistent identifiers and provide an auditable chain of custody for data as it traverses the enterprise,” says Dalglish. “Firms need to focus on what they are doing across lines of business to guarantee that all users are looking at consistent data and at the same time reduce duplicate storage, improve the process of data entitlement and authentication and increase the auditability of data. There are anticipated regulatory requirements for providing evidence of a data governance policy and traceability of data.” Quite a list of requirements given the siloed nature of most firms’ data infrastructures; no wonder investment is being earmarked by so many of these financial institutions.

Regulation is proving to be both a blessing and a curse to the data management function, according to Dalglish, who last year pointed to the intense scrutiny of the function by regulators as a challenge as well as an opportunity during a panel discussion. The regulatory spotlight has the ability to highlight any underlying inaccuracies that could potentially cause reputational damage to those caught out, which all adds up to more pressure on the data management function to perform. With investment comes great responsibility, after all.

Dalglish’s own firm is investing in its data fabric to be able to meet the requirements of its downstream users and various regulatory reports and he also recommends opening up lines of communication with the regulatory community. He reckons that a healthy industry dialogue on these subjects will help matters: “Although we need to be able to second guess the regulators we also need to be able to engage with them directly. It is important to maintain good relationships with both the regulators and data vendors.”

As for the most pressing data management items on the priority list, he indicates that governance remains a challenge within the industry, as reference data teams in many firms are still fighting silos and compensation models across lines of business. “Senior management of individual business lines may be less inclined to sign off spending on group-wide projects; they don’t necessarily want to pay for data quality for other lines of business,” he says. “Mainly, though, we have seen a broad convergence of purpose in the wake of the market turmoil of the past few years.”

Dalglish reckons the trick is to increase the efficiency and simplicity of reference data management and bring it under a good governance umbrella. “It has taken a financial crisis to underscore the importance of this endeavour, however,” he continues. “The cost benefits of consolidation of vendor data for example are more important in the current market than they used to be, as is improving data stewardship, which allows groups to be more business effective. Moreover, the rapid capability of on boarding new data sources centrally and distributing them ubiquitously is paramount for any enterprise capability.”

So, priorities are clearer as a result of regulatory pressure, but what of the standards discussions going on at the legislative table? The industry has until the end of January to respond to the Office of Financial Research’s legal entity proposals, which were published in the Federal Register at the end of November, for example.

Dalglish reckons the various ongoing symbology initiatives – including the various legal entity proposals by Swift, ISO and Avox, as well as Bloomberg’s Open Symbology discussions – are a step in the right direction but he feels that the passing of the Dodd Frank Act and the aggressive timelines specified therein for compliance is a more interesting development. “It is likely that we will have to adopt a hybrid (public and private sector) solution between the regulatory and vendor communities,” he contends.

“The plans for a regulatory backed reference data utility are likely to be thwarted by commercial concerns, intellectual property issues and warring over standards,” elaborates Dalglish. “If, for example, the ISIN is selected as an issue level identifier this may pose a challenge for the industry as it is not specific enough to track the entire related financial instruments required by the business for electronic trading. On the other hand, a proprietary identifier might confer unfair advantage to a particular vendor.” As frequently noted by Reference Data Review, when it comes to standards, the choice is not always clear cut.

Dalglish reckons instrument data standardisation will likely be easier than the challenge of establishing standards for client and counterparty data, mainly because there are so many vendors from which to buy financial securities (issue) data. “For legal entities there needs to be a data hierarchy that is able to show all the parent/child linkages and internal groupings and which must have immutable identifiers that never get re-used. We can also foresee a need to have flexible hierarchical containers that can be extended by institutions to support internal identifiers as well,” he adds.

The first half of 2011 will likely see tremendous activity in the regulatory space, as understanding the implications of the various mandates becomes clearer, according to Dalglish. “There has been a confluence of pivotal events for us in that regulators are demanding transparency, vendor capabilities are increasing and advances in software technologies have presented firms with a wide variety of choice. We may not be entirely sure where we are going, but we are on our way,” he concludes.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Practical considerations for regulatory change management

Date: 18 September 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Regulatory change management has become a norm across financial markets but a challenge for financial institutions that must monitor, manage and adapt to ensure compliance with both minor and major adjustments to obligations. This year is particularly troublesome, with...

BLOG

Navigating the MiFIR Refit in 2024

The MiFIR Refit came into force in May to overhaul the European financial landscape with its focus on transparency and data integrity. Its ban on Payment for Order Flow aims to remove any vestiges of conflict of interest, while the consolidated tape is set to provide a comprehensive view of market data in a standardized...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

ESG Handbook 2023

The ESG Handbook 2023 edition is the essential guide to everything you need to know about ESG and how to manage requirements if you work in financial data and technology. Download your free copy to understand: What ESG Covers: The scope and definition of ESG Regulations: The evolution of global regulations, especially in the UK...