This week, US Federal Reserve governor Daniel Tarullo brought the issue of data standardisation to the attention of the US Senate during his testimony before the Subcommittee on Security and International Trade and Finance, following up on his comments last year about the data challenge related to living wills reforms. He is proposing to establish a new centralised system of data collection and monitoring and to encourage greater data standardisation across the reference data space, especially in the areas of instrument and entity identification.
The regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. As noted by Tarullo this week: “The recent financial crisis revealed important gaps in data collection and systematic analysis of institutions and markets.”
To rectify these inadequacies, the US regulator is seemingly keen to kick off a standardisation process: “The Federal Reserve believes that the goals of agency action and legislative change should be: to ensure that supervisory agencies have access to high quality and timely data that are organised and standardised so as to enhance their regulatory missions; and to make such data available in appropriately usable form to other government agencies and private analysts so that they can conduct their own analyses and raise their own concerns about financial trends and developments.”
Tarullo also wants the regulatory community to begin collecting additional data in order to better supervise systemically important large financial institutions. During his speech, he discussed the investments the Fed has made thus far to be able to better monitor the markets by evaluating data sources and adding new sources. This investment should be extended, he suggested, to the entire data arena by establishing a new standalone independent data collection and analysis agency to serve the regulatory community.
He is also keen for the publication of data by the private sector to be mandated by legislation; such as trade data from OTC derivatives trade repositories, for example. The provision of this data would also need to be more timely, said Tarullo: “This kind of approach will require data that are produced more frequently than the often quarterly data gathered in regulatory reports, although not necessarily real-time or intraday, and reported soon after the fact, without the current, often long, reporting lags. These efforts will need to actively seek international cooperation as financial firms increasingly operate globally.”
He is a strong advocate of a new and improved system of data collection and aggregation at the regulatory level, which he believes will also improve risk management practices within firms by requiring “standardised and efficient collection of relevant financial information”. However, Tarullo does seem to appreciate that this level of standardisation and the introduction of new data collection tools will not be free. “Data collection entails costs in collection, organisation, and utilisation for government agencies, reporting market participants, and other interested parties. Tradeoffs may need to be faced where, for example, a particular type of information would be very costly to collect and would have only limited benefits,” he told the Senate committee.
This endeavour must therefore take into account the fact that not all data is suitable for collection in this manner, he acknowledged, and that it does not necessarily need to be provided in real time, although timeliness is important. “What is considered to be ‘timely’ will depend on its purpose, and decisions about how timely the data should be should not ignore the costs of collecting and making the data usable,” he said.
The data collected should also be user driven and reported to the particular regulatory bodies in charge of the markets concerned, added Tarullo. Standardisation is key in this endeavour and he directly referred to the need for a standardised unique identifier for institutions and instruments to make “surveillance and reporting substantially more efficient”.
Tarullo also directly referred to the practical barrier of vendor data provision that currently exists in the market, which is quite timely given the investigations into a number of data vendors’ pricing practices around proprietary codes at the moment. The Fed is a customer of these vendors but is concerned about the “strong limitations” that may be placed on the sharing of such data and on the manner in which it may be used; a concern that is seemingly shared by the private sector (see recent customer lobbying of Bloomberg for proof).
“They also create systems with private identifiers for securities and firms or proprietary formats that do not make it easy to link with other systems. Surely it is important that voluntary contributors of data be able to protect their interests, and that the investments and intellectual property of firms be protected. But the net effect has been a non-compatible web of data that is much less useful, and much more expensive, to both the private and the public sector, than it might otherwise be,” he said. Vendors be warned.
Subscribe to our newsletter