FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

LEI-Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again.

But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I think there’s nothing more to say on the topic, there is – well – more to say. With the artifice of the March ‘launch date’ behind us, it’s time to deal with reality. And the reality practitioners are having to deal with is one that’s changing rapidly.

Down load full and detailed report.

LEI-Dealing_with_reality-how_to_ensure_data_quality with Entity Identifiers_06_13.pdf

Source: A-Team, 26.06,2013

Advertisements

Filed under: Data Management, Data Vendor, Library, Reference Data, Standards, , , , , , , , , , , , , , ,

LEI Development Embraces Change and Picks up Speed Ahead of G20 Meeting

The Financial Stability Board’s (FSB) third progress note on the legal entity identifier (LEI) initiative, released last week, has met with a positive response from those involved in shaping the system, potential infrastructure providers and market data vendors, despite changes to some proposals and the collapse of perceptions that have built up during debate on how the final system could shape up.

But while progress is positive, there are still fundamental concerns around corporate hierarchies, as without agreed reference data on legal entity parent and relationship information, the LEI will not fulfil the effective risk aggregation function at the heart of the global system development.

The decisions and to-do lists outlined in the FSB progress note are significant steps forward in developing a global LEI system and come ahead of another major milestone this week when G20 finance ministers and central bank governors meet in Mexico City and will be asked to endorse a draft charter for the system’s Regulatory Oversight Committee (ROC). The charter has been drawn up by the FSB Implementation Group (IG) and is expected to be approved by the G20 meeting, setting in motion the creation of the ROC and the global LEI foundation that will underpin the Central Operating Unit (COU) and secure a governance framework designed to sustain the public good of the system.

One of the late changes identified in the progress note is a shift away from perceptions that entity identifier codes would be 20-character random numbers. Instead, the note describes a part-structured, part-random character string resulting from an ‘urgent’ request made by the FSB IG in September for the FSB LEI Private Sector Preparatory Group (PSPG) to consider how identifiers could best be issued for the purposes of a federated, global LEI system. The PSPG’s views were considered at meetings of the PSPG and IG in Basel earlier this month and a technical specification has been endorsed by the FSB plenary.

The FSB states in the progress note: “The FSB decision is provided now to deliver clarity and certainty to the private sector on the approach to be taken by potential pre-LEI systems that will facilitate the integration of such local precursor solutions in to the global LEI system.”

On the basis of the arguments presented and discussed by the PSPG, the FSB has selected a structured number as the best approach for the global LEI system, although it acknowledges that the 20-character code, which complies with the existing ISO 17442 standard, will have no permanent embedded meaning. Instead it is aimed to avoid any overlap of random numbers in a federated issuing system by adding a code for each local operating unit (LOU) assigning LEIs in front of the numbers.

The breakdown then looks like this:

· Characters 1-4: a four character prefix allocated uniquely to each LOU

· Characters 5-6: two reserved characters set to zero

· Characters 7-18: entity-specific part of the code generated and assigned by LOUs

· Characters 19-20: two check digits as described in ISO 17442.

If this information has been a long time coming, the time to organise behind it is short with pre-LEI solutions wanting to transition into the global LEI system required to adopt the numbering scheme no later than November 30, just a month away. The LEI will be portable within the global LEI system, implying that the LEI code can be transferred from one LOU to another and that each LOU must have capacity to take responsibility for LEIs issued by other LOUs.

Following recommendations on data quality achieved through self-registration of legal entities in the FSB’s June 2012 report, the FSB goes on to decree that pre LEI-services should be based on self-registration, although this can include third-party registration made with the permission of the entity to be registered, and that from November 9 all pre-LEI systems must allow self-registration only.

No specific recommendations are made on how the Commodity Futures Trading Commission’s (CFTC) CFTC Interim Compliant Identifiers, or CICIs, which are entirely random numbers, will integrate with the LEI system, although the 27,000 or so already issued are expected to be grandfathered and accepted into the system without being restated.

Commenting on the LEI number structure, Peter Warms, global head of ID and symbology development at Bloomberg, says: “But for the prefix that identifies where the number was assigned from, the number is still random. This is good for data management practices as the number has no other data dependencies. I would question, however, whether the prefix of an identifier would be changed if it is moved to another LOU as this is not clear.”

Tim Lind, head of legal entity and corporate actions at Thomson Reuters, says: “We must put the debate on intelligent versus dumb numbers behind us and leave it as a milestone. Either solution could work and ongoing argument is not productive. The LEI principles are in place and we need to get on and get the work done.”

Both Warms and Lind applaud the advances made by the FSB and its working groups, but the need for speed remains if deadlines are to be met. And as the complex tasks of developing a legal foundation, ROC and governance framework for the LEI continue, Lind proposes a balance of perfection and pragmatism as the only way forward.

Another outcome of the Basel meetings that deflates earlier perceptions, is a clear indication that the COU will not be located in one central place, but will instead be distributed across several locations. This is likely to emanate from the FSB’s hard fought for and well held desire to ensure the LEI system is a collective development for the public good including a governance and operational framework that will encourage all jurisdictions to join in.

On the same basis, it has also become apparent that any suggestion that an LEI system could initially be based on a replica of the DTCC and Swift utility set up for the CFTC’s CICIs has been quashed. Instead, LOUs are expected to make their own technology choices to support the LEI – indeed they may already have systems in place – although they will, necessarily, have to conform with standards set by the COU.

If these are some of the recent gains in the LEI development, there is still much to be done ahead of having an ROC, COU and some LOUs in place by March 2013. Again sustaining a level playing field for the public good on a global basis, the FSB has asked the PSPG to build on initial work and consider the next phase of operational work that will focus on how the system can best address key issues in areas such as data quality, supporting local languages and characters, and drawing effectively on local infrastructure to deliver a truly global federated LEI system. The PSPG’s deadline to make proposals on these issues is the end of the year, generating the need for extremely swift action if the LEI system is to be up and running to any extent in March.

The final issue raised in the FSB’s progress note and one which has yet to be openly debated and resolved is ownership and hierarchy data associated with LEIs. The note states: “Addition of information on ownership and corporate hierarchies is essential to support effective risk aggregation, which is a key objective for the global LEI system. The IG is developing proposals for additional reference data on the direct and ultimate parents(s) of legal entities and on relationship (including ownership) data more generally and will prepare initial recommendations by the end of 2012. The IG is working closely with the PSPG to develop the proposals.”

This might be the FSB’s final note, but the issue has to be a top priority. As one observer puts it: “The next big thing is hierarchies. They need to be nailed down and there needs to be transparency. Work is being done on this, but without a good solution there will be no meaning in the LEI.”

Source: Reference Data Review, 29.10.2012

Filed under: Data Management, Reference Data, Standards, , , , , , , ,

News and updates on LEI standard progress and development

As a follow up on G20 acceptance in Los Cabos in July 2012 and the Financial Stability Board guidelines and recommendations of the Legal Entity Identifier  LEI, we will regularly update this post with news and article to provide an overview of  LEI standard progress and development.

 
First Published  13.07.2012 , Last Update 27.09.2012

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , , , , , , , , , , , , , , , , ,

Whitepaper: Bloomberg to embrace emerging LEI

The industry initiative to develop and promote a standard global legal entity identifier (LEI) is expected to significantly reduce the opacity associated with complex financial instruments, widely acknowledged to be a major contributing factor in the 2008 credit crisis.

In this white paper, Bloomberg explains the implications of the emerging LEI for financial institutions, and outlines how it is embracing the new standard to help clients better understand the entities whose instruments they trade and hold (like mapping of LEI to Blombergs numeric BUID, etc.)

Download the White Paper Now

Source: A-TEAM 28.06.2012

Filed under: Data Management, Reference Data, Standards, , , , , , , , , , ,

Reference Data: LEI system Real and Ready for Use…or maybe not?

The morning after the G-20 leaders endorsed the Financial Stability Board’s recommendations for a global system of precisely identifying legal entities, the co-chairwoman of the LEI Trade Association Group said, “I think we have something that is real and ready for use.’’

Robin Doyle, a senior vice president at JPMorgan Chase, noted that 20,000 ready-to-use “legal entity identifiers” have already been generated by a prototype jointly developed by the Depository Trust and Clearing Corporation and the Society for Worldwide Interbank Financial Telecommunication. A copy of that file can be downloaded here.

The online portal that would allow financial market participants to register and receive 20-character ID codes and to search for the codes of counterparties or other entities was demonstrated Wednesday morning at the 2012 Technology Leaders Forum of the Securities Industry and Financial Markets Association.

That portal can be turned live “within 24 hours” of its need, said Mark Davies, 
Vice President, Business Development
 at The Depository Trust & Clearing Corporation, during the demonstration.

The LEI Trade Association Group represents a group of firms and financial industry trade associations trying to develop a global and uniform legal entity identifier. The group is supported by the Global Financial Markets Association, which includes SIFMA.

SIFMA and a variety of other trade groups have recommended that DTCC and SWIFT operate a central authority for registering and issuing the codes that the leaders of the G-20 industrial nations Tuesday endorsed.

The G-20 endorsed the 35 recommendations of an international coordinator known as the Financial Stability Board.

The board’s recommendations differed in one significant aspect from the SIFMA and trade association recommendation. Where the trade groups recommended a centralized system for registering and issuing ID codes – a point reinforced Tuesdya in opening remarks at SIFMA Tech by SIFMA president T. Timothy Ryan Jr. – the FSB recommended a “federated” registration model. Under that approach, local authorities, aka nations, could and theoretically would act as the agencies for registration, issuing and storing the codes.

The central authority would maintain a database that would be logically managed, but whose contents might be spread around the world, as on servers spread across the Internet.

“We think it can work,” but it has to be set up and maintained properly, Doyle said.

The federated model will only be as good as it adheres to the global standards set by the FSB and the International Organization for Standardization, which defined the 20-character code.

Doyle said a central authority under the FSB approach likely will need to conduct audits of local operating units, to ensure compliance with the overall standards. The challenge will be to make sure the codes are kept correctly and not, in some fashion, duplicated.

The local authorities will need to take on the expense of maintaining high standards. “It is an expensive, difficult process to validate data,” Doyle said.

“A public-facing system like this needs a huge amount of control,” Davies said.

The next shoe to drop on the development of the system will come within the next couple weeks. That’s when Commodity Futures Trading Commission member Scott O’Malia said a decision will be announced on what organization or organizations will handle the registration and issuance of ID codes for the swaps markets it will oversee. O’Malia said at SIFMA Tech Tuesday that the decision among what industry executives say are four competing proposals will come “very soon.”

Srinivas Bangarbale, the CFTC’s Chief Data Officer, said Wednesday that the regulator’s “interim compliant identifier” will support the ISO 17442 standard set out by the FSB and ISO. r

It’s decision to move ahead “presupposed the standard” and that the chosen implementing group would “adopt the standards as published.” The CFTC will not directly or indirectly create another set of reference data for the industry to keep track of

“It’s important to use the standard as soon as possible,” he said, however.

O’Malia said the CFTC is likely to begin issuing IDs as early as September. That is so the commission can fulfill its mandate to oversee interest-rate and credit-default swap markets, as mandated by the 2010 Dodd-Frank Wall Street Reform Act.

The FSB’s implementation schedule calls for a functional system to be ready to use by March 2013.

Source: Securities Technology Monitor, 20.06.2012 by Tom Steinert-Threlkeld

Filed under: Data Management, Reference Data, Standards, , , , , , , , , ,

LEIs – Increasing Usability & Benefits of the New Standardised Identifier – IDC

The development of the standardised legal entity identifier (LEIs) is very much underway, but how can firms and market participants utilise this new identifier to improve internal data flow and risk monitoring processes whilst also meeting the regulatory reporting requirements?

Listen to the Podcast here

Moderator/ Speakers:
Julia Schieffer
– Founder, DerivSource.com
Chris Johnson – Head of Product Management, Market Data Services, HSBC Securities Services
Darren Marsh – European Business Manager, Risk Management and Compliance, Interactive Data

Filed under: Data Vendor, Events, Market Data, Reference Data, , , , , , , , ,

LEI (Legal Entity Identifier) set to arrive in waves

A new system giving financial institutions standardized Legal Entity Identifiers (LEIs) will start to be phased in next year after an international organization finalizes new standards in January 2012.

LEI requirements for a Global Legal Entity Identifier (LEI) Solution May 2011
LEI industry progress and  recommendation July 2011

The Geneva-based International Organization for Standardization (ISO) is expected to approve a plan for LEIs at the beginning of next year, calling for them to consist of 20 alphanumeric characters. After that happens, the infrastructure is already in place to start issuing the IDs early in 2012, according to officials with the Securities Industry and Financial Markets Association.

“Assuming the standard is approved by early January, our expectations are that legal entities will be able to register in short order for an LEI,” said Tom Price, managing director and head of SIFMA’s technology, operations and business continuity planning group.

During the financial crisis, both regulators and institutions realized they did not have the information available to quickly address issues of counterparty risk. LEIs aim to change that by using a universal code that would allow counterparties to be easily identified.

The United States has provided much of the leadership behind the push for LEIs, but the concept enjoys broad support around the globe. The registering authority for LEIs will not come from any government, but rather from the Society for Worldwide Interbank Financial Telecommunications (SWIFT).

After the ISO finalizes the standard, the next step will be rule writing, which is already underway at the Commodity Futures Trading Commission with respect to swaps. Price said LEIs will be used first for swaps participants and then gradually adopted for transactions involving other types of assets until they are required for all trades.

David Strongin, who is also a managing director at SIFMA, said the U.S. will be the first country to require LEIs, but Hong Kong and Canada will likely follow fairly quickly. The European Union has committed to adopting LEIs as well, though it is unclear whether Europe will adopt the system all at once or phase it in country by country.

Strongin stressed, however, that there is a global consensus to move forward, even if not every nation and region mandates LEIs at the same time.

“The G20, both the finance ministers and leaders, have all endorsed this,” Strongin said. “From a very high level, you don’t see disagreement that an LEI is needed. I think everyone agrees that it’s an important tool to build the foundation for risk management.”

Strongin said that while many traders might not see it right now, most firms are currently working hard to prepare for LEIs. Eventually, however, the changes will touch every facet of the industry. “There’s a lot of work going on, though there’s only so much you can do until you see the final rules,” Price added.

Source: Traders Magazine, 18.11.2011

Filed under: Data Management, Reference Data, Risk Management, Standards, , , , , , , , , , , , ,

Expanding Global Identifiers in Complex Assets and Other Areas

In the post-credit crisis financial services industry, risk management, compliance and transparency have emerged as focus points for review with provision of accurate and timely data recognised as a critical element of success. Fundamental to data provision is the accurate identification of both financial instruments and counterparties – without which you cannot truly measure your performance or exposure.

Rapid growth in derivatives and securitised debt instruments played a central role in the credit crisis. In the aftermath of the crisis, the use of alternative asset classes has continued to grow. In order to ensure that an individual firm’s exposure through such complex instruments can be accurately measured, and therefore, managed, that firm must be able to correctly identify the securities and the entities that they are investing in.

This can only be done through the use of unique identifiers. But it is a well known fact that there is no single identifier capable of uniquely identifying securities or entities globally. While there are countries with identifier schemas, or certain asset classes such as equities that are well covered, there are many regions, asset classes, and markets that do not have a robust mechanism for identifying securities and entities. Despite significant effort, the industry has not been able to progress a standardised approach to this problem.

Is there another way? Can commercial initiatives and innovation through partnerships succeed where standards bodies have so far failed?

We examine the industry requirements and complexities inherent in the application of unique identifiers in three key areas: Business Entity Identifiers, US Listed Options and Syndicated Loans and review the collaborative approach taken by Standard & Poor’s CUSIP Global Services to develop innovative and comprehensive industry solutions.

Click below to download the free 12-page white paper from CUSIP Global Services now.

Source:A-TEAM 08.09.2010

Filed under: Data Management, Reference Data, Standards, , , , , , , ,

Thomson Reuters Faces EU Probe of RIC Data Code Issues

Nov. 10 (Bloomberg) — Thomson Reuters Corp., the news and data provider created in a merger last year, faces a European Union antitrust probe into possible restrictions on competitors’ use of identification codes for real-time market data feeds.

Bloomberg provided free access to it’s code just a few days ago.

The probe will focus on whether Thomson Reuters prevents clients from translating Reuters instrument codes  (RIC’s) to alternative identification codes of rival data-feed suppliers, a process known as “mapping,” the European Commission, the EU’s antitrust regulator, said in a statement today from Brussels.

“Without the possibility of such mapping, customers may potentially be ‘locked’-in to working with Thomson Reuters because replacing Reuters instrument codes by reconfiguring or by rewriting their software applications can be a long and costly procedure,” the commission said.

The probe is the EU’s second into financial information providers this year after the regulator said in January that it would review how Standard & Poor’s charges customers for the use of certain codes in databases. Thomson Reuters said last week that third-quarter profit dropped 59 percent on declining revenue at its sales and trading business and legal division.

Thomson Reuters said in a statement that it received an EU questionnaire Nov. 3 and is cooperating with the probe.

“Thomson Reuters data is reliably and consistently identified by a managed code, which we create and maintain to enable navigation of the company’s global content,” the New York-based company said in the e-mailed statement. “Our customers are at the heart of our business and we continue to work with them to explore how best to add value to our data services.”

The commission said it started the probe on its own initiative. Under EU rules, companies can be fined as much as 10 percent of annual sales for antitrust violations. Companies can appeal antitrust decisions at EU courts.

Bloomberg LP, the parent of Bloomberg News, competes with Thomson Reuters in selling financial and legal information and trading systems.

Source: Bloomberg 10.11.2009 by  Matthew Newman in Brussels

Filed under: Data Management, Data Vendor, Market Data, News, Reference Data, Risk Management, Standards, , , , , , , , , ,

Avox launches Global Business Entity Directory: Wiki-Data – Initiative supports Open Global Standards for Business Entity Data

Avox, a subsidiary of Deutsche Börse, is opening up access to a subset of over two hundred thousand verified and maintained business entity data records of its content. On www.wiki-data.com, users will find basic information about those business entities including their legal name, country of incorporation and operation, state/province/region and city of operation (where applicable) and an Avox identifier (AVID).

Avox is publishing this information for free usage in an effort to help facilitate a common standard for business entity data. Any data record with an AVID attached has been comprehensively verified and is maintained by Avox’s team of data experts. It is also continuously checked by clients and partners because of the built in feedback loop established between these user organizations and Avox. With this launch, the entire world can participate in and benefit from the increased level of maintenance, enhancement and consistency of the content.

“The industry has been demanding a low cost business entity data standard for years” says Ken Price, CEO and co-founder of Avox. “We believe this is a big step toward achieving such a standard however we fully intend to collaborate with other key industry players who will add content and significant value to the offering. A successful standard must be a shared standard in our view.” Price points to partnerships with S&P, Markit, IDC and SWIFT as examples of this collaboration.

Julia Sutton, Global Head of Customer Accounts for the Institutional ClientGroup at Citigroup commented “Wiki-data is yet another example of the collaborative model at work. Anyone that uses data from wiki-data.com will be consistent with a number of major industry players including Citi, Barclays and Nomura. We see this as a major efficiency play as usage expands.”

Richard Snookes, Director, Global Reference Data at Barclays Capital says “By taking the decision to allow the use of their AVID identifier as unlicensed content, Avox have removed one of the key obstacles to creating a genuine community for counterparty identification. The user experiencing a loss related to a bad piece of data quickly effects a transition to being the user with the most accurate data. The wiki-data model magnifies these transitions across multiple data fields and users to present an extremely powerful tool with real scalability.”

Users can purchase additional data attributes or regular updates of the complete file as an annual subscription. Third parties can license the content as the basis for their directory and identification services. Internet access of the data for primary usage is restriction and cost free. The key objective is for all major industry participants to use the same underlying data.

Price notes that the data will be of extremely high quality, “That’s the nature of the beast. One of the great benefits of the Internet is that we can now leverage a global community of data checkers who can point out remaining potential errors, changes or omissions which our central team of data experts can address straight away. There is no better way of maximizing data quality.” Additional capabilities such as online record linking, user commentary and the facility to add new records which can be verified by Avox or other firms are planned for the future.

A blog has been established at http://avoxinfo.blogspot.com/ where anyone can openly comment, criticize and/or make suggestions for improvements to the platform and the content. There is also a LinkedIn group set up called the “Avox Business Entity Discussion Forum” for those who use that platform. Wiki-data and the blog can also be accessed directly from www.avox.info.

Source:MondoVisione, 22.06.2009

Filed under: Data Management, Data Vendor, Exchanges, Library, News, Reference Data, Risk Management, Standards, , , , , , , , , ,

Operational Risk and Reference Data: Exploring Costs, Capital Requirements and Risk Mitigation

Financial transactions can be thought of as a set of computer encoded data elements that collectively represent 1) standard reference data, identifying it as a specific product bought by a specific counterparty, and 2) variable transaction data such as traded date, quantity and price. The reference data components of a financial transaction identifies it as a specific financial product (security number, symbol, market, etc.), its unique type, terms and conditions (asset class, maturity date, conversion rate, etc.), its manufacturer or supply chain participant (counterparty, dealer, institution, exchange, etc.), its delivery point (delivery, settlement instructions and location), its delivery or inventory price (closing or settlement price) and its currency. Analogous to specifications for manufactured products, reference data also defines the products’ changing specifications (periodic or event driven corporate actions), occasional changes to sub-components (calendar data, credit rating, historical price, beta’s, correlations, volatilities) and seasonal incentives or promotions (dividends, capital distributions and interest payments).

Download: Opertional Risk and Reference Data Nov 2005 Study

This paper documents the impact of increasingly costly and duplicative expenditures for the sourcing, maintenance, and processing of reference data; the effect of recent regulatory mandates on reference data; and the role faulty reference data plays in operational risk and operational capital. While quantifying this impact, reference data remains an area needing further research due to the lack of granularity of cost and loss data compiled to date. As such this paper is the first time the literature of Operational Risk and Reference Data have been drawn together and useful insights presented.

Source: November 2005 by  Allan D.Grody Fotios C. Harmantzis, PhD, Gregory J. Kaple

Filed under: Corporate Action, Data Management, Library, Reference Data, Risk Management, Services, Standards, , , , , , , , , ,

Risk & Compliance Report March 2009: Reference Data Review

Download: RDR  Risk_&_Compliance Report March 2009  A-TEAM

The current financial climate has meant that risk management and compliance requirements are never far from the minds of the boards of financial institutions. In order to meet the slew of regulations on the horizon, firms are being compelled to invest in their systems in order to cope with the new requirements.

Data management is an integral part of this endeavour, as it represents the building blocks of any form of risk management or regulatory reporting system. Only by first understanding the instruments being traded and the counterparties involved in these trades can an institution hope to be able to get a handle on its risk exposure. The fall of Lehman Brothers and the ensuing chaos was proof enough of the data quality issues still facing the industry in this respect.

Regulators are demanding more and more data transparency from all players in the financial markets and this has meant that the ability to access multiple data silos has become essential. A siloed mentality towards data will no longer be acceptable, as these regulators seek a holistic view of positions and the relationships between counterparties.

All of this represents a significant challenge to the data management community, given that there are standards lacking in many areas, for example business entity identification. But with great challenges come tremendous opportunities to solve data management issues that have been in the background for far too long.

Source: A-TEAM Group 13.03.2009

Filed under: Corporate Action, Data Management, Library, Market Data, News, Reference Data, Risk Management, Services, Standards, , , , , , , , , ,