FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

LEI Development Embraces Change and Picks up Speed Ahead of G20 Meeting

The Financial Stability Board’s (FSB) third progress note on the legal entity identifier (LEI) initiative, released last week, has met with a positive response from those involved in shaping the system, potential infrastructure providers and market data vendors, despite changes to some proposals and the collapse of perceptions that have built up during debate on how the final system could shape up.

But while progress is positive, there are still fundamental concerns around corporate hierarchies, as without agreed reference data on legal entity parent and relationship information, the LEI will not fulfil the effective risk aggregation function at the heart of the global system development.

The decisions and to-do lists outlined in the FSB progress note are significant steps forward in developing a global LEI system and come ahead of another major milestone this week when G20 finance ministers and central bank governors meet in Mexico City and will be asked to endorse a draft charter for the system’s Regulatory Oversight Committee (ROC). The charter has been drawn up by the FSB Implementation Group (IG) and is expected to be approved by the G20 meeting, setting in motion the creation of the ROC and the global LEI foundation that will underpin the Central Operating Unit (COU) and secure a governance framework designed to sustain the public good of the system.

One of the late changes identified in the progress note is a shift away from perceptions that entity identifier codes would be 20-character random numbers. Instead, the note describes a part-structured, part-random character string resulting from an ‘urgent’ request made by the FSB IG in September for the FSB LEI Private Sector Preparatory Group (PSPG) to consider how identifiers could best be issued for the purposes of a federated, global LEI system. The PSPG’s views were considered at meetings of the PSPG and IG in Basel earlier this month and a technical specification has been endorsed by the FSB plenary.

The FSB states in the progress note: “The FSB decision is provided now to deliver clarity and certainty to the private sector on the approach to be taken by potential pre-LEI systems that will facilitate the integration of such local precursor solutions in to the global LEI system.”

On the basis of the arguments presented and discussed by the PSPG, the FSB has selected a structured number as the best approach for the global LEI system, although it acknowledges that the 20-character code, which complies with the existing ISO 17442 standard, will have no permanent embedded meaning. Instead it is aimed to avoid any overlap of random numbers in a federated issuing system by adding a code for each local operating unit (LOU) assigning LEIs in front of the numbers.

The breakdown then looks like this:

· Characters 1-4: a four character prefix allocated uniquely to each LOU

· Characters 5-6: two reserved characters set to zero

· Characters 7-18: entity-specific part of the code generated and assigned by LOUs

· Characters 19-20: two check digits as described in ISO 17442.

If this information has been a long time coming, the time to organise behind it is short with pre-LEI solutions wanting to transition into the global LEI system required to adopt the numbering scheme no later than November 30, just a month away. The LEI will be portable within the global LEI system, implying that the LEI code can be transferred from one LOU to another and that each LOU must have capacity to take responsibility for LEIs issued by other LOUs.

Following recommendations on data quality achieved through self-registration of legal entities in the FSB’s June 2012 report, the FSB goes on to decree that pre LEI-services should be based on self-registration, although this can include third-party registration made with the permission of the entity to be registered, and that from November 9 all pre-LEI systems must allow self-registration only.

No specific recommendations are made on how the Commodity Futures Trading Commission’s (CFTC) CFTC Interim Compliant Identifiers, or CICIs, which are entirely random numbers, will integrate with the LEI system, although the 27,000 or so already issued are expected to be grandfathered and accepted into the system without being restated.

Commenting on the LEI number structure, Peter Warms, global head of ID and symbology development at Bloomberg, says: “But for the prefix that identifies where the number was assigned from, the number is still random. This is good for data management practices as the number has no other data dependencies. I would question, however, whether the prefix of an identifier would be changed if it is moved to another LOU as this is not clear.”

Tim Lind, head of legal entity and corporate actions at Thomson Reuters, says: “We must put the debate on intelligent versus dumb numbers behind us and leave it as a milestone. Either solution could work and ongoing argument is not productive. The LEI principles are in place and we need to get on and get the work done.”

Both Warms and Lind applaud the advances made by the FSB and its working groups, but the need for speed remains if deadlines are to be met. And as the complex tasks of developing a legal foundation, ROC and governance framework for the LEI continue, Lind proposes a balance of perfection and pragmatism as the only way forward.

Another outcome of the Basel meetings that deflates earlier perceptions, is a clear indication that the COU will not be located in one central place, but will instead be distributed across several locations. This is likely to emanate from the FSB’s hard fought for and well held desire to ensure the LEI system is a collective development for the public good including a governance and operational framework that will encourage all jurisdictions to join in.

On the same basis, it has also become apparent that any suggestion that an LEI system could initially be based on a replica of the DTCC and Swift utility set up for the CFTC’s CICIs has been quashed. Instead, LOUs are expected to make their own technology choices to support the LEI – indeed they may already have systems in place – although they will, necessarily, have to conform with standards set by the COU.

If these are some of the recent gains in the LEI development, there is still much to be done ahead of having an ROC, COU and some LOUs in place by March 2013. Again sustaining a level playing field for the public good on a global basis, the FSB has asked the PSPG to build on initial work and consider the next phase of operational work that will focus on how the system can best address key issues in areas such as data quality, supporting local languages and characters, and drawing effectively on local infrastructure to deliver a truly global federated LEI system. The PSPG’s deadline to make proposals on these issues is the end of the year, generating the need for extremely swift action if the LEI system is to be up and running to any extent in March.

The final issue raised in the FSB’s progress note and one which has yet to be openly debated and resolved is ownership and hierarchy data associated with LEIs. The note states: “Addition of information on ownership and corporate hierarchies is essential to support effective risk aggregation, which is a key objective for the global LEI system. The IG is developing proposals for additional reference data on the direct and ultimate parents(s) of legal entities and on relationship (including ownership) data more generally and will prepare initial recommendations by the end of 2012. The IG is working closely with the PSPG to develop the proposals.”

This might be the FSB’s final note, but the issue has to be a top priority. As one observer puts it: “The next big thing is hierarchies. They need to be nailed down and there needs to be transparency. Work is being done on this, but without a good solution there will be no meaning in the LEI.”

Source: Reference Data Review, 29.10.2012

Filed under: Data Management, Reference Data, Standards, , , , , , , ,

News and updates on LEI standard progress and development

As a follow up on G20 acceptance in Los Cabos in July 2012 and the Financial Stability Board guidelines and recommendations of the Legal Entity Identifier  LEI, we will regularly update this post with news and article to provide an overview of  LEI standard progress and development.

First Published  13.07.2012 , Last Update 27.09.2012

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , , , , , , , , , , , , , , , , ,

Symbology: EDI’s Corporate Actions Service Adopts Bloomberg Open Symbology

Free-use Data Tagging System Reduces Costs and Risks in Trading

Exchange Data International (EDI), a premier back office financial data provider, today announced it adopted Bloomberg’s Global Securities Identifiers (‘BBGID’) to name and track all equities securities in its Worldwide Corporate Actions service.

EDI is the latest financial data provider to adopt Bloomberg’s Open Symbology (BSYM), an open and free-use system for naming global securities across all asset classes with a BBGID, a 12 digit alpha-numeric identifier for financial instruments. EDI has implemented BBGID numbers in its equities reference, pricing and corporate actions data feeds. Its Worldwide Corporate Actions service provides detailed information on 50 corporate action event types affecting equities listed on 160 exchanges.

“EDI decided to integrate Bloomberg’s Open Symbology, as it is easily accessible and has no license fee or restrictions on usage,” said Jonathan Bloch, the Chief Executive Officer of EDI. “Bloomberg’s Symbology also advances straight-through processing of equity orders, which aids reporting and compliance management.”

Peter Warms, Global Head of Bloomberg Open Symbology, said, “Existing identifiers that change due to underlying corporate actions introduce inefficiencies, increase costs and add complexity to the data management process. Bloomberg and EDI recognise the importance of comprehensive, open and unchanging identifiers, like the BBGID, in enabling customers to track unique securities consistently and to process corporate action data seamlessly. As BSYM grows in adoption, interoperability across market systems and software using BSYM will improve steadily and reduce operational costs.”

Source: Bobsguide, 24.09.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , ,

LEIs – Increasing Usability & Benefits of the New Standardised Identifier – IDC

The development of the standardised legal entity identifier (LEIs) is very much underway, but how can firms and market participants utilise this new identifier to improve internal data flow and risk monitoring processes whilst also meeting the regulatory reporting requirements?

Listen to the Podcast here

Moderator/ Speakers:
Julia Schieffer
– Founder,
Chris Johnson – Head of Product Management, Market Data Services, HSBC Securities Services
Darren Marsh – European Business Manager, Risk Management and Compliance, Interactive Data

Filed under: Data Vendor, Events, Market Data, Reference Data, , , , , , , , ,

UK asset managers lack confidence in reference data quality – survey

Over a third of UK-based asset managers and banks are not confident in the quality of reference data they use to support trading activity, according to a survey from IT services firm Patni.

The survey of 100 company representatives found that 91% of asset managers do not have a single supplier of reference data, with the remainder admitting that they were not sure of their source at all. Respondents say that an average of six per cent of trades fail as a result of poor reference data.

Yet just half of those questioned say they have not considered outsourcing the management of their reference data to a third party, due to fears of a potential loss of control and security breaches. Meanwhile, the overwhelming reason cited for considering outsourcing is the potential for cost savings, followed by higher levels of accuracy.

Philip Filleul, product manager, reference data, Patni, says: “Many buy-side and sell-side firms are now uncomfortably aware of both the time and costs they devote to purchasing, cleansing and distributing reference data, as well as the risks that arise when these tasks are not performed effectively, among them failed trades and lost revenue opportunities.”

“The twin pressures of achieving regulatory compliance and straight-through processing have highlighted substantial redundancy and duplication of effort in the area of reference data management.

“One in ten trades fail on first settlement attempt – and of these, 60 per cent -70 per cent can be attributed to poor data management. “

Research from the Tower Group, which was cited by the report, showed that nearly two thirds of failed trades did so due to inaccurate data.

Source: Finextra, Bobsguide, 29.10.2010

Filed under: Corporate Action, Data Management, Market Data, Reference Data, Risk Management, Standards, , , , , , , , ,

Finextra and the FISD partner for Data webcasts in 2010

The Financial Information Services Division (FISD) and Finextra have forged a partnership to deliver a series of video webcasts for market data and risk management professionals worldwide with a focus on real-time data management and delivery, reference data, and standards.

The events will be hosted in-studio and broadcast via real-time or recorded streaming video to an invited audience of financial professionals. Participants will be able to interact with panelists during the live webcasts via real-time Q&As.

The FISD/Finextra partnership builds on Finextra’s established Finextra Live brand of webcast events. Past webcasts have been broadcast from London, Hong Kong and New York. Participants have included senior representatives from HSBC, Nomura, Citi, Morgan Stanely, Societe Generale, Credit Suisse, Bank of America, Barclays and Royal Bank of Scotland. Over 2,500 people working within the financial industry have registered to attend Finextra video webcasts since March 2009.

Tom Davin, managing director of FISD, says “A survey of FISD members revealed that more of them would like to interact with FISD programs and thought leadership via more sophisticated technologies. Our partnership with Finextra aims to provide timely and relevant topics and discussion to our member base and beyond via interactive video content.”

Nick Hastings, managing director of Finextra, says: “This new service will provide the industry with a forum to discuss and learn about key issues affecting global data management via emerging and innovative communication mediums.”

Source:FINEXTRA, 06.01.2010

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, News, Reference Data, Standards, , , , , , , ,

Industry Briefing & Survey: Harnessing Data for Better Valuations – November 2009 A-TEAM

A new industry briefing and survey report from A-Team Group and GoldenSource

A-Team Group, a publishing and research company specialising in financial information technology, was commissioned by enterprise data management specialist GoldenSource to conduct research into the challenges of managing pricing and valuations data.

Throughout the course of October 2009, A-Team Group researchers interviewed senior-level specialists closely aligned to market data or valuations. Several spanned multiple responsibilities including oversight of client data, product information, and trading risk.

The interview sample was spread across asset managers (52%), Tier-1 and Tier-2 banks (32%), broker/dealers (11%) and custodians (5%).

Geographically, participants were dispersed across the United Kingdom (47%), Europe (21%), and the United States ((32%). Over half of the respondents had global responsibility within their organizations.

Source: A-TEAM, 19.11.2009

Filed under: Data Management, Data Vendor, Library, Market Data, News, Reference Data, Risk Management, , , , , , , , , , ,

Corporate Actions Report Sept 2009 – Reference Data Review

Download: Reference Data Review Special Report Corporate Actions 2009 Edition

Rather than detracting attention away from corporate actions automation projects, the financial crisis appears to have accentuated the importance of the vital nature of this data. Financial institutions are more aware than ever before of the impact that inaccurate corporate actions data has on their bottom lines as a result of the increased focus on risk management in the market as a whole.

This renewed focus on the basics of data management has, in turn, spurred on vendors in the space to significantly up their game. The focus of this innovation has been on bringing prices down, making the implementation of these solutions easier and designing more intuitive user interfaces. This has manifested itself in a range of enhancements, not least of which are the deployment of web-based front ends and software as a service (SaaS) models.

Financial institutions and (surprisingly) issuers have also been doing their bit to improve the often complex muddle of corporate actions data via various working groups and standards initiatives. Earlier in 2009, the European issuer community agreed that a framework for shareholder communication and cross border voting is needed in the market. This was then followed by the release of the results of the Corporate Actions Joint Working Group’s standardisation initiative, which is aimed at defining each category of corporate action in the market.

Corporate actions are most certainly back in the spotlight

Filed under: Corporate Action, Data Management, Data Vendor, Library, News, Reference Data, Risk Management, Standards, , , , , , , , ,

Framework Approach to Governance, Risk Management, & Compliance

The landscape of governance, risk management, and compliance initiatives is broad and littered with a variety of specific standards and frameworks. Each of these specific frameworks may be good at what they focus on – but they fail to link GRC together and put everything in context with each other. Risk management, security, corporate governance, control, security, compliance, audit, quality, EH&S, sustainability – all have their respective islands of standards. This makes putting a GRC strategy in place that bridges these silos difficult as the language, implementations, and approaches are quite different. In fact – organizations trying to get an enterprise view of risk and compliance desperately search for a GRC “Rosetta Stone.”

There is only one framework that I see that brings this universe of GRC into a common language, process, and architecture – that is the OCEG Red Book (v2) and its GRC Capability Model™. Although various standards and guidance frameworks exist to address discrete portions of governance, risk management and compliance issues, the OCEG GRC Capability Model™ is the only one that provides comprehensive and detailed practices for an integrated and collaborative approach to GRC. These practices address the many elements that make up a complete GRC business architecture. Applying the elements of the GRC Capability Model™ and the practices within them enable an organization to:

Achieve business objectives
Enhance organizational culture
Increase stakeholder confidence
Prepare and protect the organization
Prevent, detect and reduce adversity
Motivate and inspire desired conduct
Improve responsiveness and efficiency
Optimize economic and social value

The GRC Capability Model™ describes key elements of an effective GRC architecture that integrate the principles of good corporate governance, risk management, compliance, ethics and internal control. It provides a comprehensive guide for anyone implementing and managing a GRC system or some aspect of that system. The OCEG GRC Capability Model™ is broken into eight components:

CULTURE & CONTEXT. Understand the current culture and the internal and external business contexts in which the organization operates, so that the GRC system can address current realities – and identify opportunities to affect the context to be more congruent with desired organizational outcomes.
ORGANIZE & OVERSEE. Organize and oversee the GRC system so that it is integrated with and when appropriate modifies, the existing operating model of the business and assign to management specific responsibility, decision-making authority, and accountability to achieve system goals.
ASSESS & ALIGN. Asses risks and optimize the organizational risk profile with a portfolio of initiatives, tactics, and activities.
PREVENT & PROMOTE. Promote and motivate desirable conduct, and prevent undesirable events and activities, using a mix of controls and incentives.
DETECT & DISCERN. Detect actual and potential undesirable conduct, events, GRC system weaknesses, and stakeholder concerns using a broad network of information gathering and analysis techniques.
RESPOND & RESOLVE. Respond to and recover from noncompliance and unethical conduct events, or GRC system failures, so that the organization resolves each immediate issue and prevent or resolve similar issues more effectively and efficiently in the future.
MONITOR & MEASURE. Monitor, measure and modify the GRC system on a periodic and ongoing basis to ensure it contributes to business objectives while being effective, efficient and responsive to the changing environment.
INFORM & INTEGRATE. Capture, document and manage GRC information so that it efficiently and accurately flows up, down and across the extended enterprise, and to external stakeholders.

OCEG’s GRC Capability Model™ is, in my opinion, the best umbrella framework to bring a holistic enterprise view of GRC together that works from the board of directors down into the management and process of an organization. Its goal is not to replace other frameworks and standards but to give them a common language and context to operate within and thus provide enterprise collaboration and communication across governance, risk, and compliance.

Source: Michel Rassmusen, 22.07.2009


Filed under: Library, News, Risk Management, Standards, , , , ,