FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

Chile: Comder to launch Central Counterparty next year with Calypso clearing solution

A consortium of Chilean banks is forming a new central counterparty (CCP) next year for over-the-counter (OTC) derivatives. The Comder CCP has selected Calypso to provide the core clearing platform for the new launch which will enable compliance with the post-crash rules laid out at the Pittsburgh G20 mandating more transparency and effectively ‘on exchange’ clearing.

The new Comder CCP will begin clearing non-deliverable forwards (NDFs) in Q4 2014 and interest rate derivatives (IRD) in Q1 2015. The CCP will be powered by Calypso with its platform providing legal novation, affirmation, registration, limits, initial and variation risk margins, collateral management data, and default management and centralised trade repository storage and reporting.

According to Felipe Ledermann, the chief executive of Comder, Calypso was chosen for its experience in OTC derivatives central clearing. Comder will receive on-going maintenance and support from the vendor after the platform is rolled out next year.

“We see Calypso as a strategic partner for one of the most important projects in the Chilean banking industry,” continued Ledermann. “This initiative allows us to build a best-in-class CCP with the highest standards and align with BIS-IOSCO principles for market infrastructures.”

Calypso already provides OTC derivatives clearing and processing infrastructure and technology to leading clearing houses, such as the Chicago Mercantile Exchange (CME), Eurex, BM&FBovespa, the Tokyo (TSE) and Singapore exchanges (SGX) and Hong Kong Exchanges and Clearing (HKEX). The Calypso clearing solution provides full cross-asset coverage, manages each step in the clearing process and delivers visibility into risk for cash and OTC derivatives products, claims the vendor. The single platform should also be scalable if Comder attracts significant volumes.

Commenting on the deal, Kishore Bopardikar, president and chief executive of Calypso Technology, said he was excited to provide a solution that will enable the Chilean market to move towards a centrally cleared derivatives environment, adding that “we are pleased to be supporting the development of such an important platform for the country”.

Source: Bobsguide, 23.07.2013

Filed under: Chile, Latin America, Standards, , , , ,

Thomson Reuters Outlines Plans to Lighten the Burden of Symbology

Thomson Reuters has set out its store on symbology saying it does not support the promotion of new identifiers as a means of improving data management, but is keen to support industry standards and has plans to offer services such as symbology cross-referencing to ease the burden on data managers.

The company documents the development of symbology, its use and complexity in a white paper authored by Jason du Preez, head of symbology services at Thomson Reuters, and entitled ‘Solving for Symbology Discord, the Identity Challenge’.

Thomson Reuters set up a symbology business last year and published the white paper to acknowledge the importance of symbology and recognise its challenges. Du Preez says: “We don’t believe there is a silver bullet that will answer the problems of symbology. Innovative new products continue to exacerbate the problem and that is not going to change. We can, using our core competencies, create linkages, invest to take on the burden of linking data sets, and maintain code mapping. And we can allow the market to make more use of our intellectual property.”

Du Preez cites licences introduced last summer to extend the use of the company’s proprietary Reuters Instrument Codes (RICs) in non real-time content, as well as its agreement in response to a European Commission antitrust investigation to extend the use of RICs in real-time consolidated data feeds, as moves to open up how RICs are licensed and make them more accessible across all asset classes.

Integration of RICs with Proprietary Identifiers

He says: “As there is no silver bullet, we will invest more in cross-referencing services and tie in quality of information. We will have interesting things to offer over the next 18 months.” Among these he lists the integration of RICs and proprietary identifiers, with firms submitting their codes to Thomson Reuters and the company playing them back as part of its own codes. Other broad cross-referencing services will be tailored to allow clients to access only required cross references and linkages.

“Thomson Reuters doesn’t promote a new code, there are enough out there already. We will continue to use existing codes and extract value from them; the key is linkages between market vendor codes and proprietary structures. While clients face regulatory and cost drivers, we will take care of linkages and cross referencing to improve the breadth and quality of client content.”

Thomson Reuters’ white paper details the development of symbology and notes the company’s intent, as described by du Preez. It starts by mentioning irregular incidents in the market that remind the industry of the challenges involved when an aggregated or consolidated view across positions is needed, including the incompatibility of core data symbols. The paper states: “The core elements: security identification, counterparty identification and price discovery, were never developed to work efficiently and effectively on an enterprise/global scale.”

Looking at the current state of symbology, the paper flags the fragmented identification methods resulting form the market’s approach to symbology, including data providers’ and data aggregators’ different means of identifying the various parts of securities or counterparties, as well as firms’ creation of proprietary identifiers to fill gaps in vendor provision. The paper reports: “[Symbology] is still a ‘cottage industry’ where the identification schemes put in place by one group are locally focused and usually limited to a specific slice of the securities market. This consumes resources: in many cases the task of mapping multiple sets of disjointed or partially overlapping symbols can consume as much (or more) development time and computing resource as programming the business logic itself.”

The paper reviews changes in the financial industry since 1993 that have complicated symbology and notes the increasing difficulty, yet increasing need, to integrate information across a firm’s complete range of trading businesses to achieve effective risk management. On the flip side, it points to the parallel need to analyse rapidly growing stores of information and connect increasingly diverse datasets to find relevant information in the quest for alpha. It states: “The sophistication of the methods we employ to aggregate, rationalise and navigate information bears a direct relationship to the size of the lead a firm can have in the financial marketplace.”

How to Unambiguously Identify Information

While the outcome of linking and navigating information can be positive, it presents significant challenges as a lack of consistent and comprehensive global industry standards means firms must maintain symbology cross references, a difficult and often flawed task, particularly in banks with many different trade and compliance-related systems. Du Preez writes: “A popular approach is ‘we can build an adaptor’. Adaptors have become some of the most complex processes in banking technology. That is not data management. It is trying not to get eaten by the alligators.” He goes on to surmise: “Data managers do not want to deal with these problems – they ultimately want services they can reliably use to unambiguously identify information.”

Enter Thomson Reuters with its vision of how to resolve these problems. “We believe that these linkages are the key to enormous untapped value. Being able to enter the data model through any entity identifier (quote, security or legal entity) and easily navigate and explore all the linkages between related entities not only puts a firm in control of its risk position, but also creates a window into opportunities. Industry standards have a significant part to play as they provide a universal start and end point; Thomson Reuters is a strong supporter of symbology standards in the data industry and we will be first in line to adopt and link industry standard identifiers to our content sets.”

The report discusses the challenges propagated by the use of multiple symbologies and the workload associated with the maintenance of cross reference tables in local security master databases. It touches on Thomson Reuters’ plans to provide cross reference services centrally and leverage its core competencies and infrastructure to ease the burden on institutions that have traditionally solved the problems themselves.

It states: “Cross referencing is a reality that cannot be avoided – we aim to make this as accurate and cost-effective as possible for our customers. We also understand that while symbology is an important part of the picture, translation and synchronisation services will also play a critical part. The need for these services is evidenced by the burgeoning desire of the market to offload these onerous data management functions to specialist providers.” The report concludes: “Thomson Reuters is investing now to continue to expose the growing capabilities of its data management infrastructure and ensure that structured and unstructured data come together in a rich tapestry of knowledge with the aim of maximizing utility to trading algorithms, research, analysis and information discovery.”

Source: A-Team Reference Data Review, 26.03.2013

Filed under: Data Management, Data Vendor, Reference Data, Standards, , , , , , , ,

ICE gets green light for Brazilian fixed income trading platform

IntercontinentalExchange (NYSE: ICE), a leading operator of global markets and clearing houses, and Cetip S.A., Latin America’s largest private fixed income depository, announced that the jointly developed fixed income trading platform Cetip | Trader is expected to launch on February 25, 2013.

This follows a successful beta test that started in August and final regulatory approval from the Brazilian securities regulator Comissao de Valores Mobiliarios (CVM) today.

“This platform reinforces Cetip’s commitment to develop an efficient and transparent secondary market, respecting and improving the practices of our local over-the-counter market,” said Cetip Managing Director Carlos Ratto.

Cetip | Trader offers market participants access to voice confirmation, electronic trading and historical data in a single platform. Over the past few months, it has been thoroughly tested in a simulated trading environment with approximately 80 institutions entering more than 100,000 mock trades. While the platform was initially developed for corporate and government bonds, its flexible architecture is adaptable for new products as driven by market demand.

“We are providing an innovative solution that brings the front office a more dynamic and intuitive language,” said Cetip Trading Solutions Manager Ricardo Vit. “Cetip | Trader is a flexible and simple tool that will help keep our customers ahead of the curve.”

“ICE was pleased to work with Cetip to build a product that is customized for the Brazilian market, available in Portuguese, and that provides a complete trading solution to the market for corporate and government bonds,” said ICE Senior Vice President and Chief Strategic Officer Dave Goone.

ICE Link, ICE’s post-trade processing service, will also be available to Cetip | Trader customers beginning February 25. ICE Link has developed straight-through-processing workflows customized for the Brazilian bond market. These workflows enable middle and back office operations to more efficiently and effectively allocate trades and submit them to Cetip for registration.

Said Vit, “In addition to easier access to liquidity and execution, the complete solution also aims for better operational risk mitigation and total cost reduction on the life of a trade. ICE Link allows standardization in the trade workflow and better data integrations with counterparties and internal systems. This provides market participants a level of automation never before experienced in the Brazilian over-the-counter market.”

Source:  IntercontinentalExchange, 08.02.2013

Filed under: Brazil, Exchanges, Latin America, Trading Technology, , , , , , ,

Whitepaper: Bloomberg to embrace emerging LEI

The industry initiative to develop and promote a standard global legal entity identifier (LEI) is expected to significantly reduce the opacity associated with complex financial instruments, widely acknowledged to be a major contributing factor in the 2008 credit crisis.

In this white paper, Bloomberg explains the implications of the emerging LEI for financial institutions, and outlines how it is embracing the new standard to help clients better understand the entities whose instruments they trade and hold (like mapping of LEI to Blombergs numeric BUID, etc.)

Download the White Paper Now

Source: A-TEAM 28.06.2012

Filed under: Data Management, Reference Data, Standards, , , , , , , , , , ,

LEIs – Increasing Usability & Benefits of the New Standardised Identifier – IDC

The development of the standardised legal entity identifier (LEIs) is very much underway, but how can firms and market participants utilise this new identifier to improve internal data flow and risk monitoring processes whilst also meeting the regulatory reporting requirements?

Listen to the Podcast here

Moderator/ Speakers:
Julia Schieffer
– Founder,
Chris Johnson – Head of Product Management, Market Data Services, HSBC Securities Services
Darren Marsh – European Business Manager, Risk Management and Compliance, Interactive Data

Filed under: Data Vendor, Events, Market Data, Reference Data, , , , , , , , ,

Expanding Global Identifiers in Complex Assets and Other Areas

In the post-credit crisis financial services industry, risk management, compliance and transparency have emerged as focus points for review with provision of accurate and timely data recognised as a critical element of success. Fundamental to data provision is the accurate identification of both financial instruments and counterparties – without which you cannot truly measure your performance or exposure.

Rapid growth in derivatives and securitised debt instruments played a central role in the credit crisis. In the aftermath of the crisis, the use of alternative asset classes has continued to grow. In order to ensure that an individual firm’s exposure through such complex instruments can be accurately measured, and therefore, managed, that firm must be able to correctly identify the securities and the entities that they are investing in.

This can only be done through the use of unique identifiers. But it is a well known fact that there is no single identifier capable of uniquely identifying securities or entities globally. While there are countries with identifier schemas, or certain asset classes such as equities that are well covered, there are many regions, asset classes, and markets that do not have a robust mechanism for identifying securities and entities. Despite significant effort, the industry has not been able to progress a standardised approach to this problem.

Is there another way? Can commercial initiatives and innovation through partnerships succeed where standards bodies have so far failed?

We examine the industry requirements and complexities inherent in the application of unique identifiers in three key areas: Business Entity Identifiers, US Listed Options and Syndicated Loans and review the collaborative approach taken by Standard & Poor’s CUSIP Global Services to develop innovative and comprehensive industry solutions.

Click below to download the free 12-page white paper from CUSIP Global Services now.

Source:A-TEAM 08.09.2010

Filed under: Data Management, Reference Data, Standards, , , , , , , ,

Singapore’s private banks lacking back office automation in trade processing – study

Research on the post-trade processing practices of private banks in Singapore has revealed that nearly 60% of private banks in the region lack back office automation in trade processing.

The study was conducted by InsightAsia Banking & Finance Consulting, a division of InsightAsia Research Group, that specialises in the Asia Pacific region, and commissioned by Omgeo, the global standard for post-trade efficiency.

Against a background of the growing importance of Singapore to the global private banking sector, Insight Asia surveyed a group of Singapore-based private banks regarding their post-trade processes. The study focused on a range of issues related to trade processing, including the effects of the recent financial crisis on the private banking sector and the current mechanisms that private banks are using to process trades.

The study showed that nearly a third of private banks continue to manually carry out trade allocation and confirmation, rather than processing their trades electronically. Manual processes can make a firm more vulnerable to trade failure and create a more risk-prone environment because there is more room for error in comparing trade details.

Many of the Singapore private bank executives surveyed highlighted the importance of having efficient and flexible banking and processing systems as a key area of development. There was general agreement that higher levels of automation in trade processing would result in a reduction in operational risk. In fact, of the executives interviewed from within private banks currently carrying out trade matching in Singapore, 59% said they either wanted to make improvements to their system or were in the process of doing so.

“This study suggests that Singapore private banks are becoming increasingly aware of the benefits of introducing automation into their back-offices,” said James Drumm, Executive Director, Asia Pacific for Omgeo. “At present, many private banks operate in a manual environment, but there is a growing consensus that introducing more automated processes will significantly decrease their operational and systemic risk.”

In addition to the findings on electronic trade processing, the study also found general agreement from the private bank executives interviewed that, while recent events in financial markets were unprecedented and posed some challenges to the sector, Asia, and in particular Singapore, remains a key element in their global expansion strategies.

Another key finding of the research was that there was almost universal agreement among executives that the focus on counterparty risk has increased substantially over the last 12 months, and is likely to continue in the foreseeable future.

“We conducted this study against the background of the global financial crisis,” Phillip King, Head, Banking & Finance Consulting for InsightAsia noted. “The impact of these events at a group level for many private banks is still ongoing; however the long term growth story in Asian wealth markets remains intact. The COOs and operations executives interviewed reveal that Singapore has a solid corps of seasoned and highly capable professionals in senior roles in its private banking sector. They are a strong collective asset to the ongoing development of Singapore as a private banking hub.”

Source: Finextra, 06.07.2009

Filed under: Asia, Banking, News, Risk Management, Services, Singapore, Wealth Management, , , , , , ,

Counterparty Credit Risk Study Links Credit and Market Risk

Credit and market risk are linked, and should be managed consistently, preferably in one system, according to a study of counterparty credit risk released jointly by SunGard Data Systems and the Professional Risk Managers International Association.

The report, based on a global survey of 436 risk professionals, was the topic of a June 23 presentation at SunGard’s New York City Day, a collection of events held prior to the opening of the Securities Industry and Financial Markets Association’s Technology Management Conference and Exhibit at the New York Hilton.

In a workshop on “What Happens Next in Counterparty Credit Risk,” Nawal Roy, head of the New York chapter of Professional Risk Managers International Association (PRIMA) and managing partner at Shobhit Capital Group, said that the survey of risk professionals from sell-side firms, buy-side firms, consulting firms and government asked a series of questions about the characteristics of a credit risk monitoring system.  In their responses, 67 percent of those surveyed said it is very important to have a combined market and credit risk system, while 61 percent said that issuer exposures should be monitored “under a hybrid framework.”

“Credit and market risk are linked, and should be managed consistently, preferably in one system,” said Marcus Cree, director for the Americas of SunGard’s Adaptiv business unit. However, he added, combining them is complicated, and “how this works in practice is an open question.”

Another finding from the survey, he said, is that in counterparty credit risk management, “perceived limitations are getting in the way of desired risk policy.” Systems should reflect exposure accurately, Cree said, and need to account for the whole portfolio effect and include risk mitigation such as netting. “Accuracy is important,” Cree said. “Simple proxy measures are not up to the job.

At the same time, he said, the survey showed that there is no “one size fits all” approach to counterparty credit risk management: “Fragmented limit structures are a reality, even if unified global limits are an aspiration.”

SunGard’s offering in the space, SunGard Adaptiv Credit Risk, is an enterprise scale transaction processing and portfolio management engine that allows users to perform credit inquiries in real time, around the globe. The system departs from the traditional “up front license cost plus annual maintenance” model that is commonly associated with risk systems, and is instead priced on a per-transaction basis.

Source:, 24.06.2009 by Carol E. Curtis

Filed under: Banking, Data Management, News, Risk Management, Services, , , , ,

Avox launches Global Business Entity Directory: Wiki-Data – Initiative supports Open Global Standards for Business Entity Data

Avox, a subsidiary of Deutsche Börse, is opening up access to a subset of over two hundred thousand verified and maintained business entity data records of its content. On, users will find basic information about those business entities including their legal name, country of incorporation and operation, state/province/region and city of operation (where applicable) and an Avox identifier (AVID).

Avox is publishing this information for free usage in an effort to help facilitate a common standard for business entity data. Any data record with an AVID attached has been comprehensively verified and is maintained by Avox’s team of data experts. It is also continuously checked by clients and partners because of the built in feedback loop established between these user organizations and Avox. With this launch, the entire world can participate in and benefit from the increased level of maintenance, enhancement and consistency of the content.

“The industry has been demanding a low cost business entity data standard for years” says Ken Price, CEO and co-founder of Avox. “We believe this is a big step toward achieving such a standard however we fully intend to collaborate with other key industry players who will add content and significant value to the offering. A successful standard must be a shared standard in our view.” Price points to partnerships with S&P, Markit, IDC and SWIFT as examples of this collaboration.

Julia Sutton, Global Head of Customer Accounts for the Institutional ClientGroup at Citigroup commented “Wiki-data is yet another example of the collaborative model at work. Anyone that uses data from will be consistent with a number of major industry players including Citi, Barclays and Nomura. We see this as a major efficiency play as usage expands.”

Richard Snookes, Director, Global Reference Data at Barclays Capital says “By taking the decision to allow the use of their AVID identifier as unlicensed content, Avox have removed one of the key obstacles to creating a genuine community for counterparty identification. The user experiencing a loss related to a bad piece of data quickly effects a transition to being the user with the most accurate data. The wiki-data model magnifies these transitions across multiple data fields and users to present an extremely powerful tool with real scalability.”

Users can purchase additional data attributes or regular updates of the complete file as an annual subscription. Third parties can license the content as the basis for their directory and identification services. Internet access of the data for primary usage is restriction and cost free. The key objective is for all major industry participants to use the same underlying data.

Price notes that the data will be of extremely high quality, “That’s the nature of the beast. One of the great benefits of the Internet is that we can now leverage a global community of data checkers who can point out remaining potential errors, changes or omissions which our central team of data experts can address straight away. There is no better way of maximizing data quality.” Additional capabilities such as online record linking, user commentary and the facility to add new records which can be verified by Avox or other firms are planned for the future.

A blog has been established at where anyone can openly comment, criticize and/or make suggestions for improvements to the platform and the content. There is also a LinkedIn group set up called the “Avox Business Entity Discussion Forum” for those who use that platform. Wiki-data and the blog can also be accessed directly from

Source:MondoVisione, 22.06.2009

Filed under: Data Management, Data Vendor, Exchanges, Library, News, Reference Data, Risk Management, Standards, , , , , , , , , ,

Operational Risk and Reference Data: Exploring Costs, Capital Requirements and Risk Mitigation

Financial transactions can be thought of as a set of computer encoded data elements that collectively represent 1) standard reference data, identifying it as a specific product bought by a specific counterparty, and 2) variable transaction data such as traded date, quantity and price. The reference data components of a financial transaction identifies it as a specific financial product (security number, symbol, market, etc.), its unique type, terms and conditions (asset class, maturity date, conversion rate, etc.), its manufacturer or supply chain participant (counterparty, dealer, institution, exchange, etc.), its delivery point (delivery, settlement instructions and location), its delivery or inventory price (closing or settlement price) and its currency. Analogous to specifications for manufactured products, reference data also defines the products’ changing specifications (periodic or event driven corporate actions), occasional changes to sub-components (calendar data, credit rating, historical price, beta’s, correlations, volatilities) and seasonal incentives or promotions (dividends, capital distributions and interest payments).

Download: Opertional Risk and Reference Data Nov 2005 Study

This paper documents the impact of increasingly costly and duplicative expenditures for the sourcing, maintenance, and processing of reference data; the effect of recent regulatory mandates on reference data; and the role faulty reference data plays in operational risk and operational capital. While quantifying this impact, reference data remains an area needing further research due to the lack of granularity of cost and loss data compiled to date. As such this paper is the first time the literature of Operational Risk and Reference Data have been drawn together and useful insights presented.

Source: November 2005 by  Allan D.Grody Fotios C. Harmantzis, PhD, Gregory J. Kaple

Filed under: Corporate Action, Data Management, Library, Reference Data, Risk Management, Services, Standards, , , , , , , , , ,

Data Quality key to risk overhaul – survey

Just a third of financial services executives think risk management principles in their business remain sound, with over half conducting or planning a major overhaul of operations, according to a survey by the Economist Intelligence Unit.

The survey of 334 executives, conducted for SAS, shows the improvement of data quality and availability is likely to be the key area of focus in the management of risk over the next three years, cited by 41% of respondents.

Strengthening risk governance is a key area for 33%, developing a firm wide approach to risk is important for 29% and improved technology infrastructure is cited by 24%.

The research highlights a belief that all departments, not just lending, need a clearer picture of risk adjusted performance and the behaviours that influence it.

Virginia Garcia, senior research director, Tower Group, says: “Although technology is not to blame for the widespread financial crisis, rigid technology and business processes have undoubtedly made it difficult for many FSIs to respond rapidly and effectively to the financial crisis. This situation reinforces the business case for a more agile and intelligent enterprise architecture to mitigate risk by helping FSIs adjust to volatile business dynamics.”

Less than a third of those questioned feel regulators handled the financial crisis properly but respondents agree that transparency needs to be heavily emphasised within proposed reforms.

They point to greater disclosure of off-balance-sheet vehicles, stronger regulation of credit rating agencies, and the central clearing for over-the-counter derivatives as initiatives thought to be most beneficial to the financial services industry.

“Now more than ever, this survey confirms the need for the players in financial markets to make transparency a major part of a comprehensive overhaul of risk and performance management to make better business decisions,” says Allan Russell, head, global risk practice, SAS.

Source: Finextra, SAS, 06.05.2009

Filed under: Banking, Corporate Action, Data Management, Market Data, News, Reference Data, Risk Management, , , , , , ,

Risk & Compliance Report March 2009: Reference Data Review

Download: RDR  Risk_&_Compliance Report March 2009  A-TEAM

The current financial climate has meant that risk management and compliance requirements are never far from the minds of the boards of financial institutions. In order to meet the slew of regulations on the horizon, firms are being compelled to invest in their systems in order to cope with the new requirements.

Data management is an integral part of this endeavour, as it represents the building blocks of any form of risk management or regulatory reporting system. Only by first understanding the instruments being traded and the counterparties involved in these trades can an institution hope to be able to get a handle on its risk exposure. The fall of Lehman Brothers and the ensuing chaos was proof enough of the data quality issues still facing the industry in this respect.

Regulators are demanding more and more data transparency from all players in the financial markets and this has meant that the ability to access multiple data silos has become essential. A siloed mentality towards data will no longer be acceptable, as these regulators seek a holistic view of positions and the relationships between counterparties.

All of this represents a significant challenge to the data management community, given that there are standards lacking in many areas, for example business entity identification. But with great challenges come tremendous opportunities to solve data management issues that have been in the background for far too long.

Source: A-TEAM Group 13.03.2009

Filed under: Corporate Action, Data Management, Library, Market Data, News, Reference Data, Risk Management, Services, Standards, , , , , , , , , ,

Business Entity Identifiers White Paper – The Crucial Foundation for Accurate Risk Management »

Download: Business Entity Identifiers- March 2009 White Paper A-TeamGroup

There is an immediate and pressing requirement from financial institutions for a usable global enumeration standard for business entity identifiers. Getting risk management houses in order is a clear driver, but this also comes against the backdrop of the anticipated onslaught of new regulations that financial institutions will have to contend with, along with the ongoing need to improve operational efficiencies, reduce errors, and reduce costs.

So while industry bodies continue the lengthy process of agreeing on a standard format to bring to market, is there an immediate step that financial institutions can take to fill this void? We believe there is, and it is a step that a number of fund managers and investment banks are already piloting.

Source: A-TEAM Group,  27.02.2009

Filed under: Data Management, Library, News, Reference Data, Risk Management, Services, Standards, , , , , , , , ,

Corporate Actions Report April 2009 – Inside Reference Data

Download: Corporated Actions April 2009 – Inside Reference Data

This is at least what industry groups hope will happen. In fact, there is already a
working group creating an XBRL taxonomy for corporate actions, and Swift and the
Depository Trust & Clearing Corporation are both working with XBRL on the corporate
actions initiative. So even though change is not expected overnight, the development is
now a lot more encouraging than it was a year back.

Corporate actions and the issuer debate is now, in fact, becoming an important area
to follow. And with this special report, which includes comments from industry experts
and a news review, we hope to give readers the opportunity to keep on top of the latest
developments in the corporate actions space.

Source: Inside Reference Data, April 2009

Filed under: Corporate Action, Data Management, Data Vendor, Exchanges, Library, News, Reference Data, Risk Management, Standards, , , , , , , , , , ,