FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

Reference Data: Tech Mahindra Details Global Data Utility Based on Acquired UBS Platform

Tech Mahindra, a business process outsourcing specialist and parent of London-based investment management technology consultancy Citisoft, has repositioned a reference data platform acquired from UBS Global Asset Management to offer an offshore reference data utility aimed at meeting market demand for lower cost, high quality data that can reduce risk and increase efficiency.

The global data utility has been introduced under the Tech Mahindra Managed Data Services brand and offers securities reference data across all asset types, reference data for corporate actions, tax information and end-of-day and intra-day validated pricing data. The utility handles data cleansing and validation, with clients buying licences to access the data.

Tech Mahindra suggests the utility differs from other offerings in the enterprise data management market as it is owned by the company and can be developed. It is also agnostic on data feeds, including 20 from vendors including SIX, Markit, Bloomberg, Thomson Reuters and DTCC.

The company’s first customer is UBS Fund Services in Luxembourg. Under the terms of a five-year services contract with UBS, Tech Mahindra will create and store golden copy data and provide multiple intra-day golden copies to the asset manager. As part of the acquisition and customer deal, Tech Mahindra, which is headquartered in Hyderabad, India, will take on some staff from UBS Global Asset Management who were working on the platform in Luxembourg, but most staff will be located in India.

As a repositioned platform, Tech Mahindra MDS already covers all time zones, markets and asset types, updates 2.5 million issuers on a daily base, receives 200,000 customer price requests and validates 75,000 prices. Some 20,000 corporate actions are checked every day, along with 1,800 tax figures. Looking forward, Tech Mahindra plans to extend these metrics and add reference data around indices and benchmarks, legal entity identifiers and clients.

While Tech Mahindra will lead sales of the service to the banking, financial services and insurance sectors, Citisoft will be able to provide consultancy as necessary. Steve Young, CEO of Citisoft, says Tech Mahindra MDS has been designed to improve data quality and drive down the total cost of data ownership, in turn reducing risk and increasing efficiency. To manage clients’ cost issues, the company has built a toolkit into the data management system that allows users to analyse the cost of owning data, including people, processes and technology. Data quality will be underpinned by service level agreements and key performance indicators will be added as more clients sign up for services and data volumes grow.

Reflecting on the data challenges faced by financial firms, Citisoft Group CEO Jonathan Clark, concludes: “Outsourcing models have evolved over time and attitudes are changing as firms acknowledge that there is a big difference between outsourcing and offshoring, and that captive outsourcing is not an efficient approach. The need is for a commercial relationship with a centralised data utility that can deliver high-quality, accurate data and a lower total cost of ownership.”

Source: Reference Data Review, 24.07.2013

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , , , , , , , ,

Data Management – a Finance, Risk and Regulatory Perspective- White Paper

Download this  white paper – Data Management – a Finance, Risk and Regulatory Perspective – about the challenges facing financial institutions operating across borders with respect to ensuring data consistency and usability across multiple user types.

 The finance, risk and compliance operations of any financial institution need nimble access to business information, for performance measurement, risk management, and client and regulatory reporting. But although the underlying data may be the same, their individual requirements are different, reflecting the group-level view required by senior management and regulators, and the more operational view at the individual business level.

Where in the past, risk managers have been left to figure out what’s most appropriate for their particular institutions, regulators today are adopting a more aggressive stance, challenging the assumptions underpinning banks’ approaches to risk management. As a result, today’s challenge is not only to understand the current regulatory, risk or finance requirements, but also to set in place the analytical framework that will help anticipate future requirements as they come on stream. To find out more, download the white paper now

Full text and downloads available to members only, please log in or become a member to view premium content and attached documents.  Becoming a member of the community is free and easy – just click the link below.      
Source: A-Team, 02.07.2013

Filed under: Corporate Action, Data Management, Library, Reference Data, Risk Management, Standards, , , , , ,

Outsourcing Reference Data Management: Cost Reduction and New Revenue Opportunities

The past 12 months has seen the emergence of new players offering Business Process Outsourcing (BPO) services for Reference Data Management. These new arrivals expand the range of options available to financial institutions for addressing the challenges of regulatory compliance, operational cost reduction and scalability.

But BPO has other benefits, and innovative adopters have benefited from using the model to create new value-added services. By catering to clients’ data management needs, these players have been able to transform what’s traditionally been considered a cost centre into a new and significant source of revenue.

This paper – from AIM Software – explores this exciting new trend, and describes how an established financial institution took advantage of BPO to turn its enterprise data management initiative into a new source of revenue and business growth.

Download the White Paper Now to Find Out More

Source: A-Team, March 2013

Filed under: Corporate Action, Data Management, Data Vendor, Library, Market Data, Reference Data, Standards, , , , , , , , , , , , , , ,

Capco Proposes the Creation of a Data Culture to Advance Data Management RDR

Financial firms are falling short on data management issues such as calculating the true cost of data, identifying the operational cost savings of improved data management and embracing social media data, but according to research by consultancy Capco, these issues can be resolved with a cross-organisational and practical approach to data management and the development of a data culture.

The business and technology consultancy’s report – ‘Why and how should you stop being an organisation that manages data and become a data management organisation’ – is based on interviews with towards 100 senior executives at European financial institutions. It considers the many approaches to data management across the industry and within individual enterprises, as well as the need to rethink data management. It states: “There is one certainty: data and its effective management can no longer be ignored.”

The report suggests an effective data management culture will include agreed best practices that are known to a whole organisation and leadership provided by a chief data officer (CDO) with a voice at board level and control of data management strategy and operational implementation.

For details on the report click here

Turning the situation around and attaching practical solutions to the data management vision of an all-encompassing data culture, Capco lists regulatory compliance, risk management, revenue increase, innovation and cost reduction as operational areas where good data management can have a measurable and positive effect on profit and loss.

Setting out how an organisation can create an effective data culture, Capco notes the need to change from being an organisation that is obliged to do a certain amount of data management, to a mandated and empowered data management organisation in which data has ongoing recognition as a key primary source. The report concludes: “Every organisation has the potential, as well as the need, to become a true data management organisation. However, the journey needs to begin now.”

Source: Reference Data Review, 24.10.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , ,

Symbology: EDI’s Corporate Actions Service Adopts Bloomberg Open Symbology

Free-use Data Tagging System Reduces Costs and Risks in Trading

Exchange Data International (EDI), a premier back office financial data provider, today announced it adopted Bloomberg’s Global Securities Identifiers (‘BBGID’) to name and track all equities securities in its Worldwide Corporate Actions service.

EDI is the latest financial data provider to adopt Bloomberg’s Open Symbology (BSYM), an open and free-use system for naming global securities across all asset classes with a BBGID, a 12 digit alpha-numeric identifier for financial instruments. EDI has implemented BBGID numbers in its equities reference, pricing and corporate actions data feeds. Its Worldwide Corporate Actions service provides detailed information on 50 corporate action event types affecting equities listed on 160 exchanges.

“EDI decided to integrate Bloomberg’s Open Symbology, as it is easily accessible and has no license fee or restrictions on usage,” said Jonathan Bloch, the Chief Executive Officer of EDI. “Bloomberg’s Symbology also advances straight-through processing of equity orders, which aids reporting and compliance management.”

Peter Warms, Global Head of Bloomberg Open Symbology, said, “Existing identifiers that change due to underlying corporate actions introduce inefficiencies, increase costs and add complexity to the data management process. Bloomberg and EDI recognise the importance of comprehensive, open and unchanging identifiers, like the BBGID, in enabling customers to track unique securities consistently and to process corporate action data seamlessly. As BSYM grows in adoption, interoperability across market systems and software using BSYM will improve steadily and reduce operational costs.”

Source: Bobsguide, 24.09.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards, , , , , , , , , , ,

Bloomberg unveils its NEXT terminal

On its 30th anniversary Bloomberg officially launched an updated $100 million version of its core terminal yesterday in London and New York simultaneously. The NEXT platform of the Bloomberg Professional Service is intended to give traders and financial services end users faster, deeper insights into the markets and to enable the market data terminal to answer questions more intuitively in future, not just present research and data, via an enhanced ‘natural language’ search function and ‘give me the answer’ front-end tool.

According to Tom Secunda, the co-founder and vice chairman of Bloomberg speaking at the launch, “this is an evolutionary step” that helps order increasingly complex markets and aids productivity, while continuing the company’s mission to deliver on “Mike Bloomberg’s famous three-legged stool, consisting of news, data and analytics”. The NEXT platform consolidates and crucially integrates these feeds better than ever before believes the company, giving users easier access to the information that exists on the terminal and enhancing the customer experience.  “For example, you can ask what was US CPI in 2000 …and bang, there is the answer.” Users can then drill down into the answer for further research, added Jean-Paul Zammitt, global head of core product development at Bloomberg, pointing out that this is the key presentational change in the NEXT platform, requiring every help screen and back end process to be rewritten and updated.

Under development for the last two years, Bloomberg asserts that 3,000 technologists were involved in the overhaul of its core terminal, which is used by traders, analysts and even some large multinational corporate treasuries looking to hedge their foreign exchange exposure. A select group of existing clients, including OCBC Bank, Credit Agricole CIB, and Glenhill Capital were involved in the development phrase, allowing Bloomberg to review common keystrokes and commands across an array of functions in order to improve the customer experience.

More than 100,000 clients have already converted to Bloomberg NEXT at no extra cost in the £20,000 per year outlay since its ‘soft launch’ at the end of last year, with less than 1% converting back to their old terminal. The company said that two thirds of them are using the NEXT platform more than their old terminal and that it wants to convert all of its 313,000 subscriber base for the Bloomberg Professional Service by the end of this year.

“Bloomberg NEXT saves me time by discovering functions and data more quickly,” said Seth Hoenig, head trader at one of the ‘soft launch’ development partners, Glenhill Capital. “The new help menus enable users to find the answer that they need fast. Stumbling upon the hidden gems within Bloomberg has always been revelatory; now it’s easier.”

According to Lars Hansen, senior portfolio manager at Denmark’s DIP, the Danish Pension Fund for Engineers: “Bloomberg NEXT is a major step forward. It is much more intuitive – you can see multiple pieces of information on one screen, which lets you see new interrelationships.”

Bloomberg highlighted what it sees as three key improvements in its updated terminal:

• Better discoverability: Bloomberg NEXT’s new discoverability features allow users to get quick, direct answers to their queries as well as pull together a wide variety of related details such as companies, research and charts. A more powerful search engine means users can type just a few words and go directly to the desired securities, functions, people and news. The streamlined menu listing puts the most relevant information and topics right up front.

• More uniformity: Every screen of the Bloomberg Professional Service has been redesigned to provide a common look and feel. This consistent interface across all asset classes, from FX to commodities and fixed income, and across all functions should allow expert users and generalists alike to more efficiently navigate often-used functions and discover new ones. An educational overview of each market segment for novices is also included in the update.

• Intuitive workflow: The functionality of the Bloomberg Professional service has been re-engineered so that a user should be able to quickly and seamlessly navigate through the series of questions and answers essential to making smart market decisions. The new workflow, with user prompts, in Bloomberg NEXT is intended to allow expert users to drill deeper into the data and to let occasional users discover new functions.

“The complexity and interconnectedness of the global financial marketplace has grown significantly. Business and financial professionals need to synthesize astounding amounts of information to make intelligent investment decisions,” explained co-founder, Tom Secunda. The firm is still a big believer in a single product approach, however, he stressed at the official launch of NEXT but this, “obviously gives us challenges as markets get more and more complex.”

NEXT is Bloomberg’s response. “The pace of change in financial markets will only accelerate and with it the need for more information,” added Secunda, before concluding that he believes, “Bloomberg is now positioned to quickly answer those evolving questions and ensure that our clients will always have the leading edge in making investment decisions.”

News Analysis 

Bloomberg’s new NEXT platform will go head-to-head against Thomson Reuters in the market data sector, which is increasing in value as financial markets get more and more complex and new post-crash regulations place new information demands upon market participants. Both companies are running neck and neck in terms of market data share, with estimates of 30% for each at present.

One terminal is proprietary, of course, with Bloomberg maintaining its closed market data platform in its NEXT iteration, while Thomson Reuters is now following an open access model with its Eikon terminal, allowing users to add their own data and applications. The relative failure of Thomson Reuters Eikon platform, which has sold only tens of thousands of units since launch rather than the hoped for hundreds of thousands, is what prompted the open access model from Thomson Reuters, although it does of course take time to build up a following. It will be interesting to see if Thomson Reuters move allows the firm to win back lost market data share or if Bloomberg’s updated terminal can keep it on its recent upward curve. The former is still benefiting from the 2008 merger that united Thompson Financial with Reuters, giving it synergies in the data collection and delivery areas, but the competition between the two has just hotted up.

Source: Bobsguide, 28.02.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, News, Reference Data, , , , , ,

Thomson Reuters Goes Live with Delta Data Factory

First Derivatives plc ,  is pleased to announce that Thomson Reuters (TR) pricing and reference data group (P&RDG) has selected and implemented FD’s Delta Data Factory (DDF) for use internally as a component in its multi-faceted forward-thinking data delivery strategy. This announcement follows FD’s recent launch of DDF, a hosted data factory service for reference data and also the formation of a dedicated data management division.

Thomson Reuters P&RDG client-centric focus and innovation approach makes use of Delta Data Factory as one element in a strategy to rapidly meet the formatting and workflow requirements of its clients. TR selected DDF as a managed service “data formatting factory” to assist in its strategy to offer TR clients speedy integration and adoption of reference and pricing data.

According to Tim Rice, MD of Global Pricing and Reference Data, “we selected FD’s Delta Data Factory because of the flexibility and rapid implementation speed, powerful data transformation engine, data knowledgeable team, reliable hosted infrastructure and global support model. Within TR’s data strategy, FD’s independence as a strong third party service provider supports and accelerates our plans allowing clients to leverage our data quickly. We’re now successfully live with a number of clients”.

For consumers of TR data, whether it be client-direct or third party application vendors, FD’s Delta Data Factory transforms the data into rapidly consumable formats for TR clients, third party applications partners, security master environments or EDM platform formats.

Dale Richards, President of FD US and Global Head of Data Management at FD commented, “We are very pleased to have TR as a client of DDF. The service is a powerful new model for the data industry and TR implementing and going live is a terrific endorsement of the capabilities”.

DDF is a managed service support model that includes software, expert data staff, support level management, infrastructure, customization tools, hosting and management. FD provides the factory working with clients to implement the best strategy. FD has been hosting and operating systems on behalf of clients for 15 years with ISO27001/SAS70 compliant operating centers.

In addition to data vendors and publishers, financial institutions use DDF to outsource the processing and normalization of multiple in-bound reference data sources into EDM or proprietary security master environments. FD’s also uses DDF to produce customized out-bound formats for their internal clients. Benefits include cost savings and decreased project timeframes.

Source: Bobsguide, 10.02.2012

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, , , , , , ,

SIBOS Toronto Round Up: LEI, TS2, Standards – A-Team

Unsurprisingly given the host’s recent market positioning, the dominant theme at last week’s Swift user conference Sibos in Toronto was standards, in many different flavours. The one at the top of the list in terms of reference data, however, had to be the legal entity identifier (LEI) and there was certainly no shortage of discussions on the subject (see the list from my preview here). A total of three sessions were dedicated to the topic and the exhibition hall was abuzz with the potential that the LEI holds for the vendor community in terms of revenue generation.

As I noted in my interview with Fabian Vandenreydt, Swift’s head of Securities and Treasury Markets, at the conference last week (see more here), the industry messaging network provider has made reference data a key part of its 2015 strategy and its selection by the Sifma led committee to act as the issuing body for the proposed ISO 17442 standard is a significant element in these endeavours. However, the US Office of Financial Research (OFR) has not made a final decision on whether the Swift, ISO and Depository Trust and Clearing Corporation (DTCC) team will get the gig (see more on which here) and there are many factors to be considered before a new data infrastructure is put in place, not least of which is governance.

Although it was not addressed at length during the three panel discussions on the LEI (in fact, it was only briefly noted), the notion of a privately run, public data utility is something of a challenge in terms of governance. Given that DTCC owned Avox is currently run as a commercial operation with a number of large customers such as Citi, how will the vendor’s technology be deployed as the backbone of a data utility without granting the DTCC unfair advantage over the market? Ditto with Swift, given that it plans to offer “value added” services on top of the basic reference data set provided by the OFR.

This was a subject I raised offline with a number of the DTCC and Swift execs at the show and the response was that it is an issue that is due to be tackled over the coming months. Given the European Commission investigations of players such as Thomson Reuters and S&P’s Cusip Global Services (CGS) on the subject of potential anti-competitive issues regarding reference data, it will be an active participant in this debate. That is, if Europe agrees to go down the utility route.

A lot seems to be predicated on this week’s Financial Stability Board (FSB) discussions in Basel; an event that nearly everybody I spoke to was planning to attend. The hope seems to be that the new global body will make a final recommendation about whether Europe and the rest of the world will adopt the LEI as it has been proposed. Given that the FSB is working on tricky issues such as tackling the shadow banking sector in a coordinated fashion (see more on which here), it seems likely that there will be some pressure to adopt such a standard.

However, whether the FSB has the teeth to be able to get the global regulatory community to listen and get on the same page as each other is another matter entirely. After all, China has already indicated it will be developing its own entity identification standard. How many more can the industry expect? As noted by UBS’ Daniel Maury, who is the global lead for the firm’s Enterprise Client Data Programme (ECDP), during the LEI session there could be 10 or more if regulators don’t agree on one; a development that could prove especially costly for the industry as a whole.

The party that these developments will prove most beneficial too, however, will be the vendor community. Maury admitted that there is no appetite within the investment banking community to build a vast library of cross references to these new standards, hence these firms will turn to vendors for the solution. Thomson Reuters’ announcement on the first day of the conference (see more here) that it has expanded its legal entity data solution is a case in point of vendors scaling their capabilities ahead of the requirements. It follows a similar move by Bloomberg earlier this year and it will certainly not be the last.

Turning away from the LEI for a second, the other main news from the conference from a post-trade perspective was the announcement by the European Central Bank (ECB) that the Target2-Securities (T2S) settlement infrastructure would be delayed by up to another year (see my guide to T2S from back in 2009 here). Rather than launching in September 2014, the pan-European settlement platform will be delayed until an unspecified date in 2015, according to T2S programme board chairman Jean-Michel Godeffroy.

Speaking during a panel debate on the Tuesday of the conference, Godeffroy said the delay was caused by a need for additional user requirements to be taken into account and for user testing with central securities depositories (CSDs) to be extended beyond the originally scheduled nine month period. However, the buzz from the conference and exhibition halls was that given the loss of T2S champions at the ECB Jean-Claude Trichet and Gertrude Tumpel-Gugerell, the central may back away entirely from the project and leave it up to the industry to sort out.

What does all of this mean for the data standardisation space? T2S has been a driver for a lot of work around corporate actions standardisation and, as such, a delay or even a complete reversal will have an impact on these developments, as well as more general data standardisation efforts (see more on which here). The main impact of the T2S developments relate to the fact it would take settlement out of the hands of CSDs and thus result in a complete re-evaluation of their business models and those of all the other players in the securities market active in Europe. Taking this pressure away could therefore have a whole host of consequences.

Of course, a roundup of the Sibos week couldn’t go without a mention of my standards forum panel, during which myself and Bob Masina, head of technology and operations for the Australian Payments Clearing Association (APCA), and Dan Retzer, CTO at corporate actions solution vendor XSP, debated whether ‘standards innovation’ was an oxymoron (see my earlier blog here). Our conclusion was that being innovative with standards is all well and good, but it takes the big players adopting these standards (and thus bringing the rest of the market with them) to make a difference. Standards development is merely the first step.

Source: A-TEAM, 26.09. 2011, Virginie O’Shea

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Standards

Bloomberg Pushes Benefits, Value of Data License New Commercial Model

Bloomberg is redoubling efforts to convince customers of the value of its new pricing model for its Bloomberg Data License service of intraday and end-of-day market and reference data—known as the New Commercial Model (NCM)—which it originally introduced in March, and which could see the cost of Data License increase by between 30 and 100 percent over three years.

 The pricing model, which is part of the vendor’s new customer engagement model for enterprise Data License customers, came into effect from the start of June for existing contracts facing renewal and from April 1 for new accounts, according to a letter sent to clients in March by Bloomberg president and chief executive Daniel Doctoroff. However, in recent weeks, sources say the vendor’s sales management team has contacted Data License clients to obtain feedback on the structure of the NCM, and to visit customers in person to re-explain the model.

Although Bloomberg declines to comment on why it was revisiting customers, banks and buy-side firms have criticized the model, which will lead to unbudgeted price rises of up to—and in some cases more than—100 percent. “Originally they gave us a detailed breakdown of every single security license, back-office license, estimated dollar spend, renewal dates and all the instruments that had been consumed on the feed,” says a source at one sell-side firm. “Then in the last two weeks they came back and said they want to re-present this….  Bloomberg is keen to make sure customers understand everything and show that it is not as bad as it first looks.”

Under the old commercial model, customers paid a monthly charge per security, with prices based on six categories of instrument type and three categories of data type—a security master incorporating corporate actions and prices; derived data; and issuer data—plus a sub-category of price-only data. Under the NCM, Bloomberg has retained the monthly charges and the link between prices and data/instrument type, but has replaced existing categories with a greater number of new categories which result in higher fees overall than in the old model. For example, the security master, corporate actions data and prices for a corporate security were previously bundled together for $1.50 per security per month, but are now sold separately for $1.70, $0.50 and $0.75 per security per month, respectively—a total of $2.95 per security per month.

Bloomberg has also expanded the six instrument categories—including a category covering corporate, government, and money market assets; one for municipals; agency pools; collateralized mortgage obligations, commercial mortgage-backed securities, whole loans and asset-backed securities; equity options, futures, warrants, funds indexes and currencies; and economic statistics—to 11 categories, by splitting out different asset types into new, individual categories, such as separate categories for funds, US government and syndicated loans.

Meanwhile, the vendor has divided issuer data into three component categories—credit risk data, fundamentals and estimates—meaning that monthly fees for a corporate security have more than doubled from $2.50 to $6.50 in the NCM. The cost of derived data has risen by up to 50 percent depending on the asset class, while the vendor now charges for accompanying corporate actions data, regardless of whether a corporate action event actually occurred that month. Under the NCM, multiple requests from firms who wish to view the data more than once per month will also now be charged between one and three cents per security per day, depending on the asset class and data type, whereas previously the first multi-request was free.

More Flexible
Bloomberg officials say the new model is intended to provide more flexibility and value, and to allow clients to “only pay for the data that they want and need.” But one market data manager at a European asset manager calls the change a “pure slicing and dicing” exercise, adding that if a business needs to subscribe to all the content, “You get nothing new or extra—you just have to pay a lot more for the same data.”

To soften the impact of the changes for existing clients, Bloomberg’s Data Solutions group will provide enterprise data license consultants to help clients manage their data usage, and is phasing in the increases, so clients renewing their Data License contract this year and early next year will see stepped cost increments, limited to a total increase of no more than 7 percent in the first year and a further 7 percent in the second. Some clients praise this softly-softly approach but are concerned about the impact after that initial two-year period.

“In our peer group, we are sharing knowledge on how much it will impact us. For some, it’s 2 percent, for others it’s 30 or 100 percent, depending on what data you take and how exposed you are to certain services,” says a market data vendor manager at a second European asset manager. “Seven percent in the first year, then another 7 percent in the second is fine, but after that, when it hits you fully—that’s what we’re worrying about.”

In addition to incremental rises, Bloomberg will also offer “optimization,” whereby if a firm has multiple contracts with the vendor across different branches or business units and requests the same data on the same security in the same month via those contracts, then—excluding intraday and derived data—the vendor will only charge between one and three cents for the second request, rather than twice the full price, which it expects to deliver better value for clients.

However, Jean-Pierre Gottdiener, manager at Paris-based consultancy Lucidine Conseil, says firms who have made the biggest efforts so far to reduce costs and administration by consolidating multiple contracts across branches will not be eligible to take advantage of optimization, and will have to pay the most. “If you only have one contract because you have already rationalized your request to Bloomberg, there will be no optimization and you will support nearly the full increase of the prices,” he says. “Some firms have made no optimization on Bloomberg and their increase was only 30 percent, whereas those who have already made an investment to rationalize Bloomberg face a rise of 100 percent.”

Some acknowledge that the vendor’s prices are fair, given that data volumes have increased considerably since the last time the vendor increased prices—more than a decade ago, according to Bloomberg officials—but Gottdiener adds that Bloomberg’s leading position in the market means “the industry is facing a real issue from the policy, and will probably need to find alternative solutions.”

In fact, the NCM has prompted dissatisfied buy- and sell-side firms to reassess their data consumption. Some participants have even said they will look to alternative parties for cheaper data for some parts of the Data License, such as corporate actions, where plenty of alternative providers exist. “Often with Bloomberg, you just absorb the whole universe and pump it everywhere, so it’s good that we now have to look at what data do we use, where we use it, and why,” adds the source at the second asset manager.

Source: Waters Technology 08.08. 2011

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, News, Reference Data, Standards, , , , , , ,

Integration of Histroical Reference Data

Historical data is becoming more crucial to managing risk, but to make it useful, data updates must be reconciled with the moments actual changes in data occurred, says Xenomorph’s Brian Sentance.

There has been much talk recently about integrated data management, as the post-crisis focus on risk management demands a more integrated approach to how the data needed by the business can be managed and accessed within one consistent data framework. Much of the debate has been around how different asset classes are integrated within one system, or how different types of data—such as market and reference data—should be managed together.

However, there has been little discussion on how historical components can be integrated into the data management infrastructure. This will have to change if the needs of regulators, clients, auditors and the business are to be met in the future.

Why is history and historical data becoming more important to data management? There are many reasons. First, data management for risk needs historical data in a way that simply was not necessary for the reference data origins of the industry over a decade ago.

Another reason would be the increasing recognition that market data and reference data need to be more integrated, and that having one without the other limits the extent of the data validation that can be performed. For example, how can terms and conditions data for a bond be fully validated if the security is not valued by a model and prices not compared to the market?

As another example, how many data management staff were overloaded by the “false positives” of price movement exceptions during the highly volatile markets of the financial crisis? I would suggest many organizations would have saved hours of manual effort if the price validation thresholds used could have automatically adjusted to follow levels of market volatility derived from historical price data.

Regulators and other organizations in the financial markets now want to know more of the detail behind the headline risk and valuation reports. The post-crisis need for an increase in the granularity of data should be taken as a given. This is progressing to an extent where external and internal oversight bodies not only want to know what your data is now, but want the ability to see what the data was at the time of market or institutional stress. Put another way, can you easily reproduce all the data used to generate a given report at a specific point in time? Can you also describe how and why this data differs from the data you have today?

“But I already have an audit trail on all my data,” I hear you say. Yes, that is a necessary condition on being able to “rewind the tape” to where you were at a given time, but is that sufficient? An audit trail could be considered as a sparse form of historical “time series” storage for data, but as we all are aware, there are not many pieces of “static” data that do not change over time (corporate events being the main cause behind these kinds of changes). The main issue with audit trail use here is that it can only represent the times when the data value was updated in the database, which is not necessarily the same time as when the data value was valid in the real world.

So for example, for the sovereign, that forces a change in the maturity dates of its issued bonds. You can only capture when your data management team implemented the change in the database, not necessarily when the change was actually made in the market. Hopefully, the two times may turn out to be the same if your data management team is efficient and your data suppliers are accurate and timely. But don’t count on it, and don’t be too surprised if a regulator, client or auditor is displeased with your explanation of what the data represents and why it was changed when it was. We are heading into times where not knowing the data detail beneath the headline numbers is no longer acceptable, and historic storage of any kind of data—not just market data—will necessarily become much more prevalent.

Source: Xenomorph, 13.07.2011

Filed under: Corporate Action, Data Management, Data Vendor, Market Data, Reference Data, Risk Management, Standards, , , , , , , , ,

Managing Corporate Actions Risk – January 2010 – IRD – Insight Reference Data

Despite industry efforts to reduce financial losses typically associated  with corporate actions processing, managing risk remains one of the major challenges for the corporate actions industry. On November 18,  Inside Reference Data gathered leading corporate actions professionals  in a web forum to discuss what more could be done to help improve the situation.

Source: Insight Reference Data, 29.01.2010

IRD_Jan2010_ManagingCorporate Action_ Report

Filed under: Corporate Action, Data Management, Library, News, Reference Data, Risk Management, , , , ,

UK asset managers lack confidence in reference data quality – survey

Over a third of UK-based asset managers and banks are not confident in the quality of reference data they use to support trading activity, according to a survey from IT services firm Patni.

The survey of 100 company representatives found that 91% of asset managers do not have a single supplier of reference data, with the remainder admitting that they were not sure of their source at all. Respondents say that an average of six per cent of trades fail as a result of poor reference data.

Yet just half of those questioned say they have not considered outsourcing the management of their reference data to a third party, due to fears of a potential loss of control and security breaches. Meanwhile, the overwhelming reason cited for considering outsourcing is the potential for cost savings, followed by higher levels of accuracy.

Philip Filleul, product manager, reference data, Patni, says: “Many buy-side and sell-side firms are now uncomfortably aware of both the time and costs they devote to purchasing, cleansing and distributing reference data, as well as the risks that arise when these tasks are not performed effectively, among them failed trades and lost revenue opportunities.”

“The twin pressures of achieving regulatory compliance and straight-through processing have highlighted substantial redundancy and duplication of effort in the area of reference data management.

“One in ten trades fail on first settlement attempt – and of these, 60 per cent -70 per cent can be attributed to poor data management. “

Research from the Tower Group, which was cited by the report, showed that nearly two thirds of failed trades did so due to inaccurate data.

Source: Finextra, Bobsguide, 29.10.2010

Filed under: Corporate Action, Data Management, Market Data, Reference Data, Risk Management, Standards, , , , , , , , ,