FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

Coming to Grips With Big Data Challenges by Dan Watkins

The rate of data growth in financial markets has scaled beyond the means of manageability.

Debates have gone so far as to dismiss Big Data as being tamable and controlable in the near term with the current computing architecture commonly adopted as an acceptable solution. I agree but argue that conventional data transport – not management – is the real challenge of handling and utilizing Big Data effectively.

From exchange to trading machine, the amount of new ticks and market data depth are delivered only as fast as the delivery speed can endure. Common market data feeds that are used in conventional exchange trading are but a fraction of the market information actually available.

Perhaps due to high costs of $100,000 per terabyte, many market participants deem the use of more data as a bit too aggressive. Or they believe that high performance computing (HPC) is the next generation technology solution for any Big Data issue. Firms, therefore, are sluggishly advancing their information technology in a slow cadence in tune with the old adage: “if it ain’t broke don’t fix it.”

Over the last decade, Wall Street business heads have agreed with engineers that the immense perplexity of Big Data is best categorized by Doug Laney’s 2001 META Group report’s Three B’s: Big Volume, Big Velocity and Big Variety.

When looking at “Big Volume” 10 years ago, the markets had just defragmented under Regulation ATS. A flurry of new market centers arose in U.S. equities as did dark liquidity pools. This gave rise to a global “electronic trading reformation.” Straight-through processing (STP) advocates and evangelized platforms such as BRASS, REDIPlus and Bloomberg Order Management Systems (OMS) resulted in voluminous and fragmented market data streaming to 5,000 NASD/FINRA trading firms and 700,000 professional traders.

Today, the U.S. has 30+ Securities and Exchange Commission-recognized self-regulatory organizations (SROs), commonly known as exchanges and ECNs. For the first time since 2002, full market depth feeds from NASDAQ allow firms to collect, cache, react, store and retrieve feeds on six hours of trading for nearly 300 days a year more transparently than ever. Big Data volume has grown 1,000 percent and has reached three terabytes of market data depth per day.

Billions of dollars are being spent on increasing “Big Velocity.” The pipes that wire exchanges through the STP chain to the trader have become 100 times faster and larger but still not fast enough to funnel the bulk of information laying idle back in the database. Through “proximity hosting,” the telco is eliminated and latency is lowered. This structure results in adjustments made for larger packets but not really for more information as Big Data remains the big, quiet elephant in the corner.

Five years after Reg ATS, markets are bursting at the seams with electronic trading that produces explosive market data that breaks new peak levels seemingly every day. The SEC’s Regulation National Market System (Reg NMS), struck in 2007, requires exchanges and firms to calculate the best price for execution to be compliant. Firms are also now mandated to sweep all exchanges’ market order books and process all of that data for a smart execution.

After the execution, traders have to track the “order trail” from price to execution for every trade and store all of that information for seven years in the event of an audit recall of a transaction.

Under Reg NMS, subscribing to the full depth of all 30+ markets in “real time” would mean a firm would have to have a 1x terabyte pipe for low latency. Since a T-pipe is not realistic, data moves at 1x gigabits, which is relatively slow with the data in queue at 50-100 terabytes deep. Multi-gbs pipes, as fast as they seem, are still similar to driving five miles an hour on a 55 mph highway.

Analysts typically call data from a database with R (Revolution Analytics) and “SAS” Connectors. The process includes bringing data to an analytical environment in which the user runs models and computations on the subsets of a larger store before moving on to the next data crunch job. The R and SAS Connectors between the file servers and the database are at 10/100BASE-T, making the movement of 50 terabyte environment like driving one mile per hour in a 55 mph zone.

We all hear the polemics regarding data formats and the jigsaw puzzle of unstructured data and the fact that “Big Variety” is the obstacle. Even after standardization of SQL-based queries where analysts can ask any “ad hoc” question, too many sources and too many pipes from analytic servers cause traffic jams. SQL databases are ideal for unstructured queries but are slow in unstructured data compiling. Aggregating market information is where much of market’s processing technologies are being evaluated today to meet the requirements of regulations, sweeping for best execution and for risk management.

Comparing where current prices of stocks are against bids and asks to trade across multiple exchanges, markets, sources, asset classes and clients is essentially the Big Data task of risk management. In addition to managing data changes, firms are also tasked with managing their trading accounts, client portfolios and trading limits such as with the implementation of Credit Valuation Adjustments (CVAs) for counterparty risk.

So why are we still piping data around the enterprise when we just need more compute and memory power? Hardware-accelerated core processing in databases such as XtremeData’s dbX and IBM’s Netezza are powered by FPGAs (field programmable gate arrays). Processing of massive amounts of data with FPGAs can now occur at “wireless” speed. Along with high performance computing, high-speed messaging technology provided by companies like TIBCO, Solace Systems and Informatica have redefined transport times into ultra-low latency terms from one database to another in single microseconds, sometimes in nanoseconds, from memory-cache to memory-cache.

The colloquial phrase “in-database” analytics is an approach of running analytics and computations as near as possible inside a database where the data is located. Fuzzy Logix, an algorithmic HPC vendor, replaces the need for SAS and R connecting analytics, which stretch along the wire from the database to the analyst. With Fuzzy Logix, the need to call a database for small files is eliminated because computations can be done with the rest of the database in real-time: days to seconds faster.

With in-database or in-memory analytics, BI engineers can eliminate transport latency altogether and now compute at server speeds with computations sitting inside the database or in memory for tasks to be completed locally, not on the transport wire.

Wall Street is as risk averse as ever in today’s atmosphere so the adoption of new technology or new vendors continues to present operational risk challenges. ParAccel is a company that appears to be addressing the operational risk of new technology adoption by helping firms utilize the power of parallel processing of Big Data analytics on OEM hardware.

Since ParAccel is software, an IBM, HP or Dell shop could essentially rely on the reliability of their well-known, established database vendor but use next generation Big Data analytic processing an order of magnitude faster than what is currently in place. ParAccel allows firms to aggregate, load and assimilate different data sets faster than traditional platforms through its “columnar database” nodal system. The columns in a ParAccel environment provides firms with the flexibility to first run analytics in-database or in-memory, then bring massive amounts of data to a common plane and finally, aggregate the unstructured data and do it all in lightning speed.

Other companies like NVIDIA have been building graphic processing units (GPUs) for the video game industry for three decades and are now swamped with customer requests to help build parallel computing environments, giving financial firms the ability to run trillions of algorithmic simulations in microseconds for less than $10,000 per card, essentially. GPUs can have up to 2,000 cores of processing on a single NVIDIA Tesla card embedded inside. A GPU appliance can be attached to a data warehouse for advanced complex computations. Low-latency processing can also be achieved due to minimum movement of data over a short distance analyzing most of what Wall Street claims is Big Data in seconds compared with the days it takes now.

The vendors and players are ready to get to work; there just needs to be some consensus that the Big Elephant in the room is there and it’s standing on a straw when it could be surfing a Big Wave!

Source: Tabb Forum , 02.05.2012 by Dan Watkins, President @ CC- Speed dwatkins@cc-speed.com

Filed under: Data Management, Market Data, Risk Management, Trading Technology, , , , , , , , , , ,

Scotiabank Inverlat S. A. de México automatiza los procesos de fondos mutuos con el sistema de gestión de inversiones Charles River

Simplifica el flujo de trabajo; asegura el cumplimiento de todos los títulos e instrumentos de deuda locales e internacionales

21 de octubre de 2010 – Charles River Development (Charles River), un proveedor de soluciones de software de inversión para las áreas de gestión, operación, cumplimiento, riesgos, cálculos de medidas de desempeño, atribuciones, análisis de riesgos, y tecnología de la información (front-and middle-office), anunció hoy que Scotiabank Inverlat, S.A. (Scotiabank México), uno de los grupos bancarios más grandes de México, ha implementado el sistema de gestión de inversiones (Charles River IMS) a través de su subsidiaria Scotia Fondos. El proyecto de fases múltiples, entregado puntualmente, es parte de la iniciativa de Scotia Fondos para automatizar sus procesos de Fondos de Inversión locales e internacionales con 16 opciones de cartera diferentes en una única plataforma consolidada.

Los usuarios de Scotia Fondos se benefician de herramientas avanzadas de toma de decisiones y análisis, gestión de cartera y transacciones automatizadas, y supervisión del cumplimiento previo a la transacción en tiempo real de todas las clases de activos, incluso capitales, mercados monetarios, fondos mutuos, así como también instrumentos de renta fija corporativos y gubernamentales mexicanos, tales como Bonos, CETES y UDIBONOS. Durante el proyecto inicial, Charles River automatizó los procesos de gestión y transacciones de cartera de capitales de Scotia Fondos, así como la supervisión del cumplimiento. La segunda fase consolidó las capacidades a lo largo de los procesos de renta fija de la empresa.

“Necesitábamos un sistema ultramoderno y un proveedor con experiencia comprobada en apoyar las necesidades de los gestores de activos de México; Charles River entregó ambos”, dijo Ernesto Diez, Director General, Scotia Fondos. “Nuestros gestores de carteras ahora pueden estar a la cabeza del mercado al analizar e implementar rápidamente los cambios en las carteras. También podemos validar que nuestras carteras cumplan con todas las obligaciones, en cualquier momento y para cualquier clase de activo”.

El respaldo a los numerosos requisitos del mercado local de México fue crucial para este proyecto. Charles River IMS permite a Scotia Fondos gestionar y ejecutar transacciones para todos los instrumentos de deuda gubernamentales y corporativos mexicanos. Los operarios de fondos mutuos de la empresa también pueden ejecutar préstamos de valores, dar apoyo a contratos de recompra y reequilibrio contra los índices mexicanos. Además, la arquitectura abierta de Charles River facilita a Scotia Fondos integrarse con su sistema de contabilidad propietario, así como con proveedores de servicios de apoyo administrativo (back-office), tales como Bloomberg para cotización en tiempo real, y Valmer, propiedad de la Bolsa de Valores mexicana, para información de riesgo.

Charles River IMS da apoyo a tipos de títulos específicos para la región y flujos de trabajo asociados, incluso certificados de inversión corporativos y gubernamentales mexicanos. En un futuro cercano, Scotia Fondos continuará con la implementación de la funcionalidad de cobertura y cálculos de exposición de derivados avanzados de Charles Rivera IMS, para ayudar a sus clientes a cumplir con la reglamentación mexicana, como las reglas de la Comisión Nacional Bancaria y de Valores (CNBV), al hacer una supervisión y gestionar la exposición previa y posterior a la transacción de instrumentos derivados. Las bibliotecas prefabricadas de cumplimiento de Charles River contienen más de 1.700 reglas de ejemplo generales y normativas a lo largo de 35 organismos reguladores de 20 países, incluso una biblioteca completa de reglas para México.

“Charles River proporciona a los gestores de activos en México soluciones sofisticadas pero fáciles de usar para expandir sus operaciones a nuevas clases de activos y mercados, para entregar una ventaja competitiva que respalda el crecimiento comercial”, dijo Spiros Giannaros, Vicepresidente de Ventas, Americas, Charles River Development.

Charles River brinda apoyo a cinco empresas clientes en México, y presta servicios a más de una docena de empresas a lo largo de Brasil, Chile, y Panamá.

Fuente: CRD 21.10.2010

Filed under: Data Management, Latin America, Mexico, Risk Management, , , , , , , , , , , , , , , , ,

Finamex Introduces Mexican Exchange Trading Proximity for Direct Market Access (DMA) and Colocation

Finamex, the leading financial services broker dealer for the Mexican Exchange marketplace, historically focused in providing premium professional trading products for high-performance and low-latency market access, announced today its new proximity DMA offering. Currently, Finamex enables trading systems allow firms to access Mexican Exchange venues at the lowest latencies in the market place.

Finamex being an authorized broker dealer it also offers all risk requirements, validations and processes fully in-line with existent official regulations and certified on a yearly basis. Local as well as International trading firms have utilized Finamex connectivity to Bolsa Mexicana de Valores and the Mexican Derivatives Exchange Finamex FIX gateways. Today’s announcement of the technology roll-out within the Exchanges’ datacenter incorporates a new lowlatency approach by Finamex for DMA.

FIX gateways are now within LAN proximity to the Mexican Exchange trading engines allowing for high frequency trading strategies to perform optimally on this new Finamex DMA Gateway. Straight-Through-Processing systems have also been deployed in this environment with integration of Finamex Order Management, Risk Controls, Execution Routing, Algorithmic Trading, as well as networking connectivity for clients and partners.

The new Finamex DMA Gateway includes full support of:

* Ultra-thin and transparent FIX engines configurable for special requirements
* Pre-trade order validation optimized for high throughput execution
* Low-latency verifications modules for trading limits and other important checks
* Optimized order routing directly onto the Exchange matching engine LAN
*Neutral access and protected order flow as Finamex do not operate a proprietary trading desk
* Zero-cost execution algorithms including VWAP, TWAP, POV, and others as well as several synthetic order types.

Finamex’s new infrastructure services support general co-location needs for customers and other market participants to implement their own servers and other network components further minimizing overall latency. For more than 20 years Finamex has been a leader in the Mexican financial services industry consistently ranking as one of the best independent broker dealers in the country. Finamex’s commitment to technology excellence is one of the reasons why it is in the top ten most liquid in the equities market, top-five in the fixed-income business, and the prime choice for HFT players and ALGO shops requiring execution services.

Source: A-TEAM 05.07.2010

Filed under: BMV - Mexico, Exchanges, FIX Connectivity, Latin America, Mexico, Trading Technology, , , , , , , , , , , , ,

Charles River Development Expands Global Reach with Beijing Office / 拓展全球业务 – 建立北京办事处

Charles River Development (Charles River), a front- and middle-office software solutions provider for investment firms, today announced the expansion of its global operations with a regional office in Beijing, China. Located in the Excel Centre in the Financial Street area of Beijing, the office is staffed with experienced Charles River employees and multi-lingual Chinese nationals who provide China-based investment managers with professional implementation, consulting and support services. Charles River’s client base in China includes China Life Asset Management Company, the country’s largest institutional investor.

“The opportunity to invest in international securities has increased Chinese asset managers’ demand for front- to middle-office systems that can support complex investment strategies including domestic Chinese instruments, products and workflows,” said Tom Driscoll, Managing Director, Global, Charles River Development. “The Charles River Investment Management System (Charles River IMS) can address any specific regulation, asset class, trading or language requirements our Chinese clients might have.”

Charles River IMS is available with full Chinese language capability and supports over 35 Chinese security types and associated workflows. Clients can use the system to trade, execute and manage complex Chinese domestic and international asset classes, portfolios and regulations. The system also offers direct Chinese exchange connectivity via a bi-directional real time interface to a widely-used domestic trading platform.

“Charles River has always grown its business organically,” said Cameron Field, Managing Director, Asia Pacific, Charles River Development. “We build local teams of Charles River experts who understand the local investment management market, language and culture. These teams support our clients from local offices. And we make a significant investment to localizing our solutions. For the past three years we’ve taken this approach in China.”

Continues Field, “We are pleased to be the first global investment management solutions provider in China, and are continually enhancing our local solution. Charles River is currently working on a number of strategic initiatives with key market bodies and closely monitoring how the adoption of NGTS and the STEP Protocol will enhance connectivity, data integration and STP for Chinese clients.”

一家向投资公司提供前台和中台软件及方案的供应商), 今天宣布在中国北京建立办事处以扩展其全球运营。北京办事处设于金融街的卓著中心,配备富有经验的Charles River员工及通晓多国语言的大陆员工,以向大陆投资管理公司提供专业的项目实施,咨询及支援服务。Charles River在中国的客户包括中国最大的机构投资者 – 中国人寿资产管理公司。

“投资于国际资产使得中国的资产管理公司对前台和中台系统的需求增加,而这些系统必须支持复杂的投资策略其中包括中国国内资产,产品及工作流程。” Charles River Development全球董事总经理Tom Driscoll说。“Charles River Investment Management System (Charles River IMS)可以支持我们的中国客户可能需要的任何条规,资产类别,交易及语言方面的需求。“

Charles River IMS具有全面的中文功能,并支持35种以上的国内证券类型及相关的工作流程。客户可使用Charles River IMS进行交易,执行以及管理复杂的国内及国际资产类别,组合及条规。系统也通过与一家国内普遍使用的交易平台的双向实时接口提供对国内交易所的直接连接。

“Charles River一直以来都是在有计划地拓展自己的业务,“Charles River Development亚太地区董事总经理Cameron Field说。“我们建立了解当地投资管理市场,语言及文化的富有经验的本地团队。这些团队从当地办事处支援我们的客户。而且我们投入可观的投资以使我们的方案本地化。在过去的三年里我们在中国一直是这样做的。“

他接着说,”我们非常高兴我们成为首家进入中国的全球投资管理方案提供厂商,并会继续增强我们的本地化方案 。Charles River目前正与主要市场机构联系并正在进行一些战略措施,我们也在关注新一代交易系统以及STEP的采用将会如何使中国客户的连接性,数据集成以及直通式处理的功能增强。

Source: Charles River, 13.10.2009

Filed under: Asia, China, Data Management, News, Risk Management, Trading Technology, , , , , , , , , , , , , ,

Straight Through Processing STP is key to Profitability

Closer integration between the front, middle and back offices, and amongst inter market participants, is the next important step in technology innovation within the securities lending industry, says SunGard’s Jane Milner

Up until now, securities lending has often existed in isolation in relation to many other departments within financial institutions. However, institutions are increasingly realising the benefits of enterprise-wide efficiency and improved workflow gained by a consistent approach to technology and systems.

One clear premise that arises from this realisation is that the more integration there is between systems, the greater the benefits these systems can offer.

Source: SunGard, 23.01.2009

Filed under: Data Management, Library, News, Reference Data, Risk Management, Services, Standards, Trading Technology, , , , , , , ,

Corporate Actions Report Sept 2008 – Reference Data Review

Download: Reference Data Review: Corporate_Actions Report Sept 2008_A-TEAM

Corporate actions automation has languished on the STP to do list for some time and significant challenges remain to be tackled, not least of which is the adoption of the same set of standards by all market participants.
Even though corporate actions data does ultimately reach the end investor, the process involves a high degree of manual intervention, which incurs operational risk and cost for all involved. Globalisation has not helped matters as investors are actively investing a large proportion of their business outside of their home markets and corporate actions are often categorised differently from market to market.
Furthermore, as market offerings become more complex, the ability to apply set standards to existing processes is often tricky. Based on the terms, conditions and options provided by the companies and their agents, custodians can find themselves falling into a more manually intensive review of processes they have already automated, as the offers do not fit well into event templates.

Source: A-TEAM Group 13.09.2008

Filed under: Corporate Action, Data Management, Library, Reference Data, Risk Management, Services, Standards, , , , , ,