FiNETIK – Asia and Latin America – Market News Network

Asia and Latin America News Network focusing on Financial Markets, Energy, Environment, Commodity and Risk, Trading and Data Management

Finamex: It’s a Fine Time to Cross the Border – Mexico the Emerged Market of Growth

In January of this year the theme of emerging markets became more of a primary investment rather than that of an alternative one. Many people ventured toward countries that have had rocket high growth over the last few years such as the BRIC countries of Brazil, Russia, India and China which received the preponderance of excitement in the emerging market approach.

Read full article Mexico the Growth Market

Today, the BRIC countries have been challenged to maintain upward momentum. The simmering down of the American market crisis and the expanding concerns for the Eurozone present a dilemma and are showing the effects. The Institute of International Finance (IIF), a global association of financial institutions, says that “net private capital flows to emerging market economies remain quite volatile and subject to disturbance from the euro area”. According to the research, data capital flows fell in 2011 to $1.03 trillion from $1.09 trillion in 2010 and are expected to fall again this year to $912 billion before rising to $994 billion in 2013.

The woes of the Eurozone monetary crisis have influenced investors to move money out of country and to seek safe haven in securities markets elsewhere. Brazil, Indonesia, China as well as others are no longer experiencing upward momentum and are now even in decline or negative.

However year after year, analysts continue to see strong signs of growth and long term prosperity in Mexico as many of the emerging markets troubles are not being seen in Mexico, in fact quite the opposite.

Brazil with its lucrative energy industry capitalized by the largest South American exchange, has attracted many investors to seek opportunities in Latin America. Brazil has enjoyed the influx of foreign investments and has gone further to encourage more interest from the North by recently lowering some of its staggeringly high tax penalties on returns and additionally allowing the shares of foreign instruments to take more of a part in portfolios of its domestic shareholders. “Investors are more cautious with Brazil,” Gustavo Mendonca, an economist with Oren Investimentos in Sao Paulo said this week. “The country has slowed very sharply and the prospects for long-term growth have gone downhill.”

Policy adjustments invite and attract investments, but many of these actions are late and under pressure by issues developing in other countries such as Spain. On the other hand, the opportunities for a rudimental Northern investor looking South of the Border to Mexico remain solid.

A key factor with Mexico is that it has  some of the most definitive metrics that provide the level of transparency needed in a volatile global market.  Unlike Brazil, Russia, India or China, Mexico is directly tied to American monetary policy with a correlation that does not exist in other Emerging Market countries and not surprisingly is also growing alongside the American economy.

Is Mexico beyond ridicule and examination? Of course not, but to begin to understand the benefits of investing in Mexico for the short and the long term we should begin with how Mexico plays a key role as a member of NAFTA (North American Free Trade Agreement). The implementation of NAFTA along with close inter-country relationships, ties Mexico’s trade and currency valuation to that of the US and Canada.

 For example, in 2010 many believed the US would remain flat for the next two years, but we now see this was not the case. As a result of American performance, Mexico’s markets have also increased working in parallel a framework portfolio managers find affirmative Mexico has also maintained a weak peso over the last ten years. The Mexican peso has been priced at a competitive advantage with China.

 Currency rates have helped Mexico realize an economic boom that continues to rise since the 90’s. The move to NAFTA in 1994 could be the key contributing factor for Mexico’s 600 percent increase in sales to the US. With inflation no longer under control in countries like China and  Brazil, analysts are discovering that Mexico’s policies have proven successful in weathering many global financial catastrophes.

…..

As opportunities within the developed markets diminish, the Mexican marketplace is standing strong. As a top emerging market for the global investing community, particularly in Latin America, Mexico represents a substantial alternative to Brazil, home of the leading Latin American stock market. Mexico, although not a BRIC country, certainly has more promising economic stability and growth potential than some of the most mature economies. With a clear goal in sight, the local markets in Mexico continue to take measures that enhance liquidity in equities and derivatives trading which provide surety to its financial institutions and reach more investors abroad.

Source: FINAMEX /Dan Watkins, 01.08.2012  dwatkins@cc-speed.com

Advertisements

Filed under: Asia, BMV - Mexico, Brazil, China, Exchanges, Latin America, Mexico, News, Trading Technology, , , , , , , , , , , , , , , , , , , ,

Coming to Grips With Big Data Challenges by Dan Watkins

The rate of data growth in financial markets has scaled beyond the means of manageability.

Debates have gone so far as to dismiss Big Data as being tamable and controlable in the near term with the current computing architecture commonly adopted as an acceptable solution. I agree but argue that conventional data transport – not management – is the real challenge of handling and utilizing Big Data effectively.

From exchange to trading machine, the amount of new ticks and market data depth are delivered only as fast as the delivery speed can endure. Common market data feeds that are used in conventional exchange trading are but a fraction of the market information actually available.

Perhaps due to high costs of $100,000 per terabyte, many market participants deem the use of more data as a bit too aggressive. Or they believe that high performance computing (HPC) is the next generation technology solution for any Big Data issue. Firms, therefore, are sluggishly advancing their information technology in a slow cadence in tune with the old adage: “if it ain’t broke don’t fix it.”

Over the last decade, Wall Street business heads have agreed with engineers that the immense perplexity of Big Data is best categorized by Doug Laney’s 2001 META Group report’s Three B’s: Big Volume, Big Velocity and Big Variety.

When looking at “Big Volume” 10 years ago, the markets had just defragmented under Regulation ATS. A flurry of new market centers arose in U.S. equities as did dark liquidity pools. This gave rise to a global “electronic trading reformation.” Straight-through processing (STP) advocates and evangelized platforms such as BRASS, REDIPlus and Bloomberg Order Management Systems (OMS) resulted in voluminous and fragmented market data streaming to 5,000 NASD/FINRA trading firms and 700,000 professional traders.

Today, the U.S. has 30+ Securities and Exchange Commission-recognized self-regulatory organizations (SROs), commonly known as exchanges and ECNs. For the first time since 2002, full market depth feeds from NASDAQ allow firms to collect, cache, react, store and retrieve feeds on six hours of trading for nearly 300 days a year more transparently than ever. Big Data volume has grown 1,000 percent and has reached three terabytes of market data depth per day.

Billions of dollars are being spent on increasing “Big Velocity.” The pipes that wire exchanges through the STP chain to the trader have become 100 times faster and larger but still not fast enough to funnel the bulk of information laying idle back in the database. Through “proximity hosting,” the telco is eliminated and latency is lowered. This structure results in adjustments made for larger packets but not really for more information as Big Data remains the big, quiet elephant in the corner.

Five years after Reg ATS, markets are bursting at the seams with electronic trading that produces explosive market data that breaks new peak levels seemingly every day. The SEC’s Regulation National Market System (Reg NMS), struck in 2007, requires exchanges and firms to calculate the best price for execution to be compliant. Firms are also now mandated to sweep all exchanges’ market order books and process all of that data for a smart execution.

After the execution, traders have to track the “order trail” from price to execution for every trade and store all of that information for seven years in the event of an audit recall of a transaction.

Under Reg NMS, subscribing to the full depth of all 30+ markets in “real time” would mean a firm would have to have a 1x terabyte pipe for low latency. Since a T-pipe is not realistic, data moves at 1x gigabits, which is relatively slow with the data in queue at 50-100 terabytes deep. Multi-gbs pipes, as fast as they seem, are still similar to driving five miles an hour on a 55 mph highway.

Analysts typically call data from a database with R (Revolution Analytics) and “SAS” Connectors. The process includes bringing data to an analytical environment in which the user runs models and computations on the subsets of a larger store before moving on to the next data crunch job. The R and SAS Connectors between the file servers and the database are at 10/100BASE-T, making the movement of 50 terabyte environment like driving one mile per hour in a 55 mph zone.

We all hear the polemics regarding data formats and the jigsaw puzzle of unstructured data and the fact that “Big Variety” is the obstacle. Even after standardization of SQL-based queries where analysts can ask any “ad hoc” question, too many sources and too many pipes from analytic servers cause traffic jams. SQL databases are ideal for unstructured queries but are slow in unstructured data compiling. Aggregating market information is where much of market’s processing technologies are being evaluated today to meet the requirements of regulations, sweeping for best execution and for risk management.

Comparing where current prices of stocks are against bids and asks to trade across multiple exchanges, markets, sources, asset classes and clients is essentially the Big Data task of risk management. In addition to managing data changes, firms are also tasked with managing their trading accounts, client portfolios and trading limits such as with the implementation of Credit Valuation Adjustments (CVAs) for counterparty risk.

So why are we still piping data around the enterprise when we just need more compute and memory power? Hardware-accelerated core processing in databases such as XtremeData’s dbX and IBM’s Netezza are powered by FPGAs (field programmable gate arrays). Processing of massive amounts of data with FPGAs can now occur at “wireless” speed. Along with high performance computing, high-speed messaging technology provided by companies like TIBCO, Solace Systems and Informatica have redefined transport times into ultra-low latency terms from one database to another in single microseconds, sometimes in nanoseconds, from memory-cache to memory-cache.

The colloquial phrase “in-database” analytics is an approach of running analytics and computations as near as possible inside a database where the data is located. Fuzzy Logix, an algorithmic HPC vendor, replaces the need for SAS and R connecting analytics, which stretch along the wire from the database to the analyst. With Fuzzy Logix, the need to call a database for small files is eliminated because computations can be done with the rest of the database in real-time: days to seconds faster.

With in-database or in-memory analytics, BI engineers can eliminate transport latency altogether and now compute at server speeds with computations sitting inside the database or in memory for tasks to be completed locally, not on the transport wire.

Wall Street is as risk averse as ever in today’s atmosphere so the adoption of new technology or new vendors continues to present operational risk challenges. ParAccel is a company that appears to be addressing the operational risk of new technology adoption by helping firms utilize the power of parallel processing of Big Data analytics on OEM hardware.

Since ParAccel is software, an IBM, HP or Dell shop could essentially rely on the reliability of their well-known, established database vendor but use next generation Big Data analytic processing an order of magnitude faster than what is currently in place. ParAccel allows firms to aggregate, load and assimilate different data sets faster than traditional platforms through its “columnar database” nodal system. The columns in a ParAccel environment provides firms with the flexibility to first run analytics in-database or in-memory, then bring massive amounts of data to a common plane and finally, aggregate the unstructured data and do it all in lightning speed.

Other companies like NVIDIA have been building graphic processing units (GPUs) for the video game industry for three decades and are now swamped with customer requests to help build parallel computing environments, giving financial firms the ability to run trillions of algorithmic simulations in microseconds for less than $10,000 per card, essentially. GPUs can have up to 2,000 cores of processing on a single NVIDIA Tesla card embedded inside. A GPU appliance can be attached to a data warehouse for advanced complex computations. Low-latency processing can also be achieved due to minimum movement of data over a short distance analyzing most of what Wall Street claims is Big Data in seconds compared with the days it takes now.

The vendors and players are ready to get to work; there just needs to be some consensus that the Big Elephant in the room is there and it’s standing on a straw when it could be surfing a Big Wave!

Source: Tabb Forum , 02.05.2012 by Dan Watkins, President @ CC- Speed dwatkins@cc-speed.com

Filed under: Data Management, Market Data, Risk Management, Trading Technology, , , , , , , , , , ,