Financial Markets Are at Risk of a ‘Big Data’ Crash

By on May 22, 2013
BigDataBigBuildingsBig

Regulators and investors are struggling to meet the challenges posed by high-frequency trading. This ultra-fast, computerized segment of finance now accounts for most trades. HFT also contributed to the “flash crash,” the sudden, vertiginous fall in the Dow Jones Industrial Average in May 2010, according to U.S. regulators. However, the HFT of today is very different to that of three years ago. This is because of “big data.”

The term describes data sets that are so large or complex (or both) that they cannot be efficiently managed with standard software. Financial markets are significant producers of big data: trades, quotes, earnings statements, consumer research reports, official statistical releases, polls, news articles, etc.

Companies that have relied on the first generation of HFT, where unsophisticated speed exploits price discrepancies, have had a tough few years. Profits from ultra-fast trading firms were 74 percent lower in 2012 compared with 2009, according to Rosenblatt Securities. Being fast is not enough. We, along with Marcos Lopez de Prado of the Lawrence Berkeley National Laboratory, have argued that HFT companies increasingly rely on “strategic sequential trading.” This consists of algorithms that analyse financial big data in an attempt to recognize the footprints left by specific market participants.

Read full article