- Bill Gates has revealed what frightens him most: An epidemic so great, it would rapidly wipe out millions worldwide
- South Korea, Saudi Arabia confirm more MERS cases
- Post says bird flu getting worse for its egg supply
- Birds shipped to incinerator and landfills as bird flu spreads to more counties in Iowa
- Pentagon chief urges end to island-building in South China Sea
- Russia masses heavy firepower on border with Ukraine
- U.S. officials say IRS hackers who stole personal tax info are Russians
- ‘They created these people': Rand Paul blames GOP hawks for rise of ISIS
Financial Markets Are at Risk of a ‘Big Data’ Crash
Regulators and investors are struggling to meet the challenges posed by high-frequency trading. This ultra-fast, computerized segment of finance now accounts for most trades. HFT also contributed to the “flash crash,” the sudden, vertiginous fall in the Dow Jones Industrial Average in May 2010, according to U.S. regulators. However, the HFT of today is very different to that of three years ago. This is because of “big data.”
The term describes data sets that are so large or complex (or both) that they cannot be efficiently managed with standard software. Financial markets are significant producers of big data: trades, quotes, earnings statements, consumer research reports, official statistical releases, polls, news articles, etc.
Companies that have relied on the first generation of HFT, where unsophisticated speed exploits price discrepancies, have had a tough few years. Profits from ultra-fast trading firms were 74 percent lower in 2012 compared with 2009, according to Rosenblatt Securities. Being fast is not enough. We, along with Marcos Lopez de Prado of the Lawrence Berkeley National Laboratory, have argued that HFT companies increasingly rely on “strategic sequential trading.” This consists of algorithms that analyse financial big data in an attempt to recognize the footprints left by specific market participants.