- RAF jets scrambled to meet ‘unusual’ level of Russian bombers that flew close to Britain
- CDC pulls poster saying Ebola can spread through a sneeze
- Mystery drones seen over France’s nuclear plants
- FEMA conducts pandemic training in NY, NJ
- Anti-quarantine nurse Hickox was trained as intelligence officer by the CDC
- Hamas, Islamic Jihad call for Palestinians to step up ‘resistance’ against Israel
- ‘Stay in Australia … behead people': Islamic State ‘plot’ prompted US security fears
Financial Markets Are at Risk of a ‘Big Data’ Crash
Regulators and investors are struggling to meet the challenges posed by high-frequency trading. This ultra-fast, computerized segment of finance now accounts for most trades. HFT also contributed to the “flash crash,” the sudden, vertiginous fall in the Dow Jones Industrial Average in May 2010, according to U.S. regulators. However, the HFT of today is very different to that of three years ago. This is because of “big data.”
The term describes data sets that are so large or complex (or both) that they cannot be efficiently managed with standard software. Financial markets are significant producers of big data: trades, quotes, earnings statements, consumer research reports, official statistical releases, polls, news articles, etc.
Companies that have relied on the first generation of HFT, where unsophisticated speed exploits price discrepancies, have had a tough few years. Profits from ultra-fast trading firms were 74 percent lower in 2012 compared with 2009, according to Rosenblatt Securities. Being fast is not enough. We, along with Marcos Lopez de Prado of the Lawrence Berkeley National Laboratory, have argued that HFT companies increasingly rely on “strategic sequential trading.” This consists of algorithms that analyse financial big data in an attempt to recognize the footprints left by specific market participants.