I’m at the SAS industry analyst day in Steamboat Springs, Colorado. There’s been a lot of emphasis here on big data, high powered analytics and in-memory computing. It got me to thinking about an article last week in Wired about a new academic study on high frequency trading. The study found that there have been over 18,000 flash events in capital markets trading in the last five years. There’s strong economic rent seeking incentive for traders to locate their systems as close as possible to an exchange in order to be able to take advantage of events, but detecting and responding to events that are less than a tenth of a second seems problematic, and such events could cascade out of control — such as happened in the flash crash of May 2010.
Some companies like Citigroup get jerked around like crazy by high frequency trading — so much so that there’s little correlation between stock price and the company’s value.
So with analytics vendors like SAS making quantum jumps in processing and analytic speeds, could a company like Citigroup begin to use it to counter high frequency trading effects. In other words, will it be possible with a combination of big data analytics and in-memory capabilities of modern hardware to detect and respond to events that are in the nano-seconds?