Global financial markets increasingly rely on computers and algorithms.
Between 2:42pm and 3:07pm on 6th May 2010, the Dow Jones Industrial Average plunged 998.5 points before recovering. The ‘Flash Crash’ raised United States regulator awareness of how computers and algorithms can affect trading. The US Securities and Exchange Commission blamed high-frequency trading (HFT): millisecond, computer-driven arbitrage used primarily by quantitative hedge funds. The trial of ex-Goldman Sachs programmer Sergey Aleynikov also created media attention on HFT. Today, supercomputers dominate financial market exchanges.
Computers and algorithms have a Wall Street prehistory. Michael Goodkin co-founded the Arbitrage Management Company in 1968 to pioneer computer and statistical arbitrage strategies. Goodkin recruited economist and corporate finance experts Harry Markowitz, Myron Scholes and Paul Samuelson as his academic brains trust. Bill Fouse used a Prime mini-computer to develop quantitative tactical asset allocation and index funds. In 1981, Michael Bloomberg founded the company that would sell his now-ubiquitous Bloomberg terminals to Wall Street. Bloomberg LP now has ThomsonReuters and Australia’s IRESS as market data competitors. Artificial intelligence, genetic algorithms, machine learning and neural networks each had speculative bubbles as Wall Street experimented with them and marketed black box systems as client solutions.
This experimentation spawned a new generation of academic entrepreneurs.
Finance academics like Fischer Black and Emanuel Derman moved to Goldman Sachs and enjoyed the market-driven environment. In the early 1980s, Wall Street hired physicists and created new sub-fields of knowledge: econophysics, computational finance, and financial engineering. In 1991, Doyne Famer, Norman Packard and Jim McGill founded The Prediction Company (acquired in 2005 by UBS) to use complex adaptive systems theory to model financial markets. In 1996, Michael Goodkin co-founded Numerix to use Monte Carlo simulations to test trading strategies.
Collectively, their work identified new market anomalies and complex dynamics to trade. Their research anticipated new software. Today, US university programs in financial engineering use software platforms like Alphacet, Deltix and Streambase to develop algorithms for HFT and complex event processing (CEP) systems. Yet these innovations remain unavailable in many Australian university programs with the exception of the Capital Markets CRC.
Two former academics offer one compelling vision of how computers and algorithms will reshape Wall Street in the next century. Stony Brook University mathematician Jim Simons formed the quantitative fund Renaissance Technologies and now uses ex-IBM voice synthesis scientists. Stanford supercomputer designer David Shaw founded D.E. Shaw & Company, which employed Jeff Bezos before he founded Amazon.com. Shaw rejects technical analysis (the pattern recognition of price and volume) for Karl Popper’s philosophy of falsifiability and event-based studies.
Shaw and Simons’ funds use terabytes of data, daily: a forerunner of the current interest in Big Data research. As academic entrepreneurs, they ended journal publications and government competitive grants. Instead, they used market arbitrage, economies of scope, highly incentivised staff, private scientific knowledge, and walled gardens to protect their funds’ intellectual property.
Shaw and Simons have already lived a decade in a different future than most investors and traders.
HFT and CEP systems are already changing how Australia’s financial markets operate. The Australian Securities Exchange (ASX) and the new Chi-X Exchange now both have HFT capabilities including direct market access: the exchanges now host the ‘co-located’ low-latency computing systems of market-makers, proprietary trading firms and hedge funds. Algorithmic and HFT trading now accounts for higher trading volumes and volatility in company share prices. HFT is also blamed for greater inter-market correlation such as between the ASX and the Shanghai Composite Index. These trends echo the volatility of commodities and futures markets in the 1970s. More subtly, HFT and CEP systems create knowledge decay: in which new knowledge and faster cycle times makes existing knowledge and investment strategies obsolete. Such innovations are unlikely to diffuse any time soon to retail investors.
The 2007-09 global financial crisis has prompted a backlash against Wall Street computers and algorithms. This backlash is similar to the fall of Master of the Universe traders after the 1980s merger wave and the demise of technology firms after the 1995-2000 dotcom bubble. University of Edinburgh’s Donald MacKenzie exemplifies the new academic research programs that are emerging: how sociology, and science and technology studies, might contribute to our understanding of financial markets. Barnard College’s president Debora L. Spar and Columbia Law School’s Timothy Wu caution that regulatory actions can dramatically affect future industry trajectories. A financial world without computers would be a return to mid-1960s trading: back-office processing, brokerage, clearing and settlement delays, and lower trading volumes.
In the face of HFT technology Wall Street traders emphasise craft. Arbitrage opportunities, psychology, and risk/money management are still vital for trading success, they contend. HFT has just changed Wall Street in a way closer to Margin Call than to Boiler Room or Wall Street. Interactive Brokers has a more direct future in mind. It ran a new television advertising campaign after the Occupy Wall Street protests in New York’s Zuccotti Park: “Join the 1%.”