Blog Archives

Capital as (Algorithmic) Process

I’ve just finished reading this great article by Donald MacKenzie in London Review of Books. The article discusses the move towards the automation of financial transactions in the stocks and shares market and the implications that this has for the geography of finance and, more importantly, for the stability of financial systems. I found it a really fascinating insight into the functioning of financial markets today.

The key process which MacKenzie is discussing is the move from the trading floor to the rented server as the site of financial transaction.

Human beings can, and still do, send orders from their computers to the matching engines, but this accounts for less than half of all US share trading. The remainder is algorithmic: it results from share-trading computer programs.

Many new recruits to financial firms are Maths and Physics PHD students who will join large programming teams to develop financial algorithms based on physical models previously used to understand processes such as the expansion of gases. Finance is yet another industry in which we are seeing the replacement of living labour with machines.

The bulk of the research also suggests that automated trading makes the buying and selling of shares cheaper and usually easier. Renting rack space in a data centre may be expensive, but not nearly as expensive as employing dozens of well-paid human traders.

This move to computers is helping to shape the new geography of financial transaction with warehouses in the desert hosting vast arrays of servers becoming key sites of financial exchange. MacKenzie tells a fascinating story of the changing temporality of finance as the milli and micro-second becomes the key unit in which MacKenzie’s tales of algorithms stalking data corridors in search of profit takes place. This is all very interesting but there are also, in my opinion, a few other things we can take from this:

  • The is certainly another argument against the whole ‘evil banker’ hypothesis so prevalent in many oppositional politics. It is not morally suspect humans to blame for the financial crash but the inter-play of a variety of factors within a highly complex yet structured system. A critique of capital can not stop at ‘hanging the bankers’ or a critique of the fictional or virtual financial economy.
  • Also, the concept of algorithms is useful when it comes to attempting to analyse the development of capitalism. We need to move away from previous understandings of capitalist development as the unfolding of immutable laws. This is a closed understanding of Capital, in which the possibilities of beyond capital, and our agency to affect this is highly limited. By changing our vocabulary from the law of capital to that of algorithms, or processes, our perspective changes. Rather than a unified, unstoppable juggernaut we begin to see a more chaotic, open and ultimately more fragile set of processes which we can collectively label as capital(ism). A set of processes who’s strength, hyper-flexibilty, is also its major (internal) weakness. Recognizing the inherent fragility of capital is vital if we wish to move beyond and against it.

MacKenzie then goes on to discuss a mini-crash, which lasted for 20 minutes on the 6th May, 2010. A crash which saw the overall value of US shares fall 6% in 5 minutes, a crash in which Accenture’s shares fell from $40.50 to a cent, whilst Southeby’s shares spiked to $99,999.99. Through Mackenzie’s exploration of this mini-crash we are given a brief glimpse into some of the processes helping to shape the current financial context we find ourselves moving within. MacKenzie ends with concerns over a financial system which is both highly complex and tightly coupled – a system with little decision time built into it. Whether operated by humans or computer programmes the stocks and shares market is still highly volatile. Overall, a really great article and well worth a read.