Warren Ellis on the future as weather

To be absolutely clear upfront, I didn’t write the following post. Warren Ellis did. It’s a text companion piece to the How The Light Gets In festival, and was sent in the latest edition of his newsletter, Orbital Operations. I found it to be such a quietly wonderful and impressive piece that I decided to reprint it here, without his permission.


Thinking about the future isn’t rocket science, and neither is it archery.  Time’s Arrow is a seductive idea. The notion that time travels in a straight line, and that we rise into a single future. Or, if you’re feeling oppressed and cranky, that the future is a single cutting head diving down towards us. It simplifies things nicely, and makes passengers of us. It is convenient and sometimes comforting to believe we have no agency in the face of Time’s Arrow. The Future is coming and we’re trapped on the ride.

I stopped believing, a long time ago, that the future works in that way, no matter how many rocket scientists you throw at it.

It works like The Great Storm of 1987. If you weren’t in it, your friends and parents will have talked about it. I lived by the water’s edge on the estuary at the time, and woke up to a two-foot drift of sand piled against my front door. Walking to work was the full Walking Dead experience: totally alone, walking up the middle of the road through the debris and the shattered glass, the only real difference between that and a zombie show being the ancient trees laying across dual carriageways and sticking out of houses. You’d think that, as a bloody great 122mph meteorological convulsion that killed twenty-odd people and caused two billion pounds’ worth of damage, a Great Storm would be an unmissably large piece of the future smack dab in the middle of Time’s Arrow’s path. But no. The projections were off. Prediction will always get you into trouble, as anyone who ever talked about flying cars will tell you.

Living on the estuary, I have a different sense of the future. I watch weather systems bounce around all over the estuary. The path of any one of them is determined by a dozen different elements interacting and colliding. Out here on the water, you understand how meteorology is a nightmare discipline. Look at America’s recent experience with the “winter weather bomb” that was on course to destroy the East Coast, until it kind of didn’t. Part of that is down to the hyperbole that accompanies prediction. Part of it is down to the nature of single prediction, the stuff of futurism, itself.

From this perspective, the Great Storm was the mobile phone. It could be seen in the distance, along with a dozen other swirls of stormy weather, but we had no idea it would hit hard enough to change the shape of the world. It hit hard enough to break science fiction, one of our traditional early-warning stations, and it became interpolated into and interrogated by contemporary and popular fiction without science fiction ever getting to lay a finger on it. “Gene-editing,” as a phrase, has broken into the world without a science-fictional genesis. This will keep happening if we keep believing we’re on a single rail to the future and that single prediction is the way to see ahead.

The future is a weatherfront, and attempting to predict single lightning strikes is stupid and wasteful. Understand the future as weather, and yourself as standing on the shore looking out to the horizon. Breathe the air and watch the water. There are dozens of different systems acting on the approach of the future. In order to get a handle on what’s coming, you need to be talking to and working with and keeping an eye on many different fields. Not just “technology.” The future is also always social, and economic, and political, and many other things besides, and those things act on the path of the storm. And, if you’re standing on the shore, you know that there are a lot of storms out there, and any one of them could hit like a hurricane. If this sounds good to you, then, please, get to it. Because we’re running out of reliable early-warning stations.

The internet vernacular

I’m fascinated by this micro-trend of video that replicates the experience of using modern communication software, especially messaging and social media. For example, this music video by the Japanese artist, Aimyong, which takes place almost exclusively in Line messenger:

And the video for the song Run and Run by the Japanese girl band, lyrical school, has taken the idea even further, playing with the whole iPhone interface. More than that, it’s formatted for mobile, so looks brilliant when viewed full screen on your phone.

This idea isn’t exclusive to music videos. There’s a short film from 2013 called Noah, a romantic drama that’s set on desktop and mobile, and uses not only the language of internet comms, but also its effects — the paranoia that Facebook can bring to relationships (NB this film is NSFW).

And the 2015 film Unfriended carries the conceit even further; it’s a full-length horror story that takes place on a single desktop across Skype, messaging, and the web browser, with a story that’s drawn from real online life.

These videos could not have been made ten years ago; they rely on a shared knowledge of technology (and the smartphone especially) that’s only been common since around 2008. They use the dialect of the globalised online population: the internet vernacular.

I’d love you to send me more examples if you know of any.

Twitter, Listening to Users, and Murder

I saw this cartoon gain a few thousand retweets on Twitter. In it, a Twitter executive asks three colleagues how they should grow the service. One colleague says “Algorithms”; another, “Moments”; a third says “Listen to users”. This third response angers the executive, to the point that he throws the man who suggested it out of a window (it’s at least a second floor window, so this is presumably murder).

What I infer from this cartoon is that the author believes that Twitter doesn’t listen to its users, but should.

So what do Twitter users want? The #RIPTwitter hashtag1 reveals a few common demands.

One is that Twitter needs an edit button. That opens the door to so much potential abuse that I can’t believe it’s seriously being proposed, let alone considered:

If you get angry at people who retweet bigotry, abuse of marketing material, just imagine how you’ll feel when you find out you’re the one retweeting it.

Another is that Twitter needs to concentrate on stopping abuse. Brianna Wu, who has more reason than most to want an end to Twitter abuse, says that this position is nonsense:

As someone that works with Twitter frequently on harassment, I feel uniquely qualified to say… [this] is bullshit. Twitter’s harassment outcome is improving. I have documented, statistical proof it’s improving.

So given that two of the most popular user requests are rubbish, and many vocal Twitter users seem to really just want to preserve the status quo, and that Twitter growth continues to stall, my take on the cartoon is that the executive was right to get angry at the person who suggested they just listen to users.

Although I don’t condone murder.


Yes, I write about Twitter quite a lot. That’s because it’s important to me, I use it frequently, every day. I want to see it succeed, and I want to see it improve. And, as M.G. Siegler notes in Tempest in a Tweetpot:

Change is always scary — especially on the internet. But time goes on, we move on, and everyone is often happier as a result.


1 Ironically, Twitter search uses a non-linear algorithm, and is better for it.

The Victorian computer pioneers ahead of their time.

I’m reading Sydney Padua’s The Thrilling Adventures of Lovelace and Babbage. It’s a fun alternate history, told in comics, of the work of Charles Babbage and Ada, Countess of Lovelace—between them, the precursors of (respectively) automated computing and computer programming (for the unfamiliar, Steven Wolfram’s Untangling the Tale of Ada Lovelace puts their work and relationship into perspective).

The stories themselves are charming, but the real highlights of the book for me are the extensively researched footnotes and endnotes. In them, I learned that Babbage, at the Great Exhibition of 1862, met and conversed with George Boole, a logician famous for creating what we know today as Boolean logic—the three operations AND, OR and NOT, that power modern digital systems. Why is this meeting so important? As one bystander wrote:

As Boole had discovered that means of reasoning might be conducted by a mathematical process, and Babbage had invented a machine for the performance of mathematical work, the two great men together seemed to have taken steps towards the construction of that great prodigy, a Thinking Machine.

Boole’s ideas led, almost a century later, to the creation of logic gates, critical to digital systems. Logic gates need a way to retain state, and early ones used valves, or vacuum tubes. These were developed from the pioneering work of Michael Faraday, scientist of electromagnetism—and friend of Babbage.

It’s incredible to think that Lovelace, Babbage, Faraday, and Boole were contemporaries; the developers of the engine, programs, power and logic of modern computers were all connected in the mid-18th century, but too far ahead of their time to see their concepts realized. Faraday’s valves could have enabled Boole’s logic gates, which could have made Babbage’s machine simpler and more affordable, which could have seen Lovelace’s programs be more than theoretical. But it would be almost another 100 years before technology caught up to their ideas.