At the age of 74, Werner Herzog has just made his 38th feature film, a documentary about the Internet titled “Lo and Behold Reveries of the Connected World”. The German director shows no sign of the age-related decline that has affected so many of his peers such as Martin Scorsese or Woody Allen. Of course, given the fact that Allen hasn’t made a watchable film in over 35 years makes one speculate that he has never lived up to his accolades to start with.
Unlike the mega-celebrity from Hollywood, Herzog belongs to that rarefied world of “foreign” or “independent” films that inevitably get screened in art houses and rarely get nominated for Academy Awards. In other words, he makes the sort of film I live for, starting with his narrative film “Aguirre, the Wrath of God” that I saw in 1977 and that made me a devoted fan. It starred Klaus Kinski, a member of Herzog’s repertory company at the time, as a deranged conquistador determined to find the lost city of El Dorado to seize control over its legendary riches, even if it cost the lives of every man in his expedition. In the final gripping scene, he is the sole survivor adrift on the Amazon River with monkeys overrunning his raft. It was not a stretch for a Marxist like me to see it as a critique of colonialism even though 7 years later I would be dismayed to discover that he had made a TV documentary taking up the cause of the Miskitos in Nicaragua. It was only 10 years later that I figured out that the Atlantic Coast Indians had legitimate grievances and that Herzog was right to make such a film.
If there is one thing you can predict about a Werner Herzog film, it is that it will be unpredictable. “Lo and Behold” is nominally a series of interviews with pioneer computer scientists like UCLA’s Leonard Kleinrock but refracted through Herzog’s off-kilter sensibility that runs through the film like a black thread. For example, upon the completion of his interview with Ted Nelson, who anticipated the rise of the Worldwide Web, he allows Nelson to take his photo like a fan—a gesture that defies conventional documentary techniques to say the least.
Herzog is obviously fascinated by the computer scientists who come across as gee-whiz techno-optimists who clash with his own darkly absurdist vision of life even as he shares their breathless testimonies to the spectacular rise of the Internet. It is reminiscent of his near-obsession with the German-American jet fighter pilot Dieter Dengler who was shot down over Vietnam. In the 1997 documentary “Little Dieter Needs to Fly”, he describes how Dengler became obsessed with flying after seeing Allied fighter-bombers destroying his German village during WWII.
If you’ve seen that film, you will understand why Herzog seems just as fascinated with Elon Musk whose SpaceX company is building rockets that are intended to create a colony on Mars. Like Dengler, flight brings Musk closer to eternity or at least a taste of it. In explaining the need for colonizing Mars, Musk describes it as a hedge against something “going wrong” on Earth, the result of either a manmade or natural disaster. Will we have an Internet on Mars, Herzog playfully asks. With a cold smile, Musk says that we will after sending up a few satellites to circle the planet. One can hardly escape feeling that we are in the company of someone who would have made Aguirre blanch.
The film does not limit itself to the Internet. It is also devoted to displaying the latest in robotics and interviewing the geeks who work in the field. We meet Joydeep Biswas, a Carnegie-Mellon engineer who displays six inch tall soccer-playing robots that dart about a miniature field scoring goals against each other. He has a particular fondness for robot number 8 that seems to be just a cut above the others. After Herzog asks Biswas if he loves that robot, the engineer grins sheepishly and admits that he does. It is a priceless moment.
If most of the film is devoted to the wonders of the Internet, Herzog makes sure to illustrate its dark side. He interviews the Catsouras family at their home in Orange County, near Los Angeles. The father, mother and three teen daughters sit around their dining table as the father describes the trauma they faced after a fourth daughter died in an auto accident in 2006. When a photograph taken by state troopers leaked out to the Internet showing her nearly decapitated head, the family was horrified by the photo that had gone viral and their inability to suppress it after a judge ruled that a dead person does not have the right to privacy. Mrs. Catsouras tells Herzog that the Internet was the anti-Christ to her.
As the conversation with Catsouras family over these grizzly matters transpires, your attention is fixated both on them and three trays of baked goods sitting on the table in front of them. The contrast between the muffins, cakes and cookies and their woeful experience could hardly be more striking. It makes you wonder if they prepared the goodies for the film crew and that Herzog decided to leave them on the table just for their macabre counterpoint to the matters under discussion. I am sure he did.
After the press screening, I chatted briefly with NY Times film critic A.O. Scott about “Lo and Behold”. He was a bit surprised that I did not have much to say about Herzog’s utter lack of attention to the frequently aired concerns about the political implications of the Internet’s explosive growth. There is nothing about the monopolistic tendencies of Jeff Bezos, the NSA’s ability to snoop on our emails or phone calls, the threat of cyberwarfare, and the like. Scott was right, of course, but I doubt that Herzog had much interest in making the kind of film that Laura Poitras would have made. Herzog is primarily interested in human psychology, and particularly what some might consider abnormal psychology. With his command of cinematic techniques gathered over nearly four decades and a sense of the absurd matched by very few filmmakers today, Werner Herzog marches to the tune of his own drummer. “Lo and Behold” opens at the Lincoln Center Film Society on August 19th and better theaters everywhere. Highly recommended.
On May 7th a man named Joshua Brown died when his Tesla smacked into a trailer truck that the autopilot system mistook for the sky. Brown was a Navy Seal veteran who had worked in the Special Warfare Development Group, the elite unit that killed Osama Bin-Laden. His specialty was dismantling bombs in Iraq. Little did he realize that he was killed by a bomb that was set to go off the first time its onboard computer system malfunctioned.
Apparently Brown was obsessed with his car and its supposedly miraculous ability to forestall highway accidents. He made many Youtube videos about his passion, including the most recent one that illustrated its uncanny ability to avoid accidents.
The Guardian reported that Brown was watching a Harry Potter video when his Tesla careened into the trailer-truck so we can conclude that magic did not come to his rescue. It described the circumstances of the collision:
According to Tesla’s account of the crash, the car’s sensor system, against a bright spring sky, failed to distinguish a large white 18-wheel truck and trailer crossing the highway. In a blogpost, Tesla said the self-driving car attempted to drive full speed under the trailer “with the bottom of the trailer impacting the windshield of the Model S”.
One imagines that Brown must have invested so much in the car and his invincibility because he ran a technology consulting company called Nexu Innovations that was for “Making a Difference in Our Flattening World”. Of course, the concept of a “flattening” world is straight out of the Thomas Friedman playbook. Friedman has been churning out columns on how outsourced tech support help desks in Ghana, etc. would be the answer to the world’s woes and wherever it failed, the Navy Seals could step in and straighten things out.
My immediate reaction to the news of his death was to tell my wife that we should be grateful that Ronald Reagan’s Strategic Defense Initiative, aka Star Wars, was never implemented. Back in 1983 when I was getting re-politicized around the Central America guerrilla struggles, I also decided to join Computer Professionals for Social Responsibility, a group that made blocking the implementation of SDI a high priority.
The technology of SDI and the Tesla autopilot system are both based on artificial intelligence, in effect to give computer systems the same capability of a human eye matched to a functioning brain that follows certain pre-established rules. With Tesla, the goal is to avoid collisions. With SDI, the goal was to make them—specifically to smack into and blow to smithereens Soviet missiles that encroached upon American airspace. Reagan’s goal was to provide a nuclear shield that would give the USA a big advantage in a Cold War that might turn hot. Many people, including someone like me who used to take part in “duck and cover” drills in elementary school in the 1950s, were terrified by the notions being put forward by Reagan and his cohorts.
Reagan believed that missiles could be “recalled” as if they were like remote controlled model airplanes. Even more ghastly was the reassurances of Thomas K. Jones, Reagan’s Deputy Under Secretary of Defense for Research and Engineering, that the USA could recover from a nuclear war with Russia in 2 to 4 years. Jones once said, “If there are enough shovels to go around, everybody’s going to make it.” We were supposed to use the shovels to dig a hole in the ground (can you imagine New Yorkers running to Central Park with the H-Bomb on the way?) that would be covered with a couple of doors and three feet of dirt on top of them. Jones said, “It’s the dirt that does it.”
As it happens, there is a morbid connection between this doomsday scenario and the capitalist who started Tesla. Elon Musk is not the only the manufacturer who is pioneering such cars but he is the only one who pushes the idea that an autopilot system capable of changing lanes now exists in his automobile. For others working in the field such as Volvo, Mercedes and Toyota, they never saw it more than only a technology good for parking assistance.
Mary “Missy” Cummings, a Duke University robotics professor and former military pilot, told the Guardian that Tesla should disable its autopilot system for navigating multilane expressways. “Either fix it or turn it off … The car was in a place where the computer was blind. The computer couldn’t see the environment for what it was.”
In addition to Tesla, Musk is investing in space travel. He is interviewed by Werner Herzog in “Lo and Behold”, a documentary on computers, the Internet and robotics that opens on August 19th. Herzog, who is much more interested in the “gee whiz” personalities of the men he interviews than their political or social ambitions (a point that A.O. Scott made to me that I had not even gathered), was goggle-eyed as Musk spelled out the need for colonizing Mars if “something goes wrong” on Earth.
The company is called SpaceX and it hopes to have its first launch in 2022. In a 2013 interview with the Guardian, the man who made his billions from Paypal stated that he was inspired to shoot for colonizing Mars after reading Isaac Asimov’s “Foundation” science fiction series whose main character Hari Seldon anticipates the collapse of the Galactic Empire, which encompasses the entire Milky Way. To save humanity, he creates a think-tank that develops the technology to launch a new galactic empire.
Musk told the Guardian, “It’s sort of a futuristic version of Gibbon’s Decline and Fall of the Roman Empire. Let’s say you were at the peak of the Roman empire, what would you do, what action could you take, to minimise decline?”
The answer for Musk is technology.
“The lessons of history would suggest that civilisations move in cycles. You can track that back quite far – the Babylonians, the Sumerians, followed by the Egyptians, the Romans, China. We’re obviously in a very upward cycle right now and hopefully that remains the case. But it may not. There could be some series of events that cause that technology level to decline. Given that this is the first time in 4.5bn years where it’s been possible for humanity to extend life beyond Earth, it seems like we’d be wise to act while the window was open and not count on the fact it will be open a long time.”
In James Joyce’s “Ulysses”, Stephen Dedalus says “History is a nightmare from which I am trying to awake.”
This is our nightmare, comrades. We have a capitalist class that is planning to colonize Mars in order to escape from the disaster it is now creating on Earth. Musk says he expects his business to be profitable since there will certainly be 80,000 people willing to pay the big bucks to flee a planet that has been consumed by nuclear war, catastrophic Noah’s Ark type flooding because of climate change, epidemics caused by viruses unleashed by the penetration of rain forests, or some other unforeseen disaster.
Musk is not the only capitalist who has “escape” plans. Jeff Bezos, the filthy predator who runs Amazon, is investing in Blue Origin, a space travel company that will not aim at colonizing Mars—a place that Bezos writes off as inhabitable—but instead hopes to launch huge satellites that will orbit around a post-apocalyptic planet Earth. In an interview with the Miami Herald conducted shortly after his high school graduation (he was class valedictorian), he said he wanted to build space hotels, amusement parks and colonies for 2 million or 3 million people who would be in orbit. We have no idea what Bezos’s plans are today but one suspected that they are much more in line with Musk’s, to create a sanctuary for 80,000 or so people who share his bourgeois values.
One thing we can be certain about: if people like Bezos inhabited a space station, they’d probably kill each other before the year is up given what they are doing to the planet today.
The Rod Holt character in the 2013 narrative film “Jobs”. Note the SWP poster on the wall.
Last night I attended a press screening for Alex Gibney’s documentary “Steve Jobs: The Man in the Machine” that opens in theaters and VOD on Friday, September 4th. The film is a brilliant analysis of both the man and the company he built. Since Gibney’s last documentary was on Scientology, it was natural to wonder whether he decided to take on another cult. When Jobs died, Gibney was struck by the mass grief that poured out for the CEO after the fashion of Princess Di. What explained such devotion? Since Gibney owned and treasured his IPhone, this was a question that provoked him into making this film. As someone who likes but does not exactly love his Macbook, and who spent 44 years working as a systems analyst and a programmer, the question of Apple’s place in the American economy and society is also of great interest to me.
There’s another connection. Back in 1967 I met Rod Holt in the New York branch of the Socialist Workers Party, a wiry fellow with close-cropped hair who I found more interesting than most party veterans since he was an engineer and had raced motorcycles—not the typical resume for a Trotskyist. Years later I learned that Holt would become one of the five founding members of Apple. As such I was spurred to watch the 2013 narrative film “Jobs” on Amazon streaming that includes Holt as a minor character. This review will take up both films as a prelude to the new film about Steve Jobs by Danny Boyle that will premiere in the Lincoln Center Film Festival next month. It is understandable why three films will have taken up the Steve Jobs story. Apple now enjoys the highest capitalization of any American corporation at 724 billion dollars, twice that of ExxonMobil. If a film like “There Will be Blood” or “Citizen Kane” documented the ugly character of previous generations of the bourgeoisie, the three films about Steve Jobs bring us up to date on how the computer revolution turns entrepreneurs into monsters—the latest report on Amazon’s treatment of white collar workers bears this out. In many ways, Jobs was the prototypical Silicon Valley terror anticipating Jeff Bezos, Mark Zuckerberg and Sergey Brin. Unlike these more recent avatars of the computer ruling class, the 1960s counter-culture shaped Steve Jobs as both Gibney and Joshua Michael Stern, the director of “Jobs”, make clear. Devotee of Eastern religion and Bob Dylan, wearing long hair, and with a background in phone phreaking escapades, Jobs seemed the least likely candidate for building a corporation twice as big as ExxonMobil—itself the product of a merger of two behemoths. Figuring out how that took place was exactly what drove Alex Gibney into making the most important documentary of 2015.
Born in 1955, Steve Jobs was in his late teens during the biggest shake-up in American society since the 1930s. Unlike me, ten years his senior and ten years Rod Holt’s junior, Jobs was far more interested in Enlightenment than socialism. You have to remember that the thirst for spiritual transcendence was very deep at the time, powerful enough to turn antiwar leaders Rennie Davis and Jerry Rubin into searchers after Transcendence either in the form of a Hindu guru’s cult and EST respectively. EST was a training program that was founded by Werner Erhard designed to help yuppies solve problems after the fashion of Scientology. Erhard cobbled together some techniques that he had picked up from Zen Buddhism. and psychotherapy The CEO of the consulting company I worked for in the 1980s was an EST follower although he never foisted his beliefs on me. The idea that Zen Buddhism could be a guide to business success for both Steve Jobs and my boss might seem strange at first but one must never forget that Zen Buddhists were gung-ho for Japanese imperialism in WWII as I mentioned to Gibney in the Q&A.
Like so many others from his generation, Jobs went to India on a pilgrimage to seek Wisdom with his friend Daniel Kottke who would become one of Apple’s founders. As both films point out, Jobs decided to allocate zero shares to Kottke when Apple was incorporated. He was very good at throwing people under the bus. When Jobs was at Atari in his first real job, the boss offered him a $5000 bonus if he could come up with a really great game. Needing hardware assistance, he recruited Steve Wozniak who was told that he could get half the bonus if they succeeded. But Jobs lied and told Woz that the bonus was only for $750.
That’s not the half of it. When Jobs’s girlfriend became pregnant, he retained a lawyer to help him avoid paying child support, claiming that she had screwed around so much that nobody could tell who the father was. Eventually a DNA test proved that he was the father. Even if he wasn’t, his millions could have easily helped to support the child of someone with whom he had been intimate.
Gibney gives the devil his due. In capturing Jobs’s single-minded devotion to crafting user-friendly and beautiful machines, you are reminded of why Apple became dominant. Unlike Detroit, Silicon Valley was always much more sensitive to marketing trends since so much of personal computing was driven by taste. And once Apple embarked on making products like the IPod, the IPhone, and the IPad, it was possible for consumers to really feel like the computer was an extension of their self. Gibney wonders, however, whether this is at the expense of the social bonding that was so important in the 1960s. If you go into a restaurant nowadays, you will often find a family of four fixated on their IPhone as each course is delivered, with conversation going by the wayside. The phone becomes worry beads that you can’t keep your hands off of.
The final fifteen minutes or so of Gibney’s film is a rather scathing summary of Jobs’s misdeeds from avoiding taxes to screwing Chinese workers out of a living wage as he polluted their rivers with industrial waste. Of particular interest is how Steve Jobs used a special task force of Silicon Valley police to go after a reporter for Gizmodo who had reported on an early release of an IPhone that a drunken Apple employee had left behind in a bar and that had come into his hands. Even after the phone had been returned, the cops raided the reporter’s home and carted off computers and other valuables. When asked by a TV interviewer why he had resorted to such repressive measures, Jobs replied that he was trying to uphold Apple “values”. In the Q&A, someone asked Gibney what question he would have asked Jobs if he had had the opportunity to interview him. He replied that he would have asked him to define what are his “values”.
I can’t recommend “Steve Jobs: The Man in the Machine” highly enough. For my money, Alex Gibney is the best documentary filmmaker working today, an equal to Werner Herzog. With 35 credits to his name, including “Taxi to the Dark Side” about the American torture regime, Gibney combines acute social analysis with fluid documentary techniques. As is always the case with documentaries, there is a need to tell a story just as much as there is with narrative films. Since Gibney described Jobs as someone who excelled in telling a story, this is a film that was a perfect match of filmmaker and subject matter.
Despite the fact that only 27 percent of critics found it “fresh” on rottentomatoes.com, I consider “Jobs” to be a compelling film with a remarkable fidelity to the facts, at least based on its close parallels with Gibney’s documentary. Of course, since 98 percent of critics found the wretched “Mad Max: Fury Road” to be “fresh”, there’s no accounting for aesthetic judgments among my peers.
Although I am by no means an Aston Kutcher fan, he captured the essence of Jobs as a brilliant martinet who had about as much warmth as a lamprey eel. Since the film does not try to deal with Apple Corporation’s disgusting behavior overseas, most of the negative side of the Jobs ledger is devoted to his treatment of his girlfriend and workmates including Daniel Kottke.
Much of the drama is centered on fights in the boardroom with former CEO John Scully who is depicted as a hidebound bureaucrat who cares more about the quarterly earnings than Apple’s mission as a corporation that “thinks different”. The most interesting scenes, however, involve Jobs’s interaction with his fellow designers and engineers who are on his wavelength. No matter how much of a prick he was, he appeared to be a very good judge of talent and an inspirer of those who chose to walk the same road with him.
One aspect of the narrative film that is passed over in the documentary is how Jobs and Wozniak presented their first computer to the Homebrew Computer Club in the Bay Area, which was nothing but a circuit board and barely worthy of notice by those in attendance.
In my very first article on the Internet, which was a review of a book about the personal computer industry called “Hackers”, I referred to Homebrew:
So enamored of the idea of personal computing were Felsenstein and Halbrecht that they then launched something called the Homebrew Computer Club. The club drew together the initial corps of engineers and programmers who would launch the personal computer revolution. Among the participants were a couple of adolescents named Steven Jobs and Steve Wozniak who went on to form the Apple Corporation.
The hacker ethic that prevailed at the Homebrew Computer Club was decidedly anticapitalist, but not consciously pro-socialist. Software was freely exchanged at the club and the idea of proprietary software was anathema to the club members. There were 2 hackers who didn’t share these altruistic beliefs, namely Paul Allen and Bill Gates. When Allen and Gates discovered that their version of Basic that was written for the Altair was being distributed freely at the club, they raised hell. The 19-year-old Gates stated in a letter to the club: “Who can afford to do professional work for nothing?”
For those who have followed the personal computer and Internet revolution for the past 25 years or so, you are aware of the tension between private and public that remains unresolved. For every scumbag like Zuckerberg anxious to enjoy the kind of monopoly that IBM once had in the mainframe business, there are others willing to work for free on Wikipedia, Open Source journals, free software, and the like. If capitalism creates the technology that allows the instant communication links that make runaway shops feasible, it also creates the networks that allow activists to build solidarity across national boundaries that will oppose capitalist exploitation. This is the contradiction that marks late capitalism more than any other in some ways.
Nobody better represents the intersection between public and private than Rod Holt who was the lead engineer on the Apple II and who worked on the Macintosh as well. In the Q&A I told Gibney that Holt paid tribute to Jobs not long after he died even if his values clashed with his former Apple pioneer. Here is what he told Marxmail subscribers:
Just a remark here:
Dear Comrades:
Concerning Steve Jobs:
I worked with Steve from the Summer of 1976 to his ouster. I was responsible for the Apple II hardware design and its manufacture. I was in charge of the Macintosh group until its launch in 1984. I was twice anointed with the title “Apple Fellow”. I’m sick and tired of people making judgements without the slightest idea of what they are talking about. They buy the official myths fabricated by various individuals around Apple (including the 2 Steves themselves). I have in my possession enough original documents to back up what I am saying.
There were 5 (five) founders of Apple Computer:
Mike Markkula, Chairman of the board of directors
Mike Scott, CEO and President
Steve Jobs, V.P. of Marketing
Steve Wozniak, V.P. Software
Rod Holt, V.P. Engineering
We were incorporated in the state of California effective Jan. 1, 1977 with the above 5 officers. Apple Computer had never been incorporated earlier.
I will just say here that the history of Apple in Wikipedia is seriously incorrect. Most other histories are also wildly wrong. Some of this was deliberately done by Steve Jobs, but most can be attributed to sloppy journalism. Some is due to bad memories.
Steve Jobs wanted products that he would buy and use. For the rest of Apple, the creators produced what they wanted to buy. The success stemmed from this simple set of motives.
Marxists should understand that the Apple products grew from the social environment of these times in silicon valley. There was a confluence here of what we, the designers, wanted and what the world wanted. I could go into more detail if there were room and time, but really, that’s the story.
Jobs was very, very bright, a genius perhaps. So was Woz. And Scotty too. We never lacked for brains. One of Steve’s remarkable abilities was that he listened. I would get into a fierce argument with him, go into the executive staff meeting and be floored when he would take my position exactly, understanding every bit of my arguments, re-phrase them and then convince everyone. I’ve never to this day met anyone that could dispute and at the same time listen so well.
But, for heavens sake, let’s remember that leaders of corporations have to make profits or else they are on the street looking for a job. Steve Jobs wanted a billion happy customers, a goal he could reach only as a super-capitalist. So that’s what he became. It wasn’t where he started, but that’s what happened. The fact that so much ink is expended by the press is embarrassing, but that’s just the byproduct. I’m sure he would be as embarrassed as I am now.
==================
If anyone wants particulars from me, he can ask.
Thanks,
–rod
Recent photo of Steve Wozniak and Rod Holt
* * * *
From Walter Isaacson’s biography:
Update
Posted by Rod Holt to Marxmail on August 29th:
Remarks on Steve Jobs as a Phenomenon
[The producers of the first Jobs movie, “Jobs” kindly loaned a preprint to the Roxie Theater in San Francisco so that my old friends and Apple co-workers could have a party—which we did, wall to wall.
After the showing that Thursday afternoon, here and there, I offered my opinion on the movie and its social meaning. That raised a few eyebrows and more questions. I have since been asked to explain myself, a reasonable request. Since my outlook differs a lot from that of many of us, I thought it proper to clarify what I meant when I talked about Steve as being intrinsically anti-capitalist. By that I meant that Steve was opposed to the “alienation of labor”, while the alienation of labor is intrinsic to capitalist production.
The term “alienation of labor” is a technical term, and like many in philosophy and economics, doesn’t quite mean what one would think. The shortest explanation of the concept is found in Wikipedia. Of course, the concept is not the property of Marx but has been part of the thinking of many thinkers since the rise of capitalism.
In the Wikipedia article, there is a quotation where Marx imagines production with non-alienated labor:
“… In your enjoyment, or use, of my product I would have the direct enjoyment both of being conscious of having satisfied a human need by my work, that is, of having objectified man’s essential nature, and of having thus created an object corresponding to the need of another man’s essential nature. . . . Our products would be so many mirrors in which we saw reflected our essential nature.”
Steve Jobs wanted his products enjoyed as expressing his essential nature, and therefore in the general sense, he was an artist with the development team and its laboratories as his studio.
In the capitalist system, products are produced by workers paid in money and with tools owned by the capitalist. The sole purpose of the product is to be sold to realize a profit. This process eliminates the artist altogether. Wikipedia sums this up:
In a capitalist society, the worker’s alienation from his and her humanity occurs because the worker can only express labour — a fundamental social aspect of personal individuality — through a privately-owned system of industrial production in which each worker is an instrument, a thing, not a person.
So the product of labor under capitalism, the commodity, is not what Steve Jobs intended to sell. He was selling something better, something more. As far as he was concerned, profit was just fine, but not at the expense of that “something more.”
I wrote the few paragraphs below without a discussion of the alienation of labor, which is an unusual social-philosophical concept. As a result of this omission, there were some misunderstandings. For example, the alienation of labor does not mean the alienation of workers.
The fact that Steve was driven by his vision of beautiful products, “insanely great” as he would say, didn’t prevent us from glorying in our own contribution of non-alienated labor.
I do not believe Steve grasped the notion of alienated labor in and of itself. It is impossible to imagine the tens of thousands of Chinese laborers getting any whiff of the intoxicating perfume in the air we enjoyed in the early years at Apple.
=====================
Let me take on the task of explaining my view of Steve and the “First Five Years of Apple Computer.” Over the years, I’ve listened to lots of people with theories of how Apple succeeded, what was the magic ingredient, and whether the life of Steve Jobs verified the Great Man theory of history or not. I believe that the overwhelming majority of commentators miss the point completely. This is not surprising not only because they weren’t there, but also because what actually went on at Apple completely contradicts some central myths of Modern Capitalism.
I will state my thesis here as briefly as I can. I will not be writing a book proving every jot and tittle on the way to a grand conclusion. However, I feel competent to defend the thesis against any opponent. The first few years of Apple Computer were remarkable because labor was not alienated labor in the Marxist sense. We were not producing commodities for the sake of profit. In many respects, even as the company grew beyond all expectations, inertia carried this extraordinary characteristic forward until the Scully era.
The first three years at Apple were marked by a strong bond between all the participants, and between all of us and the product. We were building a product for ourselves and everybody throughout the world who were like us. (People tend to think everybody except the Other are like themselves in fundamental ways.) This was a product we wanted. And that was why we stayed up nights solving problems as they cropped up. Nobody in the early days was doing their job with the pay envelope in mind. Nobody. Even the production people putting Apples into boxes believed (correctly) they were sending their product to someone like themselves who would appreciate it, and more, marvel over it.
We made no shortcuts whatsoever. Not one. For example, Steve had the boxes carefully marked with our name and logo in red on the cleanest of clean white cardboard. Later, we got a shipment where the ink had smeared and the boxes “looked like shit,” as Jobs put it. So without regard for the fact that nearly 200 Apples were sitting in production ready to go, Steve shipped the boxes back. Both Markkula [Chairman of the Board] and Scotty [Mike Scott, President and CEO] screamed, but they were too late; the bad boxes were gone. And the whole factory silently applauded.
Again: We were in agony when the paint showed signs of peeling off the first cases, which (it turned out) were contaminated by the release compound from the molds. While orders piled up, we didn’t ship until we had stripped the paint, found a method for cleaning the cases and then repainted them. Everything that went wrong met a concentrated corrective effort. When it was clear that the cases made by the RIM (Reaction Injection Molding) method were not ever going to meet our standard, Steve and I took an airplane to Portland, Oregon to start an intensive program to make a new set of molds for an altogether new process that promised perfection (high pressure injection-molded foamed Noryl). Fortunately, our case design was suited to both the material and the process, and without dawdling we jumped right in and Steve wrote some big checks for the tooling. When quality of the product was considered, manufacturing cost was always second.
I worked with Steve (cheek to jowl at times) for the first 7 years and I think I came to know him at least as well as anybody. We never had a conflict over product quality as such. I did have arguments on “features.” Take for example one dispute over the Macintosh; Steve wanted stereo sound, and for Burrell Smith who was doing the logic board design, it would take some major design changes to accommodate stereo (adding an extra shift register, another D-A converter, and making changes in the ROMs and software). So I said No. Enough was enough. The engineering department had to stop changing things; we had to wrap up the design and go to production. I convinced Burrell Smith. I convinced Andy Hertzfeld, and demobilized Steve Jobs. Then I went home late, leaving the usual half dozen perfectionists (including Burrell) working away. But Steve wouldn’t leave well enough alone. He came back to the lab late that night and convinced Burrell that stereo was essential. So, the next morning, Burrell went home exhausted with the prototype boasting stereo, and me shaking my head in disgust. But so it came to be that the Macintosh had stereo even though there was no application program of any sort that could use it and only one speaker — at that time.
This sort of thing I understood, but it conflicted with my desire to get the product to the user promptly. Sometimes I could move things forward, and sometimes I couldn’t. However reluctantly I say this, more often than not, Steve’s last minute changes were the best thing for the customer.
I believe that Steve was dedicated to his audience, an imaginary audience who he would simply will into existence. He wanted commodities to be more than commodities. This desire was the base for the conflicts with Apple’s Board, etc. that forced the Board to fire Steve. But somehow, the vast millions of customers understood and applauded and Steve basked in the glow.
I talked with Ashton Kutcher [who played the part of Steve in the movie] at some length about Steve as the self-appointed representative of the customer, representing the people who could appreciate the quality, the thoughtfulness, and the product; that is, the product as the crystallization of what they wanted. Jobs’s perfectionism was not just a quirk, it was central; he wanted to be the leader of a new wave of products—products that were more than commodities. Products, I imagine, as we might have under socialism. To my surprise, Kutcher had come to roughly the same conclusion. He had read all available speeches by Steve, read memos and listened to those who had direct experience. He was the only one in the organization, which produced “Jobs”, who had thought through the story to the point of understanding it. This is key to his remarkable portrayal of Steve.
The movie clearly shows this conflict between a product made solely to be sold for a profit and a product made to “change the world”. At one point, the movie shows Art Rock, the dark side venture capitalist, explaining to Steve that the company had to make a profit, even at the expense of everything else. When Steve refuses to adapt to this edict, Scully, Rock and Markkula dethrone him and the Early Apple years end. In startling contrast, when Steve returns to Apple, the movie shows him with great intensity telling the new young designer (Ivy) “Design something beautiful that you love. I don’t care what it is.” (I believe one of Ivy’s designs became the iPod.) So Steve wins; we are left to imagine the evil capitalists slinking away.
Jobs’s failure to come to terms with capitalism (at least up through the first Macintosh) was due, I believe, to his willful ignorance of politics. His all-consuming idea of himself as a visionary made it impossible for him to see the contradictions. The failure of his own enterprise NEXT must have been a humbling experience. That, followed by the success of Pixar, which made him rich again, certainly must have changed him.
I have no direct experience of his last 25 years, but I suspect at least his obsession with his audience (the customers) stayed with him.
On August 18 I wrote an article in response to Joe Firestone, the author of an EBook titled “Austerity, Greece’s Debt Crisis and the Theft of Democracy” that had a chapter on the IT problems of a Grexit, which addressed earlier articles I had written.
Yesterday someone brought my attention to a follow-up on his blog (http://neweconomicperspectives.org/2015/08/on-the-it-problem-of-grexit-a-reply.html) that once again tries to strike a balance between Australian economist Billy Mitchell’s blithe assurance that the IT problems are minimal and my own insistence that it will be at least a three year effort to modify the systems. This will be a brief response to Firestone’s latest.
Firestone maintains that he is only for studying and evaluating some approaches. He also favors a phased implementation, something that is put forward concisely in a comment he made under his article:
The mainframe application is undoubtedly very complex so there is a good possibility that Louis is right and the mainframe conversion to Drachma processing cannot be accomplished in the short time necessary for Grexit
So, if we want to support a Grexit that may be necessary in the short term, then we must find a way to get around the need to convert the mainframe application in the short-term
The two possibilities I suggest in my book deserve discussion as possible ways to avoid immediate conversion of the mainframe application and to have to deal with the complexities of the interaction between humans and the mainframe inherent in the operation of the application in the real world
This assumes that you can hold off converting “the mainframe application” for the future but that’s not the way that banking systems are put together as if they were Lego toys made up of discrete modules that can be assembled in phases.
Think of it this way. When you open a checking account, you sit at the desk of some bank officer who begins entering your information into a computer, starting with name, address, social security number, etc. He or she then issues you a temporary ATM card that can be used immediately for deposits and withdrawals.
In the ensuing months, customers might take out a credit card from the bank and afterwards a mortgage and/or an auto loan. And each month they expect a statement that will have an accurate record of their transactions, both debits and credits. I am sure everybody is accustomed to this unless they are used to keeping cash under a mattress.
The implicit assumption (bordering on explicit) in both Mitchell and Firestone’s presentation of the problem is that such a “phase” is essential to moving to a drachma. I can certainly understand why someone might think in those terms because that is generally how we relate to a bank—as a customer. I should add that the applications that handle such relationships are generally referred to as belonging to the “front office”.
Unfortunately, most “back office” operations must be converted on the very day that you implement a new front office based on a drachma since they are designed to support the managers and clerks who are invisible to the customer but critical to bank operations.
For example, the accounting department of a bank is fed data aggregated on a daily basis from various sources in order to populate a General Ledger, which is the source of profit and loss statements and other essential reports for treasurers, auditors and the like. Your deposits and withdrawals are lumped together with those of other customers and end up in buckets identified by a unique General Ledger Account Number, one of which might reflect Mortgages. Needless to say, knowing how much is owed to the bank in this category is essential to a bank based on the 2008 financial crisis.
So if the accounting software is still denominated in euros, what are you supposed to do? Use these for a couple of years until the next phase kicks in?
This does not begin to address the problem of being able to rely on accounting systems once they are converted to handle the drachma. Banks have historical data that is used to generate reports that reflect financial trends. Since 2003, data has been captured as euro-denominated. If you want to study how the mortgage business has been faring over a ten-year period, you need to write conversion software to update computer files going back to the day Greece switched from the drachma to the euro. You also need to make sure that all back-office applications are checked for hard-coded tests for a euro amount, as I have pointed out a number of times.
I know that most of my readers and those who have seen my posts on Naked Capitalism care little about the financial analysis conducted by bank officers in order to make business decisions but as long as Greece remains capitalist, that is the name of the game. This is not a problem limited to banks. It applies as well to insurance companies, brokerage houses, manufacturers, and any other large-scale capitalist enterprise.
Now it is entirely possible that at some point Greece might elect the candidates of the new Popular Unity party that is a leftwing split from Syriza and that is committed to a Grexit, at least if you take them at their word. They may consider the conversion to a drachma to be cost-justified even if it entails the wrenching IT modifications needed to make it work. While I am obviously sympathetic to resisting austerity, I cannot help but wonder if the answer lies solely in the type of currency used. I plan to write a series of articles about Greece that deals with the economic problems in general and hope that by that time the IT questions will no longer need to be discussed since in the final analysis they are secondary to the political ones.
Recently I learned that an EBook on Amazon.com titled “Austerity, Greece’s Debt Crisis and the Theft of Democracy” included a chapter titled “The Information Technology Problem” that discussed my articles on Naked Capitalism and those of Australian economist Billy Mitchell who has an unrealistic take on the amount of work required to modify Greek computer systems to handle a return to the drachma.
Joseph Firestone, the author of the EBook, has a PhD in Political Science from Michigan State, over 150 articles to his name, and an extensive background in IT but mostly at the management level. Right now he is the Chief Knowledge Officer of a company called Executive Information Systems, a title that most likely has something to do with Knowledge Management, his area of expertise. This is apparently a field that has emerged since 1991 but one that somehow managed to elude Columbia University where I worked from that year until my retirement in 2012. There will be something about it later in this article by another expert in the field.
Firestone tries to reconcile Mitchell’s views and my own, probably something that irritated the economist emeritus much more than it does me given his irascible reaction to my first article on Naked Capitalism. His tone reminded me of the one I take on issues such as when the Russian Revolution went off the rails but let’s leave that aside and move on to the substantive IT issues.
From Firestone I learned that Mitchell had a short follow-up article that somehow escaped my attention. Using the authority of a friend who appears to be as high-powered as Firestone, a man who “owns a significant private firm in Europe which is at the forefront of delivering innovative card payment services to banks and corporations throughout the Eurozone”, Mitchell sought once again to buttress his “its not rocket science” understanding of the IT issues.
The friend confided to him that since “the Euro was integrated ‘on-top’ of the existing legacy IT payment systems”, ‘switching’ the Drachma back on would not be such a major task.” He added:
the Grexit should be accomplished by stealth. He would leave everything in place as it is for now. Then establish, in secret, a public bank (like the German KfW), procure the banking software out-of-the-box, sign a contract with a major card-scheme to use its network for transactions and hook the bank up with the official Bank of Greece, the nation’s central bank.
I wonder if this plagiarized or at least conveyed the madcap spirit of Varoufakis’s “Plan B”. If they ever made a movie about such a scheme, I’d cast Steve Carell in the leading role (only because Peter Sellers is dead.)
In terms of the Euro being integrated on top of the legacy systems, I have no way of assessing this. As someone who has taken part in at least a dozen feasibility studies over the years, I have learned that it is best to be cautious. Apparently the higher up you are in the IT food chain, the easier it is to throw caution to the wind.
In the late 90s I advised IT management at Columbia to avoid purchasing a Facilities Management System from American Management Systems (AMS). This was an outfit that Robert McNamara’s aides in the Pentagon founded in 1970. That should have been a warning from the outset to steer clear. Within six months after the system was implemented at the cost of millions of dollars, the users decided it did not meet their needs and dumped it. Just a few years later AMS went under, no doubt partly a result of Mississippi terminating an $11.2 million contract to modernize the state’s tax system. It would go on to sue the company for $985 million. Wikipedia states: “a jury awarded the state $474.5 million in actual and punitive damages in August 2000, causing a drop in stock price from 44 3/8 to 14. The company subsequently settled the suit for $185 million.” You can bet that if Greece ever needed consulting help to get them back into the drachma, there would be latter-day versions of AMS knocking at its doors.
Furthermore, with all due respect to Mitchell and his friend who “delivers innovative card payment services to banks and corporations throughout the Eurozone”, there is more to IT in Greece than banking and credit card processing. Greece has hospitals, universities, wholesale and retail companies selling furniture, yogurt, olive oil, tourist accommodations, and Zeus knows what else. Many of these companies do not have in-house staffs. Getting them up and running on a drachma will not be a piece of cake—trust me on that.
For Firestone to bridge the gap between Mitchell and myself, he invokes his own particular areas of expertise that supposedly get us closer to “it’s not rocket science”. Naturally this require some critical commentary.
In a section titled “Web-oriented Architecture Approach to a Drachma-based Transaction System”, he advises “web-enabling a legacy system”, something that might take a “few days, if that long”. Well, gosh, why hadn’t he brought that to Varoufakis’s attention? That would have saved him from the trouble of lining up his pal at Columbia University to program a stealth-based “Plan B”. Firestone even offers up the names of some products that could be off-the-shelf solutions such as the one marketed by the slyly named Kapow Software. While this software no doubt works as advertised in terms of integrating different systems under a web-based front end, it has little to do with the complexities of batch processing—the meat and potatoes of all banking applications for which there is no user interface. Kapow might be of some use to a bank officer evaluating a loan application from a nervous customer sitting opposite him or her, but it is totally irrelevant to a stream of programs run at 3am in the morning that pump out customer statements. A customer statement like the kind that you receive from your friendly banker at the end of the month with a listing of your debits and credits followed by an account total. It is exactly programs such as these that will require onerous and time-consuming attention—nothing that Kapow can address.
Finally, returning to Firestone’s Knowledge Management, he starts off by wisely acknowledging that “people avoided mainframe applications wherever they could, because the chances of failure were so high”. He includes himself in that group. That being said, he regards the Kapow approach as an interim solution and concludes that a “better solution” would be to develop a new system written for the mainframe from scratch “using modern programming tools and techniques”—no doubt drawn from the Knowledge Management toolbox.
All I can say is that ever since the mid 1970s, I have heard about one new technique or another that would finally make developing large-scale systems more averse to failure. They were put forward either as management, systems analysis, database or programming technologies in trade journals such as Datamation or Computerworld:
—programmerless programming: Languages such as MarkIV would allow an end user to build a system by using to specify parameters that satisfied business requirements. In fact I automated Salomon Brothers in London (SBIL) when I reported to Michael Bloomberg in 1977. Trust me, Michael couldn’t have done anything in MarkIV if his life depended on it.
—goto less programming: The less said the better. I stopped using the “go to” in 1978 or so but deadlines were still missed because the user kept changing his or her mind—the real explanation for most software delays.
—structured design methodologies: I worked for a consulting company that employed SDM for a phone company project that would evaluate whether a customer would be charged for a phone call that they claimed that they didn’t make. When the consulting company demanded new funding because the project was delayed, negotiations broke down and we were escorted out of the building by security guards. SDM did not address user indecision, the cause of cost overruns.
—relational databases: This was a huge breakthrough supposedly because it organized data into rows and columns just like a spreadsheet that could be accessed through SQL and best when it was based on normalized data structures, which meant avoiding redundancies through a data analysis of the firm. I can only say that I have worked with VSAM flat files, IBM’s IMS hierarchical database, Cullinet’s IDMS network database before finally becoming a Sybase support person on my project team at Columbia University. All of them work just fine even though Sybase (and Oracle) are best suited for client-server or web-based applications. But in the final analysis, it is the problem of nailing down user requirements that will always bite you in the ass. Given the economic chaos in Greece, this will be a thousand times worse than the normal chaotic situation.
–Object orientation: I spent about five years developing Java programs in the STRUTS framework for Columbia University’s financial system. Anybody who sells OO as some kind of silver bullet should get one in the head.
Since I have never gone near Knowledge Management, I won’t say a word about it although I would be remiss if I did not refer you to this:
Wall Street Journal, Jun 24, 2015
Whatever Happened to Knowledge Management?
By Thomas H. Davenport
I would never claim to have invented knowledge management, but I confess to an intimate involvement with it. I co-authored (with my friend Larry Prusak) one of the best selling books on the topic (in case you are into the classics, it was Working Knowledge: How Organizations Manage What They Know) and am supposedly the second-most cited researcher in the field (after the Japanese scholar Ikujiro Nonaka).
So I should know whereof I speak when I say that knowledge management isn’t dead, but it’s gasping for breath. First, the ongoing evidence of a pulse: academics still write about it, and some organizations (most notably APQC—a nonprofit research organization of which I am a board member and respect a lot) sells out its knowledge management conference every year. Professional services firms are still quite active and successful with the idea.
But there is plenty of evidence that it’s gasping as well. Google Trends suggests that “knowledge management” is a term rarely searched for anymore. Bain’s Management Tools and Trends survey doesn’t list it in the top 25 tools for the 2015 or 2013 surveys; it was included before that. More subjectively, although I am supposedly an expert on the topic, hardly anybody ever asks me to speak or consult about it.
What happened to this idea for improving organizations? I’m pretty sure that knowledge itself hasn’t become less important to companies and societies, so why did many organizations give up on managing it? Is there any chance it will return? And what does its near-demise tell us about the attributes of successful business ideas?
Although it’s impossible to know for sure why something rises or declines in popularity, here are some of my ideas for why knowledge management (KM) has faded:
It was too hard to change behavior. Some employees weren’t that interested in acquiring knowledge, others weren’t interested in sharing what they knew. Knowledge is tied up in politics and ego and culture. There were methods to improve its flow within organizations, but most didn’t bother to adopt them. Perhaps for this reason, the Bain survey (for example, the one from 2005) suggests that corporate satisfaction with KM was relatively low compared to some other management concepts.
Everything devolved to technology. KM is a complex idea, but most organizations just wanted to put in a system to manage knowledge, and that wasn’t enough to make knowledge flow and be applied.
The technology that organizations wanted to employ was Microsoft’s SharePoint. There were several generations of KM technology—remember Lotus Notes, for example?—but over time the dominant system became SharePoint. It’s not a bad technology by any means, but Microsoft didn’t market it very effectively and didn’t market KM at all.
It was too time-consuming to search for and digest stored knowledge. Even in organizations where a lot of knowledge was contributed to KM systems—consulting firms like Deloitte and Accenture come to mind—there was often too much knowledge to sort through. Many people didn’t have the patience or time to find everything they needed. Ironically, the greater the amount of knowledge, the more difficult it was to find and use.
Google also helped kill KM. When people saw how easy it was to search external knowledge, they were no longer interested in the more difficult process for searching out internal knowledge.
KM never incorporated knowledge derived from data and analytics. I tried to get my knowledge management friends to incorporate analytical insights into their worlds, but most had an antipathy to that topic. It seems that in this world you either like text or you like numbers, and few people like both. I shifted into focusing on analytics and Big Data, but few of the KM crowd joined me.
Any chance that this idea will come back? I don’t think so. The focus of knowledge-oriented projects has shifted to incorporating it into automated decision systems. The hot technology for managing knowledge is now IBM Corp.IBM -0.28%’s Watson—very different from the traditional KM model. Big Data and analytics are also much more a focus than KM within organizations. These concepts may be declining a bit in popularity too, but companies are still very focused on making them work.
If you believe in knowledge management—and you should—perhaps in your organization you can avoid the pitfalls I have listed and allow the idea to thrive. And if you favor a different idea and want it to survive over the long term, don’t hitch a complicated set of behaviors to technology alone. Don’t embrace a vendor for your concept that doesn’t care much about your idea. And if another notion that’s related to yours comes along and gains popularity, don’t shun it, embrace it.
Thomas H. Davenport is a Distinguished Professor at Babson College, a Research Fellow at the Center for Digital Business, Director of Research at the International Institute for Analytics, and a Senior Advisor to Deloitte Analytics.
Apparently my brief reference to Australian economics professor emeritus Bill Mitchell’s failure to mention the IT aspects of Grexit in a Naked Capitalism article touched a nerve. In a 3500 word article that appeared on his blog on Friday, July 24th he minimized the challenges and appealed to his own authority as an IT professional to drive his case home. He also took up some points in my article that weren’t really directed at him, particularly my brief remarks around the question of a Grexit not being sufficient to bring an end to austerity.
I did not have Dr. Mitchell in mind when I made that point. Furthermore, I don’t think that there is that much difference between us on the economic questions but as I will now point out we are still far apart on the IT implications of a Grexit that I will now explain.
To start with, he groups me with the sensationalistic media reports on Y2K that warned about Armageddon as if I or any other seasoned professional really worried about such an outcome. He also alludes to the opportunistic sales pitches from consulting companies anxious to get their foot in the door to help firms large and small avoid a Y2K catastrophe but at a steep price. If you were part of the permanent staff in any large organization like Columbia University, you had a very clear idea about how to do a Y2K conversion without tears.
Furthermore, I am quite sure that given sufficient time, funding and personnel, the conversion to the drachma is feasible. But the purpose of my article was not to argue that it was impossible. It was only to alert a lay audience what kind of challenge it represented. For those who have not managed large-scale project implementations, it was easy to imagine that such a conversion could take place in something like a few months. But I am convinced that it would probably take no less than three years based on my 44 year experience managing, designing, programming and testing mission-critical applications in a variety of banks, brokerage houses, and insurance companies. That was about what it took to go from national currencies to the euro and I would expect that it would take about the same amount of time to reverse engineer the process.
Perhaps nothing captures Dr. Mitchell’s unfamiliarity with the IT challenges facing a euro-to-drachma conversion than what he has to say about Y2K:
As the Naked Capitalism author notes it was really about software that had used two numbers to designate the year (MMDDYY) instead of four (MMDDYYYY). Several straightforward computer changes were made to resolve the possible problems depending on the situation (date expansion, date re-partitioning in overfull databases, windowing patches etc). Very trivial.
I did a double-take when I read this. Very trivial? Well, it is very trivial to expand the year from two digits to four digits but that was never the challenge. In fact Dr. Mitchell completely ignored what I wrote, namely that the task of finding the code was like looking for a needle in haystack. At Columbia University we divided up thousands of programs and assigned programmers to search through thousands of lines within each program to track down a six-digit date and convert it to eight digits. It took 10 seconds to modify each date when it was found but it took the better part of a year to find them all. To repeat, a search for any field of data that had “date” in its name was straightforward but what if a programmer labeled it “dt” or even “d”? Furthermore, what if a piece of data identified as “admission_date” is moved into a temporary field called “admission_temp”? You have to track the movement of data within the entire program to be sure that you had all bases covered. This was a laborious task that took us the better part of a year. It also took another year for IT to test all of the modified programs to make sure that the integrity of the data was preserved.
Greece would run into the same challenges in a euro to drachma conversion but likely would not have the kind of infrastructure that a well-endowed Ivy university was able to rely on. Given the economic desperation and chaotic conditions that Greek firms large and small operate within, it is a serious mistake to use one’s influence to persuade policy-makers to leap without looking first.
Continuing in his best case scenario vein, Dr. Mitchell dismisses the possibility that hard-coded values in a program constitute a major hurdle:
The issue is simple. Rules for determining eligibility for a service (mortgage etc) might have thresholds hard-coded into the computer system. So if your bank balance is above 1000 you qualify for a loan. Good programming clearly creates variable definitions (say, $threshold = 1000) in easy to find and edit part of the system and then uses symbolic references ($threshold) throughout the rest of the system so that when the threshold might require alteration there is one data entry required which feed the old system.
Yes, we are all for “good programming” but my experience over the years is that there is enough space between “good programming” and the actual code in legacy systems to steer an ocean liner through. In the ideal world, a hard-coded value is never used. For example, as Dr. Mitchell points out, it is good practice to define an external variable such as $threshold but in practice Cobol programmers (the language of choice in most financial applications) tend to take shortcuts because they are always under the gun to meet a deadline. So instead of defining an external variable that can be modified in a single location, they will test for ’10000’ or whatever. Since the software in Greek banks is likely to be decades old, I doubt that the “good programming” practices hailed in computer science classes find much reflection within them. In fact, Mitchell expresses a surprising degree of naiveté when he writes:
So if there is a lot of ‘hard-coding’ in the Greek financial and business systems it would require some work. The reference the Naked Capitalism article uses was written in 1999 and relevant to rather dated practices and the big challenge of converting all the currencies into the euro and all the different national business systems into an integrated set of systems that could cope with the common currency.
I would suspect the assessment that there is a lot of ‘hard-coding’ now would be amiss. Business systems have become much more sophisticated and homogenised in the 16 years since that article was written.
But the point is that when Greece went from the drachma to the euro in 2002, it was practically preordained that the modifications would be made to existing software that might have been written in the 1980s or earlier. Why would Greek banks have written an entirely new Direct Demand Accounting system in that period? Yes, business systems have become more sophisticated since the year 2000 but you can be assured that those that serve the mission-critical needs of Greek banks are decades old.
I should add that although I worked on mainframes for 23 years, the last 21 were spent at Columbia in leading edge technologies of the sort that he describes as “sophisticated” and “homogenized”. When I was hired by Columbia University in 1991, it was to make recommendations about exactly such technologies in my capacity as Development Technology Coordinator. Later on, once such technologies were adopted, I had over 15 years experience designing and programming financial applications in Java using the Struts framework. Additionally, I supported that application’s Sybase backend using Perl and other Unix-based tools. Finally, part of my retirement contract involved being available on a contingency basis for technical support as the need arose. Even now I stay in touch with my colleagues to give them my take on future IT directions.
Dr. Mitchell also seems to have missed the point I was making about historical data:
These include the historical presentation of records, for example, bank statements. These problems were already encountered and solved in the transition to the euro. There is no reason to suspect that any new issues have arisen. The Bank of Greece knows how to do this and could easily issue a procedural manual to the commercial banks and other financial institutions.
But my point was that ad hoc software would have to be developed to modify historical data. For example, just to repeat myself, if the United States elected a Marxist president and adopted a new currency called the Rosa that was pegged 10 Rosas to the dollar, you would have to develop software that went through the databases to multiply all occurrences of each cash-based data store by 10. (Let’s hope we’ll see that someday.)
Finally, if I understand Mitchell correctly, he seems to be saying that you could dust off the pre-euro conversion software from 1999 or so and use it to replace current-day systems. That would be fine if there had been no modifications made in the past 16 years to incorporate new business rules. But as we know financial applications are highly dynamic since the industry is always sensitive to opportunities that can always boost corporate profits to the disadvantage of the poor customer. Who knows? Maybe when the entire world converts to the Rosa, or even when money is no longer necessary, we will not have to face such problems but in the meantime reality must govern all major policy decisions, including ones that revolve around information technology—the nervous system of any modern economy.
On July 14th I wrote an article titled “Convert to the drachma–piece of cake. Right…” that was a first take on the difficulties in implementing a Grexit from an IT standpoint. Since then I have tracked down a number of high-level strategic planning documents written in the late 90s that give me a much better handle on what those difficulties amount to. Except for the folks at Naked Capitalism who reposted my original article, there are very few people on the left who have any inkling of the problem. One of them is Robert Urie who alluded to it in a recent CounterPunch article:
A central difference between Argentina and Greece is that ‘all’ that Argentina had to do was to break the peg (fixed currency exchange ratio) with the USD while implementation of the Euro was a massive technological undertaking that replaced the Greek technology and institutions that supported the drachma. In the event of a forced Greek exit recovery of these technologies and institutions would take time that the Greeks don’t have. Breakdown of the supply-chain— the integrated economic relations that together facilitate economic production, causes a cascade effect where once lost, has to be rebuilt from the ground up.
Instead what I have mainly heard is that it is much more of a piece of cake than my article would suggest. For example, Canadian leftist Ken Hanley, who wrote an article titled “The German Grexit plan may have been the lesser of two evils”, commented: “The creditors were able to develop a Grexit plan. Schaeuble even presented a Grexit plan as an alternative to deal and many think that his whole plan was to force a Grexit.” He also referred me to an article by an Australian economist that assured his readers “A Greek exit is not rocket science”. Well, it might not be rocket science but computer science is certainly relevant notwithstanding the economist’s failure to refer to IT once in his article.
The same shortcoming exists in an article that has been embraced by many on the left as a recipe for overcoming austerity. Titled “Greece: Alternatives and Exiting the Eurozone” and written by Eric Toussaint, who works with the Committee to Abolish Third World Debt, it makes very useful recommendations but once again neglects to mention anything about IT.
Now my point in referring to these difficulties was never to support staying in the Eurozone. It was primarily intended to alert the left about the dangers of thinking in terms of short-term solutions. Furthermore, my own position is that Greece’s difficulties have more to do with the underlying economy rather than what currency it uses. Some Marxists, who have been sharply critical of Tsipras, appear to understand what this means. For example, In Defense of Marxism, warned:
Some people have argued that if Greece is pushed out of the Euro this could eventually provide a solution to its economic problems. That is naïve in the extreme, not to say irresponsible. The question would still remain: what kind of an economy, run by whom and on the interests of whom?
Let us assume that the new currency is called the drachma. What will happen to it? It will fall like a stone because nobody will want to hold it. That will cause prices to rise steeply, even hyper-inflation, as in Germany in 1923. People’s savings will be wiped out. There will be a deep slump and even more unemployment.
Moreover, if Greece is forced out of the Euro, it will also find itself out of the European Union. The European bourgeois will not want to see its markets invaded by Greek goods made cheaper by the inevitable fall of the drachma (or whatever other currency is chosen). It will be necessary to take very drastic measures in order to avoid an economic catastrophe. Half measures will be useless. One cannot cure cancer with an aspirin.
I also thought that the Belgian Trotskyists of the LCR-SAP had good advice:
Leaving the Euro is not a sufficient condition to break with austerity (as the case of Britain proves) but, in the Greek case, for the countries of the periphery and those which are not in the heart of the euro zone, it is clearly a requirement.
The need to break with the euro does not imply making leaving the euro the central axis of an alternative programme. Even in Greece, where the question arises in a burning and immediate way, the axis of the alternative programme must be the rejection of any austerity and the implementation of social, ecological, anticapitalist and democratic policies, which directly improve the fate of workers, young people, women, the victims of racism, and the peasants.
To make leaving the euro the axis of the alternative would be to run up unnecessarily against the very generally-held idea that the currency is only “neutral” technical means of allowing trade, whereas it is in fact also the crystallization of a social relationship. To make leaving the euro (or the EU) the axis of the battle would be also to play the game of the hard-line and far right, by spreading the illusion that a harmonious socio-economic-ecological development would be possible within the national framework. This illusion harms internationalist solidarity. However, this is crucial not only for the fight in Greece, but also because the integration of the economies on the continent requires a European anticapitalist perspective to satisfy social needs and to answer the urgent ecological needs.
Before moving on to the technical aspects of a Grexit, I should say a few words about my background. Even though my regular readers know that I worked in IT for 44 years, it might be useful to mention something about my experience.
To start with, before I began working at Columbia University in 1991, most of my work experience was in financial applications. I worked for five different banks: FNB of Boston, Texas Commerce Bank, Irving Trust, United Missouri Bank (where I programmed ATM’s) and Chase Manhattan. I also worked for investment banks: Salomon Brothers and Goldman-Sachs. Finally, in the 21 years I was at Columbia University, most of the time was spent working on the financial system used for purchases, general ledger and the like. Back in 1998, part of my workload over a two year period was to evaluate legacy software to identify changes needed to accommodate the arrival of 2000, a technical challenge that was dwarfed by Eurozone conversion that I will now explain.
The following documents were key to the observations I will be making:
Patrick O’Beirne, “Managing Risk in Euro Currency Conversion”, Cutter IT Journal, 1998 (http://www.sysmod.com/eurorisk.pdf). This is basically a shorter version of the Dekker article above with a useful bibliography referring to other material in this vein.
To start with, it would be useful to understand what took place in a Y2K migration. In many programs written in the 60s and 70s, when the year 2000 seemed like a long way off, dates were formatted as MMDDYY. This meant if you were trying to establish whether a bond would mature in five years, you’d subtract something like 07/22/67 from 07/22/72 but when 2000 arrived, how could you determine whether 07/22/04 meant 1904 or 2004? The answer was to wade through millions of lines of code and expand MMDDYY to MMDDYYYY.
In a computer program, every field of data is uniquely named. This means searching in a COBOL program for something like “date_today” is pretty simple. But what if a programmer called it dt_today instead? Of course, you might figure out that “dt” means date but some lazy programmer might have written it as “tdy”.
You will have the same problem, of course, with a euro to drachma conversion. Searching for the Greek equivalent of “amount” or “amt” becomes a drain on any IT staff.
A conversion from a local currency to the euro was a whole order of magnitude more difficult when it comes to converting currency amounts, even when they are identified. For nations such as Spain that did not have a decimal based currency like the euro, the rounding became a challenge. Since this did not apply to the drachma, a simple replacement might be in order and that would be the end of it.
However, the big problem was testing for a hard-coded amount parameter as I tried to explain on Naked Capitalism underneath the crossposting of my original article:
For example, there might be tests to see if a customer has sufficient funds to be qualified for a mortgage. A program might conceivably mark it as eligible if there were 10,000 euros in the account. Switching to a drachma might make everybody eligible–not that there’s anything wrong with that obviously–but you can see that this is not a simple matter. Just being able to handle a drachma instead of a euro does not mean that software is meeting expectations. You have to do a BUSINESS analysis, which is the first stage in any systems implementation.
As it turned out, the Gimnich article listed above makes an identical point:
In many cases, amounts are hard-coded in the application programs. For instance, statements of the kind IF amt_1 < 1000 THEN … appear quite often. Here, the threshold value is simply used as a constant in the program: no symbolic constant, no variable declaration, no external amount table read.
Assuming that Greece’s programmers could convert programs to handle the drachma rather than a euro, this would mean that you could start withdrawing a new currency from an ATM on day one. And at the end of the month, you’ll get a bank statement with amounts designated in the new currency with the proper currency symbol, etc. But that’s just the tip of the iceberg. Any bank maintains a history of transactions for all customers that are used for determining loan eligibility, etc. Your account might have the proper data from the day when the drachma conversion took place going forward but what about the ten years or so of prior transaction history which were denominated in euros? A suite of programs would have to be written to manage the conversion of historical data. This is not a minor task since identifying which files contain such data requires plowing through an enormous IT inventory. Since documentation is always given short shrift in the corporate world, expect major technological hiccups or even heart attacks.
The tasks described above are properly administered in an IT department, which is centrally controlled but that’s not the end of it. Ever since the advent of personal computers, there are huge amounts of mission-critical data that are not maintained by the IT staff. The finance department of any modern corporation is overflowing with PC-based spreadsheets that are used for budgeting, etc. All of these spreadsheets will have to be evaluated for their criticality and converted to the drachma if need be. Once again, a major task.
In August 2001, Computerworld, a trade magazine I read for many years before retiring, described the risks facing small and medium sized businesses that had not gotten up to speed on the euro conversion:
Pollard said the unpreparedness of vendors and suppliers won’t create a catastrophe in the European marketplace, but it will cause supply chain slowdowns and force some small and medium-size businesses to revert to using paper invoices, bound ledgers and filing cabinets.
But Noel Hepworth, head of the euro conversion project at the European Federation of Accountants (FEE), an industry trade group in London, said companies that aren’t ready will quickly be forced out of business by large manufacturers that will refuse to deal with them.
Think about what this would mean for Greece as its businesses tried to do the same thing in reverse. This nation has a huge proportion of smaller firms. It will be exactly those that will be forced out of business if they can’t make the cut. If adopting the drachma will lead to a sharp devaluation as all experts predict, those businesses will be rotten ripe for buying up by foreign investors looking to make a killing.
Now in the long run, it might not matter that all these problems lie in store. It is probably the case that leaving the Eurozone is a necessary first step to escaping the clutches of the German bankers, the IMF and all other predatory institutions. But the left does not look good by minimizing the technical challenges. Most of all, it is worth remembering what Lenin wrote in “State and Revolution”, which is just applicable to a state embarking on an anti-austerity program based on neo-Keynesian principles as it was to the infant USSR:
We are not utopians, we do not “dream” of dispensing at once with all administration, with all subordination. These anarchist dreams, based upon incomprehension of the tasks of the proletarian dictatorship, are totally alien to Marxism, and, as a matter of fact, serve only to postpone the socialist revolution until people are different. No, we want the socialist revolution with people as they are now, with people who cannot dispense with subordination, control, and “foremen and accountants”.
I would only add programmers to the people Lenin identified above.
Yesterday morning I downloaded Microsoft Office 2016 to my new Macbook and spent 3 hours working on an article for which I was to earn $350. In the course of doing a ‘replace all’ to clean some things up when I had finished but not saved the article, Word froze on me. I called tech support and was told by a fellow named Benny (SRX1295917095ID – Microsoft Answer Desk) that Office 2016 was not stable and that I needed to use Office 2011 instead, which I then installed to replace the latest version. The consequences of all this is that I had to rewrite my article from scratch.
Before I retired in 2012, I worked in corporate IT for 44 years including for some blue chip firms like Goldman-Sachs. If we had delivered a nonfunctional product like this to investment bankers, heads would have rolled.
About a year ago my wife and I went to a trendy restaurant in NY called Buddokan and had a meal that cost us about $150, which we didn’t care for very much. When the waitress came by afterwards, she asked how we liked it and I told her the truth. A minute later the manager came by and told us that we only had to pay for our drinks.
When I told Benny that the right thing to do was to refund my payment for Office and allow me to use the 2011 version for free for my troubles, he said no deal. Apparently Buddokan cares more about its reputation than Microsoft.
If you wanted to do the right thing, you’d refund my money but I imagine that you won’t. It is a shame that your standards are so low that you would allow such a fiasco to take place. But then again I have memories of Windows ME, software that required me to buy a new Dell long before it was necessary.
Whenever you drive up to a McDonald’s window, or push your grocery cart to a Stop & Shop checkout line, or head to the register at Uniqlo with a blue lambswool sweater in hand, you, too, are about to be swept up into a detailed system of metrics. A point-of-sale (P.O.S.) system connected to the cash register captures the length of time between the end of the last customer’s transaction and the beginning of yours, how quickly the cashier rings up your order, and whether she has sold you on the new Jalapeño Double. It records how quickly a cashier scans each carton of milk and box of cereal, how many times she has to rescan an item, and how long it takes her to initiate the next sale. This data is being tracked at the employee level: some chains even post scan rates like scorecards in the break room; others have a cap on how many mistakes an employee can make before he or she is put on probation.
Until recently, most retail and fast-food schedules were handmade by managers who were familiar with the strengths of their staff and their scheduling needs. Now an algorithm takes the P.O.S. data and spits out schedules that are typically programmed to fit store traffic, not employees’ lives. Scheduling software systems, some built in-house, some by third-party firms, analyze historical data (how many sales there were on this day last year, how rain or a Yankees game affects revenue) as well as moment-by-moment updates on the number of customers in the store or the number of sweaters sold in the past hour or the pay rate of each employee on the clock—what Kronos, one of the leading suppliers of these systems, calls “oceans of valuable workforce data.” In the world of retail, all of this information points toward one killer K.P.I.: labor cost as a percentage of revenue.
In postwar America, many retailers sought to increase profits by maximizing sales, a strategy that pushed stores to overstaff so that every customer received assistance, and by offering generous bonuses to star salespeople with strong customer relationships. Now the trend is to keep staffing as lean as possible, to treat employees as temporary and replaceable, and to schedule them exactly and only when needed. Charles DeWitt, a vice president at Kronos, calls it “the era of cost.”
On March 24th Art Francisco posted a link to a NY Times article on my Facebook timeline about Facebook hosting news feeds that read in part:
With 1.4 billion users, the social media site has become a vital source of traffic for publishers looking to reach an increasingly fragmented audience glued to smartphones. In recent months, Facebook has been quietly holding talks with at least half a dozen media companies about hosting their content inside Facebook rather than making users tap a link to go to an external site.
Such a plan would represent a leap of faith for news organizations accustomed to keeping their readers within their own ecosystems, as well as accumulating valuable data on them. Facebook has been trying to allay their fears, according to several of the people briefed on the talks, who spoke on condition of anonymity because they were bound by nondisclosure agreements.
This prompted Art to raise the following question:
Facebook is a 21st century social network and news medium owned and operated by our ruling class. Don’t we need a social network and news medium that is for the working class?
Louis N. Proyect, you’re a well known facebook pundit on the left, what do you think? Does facebook serve the needs of the movement, or can we do better?
I was glad to hear from Art since it reminded me that I wanted to write about this matter ever since Greg Grandin’s article on “The Anti-Socialist Origins of Big Data” appeared in The Nation on October 23, 2014. Greg’s article took up in turn a New Yorker article by Evgeny Morozov on the Salvador Allende’s planners making extensive use of computers for economic development as part of Project Cybersyn, the brainchild of cybernetics pioneer Stafford Beer, whose “Designing Freedom”—about his work in Chile—I read some twenty years ago. I was interested in what Beer had to say since my colleagues and I had been involved in a similar project but on a much smaller scale in Sandinista Nicaragua.
We learn from Greg that big corporations appropriated the technology but for contrary ends:
Morozov makes the case that, ironically, it is in Allende’s Project Cybersyn that one can trace the beginning of today’s use of computers by our hyper-linked, consumer-desire economy, by Amazon’s “anticipatory shipping,” Uber and the like, as well as new schemes of “algorithmic regulation” cooked up by neoliberal urban planners, who want to “replace rigid rules issued by out-of-touch politicians with fluid and personalized feedback lops generated by gadget-wielding customers.” Project Cybersyn looks like a “dispatch from the future.” “The socialist origins of big data,” runs a teaser for Morozov’s essay.
Greg supplements Morozov’s customary techno-pessimism by pointing out that computers were used by Pinochet to keep track of the left as part of its overall counterrevolutionary mission.
But there’s a part of the story that Morozov misses, concerning the darker side of the pervasiveness of “big data” in our daily lives. He writes that when Augusto Pinochet staged his Washington-backed coup on September 11, 1973, overthrowing Allende and installing his long dictatorship, he dismantled Project Cybersyn. “Pinochet,” Morozov writes, “had no need for real-time centralized planning.”
But he did have a need for computers, which, Cybersyn notwithstanding, were rare in Latin America in the early 1970s. Washington began to provide Latin America’s right-wing dictatorships with the latest in computer technology, as part of its larger campaign to “modernize” and “professionalize” their intelligence agencies.
Of course, this was not the first time fascists used electronic recordkeeping for repressive ends. Edwin Black’s “IBM and the Holocaust: The Strategic Alliance between Nazi Germany and America’s Most Powerful Corporation” demonstrates that the same tab machines being used by insurance companies and banks in the USA were put to use in the Third Reich’s census, which kept track of Jews.
For that matter, the Internet itself was Satan’s Spawn to begin with, when you stop and think about it. It evolved out of ARPANET, a Pentagon project that was designed to link remote computers through a network using TCP/IP.
Morozov has become something of a prophet of doom when it comes to the Internet. In books such as “The Net Delusion: The Dark Side of Internet Freedom” and “To Save Everything, Click Here: The Folly of Technological Solutionism” and countless articles in Slate, the New Republic, and major media, he issues jeremiads that remind me a bit of the classic New Yorker cartoon with some guy in a long robe, wearing a beard, and carrying a sign—this time with the words “Forsake Twitter to Save Your Soul” or some such thing.
Of course, there’s plenty of grist for his mill with people like Mark Zuckerberg, Sergey Brin, and Jeff Bezos controlling much of the software we use to communicate and buy things, plus vultures like Time-Warner and Verizon looking after the infrastructure. In an article about FB banning anonymity, Morozov calls for something that sounds like Art Francisco’s “Don’t we need a social network and news medium that is for the working class?”
It’s time that citizens articulate a vision for a civic Internet that could compete with the dominant corporatist vision. Do we want to preserve anonymity to help dissidents or do we want to eliminate it so that corporations stop worrying about cyber-attacks? Do we want to build new infrastructure for surveillance—hoping it will lead to a better shopping experience—that would be abused by data-hungry governments? Do we want to enhance serendipitous discovery, to ensure exposure to new and controversial ideas, to maximize our ability to think critically about what we see and read on the Net.
Maybe because I was a software developer for 44 years and know what it is involved to create a crappy little financial system for Goldman-Sachs or Columbia University, this sort of proposal strikes me as utterly utopian. As long as we live under capitalism, we are going to have to rely on technology that is a double-edged sword.
It is not only the Internet that is subject to government surveillance. Long before there was an Internet, the left was obsessed over wiretapping. In the SWP, our comrades used to joke about it when we called each other to discuss some antiwar demonstration we were organizing. We were so sure that the FBI was listening in on our conversations, we’d make wisecracks like “FBI, get off our phone call.”
It wasn’t just the phone that was problematic. There was also mail. We assumed that the FBI was opening our mail when it saw fit. But why would we stop using the telephone or the post office to help organize our activity? What would be the alternative? Carrier pigeon? Tin cans connected by waxed string?
I have a different take on these questions, influenced to a large extent by what Lenin wrote (as opposed to what Leninists write.) In “What is to be Done”, he proposed organizational norms that conformed to changes in the mode of production. The “Economists” who preferred struggles to be localized at the plant gate level were a reflection of the more primitive, handicrafts phase of Russian capitalism when shops were smaller and more isolated. He noticed the great concentration of large factories in major cosmopolitan centers and concluded that a more professional and more generalized approach was needed in line with the changed circumstances.
Economism belonged to Russia’s past; orthodox Marxism was the way forward. He saw modern social democracy as corresponding to the highly complex and specialized nature of modern mass production. He saw socialist parties as the working-class equivalent of large-scale industrial plants. A centrally-managed, large-scale division of labor was needed to move the struggle forward, just as it was necessary to construct steam locomotives. Lenin was no enemy of capitalist technology and mechanization. Rather he sought to appropriate its positive features whenever necessary.
If the Social Democracy of the early 20th century was a reflection of “Fordist” advances over earlier small-scale manufacturing, isn’t there a need to rethink how we are organized today in light of post-Fordist production, and networked technologies more specifically? If the bourgeoisie relies more and more on such advances for its own purposes, why should the working class be afraid of “being abused by data-hungry governments” as Morozov puts it?
In fact the activists using IPhones to record police brutality for Youtube or Facebook to organize protests do not need to read Lenin to get the green light to build movements that take advantage of the Internet. Our task as Marxists is to help the scattered movements unite into a mighty and united force that is capable of transforming society—in essence the same task that existed in Czarist Russia in 1903 but within the context of less advanced tools.