In 1975, I took a job as a Cobol programmer at Salomon Brothers in New York, mainly on the buzz generated by a N.Y. Times profile of its star block trader Michael Bloomberg. The November 9th article noted that “The single‐minded dedication of Mr. Bloomberg’s pressure‐cooker life goes hand in glove with the aggressive business style which has made Salomon Brothers one of the largest and most profitable firms on Wall Street.”
While at Salomon, I often stopped by Socialist Workers Party national headquarters after work to take part in systems design meetings with two other party members. We were automating The Militant and Pathfinder Press as part of an ambitious expansion program by the Trotskyist movement. The SWP had purchased an IBM System 32 minicomputer to generate mailing labels for The Militant and to keep track of Pathfinder’s financial records. Modernization also included the purchase of a web press located on the ground floor of a five-story building on West Street that we foolishly thought of as our Smolny Institute. (A web press had nothing to do with the Internet. It was just a high-powered technology for printing on continuous rolls of paper.)
Wreckage at the scene of an Ethiopian Airlines crash near Addis Ababa, Ethiopia, on Monday. (AP Photo/Mulugeta Ayene)
On October 29, 2018, a Boeing 737 Max 8 belonging to Lion Air in Indonesia crashed into the Java Sea 12 minutes after take-off. All 189 passengers and crew members were killed instantly. It is extremely unusual for planes to suffer such accidents in clear weather after having reached their cruising altitude. Flight experts concluded that the pilots were not adequately trained in the Maneuvering Characteristics Augmentation System (MCAS), a robotics technology that lowers the nose of a plane to prevent a stall. Although there is no definitive judgement on exactly what happened, it appears to be a combination of inadequate training for the pilots and a malfunctioning MCAS.
On Sunday, another 737 Max 8 owned by Ethiopian Airlines had the same kind of accident resulting in the death of 157 passengers and crew members. In the aftermath of the tragedy, this has led to Australia, China, Germany, France, Indonesia, Ireland, Malaysia, Singapore, and the United Kingdom grounding the planes.
Looking at these two horrible tragedies that will make me think twice about getting on a plane again, I keep thinking of the title of Gabriel Garcia Márquez’s classic “Chronicle of a Death Foretold”. In essence, the use of MCAS is akin to an experimental, driverless car owned by Uber killing a pedestrian who was jaywalking on a dark road in Tempe, Arizona on May 18, 2018. The back-up driver, who was supposed to keep a sharp eye on the road to prevent such an accident, was watching reruns of the reality TV show “The Voice” at the time.
Despite such incidents (there have been 4 fatalities already), the bourgeoisie is determined to push ahead since the savings in labor costs will make up for the collateral damage of dead pedestrians. While I am skeptical that completely driverless cars will ever become the norm for Uber or Lyft, I can see people with little driving experience being paid minimum wage just to be a back-up to the computer system—as long as they don’t watch TV on the job. (Fat chance with such a boring job.)
This morning Donald Trump tweeted about the airline crash. “Airplanes are becoming far too complex to fly. Pilots are no longer needed, but rather computer scientists from MIT. I see it all the time in many products. Always seeking to go one unnecessary step further, when often old and simpler is far better. Split second decisions are….”
To begin with, the issue is not planes becoming too complex. It is rather that they are becoming too simple in terms of the amount of deskilling the airlines favor. As for the issue of replacing human labor with robots, he is all for it—reflecting the priorities of a ruling class bent on driving down wages.
The most optimistic analysts project that trucks with empty cabs and a computer at the wheel will travel on U.S. highways in as little as two years with no escort or safety driver in sight now that the Trump administration has signaled its willingness to let tractor-trailers to become truly driverless.
The U.S. Department of Transportation this month announced that it will “no longer assume” that the driver of a commercial truck is human, and the agency will even “adapt the definitions of ‘driver’ and ‘operator’ to recognize that such terms do not refer exclusively to a human, but may in fact include an automated system.”
Already, automated truck developers such as Embark and TuSimple have made freight deliveries where the computer takes control on the highway, overseen by a human “safety driver.” Companies have also successfully tested “platooning,” where a truck with a human driver leads a convoy of as many as five computer-driven trucks following at close distance to reduce drag and save fuel.
The technologies promise big savings, with driverless trucks potentially slashing 40 percent from the cost of long-haul freight – much of it in saved labor expenses – and platooning cutting 10 to 15 percent in fuel costs.
If it is good for cars and trucks, why not airplanes?
Two years before the Indonesian 737 crash, the Guardian published an article titled “Crash: how computers are setting us up for disaster” that it clearly anticipated. Interestingly enough, it was not even a Boeing plane that was discussed in the article. It was an Airbus 330 that had the same kind of systems as the Boeing NCAS. With pilots much more used to relying on automation than manual control of the plane, they failed to override the system that was forcing the plane to plunge into the Atlantic Ocean on June 1, 2009 at about 125 miles an hour. Everyone on board, 228 passengers and crew, died instantly.
While pilots flying to major airports will continue to be highly paid, the wages of those working for regional airlines has fallen drastically. In 2010, the Guardian reported on “A pilot’s life: exhausting hours for meagre wages”. They lead a decidedly unglamorous life:
Many are forced to fly half way around the country before they even begin work. Others sleep in trailers at the back of Los Angeles airport, in airline lounges across the country or even on the floors of their own planes. Some co-pilots, who typically take home about $20,000 (£12,500) a year, hold down second jobs to make ends meet.
All that will change when airplanes go the route of driverless cars as the NY Times reported last July in an article titled “Are You Ready to Fly Without a Human Pilot?” In the same fashion that Trump backed driverless trucks, the move toward pilotless planes seems inexorable:
Regulators are already taking steps toward downsizing the role of humans on the flight deck. The bill to reauthorize the Federal Aviation Administration included language to provide funding to study single-pilot operations for cargo planes, a move that the Air Line Pilots Association opposed. Captain Canoll said that a single-pilot aircraft must be safe to fly without anyone at the controls in case the pilot takes a bathroom break or becomes incapacitated.
At the recently concluded World Economic Forum, there was a big focus on artificial intelligence and robotics. On the website, you can find breathless articles about “Meet Stan: the robot valet that parks your car at the airport” and “US companies created a record number of robot workers in 2018”. In a Washington Post article on the WEF, the title betrayed a certain unease about the replacement of human beings by robots: ‘The aristocrats are out of touch’: Davos elites believe the answer to inequality is ‘upskilling’.It cited Blackstone CEO Stephen Schwarzman about how to keep the masses docile: “The lack of education in those areas in digital is absolutely shocking. That has to be changed. That will very much lessen the inequalities that people have in terms of job opportunities.”
What world are these people living in? Schwarzman has a 32-room penthouse in 740 Park Avenue and spent $5 million for his birthday party in 2017. He just made a gift of $1 billion to MIT to launch a new school for Artificial Intelligence. Is that supposed to create jobs? Maybe for someone with an MIT degree who will go to work writing software to replace the people working for Jeff Bezos’s slave labor-like warehouses with machines but what is someone out of a job at an Amazon warehouse then supposed to do? Apply to MIT?
The handwriting is on the wall. The USA is moving into a two-tiered system. In places like NYC, Boston, San Francisco, Seattle and Portland, you get people working in high-tech industries that in contrast to the Fordist model of the 1930s employ far fewer bodies. Meanwhile, in Detroit, Cleveland, and other places where Fordism once held sway, the jobs are there if you are willing to work at Walmarts, at local hospitals emptying bedpans or as guards in a jail or prison. Class divisions between those with advanced technology skills and those left out will only increase, leading to the kind of showdown taking place in France between the neoliberal state and the Yellow Vests.
In cities like Oakland and Berkeley and San Francisco, millennials obsess over Alexandria Ocasio-Cortez’s Twitter and attend Democratic Socialists of America meetings. But the socialist passion doesn’t seem to have impacted the city’s zeal for I.P.O. parties, which the party planning community says are going to surpass past booms.
Jay Siegan, a former live music club owner who now curates private entertainment and music, is gearing up. He has worked on events for many of the I.P.O. hopefuls, including Uber, Airbnb, Slack, Postmates and Lyft.
“We see multiple parties per I.P.O. for the company that is I.P.O.ing, as well as firms that are associated to them,” Mr. Siegan said. Budgets for start-up parties, he said, can easily go above $10 million. “They’re wanting to bring in A-list celebrities to perform at the dinner tables for the executives. They want ballet performers.”
The only comment I would add to this tale of two cities is that it would not be surprising if some of these high-flying technology workers might also plan to vote for Bernie Sanders. They probably don’t feel happy about living in a city where their wealth has driven up the cost of housing to the point that homelessness is an epidemic. Whether President Sanders can do much about these class divisions is open to debate.
The replacement of human labor by machinery has been described as “creative destruction”. The assumption is that the temporary pain is worth it since there will always be the growth of new jobs. As my seventh grade social studies put it, the invention of the automobile put the blacksmith out of work but it created far more jobs in a Ford plant.
On May 12, 2010, the New York Times ran an article by economics editor Catherine Rampell titled The New Poor: In Job Market Shift, Some Workers Are Left Behind that focused on the largely middle-aged unemployed who will probably never work again. For example, 52 year old administrative assistant Cynthia Norton has been working part-time at Walmart while sending resumes everywhere but nobody gets back to her. She is part of a much bigger picture:
Ms. Norton is one of 1.7 million Americans who were employed in clerical and administrative positions when the recession began, but were no longer working in that occupation by the end of last year. There have also been outsize job losses in other occupation categories that seem unlikely to be revived during the economic recovery. The number of printing machine operators, for example, was nearly halved from the fourth quarter of 2007 to the fourth quarter of 2009. The number of people employed as travel agents fell by 40 percent.
But Ms. Rampell finds the silver lining in this dark cloud:
This “creative destruction” in the job market can benefit the economy.
Pruning relatively less-efficient employees like clerks and travel agents, whose work can be done more cheaply by computers or workers abroad, makes American businesses more efficient. Year over year, productivity growth was at its highest level in over 50 years last quarter, pushing corporate profits to record highs and helping the economy grow.
The term “creative destruction” might ring a bell. It was coined by Werner Sombart in his 1913 book “War and Capitalism”. When he was young, Sombart considered himself a Marxist. His notion of creative destruction was obviously drawn from Karl Marx, who, according to some, saw capitalism in terms of the business cycle. With busts following booms, like night follows day, a new round of capital accumulation can begin. This interpretation is particularly associated with Volume Two of Capital that examines this process in great detail. Looking at this material, some Marxists like Eduard Bernstein drew the conclusion that capitalism is an infinitely self-sustaining system.
By 1913, Sombart had dumped the Marxist commitment to social revolution but still retained the idea that there was a basis in Karl Marx for upholding the need for “creative destruction”, a view buttressed by an overly positive interpretation of this passage in the Communist Manifesto:
The bourgeoisie cannot exist without constantly revolutionizing the instruments of production, and thereby the relations of production, and with them the whole relations of society. Conservation of the old modes of production in unaltered form, was, on the contrary, the first condition of existence for all earlier industrial classes. Constant revolutionizing of production, uninterrupted disturbance of all social conditions, everlasting uncertainty and agitation distinguish the bourgeois epoch from all earlier ones.
By the 1930s, Sombart had adapted himself fairly well to the Nazi system although he was not gung-ho like Martin Heidegger or Carl Schmitt. The wiki on Sombart notes:
In 1934 he published Deutscher Sozialismus where he claimed a “new spirit” was beginning to “rule mankind”. The age of capitalism and proletarian socialism was over and with “German socialism” (National-Socialism) taking over.
But despite this, he remained critical. In 1938 he wrote an anthropology text that found fault with the Nazi system and many of his Jewish students remained fond of him.
I suspect, however, that Rampell is familiar with Joseph Schumpeter’s use of the term rather than Sombart since Schumpeter was an economist, her chosen discipline. In 1942, he wrote a book titled Capitalism, Socialism and Democracy that, like Sombart, retained much of Karl Marx’s methodology but without the political imperative to destroy the system that utilized “creative destruction”. He wrote:
The opening up of new markets, foreign or domestic, and the organizational development from the craft shop and factory to such concerns as U.S. Steel illustrate the same process of industrial mutation–if I may use that biological term–that incessantly revolutionizes the economic structure from within, incessantly destroying the old one, incessantly creating a new one. This process of Creative Destruction is the essential fact about capitalism. It is what capitalism consists in and what every capitalist concern has got to live in. . . .
The wiki on Schumpeter claims that this theory is wedded to Nikolai Kondratiev’s “long wave” hypothesis that rests on the idea that there are 50 year cycles in which capitalism grows, decays and enters a crisis until a new round of capital accumulation opens up. Not only was the idea attractive to Schumpeter, it was a key part of Ernest Mandel’s economic theories. Unlike Schumpeter, Mandel was on the lookout for social agencies that could break the cycle and put development on a new footing, one based on human need rather than private profit.
Returning to Rampell’s article, there is one dimension entirely missing. She assumes that “creative destruction” will operate once again in order to foster a new upswing in the capitalist business cycle. But how exactly will that manifest itself? All the signs point to a general decline in business activity unless there is some kind of technological breakthrough equivalent to the computer revolution that fueled growth for decades. Does anybody believe that “green manufacturing” will play the same role? I don’t myself.
One thing does occur to me. Sombart’s book was written in 1913, one year before WWI and was even titled eerily enough “War and Capitalism”. One wonders if the Great War would be seen as part and parcel of “creative destruction”. War, after all, does have a knack for clearing the playing field with even more finality than layoffs. Schumpeter wrote his book in 1942, one year into WWII. My guess is that he did not theorize war as the ultimate (and necessary?) instrument of creative destruction but history will record that WWII did introduce a whole rafter of new technology, including aluminum, radar, nuclear power, etc., while bombing old modes of production into oblivion. What a great opportunity it was for capitalism to rebuild Japan, especially after firebombing and atomic bombs did their lovely work.
In my view, there’s something disgusting about this “creative destruction” business especially when it is articulated by a young, pro-capitalist Princeton graduate like Catherine Rampell who wrote for Slate, the Village Voice and other such b-list publications before crawling her way up into an editorial job at the NYT. She clearly has learned how to cater her reporting to the ideological needs of the newspaper of record, growing more and more reactionary as the crisis of capitalism deepens.
In many ways, the best thing about the NY Times is the obituary since it amounts to a small-scale biography. If given a choice between a documentary and a narrative film, I generally lean toward the documentary because the real lives of people are far more interesting than what a screenwriter can think up. The same thing is true when it comes to a biography versus a novel. Why would I want to read something written by Jonathan Franzen when my time could be better spent on a biography of Ho Chi Minh or John Brown, just two that are sitting on my bookshelf right now?
On June 4th, the obituary for Jean Sammet appeared. Although I am pretty familiar with the lives of people who took part in the information revolution, I had never heard of her before. She was one of the six people who got together in 1959 to write the Common Business-Oriented Language, more familiarly known as COBOL. This was a language I used from 1970 until 1995 or so when I switched over to a Unix platform at Columbia University developing client-server systems in perl and java.
Jean Sammet, who brought computing into the business mainstream, at the University of Maryland in 1979 to deliver a lecture.Credit: Ben Shneiderman
From 1970 to 1978, I used my COBOL skills to facilitate moves from one city to another during my time as an SWP member. In those years, being a qualified COBOL programmer could usually land you a job within a week after moving to a new city. Furthermore, it enabled you to change jobs every 2 years or so with a 10 percent salary increase. And most importantly, it allowed you to exist in the corporate world without having to become part of the machine. Even after leaving the SWP, my computer skills continued to pay off, even at a place like Goldman-Sachs. The job served my ends just as much as it did my employer. This might have been obvious as indicated by a Newsday article about the Nicaragua solidarity movement in New York:
Lou Proyect works in a Wall Street investment bank, one of 25 “database administrators” who sit in a numbing row of fluorescent-blanched cubicles and stares at computers until the end of the day. It is the latest variation on the kind of job he has held for 19 years. Tacked to the wall of his cubicle is the latest article cut out from PC Week, a personal computer trade magazine: “IBM’s PS/2s aren’t all that revolutionary.” Neither, he says, is Lou Proyect.
It was in another computer magazine that he learned of the shortage of computer programmers in Nicaragua, because so many of the skilled middle class were leaving:
“This neon sign kept on in my head: ‘Nicaragua Needs You.’ This was using skills I had always taken for granted.”
He went to conduct a two-week workshop in computer programing as part of TecNica, a national organization which in the last two years has sent to Nicaragua about 50 New Yorkers, mostly computer programmers, some engineers, and one typesetter, one medical lab technician, one boiler mechanic, one travel agent. “Most are not hot radicals,” Proyect says. “They’re people very much like Ben Linder, taken up with the idea of helping the poor.”
I would go so far as to say that maybe half the people who went on TecNica delegations had more in common with Jean Sammet than they had with me. Born in 1928, she graduated from Mount Holyoke with a mathematics degree. Enrolled as a math grad student at the U. of Illinois in 1949, she ran into her first computer that revolted her. She said, “I thought of a computer as some obscene piece of hardware that I wanted nothing to do with.”
It was only when she ran some punch cards through a computer that she was transformed. “To my utter astonishment. I loved it.”
This exactly how I reacted when I started off as a programmer trainee at Metropolitan Life in 1968 using a COBOL-like language developed in-house called English Language. When I discovered that testing software was like doing puzzles, I couldn’t believe I was going to get paid to have fun. Here in a nutshell is what a COBOL programmer does. The code has been simplified but not by much.
P1.
Read employee_file into employee_record at end go to End_job.
Move First_Name to Pay_to_first_name.
Move Last_Name to Pay_to_last_name.
Compute check_amout = Wage_amount * Hours_worked.
Write check_record.
Go to P1.
End_job.
Close employee_file.
Close check_file.
Now all of this might seem quite mundane. The program reads through a file, calculates the wage and then writes a check. Except the program would not work. Take a minute to see why. Done? It won’t work because I spelled “amout” rather than “amount” under P1. You might understand what I meant but not a computer!
American corporations have been running payroll applications like this since the 1960s but in a place like Nicaragua most businesses, many of which were owned by Somoza before 1979, did not have computers. Someone had to sit down with a calculator and do all of this manually, including the signature on the check. In the TecNica video, one volunteer recounts how the introduction of a modest computer cut the time for a state-owned enterprise dramatically. My time spent in Nicaragua convinced me that automation made socialism possible for the first time in human history, something that cybernetics expert Stafford Beer hoped would help transform Chile. It was only Nixon and Reagan’s intervention that showed how difficult it was to build socialism, even with the best of intentions by the government and having leading-edge technology at its disposal. A counter-revolutionary army supported by the most powerful capitalist nation in history was capable of stopping even the most determined movements for change.
The obit described the cultural environment for professional programmers in the early 50s:
In the early 1950s, the computer industry was in its infancy, with no settled culture or rigid career paths. Lois Haibt, a contemporary of Ms. Sammet’s at IBM, where Ms. Sammet worked for nearly three decades, observed, “They took anyone who seemed to have an aptitude for problem-solving skills — bridge players, chess players, even women.”
While by the time I entered the field it could no longer be described as in its “infancy”, it was nothing like today when most programmers have little interest except in making big money. Goldman-Sachs is now looking for computer science graduates who more likely than not have never read a single novel in their life except “Atlas Shrugged”. In 1970, computer science was barely getting off the ground. Most programmers I worked with back then were people who fell into it like me. With a liberal arts degree, it was very tempting to take a job as a programmer trainee that expected very little from you except to be competent in your trade (it could hardly be called a profession.)
When I went to work for the First National Bank of Boston in 1970, some of my co-workers had been affected by the student movement to some degree or another. There was a guy who had just graduated Dartmouth who had very poor work habits and spent most of the day talking about the Grateful Dead to anybody who listened. I sat next to a guy named Richard who worked there as a consultant. He was very knowledgeable about the arts and politics and someone I spent much of the day wasting time with chatting about socialism, 12-tone music and Godard films. Another consultant was a Harvard graduate who was about 5 years older than me and skeptical about radical politics to say the least. I dragged him once to see Camejo speak and he summed him up as “too febrile”. This was around the time I began to realize that not everybody was open to socialism.
I never met the guy at the bank who probably had more guts than me. There was a whiteboard in the cafeteria that was used for design sessions but someone had the brilliant idea to write something like this on it when nobody was looking: “The capitalist system is destroying the United States while it is killing countless Vietnamese peasants. Now is the time to demonstrate your opposition to such a monstrous system.” And beneath it in capital letters and underlined, you could read: “DO NOT ERASE”. People working for banks are so used to authority that the agitprop stood up until late in the morning when security guards got the okay from upper management to erase it.
From Boston I went down to Houston, Texas in 1973 and went to work for Texas Commerce Bank. I reported to Billy Penrod, a guy who looked and talked like a cowboy and who was a former Texas A&M running back from Gonzalez, Texas. Like most people in Gonzalez, Billy was a racist. He once described Gonzalez as a sundown town, even though he didn’t use that term. He put it this way, “Colored people understood that they shouldn’t get caught in Gonzalez after dark.” Despite his retrograde views, I learned to admire Billy as the consummate systems analyst. We were developing a personal trust system that kept track of the estates owned by the oil millionaires. Billy was from Jean Sammet’s generation and got started out wiring IBM tab machines, used for accounting systems before there were computers. He got so good at managing them that he went to work as an IBM consultant implementing tab-machine based accounting systems around the country.
In 1975, I moved back to New York to work on automating the SWP headquarters, including the Militant newspaper and Pathfinder. This was the first time I began to suspect that I had joined a cult. The in-grown, zombie-like atmosphere at West Street made me feel ten times more alienated than I ever felt in a bank.
I went to work for Salomon Brothers during the day while doing West Street systems development by night. I didn’t stay long at Salomon but long enough to work with Michael Bloomberg who had me and a business analyst automating SBIL, their branch office in London. I rather liked Bloomberg even though he was an even bigger skunk than Billy. He was a sexist and racist pig who once yelled out “Look at the tits on that broad” when a Latina was delivering coffee on the trading floor.
When I got a mediocre review at Salomon, I went out and found another job in a week with ACI (Automated Concepts, Inc.), one of many “job shops” that hired programmers for very good salaries in the 1970s and 80s before they went to work for themselves as a contract programmer making even bigger money. One of my last jobs was as a self-employed contract programmer making $500 per day. This was in 1989 just when the job market was tightening up irrevocably and when those asshole libertarian smart-ass computer science majors were taking over.
ACI was a fun place to work. I got a big kick out of the CEO Fred Harris who was into EST, a self-improvement cult not nearly so bad as Scientology but pretty bad. Fred didn’t quite know what to make of me but he appreciated the fact that I had a rather “elevated” mind as well as being a crackerjack COBOL programmer. I have no idea whatever happened to ACI and Fred Harris but I used to dream about going up to their offices on 386 Park Avenue South getting my next assignment.
One of my last consulting assignments with ACI introduced me to Gabriel Manfugas, the son of a former Batista soldier who had fled to the USA in the early 60s. Gabriel and I became fast friends even though he had no use for my politics. We used to smoke pot, even during lunch, as we walked around mid-town. I got to know his friends, who were upwardly mobile Latinos from Washington Heights and programmers like him. By this time, programmers—including them—were computer science graduates but nobody could mistake them from the jerks I used to run into at Goldman-Sachs. Mostly, they were looking for angles to make them wealthy like starting their own consulting company. I was 20 years older than all of them, who saw me as a father figure—subversive politics and all. I enjoyed plenty of cocaine binges with them in the 1980s and have fond memories of all of them who are now in senior management positions at various corporations.
Two new documentaries will make you look differently at your electronic gadgets, especially the cool iPhone or other products from Apple whose logo might be changed to a skull-and-crossbones after seeing “Death by Design” and “Complicit”. They examine the damage done to both the workers who produce them and the environment, especially in China, as well as raise important questions about the meaning of “progress”. If being able to use an iPhone to pay for your Starbucks coffee comes at the expense of a leukemia epidemic for Foxconn workers and making 60 percent of China’s groundwater unsuitable for drinking, then the whole question of progress has to be thought through.
At the age of 74, Werner Herzog has just made his 38th feature film, a documentary about the Internet titled “Lo and Behold Reveries of the Connected World”. The German director shows no sign of the age-related decline that has affected so many of his peers such as Martin Scorsese or Woody Allen. Of course, given the fact that Allen hasn’t made a watchable film in over 35 years makes one speculate that he has never lived up to his accolades to start with.
Unlike the mega-celebrity from Hollywood, Herzog belongs to that rarefied world of “foreign” or “independent” films that inevitably get screened in art houses and rarely get nominated for Academy Awards. In other words, he makes the sort of film I live for, starting with his narrative film “Aguirre, the Wrath of God” that I saw in 1977 and that made me a devoted fan. It starred Klaus Kinski, a member of Herzog’s repertory company at the time, as a deranged conquistador determined to find the lost city of El Dorado to seize control over its legendary riches, even if it cost the lives of every man in his expedition. In the final gripping scene, he is the sole survivor adrift on the Amazon River with monkeys overrunning his raft. It was not a stretch for a Marxist like me to see it as a critique of colonialism even though 7 years later I would be dismayed to discover that he had made a TV documentary taking up the cause of the Miskitos in Nicaragua. It was only 10 years later that I figured out that the Atlantic Coast Indians had legitimate grievances and that Herzog was right to make such a film.
If there is one thing you can predict about a Werner Herzog film, it is that it will be unpredictable. “Lo and Behold” is nominally a series of interviews with pioneer computer scientists like UCLA’s Leonard Kleinrock but refracted through Herzog’s off-kilter sensibility that runs through the film like a black thread. For example, upon the completion of his interview with Ted Nelson, who anticipated the rise of the Worldwide Web, he allows Nelson to take his photo like a fan—a gesture that defies conventional documentary techniques to say the least.
Herzog is obviously fascinated by the computer scientists who come across as gee-whiz techno-optimists who clash with his own darkly absurdist vision of life even as he shares their breathless testimonies to the spectacular rise of the Internet. It is reminiscent of his near-obsession with the German-American jet fighter pilot Dieter Dengler who was shot down over Vietnam. In the 1997 documentary “Little Dieter Needs to Fly”, he describes how Dengler became obsessed with flying after seeing Allied fighter-bombers destroying his German village during WWII.
If you’ve seen that film, you will understand why Herzog seems just as fascinated with Elon Musk whose SpaceX company is building rockets that are intended to create a colony on Mars. Like Dengler, flight brings Musk closer to eternity or at least a taste of it. In explaining the need for colonizing Mars, Musk describes it as a hedge against something “going wrong” on Earth, the result of either a manmade or natural disaster. Will we have an Internet on Mars, Herzog playfully asks. With a cold smile, Musk says that we will after sending up a few satellites to circle the planet. One can hardly escape feeling that we are in the company of someone who would have made Aguirre blanch.
The film does not limit itself to the Internet. It is also devoted to displaying the latest in robotics and interviewing the geeks who work in the field. We meet Joydeep Biswas, a Carnegie-Mellon engineer who displays six inch tall soccer-playing robots that dart about a miniature field scoring goals against each other. He has a particular fondness for robot number 8 that seems to be just a cut above the others. After Herzog asks Biswas if he loves that robot, the engineer grins sheepishly and admits that he does. It is a priceless moment.
If most of the film is devoted to the wonders of the Internet, Herzog makes sure to illustrate its dark side. He interviews the Catsouras family at their home in Orange County, near Los Angeles. The father, mother and three teen daughters sit around their dining table as the father describes the trauma they faced after a fourth daughter died in an auto accident in 2006. When a photograph taken by state troopers leaked out to the Internet showing her nearly decapitated head, the family was horrified by the photo that had gone viral and their inability to suppress it after a judge ruled that a dead person does not have the right to privacy. Mrs. Catsouras tells Herzog that the Internet was the anti-Christ to her.
As the conversation with Catsouras family over these grizzly matters transpires, your attention is fixated both on them and three trays of baked goods sitting on the table in front of them. The contrast between the muffins, cakes and cookies and their woeful experience could hardly be more striking. It makes you wonder if they prepared the goodies for the film crew and that Herzog decided to leave them on the table just for their macabre counterpoint to the matters under discussion. I am sure he did.
After the press screening, I chatted briefly with NY Times film critic A.O. Scott about “Lo and Behold”. He was a bit surprised that I did not have much to say about Herzog’s utter lack of attention to the frequently aired concerns about the political implications of the Internet’s explosive growth. There is nothing about the monopolistic tendencies of Jeff Bezos, the NSA’s ability to snoop on our emails or phone calls, the threat of cyberwarfare, and the like. Scott was right, of course, but I doubt that Herzog had much interest in making the kind of film that Laura Poitras would have made. Herzog is primarily interested in human psychology, and particularly what some might consider abnormal psychology. With his command of cinematic techniques gathered over nearly four decades and a sense of the absurd matched by very few filmmakers today, Werner Herzog marches to the tune of his own drummer. “Lo and Behold” opens at the Lincoln Center Film Society on August 19th and better theaters everywhere. Highly recommended.
On May 7th a man named Joshua Brown died when his Tesla smacked into a trailer truck that the autopilot system mistook for the sky. Brown was a Navy Seal veteran who had worked in the Special Warfare Development Group, the elite unit that killed Osama Bin-Laden. His specialty was dismantling bombs in Iraq. Little did he realize that he was killed by a bomb that was set to go off the first time its onboard computer system malfunctioned.
Apparently Brown was obsessed with his car and its supposedly miraculous ability to forestall highway accidents. He made many Youtube videos about his passion, including the most recent one that illustrated its uncanny ability to avoid accidents.
The Guardian reported that Brown was watching a Harry Potter video when his Tesla careened into the trailer-truck so we can conclude that magic did not come to his rescue. It described the circumstances of the collision:
According to Tesla’s account of the crash, the car’s sensor system, against a bright spring sky, failed to distinguish a large white 18-wheel truck and trailer crossing the highway. In a blogpost, Tesla said the self-driving car attempted to drive full speed under the trailer “with the bottom of the trailer impacting the windshield of the Model S”.
One imagines that Brown must have invested so much in the car and his invincibility because he ran a technology consulting company called Nexu Innovations that was for “Making a Difference in Our Flattening World”. Of course, the concept of a “flattening” world is straight out of the Thomas Friedman playbook. Friedman has been churning out columns on how outsourced tech support help desks in Ghana, etc. would be the answer to the world’s woes and wherever it failed, the Navy Seals could step in and straighten things out.
My immediate reaction to the news of his death was to tell my wife that we should be grateful that Ronald Reagan’s Strategic Defense Initiative, aka Star Wars, was never implemented. Back in 1983 when I was getting re-politicized around the Central America guerrilla struggles, I also decided to join Computer Professionals for Social Responsibility, a group that made blocking the implementation of SDI a high priority.
The technology of SDI and the Tesla autopilot system are both based on artificial intelligence, in effect to give computer systems the same capability of a human eye matched to a functioning brain that follows certain pre-established rules. With Tesla, the goal is to avoid collisions. With SDI, the goal was to make them—specifically to smack into and blow to smithereens Soviet missiles that encroached upon American airspace. Reagan’s goal was to provide a nuclear shield that would give the USA a big advantage in a Cold War that might turn hot. Many people, including someone like me who used to take part in “duck and cover” drills in elementary school in the 1950s, were terrified by the notions being put forward by Reagan and his cohorts.
Reagan believed that missiles could be “recalled” as if they were like remote controlled model airplanes. Even more ghastly was the reassurances of Thomas K. Jones, Reagan’s Deputy Under Secretary of Defense for Research and Engineering, that the USA could recover from a nuclear war with Russia in 2 to 4 years. Jones once said, “If there are enough shovels to go around, everybody’s going to make it.” We were supposed to use the shovels to dig a hole in the ground (can you imagine New Yorkers running to Central Park with the H-Bomb on the way?) that would be covered with a couple of doors and three feet of dirt on top of them. Jones said, “It’s the dirt that does it.”
As it happens, there is a morbid connection between this doomsday scenario and the capitalist who started Tesla. Elon Musk is not the only the manufacturer who is pioneering such cars but he is the only one who pushes the idea that an autopilot system capable of changing lanes now exists in his automobile. For others working in the field such as Volvo, Mercedes and Toyota, they never saw it more than only a technology good for parking assistance.
Mary “Missy” Cummings, a Duke University robotics professor and former military pilot, told the Guardian that Tesla should disable its autopilot system for navigating multilane expressways. “Either fix it or turn it off … The car was in a place where the computer was blind. The computer couldn’t see the environment for what it was.”
In addition to Tesla, Musk is investing in space travel. He is interviewed by Werner Herzog in “Lo and Behold”, a documentary on computers, the Internet and robotics that opens on August 19th. Herzog, who is much more interested in the “gee whiz” personalities of the men he interviews than their political or social ambitions (a point that A.O. Scott made to me that I had not even gathered), was goggle-eyed as Musk spelled out the need for colonizing Mars if “something goes wrong” on Earth.
The company is called SpaceX and it hopes to have its first launch in 2022. In a 2013 interview with the Guardian, the man who made his billions from Paypal stated that he was inspired to shoot for colonizing Mars after reading Isaac Asimov’s “Foundation” science fiction series whose main character Hari Seldon anticipates the collapse of the Galactic Empire, which encompasses the entire Milky Way. To save humanity, he creates a think-tank that develops the technology to launch a new galactic empire.
Musk told the Guardian, “It’s sort of a futuristic version of Gibbon’s Decline and Fall of the Roman Empire. Let’s say you were at the peak of the Roman empire, what would you do, what action could you take, to minimise decline?”
The answer for Musk is technology.
“The lessons of history would suggest that civilisations move in cycles. You can track that back quite far – the Babylonians, the Sumerians, followed by the Egyptians, the Romans, China. We’re obviously in a very upward cycle right now and hopefully that remains the case. But it may not. There could be some series of events that cause that technology level to decline. Given that this is the first time in 4.5bn years where it’s been possible for humanity to extend life beyond Earth, it seems like we’d be wise to act while the window was open and not count on the fact it will be open a long time.”
In James Joyce’s “Ulysses”, Stephen Dedalus says “History is a nightmare from which I am trying to awake.”
This is our nightmare, comrades. We have a capitalist class that is planning to colonize Mars in order to escape from the disaster it is now creating on Earth. Musk says he expects his business to be profitable since there will certainly be 80,000 people willing to pay the big bucks to flee a planet that has been consumed by nuclear war, catastrophic Noah’s Ark type flooding because of climate change, epidemics caused by viruses unleashed by the penetration of rain forests, or some other unforeseen disaster.
Musk is not the only capitalist who has “escape” plans. Jeff Bezos, the filthy predator who runs Amazon, is investing in Blue Origin, a space travel company that will not aim at colonizing Mars—a place that Bezos writes off as inhabitable—but instead hopes to launch huge satellites that will orbit around a post-apocalyptic planet Earth. In an interview with the Miami Herald conducted shortly after his high school graduation (he was class valedictorian), he said he wanted to build space hotels, amusement parks and colonies for 2 million or 3 million people who would be in orbit. We have no idea what Bezos’s plans are today but one suspected that they are much more in line with Musk’s, to create a sanctuary for 80,000 or so people who share his bourgeois values.
One thing we can be certain about: if people like Bezos inhabited a space station, they’d probably kill each other before the year is up given what they are doing to the planet today.
The Rod Holt character in the 2013 narrative film “Jobs”. Note the SWP poster on the wall.
Last night I attended a press screening for Alex Gibney’s documentary “Steve Jobs: The Man in the Machine” that opens in theaters and VOD on Friday, September 4th. The film is a brilliant analysis of both the man and the company he built. Since Gibney’s last documentary was on Scientology, it was natural to wonder whether he decided to take on another cult. When Jobs died, Gibney was struck by the mass grief that poured out for the CEO after the fashion of Princess Di. What explained such devotion? Since Gibney owned and treasured his IPhone, this was a question that provoked him into making this film. As someone who likes but does not exactly love his Macbook, and who spent 44 years working as a systems analyst and a programmer, the question of Apple’s place in the American economy and society is also of great interest to me.
There’s another connection. Back in 1967 I met Rod Holt in the New York branch of the Socialist Workers Party, a wiry fellow with close-cropped hair who I found more interesting than most party veterans since he was an engineer and had raced motorcycles—not the typical resume for a Trotskyist. Years later I learned that Holt would become one of the five founding members of Apple. As such I was spurred to watch the 2013 narrative film “Jobs” on Amazon streaming that includes Holt as a minor character. This review will take up both films as a prelude to the new film about Steve Jobs by Danny Boyle that will premiere in the Lincoln Center Film Festival next month. It is understandable why three films will have taken up the Steve Jobs story. Apple now enjoys the highest capitalization of any American corporation at 724 billion dollars, twice that of ExxonMobil. If a film like “There Will be Blood” or “Citizen Kane” documented the ugly character of previous generations of the bourgeoisie, the three films about Steve Jobs bring us up to date on how the computer revolution turns entrepreneurs into monsters—the latest report on Amazon’s treatment of white collar workers bears this out. In many ways, Jobs was the prototypical Silicon Valley terror anticipating Jeff Bezos, Mark Zuckerberg and Sergey Brin. Unlike these more recent avatars of the computer ruling class, the 1960s counter-culture shaped Steve Jobs as both Gibney and Joshua Michael Stern, the director of “Jobs”, make clear. Devotee of Eastern religion and Bob Dylan, wearing long hair, and with a background in phone phreaking escapades, Jobs seemed the least likely candidate for building a corporation twice as big as ExxonMobil—itself the product of a merger of two behemoths. Figuring out how that took place was exactly what drove Alex Gibney into making the most important documentary of 2015.
Born in 1955, Steve Jobs was in his late teens during the biggest shake-up in American society since the 1930s. Unlike me, ten years his senior and ten years Rod Holt’s junior, Jobs was far more interested in Enlightenment than socialism. You have to remember that the thirst for spiritual transcendence was very deep at the time, powerful enough to turn antiwar leaders Rennie Davis and Jerry Rubin into searchers after Transcendence either in the form of a Hindu guru’s cult and EST respectively. EST was a training program that was founded by Werner Erhard designed to help yuppies solve problems after the fashion of Scientology. Erhard cobbled together some techniques that he had picked up from Zen Buddhism. and psychotherapy The CEO of the consulting company I worked for in the 1980s was an EST follower although he never foisted his beliefs on me. The idea that Zen Buddhism could be a guide to business success for both Steve Jobs and my boss might seem strange at first but one must never forget that Zen Buddhists were gung-ho for Japanese imperialism in WWII as I mentioned to Gibney in the Q&A.
Like so many others from his generation, Jobs went to India on a pilgrimage to seek Wisdom with his friend Daniel Kottke who would become one of Apple’s founders. As both films point out, Jobs decided to allocate zero shares to Kottke when Apple was incorporated. He was very good at throwing people under the bus. When Jobs was at Atari in his first real job, the boss offered him a $5000 bonus if he could come up with a really great game. Needing hardware assistance, he recruited Steve Wozniak who was told that he could get half the bonus if they succeeded. But Jobs lied and told Woz that the bonus was only for $750.
That’s not the half of it. When Jobs’s girlfriend became pregnant, he retained a lawyer to help him avoid paying child support, claiming that she had screwed around so much that nobody could tell who the father was. Eventually a DNA test proved that he was the father. Even if he wasn’t, his millions could have easily helped to support the child of someone with whom he had been intimate.
Gibney gives the devil his due. In capturing Jobs’s single-minded devotion to crafting user-friendly and beautiful machines, you are reminded of why Apple became dominant. Unlike Detroit, Silicon Valley was always much more sensitive to marketing trends since so much of personal computing was driven by taste. And once Apple embarked on making products like the IPod, the IPhone, and the IPad, it was possible for consumers to really feel like the computer was an extension of their self. Gibney wonders, however, whether this is at the expense of the social bonding that was so important in the 1960s. If you go into a restaurant nowadays, you will often find a family of four fixated on their IPhone as each course is delivered, with conversation going by the wayside. The phone becomes worry beads that you can’t keep your hands off of.
The final fifteen minutes or so of Gibney’s film is a rather scathing summary of Jobs’s misdeeds from avoiding taxes to screwing Chinese workers out of a living wage as he polluted their rivers with industrial waste. Of particular interest is how Steve Jobs used a special task force of Silicon Valley police to go after a reporter for Gizmodo who had reported on an early release of an IPhone that a drunken Apple employee had left behind in a bar and that had come into his hands. Even after the phone had been returned, the cops raided the reporter’s home and carted off computers and other valuables. When asked by a TV interviewer why he had resorted to such repressive measures, Jobs replied that he was trying to uphold Apple “values”. In the Q&A, someone asked Gibney what question he would have asked Jobs if he had had the opportunity to interview him. He replied that he would have asked him to define what are his “values”.
I can’t recommend “Steve Jobs: The Man in the Machine” highly enough. For my money, Alex Gibney is the best documentary filmmaker working today, an equal to Werner Herzog. With 35 credits to his name, including “Taxi to the Dark Side” about the American torture regime, Gibney combines acute social analysis with fluid documentary techniques. As is always the case with documentaries, there is a need to tell a story just as much as there is with narrative films. Since Gibney described Jobs as someone who excelled in telling a story, this is a film that was a perfect match of filmmaker and subject matter.
Despite the fact that only 27 percent of critics found it “fresh” on rottentomatoes.com, I consider “Jobs” to be a compelling film with a remarkable fidelity to the facts, at least based on its close parallels with Gibney’s documentary. Of course, since 98 percent of critics found the wretched “Mad Max: Fury Road” to be “fresh”, there’s no accounting for aesthetic judgments among my peers.
Although I am by no means an Aston Kutcher fan, he captured the essence of Jobs as a brilliant martinet who had about as much warmth as a lamprey eel. Since the film does not try to deal with Apple Corporation’s disgusting behavior overseas, most of the negative side of the Jobs ledger is devoted to his treatment of his girlfriend and workmates including Daniel Kottke.
Much of the drama is centered on fights in the boardroom with former CEO John Scully who is depicted as a hidebound bureaucrat who cares more about the quarterly earnings than Apple’s mission as a corporation that “thinks different”. The most interesting scenes, however, involve Jobs’s interaction with his fellow designers and engineers who are on his wavelength. No matter how much of a prick he was, he appeared to be a very good judge of talent and an inspirer of those who chose to walk the same road with him.
One aspect of the narrative film that is passed over in the documentary is how Jobs and Wozniak presented their first computer to the Homebrew Computer Club in the Bay Area, which was nothing but a circuit board and barely worthy of notice by those in attendance.
In my very first article on the Internet, which was a review of a book about the personal computer industry called “Hackers”, I referred to Homebrew:
So enamored of the idea of personal computing were Felsenstein and Halbrecht that they then launched something called the Homebrew Computer Club. The club drew together the initial corps of engineers and programmers who would launch the personal computer revolution. Among the participants were a couple of adolescents named Steven Jobs and Steve Wozniak who went on to form the Apple Corporation.
The hacker ethic that prevailed at the Homebrew Computer Club was decidedly anticapitalist, but not consciously pro-socialist. Software was freely exchanged at the club and the idea of proprietary software was anathema to the club members. There were 2 hackers who didn’t share these altruistic beliefs, namely Paul Allen and Bill Gates. When Allen and Gates discovered that their version of Basic that was written for the Altair was being distributed freely at the club, they raised hell. The 19-year-old Gates stated in a letter to the club: “Who can afford to do professional work for nothing?”
For those who have followed the personal computer and Internet revolution for the past 25 years or so, you are aware of the tension between private and public that remains unresolved. For every scumbag like Zuckerberg anxious to enjoy the kind of monopoly that IBM once had in the mainframe business, there are others willing to work for free on Wikipedia, Open Source journals, free software, and the like. If capitalism creates the technology that allows the instant communication links that make runaway shops feasible, it also creates the networks that allow activists to build solidarity across national boundaries that will oppose capitalist exploitation. This is the contradiction that marks late capitalism more than any other in some ways.
Nobody better represents the intersection between public and private than Rod Holt who was the lead engineer on the Apple II and who worked on the Macintosh as well. In the Q&A I told Gibney that Holt paid tribute to Jobs not long after he died even if his values clashed with his former Apple pioneer. Here is what he told Marxmail subscribers:
Just a remark here:
Dear Comrades:
Concerning Steve Jobs:
I worked with Steve from the Summer of 1976 to his ouster. I was responsible for the Apple II hardware design and its manufacture. I was in charge of the Macintosh group until its launch in 1984. I was twice anointed with the title “Apple Fellow”. I’m sick and tired of people making judgements without the slightest idea of what they are talking about. They buy the official myths fabricated by various individuals around Apple (including the 2 Steves themselves). I have in my possession enough original documents to back up what I am saying.
There were 5 (five) founders of Apple Computer:
Mike Markkula, Chairman of the board of directors
Mike Scott, CEO and President
Steve Jobs, V.P. of Marketing
Steve Wozniak, V.P. Software
Rod Holt, V.P. Engineering
We were incorporated in the state of California effective Jan. 1, 1977 with the above 5 officers. Apple Computer had never been incorporated earlier.
I will just say here that the history of Apple in Wikipedia is seriously incorrect. Most other histories are also wildly wrong. Some of this was deliberately done by Steve Jobs, but most can be attributed to sloppy journalism. Some is due to bad memories.
Steve Jobs wanted products that he would buy and use. For the rest of Apple, the creators produced what they wanted to buy. The success stemmed from this simple set of motives.
Marxists should understand that the Apple products grew from the social environment of these times in silicon valley. There was a confluence here of what we, the designers, wanted and what the world wanted. I could go into more detail if there were room and time, but really, that’s the story.
Jobs was very, very bright, a genius perhaps. So was Woz. And Scotty too. We never lacked for brains. One of Steve’s remarkable abilities was that he listened. I would get into a fierce argument with him, go into the executive staff meeting and be floored when he would take my position exactly, understanding every bit of my arguments, re-phrase them and then convince everyone. I’ve never to this day met anyone that could dispute and at the same time listen so well.
But, for heavens sake, let’s remember that leaders of corporations have to make profits or else they are on the street looking for a job. Steve Jobs wanted a billion happy customers, a goal he could reach only as a super-capitalist. So that’s what he became. It wasn’t where he started, but that’s what happened. The fact that so much ink is expended by the press is embarrassing, but that’s just the byproduct. I’m sure he would be as embarrassed as I am now.
==================
If anyone wants particulars from me, he can ask.
Thanks,
–rod
Recent photo of Steve Wozniak and Rod Holt
* * * *
From Walter Isaacson’s biography:
Update
Posted by Rod Holt to Marxmail on August 29th:
Remarks on Steve Jobs as a Phenomenon
[The producers of the first Jobs movie, “Jobs” kindly loaned a preprint to the Roxie Theater in San Francisco so that my old friends and Apple co-workers could have a party—which we did, wall to wall.
After the showing that Thursday afternoon, here and there, I offered my opinion on the movie and its social meaning. That raised a few eyebrows and more questions. I have since been asked to explain myself, a reasonable request. Since my outlook differs a lot from that of many of us, I thought it proper to clarify what I meant when I talked about Steve as being intrinsically anti-capitalist. By that I meant that Steve was opposed to the “alienation of labor”, while the alienation of labor is intrinsic to capitalist production.
The term “alienation of labor” is a technical term, and like many in philosophy and economics, doesn’t quite mean what one would think. The shortest explanation of the concept is found in Wikipedia. Of course, the concept is not the property of Marx but has been part of the thinking of many thinkers since the rise of capitalism.
In the Wikipedia article, there is a quotation where Marx imagines production with non-alienated labor:
“… In your enjoyment, or use, of my product I would have the direct enjoyment both of being conscious of having satisfied a human need by my work, that is, of having objectified man’s essential nature, and of having thus created an object corresponding to the need of another man’s essential nature. . . . Our products would be so many mirrors in which we saw reflected our essential nature.”
Steve Jobs wanted his products enjoyed as expressing his essential nature, and therefore in the general sense, he was an artist with the development team and its laboratories as his studio.
In the capitalist system, products are produced by workers paid in money and with tools owned by the capitalist. The sole purpose of the product is to be sold to realize a profit. This process eliminates the artist altogether. Wikipedia sums this up:
In a capitalist society, the worker’s alienation from his and her humanity occurs because the worker can only express labour — a fundamental social aspect of personal individuality — through a privately-owned system of industrial production in which each worker is an instrument, a thing, not a person.
So the product of labor under capitalism, the commodity, is not what Steve Jobs intended to sell. He was selling something better, something more. As far as he was concerned, profit was just fine, but not at the expense of that “something more.”
I wrote the few paragraphs below without a discussion of the alienation of labor, which is an unusual social-philosophical concept. As a result of this omission, there were some misunderstandings. For example, the alienation of labor does not mean the alienation of workers.
The fact that Steve was driven by his vision of beautiful products, “insanely great” as he would say, didn’t prevent us from glorying in our own contribution of non-alienated labor.
I do not believe Steve grasped the notion of alienated labor in and of itself. It is impossible to imagine the tens of thousands of Chinese laborers getting any whiff of the intoxicating perfume in the air we enjoyed in the early years at Apple.
=====================
Let me take on the task of explaining my view of Steve and the “First Five Years of Apple Computer.” Over the years, I’ve listened to lots of people with theories of how Apple succeeded, what was the magic ingredient, and whether the life of Steve Jobs verified the Great Man theory of history or not. I believe that the overwhelming majority of commentators miss the point completely. This is not surprising not only because they weren’t there, but also because what actually went on at Apple completely contradicts some central myths of Modern Capitalism.
I will state my thesis here as briefly as I can. I will not be writing a book proving every jot and tittle on the way to a grand conclusion. However, I feel competent to defend the thesis against any opponent. The first few years of Apple Computer were remarkable because labor was not alienated labor in the Marxist sense. We were not producing commodities for the sake of profit. In many respects, even as the company grew beyond all expectations, inertia carried this extraordinary characteristic forward until the Scully era.
The first three years at Apple were marked by a strong bond between all the participants, and between all of us and the product. We were building a product for ourselves and everybody throughout the world who were like us. (People tend to think everybody except the Other are like themselves in fundamental ways.) This was a product we wanted. And that was why we stayed up nights solving problems as they cropped up. Nobody in the early days was doing their job with the pay envelope in mind. Nobody. Even the production people putting Apples into boxes believed (correctly) they were sending their product to someone like themselves who would appreciate it, and more, marvel over it.
We made no shortcuts whatsoever. Not one. For example, Steve had the boxes carefully marked with our name and logo in red on the cleanest of clean white cardboard. Later, we got a shipment where the ink had smeared and the boxes “looked like shit,” as Jobs put it. So without regard for the fact that nearly 200 Apples were sitting in production ready to go, Steve shipped the boxes back. Both Markkula [Chairman of the Board] and Scotty [Mike Scott, President and CEO] screamed, but they were too late; the bad boxes were gone. And the whole factory silently applauded.
Again: We were in agony when the paint showed signs of peeling off the first cases, which (it turned out) were contaminated by the release compound from the molds. While orders piled up, we didn’t ship until we had stripped the paint, found a method for cleaning the cases and then repainted them. Everything that went wrong met a concentrated corrective effort. When it was clear that the cases made by the RIM (Reaction Injection Molding) method were not ever going to meet our standard, Steve and I took an airplane to Portland, Oregon to start an intensive program to make a new set of molds for an altogether new process that promised perfection (high pressure injection-molded foamed Noryl). Fortunately, our case design was suited to both the material and the process, and without dawdling we jumped right in and Steve wrote some big checks for the tooling. When quality of the product was considered, manufacturing cost was always second.
I worked with Steve (cheek to jowl at times) for the first 7 years and I think I came to know him at least as well as anybody. We never had a conflict over product quality as such. I did have arguments on “features.” Take for example one dispute over the Macintosh; Steve wanted stereo sound, and for Burrell Smith who was doing the logic board design, it would take some major design changes to accommodate stereo (adding an extra shift register, another D-A converter, and making changes in the ROMs and software). So I said No. Enough was enough. The engineering department had to stop changing things; we had to wrap up the design and go to production. I convinced Burrell Smith. I convinced Andy Hertzfeld, and demobilized Steve Jobs. Then I went home late, leaving the usual half dozen perfectionists (including Burrell) working away. But Steve wouldn’t leave well enough alone. He came back to the lab late that night and convinced Burrell that stereo was essential. So, the next morning, Burrell went home exhausted with the prototype boasting stereo, and me shaking my head in disgust. But so it came to be that the Macintosh had stereo even though there was no application program of any sort that could use it and only one speaker — at that time.
This sort of thing I understood, but it conflicted with my desire to get the product to the user promptly. Sometimes I could move things forward, and sometimes I couldn’t. However reluctantly I say this, more often than not, Steve’s last minute changes were the best thing for the customer.
I believe that Steve was dedicated to his audience, an imaginary audience who he would simply will into existence. He wanted commodities to be more than commodities. This desire was the base for the conflicts with Apple’s Board, etc. that forced the Board to fire Steve. But somehow, the vast millions of customers understood and applauded and Steve basked in the glow.
I talked with Ashton Kutcher [who played the part of Steve in the movie] at some length about Steve as the self-appointed representative of the customer, representing the people who could appreciate the quality, the thoughtfulness, and the product; that is, the product as the crystallization of what they wanted. Jobs’s perfectionism was not just a quirk, it was central; he wanted to be the leader of a new wave of products—products that were more than commodities. Products, I imagine, as we might have under socialism. To my surprise, Kutcher had come to roughly the same conclusion. He had read all available speeches by Steve, read memos and listened to those who had direct experience. He was the only one in the organization, which produced “Jobs”, who had thought through the story to the point of understanding it. This is key to his remarkable portrayal of Steve.
The movie clearly shows this conflict between a product made solely to be sold for a profit and a product made to “change the world”. At one point, the movie shows Art Rock, the dark side venture capitalist, explaining to Steve that the company had to make a profit, even at the expense of everything else. When Steve refuses to adapt to this edict, Scully, Rock and Markkula dethrone him and the Early Apple years end. In startling contrast, when Steve returns to Apple, the movie shows him with great intensity telling the new young designer (Ivy) “Design something beautiful that you love. I don’t care what it is.” (I believe one of Ivy’s designs became the iPod.) So Steve wins; we are left to imagine the evil capitalists slinking away.
Jobs’s failure to come to terms with capitalism (at least up through the first Macintosh) was due, I believe, to his willful ignorance of politics. His all-consuming idea of himself as a visionary made it impossible for him to see the contradictions. The failure of his own enterprise NEXT must have been a humbling experience. That, followed by the success of Pixar, which made him rich again, certainly must have changed him.
I have no direct experience of his last 25 years, but I suspect at least his obsession with his audience (the customers) stayed with him.
On August 18 I wrote an article in response to Joe Firestone, the author of an EBook titled “Austerity, Greece’s Debt Crisis and the Theft of Democracy” that had a chapter on the IT problems of a Grexit, which addressed earlier articles I had written.
Yesterday someone brought my attention to a follow-up on his blog (http://neweconomicperspectives.org/2015/08/on-the-it-problem-of-grexit-a-reply.html) that once again tries to strike a balance between Australian economist Billy Mitchell’s blithe assurance that the IT problems are minimal and my own insistence that it will be at least a three year effort to modify the systems. This will be a brief response to Firestone’s latest.
Firestone maintains that he is only for studying and evaluating some approaches. He also favors a phased implementation, something that is put forward concisely in a comment he made under his article:
The mainframe application is undoubtedly very complex so there is a good possibility that Louis is right and the mainframe conversion to Drachma processing cannot be accomplished in the short time necessary for Grexit
So, if we want to support a Grexit that may be necessary in the short term, then we must find a way to get around the need to convert the mainframe application in the short-term
The two possibilities I suggest in my book deserve discussion as possible ways to avoid immediate conversion of the mainframe application and to have to deal with the complexities of the interaction between humans and the mainframe inherent in the operation of the application in the real world
This assumes that you can hold off converting “the mainframe application” for the future but that’s not the way that banking systems are put together as if they were Lego toys made up of discrete modules that can be assembled in phases.
Think of it this way. When you open a checking account, you sit at the desk of some bank officer who begins entering your information into a computer, starting with name, address, social security number, etc. He or she then issues you a temporary ATM card that can be used immediately for deposits and withdrawals.
In the ensuing months, customers might take out a credit card from the bank and afterwards a mortgage and/or an auto loan. And each month they expect a statement that will have an accurate record of their transactions, both debits and credits. I am sure everybody is accustomed to this unless they are used to keeping cash under a mattress.
The implicit assumption (bordering on explicit) in both Mitchell and Firestone’s presentation of the problem is that such a “phase” is essential to moving to a drachma. I can certainly understand why someone might think in those terms because that is generally how we relate to a bank—as a customer. I should add that the applications that handle such relationships are generally referred to as belonging to the “front office”.
Unfortunately, most “back office” operations must be converted on the very day that you implement a new front office based on a drachma since they are designed to support the managers and clerks who are invisible to the customer but critical to bank operations.
For example, the accounting department of a bank is fed data aggregated on a daily basis from various sources in order to populate a General Ledger, which is the source of profit and loss statements and other essential reports for treasurers, auditors and the like. Your deposits and withdrawals are lumped together with those of other customers and end up in buckets identified by a unique General Ledger Account Number, one of which might reflect Mortgages. Needless to say, knowing how much is owed to the bank in this category is essential to a bank based on the 2008 financial crisis.
So if the accounting software is still denominated in euros, what are you supposed to do? Use these for a couple of years until the next phase kicks in?
This does not begin to address the problem of being able to rely on accounting systems once they are converted to handle the drachma. Banks have historical data that is used to generate reports that reflect financial trends. Since 2003, data has been captured as euro-denominated. If you want to study how the mortgage business has been faring over a ten-year period, you need to write conversion software to update computer files going back to the day Greece switched from the drachma to the euro. You also need to make sure that all back-office applications are checked for hard-coded tests for a euro amount, as I have pointed out a number of times.
I know that most of my readers and those who have seen my posts on Naked Capitalism care little about the financial analysis conducted by bank officers in order to make business decisions but as long as Greece remains capitalist, that is the name of the game. This is not a problem limited to banks. It applies as well to insurance companies, brokerage houses, manufacturers, and any other large-scale capitalist enterprise.
Now it is entirely possible that at some point Greece might elect the candidates of the new Popular Unity party that is a leftwing split from Syriza and that is committed to a Grexit, at least if you take them at their word. They may consider the conversion to a drachma to be cost-justified even if it entails the wrenching IT modifications needed to make it work. While I am obviously sympathetic to resisting austerity, I cannot help but wonder if the answer lies solely in the type of currency used. I plan to write a series of articles about Greece that deals with the economic problems in general and hope that by that time the IT questions will no longer need to be discussed since in the final analysis they are secondary to the political ones.
Recently I learned that an EBook on Amazon.com titled “Austerity, Greece’s Debt Crisis and the Theft of Democracy” included a chapter titled “The Information Technology Problem” that discussed my articles on Naked Capitalism and those of Australian economist Billy Mitchell who has an unrealistic take on the amount of work required to modify Greek computer systems to handle a return to the drachma.
Joseph Firestone, the author of the EBook, has a PhD in Political Science from Michigan State, over 150 articles to his name, and an extensive background in IT but mostly at the management level. Right now he is the Chief Knowledge Officer of a company called Executive Information Systems, a title that most likely has something to do with Knowledge Management, his area of expertise. This is apparently a field that has emerged since 1991 but one that somehow managed to elude Columbia University where I worked from that year until my retirement in 2012. There will be something about it later in this article by another expert in the field.
Firestone tries to reconcile Mitchell’s views and my own, probably something that irritated the economist emeritus much more than it does me given his irascible reaction to my first article on Naked Capitalism. His tone reminded me of the one I take on issues such as when the Russian Revolution went off the rails but let’s leave that aside and move on to the substantive IT issues.
From Firestone I learned that Mitchell had a short follow-up article that somehow escaped my attention. Using the authority of a friend who appears to be as high-powered as Firestone, a man who “owns a significant private firm in Europe which is at the forefront of delivering innovative card payment services to banks and corporations throughout the Eurozone”, Mitchell sought once again to buttress his “its not rocket science” understanding of the IT issues.
The friend confided to him that since “the Euro was integrated ‘on-top’ of the existing legacy IT payment systems”, ‘switching’ the Drachma back on would not be such a major task.” He added:
the Grexit should be accomplished by stealth. He would leave everything in place as it is for now. Then establish, in secret, a public bank (like the German KfW), procure the banking software out-of-the-box, sign a contract with a major card-scheme to use its network for transactions and hook the bank up with the official Bank of Greece, the nation’s central bank.
I wonder if this plagiarized or at least conveyed the madcap spirit of Varoufakis’s “Plan B”. If they ever made a movie about such a scheme, I’d cast Steve Carell in the leading role (only because Peter Sellers is dead.)
In terms of the Euro being integrated on top of the legacy systems, I have no way of assessing this. As someone who has taken part in at least a dozen feasibility studies over the years, I have learned that it is best to be cautious. Apparently the higher up you are in the IT food chain, the easier it is to throw caution to the wind.
In the late 90s I advised IT management at Columbia to avoid purchasing a Facilities Management System from American Management Systems (AMS). This was an outfit that Robert McNamara’s aides in the Pentagon founded in 1970. That should have been a warning from the outset to steer clear. Within six months after the system was implemented at the cost of millions of dollars, the users decided it did not meet their needs and dumped it. Just a few years later AMS went under, no doubt partly a result of Mississippi terminating an $11.2 million contract to modernize the state’s tax system. It would go on to sue the company for $985 million. Wikipedia states: “a jury awarded the state $474.5 million in actual and punitive damages in August 2000, causing a drop in stock price from 44 3/8 to 14. The company subsequently settled the suit for $185 million.” You can bet that if Greece ever needed consulting help to get them back into the drachma, there would be latter-day versions of AMS knocking at its doors.
Furthermore, with all due respect to Mitchell and his friend who “delivers innovative card payment services to banks and corporations throughout the Eurozone”, there is more to IT in Greece than banking and credit card processing. Greece has hospitals, universities, wholesale and retail companies selling furniture, yogurt, olive oil, tourist accommodations, and Zeus knows what else. Many of these companies do not have in-house staffs. Getting them up and running on a drachma will not be a piece of cake—trust me on that.
For Firestone to bridge the gap between Mitchell and myself, he invokes his own particular areas of expertise that supposedly get us closer to “it’s not rocket science”. Naturally this require some critical commentary.
In a section titled “Web-oriented Architecture Approach to a Drachma-based Transaction System”, he advises “web-enabling a legacy system”, something that might take a “few days, if that long”. Well, gosh, why hadn’t he brought that to Varoufakis’s attention? That would have saved him from the trouble of lining up his pal at Columbia University to program a stealth-based “Plan B”. Firestone even offers up the names of some products that could be off-the-shelf solutions such as the one marketed by the slyly named Kapow Software. While this software no doubt works as advertised in terms of integrating different systems under a web-based front end, it has little to do with the complexities of batch processing—the meat and potatoes of all banking applications for which there is no user interface. Kapow might be of some use to a bank officer evaluating a loan application from a nervous customer sitting opposite him or her, but it is totally irrelevant to a stream of programs run at 3am in the morning that pump out customer statements. A customer statement like the kind that you receive from your friendly banker at the end of the month with a listing of your debits and credits followed by an account total. It is exactly programs such as these that will require onerous and time-consuming attention—nothing that Kapow can address.
Finally, returning to Firestone’s Knowledge Management, he starts off by wisely acknowledging that “people avoided mainframe applications wherever they could, because the chances of failure were so high”. He includes himself in that group. That being said, he regards the Kapow approach as an interim solution and concludes that a “better solution” would be to develop a new system written for the mainframe from scratch “using modern programming tools and techniques”—no doubt drawn from the Knowledge Management toolbox.
All I can say is that ever since the mid 1970s, I have heard about one new technique or another that would finally make developing large-scale systems more averse to failure. They were put forward either as management, systems analysis, database or programming technologies in trade journals such as Datamation or Computerworld:
—programmerless programming: Languages such as MarkIV would allow an end user to build a system by using to specify parameters that satisfied business requirements. In fact I automated Salomon Brothers in London (SBIL) when I reported to Michael Bloomberg in 1977. Trust me, Michael couldn’t have done anything in MarkIV if his life depended on it.
—goto less programming: The less said the better. I stopped using the “go to” in 1978 or so but deadlines were still missed because the user kept changing his or her mind—the real explanation for most software delays.
—structured design methodologies: I worked for a consulting company that employed SDM for a phone company project that would evaluate whether a customer would be charged for a phone call that they claimed that they didn’t make. When the consulting company demanded new funding because the project was delayed, negotiations broke down and we were escorted out of the building by security guards. SDM did not address user indecision, the cause of cost overruns.
—relational databases: This was a huge breakthrough supposedly because it organized data into rows and columns just like a spreadsheet that could be accessed through SQL and best when it was based on normalized data structures, which meant avoiding redundancies through a data analysis of the firm. I can only say that I have worked with VSAM flat files, IBM’s IMS hierarchical database, Cullinet’s IDMS network database before finally becoming a Sybase support person on my project team at Columbia University. All of them work just fine even though Sybase (and Oracle) are best suited for client-server or web-based applications. But in the final analysis, it is the problem of nailing down user requirements that will always bite you in the ass. Given the economic chaos in Greece, this will be a thousand times worse than the normal chaotic situation.
–Object orientation: I spent about five years developing Java programs in the STRUTS framework for Columbia University’s financial system. Anybody who sells OO as some kind of silver bullet should get one in the head.
Since I have never gone near Knowledge Management, I won’t say a word about it although I would be remiss if I did not refer you to this:
Wall Street Journal, Jun 24, 2015
Whatever Happened to Knowledge Management?
By Thomas H. Davenport
I would never claim to have invented knowledge management, but I confess to an intimate involvement with it. I co-authored (with my friend Larry Prusak) one of the best selling books on the topic (in case you are into the classics, it was Working Knowledge: How Organizations Manage What They Know) and am supposedly the second-most cited researcher in the field (after the Japanese scholar Ikujiro Nonaka).
So I should know whereof I speak when I say that knowledge management isn’t dead, but it’s gasping for breath. First, the ongoing evidence of a pulse: academics still write about it, and some organizations (most notably APQC—a nonprofit research organization of which I am a board member and respect a lot) sells out its knowledge management conference every year. Professional services firms are still quite active and successful with the idea.
But there is plenty of evidence that it’s gasping as well. Google Trends suggests that “knowledge management” is a term rarely searched for anymore. Bain’s Management Tools and Trends survey doesn’t list it in the top 25 tools for the 2015 or 2013 surveys; it was included before that. More subjectively, although I am supposedly an expert on the topic, hardly anybody ever asks me to speak or consult about it.
What happened to this idea for improving organizations? I’m pretty sure that knowledge itself hasn’t become less important to companies and societies, so why did many organizations give up on managing it? Is there any chance it will return? And what does its near-demise tell us about the attributes of successful business ideas?
Although it’s impossible to know for sure why something rises or declines in popularity, here are some of my ideas for why knowledge management (KM) has faded:
It was too hard to change behavior. Some employees weren’t that interested in acquiring knowledge, others weren’t interested in sharing what they knew. Knowledge is tied up in politics and ego and culture. There were methods to improve its flow within organizations, but most didn’t bother to adopt them. Perhaps for this reason, the Bain survey (for example, the one from 2005) suggests that corporate satisfaction with KM was relatively low compared to some other management concepts.
Everything devolved to technology. KM is a complex idea, but most organizations just wanted to put in a system to manage knowledge, and that wasn’t enough to make knowledge flow and be applied.
The technology that organizations wanted to employ was Microsoft’s SharePoint. There were several generations of KM technology—remember Lotus Notes, for example?—but over time the dominant system became SharePoint. It’s not a bad technology by any means, but Microsoft didn’t market it very effectively and didn’t market KM at all.
It was too time-consuming to search for and digest stored knowledge. Even in organizations where a lot of knowledge was contributed to KM systems—consulting firms like Deloitte and Accenture come to mind—there was often too much knowledge to sort through. Many people didn’t have the patience or time to find everything they needed. Ironically, the greater the amount of knowledge, the more difficult it was to find and use.
Google also helped kill KM. When people saw how easy it was to search external knowledge, they were no longer interested in the more difficult process for searching out internal knowledge.
KM never incorporated knowledge derived from data and analytics. I tried to get my knowledge management friends to incorporate analytical insights into their worlds, but most had an antipathy to that topic. It seems that in this world you either like text or you like numbers, and few people like both. I shifted into focusing on analytics and Big Data, but few of the KM crowd joined me.
Any chance that this idea will come back? I don’t think so. The focus of knowledge-oriented projects has shifted to incorporating it into automated decision systems. The hot technology for managing knowledge is now IBM Corp.IBM -0.28%’s Watson—very different from the traditional KM model. Big Data and analytics are also much more a focus than KM within organizations. These concepts may be declining a bit in popularity too, but companies are still very focused on making them work.
If you believe in knowledge management—and you should—perhaps in your organization you can avoid the pitfalls I have listed and allow the idea to thrive. And if you favor a different idea and want it to survive over the long term, don’t hitch a complicated set of behaviors to technology alone. Don’t embrace a vendor for your concept that doesn’t care much about your idea. And if another notion that’s related to yours comes along and gains popularity, don’t shun it, embrace it.
Thomas H. Davenport is a Distinguished Professor at Babson College, a Research Fellow at the Center for Digital Business, Director of Research at the International Institute for Analytics, and a Senior Advisor to Deloitte Analytics.
Apparently my brief reference to Australian economics professor emeritus Bill Mitchell’s failure to mention the IT aspects of Grexit in a Naked Capitalism article touched a nerve. In a 3500 word article that appeared on his blog on Friday, July 24th he minimized the challenges and appealed to his own authority as an IT professional to drive his case home. He also took up some points in my article that weren’t really directed at him, particularly my brief remarks around the question of a Grexit not being sufficient to bring an end to austerity.
I did not have Dr. Mitchell in mind when I made that point. Furthermore, I don’t think that there is that much difference between us on the economic questions but as I will now point out we are still far apart on the IT implications of a Grexit that I will now explain.
To start with, he groups me with the sensationalistic media reports on Y2K that warned about Armageddon as if I or any other seasoned professional really worried about such an outcome. He also alludes to the opportunistic sales pitches from consulting companies anxious to get their foot in the door to help firms large and small avoid a Y2K catastrophe but at a steep price. If you were part of the permanent staff in any large organization like Columbia University, you had a very clear idea about how to do a Y2K conversion without tears.
Furthermore, I am quite sure that given sufficient time, funding and personnel, the conversion to the drachma is feasible. But the purpose of my article was not to argue that it was impossible. It was only to alert a lay audience what kind of challenge it represented. For those who have not managed large-scale project implementations, it was easy to imagine that such a conversion could take place in something like a few months. But I am convinced that it would probably take no less than three years based on my 44 year experience managing, designing, programming and testing mission-critical applications in a variety of banks, brokerage houses, and insurance companies. That was about what it took to go from national currencies to the euro and I would expect that it would take about the same amount of time to reverse engineer the process.
Perhaps nothing captures Dr. Mitchell’s unfamiliarity with the IT challenges facing a euro-to-drachma conversion than what he has to say about Y2K:
As the Naked Capitalism author notes it was really about software that had used two numbers to designate the year (MMDDYY) instead of four (MMDDYYYY). Several straightforward computer changes were made to resolve the possible problems depending on the situation (date expansion, date re-partitioning in overfull databases, windowing patches etc). Very trivial.
I did a double-take when I read this. Very trivial? Well, it is very trivial to expand the year from two digits to four digits but that was never the challenge. In fact Dr. Mitchell completely ignored what I wrote, namely that the task of finding the code was like looking for a needle in haystack. At Columbia University we divided up thousands of programs and assigned programmers to search through thousands of lines within each program to track down a six-digit date and convert it to eight digits. It took 10 seconds to modify each date when it was found but it took the better part of a year to find them all. To repeat, a search for any field of data that had “date” in its name was straightforward but what if a programmer labeled it “dt” or even “d”? Furthermore, what if a piece of data identified as “admission_date” is moved into a temporary field called “admission_temp”? You have to track the movement of data within the entire program to be sure that you had all bases covered. This was a laborious task that took us the better part of a year. It also took another year for IT to test all of the modified programs to make sure that the integrity of the data was preserved.
Greece would run into the same challenges in a euro to drachma conversion but likely would not have the kind of infrastructure that a well-endowed Ivy university was able to rely on. Given the economic desperation and chaotic conditions that Greek firms large and small operate within, it is a serious mistake to use one’s influence to persuade policy-makers to leap without looking first.
Continuing in his best case scenario vein, Dr. Mitchell dismisses the possibility that hard-coded values in a program constitute a major hurdle:
The issue is simple. Rules for determining eligibility for a service (mortgage etc) might have thresholds hard-coded into the computer system. So if your bank balance is above 1000 you qualify for a loan. Good programming clearly creates variable definitions (say, $threshold = 1000) in easy to find and edit part of the system and then uses symbolic references ($threshold) throughout the rest of the system so that when the threshold might require alteration there is one data entry required which feed the old system.
Yes, we are all for “good programming” but my experience over the years is that there is enough space between “good programming” and the actual code in legacy systems to steer an ocean liner through. In the ideal world, a hard-coded value is never used. For example, as Dr. Mitchell points out, it is good practice to define an external variable such as $threshold but in practice Cobol programmers (the language of choice in most financial applications) tend to take shortcuts because they are always under the gun to meet a deadline. So instead of defining an external variable that can be modified in a single location, they will test for ’10000’ or whatever. Since the software in Greek banks is likely to be decades old, I doubt that the “good programming” practices hailed in computer science classes find much reflection within them. In fact, Mitchell expresses a surprising degree of naiveté when he writes:
So if there is a lot of ‘hard-coding’ in the Greek financial and business systems it would require some work. The reference the Naked Capitalism article uses was written in 1999 and relevant to rather dated practices and the big challenge of converting all the currencies into the euro and all the different national business systems into an integrated set of systems that could cope with the common currency.
I would suspect the assessment that there is a lot of ‘hard-coding’ now would be amiss. Business systems have become much more sophisticated and homogenised in the 16 years since that article was written.
But the point is that when Greece went from the drachma to the euro in 2002, it was practically preordained that the modifications would be made to existing software that might have been written in the 1980s or earlier. Why would Greek banks have written an entirely new Direct Demand Accounting system in that period? Yes, business systems have become more sophisticated since the year 2000 but you can be assured that those that serve the mission-critical needs of Greek banks are decades old.
I should add that although I worked on mainframes for 23 years, the last 21 were spent at Columbia in leading edge technologies of the sort that he describes as “sophisticated” and “homogenized”. When I was hired by Columbia University in 1991, it was to make recommendations about exactly such technologies in my capacity as Development Technology Coordinator. Later on, once such technologies were adopted, I had over 15 years experience designing and programming financial applications in Java using the Struts framework. Additionally, I supported that application’s Sybase backend using Perl and other Unix-based tools. Finally, part of my retirement contract involved being available on a contingency basis for technical support as the need arose. Even now I stay in touch with my colleagues to give them my take on future IT directions.
Dr. Mitchell also seems to have missed the point I was making about historical data:
These include the historical presentation of records, for example, bank statements. These problems were already encountered and solved in the transition to the euro. There is no reason to suspect that any new issues have arisen. The Bank of Greece knows how to do this and could easily issue a procedural manual to the commercial banks and other financial institutions.
But my point was that ad hoc software would have to be developed to modify historical data. For example, just to repeat myself, if the United States elected a Marxist president and adopted a new currency called the Rosa that was pegged 10 Rosas to the dollar, you would have to develop software that went through the databases to multiply all occurrences of each cash-based data store by 10. (Let’s hope we’ll see that someday.)
Finally, if I understand Mitchell correctly, he seems to be saying that you could dust off the pre-euro conversion software from 1999 or so and use it to replace current-day systems. That would be fine if there had been no modifications made in the past 16 years to incorporate new business rules. But as we know financial applications are highly dynamic since the industry is always sensitive to opportunities that can always boost corporate profits to the disadvantage of the poor customer. Who knows? Maybe when the entire world converts to the Rosa, or even when money is no longer necessary, we will not have to face such problems but in the meantime reality must govern all major policy decisions, including ones that revolve around information technology—the nervous system of any modern economy.