Essays from a "Life in Code"

Ellen Ullman's new book, a collection of essays written 20 years ago during her career as a software engineer, reflects on life as a coder throughout its evolution from PC to the cloud, and provides an uncanny preview of how technology has changed our lives for better and worse. A few of these essays are summarized here.

Ellen Ullman's prescient vision of the future, a series of essays written over 20 years ago, captured my respect while her eloquent writing captured my attention. 

If the code creates programs that do useful work for regular human beings, it is called “higher.” Higher-level programs are called “applications.” Applications are things that people use. Although it would seem that usefulness by people would be a good thing, from a programmer’s point of view, direct people-use is bad. If regular people, called “users,” can understand the task accomplished by your program, you will be paid less and held in lower esteem. In the regular world, the term “higher” may be better, but in programming higher is worse. High is bad.

The above, written in a 1994 essay by Ullman, is part of her new book, Life in Code: A Personal History of Technology. The collection of essays presented in chronological order span 23 years up to 2017. Ullman worked as a programmer from 1978 through the 1990's. Her essays highlight some of the major changes and milestones that occurred over the last 40 years since her entry into the profession. After the 90's she transitioned into writing with several novels, non-fiction books and essays in leading publications to her credit acclaimed by both critics and readers.

She writes in a narrative form that any non-programmer like myself can understand and enjoy, while also writing from a personal perspective that may bring about a collective "Exactly!" from experienced programmers. In reading her early essays I was amazed at her ability to see where technology was headed. Either she was very prescient or I was stuck in the present with Pearl Jam and Friends. I think it was more of the former. My favorite essays all from the 90's are briefly summarized below using Ullman's words in block quote format as they are much better than mine.

Close to the Machine
Out of Time is a brilliant essay from 1994 that provides a glimpse into the psyche of the software engineer or programmer. She talks about the hierarchy within programming, as described in the excerpt at the beginning of this post, where lower level programming referred to as being "close to the machine" is really at the top. They are the ones that talk to the machines translating back and forth from our high level real world languages to the lower level machine language of chips, memory and kernels. They live in an asynchronous time from the rest of us. The machine is ultra logical and ultra literal. It will do exactly and only what you tell it to do. As a result bugs and errors are a norm that the programmer has to deal with on a constant basis where full attention to detail and living in mind time require real time interactions and norms to be ignored. 

People imagine that programmers don’t like to talk because they prefer machines to people. This is not completely true. Programmers don’t talk because they must not be interrupted. This inability to be interrupted leads to a life that is strangely asynchronous with the one lived by other human beings. It’s better to send email than to call a programmer on the phone. It’s better to leave a note on the chair than to expect the programmer to come to a meeting. This is because the programmer must work in mind-time but the phone rings in real time. Similarly, meetings are supposed to take place in real time. It’s not just ego that prevents programmers from working in groups— it’s the synchrony problem. To synchronize with other people (or their representation in telephones, buzzers, and doorbells) can only mean interrupting the thought train. Interruptions mean certain bugs. You must not get off the train.



This closeness to the machine and living on asynchronous time can lead to a lifestyle that seems to be out of step with the rest of us. Getting closer to the machine means midnight dinners of Diet Coke. It means unwashed clothes and bare feet on the desk. It means anxious rides through mind-time that have nothing to do with the clock. To work on things used only by machines or other programmers— that’s the key. Programs and machines don’t care how you live. They don’t care when you live. You can stay, come, go, sleep, or not. At the end of the project looms a deadline, the terrible place where you must get off the train. But in between, for years at a stretch, you are free: free from the obligations of time.

Ullman relays a story of Frank, who was a programmer on the higher side working on applications for users (normal people versus the machine). As a result he was punished and his punishment was to only talk to regular people. 

Frank was thinking he had to get closer to the machine. Somehow, he’d floated up. Up from memory heaps and kernels. Up from file systems. Up through utilities. Up to where he was now: an end-user query tool. Next thing, he could find himself working on general ledgers, invoices— God— financial reports. Somehow, he had to get closer to the machine... Frank became a sales-support engineer. Ironically, working in sales and having a share in bonuses, he made more money. But he got no more stock options. And in the eyes of other engineers, Frank was as “high” as one could get. When asked, we said, “Frank is now in sales.” This was equivalent to saying he was dead.

Later in the essay, Ullman writes about the emergence of graphical interfaces and higher level programming languages that begin to make us all closer to the machine. But is this a good thing? Does being closer mean having more control? This is where she really sees the future - our present with amazing clairvoyance.

In the workplace, home office, sales floor, we will be “talking” to programs that are beginning to look surprisingly alike: all full of animated little pictures we are supposed to pick, like push buttons on a toddler’s toy. The toy is supposed to please us. Somehow, it is supposed to replace the satisfactions of transacting meaning with a mature human being, in the confusion of a natural language, together, in a room, at a touching distance. As the computer’s pretty, helpfully waiting face (and contemptuous underlying code) penetrates deeply into daily life, the cult of the boy engineer comes with it. The engineer’s assumptions and presumptions are in the code. That’s the purpose of the program, after all: to sum up the intelligence and intentions of all the engineers who worked on the system over time, tens and hundreds of people who have learned an odd and highly specific way of doing things. The system contains them. It reproduces and re-enacts life as engineers know it. Soon we may all be living the programming life: alone, floating in mind-time, disdainful of anyone far from the machine.

Sound familiar?

The End of Programming
In Ullman's 1998 essay, The Dumbing Down of Programming, she looks further at the shift from "coding" to using programming tools. She starts with a purchase of a Linux OS after she went into buy a WindowsNT software. She strips down the machine she has by taking everything MS off of it. The Linux was bare boned and she would program the old fashioned way. Contrasting that with the Windows use of wizards, tools and hidden code where she couldn't tell if the problem was her programming or was MS. From that experience she debates the changes that replace technical expertise with "easy". Are there costs from this change?

My programming tools were full of wizards. Little dialogue boxes waiting for me to click “Next” and “Next” and “Finish.” Click and drag, and— shazzam— thousands of lines of working code. No need to get into the “hassle” of remembering the language. No need even to learn it. It is a powerful siren-song lure: You can make your program do all these wonderful and complicated things, and you don’t really need to understand...In this programming world, the writing of code has moved away from being the central task to become a set of appendages to the entire Microsoft system structure. I’m a scrivener here, a filler-in of forms, a setter of properties. Why study the technical underbelly, since it’s already working— since my deadline is pressing, since the marketplace is not interested in programs that do not work well in the entire Microsoft structure, which AppWizard has so conveniently prebuilt for me?

The article reminded me of some challenges within development and system administration where there is a disconnect understanding how something is done and what is actually happening. Where clicks and selections become the process of doing and we only understand the final result we expect to achieve without realizing what happens "under the hood" to get to the final result. 

Yet, when we allow complexity to be hidden and handled for us, we should at least notice what we are giving up. We risk becoming users of components, handlers of black boxes that do not open or don’t seem worth opening. We risk becoming people who cannot really fix things, who can only swap components, work with mechanisms we can use but do not understand in crucial ways. This not-knowing is fine while everything works as we expected. But when something breaks or goes wrong or needs fundamental change, what will we do except stand helpless in the face of our own creations?


Y2K and the End of Days

Ullman's essay, What We Were Afraid Of As We Feared Y2K, corrected a few incorrect memories I had of Y2K. I was just 6 months or so into my professional transition from Engineering into IT at the beginning of 1999. My memory of the event was that everything worked out at the beginning of 2000 and the whole thing was way overblown. I was right on the first part but after reading the essay I was off base on the second part. 

The essay starts in February 1999 and closes just a few hours after midnight on New Years Day, 2000. Ullman speaks to a variety of programmers in key industries as well as attending some conferences setup to discuss Y2K over the course of the year. As a seasoned programmer, she was sincerely concerned at the possible issues that may happen when the year changes.

The problem can be summarized as follows: Computers have been handling dates with the years represented by two digits: 98 for 1998, 99 for 1999, and so on. Which brings us to 2000, when machines will see the year as 00. From the dawn of the modern computer era to this day, digital systems have never seen a “today’s date” in which the year was not in the range of 40 to 99. What will happen when they encounter 00? No one precisely knows.

Her travels and interviews reveal two different perspectives - perhaps insiders and outsiders would be the best, if not ideal, labels. The outsiders including the media and some leading pundits like economist Ed Yardeni felt doom was inevitable and seem to blame a certain segment exclusively.

“No more programmers working without adult supervision!” declaims Edward Yardeni, chief economist for Deutsche Morgan Grenfell and celebrity stock-market analyst...Yardeni tells the audience that all that Y2K code cannot possibly be fixed in time. The millennium bug, proliferating through the economy, will bring on a world recession on the order of the 1973– 74 downturn, after which came a decade of limping growth. All this will occur, he says, because the world’s systems “were put together over thirty, forty years without any adult supervision whatsoever.”

Ullman was angered by this perspective. She felt technology moves step by step, and for each step the programmer has to figure out how to meet the next step. 

These systems did not see “it” coming because the future comes step-by-step. One day we are amazed that we can keep a general ledger without papers and pens, the next we want all financial data always available; then, in no time, we want to query great masses of data. We start with joy that a single person can experiment with a computer. Soon we want to connect that computer to another, and another, and another, until we are perpetually and ubiquitously connected. The human reaction to the technology itself— our using it, imagining what else we can do with it— determines what the future will be. Technology is not the driver of change; what drives technology is human desire.

The other perspective is provided by the many programmers and their leaders she talked with during the year. They had a very sober view of the problem at hand. It was a large problem but it could be solved. They worked hard to develop tools to review the code of their systems, some being ancient by technology standards and then make the changes necessary to overcome the doomsday scenario. For example Texaco engineers and programmers were concerned about systems that monitor oil flow to customers and weather patterns in the Gulf of Mexico. Information that is crucial to keep both the company and many other companies running. They worked their way through these challenges. The process was repeated all over the world.

The close of the essay is a scene from the New Year's celebration Ullman is having at her home that evening. She and her guests celebrate and wait to see what happens after midnight. Midnight came and went all across the globe and there wasn't a doomsday. The recession that did come about had nothing to do with Y2K, or the programmers that needed "adult supervision".  

I think of all the good programmers and testers. Jim Fuller, Lawrence Bell, the guys from the railroad, Jay Abshier, Robert Martin, Fred Cook, my colleagues and friends, all the technical people blamed for the looming end of time. I am glad for the coming of the year 2000, almost sorry for the lack of a failure tonight. The outside world may not understand what peril we were under and what my colleagues have done to keep us from it. Yet I know, as the next weeks and months unfold, there will be some problems that are “locally severe,” in Abshier’s words. By then I hope society will remember its fear, and its relief, and feel a little fear again, and know that the danger was real.

Sometimes when good things happen instead of the dreaded, we reflect back and blame our initial perception of the bad. That may be true sometimes, but an equal possibility was that we weren't wrong to fear. Instead, using the fear to think clearly, understand the problem at hand and work diligently through a possible solution was the reason for the happy non-event. The first scenario seems to be what happened with Y2k for most of us except a small group of unlikely heroes


>>>>>>>>>>>>>>>> Artificial Intelligence: Our Hopes and Fears

>>>>>>>>>>>>>>>> How To Work More Deeply in a Distracted World

>>>>>>>>>>>>>>>> The Inevitable by Kevin Kelly

Fake News and the Loss of Faith in Expertise

A democracy, indeed a culture, needs some sustaining common mythos. Yet, in a world where “truth” is a variable concept— where any belief can find its adherents— how can a consensus be formed? How can we arrive at the compromises that must underlie the workings of any successful society?

The above isn't from yesterday's opinion section of the NY Times. It was written in Ullman's essay, The Museum of Me, written way back in 1998 where the world had about 150 million internet users compared to today's 3.8 billion. A recurring theme in Ullman's essays I've read have been a trend in our society of moving from community and interdependence to a hyper individualism. Not an individualism that provides variety within a community, but one that is exclusive to a community - at least a physical one. This happens when we all get close to the machine, as in the Out of Time, and it happens even more through the enabling technology of the internet. 

As she is walking down the street near her home in San Francisco she sees a billboard with nothing on it but a few words against a background of brilliant sky - "now the world really does revolve around you". As she came closer she saw a URL at the bottom that indicated it was for a semiconductor equipment maker. This was an expression of a world where we the individual can have anything we want, when we want it, without the need to go through the old fashioned intermediaries of the physical world.

Through a process known as “disintermediation,” producers are removing the expert intermediaries, the agents, brokers, middlemen, who until now have influenced our interactions with the commercial world...Removal of the intermediary. All those who stand in the middle of a transaction, whether financial or intellectual: out! Brokers and agents and middlemen of every description: goodbye! Travel agents, real-estate agents, insurance agents, stockbrokers, mortgage brokers, consolidators, and jobbers— all the scrappy percentniks who troll the bywaters of capitalist exchange— who needs you? ... Small retailers and store clerks, salespeople of every kind— a hindrance, idiots, not to be trusted. Even the professional handlers of intellectual goods, anyone who sifts through information, books, paintings, knowledge, selecting and summing up— librarians, book reviewers, curators, disk jockeys, teachers, editors, analysts— why trust anyone but yourself to make judgments about what is more or less interesting, valuable, authentic, or worthy of your attention? No one, no professional interloper, is supposed to come between you and your desires, which, according to this idea, are nuanced, difficult to communicate, irreducible, and, most of all, unique. 

The prophetic words she wrote 20 years ago are our current reality. Lawyers, doctors, drivers, and even programmers are the current intermediaries that AI may disintermediate. She goes onto discuss another current trend we label Fake News. 

Physical reality— the discomfort and difficulty of abandoning one’s normal life— put a natural break on the formation of cults, separatist colonies, underground groups, apocalyptic churches, and extreme political parties. But now, without leaving home, from the comfort of your easy chair, you can divorce yourself from the consensus on what constitutes “truth.” Each person can live in a private thought bubble, reading only those websites that reinforce his or her desired beliefs, joining only those online groups that give sustenance when the believer’s courage flags.

Final Remarks
I wasn't familiar with Ellen Ullman or her writing until I came across her latest book containing the essays summarized in this post a few weeks before writing this post. Just after a few pages I was hooked. She is a talented writer that can engage the reader using the narrative style that keeps you wanting to keep turning the page. She was also a programmer for 20 years that spanned the period where PC's, microcomputers and mainframes coexisted to one where the internet was exploding into our lives. With that experience and the talent for communicating it, she presents a vision of the history of technology along with how it has shaped and will continue to shape our humanity, for better or worse. 

That history of technology was also a part of my history - both professional and personal. I am a willing participant in both realms. When I was a college student in the mid 80's using a computer meant going into a special building and then into a special room inside that special building to use it. There was even a room with Macs for word processing. There was a beginning and end to my getting close to the machine (or close as I could get) as I entered and left the room. Today my kids, who aren't yet in high school, exemplify the vision Ullman had over 29 years ago. They are close to the machine anytime and most of the time through their iPhone, iPad or laptop. They go to school and use laptops, smart boards and take coding classes so there is no escape from the machine. But they are not rebels like the hacker/programmers of the early days. They are marching to the same beat of the drum as the rest of us. We encourage their getting close to the machine and we fear them being left out if they stop. We see it as continued progress, and no doubt in many ways it has been and will continue to be, but we also see that there may be some harm being done. 

When I read a book like the Life in Code, I am reminded that with every gain there is a loss. As we get closer to the machine we do in many ways get further away from our humanity, community and interdependence within our community. Ullman captured this extremely well and long before I saw it coming. You may not share or agree with all of her perspectives. I'm not sure that I do either, but her perspective is worth reading and understanding. One can be skeptical about technology without unplugging from it. Balancing the gain and loss derived from it, both as a professional, individual and parent, can be a challenge.

We just have to hope the loss is less than the gain.