V. 4.0

DISCUSSED: Copyright Infringement, Self-Imposed Solitude, Repetitive Linguistic Acrobatics,“The Way The Business World Turns,” Programming Manuals as Life-Rafts, Clint Eastwood, Luddites, Techno-Masochism, Abstraction, Eternal Servitude, Geek-Lit, Peer Pressure, Teaching by Example, Precision and Order, The Turing Machine, Sharing, Consensual Hallucination, Pizza.

V. 4.0

David Ng
12 Snaps

It hasn’t been easy being a computer programmer these past few months, what with a half-dozen glossy weeklies screaming at you in hundred-point font: Is Your Job Going Abroad? Add to that the SCO threatening to sue Linux users into the ground for copyright infringement (among a host of other bullshit charges), and you, Poor Programmer, have just received a double whammy in the jaw and the nuts.

Not four years ago, the IT bandwagon was the best ride in town. Then came the bust of course, and now—well, let’s just say I know more than a few coders who’ve mused half-jokingly about moving to Bangalore. Alarmist hand-wringing? Programmers have always been a nervous bunch prone to paralyzing social insecurities coupled with verbal over-expressiveness. Indeed, what other profession (besides writing) combines self-imposed solitude with repetitive linguistic acrobatics?

To be perpetually caught up in our own code, literally and figuratively, is the programmer’s fundamental dilemma. Job flight overseas may just be the latest manifestation of the “way the business world turns” (per Believer contributor Jyoti Thottam’s recent Time cover story) but it’s also true that a technologist’s job is to ruthlessly remove inefficiencies from everyday work—to streamline, to reduce. And a fundamental part of those inefficiencies is the technologist himself, a needy and expensive human being who will inevitably be replaced with workers who are less needy and less expensive.

Thus does a programmer seal his own fate by participating in a profession founded on the supremacy of best design. HAL kills his creator. The world of computer programming is filled with contradictions and ironies. To understand it fully, one must become part of it, because in the end, programming culture is computer code—syntax, logic, operating systems, telecommunications. A bitstream of pure energy. It’s not enough to stand outside looking in. The stream has to flow through you.

Much has been written about how cool it is to be a programmer. Much less has been written about the day-to-day reality of writing computer code—the tedium, the career-related anxieties. Increasingly expendable, programmers today are faceless foot soldiers in a war waged by their rapacious employers, whom the media like to both worship and vilify. (60 Minutes II dubbed Oracle’s Larry Ellison the World’s Most Competitive Man, alternating scenes of boyish gumption with moments of pure megalomania. Apparently, a benevolent dictatorship is still a dictatorship.)

Reclaiming computing culture from Fortune 500 hyperbole means going back to the source code, to a place where a semicolon can mean the difference between success and failure. It’s only at this hyper-granular level where we can hope to undo the webs we’ve weaved for ourselves.


As any programmer will tell you—and I was a programmer once, so I write with a sense of (expired) authority—the only things you need to master a computer language are a good manual and a general willingness to fuck around with a machine.The latter may not be to everyone’s taste (computers resemble black boxes for a reason) but the former need not be the daunting nosedive into the unknown that many assume it is. Programming manuals are more than just their own hyper-specialized literary genre: they are the life-rafts, the external knowledge stores, the professors, the tutors, the cheat sheets—the sine qua non of even the most experienced coders.

A journey through the Computer Reference section of your local B&N can feel at times like wading through a glutinous alphabet soup: C, C++, C#, PHP, SQL, ASP, VB, J++, HTML, etc. A true appreciation of programming culture means immersing yourself in these elephantine tomes, absorbing their cryptic cryptology as if it were some strange and unknowable potion. Where does one begin? I started with Perl—a tireless workhorse of a language that, despite its age (over twenty years old, i.e., positively ancient), continues to surprise with its durability, like a late- career Clint Eastwood.

You, the nonprogrammer, may as well begin with a few basic definitions. According to the latest edition of The Penguin Concise Dictionary of Computing, a programming language is an invented language “designed to be more readable by humans than binary machine code instructions are.” Similarly, Wikipedia states that a “primary purpose of programming languages is to enable programmers to express their intent for a computation more easily than they could with a lower-level language or machine code.”

In other words, computer languages exist for the sole purpose of being translated (or, in comp-sci parlance, “compiled” or “interpreted”). They are an intermediary linguistic layer—the tomato sauce in between the mozzarella of English and the crust of binary code. (Flat food like pizza—as well as Kraft Singles and Doritos—being the coder’s preferred nourishment, as described by Douglas Coupland in his 1995 cult novel microserfs, more on which later.) It’s hard to imagine any language designed merely to serve other languages, but whether it be something as basic as, well, BASIC, or something as complex as Java, computer code serves as the crucial link between human creativity and cold machine circuitry. It helps, I find, to differentiate levels of human readability among languages—micro nuances that usually only programmers notice or care about.Take Visual Basic, a language that was developed by Microsoft and hence is despised by most self-respecting programmers.

(Microsoft is notorious for co-opting the names of long-established programming languages, like BASIC, and reconfiguring them as its own by adding the word Visual or the # symbol. Hence such bastard languages like Visual C++ and C# and Visual J++.) A snippet of code from a VB program might look something like this: String = “Hello World” Print String End Even a diehard Luddite could extract rudimentary meaning out of this sliver of syntax. The programmer is merely telling the computer to print the words “Hello World.” Because VB looks so much like plain, written English, it places embarrassingly low on the coolness scale for programmers who prefer something less user-friendly. Something like Java. A similar piece of Java code would look like this:

public class HelloWorld {

public static void

main(String[ ] args) {

ln(“Hello World!”);

(Source: Java in a Nutshell)

Much less readable, eventhough it performs the same function.To deconstruct it, we can look at the last line ending in a semi-colon (all commands in Java must terminate with a semicolon), which tells the computer to print the words “Hello World” to the screen. That solitary command is encapsulated in the routine called “main,” which is itself encapsulated in the überprogram called HelloWorld. This explanation is egregiously simple—a more detailed one would require a lesson in object-oriented programming. Thus Java is trés cool—a land beyond the grasp of mere mortals where the best programmers can play.

Above all, programmers prize brevity—the ability to write a program in the fewest lines possible, an undeniable sign of a highly evolved mind. Woose languages like VB (which involve long, spaghetti syntax) aren’t conducive to sleek code-writing. Real programmers love compact code that often verges on the hieroglyphic. A slice of Perl syntax like

^[ \t]*([a-zA-Z][\\w]*)[\t]*
))[ \t]*(([\\w]+)|\”(.+)\”)
(Source: OnLamp.com)

would trip up some of the best computer jocks. Given time, any reasonably trained Perl coder could decipher its meaning (it is a regular expression designed to parse name/value pairs in a long data string) but the point is aesthetic: it does a lot of shit in very little space. Plus it’s awesome to look at.

This fetish for all things indecipherable and abstruse extends to programming environments as well especially programming editors, the text tools in which code is written and modified. True programmers—and, shamefully, I fail to qualify in this respect—wouldn’t in their wildest Mountain Dew–induced trances dream of being caught using anything GUI- based, i.e., any point-and-click monstrosity that, god forbid, color-codes your syntax for easier comprehension and debugging. No: the initiated will henceforth be restricted, under threat of eternal ridicule, to using “vi,” the Unix ASCII-editor that not only turns your user-friendly cursor keys into shriveled, vestigial appendages, i.e., completely useless, but transforms every common operation like “save,” “delete,” and “page up/down” into convoluted combinations of keystrokes that require the phalangeal dexterity of a concert pianist. Call it technomasochism: programmers are known to use vi for all of their text needs, including writing term papers—as some of my colleagues claim to have done. One wonders just how powerful vi must truly be—and, admittedly, I’ve seen people write code at close to mach-3 once they’ve mastered all its commands—trust me on this one.

So scoff if you must. Roll those eyes. This monastic self-denial of ordinary computing luxuries may be a kind of snobbery, but it’s also a kind of purity, a techno-transcendence.

In the end, what programmers do can best be described as “abstraction”—the process of “replacing the specific numbers and strings in a particular instance of a problem by variables and functions, so that the same program can solve many problems of the same kind,” per Penguin’s Dictionary. This is often undoubtedly laborious work, not at all Fortune worthy, let alone Time or Newsweek. Here are people whose sole purpose is to create something that never breaks, i.e., to spin logical perfection out of human irrationality, to transform the specific into something universal.

But please allow this erstwhile coder to disabuse you of a prevalent notion: programmers, for all of their high-priest airs, are among the most error-prone of technical professionals. (Remember Y2K?) Every time your computer goes blue screen or your application crashes, it’s the result of bad coding—some lazy-ass programmer who maybe didn’t terminate a thread properly. Humans are fallible, and computer programs, as our brainchildren, necessarily inherit that flaw—though not in ways we might expect.


If any novel is ripe for revisiting, it’s Douglas Coupland’s microserfs.Today, it seems like a time capsule from a long-ago era of techno-libertarian free love—a midnineties ode to the infinite possibilities of digital culture. Come on, baby, light my fire: the quasistoner ’tude of Coupland’s programmers can feel almost embarrassing today, heading as they were toward an industry flame-out of multimillion-dollar proportions.

Optimistic they certainly were, but blind they most definitely were not.There’s a philosophically astute email exchange midway through that’s guaranteed to paralyze any programmer (from Java wunderkind to Visual Basic hack) with a (now) familiar jolt of occupational panic. Two Silicon Valley tech-heads (young, neurotic, and impossibly intelligent, natch) discuss their respective career-arcs and discern a trendlette sneaking up on their spoiled demographic. The older of the two, Abe, a Microsoft prole, summarizes it thus:

 The tech system feeds on bright, asocial kids from diveorced [sic] backgrounds who had pro-education parents… just think of the way high tech cultures purposefully protract out the adolescence of their employees well into their late 20s, if not their early 30s. I mean, all those NERF TOYS and FREE BEVERAGES! And the way tech firms don’t even call work “the office,” but instead,“the campus.”

Abe’s e-rant neatly encapsulates another dilemma of the programmer: our profession infantilizes even as it demands intellectual maturity, arresting its best and brightest in a state of eternal servitude.

Unlike the cyberpunk narratives of the eighties, whose hacker ethos evoked a sexy man-machine meld (a notion co-opted whole-hog by the Wachowski brothers at the crest of the dot-com wave) and whose literary styles exalted in too much information, today’s computer literature is adapting to more sober times and is thus more likely to contain a healthy dose of the mundane: a bug that won’t go away; strained relationships exacerbated by immovable deadlines; and, most tellingly, angry venture-capital investors knocking down doors to demand profits—and to confiscate the Nerf weaponry.

If computer science remains a stubborn enigma for many readers, digestible only if served with a heaping side of action, romance, or mysticism, then the challenge for tech-culture writers today is to ruthlessly wean them from those palliatives and toward a more honest, vérité aesthetic. Irreverent, regressive, and mildly profane, microserfs presciently provided a blast of childlike innocence tinged with middle-age hopelessness. As the twentysomething narrator Daniel puts it:

Nerds get what they want when they want it, and they go psycho if it’s not immediately available. Nerds overfocus. I guess that’s the problem. But it’s precisely this ability to narrow-focus that makes them so good at code writing: one line at a time, one line in a strand of millions. “In the computer world, any book printed more than two years ago is a campy nostalgia item.”
Cryptonomicon by Neal Stephenson


If the geek-speak in microserfs doesn’t turn you off, O Novice Programmer, then proceed to Brian Kernighan and Dennis Ritchie’s The C Programming Language. A classic in computer-science classrooms, it remains most notable for its brevity.“C is not a big language and it is not well served by a big book,” the authors write, apropos their manual’s 271 pages—a featherweight in terms of physical bulk but without a doubt the undisputed champ of nonfiction geek-lit.

Fat computer manuals still figure prominently in the comp-sci bookscape, though; and it’s a testament to the unstoppable proliferation of platforms, operating systems, and web functionality that new languages emerge with remarkable regularity. Unavoidable fact of technology: each language does a specific thing well—there is no universal language, no equivalent to English in the computer science field. Thus, being a programmer means investing in a library of manuals, some of them big,some of them skinny.The more books you have, the more powerful you become—a digital-age Prospero, summoning technical knowledge from your biblio-trove with a magician’s facility and speed. In keeping various languages straight, I find it helps to either personify or analogize. Choose your metaphor: if C is a tugboat, exerting its brute power in compact ways, then Java is an aircraft carrier, loudly boasting ultra- expensive functionality and an unstoppable inertia. In his 1999 manifesto “In the Beginning Was the Command Line,” Neal Stephenson compares computer operating systems to cars:Windows 95 is a “colossal station wagon,” Windows NT is a “hulking off- road vehicle intended for industrial users,” while Linux is a chaotic bunch of RVs “set up in a field and organized by consensus.”

The prevailing metaphor is zoological, however, a tradition that began in the late eighties with O’Reilly & Associates, the most respected computer-book publisher today. Printed on the cover of each of its manuals is an animal intended to embody the spirit of the language. Java is to a Bengal tiger as Perl is to an Andean llama; SQL is to a chameleon as HTML is to a koala; VB is to a turtle (perfect!) as PHP is to an iguana. The serpentine languages Python and ASP are represented by a python and an asp, respectively.

These Darwinian associations are more appropriate than the people at O’Reilly may realize. Evolutionary in nature, computer languages typically start as informal academic tools, gradually finding their way into the public domain, proliferating at exponential rates (think PHP), and eventually dying out as more efficient languages inevitably emerge. Extinct languages like FORTRAN and COBOL (the dodo and the passenger pigeon?) can still be found on many systems, but there’s little new code written in them, and finding a manual on them can be difficult. Meanwhile, C has long since reached its evolutionary crest: the Kernighan/Ritchie book’s status as a classic bespeaks C’s long goodbye from live server farms to greener pastures in high-school classrooms across America.

O’Reilly currently publishes over three hundred manuals—a literary menagerie of languages and operating systems—almost all of them written by programmers for programmers. Call it snobbery but coders swear by O’Reilly and look down on anyone who would willingly associate themselves with an inferior book. (On my first programming gig, I showed up toting a five-hundred-page eyesore entitled Teach Yourself CGI Programming with Perl 5 in a Week; by week’s end, I was carrying O’Reilly’s Learning Perl—half as long and twice the nerd cachet.)

Implicit in this peer pressure is a disdain for books heavy with sample code. The maxim that it is easiest to teach by example holds in computer programming all too well, and those manuals that boast obscene girths are that way because they offer pages of code that lazy programmers can copy, with minor modifications, and insert directly into their own programs. The best coders, of course, don’t need this sort of crutch, this tool of the feebleminded. They want only the essential information—a program name and its parameters; a function and its arguments—and will gladly work out the rest. No unsolicited help, thank you very much.

Books that breach this aesthetic have no place on the programmer’s desk. That such a thing should matter at all confirms the microserfian duality of arrested emotional maturation and left- lobe overdevelopment. Computer books are our haute-couture accessories—tiny details that signify everything.

Any representation of computing culture, whether in a novel, a movie, or a literary magazine, must be able to evoke the often-overlooked misery that comes with the profession. In truth, a programmer lives and dies by a single, terrifying question: will my program work?

Like commoners beseeching a monarch, we offer up our code to the all-knowing compiler—that absolutist piece of software that turns our code into machine language—and wait in terror as it decides to accept or reject it.A successful compile rings like a church bell—your code is, for the time being, error-free. A rejection is like being cast into a purgatory where you must toil without rest until you’ve found the glitch in your program. The hours spent hunting down a bug have lead many programmers (this one included) to states of near-insanity. The real world recedes; your code becomes your universe, and you are both its creator and its captive.

In many ways the antithesis to microserfs, Ellen Ullman’s The Bug (2003) recreates that sense of purgatorial huis clos with scary accuracy. (Ullman, after all, is a twenty-year industry veteran.) The nebbishy protagonist Ethan Levin is haunted by a bug that resists all means of isolation. Weeks, months go by, and Ethan grows by turns frustrated and humiliated, the giggling gremlin continuing to mock him from the other side of the cyber/real-world divide.

The implicit paradox in The Bug is that while programs never err (they just do what they’re told), their creators often do.“Programming: a dense, precise, and ordered conversation, an argument in the formal sense, with reason, without heat,” Ullman writes, almost ironically. Certainly Ethan’s universe is neither precise nor ordered. After his wife cheats on him and then abandons him, he regresses into a pathetic, neo-bachelor decrepitude. The bug seems to have infected him as much as his system, gnawing away at his insides like a voracious worm. In the end, this parasite even manages to outlive its host.

Or has Ethan been absorbed by his own program, shedding flesh for digital fantasy? If microserfs presents the programmer’s life as a grinning rictus, then The Bug provides the refreshing opposite by revealing a Munchian grimace of despair—a hostile world where the only haven for a misunderstood programmer is inside his own head, i.e., inside his own code. The novel’s Northern California locales seem almost irrelevant. Ethan exists exclusively in his “tangled trails of code calling code calling code.”Though he may have retreated fully into the machine, his demons have followed him there. In one passage, a simple UNIX memory-allocation routine, malloc, takes on horrific dimensions as Ethan forms a string of self-perpetuating pessimisms from its otherwise innocuous name: “Malloc. Malice. Malaise. Malign. Malinger. Malaria. Mala suerte. Malkin. Malcontent. Malevolent.”


Dense with C-language syntax, The Bug should have come with a Comp Sci 101 prerequisite. But
there’s a salutary side effect to the code overload: Ullman is implicitly evoking coding’s multilingual mindset. The profession is among the few in the world that demands proficiency, if not fluency, in several languages at a time. (Ethan’s female counterpart is, like Ullman herself, a trained linguist.)

The one thing that all computer languages have in common, however, is that they are founded in English diction. Consider the following “reserved” programming words: for, loop, if, else, while, do, main, select. Even HTML (which is technically not a language since it doesn’t contain logic) is founded on such words as: table, row, column, layer, and division. Since there is no French version of Java or Japanese breed of SQL, all programmers must work in these English-descended tongues. Computer science may not have a universal language, but as in the spoken world, English is king.


In the end of The Bug, the humans win.The bug is vanquished. But Ullman, never one to simplify things, concludes her novel with intimations of a machine rebellion: “between the blinks of the machine’s shuttered eye—going on without pause or cease; simulated, imagined, but still not caught—was life.” Rare is the sci-fi novel that doesn’t toy with the notion of extra-human intelligence—from the confused robots of Philip K. Dick and Brian Aldiss to the celestial “beings” in the novels of Arthur C. Clarke. But how does computer intelligence look to programmers themselves, for whom life spent creating machine logic might yield some special insight, or merely incredulous cynicism?

If microserfs is any indication, they are no less bewildered or philosophically awed. As Daniel muses early in the novel:

What if machines do have a subconscious of their own? What if machines right now are like human babies, which have brains but no way of expressing themselves except screaming (crashing)? What would a machine’s subconscious look like? How does it feed off what we give it? If machines could talk to us, what would they say?

Such questions may not keep many real-life programmers up at night, but I can say that they do lurk in the back of our minds—a background process as opposed to a RAM process. To program is arguably to create a rudimentary form of intelligence. Are we not, therefore, playing God in some way? How responsible are we for our own creations? The novel Turing (2003) attempts to answer those very questions by going in two directions at once—into the past and toward the future. UC Berkeley comp-sci professor Christos Papadimitriou has baptized his first novel with the namesake of one of the field’s great geniuses, Alan Turing. Known primarily for formulating encryption algorithms during World War II, Turing also worked in AI and developed the famous Turing Test, which states that if an interrogator is unable to tell the difference between human and computer responses (assuming all communication is textual), then the computer has achieved intelligence.

Seems almost too simple. In his novel, Papadimitriou puts the Test to the test by creating an interactive web-bot that may or may not be run by a real person. The eponymous bot appears erratically—a literal ghost in the machine—spilling out lectures about math, ancient history, and technology to those patient enough to sit through its screens of scrolling text. (Again, readers’ tolerance for things like binary arithmetic will be sorely tested.) Turing appears variously to a cast of romantically challenged loners who, like Coupland’s microserfs, languish in a teenage mindset of emotional promiscuity. There’s Alexandros, a fiftyish archaeologist vacationing on the island of Corfu; Ethel (named after Alan Turing’s mother), an American businesswoman; and Ian, a legendary jet-set computer hacker.

Set in that cradle of all things mathematical and logical, Greece, Turing name-drops a mini-pantheon’s worth of ancient philosophers (from Archimedes to Pythagoras) to lend itself an air of classical brilliance. Papadimitriou clearly wants us to regard computer science not as a bastard discipline but as a pure study, one with its own set of natural laws, theorems, and godlike celebrities. (In terms of Olympian status, Alan Turing ranks only behind Charles Babbage, whose difference engine is regarded as the world’s first mechanical computer.)

But Papadimitriou doesn’t luxuriate in antiquity: his novel unfolds in a vague near-future where human activity is triangulated between Greece, the United States, and Hong Kong.The novel’s compacted globality already feels scarily within reach. Everyone converses in more or less fluent English. The dollar is the currency of choice. And forgotten economic backwaters flare up with sudden political importance. (As in William Gibson’s similarly time-zone-hopping Pattern Recognition, the hub of the networked world is located in the former Soviet Union—the geographic origin of so many of today’s computer viruses.) In the end, Turing’s only true nationality (or religion) is a pan-technophilia that serves as a cross-cultural adhesive and the source of an imperfect universalism.We are all equal in the eyes of the ubiquitous Net, the novel implies, even if someone—or is it something?—knows our darkest secrets.


Perhaps more significant than anything it has to say about artificial intelligence, Turing explicitly answers the question: what if a novel about computer programmers were written using the actual methods of programming itself, i.e., long-distance collaboration with experts, liberal sharing of source code, iterative improvement, constant testing and retesting?

It was only a matter of time. In true open-source fashion, Papadimitriou posted his manuscript for Turing on a newsgroup and invited public feedback that would eventually be incorporated into the finished text. (A selection of that feedback is included in the novel as an appendix.) As with Linux—the operating system that made the open-source movement famous—Papadimitriou’s original manuscript acted as a kernel—a bare-bones structure that anyone could criticize and improve. (Unlike the case with Linux, however, contributors were not allowed to redistribute their own versions of the novel. Turing remains the proprietary product of its author.) Still, the book’s pseudocollaborative genesis exemplifies the programmer’s mindset: share and share alike.

For veteran coders, the opensource movement is hardly news. We’ve been doing it for years, frankly. (The C programming language and many other Unix tools were born from the Free Software Foundation’s GNU Project, which was intended to create an entireoperating system for free.) Sharing programs—and more importantly, making their source code publicly available—is as fundamental to programming as cheese doodles and high-caffeine beverages. Linux may not be the first operating system to be distributed in such a way, but it nonetheless represents a potent shot to the face of Microsoft’s goliath, that all-purpose devil who, we learned a few months ago, will go to any length to keep its source code out of public hands.

Of course, Linux would not be a true operating system without a bestial doppelgänger, and Linux’s creator and namesake, the Finnish computer scientist Linus Torvalds, has done the honors. “I was looking for something fun and sympathetic to associate with Linux,” he once said. “A slightly fat penguin that sits down after having had a great meal fits the bill perfectly…. Don’t take the penguin too seri- ously. It’s supposed to be kind of goofy and fun, that’s the whole point.” Indeed, Tux the Linux penguin resembles more a plush toy—smiling and squeezable—than a cutthroat O’Reilly jungle dweller.

Turing might have appreciated this nod to childhood. He himself was a bit of an eternal boy, and like so many programmers, fictional and nonfictional, never fully made the transition to adulthood. In 1936, at the age of twenty-four, he developed the theory of the Turing Machine, which many regard to bE the foundation of modern computer science. He spent World War II as a codebreaker for the British government.After the war, he held many and various academic positions, working on such mathematical puzzles as the Riemann zeta function.At the age of forty, he was arrested for homosexual acts, and two years later, after being subjected to a humiliating regime of estrogen injections to control his libido, was found dead in his house, an apparent suicide.


It’s telling that so many sci-fi writers have reincarnated Alan Turing in their novels, both as human characters and as intelligent computer systems. (In Neuromancer,Turing is an Orwellian registry designed to track AI units; in Neal Stephenson’s Cryptonomicon, Turing appears as himself, first as a callow Princeton grad student, then as an elusive member of the Bletchley Park dream team.) History weighs heavily on this future-looking literary genre, and in the most recent crop of computer novels, the achievements of geniuses long-dead continue to reverberate through space-time. I haven’t read Mike Heppner’s The Egg Code, or Richard K. Morgan’s Altered Carbon. But as long as computers remain a generational technology, based upon iterative improvements on past accomplishments, so too will computer fiction reach backward to propel itself forward.

Perhaps the latest evolutionary step in computer literature isn’t in fiction at all. Real-life accounts of time in the tech-bubble are, by definition, low on manufactured drama and high on unflattering honesty.This new breed of non-fiction arguably offers the most direct path into a programmer’s head (short of reading programming manuals, of course). Andrew Ross’s No-Collar (2003) provides one of the most detailed plunges into the techno-mentality as it follows the life of the Internet start-up firm Razorfish from inception to launch to spectacular burnout. In its evocation of a failed techno-utopia, No-Collar suggests the frontier spirit of microserfs fatally wedded to the ruthless corporate mentality in The Bug.

Indeed, intimations of science fiction haunt Ross’s book, literally and figuratively. On the wall of one of Razorfish’s offices, a poster of The Matrix presides, a totem of tech-coolness, the messianic Keanu Reeves blessing the masses with his shaded stare. The Matrix’s cyberpunk inflections add an unexpected edge to Razorfish’s story, casting its naïve denizens as a group of rebels intent oN taking down the demonic powers that be by using their own technology against them.

Not coincidentally, Ross alludes to his story as being like “a bad science fiction film.” Maybe the more appropriate analogue would be a Gibsonian “consensual hallucination,” in which the collective rah-rah of Razorfish employees concealed a surprising lack of substance behind the business model (a “Potemkin industry” as Ross puts it). No Collar also shares more than a few similarities with the world of Philip K. Dick insofar as paranoia and disorder run rampant, and nothing at all seems real. In one telling scene, the detritus that Ross describes as littering the office floors sounds eerily like Dickian “kipple,” the word he coined for the nameless but ubiquitous junk that accumulates in our daily lives.

Far more dystopian in tone and outlook, Ellen Ullman’s memoir Close to the Machine (published in 1997, six years before The Bug) embodies the menopausal malaise of an entire life spent in IT enslavement. With far more persuasion and insight than in her novel, Ullman invokes the paradox of age in an industry obsessed with newness: “The preciousness of an old system is axiomatic.The longer the system has been running, the greater the number of programmers who have worked on it, the less any one person understands it.As years pass and untold number of programmers and analysts come and go, the system takes on a life of its own.”

But what about the programmer? The eternal adolescent even- tually gets old. In any other profession, age confers respect. In programming, it’s the opposite. What do we do with old programmers? They can’t all go on to teach or write novels. Obsolescence affects man and machine equally. “The corollary of constant change is ignorance,” Ullman writes in Close to the Machine.“A programmer who denies this is probably lying, or else is densely unaware of himself.”


Is it ironic that the world’s first programmer never had that problem? After all, she died at the age of thirty-six.Ada, Countess of Lovelace, née Byron, daughter of Lord Byron, worked with Charles Babbage on the difference engine in an intense spurt of collaboration from 1841 to 1843. She’s generally considered to have written the first computer program, essentially a list of instructions, or an algorithm, to perform certain mathematical calculations. Her single publication in her lifetime was a set of notes documenting the analytical engine— published pseudo-anonymouslyN under the initials A.A. L.

If Ada’s contributions to Babbage’s work have been greatly exaggerated through the years (as Doron Swade suggests in his non-fiction book The Difference Engine), her mystique remains as alluring as ever—a welcome gust of femininity in a profession historically rank with maleness. Übermuse and intellectual pinup rolled into one, Ada enchants precisely because she embodies both the rational and the irrational— the former is the result of her rigorous mathematical training; the latter, her unstable father’s genetic legacy. Though she and Babbage were not lovers, her impassioned letters to him signify a rapturous, almost erotic attachment to their work. She also possessed a mean egotistical streak.“Owing to some peculiarity in my nervous system,” she wrote in 1841,“I have perceptions of some things that no one else has; or at least very few, if any.”

Her precocity (and Turing’s as well) can remind one of the narcissistic musings in microserfs. (Ada was nearly twenty-six when she wrote that letter, the same age as Coupland’s protagonist.) Clearly technology needs young people—their drive, their heedlessness, and their arrogance.Thanks to them, the past 15 years have seen warp-speed progress in computing, particularly when compared to the previous 150. (Babbage’s difference engine was built to calculate financial tables; a hundred years later, ENIAC, the first electronic com- puter, was built to calculate artillery tables. Not a great leap, functionality-wise.)

“The construction of hardware and software is where the species is investing its very survival,” says one character in microserfs. Of all the declarations of machine love in Coupland’s novel, this steadfast belief in the salvational power of computing seems the most dated. As more undergrads flee the confines of engineering halls for the spacious atria of business schools, the future of the computer science appears increasingly sparse and anemic. And yet, attempts by Bill Gates and the like to promote the profession as fun and exciting ultimately belie a hard truth. Programming devours you. It maims, it scars, it cripples. It feeds vampirically on the minds of those who have dedicated themselves to its maintenance. If computing is in fact mankind’s savior, it demands sacrifice in return. ✯

More Reads

Leave No Trace

Madeleine Watts

Send in the Clones

James Pogue

The Disaster and How Some Escaped

Will Stephenson