In Defense of Ada

by
Kirby Urner
March 7, 2001

Jim Holt has an article in the recent New Yorker (The Ada Perplex, March 5, 2001, pg. 88) wherein he takes aim at Ada Byron's reputation as the first computer programmer -- although he also spends some words debunking the myth that she invented the computer as well, a rumor I'd not heard before, but this is what the magazine cover advertises as the central question to be addressed inside (clever marketing?).

The cartoon that goes with the piece helps underscore what we're dealing with here: polemics & propaganda -- it shows the opium case and "Delusions of Grandeur" potion open at table-side, and a scrap of paper proving she's a ditz-brain, because she carried forward a printer's typo setting a cosine on an approach towards infinity, whereas we all know the cosine function maxes out at one.

image
"Ada Byron, First Computer Nerd?" The New Yorker, 3/5/2001, pg. 89

I found the article somewhat mean-spirited, busying itself with the details of her personal life much as a lawyer would attempt to sway a jury by calling character into question in ways rather tangential to the case. Since I'd seen Erin Brokovich lately on video, it called to mind the trial scene wherein Julia Roberts loses out to the emergency room doctor who was clearly in the wrong -- we know because, as quasi-omniscient spectators, we're all eye-wittnesses to the collision. Because she's had two husbands already, and cusses enough to make this an R-rated film (tsk), she's clearly undeserving of any compensation. Likewise, Ada had all these lovers and husbands and nervous breakdowns and other dysfunctions and so couldn't possibly be the kind of person we'd want to dignify in the literature with the august title of "first computer programmer".

More to the point, perhaps, is whether she understood much math. She wasn't a math whiz, Holt makes clear. She was the daughter of a poet with celebrity status who worked hard to bridge what C.P. Snow would later identify as a chasm between two cultures: the humanities and the math-sciences. In her writings, she worked the shuttle-cock between these two worlds, weaving a seamless tapestry (never finished), thereby exciting a lay audience about the possibility of computers, while likewise addressing some of the philosophical questions that would surely occur in the everyday mind: could these machines really think, become intelligent creatures eventually?

Holt considers her answer to the above question ("no") rather facile, saying she neglected to consider how stored-program computers could rewrite their own programs and therefore get out from under their human Frankenstein-creators to some degree, and beat them at chess. This somewhat betrays Holt’s own naiveté I think. Deep Blue relied a lot on brute force, advances in hardware that made the same algorithms execute much faster. Deep Blue was not some breakthrough in self-modifying neural nets, and to this day you can fool a chess-playing computer with chess situations a child (not a prodigy) wouldn't bungle.

Sir Roger Penrose brings up such chess cases when describing the space of non-computable problems, where the human mind seems to venture and shine in ways that no rule-following artificial intelligence (self-modifying or not) has ever managed. Whether or not Penrose is correct regarding his version of "no" to the same question, the point is this is not a closed issue, settled long ago in the minds of all trully well-informed thinkers. Ada's position is still tenable.

Does a person need to be a math whiz to be the first computer programmer? By Holt's own argument, if we take the title from Ada, we should bestow it on that French programmable-loom guy. The loom was for weaving fabrics, not for computing Bernoulli numbers or navigation tables. There's no mention of how much calculus went into this loom design -- also no mention, while we're on the subject of the computer's forebearers, of programmable musical instruments (like player pianos and such). Anyway, by Holt's own criteria, the answer is clearly "no, you don't have to be a math whiz," as his own candidate for first computer programmer wasn't either (or even if he was, this isn't documented nor is this proferred as relevant to his case).

Ada's chief sin, and why she's so undeserving of the title apparently, is not that she failed to demonstrate a high level of math skill, but that she seemed to think herself some kind of genius -- this is what's so irritating (to her mother certainly). Her sin was pride, hubris, fancying herself the equal of her betters, even when she clearly had a weak grasp on the cosine function. The purpose of this New Yorker review is to put this upstart tart in her place, to take back this recognition and title as from an unworthy, a pretender, an imposter. That so many websites celebrate her contribution to the literature is regarded askance, as a scam perpetrated by know-nothing New Agey types who likewise celebrate crop circles, healing crystals and feminist causes (that was my impression anyway -- perhaps I'm projecting).

But what Ada did is what Babbage wanted her to do (by Holt's own analysis) which is to make the vision of the Analytical Engine compelling and "the talk of the town" in liberal-minded, high society circles. To accomplish this, she needed enough credibility with the literati to appear well-informed, and she got this by doing her homework, learning enough about the Analytical Engine and how it worked on paper to attempt a program. True, she got bogged down in the details of how to break down the Bernoulli Number calcs into smaller steps, and Babbage helped her with the algebra. She wasn't a math whiz, we've already discovered.

Ada was inspired by the Babbage engine and what it might portend. She wrapped a lifeless machine (or paper blueprints for same -- it was never actually built) in a mix of fantasy, numbers and charts, and breathed life into it, by creating a blend of science fiction and metaphysics that was credible both to her readers and to herself. Disbelief was suspended and dispelled, such that to this day we remember Ada and her antics, and the impact she made on the thinking of her day. She wasn’t trying to hoodwink her audience. The exercise with the Bernoulli Numbers is proof of her integrity -- she struggled to comprehend the Analytical Engine for exactly what it was, even as she speculated about the future.

Ada got out on stage and played the attractive young daughter of a romantic poet (which she was), imbued with dawning comprehension of times ahead (which she had). In so doing, she did what many poets have done: used her intuition (perhaps at times opium-enhanced - tsk) and creative writing skills to anticipate TomorrowLand. True, Babbage wasn't suddenly awash in funding as a result, but wheels of this magnitude turn slowly, as many a poet-futurist has found out.

So we might argue that Ada was merely the first popularizer of computer programming, a celebrity who brought attention to a nascent discipline in ways a more stodgey gentleman never could. But to be a "first popularizer" of something that doesn't really exist yet in the public mind is to achieve more of a breakthrough than simply popularizing. And she not only popularized the idea of programming, but she anticipated the genre of speculative literature that would grow up around it, a combination of science fiction laced with philosophy. And in so doing, she projected an image of her persona, as a kind of genius and enchantress, adding color and imagination to what had hitherto been colorless and dryly empirical.

It's no accident, then, that the "myth of Ada" is still around to be debunked to this day. She set her name on a trajectory, which has only succeeded in gathering momentum over time. Again, this is something poets are oft times good at, and it's a testament to the power of their magic when they manage to make a lasting imprint without indulging in carnage or increasing the need for new war memorials. Of course sometimes a magnum opus sets in motion, deliberately or inadvertently, a chain reaction with diabolical consequences. Ada, though, seems to have had a largely benign influence, and those who worship in her temple don't seem especially prone to violence or mayhem -- given the number of personality cults which are, that's another point in her favor.

So I, for one, am in favor keeping the title of "first computer programmer" right where it belongs, with Ada, because when she jumped so willfully into the deep end, she made an excellent splash with lasting precessional effects. In giving her the title, you might say I'm taking a rather light-hearted approach to the game of "me first" (of priority), which men especially seem anxious to play with grim and loveless fascination. Giving the title to someone as clearly "unmeritorious" (as the jealous zealots might see it) seems to detract from the deadly seriousness of the entire enterprise, turning the earnest business of proving priority into marketing, show business, a dramatic production, a circus, a game touched by comedy, caprice and hype. Yep, that's the spirit. It's a casting decision, and in my judgement this Ada Byron character has done a creditable job, and should keep developing in this role. It's her niche and she's earned it. Others may (and do) disagree, but I'm unmoved to alter my aesthetic judgement at this time.

I also think that articles such as this in The New Yorker, aimed at cutting her off at this late date from her rightful inheritance, provide all the more reason to keep her entrenched right where she is. Thank you Ada Byron, and may you live long and prosper in our shared memory.

image

A draft version of this article first appeard at the Swarthmore Math Forum on March 6, 2001


Synergetics on the Web
maintained by Kirby Urner