03 April 2013

They Died With Their Boots On

The title of this missive is also the title of an Errol Flynn western, 1941. That much I recalled, and the reason I wanted it for a title. What I didn't recall, but WikiPedia reminds me, is that the movie is a fictionalized account of George Armstrong Custer. Yes, that one. So, as a Dr. Codd title, it's a bit inapt. Or not, depending on your place in the RM/SQL/NoSQL spectrum.

When I was in school, I recall being told that the main difference between a job and a career was one's attachment to the work itself, rather than to the ability to pay some bills. A career lasts a lifetime, while a job is a necessity, much like sitting on the toilet in the morning. Make a bunch of money, so that one can spend as much time as possible doing something else.

The father of a high school friend was a general surgeon (the father, not the friend), who owing to the disruption of WWII and certain digressions in his youth, had been practicing for only about 20 years, even though when I met him he was a bit more than 60. This was before the attack of the vampire squid HMOs, so he was in private practice. As such, he depended, as did specialists then, on referral from general practioners to continue "cutting stomachs", as he oft times described his work. He said that a surgeon was retired by his colleagues: as they gave up their practices, the surgeon loses referrals; younger GPs referred to their age-appropriate peers, by and large. A surgeon didn't stop working because he wanted to, but because he was forced out by circumstance.

Some time later, whilst working in DC, I had the opportunity to take a couple of seminars with W. Edwards Deming. He was still well known within the OR/QA/stats world. Not so much now. He was in his 80s then. According to WikiPedia:
In 1993, Deming published his final book, The New Economics for Industry, Government, Education, which included the System of Profound Knowledge and the 14 Points for Management. It also contained educational concepts involving group-based teaching without grades, as well as management without individual merit or performance reviews.
He died in 1993. With his boots on.

Three examples of the principal of advancement: we accrue knowledge as a people, and don't purposely turn back the clock. The Dark Ages happened because the baser cultures successfully attacked (think about that in the Sandy Hook context of gun control). As attributed to Newton: "If I have seen further it is by standing on the shoulders of giants". I wonder where the IT world stands today?

Which brings us to the question: how is it that IT generally, and database applications specifically, have shown such retrograde/reactionary tendancies over the last couple of decades? That is to say, the embrace of data technologies from the 1960s (and even, one might point out, 1950s)? Why is iteration/looping over sequential data the sine qua non of coding? Is it pure ignorance? "Those that ignore the past are doomed to repeat it"? By way of contrast to other professions, (I was tempted to say that physicians don't revert to using leeches, but they do, a bit) such as medicine accrue learning and move forward. So, I'll point out that they no longer consider blood soaked clothing as a mark of expertise.
Although even some Greek surgeons had advised washing the hands before dealing with patients, this aspect was overlooked and the doctors strode around in blood-stained coats. The bloodier the coat, the higher the reputation of the surgeon.

But the younger set are willfully embracing siloed, non-transactional, client-side driven, flatfile data applications in java and such that, save for the syntax and CAPITALIZATION SCREAMING of COBOL, are semantically the same as those applications tapped out on 059 keypunches. Why? Part of the explanation lies with the residue of early web technology. By the mid 1990s, commercial computing was divided between mainframes running COBOL with mostly DB2 and a bit of Oracle (notoriously ill-suited to the 370 architecture) over SNA to 3270 character terminals, and AS/400 or *nix mini-computers running some RDBMS (or 4GL/database) over RS-232 to VT-X00 character terminals. This was the era of client/server in a box. All the data be ours sayeth the box. Such machines had an order of magnitude (or more) slower processors, smaller memories, and disk capacities than today. But they could populate their screens in real-time. While the 3270 did/does have some local coding capability, not unlike a hobbled javascript, the point was to manage the data in-situ and leave just screen painting to the client. Real time data population and editing against the datastore was the norm. One had to be careful with transaction scope, but one has to anyway.

With the young web, where bandwidth to the browser was effectively far lower than a VT-X00 would see over RS-232, javascript increasingly available, and young studs eager to make money (rather than a career)... Well, we have the "software problem". Not to mention that the math ability of US kids declined.
Unfortunately, the percentage of students in the U.S. Class of 2009 who were highly accomplished in math is well below that of most countries with which the U.S. generally compares itself. No less than 30 of the 56 other countries that participated in the Program for International Student Assessment (PISA) math test had a larger percentage of students who scored at the international equivalent of the advanced level on our National Assessment of Educational Progress (NAEP) tests.

The relational model, which doesn't demand anything more than elementary set theory, doesn't fit the bill.

Making a quick buck on Wall Street, selling bogus securities and such, doesn't require knowing algebra. So, they don't. Making a quick buck with some social app doesn't, either. So, they don't.

I recall the process of class selection as an undergraduate. When it came to electives, even for those with a math-y major, the absolute preference was for classes from sociology, poli sci, and maybe psych. Why? Because there were seldom "right" answers, and if one was an accomplished bullshit artist (and even engineers could manage that), then a high grade was assured. The lower the rigour, the higher the GPA. And so it is with the applications they gravitate towards. Unstructured data. Fuzzy logic. And so forth.

Will it be possible for Chris Date, to cite an example, to be well regarded in his 80th decade (assuming he can get that far), in the way Deming was? I hope so. On the other hand, Deming was gone before the vampire squid assault of the Bayesians, so if he'd been born in 1930 rather than 1900, he could have been ignored once he reached 60.

The infrastructure to support client/server in a box on the web only gets deeper and stronger. At some point it will win out. And we can all die with our boots on.

No comments: