16 April 2012

Yogi and Booboo [UPDATED]

For those who like to gamble in stocks, Seeking Alpha is a pumper's delight, balanced by the occasional hit-job on Chinese reverse mergers. Yesterday brought an interesting piece on SAP, which has a reasonably complete backstory, though somewhat uninformed, title. The useful part is the description of recent history. What's wrong is that SAP, from inception, had its own "database" to support the various applications and no other datastore could be used. It took Oracle some considerable effort to get access, here and here. And then, there's ABAP, the barbarous 4GL source language, in German, of course. That may have changed in recent years.

The main subject of the piece is HANA, SAP's in-memory database, which it intends to market as a separate product. Both Oracle and IBM, in the last few years, have acquired in-memory database companies, although there's not been a whole lot of press about them since the initial acquisition stories. SAP isn't doing anything new here.

A quote from the piece (with which I surely agree):
"... it's the apps that count, and increasingly, it's the database that makes the apps. Once again."

I couldn't agree more.

But here's what doesn't get said. In-memory databases will have to be compact; DRAM storage will always be more expensive per byte than SSD or HDD. The RM to the rescue. There is no other data model which is more parsimonious with data (if any of the alternatives can even be called models). High NF structures will be key to efficient implementation to in-memory databases. For those with long memories, Texas Memory began in the SSD business with DRAM devices, and held to that for a long time, perhaps too long. With high NF schemas, in-memory databases aren't penalized for joins or CTEs. Data is purely orthogonal, thus robust and with 64 bit memory addressing, all data ought to be equally accessible. I've gone on about that before, so just insert here.


[UPDATE]

Found an additional article, which had this to say:
"The longer term benefits of HANA will require new software to be written -- software that takes advantage of objects managed in main memory, and with logic pushed down into the HANA layer."

Yet another implication that the data is logic, and logic is data.

No comments: