31 January 2014

Bobbing For Apple, Again

Could Apple be scraping the the bottom of the barrel? An exercise watch?? Why pay Apple prices for one, when they can be had for $14? How much more can such a watch do, that would make it worth the Apple premium? Again, form factor. Samsung's Gear isn't doing well and it is tied to the phone. Whether one can stuff enough capability into an independent device the size of a normal watch?? Not yet.

30 January 2014

Under the micRoScope

The R community's instance on layering (for mindshare purposes) "functional" and "object oriented" tattoos on the language has always struck me as something between self-aggrandizement and self-delusion. R is FORTRAN, pure and simple; functions eating data. That the data can be given a identifier is meaningless. One of the results of R's design/heritage is that variables have somewhat squishy provenance.

Now comes another treatment of the pasta bowl via R-bloggers. The comments are a stitch.

If you want the full monty, read up on 'environments' in any R docs; "R in a Nutshell" (mine is first edition) has a chapter on it.

Here's two conjoined footnotes from the chapter (8, page 100):

- If you're familiar with other languages and language lingo, you could say that R is a lexically scoped language.

- This allows symbols to be accessed as though R were dynamically scoped.

Reminds me of that penultimate scene from "Chinatown".

One of the comments links to this post with a comment by Ross Ihaka. That post is from 2010, by the way. Not a new realization.

If you're from the world of "normal" programming languages (C family or Algol family), none of this makes sense. Pick a fork in the road, and stay there. Just because one uses the word 'function' in defining a block of code isn't sufficient to claim that the language is 'functional'. Similarly, attaching labels to data structures, and calling said structures 'objects' isn't sufficient to claim that the language is 'object oriented'.

28 January 2014

Balloon, Meet Pin

Watching Keith Olbermann voice-over sports clips is to see an utter waste. I had no concept of Olbermann before is MSNBC stint, but I do catch bits of him whilst waiting for the local weather to come on. And thus I discovered "bye, Felicia". Who the hell is she? Near as I can tell it's a young-uns expurgated version of 'perform a non-procreative auto-intercourse'. Or thereabouts.

Apple reported earnings after Mr. Market went home yesterday (he didn't really go until 8:00 pm, but ...), and the share dropped to $500 betwixt 4 and 8. As I type it's a tad over $500 in pre-market. The pundits have already spewed much typing (to which I am adding, alas) over the meaning of it all. Tally ho!!!

This endeavor exists in multiple versions, some with a preamble of quotations appropriate to the version. This missive will appear in all, but the quote I need is this:

As mass production has to be accompanied by mass consumption; mass consumption, in turn, implies a distribution of wealth -- not of existing wealth, but of wealth as it is currently produced -- to provide men with buying power equal to the amount of goods and services offered by the nation's economic machinery.
-- Marriner Eccles

Eccles was an advisor to FDR, so the time of the quote was during the Great Depression. Recent times have delivered more than few "bye, Felicia" moments, all tied, one way or another, to Eccle's observation.

- Perkins' assertion that Progressives are Nazis.

- Data that 85 individuals hold as much wealth as the lower half the planet's population.

- Chris Christie and The Bridge.

- Revelation that all that manufacturing growth is at burger flipping wages.

- Boeing, among others, carving out subsidies orders of magnitude greater than the per capita cost of jobs "saved".

- Papa John bemoaning that ACA will make him poor.

- Evisceration of SNAP/Food stamps.

- The realization that John McCain is a liberal.

- The continued burgeoning of corporate cash piles.

- All that QE money, yet total employed hasn't moved.


In terms of Apple, and all those companies focusing on the top X% of the population, eventually there's no more growth to be had, simply because the headcount of your X% of income/wealth is stagnant, at best. Some times, and some places, concentration leads to smaller market opportunity.

In Apple's case, when we (if there is a we in the future) look back, the iTrashCan (aka, new Mac Pro) will be the inflection point; and not a good one. Apple, perhaps in a "bye, Felicia" moment of great irony, introduced the Mac with the "1984" commercial. What has gone largely unnoticed over the years is that, as a device purveyor, Apple is completely fascist in its relationship to its users: Jobs/Ive/whoever tell the user what they should do, how they should do it, and provide no way to alter the device. Contrast with the PC clone world, where customization is standard; oxymoron though that may be.

With the iTrashCan, Apple has told its diminishing professional/prosumer clients that a computer is really just a toaster, "no user serviceable parts inside".

So, tonight is the State of the Union, which will bring a multi-hundred point drop in Mr. Market. Mr. Market doesn't like to hear anything which contradicts his "I am the Job Creator" self-image. The "let them eat cake" crowd never gets that capital is worthless without consumers. They just never get it. The frog and the scorpion. The corporations, and the 85, are clearly at the limit of their tolerance. They've accumulated all that moolah, but just can't seem to find ways to get 10% per annum in free money. Consumers, damn them, just don't have the demand for goods and services they used to. There just doesn't seem to be any guaranteed source of unearned income anymore. Ingrates!!!

Balloon, meet pin. With all that moolah sitting with the 85, deflation is the only sure way to make that moolah more valuable. Unearned income on steroids. They're going to give us a Depression such that we'll never, ever, again get so uppity.

26 January 2014

It's a Catastrophe!

How would you like to be a P/C actuary for one of the Bermuda Re-insurers these days? (In what follows, the discussion is around so called CAT bonds, which will be explained. While the events discussed were covered by regular liability insurance, such events should be covered privately, and CAT bonds are the logical vehicle. The difference isn't directly important to the argument.) Moving all that shale oil in antiquated railcars? No problem. Well, not so much it turns out. Although there hasn't been a post, that I've seen, come through R-bloggers on this specific genre, when Black Swan, outlier, events become part of the core distribution, life gets bitter.
Since March there have been no fewer than 10 large crude spills in the United States and Canada because of rail accidents. The number of gallons spilled in the United States last year, federal records show, far outpaced the total amount spilled by railroads from 1975 to 2012.

Again, just like "all those mortgages won't go underwater at once", "oil trains ran fine for all those years without a problem". Right.
... railroads and car owners can no longer ignore the liabilities associated with oil trains, which could reach $1 billion in the Quebec accident.

And, it turns out, not all the science and engineering were known beforehand (oops):
The accidents have brought another problem to light. Crude oil produced in the Bakken appears to be a lot more volatile than other grades of oil, something that could explain why the oil trains have had huge explosions.

Here too, the warnings came too late.

Federal regulators started analyzing samples from a few Bakken wells last year to test their flammability. In an alert issued on Jan. 2, P.H.M.S.A. said the crude posed a "significant fire risk" in an accident.

According to this Wall Street Journal report, the Canadian carrier had "normal" liability insurance and is already in bankruptcy. Gummint is paying for recovery. Damned socialists. Read this quote, closely:
They say initial data indicated the oil ranged from the most hazardous to some that wasn't classified at all. Truck shipping data, however, indicated all was the medium-hazardous type. In any case, when it reached the train, its classification changed to least hazardous - less prone to ignite - the Canadian investigators say in initial reports.

Sound a bit like the mortgage rating agency scam? "Do you feel lucky, punk? Well, do ya?"

One of these Bermuda companies, CatVest, according to this piece from a Bermuda industry organ indicates it is into such insuring. It's from two years ago, so might have somewhat different views today.
We analyse a whole suite of oil and gas risks, including physical damage to facilities, business interruption, third party liability and operators' extra expenses, which includes control and well issues and losses due to pollution damage. We've done hundreds of calculations over the years and act as both a calculation agent and a claims investigator for the energy and chemical sectors.

Hmm.

By the way, in this article you'll see a couple of acronyms that are likely foreign: ILS and ILW; insurance-linked securities and industry loss warranties. Again, they end up being open bets by third parties on events. Sound like a CDS to you? Does to me. As it happens, there are many esoteric swaps available. The Wiki lists, at least, some of them.
That is where the capital markets and insurance-linked securities meet, through derivative or securities markets. CAT [catastrophic event] bonds are grouped by their level of risk and sold in portfolios in security markets.

Sound even a bit familiar? The volume of ILS/ILW aren't at the level of MBSs or CDSs and the like in the run up to The Great Recession, but the quant folks involved in assessing risk must be experiencing ever tightening sphincters.

(Aside: if you follow no other link, you have to read this.
"from a legal perspective, all jolly interesting. As lawyers, we get to look at wording from ILS products, which are very often very novel, very inventive, very clever products."
)

If you want to pursue further, here's an industry newsletter. Better yet, perhaps, is this piece in 'The Economist'.
Pension funds and other institutional investors are on the hunt for assets generating decent yields, particularly if the returns are uncorrelated to stockmarkets. As recently as last year cat bonds paid up to 11 percentage points over Treasuries, for risks equivalent (at least according to the ratings agencies) to holding speculative-grade corporate debt.
...
Britain's financial regulator this week warned that the influx of new money into cat bonds could push insurers towards underwriting dicier business to keep profits up. Man-made disasters can be just as frightening as natural ones, after all.

Note the content of those parentheses. They're doing it again. By the way, if you don't read the whole thing, it was written in October, 2013, before Casselton happened. After Quebec, though.

Finally, this is an academic paper hot off the presses. Give it a scan. If nothing else, you'll see that this area of quant is ridiculously data poor.
To generate more interest in exchange-traded insurance-linked derivatives, we suggest that exchanges design their derivatives contracts in a simpler manner, by first selecting an index that is easy for market participants to grasp as a trigger for payoffs, and then settling the contracts very shortly after a catastrophe occurs...

If that doesn't sound a bit like a call for another Li's copula, you aren't paying attention. As ever, human events aren't like Brownian motion, and can be traced back to: incentive, incentive, incentive. The Times piece makes clear that those involved, oil companies, railroads, and car manufacturers, were intent (still are, I expect) in socializing cost (letting the Gummint pick up the tab) while privatizing profit. As they always do.

24 January 2014

We All Scream For Ice Cream

All these limp wristed whiners need to shut up. Yes, it might be a tad chilly in the Meadowlands for the Super Bowl (and on Ground Hog's Day, in case you forgot; census day on Block Island, too). So what? Here's some important data.

Here's the rundown on the NFL Championship games. Almost all north of the Mason-Dixon line. At the end of December, mostly. Here's a fact from meteorology: seasons aren't determined by solstice/equinox dates, but by actual weather. So, summer is June/July/August. And winter is December/January/February. January is the coldest month. February is the end of winter, not the middle.

I suppose that most of the whiners are too young or stupid to know about The Ice Bowl (NOAA's page). Just look it up lots of places for more detail.
The 1967 NFL Championship Game between the Dallas Cowboys and the Green Bay Packers, played on December 31 at Lambeau Field, is known as the Ice Bowl, arguably one of the greatest games in NFL history.

Yes, yes it was.

Treading Water

The quant/bankster types took something of a hit during/following The Great Recession (it is over, right?) for being the enablers of The Giant Pool of Money to bubble up through the housing markets. Yes, plural; Spain and Ireland to name just two others. I've gushed untold keystrokes in these endeavors on the notion that The Giant Pool has not gone away. If anything, with corporate profits in the ionosphere and the Fed gifting cash to large holders, the Pool has gotten deeper. Anecdotally, no question.

I use NBC News as my home page since Yahoo!/Mayer screwed that pooch. Well, today brings this "news".
Insatiable demand from hedge funds, private equity investors and foreign buyers, all armed with ready cash, are elbowing first-time buyers out of the housing market.

First-time buyers tend to purchase lower-priced homes, but all-cash investors have cornered the market on those, leaving little behind. All-cash purchases accounted for 42.1 percent of all U.S. residential sales in December, up from 38.1 percent in November, and up from 18.0 percent in December 2012, according to a new report from RealtyTrac.

Well, it isn't all that new. Way back in August, this, more extensive, data were reported. The comments to this piece are refreshingly Darwinist.
The low rates promoted by the Fed were cast under the umbrella of helping out regular families but in reality, they have turned into the next hot money play for banks, hedge funds, and Wall Street. The fact that 60 percent of all purchases in 2013 are being driven by the cash crowd is crazy (a 200 percent increase from the 20 percent pre-crash levels).

At this point, it's difficult to follow a breadcrumb trail from quants to crash. This appears to be a case of what you get when you try to push a string. Greenspan did it first, beginning in late 2001, and gave us The Great Recession as his going away present. The gift that keeps on giving. But it is further evidence of the few exploiting the many. Contrary to some comments to the August piece, Joe Sixpack didn't go into Countrywide and dictate the structure of a liar loan, rather, Joe was told that such a loan was not only feasible but in his best interest. It was found that mortgage companies and banks, some more than others, steered Joe into higher profit exotics even when Joe could have gotten a conventional, but less profitable, mortgage. Incentive, incentive, incentive.

Remember: you can't push a string, and that incentives matter more than data when the two disagree.

23 January 2014

Buffett Billions [update]

For some reason, unknown to me, Corey Chivers' (Buffett Billions) finds the following comment objectionable (as I type, at least). Check out his post, anyway, flawed though it is.

Reviewing reporting doesn't answer the important question: is the $1 billion per perfect bracket, or shared among all perfect brackets? There appear to be none reported in history, which folklore likely led to the idea. History shows that no 16 seed has won in the first round. 15/14/13 seeds not much better. Few Black Swans in the early rounds. And so on. Later rounds yield more "upsets" (lower seed winning), since the remaining teams are more evenly matched (assuming the seeding committee is smart). So that really big number doesn't really apply. Those who are college basketball junkies have a reasonable shot at getting it right.

IOW, this really isn't a prob/stat problem, but more OR based.

[update]
Some numbers that add perspective.
Some more numbers that add more perspective.

22 January 2014

Don't Touch Me, I'm a Real Live Wire

Every now and again, research of nature's laws can be applied to human behavior and actually make sense. Eye of the beholder and all that, is assumed.

There was reporting a while back that Facebook had lost some teenage users. Now comes a study of Facebook as organism. A human virus, specifically. I can hear sphincters snapping shut all over Wall Street. Such analysis can also apply to, say for instance, iPhone infatuation. Or, for that matter, any activity which is discretionary and/or mob-like.
The SIR [Susceptible, Infectious, Recovered] model can be applied to OSNs [online social network] by drawing the correct OSN analogues to the SIR model parameters. When applying the SIR model to OSNs, the susceptible population compartment is equivalent to all users that could potentially join the OSN. The infected population compartment is analogous to OSN users: potential OSN users are susceptible to joining the OSN through "infection" by contact with a current OSN users. Finally, the recovered population compartment is analogous to the population of people who are opposed to joining the OSN. In the case of an OSN, this could be comprised of people who have left the OSN with no intention of returning or people who resist joining the OSN in the first place.

Have a look at Figure 2 (it doesn't want to copy, so I'll vent my frustration elsewhere). There's a fairly well known analysis of cholera in London which serves as paradigm for disease analysis. Not exactly the same as Facebook, although one might opine that having cholera is better than having a Facebook account.

Note that the authors are neither biostats nor epidemiologists, but real engineers. (Gad it felt so good to type 'real engineers'!!)

21 January 2014

And What About Candy Land?

Just when you thought things couldn't get any stupider, candy is now trademarked. I don't play the game, so I suppose I don't care, but come on?
Developers have not been too happy about the news - especially given that Candy Crush Saga is itself essentially a finessed version of the decade-old game Bejeweled - but with the game producing so much money for its developers and for Apple it's likely that the 'candy' trademark will be enforced enthusiastically from now on.

19 January 2014

Stockholm Syndrome [update]

Shiller gets to go to Stockholm and bloviate, while I'm stuck here in Arctic New England. Life ain't fair. At least, he's provided a precis of the proceedings. And, in the process, elevated my view of Left Wingnuts.

There are two major points: first, quants mostly get it wrong; and second, homo economicus is a myth. The first occurs because the second is true.
Does it make sense to suppose that economic decisions or market prices can be modeled in the precise way that mathematical economists have traditionally favored? Or is there some emotionality in all of us that defies such modeling?

Of course. The field was originally named 'Political Economics', and there's little in human existence that's more attached to emotion than politics. How else to explain Kansas? All those gun loving, God fearing folks voting for the corporate manipulators eager to keep them poor. Well, except for those few who get fat farm subsidizes. Not that such moolah is welfare, of course.
It is hard to sum up all this discussion, however, because of a basic problem: defining "rational."

Aye, matey, thar's the rub. A frequent quote from the housing bubble amounted to: "we all have to keep dancing as long as the band plays; all the others are dancing". Another way to put (from J.K. Galbraith, variously phrased) it: "financial genius is a rising market". Thus Li was able to foist a mechanistic formula on human behavior, which is to say, where they make the rules rather than Mother Nature. The ultimate basis of quants is that the rules are, more or less, stable and *not under the control of the analyst*. Such analysis works well with microarray data, for example. This is the crux of Shiller's point, too.

He closes:
The question is not simply whether people are rational. It's about how best to describe their complex behavior. A broader notion of irrationality may someday be reconciled with one of rationality, and account for actual human behavior. My bet is that real progress will come from outside economics -- from other social sciences, and even from information sciences and computer engineering.

In other words, don't assume that one can model human decisions using only "normal" data. If policy (those musical chairs during the housing bubble) trumps data whenever enough people decide that the policy is more fun, then data loses. The best that financial quants can hope to do is front run changes in money flows. Which is not to assert that data is useless in bubbles; those that noted the massive unsticking of house prices and incomes made money going up and coming down. But not so many. And that means Joe Sixpack with his PC and standard data and R doesn't stand much of a chance to beat the pros with their HPC machines and proprietary data. You're better off tracking Briefing.com, with an eye to spotting grey swans; watch for events.

[update]
I've spent the last few hours looking for more views on the subject, and found this one. Very interesting. Could have been written by me (well, in a dream, may be).

From page 20:
The very complexity of the mathematics used to measure and manage risk, moreover, made it increasingly difficult for top management and boards to assess and exercise judgement over the risks being taken. Mathematical sophistication ended up not containing risk, but providing false assurance that other prima facie indicators of increasing risk (e.g. rapid credit extension and balance sheet growth) could be safely ignored.

And, from page 24 (the punch line, in the gut):
New generations of students will have to use the tools and techniques of QRM [quantitative risk management] wisely in a world where the rules of the game will have been changed.

16 January 2014

Protect the Children: End Graphic Violence

In the beginning was Bachman
He looked upon the Firmament of Data, and saw Chaos
And lo unto Him, he created IDS, which begat IDMS
These holies were as the spider's Web, one scion goeth to all kith and kin
And they to the scion, each to many, as the tendrils of the spider
It was good

Thence came the Business Machine of all nations, and lo, it would not pay tribute to Bachman
The Machine pruned the spider's web, each from one and only one, which was called IMS
Lo, the Machine was prideful, having banished the spider's web from its machines
It was good for the Business Machine of all nations

Darkly, a spawn in the Business Machine looked upon this new firmament of data and saw tyranny
He preached of relations and their power to erect any monument, whether like the spider or the serpent
No longer shackled to the rack of pointers and hard links, the people were free expand their data as need be
This prophet was called Codd, lo many spake of him as God, yet many in the Business Machine were fearful of him
Mighty lords of the Business Machine took heed of this fear and consecrated the Chamberpot to write the text
Only when the Oracle child did propagate boldly would the Business Machine acknowledge its patrimony

So went the years, for many of those years, when the apostate youth, innumerate as wide as the heavens, looked blindly at themselves
We do not understand the Set and our relations are carnal, verily we must have a data supine before our code
Ye, they consorted with satanic harlots, producing an evil spawn which they called Graph, which was only Bachman in sack cloth

And so, the great ouroboros has deceived the youth, who must wait for the second coming of the Codd
May they not bow before a false Codd as lo they have done for these many years

15 January 2014

Major Kong Falls Over Switzerland

Net neutrality is dead!!! Long live Robo Cop!!! Or, what does the future hold now that net neutrality is gone?? The problem for quants is that, while data driven analysis makes perfect sense in the real world, i.e. the ones driven by forces of nature, the process gets squishy when the rules of the game change at the whim of humans. Most often, those rule changes are made by those who benefit from the change, and which change is often hidden from the public at large. And that group includes many of the quants.

When there is massive policy change such as this, the job of analysis is to consider how behavioural incentives have changed. What behaviours, previously forbidden, and which can turn a profit, are now legal? Who wins and who loses? The swan may not be black, but gray enough as to not matter.

Verizon, the entity which initiated the court action says:
"Verizon has been and remains committed to the open Internet, which provides consumers with competitive choices and unblocked access to lawful websites and content when, where and how they want," the company said in a statement. "This will not change in light of the court's decision."

By way of comparison, after the voting rights act ruling, those who made the ruling claimed that the law had worked, and nothing would change, because, well, the law had worked. Within 48 hours, those purely democratic Southern states which would never, ever return to past bad behaviour, set about undoing voting rights. You can look it up. Follow the incentive.

Anyone who actually believes that is a fool. The whole point of killing neutrality is to segment the market, and dump any segments which aren't "profitable enough".
In 2002, the agency said Internet service should not be subject to the same rules as highly regulated utilities, which are governed by regulations on matters like how much they can charge customers and what content they can agree to carry.

So, who was running the FCC in 2002?
Michael K. Powell, who was F.C.C. chairman in 2002 when the agency set up its Internet governance structure, said, "Today's historic court decision means that the F.C.C. has been granted jurisdiction over the Internet."

Then, who is Powell? Well, he's Colin's son; appointed by Clinton; and toadied for Bush. He now heads up the cable lobby.

Who, in the current world of Darwinist Capitalism, is the master of market segmentation? That would be Apple. Market segmentation means not only varying price to capture consumers' surplus (the notional version of the gambit) but also to remove market segments from supply. There has been in the common lexicon for some years the term 'digital divide', wherein the less wealthy have less of the digital domain. If the less wealthy don't have access to the innterTubes on a level playing field, this becomes yet another case of not wearing an Old School Tie (the Brits will get that; Yanks may be not so much). To the extent that current affairs reporting becomes beholden to innterTubes for dissemination, we can expect a few changes. First, the ever more concentrated control over the innterTubes will provide only 'good news'. And, second, the less wealthy will get 'good news' which portrays them as the cause of all that is bad with society. Fascism, as defined by Mussolini, is government by and for capitalists. As the information superhighway becomes the only path for information, and is controlled by a handful of capitalists, what's the incentive to not mold the information?

Remember: the 'free' parts of the innterTubes run on adverts (Wikipedia being an exception), and Apple has shown that segregating out the non-buyers from its sphere is more profitable. With the carriers now free to segregate their customers, why would they not? Why would they not make deals with Netflix, et al, to provide highspeed connections to wealthy neighborhoods/towns/cities, but not to South Buttfuck? Of course they will. There are sites already which catch my use of Adblock Plus, which is more to preserve bandwidth than to avoid ads (although I've never clicked on one and never will). Some won't let me in, others can be fooled. Market segmentation at work.

Of course, there is no such incentive. The incentive is to, Darwinist/Rand fashion, crush the weak under the boot heel. And, there is no punishment for being a bad actor.

The allure of innterTube ads is that they're more focused than print ads. But net neutrality limits the ability to segment aggressively. The time will come when sites will not only block those that don't view or click ads, but don't buy. Who better to enforce this segmentation than the carriers? Rather than each advert owner having to keep track, the carriers offer up a throttle: they'll keep track of those who buy and those who don't. Those who don't get blocked from some sites, and get reduced bandwidth in the bargain. In due time, the innterTubes will become like the Apple ecosystem: of, for, and by the top 20% of the wealth curve. That Old School Tie will be adorned with the Verizon Hyperspeed tie tack.

09 January 2014

Listen to Henry

I've not attempted to verify the following:
Samsung is in a unique position in the SSD market. It's the only company in the consumer SSD business with a fully vertically integrated business model and zero reliance on other companies. The controller, NAND, DRAM, firmware and software are all designed and produced by Samsung...

but I've certainly seen similar versions of that statement over the years. The immediate response was, that's what Henry Ford built. The Ford Motor Company for many decades had a simple structure: raw materials in one end, Fords out the other. Keep all the profit from each step. Exactly the opposite of today's out sourcing meme.

A signature which waxes and wanes in the prologue of these endeavors:

A business that makes nothing but money is a poor business.
-- Henry Ford

Big Dig, Big Data, Big Deal?

Among the largest old city rehab efforts in the history of the country was The Big Dig in Boston. It finally finished, late and over budget. But it includes one of the prettiest bridges on this side of The Pond. Why is it that any European country manages to do civil engineering with greater beauty in its homeliest structures than the USofA does in its best? Why is it that virtually every "innovation" in automobiles since Henry Ford was created by some European company? Just asking.

Recently, this endeavor mused on the Death of Big Data. Or, perhaps, high morbidity. Watson has been getting ink recently on blogs, so it's not a surprise that IBM would take the opportunity to discuss the machine. And from what I can't tell is whether Watson is sui generis, or a model shippable in quantity. From the wiki description, it's built from off the shelf parts. Except, of course, for the software. What's even more interesting: Watson doesn't make it to the top 500 of supercomputers, and appears to be I/O bound by *hard drives*:
According to John Rennie, Watson can process 500 gigabytes, the equivalent of a million books, per second. IBM's master inventor and senior consultant Tony Pearson estimated Watson's hardware cost at about $3 million. Its performance stands at 80 TeraFLOPs which is unfortunately not enough to place it at Top 500 Supercomputers list. According to Rennie, the content was stored in Watson's RAM for the game because data stored on hard drives are too slow to access.

I guess these smart folks never heard of SSD!!!

What's even more interesting: some of that software is Prolog:
We required a language in which we could conveniently express pattern matching rules over the parse trees and other annotations (such as named entity recognition results), and a technology that could execute these rules very efficiently. We found that Prolog was the ideal choice for the language due to its simplicity and expressiveness.

Some background, some my own, on Prolog.
- it was created within months of Codd's relational model paper
- it uses what amounts to being a normalized, in memory, database. most Prologs refer to this data as "database".
- while at OMS in the early 90s, a couple of my colleagues attempted to build an AI sub-system for the main product, medical pre-qualification, in its database/4GL (Progress). never got very far, if only because Progress has never been particularly relational or normal in application
- while at CSC I had to endure a Prolog mutant called GraphTalk, which CSC had bought up a few years before from France. of course. my colleagues at CSC turned this mutant into COBOL/VSAM coding. Yum.
- one of the current uses of Watson is in medical diagnosis. hmm. twenty years too soon was I.

There's still a commercial version of Prolog/datastore called Amzi! (yes, the ! is part of the name just as Yahoo!). And guess what? It's major market is business rule and decision support implementations. As it happens, Prolog syntax/semantics is more alien to C inculcated coders than even R or SQL. But, according to its zealots, Prolog systems are orders of magnitude more compact than imperative (e.g., java/C/FORTRAN) equivalents. Kind of like what relational zealots say about RM/SQL databases versus flat files.

So, today's Times has a puff piece from IBM on the use and future of Watson. As others have concluded, but with suspicion, IBM sees Watson as central to its commercial success.
IBM's elevation of Watson is the biggest illustration yet of the technology industry's faith that so-called Big Data holds promise for the economy -- and the failure so far to meet that promise.

Big data is just descriptive statistics, since one has all the numbers. Look at any Baby Stat book, and an early chapter (and likely the shortest in the book) will cover all one needs to know about descriptive statistics. I know, I know. Big Data is really about correlation and finding the correct distribution. Mostly, for commercial uses, it's about finding a few golden correlation needles in a haystack of choices by millions of people. So you can spit more enticing ads at a few of them. I wonder how many of these Big Data projects were ever subjected, a priori that is, to a rigorous cost/benefit analysis? While Watson is a multi-million dollar machine, most Big Data can be handled using R or PL/R on a pumped up Dell. So, in such cases, one need only find a few silver needles.

The apostates are beginning to crawl out:
Likewise, IBM will have to sharpen its focus and what it delivers, said Henry D. Morris, an analyst at the consulting company IDC. "Big Data by itself isn't value, it has to deliver recommendations about what to do," he said. "They have to show people not just analysis, but action. They understand that there are challenges ahead."

By the way, I'd love to get invited to the Watson party (fat chance, of course): the staff will be located in the East Village. If you have to ask where the East Village is, you're so uncool.

07 January 2014

R, How to Write It, and Some Bad Quant

The stream of R books continues. Amazon sent this one along this morning. This is the chapter 10 Title:
Loops, The Un-R Way to Iterate

A step in the right direction. Which brings us to a couple of posts that popped up on R-bloggers, also this morning.

"R as a second language".
In most languages if one wants to do something many times the obvious way is using a loop (coded like, for() or while()). It is possible to use a for() loop in R but many times is the wrong tool for the job
...
There are some complications with some of the design decisions in R, especially when we get down to consistency which begets memorability. A glaring example is the apply family of functions and here is where master opportunist (in the positive sense of expert at finding good opportunities) Hadley Wickham made sense out of confusion in his package plyr.

Both statements ring true: the apply() functions are set-oriented (fits the RDBMS mind-set), and their lineage makes the syntax a Google-able construct which in turn makes the plyr package so much better.

Hot on that post's heels is this one, with a similar point to make.
... in R it is much simpler to take advantage of the R idioms to get there a lot faster. With this approach there is no need for loops or conditional branches. There is also no need for iterative array construction. Instead everything is done in one shot using a set-theoretic approach combined with function transformations.

Again, we read about set. While I've always been irritated by the fanboi need to cloak R in OOD/OOP/functional mystique (rather than actual syntax/semantics/structure), the (nearly) declarative approach to syntax (and, thence semantics) is comforting. R is just FORTRAN's function/data, and folks should just let it go at that. Iteration belongs on the metal, modulo true array processors (which don't explicitly need it); the continuing bottom of the brain stem memory of assembler (and, face it, C) likely accounts for loops in so-called higher level languages. Bah.

So, now we're on to concerns of quants. It seems that Big Data is Dead?
Google Trends shows searches of the term "Big Data" peaked in October, ending a nearly ceaseless climb that began three years earlier.

Huge flatfiles of un-normalized bytes may finally be seen as The Emperor's New Clothes by The Deciders out there:
In the case of Big Data, this probably means less focus on back-end technologies like new types of storage or database frameworks, and a rethinking about how best to integrate human knowledge, algorithms and diverse sets of data.

Can you say: "I want my PL/R"? And have I mentioned: the free-as-in-beer DB2/LUW can accommodate 15 terabytes? You don't get all the really cool bells and whistles, but that's pretty big data.

And now, for a little night music. Jenny Lind was the "Swedish Nightingale", which is close enough to a canary for government work. "We have our canary!" cried some. I'd noticed that house prices had been on a run, but not enough to put fingers to keyboard. This writer is from the AEI, so he blames The Damn Gummint, of course:
Both this bubble and the last one were caused by the government's housing policies, which made it possible for many people to purchase homes with very little or no money down.

Which, of course, is baloney. The bubble was motivated, widely known by anyone not hanging out with Mussolini's ghost, by the financial sector seeking high (but risk free) returns. So, they invented 'interesting' mortgages in order to sell more houses/mortgages which could be packaged together to make high yielding, but low risk (housing is always safe, isn't it?), securities. It wasn't the $10/hour bus drivers who went to CountryWide and said, "make me a mortgage that looks like this". Not the way it happened. The author is clearly confused, because he says:
In 1997, housing prices began to diverge substantially from rental costs. Between 1997 and 2002, the average compound rate of growth in housing prices was 6 percent, exceeding the average compound growth rate in rentals of 3.34 percent. This, incidentally, contradicts the widely held idea that the last housing bubble was caused by the Federal Reserve's monetary policy. Between 1997 and 2000, the Fed raised interest rates, and they stayed relatively high until almost 2002 with no apparent effect on the bubble, which continued to maintain an average compound growth rate of 6 percent until 2007, when it collapsed.

But, later on:
They claim that people will not be able to buy homes. What they really mean is that people won't be able to buy expensive homes. When down payments were 10 to 20 percent before 1992, the homeownership rate was a steady 64 percent -- slightly below where it is today -- and the housing market was not frothy. People simply bought less expensive homes.

Ya can't have it both ways, Jake. The moolah, whether Chinese, German, or American, flooded the mortgage process, pushing up housing costs, while the Banksters went about making mortgages for this housing stock available to the larger number of buyers needed to "clear the market" (cute econ speak, what?). Affordability was swindled in order to conjure the securities. In the short term (and we're definitely in the short term with the Bubble/Recession), only so many units could be built, yet to accommodate the Giant Pool's valuation (Banksters chasing mortgages with ever more bulging pockets) prices had to rise to "clear" the Pool's value. The builders were the ultimate winners:
When anyone suggests that down payments should be raised to the once traditional 10 or 20 percent, the outcry in Congress and from brokers and homebuilders is deafening.

Couldn't agree more; said it before.

I disagree with the use of rental payments as the measure of bubbles, however. There's no material difference between rent and (full load) mortgage; just go read up on what's going on in the Oil Plains States to see the effect of localized inflation of housing. The measure that matters is (local, for some definition of local) median income. No, what drove the Bubble/Recession was the flood of moolah. Without that flood, no amount of fiddling with mortgage lending rules would make a difference; there'd be no reason to generate ever more mortgages made possible by fiddling the rules, since there'd be no increased demand for the resulting securities; no demand means no supply, Laffer be damned. One could argue, even, that had the chronology happened the other way (relaxation of rules, first), no flood of money would have been conjured, since such mortgages would be clearly wonky to those who *need convincing* to part with more moolah, which they either didn't have at all or would have to move from other instruments. It was at the behest of the Giant Pool that the mortgages were conjured. Causation makes a difference, both in assessing blame and determining regulation from here.

In the end, it wasn't The Gummint which drove the The Great Recession. Without the flood of moolah seeking better than Fed rates (so, I guess the author should be pointing the j'accuse finger at Greenspan?), the housing bubble couldn't have happened. Without the quants figuring out how to abuse copulas, it wouldn't have happened. Without the collapse of demand for technical brains in technical professions (thus being 'freed up' to pursue finance, in the sense of automation 'freeing' farmers to work in Detroit after the turn of the 20th century), it wouldn't have happened. Without the decline in median income, and thus the willingness to spend unearned (and unexpected, largely) equity on consumption, it wouldn't have happened.

The author's most egregious lie:
By 1994, Fannie was accepting down payments of 3 percent and, by 2000, mortgages with zero-down payments. Although these lenient standards were intended to help low-income and minority borrowers, they couldn't be confined to those buyers. Even buyers who could afford down payments of 10 to 20 percent were attracted to mortgages with 3 percent or zero down. By 2006, the National Association of Realtors reported that 45 percent of first-time buyers put down no money. The leverage in that case is infinite.

Yet, he contradicts himself with regard to the driving enemy:
In effect, then, borrowing was constrained only by appraisals, which were ratcheted upward by the exclusive use of comparables in setting housing values.

And who drove the appraisals? I guess it must have been The Gummint? For what it's worth: Fairfax County Virginia was using stat analysis of home sales (as opposed to on-site inspection) to do appraisals by, at least, the mid-1970s (I was there). Such software is a big seller, and has been for a long time. Here's one example:
Based on a 20-year proven relational data model and common value approaches (Cost, Market, Comparable Sales and Income), the CAMA module produces reliable and accurate results which also apply if or when an assessment must be defended through the appeal process. Comprehensive comp sheets, ratio analysis and a CAMA valuation sheet that lists all items on a property and the dollar results of each component help automate the document management process.

For the finance types, infinity is (sorta, kinda) accurate. But not for a bus driver earning $10/hour facing an ARM reset. Not even close. The mortgagee is just a homeowner, not a titan of Wall Street. There's no 'real return' to be earned by the homeowner; the homeowner doesn't use the house to build either more or better (or both) 'psychic return' home widgets which he then sells to the market (a la Herbalife). Doesn't work that way. A house isn't an assembly line, or one robot, or even a simple milling machine; it generates no real return to anyone, not even the finance guys (they only get moolah). Leverage is just a red herring.

So, clearly there's more moolah flowing (not, yet, flooding) to the mortgage market. Recent stories have shown that hedge funds (at least) have gotten into the landlord business, buying up large tracts all to once. A more cogent analysis (assuming there's sufficiently granular data) would measure separately for owner-occupied and investor-owned units. It's pretty obvious that median income measures (which are still stagnant, at best) don't support increasing prices. I'd bet on deep pockets looking to monopolize SMAs fighting amongst themselves. Lots of money to be made being the major/sole source of shelter in an SMA.

02 January 2014

It's Not a Joke, It's a Pun

It's now 2014, and the (amateur) punditocracy finally show some signs of understanding Amdahl. Here's exhibit A. It's been a bit less than a decade since Intel started with multi-core (distinct from multi-chip package) chips. One would think that's plenty of time to have figured out teh issue. It's not a new problem, what with the likes of Thinking Machines having found a market wanting when it tried to build and sell such machines as servers back in the 1980s.
However, there's a catch. While there are some "embarrassingly parallel" workloads out there, they're usually not all that common in the consumer space. Why? The truth is that many applications and algorithms aren't well suited to parallelization (i.e. being coded in a fashion that allows the work to be distributed across a huge number of chips), so at the end of the day the vast majority of consumer applications (whether it's PC games or mobile apps) benefit far more from having more performance on 1 or 2 cores than less performance per core but more cores.

Gee, ya think?

Single core with ever increasing clocks was the key to superior performance for decades, until Intel and the rest discovered that physics made it clear that The End of the Road (a great unkown novel, by the way; if you don't know Barth, you've missed out majorly) had been reached. So far as meaningful, i.e. linear with respect to historical increases, new performance was concerned not so much.

Which brings us to New Year's Predictions. No Cringely am I (not bothered to follow him, either one, for years), but what the hell. I've been goaded so here's two.

From Revolutionary guys, I was moved to comment
Nope. The big thing in 2014 will be embracing of in-database analytics with PL/R-ish functionality in all the major databases. SAP/HANA is the prototype. Oracle is a bit behind, but does claim to have something that sounds like PL/R. DB2 is still doing SPSS from its side, alas; although they do have a Netezza (nee, Postgres) implementation for the Z machines.

As corps begin to realize that Big Data is a limited phenomenon, they'll also see the advantage of providing analytics in their applications, and that is most easily done from within the database engine. While it's been a bit more than two years since the Triage piece, and I despair every now and again that the deciders (in the C suites and pundit thrones) will ever get it, reality appears to be bopping dem boys upside the head. Good for reality.

And, for the second, will be the return of very old fashioned host/terminal computing. One might argue that The Cloud is already a manifestation of same, but the point of the prediction is a turning away from client-centric coding (including "managing transactions in the client" mentality) to server-centric data management, relegating the client device to display/input duties.

So, come back in a year (but keep those cards and letters coming), and we'll see if intelligent life in IT has poked its spiny little head above the grass.