30 June 2014

Look Ma, No Hands [update 2]

I got into a bit of a cat fight over on Seeking Alpha over the nature and value of the iWatch, if that's what it's going to be named. On my walk back from coffee and the newspaper, I mulled over the issue of what Apple could do to top current smart wrist devices. It's legendary that Apple had cornered the market on touch screen (or, at least, a critical component) and aluminum extrusion in the runup to the iPhone. Although, at the time, it looked to my eyes that Apple had merely stuffed a phone into the current iPod chassis.

Whatever. The point being, what could Apple have up its sleeve? I've seen no rumoring of Apple cornering any hardware this time, but if not, then that leaves power and a gimmick. The gimmick is a flip-watch; this provides twice the screen real estate (or thereabouts) to play with. At the same time, clearly, the form factor limits how much battery can be housed. Which sent a Gyro Gearloose bulb off in my noggin. One thing Apple could corner, and sourced in Asia anyway, is novel battery packaging. There seem to be two options, the hard and the easy. Well, -er.

Hard: a fully flexible chemistry in a neoprene (or similar) band. Would need to be pretty thoroughly tested; wouldn't want to have one break and melt off the user's hand.

Easy: a small battery cell, with one in each link of a bracelet. The techy nasty bit is connection among the cells.

Well, today AnandTech looks at a couple of new smartwatches. This is the ending:
One thing is for sure: those batteries are going to have to get thinner, or find a new place to live. Perhaps split up and distributed into a watch band?

I hate it when these rich famous dudes Vulcan mind meld me. So intrusive.

[update]
Well, that's too bad. Figuring that there's nothing new under the sun, two dead simple searches yielded, oops!

Google's patented the flip watch.
and
Nokia the flexible battery.

[update 2]
Years ago, actually a couple of decades, a guy where I worked was something of an avid runner, and had a runner's heart rate (and other parameters?) monitor. I recall it had two parts, the display on a wrist band and a shoulder holster kind of rig which had the sensor on the chest. Which got me to thinking: is it possible to *accurately* monitor heart function solely from the wrist pulse? Turns out not so much.
I put five leading smart devices with heart rate monitors to the test, measuring their accuracy with an EKG and the help of Dr. Zaroff, a cardiologist at Kaiser Permanente medical center in San Francisco. You can find my results below, but it seems the optical sensing technology used in many of today's new, wrist-based mobile heart rate monitors is sometimes inaccurate. That's in comparison to time-tested EKG machines (or the heart rate monitors that emulate them), which sense the electrical impulses that trigger your heartbeats.

If these toys can't even get heart rate right? And you're going to rely on them for what else???

26 June 2014

Carnac Is Ignorant

I've had "Applied Predictive Modeling" for a while and read through it at least a couple of times. Mentioned it a couple of times, too.

Now, the Revolution folks have posted a review, which is largely positive. But, Mr. Rickert pushes the button which irritates me about the book, and I'd assume, the cabal:
They emphasize that predictive modeling is primarily concerned with making accurate predictions and not necessarily building models that are easily interpreted.
The text contains nearly that sentence, verbatim. And it's the reason I've avoided saying much. As Your Good Mother used to teach, "if you've nothing nice to say; say nothing". Predicting from data up to time t works only if the real world is *exactly* the same at time t+1, and so on. It never is. The authors are bio-stats, more or less, so in that world God makes the rules, more or less. Not so out in finance.

But it has to be said: the reason we got The Great Recession is that quants bought the Markov principle (not the real one, of course), which is that all information about series X at time t is embedded in all observations up to time t. Which, of course, is bullshit. While it is somewhat useful in physical processes, it's bullshit when applied to human processes. Data series on human stuff is determined by policy changes over time. Decisions, and data, follow incentives. Not too surprisingly, incentives are the product of policy. Change the policy, change the incentives, and presto-chango the data takes a right turn. Thus, the whole sub-prime mess derived from loophole spelunking in law and regulation, which displayed astounding fungiblity. And was thus not predictable (and wasn't, in fact) by the quants. They didn't want to see that the rules were being gamed, so they didn't.

Read a good newspaper if you want to know where the world, or your specific concern, is headed.

25 June 2014

The End of The Road

John Barth, for those reading fiction in the 60s and 70s, was a minor cult hero. Kind of the Pynchon a half a generation before Pynchon. Not as prolific, and most fun from his first three books, late 50s: "The Floating Opera", "The End of The Road", and "The Sotweed Factor". And, of course, "Giles Goat-Boy", his most known work.

But, while I recommend the first three, we're here not to praise Barth, but to segue the title of his second book. Some have questioned the notion that real innovation, discovery of new science, is on its last legs. The usual response is very Rumsfeldian, "we don't know what we don't know". But in science, that's never been true. Scientists have always known what they don't know. They just ascribed the events or conditions to God. The structure of the universe is finite and knowable. The extent of the universe is another matter, so to speak. But the rules of engagement are fixed and finite, and as we approach the point of exhausting our ignorance, we must needs question the twin notions of innovation/discovery and of growth.

In the past, growth was driven by having more mouths to feed, which led to improvements in agriculture, and with cities, primitive forms of industrialization. In order to fight for resources, armies were needed, so gay times were banned. You can only stick where a soldier, or more brood stock, might emerge. We now use machines, more or less autonomous, to kill the folks we don't like, so massed armies aren't as important. Not to mention, The Bomb.

Economic growth driven by more humans is an anachronism. As more and more of our stuff is generated by fewer and fewer humans, creating yet more of them in privation is foolish. Not to mention evil. We need to find a way to distribute ever more stuff made by ever fewer humans to a more or less stable population. This isn't Iron Age Mesopotamia.

The 19th century filled out much of our knowledge of physics and chemistry, of the macro-world, and led to all manner of devices which we now assume as part and parcel of Modernity. I wonder whether most of us realize how little difference, in kind, our lives are from 1900? The main difference is more stuff, yet that stuff is largely miniaturized versions of what Grand Pappy had.

Do we really need yet more non-productive jobs, as we have in the financial services sector? In fact, yes. Finance, and gummint, have been sopping up excess man-hours for some decades. Finance creates its own myth of production, but really only serves one purpose: to marry those with excess moolah with those having a dearth of same. That large wages are accrued for this simple task is a puzzlement. But, what, exactly, will all those STEM students be figuring out in twenty years? What phenomena of today's world do scientists name God (or, equivalently, "we've no freaking idea!") as the cause? Should we be spending our time writing apps such a "Yo" (look it up)? Yes, it seems so.

24 June 2014

Who's Your Big Brother?

To answer the question, in general it ain't the damn Gummint.

23 June 2014

A Capital Idea

This posting, yet another attempt to game the market, asserts that QE is, to some degree, responsible for the mess we're in:
I am open to the idea we have entered a period of structurally low volatility due to increased regulatory burden and flow on effects from the decline of institutional FICC trading. Or it may just be a function of QE, and post-tapering we will see a return to higher levels.

FICC trading is the evil spawn of the whole financial services sector. That it would disappear, we'd all be better off. But what finally motivated me to track down some specific data was the whack at the QE pinata. It's been my sense, just watching the market news, that corporations as a whole are just not investing in productive activities. That the Great Recession was triggered by rogue, on a mammoth scale, fiduciary investing is the elephant in the room. Corporations simply would rather engineer moolah than build productive capacity. The simultaneous destruction of middle class demand for goods just might have something to do with it.

But, is there some quantitative measure of physical investment over some period of time? Well, Virginia, yes there is.

Here's a posting showing core capital goods, the kind we're interested in. It's the sixth graph. From this graph we get:
1990 - $36 billion (appr.)
2013 - $68 billion (appr.)

About double, if we assume that these are price adjusted. Rather less, if not.

But absolute value isn't what we should care about. Rather, physical investment as proportion of the USofA economy makes for a more accurate picture of what the 1% are doing with all that moolah they've squirreled away. So, then, what are the GDP numbers?

Here's one graph of them.
1990 - $ 5,979 billion (appr.)
2013 - $16,799 billion (appr.)

percents:
1990 - .6%
2013 - .4%

So, there's your answer. To answer the poster's question: QE hasn't much to do with it. The level of real investment has cratered, thus driving the 1% to fiduciary loophole mining to get their well deserved 10% return on their moolah. And, since it's clear that physical investment opportunities are either scarce as hens' teeth or too poor in return to be viable at the desired 10% return, the monetary interest rate must fall to meet the real return on real investment. IOW, with or without QE, we're facing (or, more likely, amidst) a long period of investment stagnation brought on by both stagnant middle class demand for goods, and ever diminishing real innovation in science and engineering. A Brave New World, so to speak.

21 June 2014

A Firm Grasp of the Obvious. Not.

A tip of the Hatlo Hat to the good folks at simple-talk for this gem of wisdom. If only the Kiddies would listen and shape up.

18 June 2014

Tomayto? Tomahto?

What began as a minor, though significant, theme of these endeavors, that Kiddie Koders' infatuation with xml/NoSql/etc. is a sign of both emotional immaturity and pure dumbness, seems to impel me to type away more than most other concerns. This sort of infatuation, which derives from these folks who insist they've invented something Really Neat and New, morphed into The Great Recession. The operative word has been Innovation (always, of course, capitalized).

There have been more than a few missives in these endeavors intended to eviscerate such nonsense. Well. Today I came across this Forbes piece by Mark Rogowsky. I recommend you read it, but it's not the subject of these musings. Rather, a piece from The New Yorker which he cites early on is our jumping off point. Why The New Yorker didn't send me the assignment is lost to history.

Lepore starts her narrative in the general vicinity of my beginnings: she in Cambridge, while I was in Government Center in Boston. She in the 1980s, I the 1970s. Not that there was a huge difference. Well, mainframe for me and PC for her. So, she must know her chops? Mostly.

OK. She starts, and perhaps not fully knowingly ends, with Harvard Business School. We meet the main protagonist, Clayton M. Christensen, author.
Christensen was interested in why companies fail. In his 1997 book, "The Innovator's Dilemma," he argued that, very often, it isn't because their executives made bad decisions but because they made good decisions, the same kind of good decisions that had made those companies successful for decades. (The "innovator's dilemma" is that "doing the right thing is the wrong thing.")

I confess to not knowing, or at least not remembering, Christensen, although the notion, and the book titles, are familiar. It all sounded like snake oil when I first met them. Lepore's prose doesn't change that. No surprise in that.

The thesis:
Manufacturers of mainframe computers made good decisions about making and selling mainframe computers and devising important refinements to them in their R. & D. departments--"sustaining innovations," Christensen called them--but, busy pleasing their mainframe customers, one tinker at a time, they missed what an entirely untapped customer wanted, personal computers, the market for which was created by what Christensen called "disruptive innovation": the selling of a cheaper, poorer-quality product that initially reaches less profitable customers but eventually takes over and devours an entire industry.

I'll interject right here the falsehood being perpetrated: Apple, by today's lights, is the innovating disrupter. Notice any disconnect? Here it is: Apple takes a cheap BOM, uses very clever marketing, and sells at preposterous margin to the top 20% or so of the market. So, then, what characterizes innovation? Is it Good Enough But Cheap? Or is it Cheap But Chic? Clearly Christensen has his head up his sphincter and spies the world through his umbilicus. Prose cleaned up for the Kiddies.

Lepore spends her text picking apart the Christensen examples which support his thesis. In particular, she takes him to task for
The handpicked case study, which is Christensen's method, is a notoriously weak foundation on which to build a theory.

One such, which Lepore spends much text on, is the hard drive industry. Both of them are fundamentally wrong. What neither gets to is the science and engineering of hard drives. It was new science and engineering that made possible multiple terabytes in a 3.5 inch form factor with predictable performance and longevity. For nearly two decades, IBM has been shipping mainframes with 3.5 inch drives. They're not just cheaper, but they're better. IBM sold off its HDD segment to Hitachi in 2002, and Western Digital now owns it. It's worth mentioning that IBM, while not the discoverer of GMR was first to ship it. I guess startups aren't the only place that figures stuff out.

But, of course, the hand picked case study is the signature method of The B-School. And most of those studies are written by B-Schoolers. What's the punchline to the fable of the frog and scorpion, "It's my nature." Why she wouldn't know that is unsaid.

Also left unsaid, by my reading, is the explicit accusation that "disruptive innovation" is just word salad. Small salad, being only two words, but bereft of any intrinsic meaning.

She does make passing reference to business change from the 19th century through the 20th, but makes no explicit mention of what it was that drove change (call it, innovation) in manufacturing over that period of time. And the answer is: scientific discovery. By 1800, Franklin had been dead for ten years, and Newton hadn't been dead until after Franklin was born. Thermodynamics were codified about 1850, give or take. Petroleum by Drake is 1859. Steel making, open hearth and Bessemer, same time. The last natural element of the periodic table, 1939 (depending on how one measures). The point, as asserted before, is that up to Hiroshima (or thereabouts), humans were figuring out the real world and innovating through new discovery. Today, not so much. We know most of what there is to know about the real world. We don't yet know what dark matter and dark energy really are, and there's not much likelihood we'd be able to exploit them the way we have petroleum and fission.

Today, innovation is mostly old wine in new bottles. With fancy labels, aka marketing. All that financial innovation which gave us the Great Recession? Just a repeat of bucket shops from the early 1900s.

In the end, is there any there, there? Is "disruptive innovation" redundant? Isn't any true innovation disruptive by definition? Could we just be many, many steps along Zeno's dichotomy paradox, making ever more inconsequential modifications to all our widgets? Given the paucity of new science applicable to normal commerce, is the reality of "disruptive innovation" just another way to say corruption of the rules of engagement? Certainly, financial innovation as executed in the last couple of decades qualifies. Apple's version is just slick marketing of cheap goods sold dear. That's hardly innovation; Swiss watches have been that for a couple of hundred years. Google and such are, in the end, just advert pushing platforms. Adverts are hardly innovation; likely the second oldest profession.

17 June 2014

Brawn Over Brain

Well, as you know, another shoe dropped with SanDisk inhaling Fusion-io. What does it all mean, Mr. Natural? More than just "shit", it would seem.

In the early days of SSD, which is only a couple of years ago, it was common wisdom that the intellectual property embedded in the controller was the differentiating factor (The Brain). Those manufacturers who were smarter would prevail. Not so much, here at the end of the Yellow Brick Road. Turns out, how to use NAND as persistent storage (in SATA/SAS protocol) isn't all that arcane. Turns out, there isn't much difference among controllers, skipping the likes of JMicron, of course.

Depending on how you count, there's between two and five NAND producers (overlapping JVs and such; and IBM may have to for internal use of Texas Memory). The Brawn. The likes of Fusion-io and Violin and the boutique set are in a bind: as node size as shrunk, inherent performance has degraded, but unit cost of a bit has too. SSD performance is near, if not arrived at, Close Enough For Government Work. They're all hostage to the fab holders. Who appear to be picking them off one by one. Once we arrive at directly attached flash persistent storage, and the smart guys are getting close, SSD style controller IP is valueless.

The Fusion-io price, right now, is way below IPO. If you bought at very bottom (a few pennies under $8 at the beginning of the month), you did OK, but nothing to retire on (unless you already had enough and put it in Fusion-io). Otherwise, not. One might wonder why Nimble is valued at $1.98 billion as I type?

Driven to Data Distraction, Part The Second

There wasn't intended to be a Part The Second, but in reviewing (after hitting "Save", naturally) the text, the punch line and trail of breadcrumbs to same is less than obvious. Hindsight, and all that.

Since the point of part the first was: The quants took a beating on this one, a clearer road to that end might be warranted.

Let's start with Nate. While I acknowledge that Nate is a saint, the title of his piece, "Eric Cantor's Loss Was Like an Earthquake", and the substance of it, are so far up his ass that it's hard to overstate his error. Nate makes the signature mistake of quants: analogizing from natural processes to human processes. Natural processes obey external (God or Nature or the Stay Puft Man as one prefers) rules which such processes can't alter, while human processes are always subject to the dominant actor changing the rules to suit. While natural processes exhibit black swan events, to the instant human eye, they're not. They're just the result of insufficient data. Weather forecasting is the prime example. At one time, weather events could be predicted only grossly, and mostly if one could communicate with others upstream of the weather highway. With more data, and very big computers, weather and climate models turn black swans into Daffy Ducks. To the creatures of the time, the Yucatan asteroid was a black swan event. Today, not at all. We have the data. We might even be able to avoid the collision with available tech.

In the human sphere data, in the sense of not being anecdotal information (which quants sneer at, of course), will never trump knowing what the puppet masters are up to.

As the local reporter from Carr's piece says:
Jim McConnell kind of saw it coming. As a staff reporter at The Chesterfield Observer, a large weekly that serves the suburbs of Richmond, he wrote several articles in the spring suggesting Mr. Cantor was in for a fight. And on June 4, he suggested that "Brat's campaign is gathering steam as it hurtles toward the finish line."

Those quants who look only at historical election data, polls, and current public polling will never see that the rules had changed. Change the rules, change the outcome. The presence of Ingraham and the other astroturfers in the field was the "data" that mattered. Carr is right that national media didn't especially notice (none has put his hand and said, "I told you so", that I've read; I can wait). The over vote was what mattered. How to have known that before election day? Probably not, but the media following the vote totals as precincts reported (assuming that they reported serially, which I don't know), would know something was up.

The notion that yesterday looked like the day before, today looks like yesterday, so tomorrow will like today (mostly) is the Achilles heel of time series analysis of human processes. There's no guarantee that the rules controlling the data are stable. Fact is, I'd assert that the rules will be changed by the dominant actor once their pain reaches some Δ.

Now that wage arbitrage has been played out to its end (too little labor left in most production to make it worth the effort), we see corporations running like the bulls of Pamplona to tax arbitrage. They have found their Δ. Alito, et al, have declared corporations people, but Leona made it clear, "only little people pay taxes". Eventually, they'll kill off the middle class golden goose consumer across the planet. Good on them.

16 June 2014

Driven to Data Distraction

OK, twist my arm. It's Cantor time.

David Carr titles today's piece, "Eric Cantor's Defeat Exposed a Beltway Journalism Blind Spot" (my dead trees version has a slightly different headline). Much of his writing is about the evisceration of newspaper staffs, but he also, obliquely, takes aim at the data bits of elections. Specifically, campaign internal polls. Only good news can be had.
Data-driven news sites are all the rage, but what happens when newspapers no longer have the money to commission comprehensive, legitimate polls? The quants took a beating on this one, partly because journalists are left to read the same partisan surveys and spotty local reporting as Mr. Cantor's campaign staff, whose own polling had him up by more than 30 points.
[emphasis added]

There may well have been nothing wrong with Cantor's polling per se, in the sense that Cantor's operatives didn't purposely skew the data to please Cantor and campaign management. What a candidate tells the public is a different story, of course; but serious ones want serious, accurate, polling data. Otherwise, he's sailing blind. As with all stratified random sampling, getting the strata right is the most important task. Only then can you draw proper inferences. I'll go out on a limb, and postulate that Cantor's campaign didn't spend much time polling in the truly redneck counties (too many miles for too few responses, and it's been reported in the post mortem that he campaigned not much), and that Brat (and the Right Wingnut surrogates like Ingraham) did. They all got in their pickup trucks with gun rack and plastic Jesus and headed off to vote that Communist Cantor off the ballot. I mean, isn't that a Jewish word?

As asserted here much of the time, data is subservient to policy whenever they disagree. Believing one's own propaganda is fraught with terror. In Cantor's case, one might chuckle that he was hung by the lynch mob he created. One could also conclude that there's a scary bit of Nazi out there in the South. As if it ever left. The real question will be whether the 65,000 or so who voted in the primary represent the thoughts (I use the word with trepidation) of most of 7th/VA? Or could it be Something Else? Read on.

In a previous missive I asked, plaintively, what Nate might have thought. Well he did have some thoughts. He takes the earthquake analogy to the ends of the earth. An interesting read, but I didn't see that he did any better than any other pundit in explaining the defeat. It clearly wasn't in any of the available data before the election, as Carr points out. But what was Ingraham doing out there?

If there were ever a case where policy beat the crap out of data, this is it. No rational Republican would cut off his nose to spite his face by giving up a seat of such power. But the yahoos in 7th/VA were happy to do so. Or, as Will put it, Cantor was: "hoisted with his own petard". Or not?

Some data.

According to the Cook PVI, 7th/VA isn't even the most right wing district in VA, at R+10. But, anyone who's lived in the DC environs knows, there's two Virginias, and the Old South version is increasingly paranoid, even as it gathers up moolah from the socialist bureaucrats living in the DC commuting counties.

Let your fingers do the walking through the Google pages. In an unsuccessful attempt to find the precinct level vote cast (did Brat really win by getting out the redneck vote?), I did find this bit of irony:
Virginia has an open primary process, in which registered voters do not have to be members of a party to vote in that party's primary. With the Democrats in the 7th District having already nominated their candidate at a convention on June 7, they were free to vote in the Republican primary on Tuesday. The 17,900 additional voters casting a ballot in this year's Republican primary relative to in 2012 could be a result of Democrats voting in an attempt to unseat Cantor.

Even so, only about 10% of the district's voters made the decision. About time we had mandatory voting?

Could it be? Democrats can play dirty? Or did Ingraham, et al, persuade the gun rack brigade? We won't know with any certainty until the general election, if then. A Democrat false flag operation would certainly explain the discrepancy between polling and outcome. But so would the gun rack brigade. An educated surmise could be drawn if one knew which precincts accounted for those "extra" 17,900 votes. Did they happen in Blue Leaning precincts or Red Leaning ones? Were these votes spread across all precincts? Didn't find the numbers, alas. I've been (attempted) to be polled, and the first question is "what's your party?". Cantor's pollsters would likely have skipped those that said "Democrat". Or the long-hair with tattoos who said, "fuck off". And so forth.

Finally, The Times has this graph (thanks R?), which appears to show Cantor getting the DC ex-urbanite vote, while Brat took the downstate ruralists and some suburban Richmond. Could be a mix of Democratic counterinsurgency and Tea Party get out the vote. This Fairfax County FAQ says affiliation is not tracked.

Cantor's defeat is open to much speculation. Love it.

15 June 2014

An Artist's Creative Response to Just Criticism

In a comment to "Apps Uber Alles" in one of its myriad incarnations, ONL remarked "And the IT industry is the least labor intensive industry ever -- in the history of mankind." I further commented that this was, sorta kinda, true but that it depends. And would require (or I wanted to do) another post.

This is that post.

In my memory, it was Paul Graham who set the leverage standard nearly two decades ago. Fact is, though, most software hasn't been developed that way (LISP and the innterTubes). While WhatsApp and Uber are designed for smartphones, they actually do very little, from a systems' point of view. The valuations paid are due mostly to the expectation that each will become the de facto advert platform of preference; thus sucking up advert dollars from other platforms. It's a zero sum game in the web/app world. In due time, web app developers will be making advert platforms blurbing other advert platforms. And yet another level soon thereafter.

Most other IT development is geared toward corporate apps, both internal to the corporation and external to the customers. Such apps rely on reliably storing massive amounts of financial data (and personal data, etc.) both securely and forevermore. These apps take tens of thousands of man-hours just to maintain. Few new are being built, since most of the functions have been computerized since at least the IBM S/360 (aka, the 1960s). The revenge of COBOL and VSAM. Been there, done that. The Ninth Circle of Hell.

On the third side of productivity, is office automation, i.e. making the users of IT's spawn more productive. Not so much. Studies over the years have found that word processing, speadsheets, and desktop databases in cubicles has done little to nothing to improve productivity. Which is not surprising, since most of what goes on in those apps is increasingly (certainly since the GUI/Windows/Mac versions) aimed at the sizzle rather than the steak.

Of the fourth side is games. Nothing I've ever done, but following the publicly traded companies in the space, it seems clear that a game requires many thousands of man-hours, if not man-years, and only a few break even. Publishers die or are acquired for pennies on the dollar.

On the whole then, IT has been where excess man-hours got sucked up. All those callow young men eager to get into software (hardware requires real engineering, e.g. EE, and that's too tough a curriculum) and get rich very quick. Whether that will continue to be true is an open question. If applications going forward amount to novel advert platforms on the web, then IT's ability to absorb excess labor ends. There's not much functionality in any one such app, so there's not much effort to make such. What matters is the novelty of the application. Think: toy. As six year olds' attention span for a toy can be measured in days or even hours, one might expect these sorts of IT efforts to face similar fates. We'll see.

14 June 2014

The Office of Central Planning

I've long since forgotten where I read it, or who said it, but the gist of it was that what made Larry Bird special (and was never recorded in his stats) was his ability to "make the pass before the assist". In other words, he saw where ball had to go before the ball knew where it had to be for the next stop in the basket. It is a rare talent. While trite, there is no I in team. Few actually play that way.

In a couple of stories (here and again here) today, writers make much the same argument for the Spurs as a team, without mentioning Bird, alas.
Told that Tim Duncan, the Spurs' ageless and brilliant forward, had broken the record for career double-doubles in the playoffs, Popovich shrugged. "I can assure you he doesn't care," he said.

and then this:
The Spurs do not run these pick-and-rolls to score immediately, but rather to create easy baskets a few passes later.
...
When Diaw shares the court with Manu Ginobili, another ace passer, San Antonio has been unstoppable. In 87 minutes together this finals, lineups with Diaw and Ginobili have outscored the Heat by an unfathomable 79 points.

I guess one Bird is worth a Ginobili plus a Diaw.

What does American hoops got to do with the RM and such? Well, only that doing what you're doing better than anyone generally works out better. Dr. Codd was right, the RBAR zealots be damned. Aspire to be Bird.

13 June 2014

What's The Word

Reali:
Data geeks' demand for easy answers to difficult problems by just looking at easy data is _______.

Kornheiser:
The word is copula-itis. Now, careful kiddies, it's spelled a little differently. But it's about the same thing. Cheap thrills now, and great pain later. You pay for cheap pleasure today, and tomorrow you gotta go get the shot. Don't do it!! Just don't do it!!!

12 June 2014

Apps Uber Alles

Let's review. WhatsApp had about 50 bodies and sold for $3 billion, and likely in the end, more. Uber is valued at $18 billion, and has, according to reports, 550 bodies right now. Given how much longer it's been around, I'd assert they're both in the same ballpark vis-a-vis development and business plan.

Both those of the Left and Right are drinking from the same bucket of Flavor Aid: education and tech will solve the employment problem. Not. The balance of this missive explains why.

In the 19th century, the USofA imported (not always willingly on their part) lots of folks from other countries and continents to, mostly, support agriculture and construction projects. Africans raising rice and cotton, Chinese building railroads, and Irish canals. And so on.

Come the early 20th century, and manufacturing enters the assembly line (mass production) era. There are multiple waves of displaced agricultural workers going to cities to fill said factories. Displaced by machines coming out of cities, mostly Up North. The skill level of assembly line work is, arguably, lower than that needed to farm successfully. There was, in any case, sufficient demand to employ both the displaced and further immigrations from Europe (Irish, then Italian). Labor productivity was still low, as automation was minimal. Capital generally replaced craftsmanship, not low skill assembly labor.

There is no evidence that the "tech economy" is on pace to absorb displaced workers this time. Productive labor has been sent to autocratic countries wherever they can be found. Labor productivity (time per widget), due to increasingly sophisticated machinery, continues to increase. Kind of like dark energy exploding the universe until no celestial object can see any other. That's some billions of years away, thankfully.

As WhatsApp and Uber make clear, a bit of code goes a long way. It's nowhere near comparable to 19th and 20th century farm workers taking jobs on the line at Ford, GM, RCA. Not even close. To make matters yet worse, from a macro survival perspective, so long as the USPTO and courts allow anyone to patent a ham sandwich recipe, there will be no competition within a newly defined market. But, given the leverage displayed by WhatsApp and Uber, we'll need literally millions such newly minted companies to absorb all those highly educated techies.

Guess where most cybercrime comes from? Yeah, you guessed it: countries with lots of idle highly educated techies. Be careful what you wish for.

The Origin of Specie

Some may recall these endeavors' reporting a few months back on the so-called myth of Dr Copper: it appeared, according to reports, that Chinese businesses were using copper stores as surrogate specie. Some saw the fall in price as an omen of larger collapse. Well. The reasoning is that copper has actual industrial uses, and its price is therefore a simple surrogate for manufacturing. Kind of like an anecdotal copula; skip the hard data collection, and just watch the price of copper. The correlation is tight as a drum, don't you know?

Turns out the Chinese may have a general commodities issue. Could be Ponzi, could be pyramid, could be simple fraud. "Beautiful Redhead in the third at Suffolk. Can't lose!!"

Oh, and by the way, while I only read the headline various places, Steve Forbes is on his gold standard LSD trip again. Why? Well, as one of those .01%-ers with mucho moolah, with no idea how to invest it productively, and demanding the God given right to 15% return on his moolah, deflation looks like the best deal available. Go read the economic history of 19th century USofA, and you'll see a hundred's year war of the money holders against everyone else. Deflation and wealth concentration was the course. Rand would be giddy, naturally.

Regular reader can see a rotating (and a few permanent, so far) set of quotes that top these endeavors. Here's the one apropos to this missive:

If you like what OPEC means for oil prices, you'd love what the gold standard would do to financial markets.
-- Michael Feroli/2011

10 June 2014

Square Dancing [update]

More often than not (and, likely, more than I should) I'll expound, "inferential stats is just about squared differences, and all the other stuff is just... stuff". Or words to that effect.

Well, I came across this post, which contains a turn of phrase nearly as dismissive.
...[there] may be a cognitive bias in me that prefers old-skool solutions along the lines of "just log everything and do a t-test, it'll be about right and then you can go home early" (a strange attitude in one who spends a lot of time doing Bayesian structural equation models).

One might wonder, might one, whether Nate takes such issues into consideration?

[update]
Eric Cantor, come on down!!

08 June 2014

Chrome Dome

Bald headed guys like Celko (and, nearly, Your 'Umble Servant) get called chrome dome in jest or anger. California Chrome's owner (well, one of them) has been ranting that the other owners of horses in the races, specifically the Belmont, were cowards. Chrome Dumb, in the view of "traditionalists". I hadn't given it any thought, since I follow horse racing not at all, having seen the nonsense at Suffolk Downs in the 70's. But the Triple Crown is made for casual fans.

As Coburn was ranting (I saw the first one live), it occurred to me that he may be right. That is, there was a time when horses did run all three races, and few if any just sat out some in favor of others. The issue this year was the number that sat out both. Since I'm not up on horse racing data (the full field from each race run by a Triple Crown winner), I let my fingers do the walking through the Google pages: some one of those touts had to have reviewed the history of Triple Crown winners, and the horses faced in all three races. Among the complaints from "traditionalists": the Triple Crown races have always been this way, so leave 'em alone. Not so, it turns out. So, changing the rule, as Coburn insists, that only those who run (enter?) the Derby can run the following races wouldn't be the first time the Triple Crown races have veered from previous structure. What about requiring that they run all races?

And so it was for the horses in the 70's.
There's nothing illegal or unethical about the losing Kentucky Derby horses who skipped the Preakness in order to take fresh aim at the Belmont. But the last three Triple Crown winners didn't face that kind of strategy from their top rivals.

Looks like history, at least the most recent, is on Coburn's side.

It ain't a lot of data, but it's still data.

05 June 2014

Serial Killers

In the bad old days, Bill Gates would try to parry the monopolist meme by saying that MicroSoft needed to be left alone so it could "innovate". Nevermind that from day one it co-opted and/or stole everything it made. BASIC came from Dartmouth. DOS came from Seattle Computer. Office was made under contract to Apple. Windows came from Apple, which stole it from PARC. Apple's operating systems are cadged from BSD, in the main (BSD, unlike linux, permits its own theft). So much innovation, yet so few places to steal it. The hypocrite's dilemma.

What's worser, sometimes, is to read a turn of phrase that impels one to slap one's face with vigor. "I wish I'd said that!!" It gets worser, still, if the kernel of the statement has been in one's lexicon and musings for sometime, just not as elegantly. Oh, the horror. Much more heft in that slap. Damn, my face is sore.

Today is such a day. Sigh.

Jesse Eisinger weighs in on Valeant/Allergan, and opens with this sentence:
On Wall Street, financial engineering masquerades as vision.

Boy, do I wish I'd said that. Of course, I have. Just not quite.

The story of Valeant's assault on Allergan is worth understanding in its own right, so please go read the story. If not, here's the crux of that matter:
J. Edward Ketz, an accounting professor at Penn State University, took the company's cash flow and adjusted it for all the spending from acquiring companies and paying the restructuring costs. By Professor Ketz's reckoning, Valeant has sent more than $1 billion in cash out the door each of the last four years and was negative $5.4 billion in 2013 alone.

This strategy isn't unique to Valeant or bio-pharma generally. But it does highlight a recently repeated lament: these corporate CEOs are paid the big bucks, supposedly, for their ability to allocate capital productively. Fiduciary manipulation, which is what financial engineering really is, can have short term micro benefit, but doesn't increase either micro or macro productivity, since it's just a case of moving moolah from Peter to pay Paul. But, as your Good Mother used to yell at you when you did something stupid and foolish, "What would the world be like if everybody behaved like you??!!". An economy based on finance and tax evasion and big fish eating little fish (and then spitting out the bones and guts) fails spectacularly soon enough.

03 June 2014

Is It a Bull, Or Is It a Steer?

Bull or bullshit??? Of late, the market pundits have been making noise with two assertions:
1) longest bull market in history
and
2) active investing is stupid

Well, let's see. Here's a graph of the Dow from January, 2003 to June, 2014. Notice anything interesting?
(I got this one from 5yearcharts.com)

Look closer. You'll note that the pre-Great Recession trend is re-intersected only last year. Not exactly a raging bull market, especially from the perspective of the passive pundits. And even more especially true of the passive pundits who advise "individual investors", i.e. the 401(k) crew to just buy Mr. Market. And all will be well.

Not so much. If such a passive had retired in early 2008, some/much/all of his nest egg disappeared during those four years. And given that such a passive needed moolah to house, clothe, and feed himself, just leaving it there to re-build itself wouldn't have been an option. Unless such a passive had massively over-saved, he's now living in an SRO in Watts. Those who wanted to retire in 2009 to 2014 could have avoided, nay had to avoid, doing so, in order to rebuild that nest egg. But, now they know that they've no control over when or how well they can retire. Further, given that capital is fleeing from real investment, pyramiding of fiduciary investment is the name of the game. Hello, housing!

The whole point of defined benefit retirement plans is countervailing power. Professional, active management keeps track of Mr. Market. The advantage is that booms build the capital base, while avoiding busts. The notion that the Mr. Market, long term, always gets bigger is based on a much smaller planet. Smaller in absolute population, but also, and more importantly, vastly smaller in middle class demands. Not to mention, that we're running out of resources, water and arable land in particular.

The 401(k) scam is simply that the financial sector found a way to extract evermore baksheesh from workers. With managed plans covering thousands, and even tens of thousands, of individuals, the opportunity is far smaller. Divide and conquer.

From dictionary.com:
steer
noun, plural steers ( especially collectively ) steer.
a male bovine that is castrated before sexual maturity, especially one raised for beef.