IoT ideal example of “recombinant innovation”!

I’m currently reading Erik Brynjolfsson (say that one fast three times…) and Andy McAfee’s brilliant The Second Machine Age, which I highly recommend as an overview of the opportunities and pitfalls of what they call “brilliant technologies.”

While they don’t specifically mention the IoT, I was riveted by one section in which they contrasted current digital innovation with past technologies, using economist Paul Romer‘s term “recombinant innovation”:

Economic growth occurs whenever people take resources and rearrange them in ways that make them more valuable…. Every generation has perceived the limits to growth that finite resources and undesirable side effects would pose if no new … ideas were discovered. And every generation has underestimated the potential for finding new … ideas. We consistently fail to grasp how many ideas remain to be discovered… Possibilitities do not merely add up, they multiply.” (my emphasis)

I felt like Dr. Pangloss, who was surprised to learn he’d been speaking prose all his life: I realized Romer’s term and definition was a more elegant version of what I’ve written before, especially about IFTTT, about an Essential Truth of the IoT — that sharing data is critical to achieving the IoT’s full potential. IFTTT is a great example of Romer’s argument in practice: individuals are “taking resource and rearrang(ing) them in ways that make them more valuable.” As Brynjolfsson and McAfee write:

“.. digital innovation is recombinant innovation in its purest form. Each development becomes a building block for future innovations. Progress doesn’t run out; it accumulates. And the digital world doesn’t respect any boundaries. It extends into the physical one, leading to cars and planes that drive themselves, printers that make parts, and so on….We’ll call this the ‘innovation-as-building-block’ view of the world..” (again, my emphasis)

This is such a powerful concept. Think of Legos — not those silly ones that dominate today, where they are so specialized they can only be used in making a specific kit — but the good ol’ basic ones that could be reused in countless ways. It’s why I happen to believe that all the well-thought-out projections on the IoT’s potential size probably are on the low side: there’s simply no way that we can predict now all the creative, life-saving, money-saving, or quality-of-life-enhancing ways the IoT will manifest itself until people within and outside of organizations take new IoT devices and use them in IFTTT-like “Recipes” that would never have occurred to the devices’ creators.  But beware: none of this will happen if companies use proprietary standards or don’t open their APIs and other tools to all those who can benefit.

How exciting!

comments: Comments Off on IoT ideal example of “recombinant innovation”! tags: , , , , , , ,

Crucially important cautionary note about data’s limits!

Posted on 4th February 2014 in Internet of Things, open data, US government

I yield to no one in my passion for liberating data, and for its potential role in improving decision-making. It’s essential to full realization of the Internet of Things, and yes, it can even save lives (not to mention baseball teams, witness Michael Lewis’ wonderful Moneyball!). However, I implore you to read “Why Quants Don’t Know Everything,” a gem by Felix Salmon that’s tucked into the current Wired issue. It documents a disturbing pattern of how decision-making in everything from baseball to, yes, the NSA, can be distorted — with serious consequences, when the “quants” take over completely and data is followed blindly. Salmon begins with the NSA’s insatiable appetite for data:

“Once it was clear that the NSA could do something, it seemed inarguable that the agency should do it—even after the bounds of information overload (billions of records added to bulging databases every day) or basic decency (spying on allied heads of state, for example) had long since been surpassed. The value of every marginal gigabyte of high tech signals intelligence was, at least in theory, quantifiable. The downside—the inability to prioritize essential intelligence and act on it; the damage to America’s democratic legitimacy—was not. As a result, during the past couple of decades spycraft went from being a pursuit driven by human judgment calls to one driven by technical capability.”

Let me emphasize: technical capability came to trump human judgment calls. I suspect there’s probably not too much question among you, dear readers, that the NSA went to far. But Salmon sees a broader problem with unchecked faith in data:

The reason the quants win is that they’re almost always right—at least at first. They find numerical patterns or invent ingenious algorithms that increase profits or solve problems in ways that no amount of subjective experience can match. But what happens after the quants win is not always the data-driven paradise that they and their boosters expected. The more a field is run by a system, the more that system creates incentives for everyone (employees, customers, competitors) to change their behavior in perverse ways—providing more of whatever the system is designed to measure and produce, whether that actually creates any value or not. It’s a problem that can’t be solved until the quants learn a little bit from the old-fashioned ways of thinking they’ve displaced.” (my emphasis)

Salmon goes on to show parallel stages in a wide range of fields where data is in the ascendancy:

  1.  “pre-disruption.” The Neanderthal period, before data is applied to big problems.
  2. disruption.” Example they use is 2012 Obama campaign, where the technologists held sway, targeted voters down to the individual level based on data. You know what happened.
  3. overshoot.” Here’s where things go off the track:”The most common problem is that all these new systems—metrics, algo­rithms, automated decisionmaking processes—result in humans gaming the system in rational but often unpredictable ways. (my emphasis) Sociologist Donald T. Campbell noted this dynamic back in the ’70s, when he articulated what’s come to be known as Campbell’s law: “The more any quantitative social indicator is used for social decision-making,” he wrote, “the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.”On a managerial level, once the quants come into an industry and disrupt it, they often don’t know when to stop. They tend not to have decades of institutional knowledge about the field in which they have found themselves. And once they’re empowered, quants tend to create systems that favor something pretty close to cheating. (again, my emphasis) As soon as managers pick a numerical metric as a way to measure whether they’re achieving their desired outcome, everybody starts maximizing that metric rather than doing the rest of their job—just as Campbell’s law predicts.”

    He then gives a number of illustrations including “teaching to tests” and, most infamously, the bank meltdown  (I was particularly struck by the one dealing with serious problems in policing: um, it can kill…) that can come as a result of pre-occupation with data. Have you seen this in your field??

  4. synthesis.”  My father used to say that there was an inverse relationship between the amount of education you had and your amount of common sense (he was a little too intimidating for me to point out that he had a Ph.D….).  Here’s where the smart guys and gals learn to put data in perspective:”It’s increasingly clear that for smart organizations, living by numbers alone simply won’t work. That’s why they arrive at stage four: synthesis—the practice of marrying quantitative insights with old-fashioned subjective experience. Nate Silver himself has written thoughtfully about examples of this in his book, The Signal and the Noise. He cites baseball, which in the post-Moneyball era adopted a ‘fusion approach’ that leans on both statistics and scouting. Silver credits it with delivering the Boston Red Sox’s first World Series title in 86 years. (LOL: my emphasis!) Or consider weather forecasting: The National Weather Service employs meteorologists who, understanding the dynamics of weather systems, can improve forecasts by as much as 25 percent compared with computers alone. A similar synthesis holds in eco­nomic forecasting: Adding human judgment to statistical methods makes results roughly 15 percent more accurate. And it’s even true in chess: While the best computers can now easily beat the best humans, they can in turn be beaten by humans aided by computers.”

I’ve been concerned for a while that the downside of vast quantities of real-time data is that decision-makers may ignore time-honored perspective, horse sense, whatever you call it, and may just get whip-sawed by constantly changing data.

So yes, there will be a need for living, breathing managers in the era of the Internet of Things, even ones with grey hair! It will take time, and probably a lot of trial-and-error, but smart companies will attain that synthesis of qualitative insights and “old-fashioned subjective experience.

I beg you: please read this entire article, save it, and share it: it’s a bit of critical insight that may just get drowned out by people like me calling for more, and more rapid, sharing of data. 

Whew. My conscience feels redeemed!

comments: Comments Off on Crucially important cautionary note about data’s limits! tags: , , , , ,
http://www.stephensonstrategies.com/">Stephenson blogs on Internet of Things Internet of Things strategy, breakthroughs and management