McKinsey IoT Report Nails It: Interoperability is Key!

I’ll be posting on various aspects of McKinsey’s new “The Internet of Things: Mapping the Value Beyond the Hype” report for quite some time.

First of all, it’s big: 148 pages in the online edition, making it the longest IoT analysis I’ve seen! Second, it’s exhaustive and insightful. Third, as with several other IoT landmarks, such as Google’s purchase of Nest and GE’s divestiture of its non-industrial internet division, the fact that a leading consulting firm would put such an emphasis on the IoT has tremendous symbolic importance.

McKinsey report — The IoT: Mapping the Value Beyond the Hype

My favorite finding:

“Interoperability is critical to maximizing the value of the Internet of Things. On average, 40 percent of the total value that can be unlocked requires different IoT systems to work together. Without these benefits, the maximum value of the applications we size would be only about $7 trillion per year in 2025, rather than $11.1 trillion.” (my emphasis)

This goes along with my most basic IoT Essential Truth, “share data.”  I’ve been preaching this mantra since my 2011 book, Data Dynamite (which, if I may toot my own horn, I believe remains the only book to focus on the sweeping benefits of a paradigm shift from hoarding data to sharing it).

I was excited to see that the specific example they zeroed in on was offshore oil rigs, which I focused on in my op-ed on “real-time regulations,” because sharing the data from the rig’s sensors could both boost operating efficiency and reduce the chance of catastrophic failure. The paper points out that there can be 30,000 sensors on an rig, but most of them function in isolation, to monitor a single machine or system:

“Interoperability would significantly improve performance by combining sensor data from different machines and systems to provide decision makers with an integrated view of performance across an entire factory or oil rig. Our research shows that more than half of the potential issues that can be identified by predictive analysis in such environments require data from multiple IoT systems. Oil and gas experts interviewed for this research estimate that interoperability could improve the effectiveness of equipment maintenance in their industry by 100 to 200 percent.”

Yet, the researchers found that only about 1% of the rig data was being used, because it rarely was shared off the rig with other in the company and its ecosystem!

The section on interoperability goes on to talk about the benefits — and challenges — of linking sensor systems in examples such as urban traffic regulation, that could link not only data from stationary sensors and cameras, but also thousands of real-time feeds from individual cars and trucks, parking meters — and even non-traffic data that could have a huge impact on performance, such as weather forecasts.  

While more work needs to be done on the technical side to increase the ease of interoperability, either through the growing number of interface standards or middleware, it seems to me that a shift in management mindset is as critical as sensor and analysis technology to take advantage of this huge increase in data:

“A critical challenge is to use the flood of big data generated by IoT devices for prediction and optimization. Where IoT data are being used, they are often used only for anomaly detection or real-time control, rather than for optimization or prediction, which we know from our study of big data is where much additional value can be derived. For example, in manufacturing, an increasing number of machines are ‘wired,’ but this instrumentation is used primarily to control the tools or to send alarms when it detects something out of tolerance. The data from these tools are often not analyzed (or even collected in a place where they could be analyzed), even though the data could be used to optimize processes and head off disruptions.”

I urge you to download the whole report. I’ll blog more about it in coming weeks.

Incredible example of rethinking “things” with Internet of Things

Ladies and gentlemen, I give you the epitome of the IoT-enabled product: the trash can!

My reader statistics do not indicate this blog has a heavy readership among trash cans, but let me apologize in advance to them for what I’m about to write: it’s not personal, just factual.

I’m sorry, but you municipal trash cans are pathetic!

Dented. Chipping paint. Trash overflowing. Smelly. Pests (ever seen any of those prize city rats? Big!!!) Sometime even knocked over. And, worst of all, you are so…. DUMB. You just sit there and don’t do anything.

BigBelly trash compactor and recycling center

But that was then, and this is now.

I have seen the future of trash cans, and, equally important, perhaps the best example I’ve seen of how smart designers and company strategists can –and must — totally rethink products’ design and how they are used because of the Internet of Things! 

At last week’s Re-Work Internet of Things Summit there were many exciting new IoT examples (I’ll blog others in coming weeks) but perhaps the one that got more people talking was the BigBelly trash compactor & recycling system, high-tech successor to the lowly trash can.

The company’s motto is that they are “transforming waste management practices and contributing to the Smart Cities of tomorrow.” Indeed!

I was first attracted to the BigBelly systems because of my alternative energy and environmental passions: they featured PV-powered trash compactors, which can quintuple the amount a trash container can hold, eliminating overflowing containers and the need to send trucks to empty them as frequently. Because the containers are closed, there’s no more ugly banana peels and McDonald’s wrappers assaulting your delicate eyes — or noses! Equally important, each is paired with a recycling container, which are almost never seen on city streets, dramatically reducing the amount of recyclables that go into regular trash simply because no recycling containers are accessible downtown.  These features alone would be a noteworthy advance compared to conventional trash cans.

But BigBelly wasn’t content to just improve the efficiency of trash and recyclable collection: they decided to make the containers smart.

The company worked with Digi to add wireless communications to the bins. This is a critical part of BigBelly’s broader significance: when the IoT first started to creep into corporate consciousness, of course designers thought about smart versions of high-value products such as cars, but lowly trash cans? That deserves real praise, because they fundamentally re-examined not only the product as it existed, but also realized that an IoT-based version that could also communicate real-time data would become much more versatile and much more valuable.

Here’s what has resulted so far (and I suspect that as the BigBellys are more widely deployed and both city administrators and others become aware of their increased functionality, other features will be added: I see them as “Smart City Hubs!”):

  • heatmap of trash generation in Lower Manhattan using real-time data from BigBellys and CLEAN dashboard

    instead of traditional pickup routes and schedules that were probably based on sheer proximity (or, as BigBelly puts it a little more colorfully, “muscle memory and gut instincts”), they now offer a real-time way to monitor actual waste generation, through the “CLEAN Management Console,” which allows DPW personnel to monitor and evaluate bins’ fullness, trends and historical analysis, for perspective. Collections can now be dynamic and driven by current needs, not historical patterns.

  • For those cities that opt for it, the company offers a Managed Services option where it does the analysis and management of the devices — not unlike the way jet turbine manufacturers now offer their customers value-added data that allows them to optimize performance — and generates new revenue streams for the manufacturers.
  • You may remember that I blogged a while ago about the “Collective Blindness” analogy: that, until the IoT, we humans simply couldn’t visualize much about the inner workings of the material world, so we were forced to do klugy work-arounds.  That’s not, strictly speaking, the case here, since trash in a conventional can is obviously visible, but the actual volume of trash was certainly invisible to those at headquarters. Now they can see — and really manage it.
  •  They can dramatically increase recycling programs’ participation rate and efficiency. As BigBelly says, the system provides “intelligent infrastructure to support ongoing operations and free up staffing and resources to support new and expanded recycling programs. Monitoring each separate stream volumes, days to fullness, and other activities in CLEAN enables you to make changes where needed to create a more effective public recycling program. Leverage the stations’ valuable sidewalk real estate to add messaging of encouraging words to change your users’ recycling behaviors.”Philadelphia is perhaps the best example of how effective the system can be. The city bought 210 of the recycling containers in 2009. On average, each collected 225 pounds of recyclables monthly, resulting in 23.5 tons of material diverted from landfills. Philly gets $50 per ton from the recycling — and avoiding $63 in landfill tipping fees, with a total benefit to the city of $113 per ton, or $2599 per month.

Here’s where it really gets neat, in my estimation.

Because the BigBellys are connected in real time, the devices can serve a number of real-time communication functions as well (enabled by an open API and an emphasis by BigBelly on finding collaborative uses). That includes making them hubs for a “mesh network” municipal wi-fi system (which, by the way, means that your local trash container/communications hub could actually save your life in a disaster or terror attack, when stationary networks may be disrupted, as I explained years ago in this YouTube video).

The list of benefits goes on (BigBelly lists all of them, right down to “Happy Cities,” on its web site). Trust me: if my premise is right that we can’t predict all of the benefits of the IoT at this point because we simply aren’t accustomed to thinking expansively about all the ways connected devices can be used, there will be more!

So here’s my take-away from the BigBelly:

If something as humble and ubiquitous as a municipal trashcan can  be transformed into a waste-reduction, recycling collection, municipal communication hub, then to fully exploit the Internet of Things’ full potential, we need to take a new, creative look at every material thing we interact with, no longer making assumptions about its limited role, and instead looking at it creatively as part of an interconnected network whose utility grows the more things (and people!) it’s connected with!

Let me know your ideas on how to capitalize on this new world of possibilities!

GE & IBM make it official: IoT is here & now & you ignore it at your own risk!

Pardon my absence while doing the annual IRS dance.

While I was preoccupied, GE and IBM put the last nail in the coffin of those who are waiting to launch IoT initiatives and revise their strategy until the Internet of Things is more ….. (supply your favorite dismissive wishy-washy adjective here).

It’s official: the IoT is here, substantive, and profitable.

Deal with it.

To wit:

The two blue-chips’ moves were decisive and unambiguous. If you aren’t following suit, you’re in trouble.

The companies accompanied these bold strategic moves with targeted ones that illustrate how they plan to transform their companies and services based on the IoT and related technologies such as 3-D printing and Big Data:

  • GE, which has become a leader in 3-D printing, announced its first FAA-approved 3-D jet engine part, housing a jet’s compressor inlet temperature sensor. Sensors and 3-D printing: a killer combination.
  • IBM, commercializing its gee-whiz Watson big data processing system, launched Watson Health in conjunction with Apple and Johnson & Johnson, calling it “our moonshot” in health care, hoping to transform the industry.  Chair Ginny Rometty said that:

“The Watson Health Cloud platform will ‘enable secure access to individualized insights and a more complete picture of the many factors that can affect people’s health,’ IBM says each person generates one million gigabytes of health-related data across his or her lifetime, the equivalent of more than 300 million books.”

There can no longer be any doubt that the Internet of Things is a here-and-now reality. What is your company doing to catch up to the leaders and share in the benefits?


Deloitte’s IoT “Information Value Loop”: critical attitudinal shift

Ever so often it’s good to step back from the day-to-day minutia of current Internet of Things projects, and get some perspective on the long-term prospects and challenges.

That’s what Deloitte did last December, when it held an “Internet of Things Grand Challenge Workshop,” with a focus on the all-important “forging the path to revenue generation.”

The attendees included two of my idols: John Seely Brown and John Hagel, of Deloitte’s “Center for the Edge” (love the pun in that title!).

The results were recently released, and bear close examination, especially the concept of how to foster what they call the “Information Value Loop”:

Deloitte IoT Information Value Loop

Deloitte IoT Information Value Loop

“The underlying asset that the IoT creates and exploits is information, yet we lack a well- developed, practical guide to understand how information creates value and how companies can effectively capture value. The ‘Information Value Loop’ describes how information creates value, how to increase that value, and how understanding the relevant technology is central to positioning an organization to capture value. The Information Value Loop is one way to begin making sense of the changes we face. The Loop consists of three interconnected elements: stages, value drivers, and technologies. Where the stages and value drivers are general principles defining if and how information creates value under any circumstances, it is the specifics of today’s technology that connect the Loop to the challenges and opportunities created by the IoT.”

This fits nicely with one of my IoT Esssential Truths,” that we need to turn linear information flows into cyclical ones to fully capitalize on the IoT.  No pussy-footin’ about this for these guys: “For information to create any value at all, it must pass through all the stages of the Loop. This is a binary outcome: should the flow of information be blocked completely at any stage, no value is created by that information.”

IMHO, this is also going to be one of the biggest challenges of the IoT for management: in the days when it was sooo difficult to gather and disseminate information, it made sense for those in the C-suite to control it, and parcel out what they felt was relevant, to whom and when they felt it was relevant. More often than not, the flow was linear and hierarchical, with one information silo in the company handing on the results to the next after they’d processed it. That didn’t allow any of the critical advantages the IoT brings, of allowing everyone who needs it to share real-time data instantly.  But saying we need to change those information management practices is one thing: actually having senior management give up their gatekeeper functions is another, and shouldn’t be understated as a challenge.

So here are some of the other key points in the conference proceedings:

  • In line with the multi-step strategy I outlined in Managing the Internet of Things Revolution, they concluded that incremental improvements to existing processes and products are important, but will only take you so far, at which point radical innovation will be crucial: “At first blush, the early IoT emphasis on sustaining innovation seems reasonable. Performance and cost improvement are seldom absent from the priorities of stakeholders; they are relatively easy to measure and their impact is likely more immediate than any investment that is truly disruptive. Put simply, the business case for an IoT application that focuses on operational efficiencies is relatively easy to make. Many decision makers are hard-wired to prefer the path of less resistance and, for many, truly innovative IoT applications seem too far-flung and abstract to risk pursuing. Still, organizations cannot innovate from the cost side forever.”
  • Melding the public and private, “Cities have inherent societal challenges in place to serve as natural incubators of IoT solutions.” Yeah!
  • As in everything else, those contrarian Millennials (who aren’t so hung up on buying stuff and often prefer to just use it)  are likely to save us when it comes to the IoT:  “From an innovation perspective … some of the new technologies are first marketed at the consumers. Thus, many believe that near-term innovation in IoT applications will come out of the consumer sector – spurred by the emergence of the tech-savvy Millennial consumers as a driving economic force.”
  • As I’ve written before, while some customers will still prefer to buy products outright, the IoT will probably bring a shift from selling products to marketing services based on those products, creating new revenue streams and long-term relationships with customers: “As IoT makes successful forays into the world of consumer and industrial products, it may radically change the producer—buyer transactional model from one based on capital expenditure to one based on operating expenditure. Specifically, in a widely adopted IoT world, buyers may be more apt to purchase product service outcomes on some kind of “per unit” basis, rather than the product itself and in so doing, render the physical product as something more of an afterthought. The manufacturer would then gradually transform into a service provider, operating on a complete awareness of each product’s need for replenishment, repair, replacement, etc.”

    Or, a hybrid model may emerge: “What may ultimately happen in a relatively connected product world is that many may accept the notion of the smartly connected product, but in a limited way. Such people will want to own the smartly connected product outright, but will also accept the idea of sharing the usage data to the limited extent that the sellers use such data in relatively benign ways, such as providing advice on more efficient usage, etc. The outcome here will also rely upon a long term total cost of ownership (TCO) perspective. With any fundamental purchasing model changes (as is taking place in owned vs. cloud resources in the network / IT world), not all suppliers will be able to reap additional economic benefit under the service model. Buyers will eventually recognize the increase in TCO and revert back to the more economical business model if the economic rents are too high.”

  • It’s likely that those players in the IoT ecosystem who create value-added data interpretation will be the most valuable and profitable: “…are certain building blocks of the IoT network “more equal” than others?

    “Some have argued that the holy grail of the IoT value loop resides in the data and that those in the IoT ecosystem who aggregate and transform massive amounts of raw data into commercially useful intelligence capture the real value in the IoT environment. This notion holds that commercially useful data provide insights that drive action and ultimately represent the reason that the end user pursues a smart solution in the first place. Put another way, the end customer is more apt to pay for a more comprehensive treatment of raw data than for a better sensor. Indeed, some even believe that as time passes, the gap in relative value captured by those who curate and analyze the data and the rest of the IoT ecosystem will only widen and that, on a long-term basis, players within the “non-data” part of the IoT ecosystem will need to develop some data analytics capabilities simply to differentiate themselves as something more than commodity providers. Of course, some think that the emphasis on data is overblown and argue that where the real value in the IoT ecosystem is captured depends on application. Time will tell of course. But there can be little doubt that the collection and enhancement of data is highly coveted, and analytics and the ability to make use of the vast quantities of information that is captured will serve as critical elements to virtually any IoT solution.”

I urge you to download and closely analyze the entire report. It’s one of the most thoughtful and visionary pieces of IoT theory I’ve seen (no doubt because of its roundtable origins: in keeping with the above-mentioned need for cyclical information flow for the IoT [and, IMHO, creativity in general], the more insights you can bring together on a real-time basis, the richer the outcome. Bravo!


The Internet of Things’ Essential Truths

I’ve been writing about what I call the Internet of Things’ “Essential Truths” for three years now, and decided the time was long overview to codify them and present them in a single post to make them easy to refer to.

As I’ve said, the IoT really will bring about a total paradigm shift, because, for the the first time, it will be possible for everyone who needs it to share real-time information instantly. That really does change everything, obliterating the “Collective Blindness” that has hampered both daily operations and long-term strategy in the past. As a result, we must rethink a wide range of management shibboleths (OK, OK, that was gratuitous, but I’ve always wanted to use the word, and it seemed relevant here, LOL):

  1. First, we must share data. Tesla leads the way with its patent sharing. In the past, proprietary knowledge led to wealth: your win was my loss. Now, we must automatically ask “who else can use this information?” and, even in the case of competitors, “can we mutually profit from sharing this information?” Closed systems and proprietary standards are the biggest obstacle to the IoT.
  2. Second, we must use the Internet of Things to empower workers. With the IoT, it is technically possible for everyone who could do their job better because of access to real-time information to share it instantly, so management must begin with a new premise: information should be shared with the entire workforce. Limiting access must be justified.
  3. Third, we must close the loop. We must redesign our data management processes to capitalize on new information, creating continuous feedback loops.
  4. Fourth, we must rethink products’ roles. Rolls-Royce jet engines feed back a constant stream of real-time data on their operations. Real-time field data lets companies have a sustained dialogue with products and their customers, increasingly allowing them to market products as services, with benefits including new revenue streams.
  5. Fifth, we must develop new skills to listen to products and understand their signals. IBM scientists and medical experts jointly analyzed data from sick preemies’ bassinettes & realized they could diagnose infections a day before there was any visible sign. It’s not enough to have vast data streams: we need to understand them.
  6. Sixth, we must democratize innovation. The wildly-popular IFTTT web site allows anyone to create new “recipes” to exploit unforeseen aspects of IoT products – and doesn’t require any tech skills to use. By sharing IoT data, we empower everyone who has access to develop new ways to capitalize on that data, speading the IoT’s development.
  7. Seventh, and perhaps most important, we must take privacy and security seriously. What responsible parent would put an IoT baby monitor in their baby’s room after the highly-publicized incident when a hacker exploited the manufacturer’s disregard for privacy and spewed a string of obscenities at the baby? Unless everyone in the field takes privacy and security seriously, the public may lose faith in the IoT.

There you have ’em: my best analysis of how the Internet of Things will require a revolution not just in technology, but also management strategy and practices. What do you think?

Remember: The IoT Is Primarily About Small Data, Not Big

Posted on 16th March 2015 in data, Internet of Things, M2M, management, manufacturing, open data

In one of my fav examples of how the IoT can actually save lives, sensors on only eight preemies’ incubators at Toronto’s Hospital for Sick Children yield an eye-popping 90 million data points a day!  If all 90 million data points get relayed on to the “data pool,” the docs would be drowning in data, not saving sick preemies.

Enter “small data.”

Writing in Forbes, Mike Kavis has a worthwhile reminder that the essence of much of the Internet of Things isn’t big data, but small. By that, he means:

a dataset that contains very specific attributes. Small data is used to determine current states and conditions  or may be generated by analyzing larger data sets.

“When we talk about smart devices being deployed on wind turbines, small packages, on valves and pipes, or attached to drones, we are talking about collecting small datasets. Small data tell us about location, temperature, wetness, pressure, vibration, or even whether an item has been opened or not. Sensors give us small datasets in real time that we ingest into big data sets which provide a historical view.”

Usually, instead of aggregating  ALL of the data from all of the sensors (think about what that would mean for GE’s Durathon battery plant, where 10,000 sensors dot the assembly line!), the data is originally analyzed at “the edge,” i.e., at or near the point where the data is collected. Then only the data that deviates from the norm (i.e., is significant)  is passed on to to the centralized data bases and processing.  That’s why I’m so excited about Egburt, and its “fog computing” sensors.

As with sooo many aspects of the IoT, it’s the real-time aspect of small data that makes it so valuable, and so different from past practices, where much of the potential was never collected at all, or, if it was, was only collected, analyzed and acted upon historically. Hence, the “Collective Blindness” that I’ve written about before, which limited our decision-making abilities in the past. Again, Kavis:

“Small data can trigger events based on what is happening now. Those events can be merged with behavioral or trending information derived from machine learning algorithms run against big data datasets.”

As examples of the interplay of small and large data, he cites:

  • real-time data from wind turbines that is used immediately to adjust the blades for maximum efficiency. The relevant data is then passed along to the data lake, “..where machine-learning algorithms begin to understand patterns. These patterns can reveal performance of certain mechanisms based on their historical maintenance record, like how wind and weather conditions effect wear and tear on various components, and what the life expectancy is of a particular part.”
  • medicine containers with smart labels. “Small data can be used to determine where the medicine is located, its remaining shelf life, if the seal of the bottle has been broken, and the current temperature conditions in an effort to prevent spoilage. Big data can be used to look at this information over time to examine root cause analysis of why drugs are expiring or spoiling. Is it due to a certain shipping company or a certain retailer? Are there re-occurring patterns that can point to problems in the supply chain that can help determine how to minimize these events?”

Big data is often irrelevant in IoT systems’ functioning: all that’s needed is the real-time small data to trigger an action:

“In many instances, knowing the current state of a handful of attributes is all that is required to trigger a desired event. Are the patient’s blood sugar levels too high? Are the containers in the refrigerated truck at the optimal temperature? Does the soil have the right mixture of nutrients? Is the valve leaking?”

In a future post, I’ll address the growing role of data scientists in the IoT — and the need to educate workers on all levels on how to deal effectively with data. For now, just remember that E.F. Schumacher was right: “small is beautiful.”


Apple ResearchKit will launch medical research paradigm shift to crowd-sourcing

Amidst the hoopla about the new MacBook and much-anticipated Apple Watch, Apple snuck something into Monday’s event that blew me away (obligatory disclaimer: I work part-time at The Apple Store, but the opinions expressed here are mine).

My Heart Counts app

Four years after I proselytized about the virtues of democratizing data in my Data Dynamite: how liberating data will transform our world book (BTW: pardon the hubris, but I still think it’s the best thing out there about the attitudinal shift needed to capitalize on sharing data), I was so excited to learn about the new ResearchKit.

Tag line? “Now everybody can do their part to advance medical research.”

The other new announcements might improve your quality of life. This one might save it!

As Senior VP of Operations Jeff Williams said in announcing the kit,  the process of medical research ” ..hasn’t changed in decades.” That’s not really true: as I wrote in my book, the Quantified Self movement has been sharing data for several years, as well as groups such as CureTogether and PatientsLikeMe. However, what is definitely true is that no one has harnessed the incredible power of the smartphone for this common goal until now, and that’s really incredible. It’s a great example of my IoT Essential Truth of asking “who else could use this data?

A range of factors cast a pall over traditional medical research.

Researchers have had to cast a broad net even to get 50-100 volunteers for a clinical trial (and may have to pay them, to boot, placing the results validity when applied to the general population in doubt).  The data has often been subjective (in the example Williams mentioned, Parkinson’s patients are classified by a doctor simply on the basis of walking a few feet). Also, communication about the project has been almost exclusively one way, from the researcher to the patient, and limited, at best.

What if, instead, you just had to turn on your phone and open a simple app to participate? As the website says, “Each one [smartphone] is equipped with powerful processors and advanced sensors that can track movement, take measurements, and record information — functions that are perfect for medical studies.” Suddenly research can be worldwide, and involve millions of diverse participants, increasing the data’s amount and validity (There’s a crowdsourcing research precedent: lot of us have been participating in scientific crowdsourcing for almost 20 years, by installing the SETI@Home software that runs in the background on our computers, analyzing data from deep space to see if ET is trying to check in)!

Polymath/medical data guru John Halamka, MD wrote me that:

“Enabling patients to donate data for clinical research will accelerate the ‘learning healthcare system’ envisioned by the Institute of Medicine.   I look forward to testing out Research Kit myself!”

The new apps developed using ResearchKit harvest information from the Health app that Apple introduced as part of iOS8. According to Apple:

“When granted permission by the user, apps can access data from the Health app such as weight, blood pressure, glucose levels and asthma inhaler use, which are measured by third-party devices and apps…. ResearchKit can also request from a user, access to the accelerometer, microphone, gyroscope and GPS sensors in iPhone to gain insight into a patient’s gait, motor impairment, fitness, speech and memory.

Apple announced that it has already collaborated with some of the world’s most prestigious medical institutions, including Mass General, Dana-Farber, Stanford Medical, Cornell and many others, to develop apps using ResearchKit. The first five apps target asthma, breast cancer, cardiovascular disease, diabetes and Parkinson’s disease.  My favorite, because it affects the largest number of people, is the My Heart Counts one. It uses the iPhone’s built-in motion sensors to track participants’ activity, collecting data during a 6-minute walk test from those who are able to walk that long. If participants also have a wearable activity device connecting with the Health app (aside: still don’t know why my Jawbone UP data doesn’t flow to the Health app, even though I made the link) , they are encouraged to use that as well. Participants will also enter data about their heart disease risk factors and their lab tests readings to get feedback on their chances of developing heart disease and their “heart age.” Imagine the treasure trove of cardiac data it will yield!

 A critical aspect of why I think ResearchKit will be have a significant impact is that Apple decided t0 make it open source, so that anyone can tinker with the code and improve it (aside: has Apple EVER made ANYTHING open source? Doubt it! That alone is noteworthy).  Also, it’s important to note, in light of the extreme sensitivity of any personal health data, that Apple guarantees that it will not have access to any of the personal data.

Because of my preoccupation with “Smart Aging,” I’m really interested in whether any researchers will specifically target seniors with ResearchKit apps. I’ll be watching carefully when the Apple Watch comes out April 24th to see if seniors buy them (not terribly optimistic, I must admit, because of both the cost and the large number of seniors I help at The Apple Store who are befuddled by even Apple’s user-friendly technology) because the watch is a familiar form factor for them (I haven’t worn a watch since I got my first cell phone, and most young people I know have never had one) and might be willing to use them to participate in these projects.

N0w, if you’ll excuse me, I just downloaded the My Heart Counts app, and must find out my “heart age!”


Doh!  Just after I posted this, I saw a really important post on Ars Technica pointing out that this brave new world of medical research won’t go anywhere unless the FDA approves:

“As much as Silicon Valley likes to think of itself as a force for good, disrupting this and pivoting that, it sometimes forgets that there’s a wider world out there. And when it comes to using devices in the practice of medicine, that world contains three very important letters: FDA. That’s right, the US Food and Drug Administration, which Congress has empowered to regulate the marketing and research uses of medical devices.

“Oddly, not once in any of the announcement of ResearchKit did we see mention of premarket approval, 510k submission, or even investigational device exemptions. Which is odd, because several of the uses touted in the announcement aren’t going to be possible without getting the FDA to say yes.”

I remember reading that Apple had reached out to the FDA during development of the Apple Watch, so I’m sure none of this comes as a surprise to them, and any medical researcher worth his or her salt is also aware of that factor. However, the FDA is definitely going to have a role in this issue going forward, and that’s as it should be — as I’ve said before, with any aspect of the IoT, privacy and security is Job One.



FTC report provides good checklist to design in IoT security and privacy

FTC report on IoT

FTC report on IoT

SEC Chair Edith Ramirez has been pretty clear that the FTC plans to look closely at the IoT and takes IoT security and privacy seriously: most famously by fining IoT marketer TrendNet for non-existent security with its nanny cam.

Companies that want to avoid such actions — and avoid undermining fragile public trust in their products and the IoT as a whole — would do well to clip and refer to this checklist that I’ve prepared based on the recent FTC Report, Privacy and Security in a Connected World, compiled based on a workshop they held in 2013, and highlighting best practices that were shared at the workshop.

  1. Most important, “companies should build security into their devices at the outset, rather than as an afterthought.” I’ve referred before to the bright young things at the Wearables + Things conference who used their startup status as an excuse for deferring security and privacy until a later date. WRONG: both must be a priority from Day One.

  2. Conduct a privacy or security risk assessment during design phase.

  3. Minimize the data you collect and retain.  This is a tough one, because there’s always that chance that some retained data may be mashed up with some other data in future, yielding a dazzling insight that could help company and customer alike, BUT the more data just floating out there in “data lake” the more chance it will be misused.

  4. Test your security measures before launching your products. … then test them again…

  5. “..train all employees about good security, and ensure that security issues are addressed at the appropriate level of responsibility within the organization.” This one is sooo important and so often overlooked: how many times have we found that someone far down the corporate ladder has been at fault in a data breach because s/he wasn’t adequately trained and/or empowered?  Privacy and security are everyone’s job.

  6. “.. retain service providers that are capable of maintaining reasonable security and provide reasonable oversight for these service providers.”

  7. ‘… when companies identify significant risks within their systems, they should implement a defense-in -depth approach, in which they consider implementing security measures at several levels.”

  8. “… consider implementing reasonable access control measures to limit the ability of an unauthorized person to access a consumer’s device, data, or even the consumer’s network.” Don’t forget: with the Target data breach, the bad guys got access to the corporate data through a local HVAC dealer. Everything’s linked — for better or worse!

  9. “.. companies should continue to monitor products throughout the life cycle and, to the extent feasible, patch known vulnerabilities.”  Privacy and security are moving targets, and require constant vigilance.

  10. Avoid enabling unauthorized access and misuse of personal information.

  11. Don’t facilitate attacks on other systems. The very strength of the IoT in creating linkages and synergies between various data sources can also allow backdoor attacks if one source has poor security.

  12. Don’t create risks to personal safety. If you doubt that’s an issue, look at Ed Markey’s recent report on connected car safety.

  13. Avoid creating a situation where companies might use this data to make credit, insurance, and employment decisions.  That’s the downside of cool tools like Progressive’s “Snapshot,” which can save us safe drivers on premiums: the same data on your actual driving behavior might some day be used become compulsory, and might be used to deny you coverage or increase your premium).

  14. Realize that FTC Fair Information Practice Principles will be extended to IoT. These “FIPPs, ” including “notice, choice, access, accuracy, data minimization, security, and accountability,” have been around for a long time, so it’s understandable the FTC will apply them to the IoT.  Most important ones?  Security, data minimization, notice, and choice.

Not all of these issues will apply to all companies, but it’s better to keep all of them in mind, because your situation may change. I hope you’ll share these guidelines with your entire workforce: they’re all part of the solution — or the problem.

Real-time data sharing critical to “Smart Aging” and collaborative health care

Posted on 25th February 2015 in health, Internet of Things, open data, SmartAging

It’s hard to describe to someone who hasn’t encountered the phenomenon first hand, but there’s something really exciting (and perhaps transformative) when data is shared rather than hoarded. When data becomes the focus of discussions, different perspectives reveal different aspects of the data that even the brightest person couldn’t discover working in isolation.

That transformative aspect is very exciting when it involves health care.

I’ve written before about the life-saving discoveries when doctors and data scientists from Toronto’s Hospital for Sick Children and IBM collaboratively analyzed data from newborns in the NICU and discovered early signs of infections that allowed them to begin treatment a day before there was any outward manifestation of the infection. Now, the always-informative SAP Innovation blog (I don’t just say that because they’re kind enough to reprint many of my posts: I find it an eclectic and consistently informative source of information on all things dealing with innovation!) has an interesting piece about how Dartmouth Hitchcock is sharing real-time data with patients considering knee-replacement surgery.

In some cases, that data leads patients to decide — sigh of relief — their condition doesn’t warrant surgery at this point, while it confirms the need for others.  In both cases, there’s a subtle but important shift in the doctor-patient relationship that’s at the heart of my proposed “Smart Aging” paradigm shift: away from the omnipotent doctor telling the patient what’s needed and instead empowering the patient to be an active partner in his or her care.

The key is using the data to predict outcomes:

“‘Prior to anyone ever getting surgery, we want to try to predict how they’re going to do,’ Dartmouth-Hitchcock orthopedic surgeon Michael Sparks said in an SAP video. ‘But we’ve never had that missing tool, which is real-time data.’

“D-H recently began using real-time data analytics and predictive technologies to help people suffering from chronic knee pain to choose wisely and improve their outcomes. ‘It is actually a partnership to help people get ‘through this,’ Sparks said. ‘And it’s the analysis of data that adds to their ability to make a decision.’”

For the first time, the patient’s choice really becomes informed consent.

comments: 0 » tags: , , ,

IFTTT DO apps: neat extension of my fav #IoT crowdsourcing tool!

Have I told you lately how much I love IFTTT? Of course!  As I’ve said, I think they are a phenomenal example of my IoT “Essential Truth” question: who else can use this data?

IFTTT_DO_buttonNow, they’ve come up with 3 new apps, the “DO button,” “DO camera,” and “DO Note,” that make this great tool even more versatile!

With a DO “recipe,” you simply tap on the appropriate app, and the “recipe” runs. Presto! Change-o!

As a consultant who must bill for his time, I particularly like the one that lets you “Track Your Work hours” on Google Drive, but you’re sure to find your own favorites in categories such as play, work, home, families, and essentials. Some are just fun, and some will increase your productivity or help manage your household more easily (hmm: not sure where “post a note to your dog’s timeline” fits in (aside to my sons: feel free to “send notes to your data via email”.  If past experience is any indication, there should be many, many more helpful “Do” recipes as soon as users are familiar with how to create them.

As I’ve said before, it’s no reflection on the talented engineers at HUE, NEST, et. al., but there’s simply no way they could possibly visualize all the ways that their devices could be used and/or combined with others, and that’s why IFTTT, by adding the crowdsourcing component and democratizing data, is so important to speeding the IoT’s deployment.">Stephenson blogs on Internet of Things Internet of Things strategy, breakthroughs and management