Previewing “The Future Is Smart”: 1) Collective Blindness and the IoT

This is the first of an occasional series of posts preceding the August 1st publication of The Future Is Smart. The book will introduce the Internet of Things to business audiences and help them create affordable, profitable strategies to revolutionize their products, services, and even their very way of doing business through the IoT.  Each post will excerpt part of the book, giving you enough detail to be informative, but not — LOL — complete enough that you’ll be able to skip buying the book itself!

The critical point the book makes about revising your products and services to capitalize on the IoT is that it’s not enough to simply install sensors and beef up your data analysis: equally important are fundamental attitudinal shifts to break free from the limits of past technology and realize the IoT’s full capability.

A critical component is what I call “Collective Blindness,” a way of describing how limited we were in understanding how products actually ran in the era when we had almost no data about their operations, let alone real-time data that we (or other machines, through M2M controls) could act on instantly to create feedback loops and improve operating precision and facilitate upgrades.

Let me know what you think (after an horrific hack, I’ve decided to scrap comments on the blog — if I think it’s merited, I’ll feature your feedback in future posts)!


Technologist Jeffrey Conklin has written of “wicked problems” that are so complex they aren’t even known or detailed until solutions to them are found.

What if there had been a wicked problem, a universal human malady that we’ll call “Collective Blindness,” whose symptoms were that we humans simply could not see much of what was happening in the material world? We could only see the surface of these things, while their interiors and actual operations were impenetrable to us. For millennia we just came up with coping mechanisms to work around the problem of not being able to peer inside things, which we accepted as reality.

Collective Blindness was a stupendous obstacle to full realization of a whole range of human and business activities. But, of course, we couldn’t quantify the problem’s impact because we weren’t even aware that it existed.

In fact, Collective Blindness has been a reality, because vast areas of our daily reality have been unknowable and we have accepted those limits as a condition of reality.

For example, in a business context:

  • We couldn’t tell when a key piece of machinery was going to fail due to metal fatigue.
  • We couldn’t tell how efficiently an assembly line was operating, or how to fully optimize its performance by having changes in one machine trigger adjustments in the next one.
  • We couldn’t tell whether or when a delivery truck would be stuck in traffic, or for how long.
  • We couldn’t tell exactly when we’d need a parts resupply shipment from a supplier. (Let’s be honest: What we’ve called “just-in-time” in the past was hopelessly inexact compared to what we’ll be able to do in the future.) Nor would the supplier know exactly when to do a new production run in order to be ready.
  • We couldn’t tell how customers actually used our products once they were in the field, or help those customers adjust operations to make them more efficient.

That’s all changing now.

The wicked problem of Collective Blindness is ending, because the Internet of Things solves it, giving us real-time information about what’s happening inside things.

The Internet of Things will affect and improve every aspect of business, because it will allow us to eliminate all of those blind spots resulting from Collective Blindness, achieve efficiency, and derive insights that were impossible before.

Cisco, which focuses not only on the IoT’s enabling technologies but also on the management issues it will address, understands the Collective Blindness concept. It refers to previously opaque and unconnected things as “dark assets,” and says that, “The challenge is to know which dark assets (unconnected things) to light up (connect) and then capture, analyze, and use the data generated to improve efficiency while working smarter.”

Vuforia “sees” inside Caterpillar device

PTC has created the most literal cure for Collective Blindness: Vuforia, an AR system that lets an operator or repair person wearing an AR headset or using a tablet to go from looking at the exterior of a Caterpillar front-end loader to “seeing” an exploded view of the system that shows each part and how they connect as well as monitoring the realtime performance data of each component, gathered by sensors on the machinery. That insight can also be shared, in real-time, by others who need it.


You may quibble with my choice of the “Collective Blindness” metaphor for the obstacles we and businesses in general, faced before the IoT, but I do think we need some sweeping description of exactly how limited we used to be because the acceptance of those limits, and our inability to “see” how things really did restrict our ability to fine-tone products and their operation — and even now may keep us from re-examining everything now that we have gained this ability. Let me know your thoughts on this — and I hope you’ll stay tuned for more excerpts from The Future Is Smart in coming months.

 

OtoSense: the next level in sound-based IoT

It sounds (pardon the pun) as if the IoT may really be taking off as an important diagnostic repair tool.

I wrote a while ago about the Auguscope, which represents a great way to begin an incremental approach to the IoT because it’s a hand-held device to monitor equipment’s sounds and diagnose possible problems based on abnormalities.

Now NPR reports on a local (Cambridge) firm, OtoSense, that is expanding on this concept on the software end. Its tagline is “First software platform turning real-time machine sounds and vibrations into actionable meaning at the edge.”

Love the platform’s origins: it grows out of founder Sebastien Christian’s research on deafness (as I wrote in my earlier post, I view suddenly being able to interpret things’ sounds as a variation on how the IoT eliminates the “Collective Blindness”  that I’ve used to describe our past inability to monitor things before the IoT’s advent):

“[Christian} … is a quantum physicist and neuroscientist who spent much of his career studying deaf children. He modeled how human hearing works. And then he realized, hey, I could use this model to help other deaf things, like, say, almost all machines.”

(aside: I see this as another important application of my favorite IoT question: learning to automatically ask “who else can use this data?” How does that apply to YOUR work? But I digress).

According to Technology Review, the company is concentrating primarily on analyzing car sounds from IoT detectors on the vehicle at this point (working with a number of car manufacturers) although they believe the concept can be applied to a wide range of sound-emitting machinery:

“… OtoSense is working with major automakers on software that could give cars their own sense of hearing to diagnose themselves before any problem gets too expensive. The technology could also help human-driven and automated vehicles stay safe, for example by listening for emergency sirens or sounds indicating road surface quality.

OtoSense has developed machine-learning software that can be trained to identify specific noises, including subtle changes in an engine or a vehicle’s brakes. French automaker PSA Group, owner of brands including Citroen and Peugeot, is testing a version of the software trained using thousands of sounds from its different vehicle models.

Under a project dubbed AudioHound, OtoSense has developed a prototype tablet app that a technician or even car owner could use to record audio for automated diagnosis, says Guillaume Catusseau, who works on vehicle noise in PSA’s R&D department.”

According to NPR, the company is working to apply the same approach to a wide range of other types of machines, from assembly lines to DIY drills. As always with IoT data, handling massive amounts of data will be a challenge, so they will emphasize edge processing.

OtoSense has a “design factory” on the site, where potential customers answer a variety of questions about the sounds they must monitor (such as whether the software will be used indoors or out, whether it is to detect anomalies, etc. that will allow the company to choose the appropriate version of the program.

TechCrunch did a great article on the concept, which underscores really making sound detection precise will take a lot of time and refinement, in part because of the fact that — guess what — sounds from a variety of sources are often mingled, so the relevant ones must be determined and isolated:

“We have loads of audio data, but lack critical labels. In the case of deep learning models, ‘black box’ problems make it hard to determine why an acoustical anomaly was flagged in the first place. We are still working the kinks out of real-time machine learning at the edge. And sounds often come packaged with more noise than signal, limiting the features that can be extracted from audio data.”

In part, as with other forms of pattern recognition such as voice, this is because it will require accumulating huge data files:

“Behind many of the greatest breakthroughs in machine learning lies a painstakingly assembled dataset.ImageNet for object recognition and things like the Linguistic Data Consortium and GOOG-411 in the case of speech recognition. But finding an adequate dataset to juxtapose the sound of a car-door shutting and a bedroom-door shutting is quite challenging.

“’Deep learning can do a lot if you build the model correctly, you just need a lot of machine data,’ says Scott Stephenson, CEO of Deepgram, a startup helping companies search through their audio data. ‘Speech recognition 15 years ago wasn’t that great without datasets.’

“Crowdsourced labeling of dogs and cats on Amazon Mechanical Turk is one thing. Collecting 100,000 sounds of ball bearings and labeling the loose ones is something entirely different.

“And while these problems plague even single-purpose acoustical classifiers, the holy grail of the space is a generalizable tool for identifying all sounds, not simply building a model to differentiate the sounds of those doors.

…”A lack of source separation can further complicate matters. This is one that even humans struggle with. If you’ve ever tried to pick out a single table conversation at a loud restaurant, you have an appreciation for how difficult it can be to make sense of overlapping sounds.

Bottom line: there’s still a lot of theoretical and product-specific testing that must be done before IoT-based sound detection will be an infallible diagnostic tool for predictive maintenance, but clearly there’s precedent for the concept, and the potential payoff are great!

 


LOL: as the NPR story pointed out, this science may owe its origins to two MIT grads of an earlier era, “Click” and “Clack” of Car Talk, who frequently got listeners to contribute their own hilarious descriptions of the sounds they heard from their malfunctioning cars.   BRTTTTphssssBRTTTT…..

comments: Comments Off on OtoSense: the next level in sound-based IoT tags: , , , , ,

Incredible example of rethinking “things” with Internet of Things

Ladies and gentlemen, I give you the epitome of the IoT-enabled product: the trash can!

My reader statistics do not indicate this blog has a heavy readership among trash cans, but let me apologize in advance to them for what I’m about to write: it’s not personal, just factual.

I’m sorry, but you municipal trash cans are pathetic!

Dented. Chipping paint. Trash overflowing. Smelly. Pests (ever seen any of those prize city rats? Big!!!) Sometime even knocked over. And, worst of all, you are so…. DUMB. You just sit there and don’t do anything.

BigBelly trash compactor and recycling center

But that was then, and this is now.

I have seen the future of trash cans, and, equally important, perhaps the best example I’ve seen of how smart designers and company strategists can –and must — totally rethink products’ design and how they are used because of the Internet of Things! 

At last week’s Re-Work Internet of Things Summit there were many exciting new IoT examples (I’ll blog others in coming weeks) but perhaps the one that got more people talking was the BigBelly trash compactor & recycling system, high-tech successor to the lowly trash can.

The company’s motto is that they are “transforming waste management practices and contributing to the Smart Cities of tomorrow.” Indeed!

I was first attracted to the BigBelly systems because of my alternative energy and environmental passions: they featured PV-powered trash compactors, which can quintuple the amount a trash container can hold, eliminating overflowing containers and the need to send trucks to empty them as frequently. Because the containers are closed, there’s no more ugly banana peels and McDonald’s wrappers assaulting your delicate eyes — or noses! Equally important, each is paired with a recycling container, which are almost never seen on city streets, dramatically reducing the amount of recyclables that go into regular trash simply because no recycling containers are accessible downtown.  These features alone would be a noteworthy advance compared to conventional trash cans.

But BigBelly wasn’t content to just improve the efficiency of trash and recyclable collection: they decided to make the containers smart.

The company worked with Digi to add wireless communications to the bins. This is a critical part of BigBelly’s broader significance: when the IoT first started to creep into corporate consciousness, of course designers thought about smart versions of high-value products such as cars, but lowly trash cans? That deserves real praise, because they fundamentally re-examined not only the product as it existed, but also realized that an IoT-based version that could also communicate real-time data would become much more versatile and much more valuable.

Here’s what has resulted so far (and I suspect that as the BigBellys are more widely deployed and both city administrators and others become aware of their increased functionality, other features will be added: I see them as “Smart City Hubs!”):

  • heatmap of trash generation in Lower Manhattan using real-time data from BigBellys and CLEAN dashboard

    instead of traditional pickup routes and schedules that were probably based on sheer proximity (or, as BigBelly puts it a little more colorfully, “muscle memory and gut instincts”), they now offer a real-time way to monitor actual waste generation, through the “CLEAN Management Console,” which allows DPW personnel to monitor and evaluate bins’ fullness, trends and historical analysis, for perspective. Collections can now be dynamic and driven by current needs, not historical patterns.

  • For those cities that opt for it, the company offers a Managed Services option where it does the analysis and management of the devices — not unlike the way jet turbine manufacturers now offer their customers value-added data that allows them to optimize performance — and generates new revenue streams for the manufacturers.
  • You may remember that I blogged a while ago about the “Collective Blindness” analogy: that, until the IoT, we humans simply couldn’t visualize much about the inner workings of the material world, so we were forced to do klugy work-arounds.  That’s not, strictly speaking, the case here, since trash in a conventional can is obviously visible, but the actual volume of trash was certainly invisible to those at headquarters. Now they can see — and really manage it.
  •  They can dramatically increase recycling programs’ participation rate and efficiency. As BigBelly says, the system provides “intelligent infrastructure to support ongoing operations and free up staffing and resources to support new and expanded recycling programs. Monitoring each separate stream volumes, days to fullness, and other activities in CLEAN enables you to make changes where needed to create a more effective public recycling program. Leverage the stations’ valuable sidewalk real estate to add messaging of encouraging words to change your users’ recycling behaviors.”Philadelphia is perhaps the best example of how effective the system can be. The city bought 210 of the recycling containers in 2009. On average, each collected 225 pounds of recyclables monthly, resulting in 23.5 tons of material diverted from landfills. Philly gets $50 per ton from the recycling — and avoiding $63 in landfill tipping fees, with a total benefit to the city of $113 per ton, or $2599 per month.

Here’s where it really gets neat, in my estimation.

Because the BigBellys are connected in real time, the devices can serve a number of real-time communication functions as well (enabled by an open API and an emphasis by BigBelly on finding collaborative uses). That includes making them hubs for a “mesh network” municipal wi-fi system (which, by the way, means that your local trash container/communications hub could actually save your life in a disaster or terror attack, when stationary networks may be disrupted, as I explained years ago in this YouTube video).

The list of benefits goes on (BigBelly lists all of them, right down to “Happy Cities,” on its web site). Trust me: if my premise is right that we can’t predict all of the benefits of the IoT at this point because we simply aren’t accustomed to thinking expansively about all the ways connected devices can be used, there will be more!

So here’s my take-away from the BigBelly:

If something as humble and ubiquitous as a municipal trashcan can  be transformed into a waste-reduction, recycling collection, municipal communication hub, then to fully exploit the Internet of Things’ full potential, we need to take a new, creative look at every material thing we interact with, no longer making assumptions about its limited role, and instead looking at it creatively as part of an interconnected network whose utility grows the more things (and people!) it’s connected with!

Let me know your ideas on how to capitalize on this new world of possibilities!

comments: Comments Off on Incredible example of rethinking “things” with Internet of Things tags: , , , , , , ,

Remember: The IoT Is Primarily About Small Data, Not Big

Posted on 16th March 2015 in data, Internet of Things, M2M, management, manufacturing, open data

In one of my fav examples of how the IoT can actually save lives, sensors on only eight preemies’ incubators at Toronto’s Hospital for Sick Children yield an eye-popping 90 million data points a day!  If all 90 million data points get relayed on to the “data pool,” the docs would be drowning in data, not saving sick preemies.

Enter “small data.”

Writing in Forbes, Mike Kavis has a worthwhile reminder that the essence of much of the Internet of Things isn’t big data, but small. By that, he means:

a dataset that contains very specific attributes. Small data is used to determine current states and conditions  or may be generated by analyzing larger data sets.

“When we talk about smart devices being deployed on wind turbines, small packages, on valves and pipes, or attached to drones, we are talking about collecting small datasets. Small data tell us about location, temperature, wetness, pressure, vibration, or even whether an item has been opened or not. Sensors give us small datasets in real time that we ingest into big data sets which provide a historical view.”

Usually, instead of aggregating  ALL of the data from all of the sensors (think about what that would mean for GE’s Durathon battery plant, where 10,000 sensors dot the assembly line!), the data is originally analyzed at “the edge,” i.e., at or near the point where the data is collected. Then only the data that deviates from the norm (i.e., is significant)  is passed on to to the centralized data bases and processing.  That’s why I’m so excited about Egburt, and its “fog computing” sensors.

As with sooo many aspects of the IoT, it’s the real-time aspect of small data that makes it so valuable, and so different from past practices, where much of the potential was never collected at all, or, if it was, was only collected, analyzed and acted upon historically. Hence, the “Collective Blindness” that I’ve written about before, which limited our decision-making abilities in the past. Again, Kavis:

“Small data can trigger events based on what is happening now. Those events can be merged with behavioral or trending information derived from machine learning algorithms run against big data datasets.”

As examples of the interplay of small and large data, he cites:

  • real-time data from wind turbines that is used immediately to adjust the blades for maximum efficiency. The relevant data is then passed along to the data lake, “..where machine-learning algorithms begin to understand patterns. These patterns can reveal performance of certain mechanisms based on their historical maintenance record, like how wind and weather conditions effect wear and tear on various components, and what the life expectancy is of a particular part.”
  • medicine containers with smart labels. “Small data can be used to determine where the medicine is located, its remaining shelf life, if the seal of the bottle has been broken, and the current temperature conditions in an effort to prevent spoilage. Big data can be used to look at this information over time to examine root cause analysis of why drugs are expiring or spoiling. Is it due to a certain shipping company or a certain retailer? Are there re-occurring patterns that can point to problems in the supply chain that can help determine how to minimize these events?”

Big data is often irrelevant in IoT systems’ functioning: all that’s needed is the real-time small data to trigger an action:

“In many instances, knowing the current state of a handful of attributes is all that is required to trigger a desired event. Are the patient’s blood sugar levels too high? Are the containers in the refrigerated truck at the optimal temperature? Does the soil have the right mixture of nutrients? Is the valve leaking?”

In a future post, I’ll address the growing role of data scientists in the IoT — and the need to educate workers on all levels on how to deal effectively with data. For now, just remember that E.F. Schumacher was right: “small is beautiful.”

 

comments: Comments Off on Remember: The IoT Is Primarily About Small Data, Not Big tags: , , , , , , , , ,

Interview w/ Echelon for its IoT blog

Just finished a delightful interview with three Echelon staffers for a forthcoming piece on its blog about my prognostications for the Industrial Internet of Things (AKA “Industrial Internet” ien GE-marketing speak).  They’ve been around in this field since the dark ages — 1988, and are now focusing on industrial applications.

My main point to them was the one I made in the SAP “Managing the Internet of Things Revolution” e-guide,  that even though the IoT hasn’t realized its full potential yet, that smart companies would begin creating and executing an IoT strategy now, “to connect their existing infrastructure and enhance key foundational IoT technologies,” optimizing their operating efficiency. Then they could build on that experience to make more fundamental transformations.

We touched 0n several other examples how the IoT could increase operating efficiency or make fundamental transformations:

At any rate, a fun time was had by all, and I’ll let you know when their blog post is up!

http://www.stephensonstrategies.com/">Stephenson blogs on Internet of Things Internet of Things strategy, breakthroughs and management