My Latest Industry Week Column: why the edge is critical for IoT

As is so often the case, technological success can often result in unintended consequences that, left unremedied, could negate the benefits.

As my latest Industry Week column I looked at one of those issues — the explosion of real-time sensor data collected by the IoT — and the solution to the problem that adds many other benefits in the process, shifting at least part of the data processing from the cloud to the “edge” of the system, preferably at the point of collection.

As I pointed out, if the data must be moved to the cloud first for processing (no mean feat, BTW, because it can also overwhelm the transmission networks) and then back to the collection point for action, it negates the IoT’s major benefit, being able to collect and then act on data in near-real time, allowing precise regulation of things.

Of course edge processing adds additional costs for distributed processing hardware and software, and can add risk if the device is easily tampered with, but, overall, it seems to me the edge should not replace, but definitely supplement the cloud in robust IoT systems.

I based the column on a comprehensive, short-of-over-promotion report, Data at the Edge, created by an industry consortium, State of the Edge. It’s a quick read, and I recommend it!

Read it and let me know what you think.

comments: Closed tags: , , , , ,

Hippo: IoT-based paradigm shift from passive to active insurance companies

I’m a big advocate of incremental IoT strategies (check out my recent webinar with Mendix on this approach), for existing companies that want to test the waters first. However, I’m enough of a rabble-rouser to also applaud those who jump right in with paradigm-busting IoT (and big data) startups.

Enter, stage left, a nimble (LOL) new home insurance company: Hippo!

IMHO, Hippo’s important both in its own right and also as a harbinger of other startups that will exploit the IoT and big data to break with years of tradition in the insurance industry as a whole, no longer sitting passively to pay out claims when something bad happens, but seizing the initiative to reduce risk, which is what insurance started out to do.

After all, when a Mr. B. Franklin (I’ll tell you: plunk that guy down in 2017 and he’d create a start-up addressing an unmet need within a week!) and his fellow firefighters launched the Philadelphia Contributionship in 1752, one of the first things they did was to send out appraisers to determine the risk of a house burning and suggest ways to make it safer.

Left to right: Eyal Navon, CTO and cofounder; Assaf Wand, CEO cofounder of Hippo

In fact, there’s actually a term for this kind of web-based insurance, coined by McKinsey: insuretec” (practicing what he preached, one of Hippo’s founders had been at McKinsey, and what intrigued the founders about insurance as a target was that it’s a huge industry, hasn’t really innovate for years, and didn’t focus on the customer experience.).

I talked recently to two key staffers, Head of Product Aviad Pinkovezky and Head of Marketing, Growth and Product Innovation Jason White.  They outlined a radically new strategy “with focused attention on loss reduction”:

  • sell directly to consumers instead of using agents
  • cut out legacy coverage leftovers, such as fur coats, silverware & stock certificates in a home safe) and instead cover laptops, water leaks, etc.
  • Leverage data to inform customers about appliances they own that might be more likely to cause problems, and communicate with them on a continuous basis about steps such as cleaning gutters that could reduce problems.

According to Pinkovezky, the current companies “are reactive, responding to something that takes place. Consumer-to-company interaction is non-continuous, with almost nothing between paying premiums and filing a claim.  Hippo wants to build must more of a continuous relationship, providing value added,” such as an IoT-based water-leak detection device that new customers receive.

At the same time, White said that the company is still somewhat limited in what if can do to reduce risk because so much of it isn’t really from factors such as theft (data speaks: he said thefts actually constitute little of claims) but from one, measured by frequency and amount of damage (according to their analysis) that’s beyond their control: weather. As I pointed out, that’s probably going to constitute more of a risk in the foreseeable future due to global warming.

Hippo also plans a high-tech, high-touch strategy, that would couple technnology with a human aspect that’s needed in a stressful situation such as a house fire or flood. According to Forbes:

The company acknowledges that its customers rely on Hippo to protect their largest assets, and that insurance claims often derive from stressful experiences. In light of this, Hippo offers comprehensive, compassionate concierge services to help home owners find hotels when a home becomes unlivable, and to supervise repair contractors when damage occurs.”

While offering new services, the company has firm roots in the non-insuretech world, because its policies are owned and covered by Topa, which was founded more than 30 years ago.

Bottom line: if you’re casting about for an IoT-based startup opportunity, you’d do well to use the lens McKinsey applied to insurance: look for an industry that’s tradition-bound, and tends to react to change rather than initiate it (REMEMBER: a key element of the IoT paradigm shift is that, for the first time, we can piece “universal blindness” and really see inside things to gauge how they are working [or not] — the challenge is to capitalize on that new-found data). 

comments: Comments Off on Hippo: IoT-based paradigm shift from passive to active insurance companies tags: , , , , , , ,

Blockchain might be answer to IoT security woes

Could blockchain be the answer to IoT security woes?

I hope so, because I’d like to get away from my recent fixation on IoT security breaches and their consequences,  especially the Mirai botnet attack that brought a large of the Internet to its knees this Fall and the even scarier (because it involved Philips, a company that takes security seriously) white-hat hackers attack on Hue bulbs.  As I’ve written, unless IoT security is improved, the public and corporations will lose faith in it and the IoT will never develop to its full potential.

Now, there’s growing discussion that blockchain (which makes bitcoin possible), might offer a good IoT security platform.

Ironically — for something dealing with security — blockchain’s value in IoT may be because the data is shared and no one person owns it or can alter it unilaterally (BTW, this is one more example of my IoT “Essential Truth” that with the IoT data should be shared, rather than hoarded as in the past.

If you’re not familiar with blockchain, here’s an IBM video, using an example from the highly security-conscious diamond industry, that gives a nice summary of how it works and why:

The key aspects of blockchain is that it:

  • is transparent
  • can trace all aspects of actions or transactions (critical for complex sequences of actions in an IoT process)
  • is distributed: there’s a shared form of record keeping, that everyone in the process can access.
  • requires permission — everyone has permission for every step
  • is secure: no one person — even a system administrator — can alter it without group approval.

Of these, perhaps the most important aspect for IoT security is that no one person can change the blockchain unilaterally, adding something (think malware) without the action being permanently recorded and without every participant’s permission.  To add a new transaction to the blockchain, all the members must validate it by applying an algorithm to confirm its validity.

The blockchain can also increase efficiency by reducing the need for intermediaries, and it’s a much better way to handle the massive flood of data that will be generated by the IoT.

The Chain of Things think tank and consortium is taking the lead on exploring blockchain’s application to the IoT. The group describes itself as “technologists at the nexus of IoT hardware manufacturing and alternative blockchain applications.” They’ve run several blockchain hackathons, and are working on open standards for IoT blockchains.

Contrast blockchain with the current prevailing IoT security paradigm.  As Datafloq points out, it’s based on the old client-server approach, which really doesn’t work with the IoT’s complexity and variety of connections: “Connection between devices will have to exclusively go through the internet, even if they happen to be a few feet apart.”  It doesn’t make sense to try to funnel the massive amounts of data that will result from widespread deployment of billions of IoT devices and sensor through a centralized model when a decentralized, peer-to-peer alternative would be more economical and efficient.

Datafloq concludes:

“Blockchain technology is the missing link to settle scalability, privacy, and reliability concerns in the Internet of Things. Blockchain technologies could perhaps be the silver bullet needed by the IoT industry. Blockchain technology can be used in tracking billions of connected devices, enable the processing of transactions and coordination between devices; allow for significant savings to IoT industry manufacturers. This decentralized approach would eliminate single points of failure, creating a more resilient ecosystem for devices to run on. The cryptographic algorithms used by blockchains, would make consumer data more private.”

I love it: paradoxically, sharing data makes it more secure!  Until something better comes along and/or the nature of IoT strategy challenges changes, it seems to me this should be the basis for secure IoT data transmission!

 

 

 

Live Blogging from SAP’s HANA IoT event

Hmm. Never been to Vegas before: seems designed to bring out the New England Puritan in me. I’ll pass on opulence, thank you very much…

 SAP HANA/ IoT Conference

SAP HANA/ IoT Conference

Up front, very interested in a handout from Deloitte, “Beyond Linear,” which really is in line with speech I’ll give here tomorrow on the IoT “Essential Truths,” in which one of my four key points will be that we need to abandon the old, linear flow of data for a continuous cyclical one.  According to Deloitte’s Jag Bandia,

“Among users with a complete, 360-degree view of relevant data for each specific process can help avoid missed opportunities. The ‘all data’ approach means relevant data can and should come from anywhere — any application, any system, any process — not just the traditional channels associated with the process.”

Bravo!

First speaker: SAP Global Customers Operations CTO Ifran Khan:

  • “digital disruption”: catalyst for change & imperative to go digital.
  • digression about running going digital (I put in my 30 minutes this morning!!!), creating a totally new way of exercising (fits beautifully with “Smart Aging“!)
  • new macro tech trends are enabling digitalizations: hyper-connectivity, super computing, cloud computing, smart world, and cybersecurity (horrifying stat about how many USB sticks were left in dry cleaning!)
  • those who don’t go digital will go under…. (like John Chambers’ warning about IoT).
  • new opportunities in wide range of industries
  • need new digital architectures — “driving locality of data, integrated as deep as possible into the engine.
  • HOLY COW! He starts talking about a circular, digitally-centered concept, with a buckyball visual.  Yikes: great minds think alike.
  • sez HANA allows a single platform for all digital enterprise computing.
  • running things in real-time, with no latency — music to my ears!

Jayne Landry, SAP:

  • too few in enterprise have real-time access to analytics — oh yeah!
  • “analytics for everyone”
  • “own the outcome”
  • “be the one to know”
  • SAP Cloud for Analytics — “all analytics capabilities in one product.” real-time, embedded, consumer-grade user experience, cloud-based. Looking forward to seeing this one!
  • “Digital Boardroom” — instant insight. Same info available to board also available to shopfloor — oh yeah — democratizing data!

Very funny bit by Ty Miller on using SAP Cloud for Analytics to analyze Area 51 data. Woo Woo!

Ifran Khan again:

  • how to bring it to the masses? Because it’s expensive and difficult to maintain on the premises, extend and build in cloud! Add new “micro services” to SAP HANA cloud platform: SAP Application Integration, Tax Service, Procurement, Customer Engagement, Predictive, and, ta da, IoT.
  • video of Hamburg Port Authority. Absolutely love that and what they’re doing with construction sites!

Jan Jackman, IBM:

  • customers want speed. Cloud is essential. IBM & HANA are partners in cloud…

This guy is sooo neat: Michael Lynch, IoT Extended Supply Chain for SAP (and former opera student!):

  • “Connecting information, people, and things is greatest resource ever to drive insightful action.”
  • “big deal is the big data processing potential is real & chips are cheaper, so you can build actual business solutions”
  • STILL gmbh (forklifts) great example!
  • phase 1: connect w/ billions of internet-enabled things to gain new insights
  • phase II: transform the way you make decisions and take action
  • phase III: re-imagine your customer’s experience.
  • they do design thinking workshops — would luv one of those!
  • great paradigm shift: Hagleitner commercial bathroom supplies
  • Kaeser compressors: re-imaging customer service
  • working with several German car companies on enabling connected driving
  • once again, the  Hamburg Port Authority!!

SAP’s strategy:

  • offers IoT apps. platforms, and facilitates extensions of IoT solutions
  • work closely with Siemens: he’s talked with them about turbine business.
  • SAP has several solutions for IoT
  • Cloud-based predictive maintenance!
  • “social network for assets”: Asset Intelligence Network
  • They did the Harley York PA plant! — one line, 21-day per bike to 6 hrs.  (displays all around the plant with KPIs)
  • 5 layers of connectivity in manufacturing “shop floor to top floor”  SAP Connected Manufacturing
  • They have a IoT Starter Kit — neat
  • SAP Manufacturing Integration and Intelligence
  • SAP Plant Connectivity
  • SAP Event Stream Processor
  • SAP MobiLink
  • SAP SQL Anywhere/SAP ultralite
  • 3rd Party IoT Device Cloud (had never heard of “device cloud” concept — specialize in various industry verticals).

“Becoming an Insight-Driven Organization”  Speakers: Jag Bandla and Chris Dinkel of Deloitte.

  • Deloitte is using these techniques internally to make Deloitte “insight-driven”
  • “an insight-driven organization (IDO) is one which embeds analysis, data, and reasoning into every step of the decision-making process.” music to my ears!
  • emphasis on actionable insight
  • “when humans rely on their own experiences and knowledge, augmented by a stream of analytics-driven insights, the impact on value can be exponential”
  • benefits to becoming an IDO:
    • faster decisions
    • increased revenue
    • decreased cost of decision making
  • challenges:
    • lack of proper tech to capture
    • oooh: leaders who don’t understand the data…
  • 5 enabling capabilities:
    • strategy
    • people
    • process
    • data
    • tech
  • developing vision for analytics
  • Key questions: (only get a few..)
    • what are key purchase drivers for our customers?
    • how should we promote customer loyalty?
    • what customer sentiments are being expressed on social media?
    • how much should we invest in innovation?
  • Value drivers:
    • strategic alignment
    • revenue growth
    • cost reduction
    • margin improvement
    • tech
    • regulation/compliance
  • Organize for success (hmm: I don’t agree with any of these: want to decentralize while everyone is linked on a real-time basis):
    • centralized (don’t like this one, with all analyzed in one central group.. decentralize and empower!)
    • consulting: analysts are centralized, but act as internal consultants
    • center of excellence: central entity coordinates community of analysts across company
    • functional: analysts in functions such as marketing & supply chain
    • dispersed: analysts scattered across organization, little coordination
  • Hire right people! “Professionals who can deliver data-backed insights that create business value — and not just crunch numbers — are the lifeblood of an Insight-Driven Organization”
    • strong quantitative skills
    • strong biz & content skills (understand content and context)
    • strong data modeling & management skills
    • strong IT skills
    • strong creative design skills (yea: techies often overlook the cool design guys & gals)
  • Change the mindset (critical, IMHO!):
    • Communicate: build compelling picture of future to steer people in right direction.
    • Advocate: develop cohort of leaders to advocate for program.
    • Active Engagement: engage key figures to create pull for the program
    • Mobilize: mobilize right team across the organization.
  • How do you actually do it? 
    • improve insight-to-impact with “Exponential Biz Processes” — must rebuild existing business processes!  Involves digital user experience, biz process management, enterprise science, all data, and IT modernization.
      • re-engineer processes from ground up
      • develop intuitive, smart processes
      • enable exception-based management
  • Data:
    • “dark data:” digital exhaust, etc. might be hidden somewhere, but still actionable.
      • they use it for IoT: predictive personalization (not sure I get that straight…).
    • want to have well-defined data governance organization: standards, data quality, etc.
  • Technology: digital core (workforce engagement, big data & IoT, supplier collaboration, customer experience
    • HANA
  • Switch to digital delivery: visualizations are key!
    • allow for faster observations of trends & patterns
    • improve understanding & retention of info
    • empower embedded feeds and user engagement

 

IoT and the Data-Driven Enterprise: Bob Mahoney, Red Hat & Sid Sipes, Sr. Director of Edge Computing, SAP

  • What’s driving enterprise IoT?
    • more connected devices
    • non-traditional interactions such as M2M and H2M
    • ubiquitous internet connectivity
    • affordable bandwidth
    • cloud computing
    • standards-based and open-source software
  • Biz benefits:
    • economic gains
    • new revenue streams (such as sale of jet turbine data)
    • regulatory compliance
    • efficiencies and productivity
    • ecological impact
    • customer satisfaction
  • example of Positive Train Control systems to avert collisions. Now, that can be replaced by “smarter train tech”
  • SAP and edge computing (can’t move all of HANA to edge, but..)
    • improve security in transmission
    • reduce bandwidth need
    • what if connection goes down
    • actual analysis at the edge
    • allows much quicker response than sending it to corporate, analyzing & send it back
    • keep it simple
    • focused on, but not limited to, IoT
  • they can run SQL anywhere on IoT, including edge: SQL Anywhere
  • Red Hat & SAP doing interesting combination for retail, with iBeacons, video heat map & location tracking: yields real insights into consumer behavior.
comments: Comments Off on Live Blogging from SAP’s HANA IoT event tags: , , , , , , , , ,

McKinsey IoT Report Nails It: Interoperability is Key!

I’ll be posting on various aspects of McKinsey’s new “The Internet of Things: Mapping the Value Beyond the Hype” report for quite some time.

First of all, it’s big: 148 pages in the online edition, making it the longest IoT analysis I’ve seen! Second, it’s exhaustive and insightful. Third, as with several other IoT landmarks, such as Google’s purchase of Nest and GE’s divestiture of its non-industrial internet division, the fact that a leading consulting firm would put such an emphasis on the IoT has tremendous symbolic importance.

McKinsey report — The IoT: Mapping the Value Beyond the Hype

My favorite finding:

“Interoperability is critical to maximizing the value of the Internet of Things. On average, 40 percent of the total value that can be unlocked requires different IoT systems to work together. Without these benefits, the maximum value of the applications we size would be only about $7 trillion per year in 2025, rather than $11.1 trillion.” (my emphasis)

This goes along with my most basic IoT Essential Truth, “share data.”  I’ve been preaching this mantra since my 2011 book, Data Dynamite (which, if I may toot my own horn, I believe remains the only book to focus on the sweeping benefits of a paradigm shift from hoarding data to sharing it).

I was excited to see that the specific example they zeroed in on was offshore oil rigs, which I focused on in my op-ed on “real-time regulations,” because sharing the data from the rig’s sensors could both boost operating efficiency and reduce the chance of catastrophic failure. The paper points out that there can be 30,000 sensors on an rig, but most of them function in isolation, to monitor a single machine or system:

“Interoperability would significantly improve performance by combining sensor data from different machines and systems to provide decision makers with an integrated view of performance across an entire factory or oil rig. Our research shows that more than half of the potential issues that can be identified by predictive analysis in such environments require data from multiple IoT systems. Oil and gas experts interviewed for this research estimate that interoperability could improve the effectiveness of equipment maintenance in their industry by 100 to 200 percent.”

Yet, the researchers found that only about 1% of the rig data was being used, because it rarely was shared off the rig with other in the company and its ecosystem!

The section on interoperability goes on to talk about the benefits — and challenges — of linking sensor systems in examples such as urban traffic regulation, that could link not only data from stationary sensors and cameras, but also thousands of real-time feeds from individual cars and trucks, parking meters — and even non-traffic data that could have a huge impact on performance, such as weather forecasts.  

While more work needs to be done on the technical side to increase the ease of interoperability, either through the growing number of interface standards or middleware, it seems to me that a shift in management mindset is as critical as sensor and analysis technology to take advantage of this huge increase in data:

“A critical challenge is to use the flood of big data generated by IoT devices for prediction and optimization. Where IoT data are being used, they are often used only for anomaly detection or real-time control, rather than for optimization or prediction, which we know from our study of big data is where much additional value can be derived. For example, in manufacturing, an increasing number of machines are ‘wired,’ but this instrumentation is used primarily to control the tools or to send alarms when it detects something out of tolerance. The data from these tools are often not analyzed (or even collected in a place where they could be analyzed), even though the data could be used to optimize processes and head off disruptions.”

I urge you to download the whole report. I’ll blog more about it in coming weeks.

comments: Comments Off on McKinsey IoT Report Nails It: Interoperability is Key! tags: , , , , , , ,

Remember: The IoT Is Primarily About Small Data, Not Big

Posted on 16th March 2015 in data, Internet of Things, M2M, management, manufacturing, open data

In one of my fav examples of how the IoT can actually save lives, sensors on only eight preemies’ incubators at Toronto’s Hospital for Sick Children yield an eye-popping 90 million data points a day!  If all 90 million data points get relayed on to the “data pool,” the docs would be drowning in data, not saving sick preemies.

Enter “small data.”

Writing in Forbes, Mike Kavis has a worthwhile reminder that the essence of much of the Internet of Things isn’t big data, but small. By that, he means:

a dataset that contains very specific attributes. Small data is used to determine current states and conditions  or may be generated by analyzing larger data sets.

“When we talk about smart devices being deployed on wind turbines, small packages, on valves and pipes, or attached to drones, we are talking about collecting small datasets. Small data tell us about location, temperature, wetness, pressure, vibration, or even whether an item has been opened or not. Sensors give us small datasets in real time that we ingest into big data sets which provide a historical view.”

Usually, instead of aggregating  ALL of the data from all of the sensors (think about what that would mean for GE’s Durathon battery plant, where 10,000 sensors dot the assembly line!), the data is originally analyzed at “the edge,” i.e., at or near the point where the data is collected. Then only the data that deviates from the norm (i.e., is significant)  is passed on to to the centralized data bases and processing.  That’s why I’m so excited about Egburt, and its “fog computing” sensors.

As with sooo many aspects of the IoT, it’s the real-time aspect of small data that makes it so valuable, and so different from past practices, where much of the potential was never collected at all, or, if it was, was only collected, analyzed and acted upon historically. Hence, the “Collective Blindness” that I’ve written about before, which limited our decision-making abilities in the past. Again, Kavis:

“Small data can trigger events based on what is happening now. Those events can be merged with behavioral or trending information derived from machine learning algorithms run against big data datasets.”

As examples of the interplay of small and large data, he cites:

  • real-time data from wind turbines that is used immediately to adjust the blades for maximum efficiency. The relevant data is then passed along to the data lake, “..where machine-learning algorithms begin to understand patterns. These patterns can reveal performance of certain mechanisms based on their historical maintenance record, like how wind and weather conditions effect wear and tear on various components, and what the life expectancy is of a particular part.”
  • medicine containers with smart labels. “Small data can be used to determine where the medicine is located, its remaining shelf life, if the seal of the bottle has been broken, and the current temperature conditions in an effort to prevent spoilage. Big data can be used to look at this information over time to examine root cause analysis of why drugs are expiring or spoiling. Is it due to a certain shipping company or a certain retailer? Are there re-occurring patterns that can point to problems in the supply chain that can help determine how to minimize these events?”

Big data is often irrelevant in IoT systems’ functioning: all that’s needed is the real-time small data to trigger an action:

“In many instances, knowing the current state of a handful of attributes is all that is required to trigger a desired event. Are the patient’s blood sugar levels too high? Are the containers in the refrigerated truck at the optimal temperature? Does the soil have the right mixture of nutrients? Is the valve leaking?”

In a future post, I’ll address the growing role of data scientists in the IoT — and the need to educate workers on all levels on how to deal effectively with data. For now, just remember that E.F. Schumacher was right: “small is beautiful.”

 

comments: Comments Off on Remember: The IoT Is Primarily About Small Data, Not Big tags: , , , , , , , , ,

IBM picks for IoT trends to watch this year emphasize privacy & security

Last month Bill Chamberlin, the principal analyst for Emerging Tech Trends and Horizon Watch Community Leader for IBM Market Development (hmmm, must have an oversized biz card..) published a list of 20 IoT trends to watch this year that I think provide a pretty good checklist for evaluating what promises to be an important period in which the IoT becomes more mainstream.

It’s interesting to me, especially in light of my recent focus on the topics (and I’ll blog on the recent FTC report on the issue in several days), that he put privacy and security number one on the list, commenting that “Trust and authentication become critical across all elements of the IoT, including devices, the networks, the cloud and software apps.” Amen.

Most of the rest of the list was no surprise, with standards, hardware, software, and edge analytics rounding out the top five (even though it hasn’t gotten a lot of attention, I agree edge analytics are going to be crucial as the volume of sensor data increases dramatically: why pass along the vast majority of data, that is probably redundant, to the cloud, vs. just what’s a deviation from the norm and probably more important?).

Two dealing with sensors did strike my eye:

9.  Sensor fusion: Combining data from different sources can improve accuracy. Data from two sensors is better than data from one. Data from lots of sensors is even better.

10.  Sensor hubs: Developers will increasingly experiment with sensor hubs for IoT devices, which will be used to offload tasks from the application processor, cutting down on power consumption and improving battery life in the devices”

Both make a lot of sense.

One was particularly noteworthy in light of my last post, about the Gartner survey showing most companies were ill-prepared to plan and launch IoT strategies: “14.  Chief IoT Officer: Expect more senior level execs to be put in place to build the enterprise-wide IoT strategy.” Couldn’t agree more that this is vital!

Check out the whole list: I think you’ll find it helpful in tracking this year’s major IoT developments.

comments: Comments Off on IBM picks for IoT trends to watch this year emphasize privacy & security tags: , , , , , , , , ,

I’ll be on SAP Radio Again Today: the IoT and Big Data

I’ll be on SAP’s “Coffee Breaks With Game Changers” radio again today, live @ 2 EST, appearing again with SAP’s David Jonker, again talking about the IoT and Big Data.  This time I plan to speak about:

  • Integrating real-time and historic data in decision-making:  in the past, it was so hard to glean real-time operating data that we had to operate on the basis of inferring about how to manage the future based on analysis of past data.  Now we have a more difficult challenge: learn to balance past and real-time data.
  • Sharing data in real-time: In the past, data trickled down from top management and might (or might not) eventually get to operators on the shop floor.  Now, everyone can get immediate access to it. Will senior managers continue to be the gatekeepers, or will everyone have real-time access to the data that might allow them to do their jobs more effectively (for example, fine-tuning production processes).

  • Revolutionizing decision-making: Decision-making will also change, because of everyone being able to have simultaneous access to data. Does it really make sense any more for sequential decision-making by various siloed departments when they might all benefit by making the decisions simultaneously and collaboratively, based on the data?

Tune in!

comments: Comments Off on I’ll be on SAP Radio Again Today: the IoT and Big Data tags: , , ,

Perhaps Most Important Internet of Things Essential Truth: Everything’s Linked

PROCEED WITH CAUTION!

You see, I’m thinking out loud (that accounts for that sound of gears grinding….) — I really am writing this post as I mull over the subject for the first time, so you’re forewarned that the result may be a disaster — or insightful. Bear with me…

I’m working on a book outline expanding on “Managing the Internet of Things Revolution,” the introduction to IoT strategy for C-level executives that I wrote for SAP. One of the things I’ve been looking for is a theme that would bring together all of the book’s parts, which include product design, manufacturing, marketing and corporate organization, among other topics.

I think I’ve got that theme, and I think it may be the most Essential Truth of all the ones I’ve written about regarding the IoT:

Everything’s Linked!

When you think about it, there have been a lot of dead-ends in business in the past:

  • we haven’t been able to know how customers used our products. We’ve actually got a lot more information about the ones that failed, because of warrantee claims or complaints, than we have about the ones that worked well, because that information was impossible to gather.
  • data that could help workers do their work better has always come from top down, filtered by various levels of management and only delivered after the fact.
  • customers can’t get the full value of our products because they operate in isolation from each other, and often were slow to react to changing conditions.
  • assembly-line machinery has frequently been hard to optimize, because we really didn’t know how it was operating — until it broke down.
  • key parts of the operation, such as supply chain, manufacturing, and distribution, have been largely independent, without simultaneous access to each other’s status.

With the Internet of Things, by contrast, everything will be linked, and that will change everything:

  • we’ll get real-time data about how customers are using our products. Most radically, that data may even allow us, instead of selling products and then severing our ties to the customer as in the past, to instead lease them the products, with the pricing dependent on how they actually use the products and the value they obtain from them.
  • everyone in the company can (if your management practices allow!) have real-time access to data that will help them improve their decision making and daily operations (hmm: still looking for an example of this one: know any companies that are sharing data on a real-time basis??).
  • products will work together, with synergistic results (as with the Jawbone UP turning on the NEXT), with their operation automatically triggered and coordinated by services such as IFTTT.
  • the assembly line can be optimized because we’ll be able to “see” into massive equipment to learn how it is operating — or if it needs repairs in time to avoid catastrophic failure.
  • access to that same data may even be shared with your supply chain and distribution network — or even with customers (again, looking for a good example of that transformation).

There’s won’t be dead ends or one-way streets where information only flows one way. Instead, they’ll be replaced by loops (in fact, I thought loops might be an alternative theme): in many cases, data will be fed back through M2M systems so things can be optimized.

If that’s the case, we’ll be able to increase the use and value of tools such as systems dynamics software, that would help us model and act on these links and loops. Instead of massive oscillations where we’re forced to make sudden, major corrections when data finally becomes available, machinery will be largely self-regulating, based on continuous feedback. We’ll delight customers because products will be more dependable and we’ll be able to fine-tune them by adding features based on actual knowledge of how the products work.  Workers will be more efficient, and happier, because they’ll be empowered. We’ll tread lightly on the earth, because we’ll use only what we need, precisely when we need it.

By George, I think I’ve got it! I’m excited about this vision of the Internet of Things linking everything. What do you think?? Please let me know! 

comments: Comments Off on Perhaps Most Important Internet of Things Essential Truth: Everything’s Linked tags: , , , ,
http://www.stephensonstrategies.com/">Stephenson blogs on Internet of Things Internet of Things strategy, breakthroughs and management