Will some smart home device makers ever grow souls??

(Please cut me a little slack on this post, dripping with sarcasm: these latest examples of some smart home device makers’ contempt/obliviousness toward customers’ privacy and security shoved me over the edge!).

Once upon a time two smart boys in their dorm room thought up a new service that really made a new technology hum. When they turned it into a tiny company, they ever adopted a cute motto: “don’t be evil.” Neat!

Then their little service got very, very big and very, very profitable. The motto? It kinda withered away. Last year it was even dropped from the company’s code of conduct.

Which, conveniently, allowed that once tiny company to produce this abomination: the Google Nest Guard (the alarm, keypad, and motion sensor portion of Nest’s Secure home protection system) featuring a mic.

Oh, did I point out that Nest didn’t mention the mic’s presence? No, that fact only emerged when it announced the Guard’s integration with Google’s Assistant voice device (Sample command: “OK, Google, surveil my family.”) and Business Insider ferreted out the mic’s presence:

“The existence of a microphone on the Nest Guard, which is the alarm, keypad, and motion-sensor component in the Nest Secure offering, was never disclosed in any of the product material for the device.”

On Tuesday, a Google spokesperson told Business Insider the company had made an “error.”

“The on-device microphone was never intended to be a secret and should have been listed in the tech specs,” the spokesperson said. “That was an error on our part.”

Oh. All is forgiven. It was just an “error on our part.”

Except, how can I say this politely?, that’s utter baloney. It seems as if the mic just sorta got there. No engineer suggested adding it. No executives reviewing the design conveniently overlooked it.

Nope, that mic was there intentionally, and Google is so morally corrupt and/or amoral that they simply chose to ignore telling the public.

And, while we’re at it, let’s not heap all the opprobrium on Google. Amazon subsidiary Ring actually let its employees view videos shot with its doorbell device:

“These videos were unencrypted, and could be easily downloaded and shared. The team was also given a database that linked each video to the Ring customer it belonged to.”

As I’ve said many times before, my perspective on the issues of privacy and security are informed by my prior work in corporate crisis management, which taught me that far too many engineers (I have many friends in the profession, but if the shoe fits, wear it) are simply oblivious to privacy and security issues, viewing them as something to be handled through bolt-on protections after the fun part of product design is done. In fact, in adding the prior link, I came across something I wrote last year in which I quoted from the Google log — which contained nary a mention of privacy concerns — about an aspect of AI that would allow identification of what shop a batch of ramen came from. Funny, huh? No — scary.

Another lesson I drew from my past was the phenomenon of guilt by association, which is incredibly rampant right now: people conflate issues as diverse as smart home privacy violations, Russian election tampering, some men’s inability to find dates (I kid you not, and the result may be lethal for some women), the so-called “deep state,” etc., etc. The engineers I know tend to dismiss these wacky ideas because they aren’t logical. But the fact that the fears aren’t logical doesn’t mean they aren’t very, very real to those who embrace them.

That means that even those companies whose smart home devices DO contain robust privacy protections risk people rejecting their devices as well. Trust me on this one: I work every day with rational people who reject the cloud and all the services that could enrich their lives due to their fear of privacy and security violations.

That’s why responsible IoT companies must become involved in collaborations such as the Internet of Things Association, and IMC, working on collaborative strategies to deal with these issues.

Let’s not forget that these gaffes come at the same time as there’s a lot more interest among regulators and elected officials in regulating and/or even breaking up the Silicon Alley behemoths. You’d kinda think they’d be on their best behavior, not doing stupid things that just draw more criticism.

I’m fed up, and I won’t shut up. Write me if you have feasible suggestions to deal with the problem.

IMPORTANT POSTSCRIPT!

I just discovered a Verge piece from last month to the effect that Google is belatedly getting religion about personal privacy, even — and this wins big points in my book — putting its privacy policies in plain English (yes!) rather than legalese. Here’s a long piece from the article. If they follow up, I’d be the first to praise them and withdraw my criticism, although not of the industry as a whole:

“So today, as Google announced that it’s going to sell a device that’s not all that different from the Facebook Portal, whose most every review wondered whether you should really invite a Facebook camera into your home, Google also decided to publicly take ownership for privacy going forward.
As we discovered in our interview with Google Nest leader Rishi Chandra, Google has created a set of plain-English privacy commitments. And while Google didn’t actually share them during today’s Google I/O keynote, they’re now available for you to read on the web.
Here’s the high-level overview:
We’ll explain our sensors and how they work. The technical specifications for our connected home devices will list all audio, video, and environmental and activity sensors—whether enabled or not. And you can find the types of data these sensors collect and how that data is used in various features in our dedicated help center page.
We’ll explain how your video footage, audio recordings, and home environment sensor readings are used to offer helpful features and services, and our commitment for how we’ll keep this data separate from advertising and ad personalization.
We’ll explain how you can control and manage your data, such as providing you with the ability to access, review, and delete audio and video stored with your Google Account at any time.
But the full document gets way more specific than that. And remarkably, a number of the promises aren’t the typical wishy-washy legalese you might expect. Some are totally unambiguous. Some of them go against the grain, like how Nest won’t let you turn off the recording light on your camera anymore because it wants to assure you!
‘Your home is a special place. It’s where you get to decide who you invite in. It‘s the place for sharing family recipes and watching babies take first steps. You want to trust the things you bring into your home. And we’re committed to earning that trust,’ Google says.”

Maybe somebody’s listening!

5G Raises the Stakes for IoT Security

Last week’s international political news was a dramatic reminder of how inextricably linked technology progress (in this case, 5G infrastructure) and high-stakes global intrigue and even warfare have become.

The speed-up in deployment of 5G networks in the US and worldwide can both dramatically increase the IoT’s benefits (with reduced latency we’ll get a significant increase in the volume of rich, near-real-time data, allowing autonomous vehicles and other hard-to-imagine advances) but also the dangers (the possibility of China, Russia or someone else launching a cyber attack through a “back door” that could cripple our critical infrastructure). That puts the IoT right in the middle of a very tense global diplomatic and technical battle, with the outcome potentially having a big impact on the IoT’s near-term growth.

The US government’s indictment of Huawei (coming on the heels of an as-yet un-corroborated Bloomberg story that Huawei had planted chips in Apple and Amazon devices that would allow “back-door” attacks not just on the devices but on overall networks) plus a little-noticed story about yet another Chinese manufacturer of cheap IoT devices that could let a bad actor install malware in its firmware are just the latest reminders that IoT privacy and security must be designed in from the beginning, using what the EU calls “privacy by design.”

Don’t forget that we’ve already had a very real preview of exactly how dangerous this can be:  the 2016 DDoS attack on Internet infrastructure company Dyn that used IoS devices with inadequate protections as its the Trojan horses to launch the attack. Much of the Internet was crippled for several hours.

It also means, as I wrote in The Future Is Smart and elsewhere that it’s not enough to design in privacy protections into your own products and services: if the public and companies lose confidence in the IoT because of an attack aimed at anyone, even the irresponsible companies that don’t worry about security, I learned during my years doing corporate crisis management that there’s an irrational but nonetheless compelling guilt-by-association phenomenon that can destroy confidence in all IoT. Is that fair? No, but that doesn’t mean it’s any less of a reality. That’s why it’s critical that you take an active role in both supporting enlightened federal policy on both 5G infrastructure and IoT regulation, especially privacy and security regulations that are performance-based, rather than descriptive (which might restrict innovation), as well as joining industry organizations working on the privacy and security issues, such as the IMC, Internet of Things Association, and IMC.

In The Future Is Smart I wrote that, counterintuitively, privacy and security can’t be bolted on after you’ve done the sexy part of designing cool new features for your IoT device or service. This news makes that even more the case. What’s required is a mind-set in which you think of privacy and security from the very beginning and then visualize the process after its initial sale as cyclical and never-ending: you must constantly monitor emerging threats and then upgrade firmware and software protections.

 

 

 

Live Blogging #LlveWorx ’18, Day 2

Aiden Quilligan, Accenture Industry X.0, on AI:

  • Mindset and AI: must undo what Hollywood has done on this over years, pose it as human vs. machine.
  • We think it should be human PLUS machine.
  • he’s never seen anything move as fast as AI, especially in robotics
  • now, co-bots that work along side us
  • exoskeletons
  • what do we mean by AI?  Machine learning.  AI is range of technologies that can learn and then act. AI is the “new work colleague” we need to learn to get along with.
  • predictions: will generate #2.9 trillion in biz value and recover 6.2 billion hours of worker productivity in 2021.
  • myths:
    • 1) robots evil, coming for us: nothing inherently anti-human in them.
    • 2) will take our jobs. Element of truth in terms of repetitive, boring work that will be replaced. They will fill in for retiring workers. Some new industries created by them.  Believe there will be net creation of jobs.
    • 3) current approaches will still work.

6 steps to the Monetization of IoT, Terry Hughes:

  • Digital native companies (Uber) vs. digitally transforming companies
  • also companies such as Kodak that didn’t transform at all (vs. Fujifilm, which has transformed).
  • Forbes: 84% of companies have failed with at least one transformation program.  Each time you fail you lose 1/2 billion
  • steps:
    • 1) devices with potential
    • 2) cloud network communication
    • 3) software distribution
    • 4) partner and provider ecosystem
    • 5) create a marketplace.
    • 6) monetization of assets.
  • crazy example of software company that still ships packages rather than just download because of initial cost in new delivery system
  • 3 big software challenges for digitally transforming company
    • fragmented silos of software by product, business unit & software
    • messy and complex distribution channels
    • often no link between software and the hardware that it relates to
  • importance of an ecosystem
    • Blackberry example of one that didn’t have the ecosystem
  • 3rd parties will innovate and add value around a manufacturer’s core products
  • in IoT it’s a land grab for mindshare of 3rd-party innovators.
  • need strong developer program
  • tools for app development and integration
  • ease of building and publishing apps
  • path to discovery and revenue for developer
  • IDC: developer ecosystem allow enterprises to massively scale distribution
  • digitally native companies have totally different models (will get details later…)
  • hybrids:
    • GE Healthcare:  working with Gallus BioPharma
    • Heidelberg & Eig have digital biz model for folding carton printing. Pay per use
  • Ford is heading for mobility as a transformation

 


Bernard Marr: Why IoT, Combined With AI and Big Data, Fuels 4th Industrial Revolution

 

  • connecting everything in house to Internet
  • Spotify: their vision is they understand us better. Can correlate your activity on Apple Watch (such as spinning) & create a play list based on that)
  • FitBit: the photo will estimate your calorie content.
  • John Deere
  • ShotSpotter: the company that monitors gun shots
  • understanding customers & markets better than before:
    • Facebook: better at face recognition than we are. They can predict your IQ, your relationship status.
  • Lot of frightening, IMHO, examples of AI analyzing individuals and responding without consideration of ethics and privacy
  • 3) improving operations and efficiency:
    • self-driving boats
    • drones
    • medicine through Watson

panel on IoT:

  • Don’t be afraid of the cloud
  • Ryan Cahalane, Colfax: prepare for big, start small and move fast. They had remarkable growth with switch to IoT.  Not a digital strategy, but digital in everything they do. Have “connected welders,” for example.
  • Justin Hester, Hirotec: most importatnt strategic digital transformation decision your organization can make is the selection of a platform. The platform is the underlying digital thread that enables your team to meet  the unique and chanding needs of your organization and to scale those solutions rapidly. “Assisted reality” in ThingWorx
  • Shane O’Callahan, TSM (Ireland):  Make industrial automation equipment for manufacturing. Understanding your key value driver is where to start. Then start samll, scale fast and get a win!

Jeffrey Miller, PTC: Digital Transformation:

  • if you start with digital strategy you’re starting in wrong place Start with business strategy. 
  • Couple with innovation vision merged with digital strategy. Add business use cases.
  • Jobs: it’s not how much you spend on R & D, but “about the people you have, you you’re dled, and how much you get it”
  • create an environment for innovation
    • do we encourage experimentation?
    • is it ok to fail
  • identify digital technologies to provide the required operating capabilities:
    • have we conducted proofs of concept?
    • experimented, tested  and validated?
    • reviewed use cases & success studies?
    • delivered small, important, scalable successes?

Matt,  PTC: Bringing Business Value to AR:

  • augmented service guidance
  • remote expert guidance
  • manufacturing: machine setup and turnover, assembly and process
  • example of Bell & Howell towers to store online sales in WalMart stores for customer pickup: very expensive to send one to a store for salesperson to use in sales — now just use AR app to give realistic demo without expense.
  • service: poor documentation organization, wants accurate, relevant, onsite info for technician. Want to remove return visits because the repair wasn’t done 1st time, or there’s a new technician. Manuals in binders, etc. Instead, with AR, requirements are quick access to current info. Finally, a demo.

Suchitra Bose, Accenture: Manufacturing IIoT, Driving the Speed of Digital Manufacturing:

  • convergence of IT and OT
  • expanding digital footprint across your entire factory
  • PTC has wide range of case studies (“use cases” in biz speak…) on aspects of IoT & manufacturing.

Wahoo! Liveblogging #Liveworx ’18!

Always my fav event, I’ll be liveblogging #LiveWorx ’18.  Stay tuned!

Keynote: Jim Heppelmann:

  • “from a place to a pace” — how fast are we moving?
  • no longer OK to think of a future destination, builds inertia (“your main competitor”). Disruption may have already happened. Hard to sustain advantage due to pace of change. Must “embrace a pace of change”
  • Um, this sounds like argument for my circular company paradigm shift!!!
  • Customer Experience Center will occupy top floor of new building.
  • combo of  physical, human and digital — transforming all at once speeds change:
    • physical: been constrained by subtractive manufacturing, while nature improves via cell division (i.e., additive). “Adopt Mother Nature’s mindset.” — new additive aspects of Creo. Example of Triumph cycle sing-arm using additive. CREO uses AI to optimize performance: non-symmetrical design. Still need to use simulation tests: new intermittent, continuous style: they are doing new partnership with ANSYS (product simulation software), unified modeling and simulation with no gaps. Historically, simulation only used at end of design cycle, now can use it throughout the process: “pervasive simulation.”
      • ANSYS “Discovery Live”: optimizes for real-time. Integrates with Creo — instant feedback on new designs. “simulation critical to innovation.”
    • digital: working with Microsoft Azure (Rodney Clark, Microsoft IoT VP). Microsoft investing $5b in IoT.  1st collaboration is an industrial welder: IoT data optimizes productivity.  BAE can train new employees 30-40% quicker.
    • finally, human: “Mother Nature designed ups to interface with the physical. How do we integrate with the digital? — Siri, Alexa, Cortna still too slow.  Sight is our best bet. “Need direct pipeline to reality ” — that’s AR. “Smart, connected humans.” Sysmex: for medical lab analysis. Hospitals need real-time access to blood cell analysis. They have real-time calibration of analysis equipment. Also improving knowledge of the support techs, using AR and digital twins when repairs are needed.
      • Will help 2.5 billion workers become more productive
      • AR can project how a process is being programmed (gotta see this one. will try to get video).
      • All of their human/digital interface initiatives united under Vuforia. Already have 10,000 enterprises using it.
    • Factories are a new focus of PTC. 200 companies now using it in 800 factories. Examples from Woodward & Colfax.  Big savings on new employee training.

Keynote: Prof. Linda Hill, HBS, “Collective Genius”:

  • Innovation= novel + useful
  • Example of Pixar: collective genius “filmmaking is a team sport.”
  • 3 characteristics of creative organizations they looked at:
    • “creative abrasion” — diversity and debate
    • “creative agility” — quickly test the idea & get feedback. Experiment rather than run pilots, which often include politics
    • “creative resolution” — ability to make integrative decisions. Don’t necessarily defer to the experts.
    • sense of community and shared purpose.
  • values: bold ambition, collaboration, responsibility, learning.
  • rules of engagement: respect, trust, influence, see the whole, question everything, be data-driven.

Ray Miciek, Aquitas Solutions. Getting Started on IoT-based Maintenance:

  • his company specializes in asset maintenance.
  • “produce products with assets that never fail”
  • 82% of all asset failures are random, because they are more IT-related now
  • find someplace in org. where you could gain info to avoid failure.
  • Can start small, then quickly expand.

 

“All of Us:” THE model for IoT privacy and security!

pardon me in advance:this will be long, but I think the topic merits it!

One of my fav bits of strategic folk wisdom (in fact, a consistent theme in my Data Dynamite book on the open data paradigm shift) is, when you face a new problem, to think of another organization that might have one similar to yours, but which suffers from it to the nth degree (in some cases, even a matter of literal life-or-death!).

That’s on the likelihood that the severity of their situation would have led these organizations to already explore radical and innovative solutions that might guide your and shorten the process. In the case of the IoT, that would include jet turbine manufacturers and off-shore oil rigs, for example.

I raise that point because of the ever-present problem of IoT privacy and security. I’ve consistently criticized many companies’ lack of attention to seriousness and ingenuity, and warned that this could result not only in disaster for these companies, but also the industry in general due to guilt-by-association.

This is even more of an issue since the May roll-out of the EU’s General Data Protection Regulation (GDPR), based on the presumption of an individual right to privacy.

Now, I have exciting confirmation — from the actions of an organization with just such a high-stakes privacy and security challenge — that it is possible to design an imaginative and effective process alerting the public to the high stakes and providing a thorough process to both reassure them and enroll them in the process.

Informed consent at its best!

It’s the NIH-funded All of Us, a bold effort to recruit 1 million or more people of every age, sex, race, home state, and state of health nationwide to speed medical research, especially toward the goal of “personalized medicine.” The researchers hope that, “By taking into account individual differences in lifestyle, environment, and biology, researchers will uncover paths toward delivering precision medicine.”

All of Us should be of great interest to IoT practitioners, starting with the fact that it might just save our own lives by leading to creation of new medicines (hope you’ll join me in signing up!). In addition, it parallels the IoT in allowing unprecedented degrees of precision in individuals’ care, just as the IoT does with manufacturing, operating data, etc.:

“Precision medicine is an approach to disease treatment and prevention that seeks to maximize effectiveness by taking into account individual variability in genes, environment, and lifestyle. Precision medicine seeks to redefine our understanding of disease onset and progression, treatment response, and health outcomes through the more precise measurement of molecular, environmental, and behavioral factors that contribute to health and disease. This understanding will lead to more accurate diagnoses, more rational disease prevention strategies, better treatment selection, and the development of novel therapies. Coincident with advancing the science of medicine is a changing culture of medical practice and medical research that engages individuals as active partners – not just as patients or research subjects. We believe the combination of a highly engaged population and rich biological, health, behavioral, and environmental data will usher in a new and more effective era of American healthcare.” (my emphasis added)


But what really struck me about All of Us’s relevance to IoT is the absolutely critical need to do everything possible to assure the confidentiality of participants’ data, starting with HIPP protections and extending to the fact that it would absolutely destroy public confidence in the program if the data were to be stolen or otherwise compromised.  As Katie Rush, who heads the project’s communications team told me, “We felt it was important for people to have a solid understanding of what participation in the program entails—so that through the consent process, they were fully informed.”

What the All of Us staff designed was, in my estimation (and I’ve been in or around medical communication for forty years), the gold standard for such processes, and a great model for effective IoT informed consent:

  • you can’t ignore it and still participate in the program: you must sign the consent form.
  • you also can’t short-circuit the process: it said at the beginning the process would take 18-30 minutes (to which I said yeah, sure — I was just going to sign the form and get going), and it really did, because you had to do each step or you couldn’t join — the site was designed so no shortcuts were allowed!:
    • first, there’s an easy-to-follow, attractive short animation about that section of the program
    • then you have to answer some basic questions to demonstrate that you understand the implications.
    • then you have to give your consent to that portion of the program
    • the same process is repeated for each component of the program.
  • all of the steps, and all of the key provisions, are explained in clear, simple English, not legalese. To wit:
    • “Personal information, like your name, address, and other things that easily identify participants will be removed from all data.
    • Samples—also without any names on them—are stored in a secure biobank”
    • “We require All of Us Research Program partner organizations to show that they can meet strict data security standards before they may collect, transfer, or store information from participants.
    • We encrypt all participant data. We also remove obvious identifiers from data used for research. This means names, addresses, and other identifying information is separate from the health information.
    • We require researchers seeking access to All of Us Research Program data to first register with the program, take our ethics training, and agree to a code of conduct for responsible data use.
    • We make data available on a secure platform—the All of Us research portal—and track the activity of all researchers who use it.
    • We enlist independent reviewers to check our plans and test our systems on an ongoing basis to make sure we have effective security controls in place, responsive to emerging threats.”

The site emphasizes that everything possible will be done to protect your privacy and anonymity, but it is also frank that there is no way of removing all risk, and your final consent requires acknowledging that you understand those limits:

“We are working with top privacy experts and using highly-advanced security tools to keep your data safe. We have several  steps in place to protect your data. First, the data we collet from you will be stored on=oyters with extra security portection. A special team will have clearance to process and track your data. We will limit who is allowed to see information that could directly identy you, like your name or social security number. In the unlikely event of a data breach, we will notify you. You are our partner, and your privacy will always be our top priority.”

The process is thorough, easy to understand, and assures that those who actually sign up know exactly what’s expected from them, what will be done to protect them, and that they may still have some risk.

Why can’t we expect that all IoT product manufacturers will give us a streamlined version of the same process? 


I will be developing consulting services to advise companies that want to develop common-sense, effective, easy-to-implement IoT privacy and security measures. Write me if you’d like to know more.

Why IoT Engineers Need Compulsory Sensitivity Training on Privacy & Security

Posted on 4th April 2018 in AI, data, Essential Truths, Internet of Things, privacy, security

OK, you may say I’m over-sensitive, but a headline today from Google’s blog that others may chuckle about (“Noodle on this: Machine learning that can identify ramen by shop“) left me profoundly worried about some engineers’ tone-deaf insensitivity to growing public concern about privacy and security.

This is not going to be pleasant for many readers, but bear with me — IMHO, it’s important to the IoT’s survival.

As I’ve written before, I learned during my work on corporate crisis management in the 80’s and 90’s that there’s an all-too-frequent gulf between the public and engineers on fear.  Engineers, as left-brained and logical as they come (or, in Myers-Briggs lingo, ISTJs, “logical, detached and detailed” and the polar opposite of ENFP’s such as me, ” caring, creative, quick and impulsive” ) are ideally-suited for the precision needs of their profession — but often (but not always, I’ll admit…) clueless about how the rest of us respond to things such as the Russian disruption of our sacred political institutions via Facebook or any of the numerous violations of personal privacy and security that have taken place with IoT devices lacking in basic protections.

The situation is bad, and getting worse. In one Pew poll, 16% or less of Americans felt that a wide range of institutions, from companies to government, were protecting their information.

Engineers are quick to dismiss the resulting fear because it isn’t logical.  But, as I’ve written before, the fact fear isn’t logical doesn’t mean it isn’t really real for many people, and can cloud their thought processes and decision-making.

Even worse, it’s cumulative and can ensnare good companies as well as bad.  After a while, all the privacy and security violations get conflated in their minds.

Exhibit A for this insensitivity? The despicable memo from Facebook VP Andrew Bosworth:

““Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good.”

Eventually he, begrudgingly, apologized, as did Mark Zuckerberg, but, IMHO that was just facesaving. Why didn’t anyone at Facebook demand a retraction immediately, and why did some at Facebook get mad not at Bosworth but instead at anyone who’d leak such information?  They and the corporate culture are as guilty as Bosworth in my mind.

So why do I bring up the story about identifying the source of your ramen using AI, which was surely written totally innocently by a Google engineer who thought it would be a cute example of how AI can be applied to a wide range of subjects? It’s because I read it — with my antennae admittedly sharpened by all the recent abuses — as something that might have been funny several years ago but should have gone unpublished now in light of all the fears about privacy and security. Think of this little fun project the way a lot of the people I try to counsel on technology fears every day would have: you mean they now can and will find out where I get my noodles? What the hell else do they know about me, and who will they give that information to???

Again, I’m quite willing to admit I may be over-reacting because of my own horror about the nonchalance on privacy and security, but I don’t think so.

That’s why I’ll conclude this screed with a call for all IoT engineers to undergo mandatory privacy and security training on a continuing basis. The risk of losing consumer confidence in their products and services is simply too great for them to get off the hook because that’s not their job. If you do IoT, privacy and security is part of the job description.

End of sermon. Go about your business.

 

 

comments: Comments Off on Why IoT Engineers Need Compulsory Sensitivity Training on Privacy & Security tags: , , , ,

Great Podcast Discussion of #IoT Strategy With Old Friend Jason Daniels

Right after I submitted my final manuscript for The Future is Smart I had a chance to spend an hour with old friend Jason Daniels (we collaborated on a series of “21st Century Homeland Security Tips You Won’t Hear From Officials” videos back when I was a homeland security theorist) on his “Studio @ 50 Oliver” podcast.

We covered just about every topic I hit in the book, with a heavy emphasis on the attitude shifts (“IoT Essential Truths” needed to really capitalize on the IoT and the bleeding-edge concept I introduce at the end of the book, the “Circular Corporation,” with departments and individuals (even including your supply chain, distribution network and customers, if you choose) in a continuous, circular management style revolving around a shared real-time IoT hub.  Hope you’ll enjoy it!

comments: Comments Off on Great Podcast Discussion of #IoT Strategy With Old Friend Jason Daniels tags: , , , , , ,

IoT Design Manifesto 1.0: great starting point for your IoT strategy & products!

Late in the process of writing my forthcoming IoT strategy book, The Future Is Smart, I happened on the “IoT Design Manifesto 1.0” site. I wish I’d found it earlier so I could have featured it more prominently in the book.

The reason is that the manifesto is the product (bear in mind that the original team of participants designed it to be dynamic and iterative, so it will doubtlessly change over time) of a collaborative process involving both product designers and IoT thought leaders such as the great Rob van Kranenburg. As I’ve written ad nauseam, I think of the IoT as inherently collaborative, since sharing data rather than hoarding it can lead to synergistic benefits, and collaborative approaches such as smart cities get their strength from an evolving mishmash of individual actions that gets progressively more valuable.

From the names, I suspect most of the Manifesto’s authors are European. That’s important, since Europeans seem to be more concerned, on the whole, about IoT privacy and security than their American counterparts, witness the EU-driven “privacy by design” concept, which makes privacy a priority from the beginning of the design process.

At any rate, I was impressed that the manifesto combines both philosophical and economic priorities, and does so in a way that should maximize the benefits and minimize the problems.

I’m going to take the liberty of including the entire manifesto, with my side comments:

  1. WE DON’T BELIEVE THE HYPE. We pledge to be skeptical of the cult of the new — just slapping the Internet onto a product isn’t the answer, Monetizing only through connectivity rarely guarantees sustainable commercial success.
    (Comment: this is like my “just because you can do it doesn’t mean you should” warning: if making a product “smart” doesn’t add real value, why do it?)*
  2. WE DESIGN USEFUL THINGS. Value comes from products that are purposeful. Our commitment is to design products that have a meaningful impact on people’s lives; IoT technologies are merely tools to enable that.
    (Comment: see number 1!)
  3. “WE AIM FOR THE WIN-WIN-WIN. A complex web of stakeholders is forming around IoT products: from users, to businesses, and everyone in between. We design so that there is a win for everybody in this elaborate exchange.
    (Comment:This is a big one in my mind, and relates to my IoT Essential Truth #2 — share data, don’t hoard it — when you share IoT data, even with competitors in some cases [think of IFTTT “recipes”] — you can create services that benefit customers, companies, and even the greater good, such as reducing global warming).
  4. WE KEEP EVERYONE AND EVERYTHING SECURE. With connectivity comes the potential for external security threats executed through the product itself, which comes with serious consequences. We are committed to protecting our users from these dangers, whatever they may be.
    (Comment: Amen! as I’ve written ad nauseum, protecting privacy and security must be THE highest IoT priority — see next post below!).
  5. WE BUILD AND PROMOTE A CULTURE OF PRIVACY. Equally severe threats can also come from within. Trust is violated when personal  information gathered by the product is handled carelessly. We build and promote a culture of integrity where the norm is to handle data with care.
    (Comment:See 4!).
  6. WE ARE DELIBERATE ABOUT WHAT DATA WE COLLECT. This is not the business of hoarding data; we only collect data that serves the utility of the product and service. Therefore, identifying what those data points are must be conscientious and deliberate.
    (Comment: this is a delicate issue, because you may find data that wasn’t originally valuable becomes so as new correlations and links are established. However, just collecting data willy-nilly and depositing it in an unstructured “data lake” for possible use later is asking for trouble if your security is breeched.).
  7. WE MAKE THE PARTIES ASSOCIATED WITH AN IOT PRODUCT EXPLICIT. IoT products are uniquely connected, making the flow of information among stakeholders open and fluid. This results in a complex, ambiguous, and invisible network. Our responsibility is to make the dynamics among those parties more visible and understandable to everyone.
    (Comment: see what I wrote in the last post, where I recommended companies spell out their privacy and usage policies in plain language and completely).
  8. WE EMPOWER USERS TO BE THE MASTERS OF THEIR OWN DOMAIN. Users often do not have control over their role within the network of stakeholders surrounding an IoT product. We believe that users should be empowered to set the boundaries of how their data is accessed and how they are engaged with via the product.
    (Comment: consistent with prior points, make sure that any permissions are explicit and  opt-in rather than opt-out to protect users — and yourself (rather avoid lawsuits? Thought so…)
  9. WE DESIGN THINGS FOR THEIR LIFETIME. Currently physical products and digital services tend to be built to have different lifespans. In an IoT product features are codependent, so lifespans need to be aligned. We design products and their services to be bound as a single, durable entity.
    (Comment: consistent with the emerging circular economy concept, this can be a win-win-win for you, your customer and the environment. Products that don’t become obsolete quickly but can be upgraded either by hardware or software will delight customers and build their loyalty [remember that if you continue to meet their needs and desires, there’s less incentive for customers to check out competitors and possibly be wooed away!). Products that you enhance over time and particularly those you market as services instead of sell will also stay out of landfills and reduce your pduction costs.
  10. IN THE END, WE ARE HUMAN BEINGS. Design is an impactful act. With our work, we have the power to affect relationships between people and technology, as well as among people.  We don’t use this influence to only make profits or create robot overlords; instead, it is our responsibility to use design to help people, communities, and societies  thrive.
    Comment: yea designers!!)

I’ve personally signed onto the Manifesto, and do hope to contribute in the future (would like something explicit about the environment in it, but who knows) and urge you to do the same. More important, why start from scratch to come up with your own product design guidelines, when you can capitalize on the hard work that’s gone into the Manifesto as a starting point and modify it for your own unique needs?


*BTW: I was contemptuous of the first IoT electric toothbrush I wrote about, but since talked to a leader in the field who convinced me that it could actually revolutionize the practice of dentistry for the better by providing objective proof that  patient had brushed frequently and correctly. My bad!

comments: Comments Off on IoT Design Manifesto 1.0: great starting point for your IoT strategy & products! tags: , , , , ,

“The House That Spied on Me”: Finally Objective Info on IoT Privacy (or Lack Thereof)

Posted on 25th February 2018 in data, Essential Truths, Internet of Things, privacy, security, smart home

Pardon a political analogy, Just as the recent indictment of 13 Russians in the horrific bot campaign to undermine our democracy (you may surmise my position on this! The WIRED article about it is a must read!) finally provided objective information on the plot, so too Kasmir Hill’s and Surya Matu’s excruciatingly detailed “The House That Spied on Me”  finally provides objective information on the critical question of how much personal data IoT device manufacturers are actually compiling from our smart home devices.

This is critical, because we’ve previously had to rely on anecdotal evidence such as the Houston baby-cam scandal, and that’s not adequate for sound government policy making and/or advice to other companies on how to handle the privacy/security issue.

Last year, Hill (who wrote one of the first articles on the danger when she was at Forbes) added just about every smart home you can imagine to her apartment (I won’t repeat the list: I blush easily…) . Then her colleague, Matu, monitored the outflow of the devices using a special router he created to which she connected all the devices:

“… I am basically Kashmir’s sentient home. Kashmir wanted to know what it would be like to live in a smart home and I wanted to find out what the digital emissions from that home would reveal about her. Cybersecurity wasn’t my focus. … Privacy was. What could I tell about the patterns of her and her family’s life by passively gathering the data trails from her belongings? How often were the devices talking? Could I tell what the people inside were doing on an hourly basis based on what I saw?”

The answer was: a lot (I couldn’t paste the chart recording the numbers here, so check the article for the full report)!

As Matu pointed out, with the device he had access to precisely the data about Hill’s apartment that Comcast could collect and sell because of a 2017 law allowing ISPs to sell customers’ internet usage data without their consent — including the smart device data.  The various devices sent data constantly — sometimes even when they weren’t being used! In fact, there hasn’t been a single hour since the router was installed in December when at least some devices haven’t sent data — even if no one was at home!

BTW: Hill, despite her expertise and manufacturers’ claims of ease-of-setup, found configuring all of the devices, and especially making them work together, was a nightmare. Among other tidbits about how difficult it was: she had to download 14 different apps!  The system also directly violated her privacy, uploading a video of her walking around the apartment nude that was recorded by the Withings Home Wi-Fi Security (ahem…) Camera with Air Quality Sensors. Fortunately the offending video was encrypted. Small comfort.

Hill came to realize how convoluted privacy and security can become with a smart home:

“The whole episode reinforced something that was already bothering me: Getting a smart home means that everyone who lives or comes inside it is part of your personal panopticon, something which may not be obvious to them because they don’t expect everyday objects to have spying abilities. One of the gadgets—the Eight Sleep Tracker—seemed aware of this, and as a privacy-protective gesture, required the email address of the person I sleep with to request his permission to show me sleep reports from his side of the bed. But it’s weird to tell a gadget who you are having sex with as a way to protect privacy, especially when that gadget is monitoring the noise levels in your bedroom.”

Matu reminds us that, even though most of the data was encrypted, even the most basic digital exhaust can give trained experts valuable clues that may build digital profiles of us, whether to attract us to ads or for more nefarious purposes:

“It turns out that how we interact with our computers and smartphones is very valuable information, both to intelligence agencies and the advertising industry. What websites do I visit? How long do I actually spend reading an article? How long do I spend on Instagram? What do I use maps for? The data packets that help answer these questions are the basic unit of the data economy, and many more of them will be sent by people living in a smart home.”

Given the concerns about whether Amazon, Google, and Apple are constantly monitoring you through your smart speaker (remember when an Echo was subpoenaed  in a murder case?), Matu reported that:

“… the Echo and Echo Dot … were in constant communication with Amazon’s servers, sending a request every couple of minutes to http://spectrum.s3.amazonaws.com/kindle-wifi/wifistub-echo.html. Even without the “Alexa” wake word, and even when the microphone is turned off, the Echo is frequently checking in with Amazon, confirming it is online and looking for updates. Amazon did not respond to an inquiry about why the Echo talks to Amazon’s servers so much more frequently than other connected devices.”

Even the seemingly most insignificant data can be important:

“I was able to pick up a bunch of insights into the Hill household—what time they wake up, when they turn their lights on and off, when their child wakes up and falls asleep—but the weirdest one for me personally was knowing when Kashmir brushes her teeth. Her Philips Sonicare Connected toothbrush notifies the app when it’s being used, sending a distinctive digital fingerprint to the router. While not necessarily the most sensitive information, it made me imagine the next iteration of insurance incentives: Use a smart toothbrush and get dental insurance at a discount!”

Lest you laugh at that, a dean at the BU Dental School told me much the same thing: that the digital evidence from a Colgate smart brush, in this case, could actually revolutionize dentistry, not only letting your dentist how well, or not, you brushed, but perhaps lowering your dental insurance premium or affecting the amount your dentist was reimbursed. Who woulda thunk it?

Summing up (there’s a lot of additional important info in the story, especially about the perfidious Visio Smart TV, that had such a company-weighted privacy policy that the FTC actually forced it to turn it off the “feature” and pay reparations, so do read the whole article), Hill concluded:

“I thought the house would take care of me but instead everything in it now had the power to ask me to do things. Ultimately, I’m not going to warn you against making everything in your home smart because of the privacy risks, although there are quite a few. I’m going to warn you against a smart home because living in it is annoying as hell.”

In addition to making privacy and security a priority, there is another simple and essential step smart home (and Quantified Self) device companies must take.

When you open the box for the first time, the first thing you should see must be a prominently displayed privacy and security policy, written in plain (and I mean really plain) English, and printed in large, bold type. It should make it clear that any data sharing is opt-in, and that you have the right to not agree, and emphasize the need for detailed, unique passwords (no,1-2-3-4 or the ever-popular “password” are not enough.

Just to make certain the point is made, it needs to be at the very beginning of the set-up app as well. Yes, you should also include the detailed legalese in agate type, but the critical points must be made in the basic statement, which needs to be reviewed not just by the lawyers, but also a panel of laypeople, who must also carry out the steps to make sure they’re really easily understood and acted on. This is not just a suggestion. You absolutely must do it or you risk major penalties and public fury. 


Clearly, this article gives us the first objective evidence that there’s a lot more to do to assure privacy and security for smart homes (and that there’s also a heck of a lot of room for improvement on how the devices play together!), reaffirming my judgement that the first IoT Essential Truth remains “make privacy and security your highest priority.” If this doesn’t get the focus it deserves, we may lose all the benefits of the IoT because of legitimate public and corporate concern that their secrets are at risk. N.B.!

comments: Comments Off on “The House That Spied on Me”: Finally Objective Info on IoT Privacy (or Lack Thereof) tags: , , , , ,

Liveblogging from Internet of Things Global Summit

Critical Infrastructure and IoT

Robert Metzger, Shareholder, Rogers Joseph O’Donnell 

  • a variety of constraints to direct government involvement in IoT
  • regulators: doesn’t trust private sector to do enough, but regulation tends to be prescriptive.
  • NIST can play critical role: standards and best practices, esp. on privacy and security.
  • Comparatively, any company knows more about potential and liabilities of IoT than any government body. Can lead to bewildering array of IoT regulations that can hamper the problem.
  • Business model problem: security expensive, may require more power, add less functionality, all of which run against incentive to get the service out at lowest price. Need selective regulation and minimum standards. Government should require minimum standards as part of its procurement. Government rarely willing to pay for this.
  • Pending US regulation shows constant tension between regulation and innovation.

             2017 IoT Summit

Gary Butler, CEO, Camgian 

  • Utah cities network embedding sensors.
  • Scalability and flexibility needed. Must be able to interface with constantly improving sensors.
  • Expensive to retrofit sensors on infrastructure.
  • From physical security perspective: cameras, etc. to provide real-time situational awareness. Beyond human surveillance. Add AI to augment human surveillance.
  • “Dealing with ‘data deluge.'”  Example of proliferation of drones. NIST might help with developing standards for this.
  • Battery systems: reducing power consumption & creating energy-dense batteries. Government could help. Government could also be a leader in adoption.

 

Cyber-Criminality, Security and Risk in an IoT World

John Carlin, Chair, Cybersecurity & Technology Program, Aspen Institute

  • Social media involved in most cyberwar attacks & most perps under 21.  They become linked solely by social media.
  • offensive threats far outstrip defenses when it comes to data
  • now we’re connecting billions of things, very vulnerable. Add in driverless cars & threat even greater. Examples: non-encrypted data from pacemakers, and the WIRED Jeep demo.

Belisario Contreras, Cyber Security Program Manager, Organization of American States

  • must think globally.
  • criminals have all the time to prepare, we must respond within minutes.
  • comprehensive approach: broad policy framework in 6 Latin American countries.

Samia Melhem, Global Lead, Digital Development, World Bank

  • projects: she works on telecommunications and transportation investing in government infrastructure in these areas. Most of these governments have been handicapped by lack of funding. Need expert data integrators. Integrating cybersecurity.

Stephen Pattison, VP Public Affairs, ARM

  • (yikes, never thought about this!) cyberterrorist hacks self-driving car & drives it into a crowds.
  • many cyber-engineers who might go to dark side — why hasn’t this been studied?
  • could we get to point where IoT-devices are certified secure (but threats continually evolve. Upgradeability is critical.
  • do we need a whistleblower protection?
  • “big data starts with little data”

Session 4: Key Policy Considerations for Building the Cars of Tomorrow – What do Industry Stakeholders Want from Policymakers?

Ken DiPrima, AVP New Product Development, IoT Solutions, AT&T

  • 4-level security approach: emphasis on end-point, locked-down connectivity through SIM, application level …
  • deep in 5-G: how do you leverage it, esp. for cars?
  • connecting 25+ of auto OEMs. Lot of trials.

Rob Yates, Co-President, Lemay Yates Associates

  • massive increase in connectivity. What do you do with all the data? Will require massive infrastructure increase.

Michelle Avary, Executive Board, FASTR, VP Automotive, Aeris

  • about 1 Gig of data per car with present cars. Up to 30 with a lot of streaming.
  • don’t need connectivity for self-driving car: but why not have connectivity? Also important f0r the vehicle to know and communicate its physical state. Machine learning needs data to progress.
  • people won’t buy vehicles when they are really autonomous — economics won’t support it, will move to mobility as a service.

Paul Scullion, Senior Manager, Vehicle Safety and Connected Automation, Global Automakers

  • emphasis on connected cars, how it might affect ownership patterns.
  • regulatory process slow, but a lot of action on state level. “fear and uncertainty” on state level. Balance of safety and innovation.

Steven Bayless, Regulatory Affairs & Public Policy, Intelligent Transportation Society of America

  • issues: for example, can you get traffic signals to change based on data from cars?
  • car industry doesn’t have lot of experience with collaborative issues.

How Are Smart Cities Being Developed and Leveraged for the Citizen?

Sokwoo Rhee, Associate Director of Cyber-Physical Systems Program, National Institute of Standards and Technology (NIST)

  • NIST GCTC Approach: Smart and Secure Cities. Partnered with Homeland Security to bring in cybersecurity & privacy at the basis of smart city efforts “Smart and Secure Cities and Communities Challenge”

Bob Bennett, Chief Innovation Officer, City of Kansas, MO

  • fusing “silos of awesomeness.”
  • 85% of data you need for smart cities already available.
  • “don’t blow up silos, just put windows on them.”
  • downtown is 53 smartest blocks in US
  • can now do predictive maintenance on roads
  • Prospect Ave.: neighborhood with worst problems. Major priority.
  • great program involving multiple data sources, to predict and take care of potholes — not only predictive maintenance but also use a new pothole mix that can last 12 years 
  • 122 common factors all cities doing smart cities look at!
  • cities have money for all sorts of previously allocated issues — need to get the city manager, not mayor, to deal with it
  • privacy and security: their private-sector partner has great resoures, complemented by the city’s own staff.

Mike Zeto, AVP General Manager, IoT Solutions, AT&T

  • THE AT&T Smart Cities guy. 
  • creating services to facilitate smart cities.
  • energy and utilities are major focus in scaling smart cities, including capital funding. AT&T Digital Infrastructure (done with GE) “iPhone for cities.”
  • work in Miami-Dade that improved public safety, especially in public housing. Similar project in Atlanta.
  • privacy and security: their resources in both have been one of their strengths from the beginning.

Greg Toth, Founder, Internet of Things DC

  • security issues as big as ever
  • smart city collaboration booming
  • smart home stagnating because early adopter boom over, value not sure
  • Quantified-Self devices not really taking hold (yours truly was one of very few attendees who said they were still using their devices — you’d have to tear my Apple Watch off).
  • community involvement greater than ever
  • looming problem of maintaining network of sensors as they age
  • privacy & security: privacy and security aren’t top priorities for most startups.

DAY TWO:

IoT TECH TALKS

  • Dominik Schiener, Co-Founder , IOTA speaking on blockchain
    • working with IoT version of blockchain for IoT — big feature is it is scaleable
    • why do we need it?  Data sets shared among all parties. Each can verify the datasets of other participants. Datasets that have been tampered are excluded.
    • Creates immutable single source of truth.
    • It also facilitates payments, esp. micropayments (even machine to machine)
    • Allows smart contracts. Fully transparent. Smart and trustless escrow.
    • Facilitates “machine economy”
    • Toward “smart decentralization”
    • Use cases:
      • secure car data — VW. Can’t be faked.
      • Pan-European charging stations for EVs. “Give machines wallets”
      • Supply chain tracking — probably 1st area to really adopt blockchain
      • Data marketplace — buy and sell data securely (consumers can become pro-sumers, selling their personal data).
      • audit trail. https://audit-trail.tangle.works
  • DJ Saul, CMO & Managing Director, iStrategyLabs IoT, AI and Augmented Reality
    • focusing on marketing uses.

Raising the bar for federal IoT Security – ‘The Internet of Things Cybersecurity Improvement Act’

  • Jim Langevin, Congressman, US House of Representatives
    • very real threat with IoT
    • technology outpacing the law
    • far too many manufacturers don’t make security a priority. Are customers aware?
    • consumers have right to know about protections (or lack thereof)
    • “failure is not an option”
    • need rigorous testing
  • Beau Woods, Deputy Director, Cyber Statecraft Initiative, Atlantic Council
    • intersection of cybersecurity & human condition
    • dependence on connected devices growing faster than our ability to regulate it
    • UL developing certification for medical devices
    • traceability for car parts
  • John Marinho, Vice President Cybersecurity and Technology, CTIA
    • industry constantly evolving global standards — US can’t be isolated.
    • cybersecurity with IoT must be 24/7. CTIA created an IoT working group, meets every two weeks online.
    • believe in public/private partnerships, rather than just regulatory.

Session 9: Meeting the Short and Long-Term Connectivity Requirements of IoT – Approaches and Technologies

  •  Andreas Geiss, Head of Unit ‘Spectrum Policy’, DG CONNECT, European Commission
    • freeing up a lot of spectrum, service neutral
    • unlicensed spectrum, esp. for short-range devices. New frequency bands. New medical device bands. 
    • trying to work with regulators globally to allow for globally-usable devices.
  • Geoff Mulligan, Chairman, LoRa Alliance; Former Presidential Innovation Fellow, The White House
    • wireless tradeoffs: choose two — low power/long distance/high speed.
    • not licensed vs. unlicensed spectrum. Mix of many options, based on open standards, all based on TCP/IP
    • LPWANs:
      • low power wide area networks
      • battery operated
      • long range
      • low cost
      • couple well with satellite networks
    • LoRaWAN
      • LPWAN based on LoRa Radio
      • unlicensed band
      • open standards base
      • openly available
      • open business model
      • low capex and opex could covered entire country for $120M in South Korea
      • IoT is evolutionary, not revolutionary — don’t want to separate it from other aspects of Internet
  • Jeffrey Yan, Director, Technology Policy, Microsoft
    • at Microsoft they see it as critical for a wide range of global issues, including agriculture.
  • Charity Weeden, Senior Director of Policy, Satellite Industry Association
    • IoT critical during disasters
    • total architecture needs to be seamless, everywhere.
  • Andrew Hudson, Head of Technology Policy, GSMA
    • must have secure, scalable networks

Session 10: IoT Data-Ownership and Licencing – Who Owns the Data?

  • Stacey Gray, Policy Lead IoT, Future Privacy Forum 
    • consumer privacy right place to begin.
    • need “rights based” approach to IoT data
    • at this point, have to show y0u have been actually harmed by release of data before you can sue.
  • Patrick Parodi, Founder, The Wireless Registry
    • focus on identity
    • who owns SSID identities? How do you create an identity for things?
  • Mark Eichorn, Assistant Director, Division of Privacy and Identity Protection, Federal Trade Commission 
    • cases involving lead generators for payday loan. Reselling personal financial info.
  • Susan Allen, Attorney-Advisor, Office of Policy and International Affairs, United States Patent & Trademark Office 
    • focusing on copyright.
    • stakeholders have different rights based on roles
  • Vince Jesaitis, Director, US Public Affairs, ARM
    • who owns data depends on what it is. Health data very tough standards. Financial data much more loose.
    • data shouldn’t be treated differently if it comes from a phone or a browser.
    • industrial side: autonomous vehicle data pretty well regulated.  Pending legislation dealing with smart cities emphasis open data.
comments: Comments Off on Liveblogging from Internet of Things Global Summit tags: , , , ,
http://www.stephensonstrategies.com/">Stephenson blogs on Internet of Things Internet of Things strategy, breakthroughs and management