IBM picks for IoT trends to watch this year emphasize privacy & security

Last month Bill Chamberlin, the principal analyst for Emerging Tech Trends and Horizon Watch Community Leader for IBM Market Development (hmmm, must have an oversized biz card..) published a list of 20 IoT trends to watch this year that I think provide a pretty good checklist for evaluating what promises to be an important period in which the IoT becomes more mainstream.

It’s interesting to me, especially in light of my recent focus on the topics (and I’ll blog on the recent FTC report on the issue in several days), that he put privacy and security number one on the list, commenting that “Trust and authentication become critical across all elements of the IoT, including devices, the networks, the cloud and software apps.” Amen.

Most of the rest of the list was no surprise, with standards, hardware, software, and edge analytics rounding out the top five (even though it hasn’t gotten a lot of attention, I agree edge analytics are going to be crucial as the volume of sensor data increases dramatically: why pass along the vast majority of data, that is probably redundant, to the cloud, vs. just what’s a deviation from the norm and probably more important?).

Two dealing with sensors did strike my eye:

9.  Sensor fusion: Combining data from different sources can improve accuracy. Data from two sensors is better than data from one. Data from lots of sensors is even better.

10.  Sensor hubs: Developers will increasingly experiment with sensor hubs for IoT devices, which will be used to offload tasks from the application processor, cutting down on power consumption and improving battery life in the devices”

Both make a lot of sense.

One was particularly noteworthy in light of my last post, about the Gartner survey showing most companies were ill-prepared to plan and launch IoT strategies: “14.  Chief IoT Officer: Expect more senior level execs to be put in place to build the enterprise-wide IoT strategy.” Couldn’t agree more that this is vital!

Check out the whole list: I think you’ll find it helpful in tracking this year’s major IoT developments.

comments: Comments Off on IBM picks for IoT trends to watch this year emphasize privacy & security tags: , , , , , , , , ,

The #IoT Can Kill You! Got Your Attention? Car Security a Must

The Internet of Things can kill you.

Got your attention? OK, maybe this is the wake-up call the IoT world needs to make certain that privacy and security are baked in, not just afterthoughts.

Markey_IoT_car_reportI’ve blogged before about how privacy and security must be Job 1, but now it’s in the headlines because of a new report by our Mass. Senator, Ed Markey (Political aside: thanks, Ed, for more than 30 years of leadership — frequently as a voice crying in the wilderness — on the policy implications of telecomm!), “Tracking & Hacking: Security & Privacy Gaps Put American Drivers at Risk,” about the dangers of not taking the issues seriously when it comes to smart cars.

I first became concerned about this issue when reading “Look Out, He’s Got an Phone,!” (my personal nominee for all-time most wry IoT headline…), a litany of all sorts of horrific things, such as spoofing the low air-pressure light on your car so you’ll pull over and the Bad Guys can get it would stop dead at 70 mph,  that are proven risks of un-encrypted automotive data.  All too typical was the reaction of Schrader Electronics, which makes the tire sensors:

“Schrader Electronics, the biggest T.P.M.S. manufacturer, publicly scoffed at the Rutgers–South Carolina report. Tracking cars by tire, it said, is ‘not only impractical but nearly impossible.’ T.P.M.S. systems, it maintained, are reliable and safe.

“This is the kind of statement that security analysts regard as an invitation. A year after Schrader’s sneering response, researchers from the University of Washington and the University of California–San Diego were able to ‘spoof’ (fake) the signals from a tire-pressure E.C.U. by hacking an adjacent but entirely different system—the OnStar-type network that monitors the T.P.M.S. for roadside assistance. In a scenario from a techno-thriller, the researchers called the cell phone built into the car network with a message supposedly sent from the tires. ‘It told the car that the tires had 10 p.s.i. when they in fact had 30 p.s.i.,’ team co-leader Tadayoshi Kohno told me—a message equivalent to ‘Stop the car immediately.’ He added, ‘In theory, you could reprogram the car while it is parked, then initiate the program with a transmitter by the freeway. The car drives by, you call the transmitter with your smartphone, it sends the initiation code—bang! The car locks up at 70 miles per hour. You’ve crashed their car without touching it.’”

Hubris: it’ll get you every time….

So now Senator Markey lays out the full scope of this issue, and it should scare the daylights out of you — and, hopefully, Detroit! The report is compiled on responses by 16 car companies (BMW, Chrysler, Ford, General Motors, Honda, Hyundai, Jaguar Land Rover, Mazda, Mercedes-Benz, Mitsubishi, Nissan, Porsche, Subaru, Toyota, Volkswagen (with Audi), and Volvo — hmm: one that didn’t respond was Tesla, which I suspect [just a hunch] really has paid attention to this issue because of its techno leadership) to letters Markey sent in late 2013. Here are the damning highlights from his report:

“1. Nearly 100% of cars on the market include wireless technologies that could pose vulnerabilities to hacking or privacy intrusions.

2. Most automobile manufacturers were unaware of or unable to report on past hacking incidents.

3. Security measures to prevent remote access to vehicle electronics are inconsistent and haphazard across all automobile manufacturers, and many manufacturers did not seem to understand the questions posed by Senator Markey.

4. Only two automobile manufacturers were able to describe any capabilities to diagnose or meaningfully respond to an infiltration in real-time, and most say they rely on technologies that cannot be used for this purpose at all. (my emphasis)

5. Automobile manufacturers collect large amounts of data on driving history and vehicle performance.

6. A majority of automakers offer technologies that collect and wirelessly transmit driving history data to data centers, including third-party data centers, and most do not describe effective means to secure the data.

7. Manufacturers use personal vehicle data in various ways, often vaguely to “improve the customer experience” and usually involving third parties, and retention policies – how long they store information about drivers – vary considerably among manufacturers.

8. Customers are often not explicitly made aware of data collection and, when they are, they often cannot opt out without disabling valuable features, such as navigation.”

In short, the auto industry collects a lot of information about us, and doesn’t have a clue how to manage or protect it.

I’ve repeatedly warned before that one of the issues technologists don’t really understand and/or scoff at, is public fears about privacy and security. Based on my prior work in crisis management, that can be costly — or fatal.

This report should serve as a bit of electroshock therapy to get them (and here I’m referring not just to auto makers but all IoT technologists: it’s called guilt by association, and most people tend to confabulate fears, not discriminate between them. Unless everyone in IoT takes privacy and security seriously, everyone may suffer the result [see below]) to realize that it’s not OK, as one of the speakers at the Wearables + Things conference said, that “we’ll get to privacy and security later.” It’s got to be a priority from the get-go (more about this in a forthcoming post, where I’ll discuss the recent FTC report on the issue).

I’ve got enough to worry about behind the wheel, since the North American Deer Alliance is out to get me. Don’t make me worry about false tire pressure readings.


PS: there’s another important issue here that may be obscured: the very connectedness that is such an important aspect of the IoT. Remember that the researchers spoofed the T.P.M.S. system not through a frontal assault, but by attacking the roadside assistance system? It’s like the way Target’s computers were hacked via a small company doing HVAC maintenance. Moral of the story? No IoT system is safe unless all the ones linking to it are safe.  For want of a nail … the kingdom was lost!

Resolved: That 2015 Is When Privacy & Security Become #IoT Priority!

I’m a right-brained, intuitive type (ENFP, if you’re keeping Myers-Briggs score…), and sometimes that pays off on issues involving technology & the general public, especially when the decidedly non-technical, primal issue of FEAR comes into the equation.

I used to do a lot of crisis management work with Fortune 100 companies, and usually worked with engineers, 95% of whom are my direct opposite: ISTJ.  Because they are so left-brained, rational and analytical, it used to drive them crazy that the public would be so fearful of various situations, because peoples’ reaction was just so darned irrational!

I’m convinced that same split is a looming, and extremely dangerous problem for the Internet of Things: the brilliant engineers who bring us all these great platforms, devices and apps just can’t believe that people could be fraidy cats.

Let me be blunt about it, IOT colleagues: get used dealing with peoples’ fears. Wise up, because that fear might just screw the IoT before it really gains traction. Just because a reaction is irrational doesn’t mean it isn’t very, very real to those who feel it, and they might just shun your technology and/or demand draconian regulations to enforce privacy and security standards. 

That’s why I was so upset at a remark by some bright young things at the recent Wearables + Things conference. When asked about privacy and security precautions (a VERY big thing with people, since it’s their very personal bodily data that’s at risk) for their gee-whiz device, they blithely said that they were just a start-up, and they’d get to security issues after they had the device technology squared away.

WRONG, KIDS: security and privacy protections have to be a key priority from the get-go.

That’s why I was pleased to see that CES asked FTC Chair Edith Ramirez to give opening remarks at a panel on security last week, and she specifically focused on “privacy by design,” where privacy protections are baked into the product from the get-go. She emphasized that start-ups can’t get off the hook:

“‘Any device that is connected to the Internet is at risk of being hijacked,’ said Ms. Ramirez, who added that the large number of Internet-connected devices would ‘increase the number of access points’ for hackers.

Ms. Ramirez seemed to be directing her remarks at the start-ups that are making most of the products — like fitness trackers and glucose monitors — driving the so-called Internet of Things.

She said that some of these developers, in contrast to traditional hardware and software makers, ‘have not spent decades thinking about how to secure their products and services from hackers.'”

I yield to no one in my love of serendipitous discoveries of data’s value (such as the breakthrough in early diagnosis of infections in neonates by researchers from IBM and Toronto’s Hospital for Sick Children, but I think Ms. Ramirez was on target about IoT developers forcing themselves to emphasize minimization of data collection, especially when it comes to personal data:

“Beyond security, Ms. Ramirez said that technology companies needed to pay more attention to so-called data minimization, in which they collect only the personal data they need for a specific purpose and delete it permanently afterward. She directly challenged the widespread contention in the technology industry that it is necessary to collect large volumes of data because new uses might be uncovered.

‘I question the notion that we must put sensitive consumer data at risk on the off chance a company might someday discover a valuable use for the information,’ she said.

She also said that technology companies should be more transparent about the way they use personal data and should simplify their terms of use.”

Watch for a major IoT privacy pronouncement soon from the FTC.

It’s gratifying that, in addition to the panel Ms. Ramirez introduced, that CES also had an (albeit small…) area for privacy vendors.  As the WaPo reported, part of the reasons for this area is that the devices and apps are aimed at you and me, because “consumers are finding — thanks to the rise in identity theft, hacks and massive data breaches — that companies aren’t always good stewards for their information.” Dealing with privacy breaches is everyone’s business: companies, government, and you and me!

As WaPo reporter   concluded: “The whole point of the privacy area, and of many of the products being shown there, is that technology and privacy don’t have to fight. They can actually help each other. And these exhibitors — the few, the proud, the private — are happy to be here, preaching that message.”

So, let’s all resolve that 2015 when privacy and security become as big an IoT priority as innovation!


Oh, before I forget, its time for my gratuitous reference whenever I discuss IoT privacy and security, to Gen. David Petraeus (yes, the very General “Do As I Say, Not As I Do” Petraeus who faces possible federal felony charges for leaking classified documents to his lover/biographer.), who was quite enamored of the IoT when he directed the CIA. That should give you pause, no matter whether you’re an IoT user, producer, or regulator!

IoT Security After “The Interview”

Posted on 22nd December 2014 in defense, Internet of Things, M2M, management, privacy, security, US government

Call me an alarmist, but in the wake of the “Interview” catastrophe (that’s how I see it in terms of both the First Amendment AND asymmetrical cyberwarfare), I see this as a clarion call to the #IoT industry to redouble efforts to make both security AND privacy Job #1.

Here’s the deal: if we want to enhance more and more parts of governmental, commercial, and private lives by clever IoT devices and apps to control them, then there’s an undeniable quid pro quo: we MUST make these devices and apps as secure as possible.

I remember some bright young entrepreneurs speaking at a recent wearables conference, where they apologized for not having put attention on privacy and security yet, saying they’d get to it early next year.

Nope.

Unacceptable.

Security must be built in from the beginning, and constantly upgraded as new threats emerge.  I used to be a corporate crisis manager, and one of the things that was so hard to convince left-brained, extremely rational engineers about was that just because fears were irrational didn’t mean they weren’t real — even the perception of insecure IoT devices and apps has the potential to kill the whole industry, or, as Vanity Fair‘s apocalyptic “Look Out, He’s Got a Phone” article documented, it could literally kill us. As in deader than a doornail.

This incident should have convinced us all that there are some truly evil people out there fixated on bringing us to our collective knees, and they have the tech savvy to do it, using tools such as Shodan. ‘Nuff said?

PS: Here’s what Mr. Cybersecurity, Bruce Schneier, has to say on the subject. Read carefully.

comments: Comments Off on IoT Security After “The Interview” tags: , , , ,

My #IoT predictions for 2015

I was on a live edition of “Coffee Break With Game-Changers” a few hours ago with panelists Sherryanne Meyer of Air Products and Chemicals and Sven Denecken of SAP, talking about tech projections for 2015.

Here’s what I said about my prognostications:

“I predict that 2015 will be the year that the Internet of Things penetrates consumer consciousness — because of the Apple Watch. The watch will unite both health and smart home apps and devices, and that will mean you’ll be able to access all that usability just by looking at your watch, without having to fumble for your phone and open a specific app.

If Apple chooses to share the watch’s API on the IFTTT – If This Then That — site, the Apple phone’s adoption – and usability — will go into warp speed. We won’t have to wait for Apple or developers to come up with novel ways of using the phone and the related devices — makers and just plain folks using IFTTT will contribute their own “recipes” linking them. This “democratization of data” is one of the most powerful – and under-appreciated – aspects of the IoT. In fact, Sherryanne, I think one of the most interesting IoT strategy questions for business is going to be that we now have the ability to share real time data with everyone in the company who needs it – and even with supply chain and distribution networks – and we’ll start to see some discussion of how we’ll have to change management practices to capitalize on this this instant ability to share.

(Sven will be interested in this one) In 2015, the IoT is also going to speed the development of fog computing, where the vast quantities of data generated by the IoT will mean a switch to processing data “at the edge,” and only passing on relevant data to the cloud, rather than overwhelming it with data – most of which is irrelevant.

In 2015 the IoT is also going to become more of a factor in the manufacturing world. The success of GE’s Durathon battery plant and German “Industry 4.0” manufacturers such as Siemans will mean that more companies will develop incremental IoT strategies, where they’ll begin to implement things such as sensors on the assembly line to allow real-time adjustments, then build on that familiarity with the IoT to eventually bring about revolutionary changes in every aspect of their operations.

2015 will also be the year when we really get serious about IoT security and privacy, driven by the increasing public concern about the erosion of privacy. I predict that if anything can hold back the IoT at this point, it will be failure to take privacy and security seriously. The public trust is extremely fragile: if even some fledgling startup is responsible for a privacy breach, the public will tend to tar the entire industry with the same brush, and that could be disastrous for all IoT firms. Look for the FTC to start scrutinizing IoT claims and levying more fines for insufficient security.”

What’s your take on the year ahead? Would love your comments!

comments: Comments Off on My #IoT predictions for 2015 tags: , , , , , ,

Internet of Things interview I did with Jordan Rich

Didn’t realize this had run several weeks ago, but here’s an introduction to the IoT (based on my SAP “Managing the Internet of Things” i-guide) that I did with Jordan Rich of WBZ Radio, who’s also my voice-over mentor.  The examples include the GE Durathon battery plant, “smart aging,” Shodan, the SAP prototype smart vending machine and Ivee. Enjoy!

comments: Comments Off on Internet of Things interview I did with Jordan Rich tags: , , , ,

In case you missed it, great panel today on the IoT and government

Posted on 19th March 2014 in government, Internet of Things, US government

In case you missed it, old friend Christopher Dorobek put together a great (in all modesty, LOL …) panel today for his “DorobekINSIDER” series on GovLoop about how the Internet of Things will transform government.  I’ll try to summarize it in a later post, but you can listen in here!

comments: Comments Off on In case you missed it, great panel today on the IoT and government tags: , , , , , ,

Here’s where I draw the IoT privacy line! social sensing badges

Posted on 5th November 2013 in Internet of Things, management, privacy

Yikes!

I had the same reaction to this story by the Boston Globe‘s Scott Kirsner (“Is this a management breakthrough, or Big Brother in the workplace?” — sorry, no linkie: it only appears to be available through the subscribers’ archive) that a lot of people did to the story about the hacked, un-encrypted baby monitor: this is the Internet of Things run amok.

Sociometric Badge

It seems that a local firm, Sociometric Solutions,  has come up with a “social sensing badge” that employees would wear around their necks. According to the firm’s CEO, Ben Waber, before long “every employee ID badge will have sensors in it.” Holy George Orwell!

As Kirsner said, “You might call it the NSA style of management.” My thoughts exactly.

Here’s how this demonic gizmo works:

“…the badges rely on infrared sensors to know when you are clustered with other people in a meeting or conversation. While they don’t record conversations, they capture data about how often you talk versus listen, how frequently you interrupt people, and your tone of voice.” (my emphasis)

This is supposed to lead to a more humane workplace, that “.. will enable companies to try different approaches to office design, corporate hierarchies, and perhaps even work schedules.”

Baloney!

I’m reminded of a story a friend tells. He had a very talented employee who was anti-social, and frequently would work in the middle of the night, even sleep at his desk. Unconventional, but absolutely essential to the department. How long do you think he’d last after wearing one of these badges? Turn in your sociometric badge as you pick up your last check, anyone with ADHD or Aspergers — and probably a lot of others who wouldn’t fit some manager’s pre-conception of the ideal employee!

According to workplace consultant Alexandra LaMaster, of OrgSpeed:

“When there’s trust between an employer and employee, and they see that you’re moving people around because you want more communication across departments, or to achieve some kind of business result, that’s one thing. If there’s a lack of trust, people might feel they’re being policed.”

I’ve seen far too many dysfunctional workplaces — particularly in low-status companies such as retailers — to subscribe to the idealized view of how this device could be used. As far as I’m concerned, the sociometric badge is one example of technologists (IMHO, shame on MIT Prof. Sandy Pentland, who is a co-founder and chairman of the company’s board, and who I’d always counted among the IoT Good Guys) who get the idea that because you can do something, you should do it.

You shouldn’t.

What do you think?

 

 

 

Tweeting the IoT Summit!

Posted on 1st October 2013 in government, Internet of Things, M2M, privacy, security

I Tweeted throughout the IoT Summit today, cryptic as the comments may have been. You can check them out at @data4all.  Learned a great deal, and picked up several nice examples for the e-book I’m writing on implications for corporate management of the IoT!

Enjoy.  Will do the same tomorrow!

comments: Comments Off on Tweeting the IoT Summit! tags: , , ,

Could IoT Allow Do-over for Privacy, Security — & Trust?

Posted on 13th September 2013 in communication, management, privacy, security

Expect to be reading a lot here about privacy and security between now and my panel on those issues at the IoT Summit in DC, Oct. 1 & 2, as I prep to ask the panel questions!

Here’s another, from Stacy Higginbotham (BTW, she does a great podcast on IoT issues!), based on a conversation with ARM CTO Mike Muller. It’s reassuring to see that this IoT-leading firm is taking privacy and security seriously. Even more refreshingly, theirs is a nuanced and thoughtful view.

Muller told Higginbotham that IoT vendors should learn from some of the missteps on privacy on the Web so far, and make amends:

“’We should think about trust as who has access to your data and what they can do with it. For example, I’ll know where you bought something, when you bought it, how often and who did you tweet about it.

“When you put the long tail of lots of bits of information and big data analytics associated with today’s applications we can discern a lot. And people are not thinking it through. … I think it’s the responsibility of the industry that, as people connect, to make them socially aware of what’s happening with their data and the methods that are in place to make connections between disparate sets of data (my emphasis). In the web that didn’t happen, and the sense of lost privacy proliferated and it’s all out there. People are trying to claw that back and implement privacy after the fact.”

Higginbotham adds that “… what troubles Muller is that today, there’s nothing that supports trust and privacy in the infrastructure associated with the internet of things.”

What struck me, as someone who used to earn his living doing corporate crisis management, is that one of the critical issues in trust (or lack thereof) is guilt by association may not be logically valid, but is emotionally powerful: if people’s preconception of IoT privacy and security standards is that they’re simply an extension of Internet ones, there’s likely to be trouble.

She goes on to differentiate between security, privacy — and trust.

“Trust is the easiest to define and the hardest to implement. It relies on both transparency and making an effort to behave consistently ….  When it comes to connected devices and apps, trust is probably most easily gained by explaining what you do with people’s data: what you share and with whom. It might also extend to promises about interoperability and supporting different platforms. Implicitly trust with connected devices also means you will respect people’s privacy and follow the best security practices….

“Privacy is more a construct of place as opposed to something associated with a specific device. So a connected camera on a public street is different from a connected camera inside your home. It’s easy to say that people shouldn’t be able to just grab a feed from inside your home — either from a malicious hack or the government (or a business) doing a random data scrape. But when it comes to newer connected devices like wearables it gets even more murky: Consider that something like a smart meter can share information about the user to someone who knows what to look for.

“So when thinking about the internet of things and privacy, it’s probably useful to start with thinking about the data the device generates….

(As for security:) “To protect privacy when everything is connected will require laws that punish violations of people’s privacy and draw lines that companies and governments can’t step over; but it will also require vigilance by users. To get this right, users should be reading the agreements they click through when they connect a device, but companies should also create those agreements, especially around data sharing transparent, in a way that inspires trust.

Governments and companies need to think about updating laws for a connected age and set criteria about how different types of data are transported and shared. Health data might still need the HIPAA-levels of regulations, but maybe looser standards can prevail for connected thermostats.”

Sounds to me as if there’s a role in these complex issues for all of us: vendors, government, and users.

But the one take-away that I have from Muller’s remarks is that IoT vendors must realize they have to earn users trust, and that’s going to require a combination of technical measures and unambiguous, plain-English communication with users about who owns their data and how it will be used. To me, that means not hiding behind the lawyers and agate-type legal disclaimers, but clear, easy-to-understand declarations about users’ rights to their data and companies’ need to directly ask them for access, displayed prominently, with the default being that the user completely denies access, and must opt in for it to be shared. 

What do you think?

Higginbotham concludes that “we need to stop freaking out about the dangers of connected devices and start having productive discussions about implementing trust and security before the internet of things goes the way of the web. Wonderful, free and a total wild west when it comes to privacy.” Hopefully, that’s what will happen during our October 1st panel.

comments: Comments Off on Could IoT Allow Do-over for Privacy, Security — & Trust? tags: , , , , , ,
http://www.stephensonstrategies.com/">Stephenson blogs on Internet of Things Internet of Things strategy, breakthroughs and management