Could IoT Allow Do-over for Privacy, Security — & Trust?

Posted on 13th September 2013 in communication, management, privacy, security

Expect to be reading a lot here about privacy and security between now and my panel on those issues at the IoT Summit in DC, Oct. 1 & 2, as I prep to ask the panel questions!

Here’s another, from Stacy Higginbotham (BTW, she does a great podcast on IoT issues!), based on a conversation with ARM CTO Mike Muller. It’s reassuring to see that this IoT-leading firm is taking privacy and security seriously. Even more refreshingly, theirs is a nuanced and thoughtful view.

Muller told Higginbotham that IoT vendors should learn from some of the missteps on privacy on the Web so far, and make amends:

“’We should think about trust as who has access to your data and what they can do with it. For example, I’ll know where you bought something, when you bought it, how often and who did you tweet about it.

“When you put the long tail of lots of bits of information and big data analytics associated with today’s applications we can discern a lot. And people are not thinking it through. … I think it’s the responsibility of the industry that, as people connect, to make them socially aware of what’s happening with their data and the methods that are in place to make connections between disparate sets of data (my emphasis). In the web that didn’t happen, and the sense of lost privacy proliferated and it’s all out there. People are trying to claw that back and implement privacy after the fact.”

Higginbotham adds that “… what troubles Muller is that today, there’s nothing that supports trust and privacy in the infrastructure associated with the internet of things.”

What struck me, as someone who used to earn his living doing corporate crisis management, is that one of the critical issues in trust (or lack thereof) is guilt by association may not be logically valid, but is emotionally powerful: if people’s preconception of IoT privacy and security standards is that they’re simply an extension of Internet ones, there’s likely to be trouble.

She goes on to differentiate between security, privacy — and trust.

“Trust is the easiest to define and the hardest to implement. It relies on both transparency and making an effort to behave consistently ….  When it comes to connected devices and apps, trust is probably most easily gained by explaining what you do with people’s data: what you share and with whom. It might also extend to promises about interoperability and supporting different platforms. Implicitly trust with connected devices also means you will respect people’s privacy and follow the best security practices….

“Privacy is more a construct of place as opposed to something associated with a specific device. So a connected camera on a public street is different from a connected camera inside your home. It’s easy to say that people shouldn’t be able to just grab a feed from inside your home — either from a malicious hack or the government (or a business) doing a random data scrape. But when it comes to newer connected devices like wearables it gets even more murky: Consider that something like a smart meter can share information about the user to someone who knows what to look for.

“So when thinking about the internet of things and privacy, it’s probably useful to start with thinking about the data the device generates….

(As for security:) “To protect privacy when everything is connected will require laws that punish violations of people’s privacy and draw lines that companies and governments can’t step over; but it will also require vigilance by users. To get this right, users should be reading the agreements they click through when they connect a device, but companies should also create those agreements, especially around data sharing transparent, in a way that inspires trust.

Governments and companies need to think about updating laws for a connected age and set criteria about how different types of data are transported and shared. Health data might still need the HIPAA-levels of regulations, but maybe looser standards can prevail for connected thermostats.”

Sounds to me as if there’s a role in these complex issues for all of us: vendors, government, and users.

But the one take-away that I have from Muller’s remarks is that IoT vendors must realize they have to earn users trust, and that’s going to require a combination of technical measures and unambiguous, plain-English communication with users about who owns their data and how it will be used. To me, that means not hiding behind the lawyers and agate-type legal disclaimers, but clear, easy-to-understand declarations about users’ rights to their data and companies’ need to directly ask them for access, displayed prominently, with the default being that the user completely denies access, and must opt in for it to be shared. 

What do you think?

Higginbotham concludes that “we need to stop freaking out about the dangers of connected devices and start having productive discussions about implementing trust and security before the internet of things goes the way of the web. Wonderful, free and a total wild west when it comes to privacy.” Hopefully, that’s what will happen during our October 1st panel.

Leave a Reply

Your email address will not be published. Required fields are marked *

     

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>