Could IoT Allow Do-over for Privacy, Security — & Trust?

Posted on 13th September 2013 in communication, management, privacy, security

Expect to be reading a lot here about privacy and security between now and my panel on those issues at the IoT Summit in DC, Oct. 1 & 2, as I prep to ask the panel questions!

Here’s another, from Stacy Higginbotham (BTW, she does a great podcast on IoT issues!), based on a conversation with ARM CTO Mike Muller. It’s reassuring to see that this IoT-leading firm is taking privacy and security seriously. Even more refreshingly, theirs is a nuanced and thoughtful view.

Muller told Higginbotham that IoT vendors should learn from some of the missteps on privacy on the Web so far, and make amends:

“’We should think about trust as who has access to your data and what they can do with it. For example, I’ll know where you bought something, when you bought it, how often and who did you tweet about it.

“When you put the long tail of lots of bits of information and big data analytics associated with today’s applications we can discern a lot. And people are not thinking it through. … I think it’s the responsibility of the industry that, as people connect, to make them socially aware of what’s happening with their data and the methods that are in place to make connections between disparate sets of data (my emphasis). In the web that didn’t happen, and the sense of lost privacy proliferated and it’s all out there. People are trying to claw that back and implement privacy after the fact.”

Higginbotham adds that “… what troubles Muller is that today, there’s nothing that supports trust and privacy in the infrastructure associated with the internet of things.”

What struck me, as someone who used to earn his living doing corporate crisis management, is that one of the critical issues in trust (or lack thereof) is guilt by association may not be logically valid, but is emotionally powerful: if people’s preconception of IoT privacy and security standards is that they’re simply an extension of Internet ones, there’s likely to be trouble.

She goes on to differentiate between security, privacy — and trust.

“Trust is the easiest to define and the hardest to implement. It relies on both transparency and making an effort to behave consistently ….  When it comes to connected devices and apps, trust is probably most easily gained by explaining what you do with people’s data: what you share and with whom. It might also extend to promises about interoperability and supporting different platforms. Implicitly trust with connected devices also means you will respect people’s privacy and follow the best security practices….

“Privacy is more a construct of place as opposed to something associated with a specific device. So a connected camera on a public street is different from a connected camera inside your home. It’s easy to say that people shouldn’t be able to just grab a feed from inside your home — either from a malicious hack or the government (or a business) doing a random data scrape. But when it comes to newer connected devices like wearables it gets even more murky: Consider that something like a smart meter can share information about the user to someone who knows what to look for.

“So when thinking about the internet of things and privacy, it’s probably useful to start with thinking about the data the device generates….

(As for security:) “To protect privacy when everything is connected will require laws that punish violations of people’s privacy and draw lines that companies and governments can’t step over; but it will also require vigilance by users. To get this right, users should be reading the agreements they click through when they connect a device, but companies should also create those agreements, especially around data sharing transparent, in a way that inspires trust.

Governments and companies need to think about updating laws for a connected age and set criteria about how different types of data are transported and shared. Health data might still need the HIPAA-levels of regulations, but maybe looser standards can prevail for connected thermostats.”

Sounds to me as if there’s a role in these complex issues for all of us: vendors, government, and users.

But the one take-away that I have from Muller’s remarks is that IoT vendors must realize they have to earn users trust, and that’s going to require a combination of technical measures and unambiguous, plain-English communication with users about who owns their data and how it will be used. To me, that means not hiding behind the lawyers and agate-type legal disclaimers, but clear, easy-to-understand declarations about users’ rights to their data and companies’ need to directly ask them for access, displayed prominently, with the default being that the user completely denies access, and must opt in for it to be shared. 

What do you think?

Higginbotham concludes that “we need to stop freaking out about the dangers of connected devices and start having productive discussions about implementing trust and security before the internet of things goes the way of the web. Wonderful, free and a total wild west when it comes to privacy.” Hopefully, that’s what will happen during our October 1st panel.

Good Paper by Mercatus on IoT Privacy and Security

Posted on 12th September 2013 in privacy, security

I’m politically on the liberal, not the libertarian side, but I’ve come to respect the libertarian Mercatus Center, in large part because of the great work Jerry Brito has done there on governmental transparency.

As part of my preparation to moderate a panel on security and privacy at the IoT Summit on October 1st in DC, I just read a great paper on the issue by Mercatus’ Adam Thierer.

In comments submitted to the FTC for its November workshop on these issues titled “Privacy and Security Implications of the Internet of Things,” Thierer says “whoa” to those who would have the FTC and others quickly impose regulations on the IoT in the name of protecting privacy and security.

Opposing pre-emptive, “precautionary” regulations, he instead argues for holding back:

“…. an “Anti-Precautionary Principle” is the better default here and would generally hold that:

“1. society is better off when technological innovation is not preemptively restricted;

“2. accusations of harm and calls for policy responses should not be premised on hypothetical worst-case scenarios; an

“3. remedies to actual harms should be narrowly tailored so that beneficial uses of technology are not derailed.”

He reminds us that, when introduced, such everyday technologies as the phone (you know, the old  on-the-wall kind..) and photography were opposed by many as invasions of privacy, but social norms quickly adapted to embrace them. He quotes Larry Downes, who has written, “After the initial panic, we almost always embrace the service that once violated our visceral sense of privacy.”

Rather than imposing limits in advance, Thierer argues for a trial-and-error approach to avoid unnecessary limits to experimentation — including learning from mistakes.

He points out that social norms often emerge that can substitute for regulations to govern acceptable use of the new technology.

In conclusion, Thierer reminds us that there are already a wide range of laws and regulations on the book that, by extension, could apply to some of the recent IoT outrages:

“…  many federal and state laws already exist that could address perceived harms in this context. Property law already governs trespass, and new court rulings may well expand the body of such law to encompass trespass by focusing on actual cases and controversies, not merely imaginary hypotheticals. State ‘peeping Tom’ laws already prohibit spying into individual homes. Privacy torts—including the tort of intrusion upon seclusion—may also evolve in response to technological change and provide more avenues of recourse to plaintiffs seeking to protect their privacy rights.”

Along the lines of my continuing screed that IoT manufacturers had better take action immediately to tighten their own privacy and security precautions, Thierer isn’t letting them off the hook:

“The public will also expect the developers of IoT technologies to offer helpful tools and educational methods for controlling improper usages. This may include ‘privacy-by-design’ mechanisms that allow the user to limit or intentionally cripple certain data collection features in their devices. ‘Only by developing solutions that are clearly respectful of people’s privacy, and devoting an adequate level of resources for disseminating and explaining the technology to the mass public’ can industry expect to achieve widespread adoption of IoT technologies.”

So get cracking, you lazy IoT developers (yes, you smirking over there in the corner…) who think that security and privacy are someone else’s business: if you don’t act, regulators may step in, and stiffle innovation in the name of consumer protection. You’ll have no one to blame but yourselves.

It’s a good read — hope you’ll check it out!

 

The Hill Publishes Op-Ed on IoT Security and Privacy

Posted on 11th September 2013 in privacy, security, US government

Earlier this week, The Hill, the highly-respected Capitol Hill newspaper, published an op-ed co-authored by Chris Rezendes of INEX Advisors and me on the ever-important topic of IoT privacy and security (or lack thereof!).

In it, we warned that “on the heels of the NSA scandal, news of security problems’ threat to privacy may cripple the IoT before it achieves its promise.”

We went on to explain that:

“The record on security and privacy is not reassuring.

“The Obama administration has almost entirely ignored the Internet of Things (by contrast, it’s frequently mentioned by the Chinese leadership, which has invested massive amounts in the technology) . The president has never mentioned it, and the FTC is the only federal agency that has begun to protect IoT privacy and security.”

We called for public-private collaboration to make IoT security and privacy a priority:

“Individual companies must make privacy and security a priority. Opaque user agreements such as Facebook’s letting the service provider remarket or redeploy user data won’t be acceptable. A recent INEX study of one multi-billion industrial market revealing 96 percent of industrial equipment owner/operators believe they own data from their machines, and access to it is theirs to determine — not the machine’s builder or service providers that connect it. Customers must legally own their online data, determine who has rights to what, and sharing must be “opt in”, with ZERO sharing as the default.

“As for security, companies should explore Resilient Networking, a concept developed for the Department of Homeland Security framing new approaches to network/cyber security in more connected, distributed, automated, and dynamic digital networks.

“But individual efforts aren’t as important as collaborative ones, again, because of the data-sharing that is central to the IoT’s transformative power. We’re encouraged by formation of the IPSO Alliance and the IoT Consortium, which make security and privacy a priority.

“The president must also become involved in this issue. One reason is that the IoT will benefit government: cities worldwide are already applying the IoT, and it can make government in general more effective and responsive. Working closely with the private sector is a priority because 85 percent of the nation’s critical infrastructure, including the electric grid, pipelines and chemical plants, is in private hands, and is the focus of IoT initiatives such as a the “smart grid” to make them more interconnected and reliable – but also more vulnerable to a coordinated attack.”

That’s our opinion on this crucial issue. What’s yours?

P.S. A reminder that these issues will be front and center in  the panel on security and privacy that I will moderate at the IoT Summit, to be held October 1st and 2nd at the National Press Club in DC. Don’t miss it!