FTC report provides good checklist to design in IoT security and privacy

FTC report on IoT

FTC report on IoT

SEC Chair Edith Ramirez has been pretty clear that the FTC plans to look closely at the IoT and takes IoT security and privacy seriously: most famously by fining IoT marketer TrendNet for non-existent security with its nanny cam.

Companies that want to avoid such actions — and avoid undermining fragile public trust in their products and the IoT as a whole — would do well to clip and refer to this checklist that I’ve prepared based on the recent FTC Report, Privacy and Security in a Connected World, compiled based on a workshop they held in 2013, and highlighting best practices that were shared at the workshop.

  1. Most important, “companies should build security into their devices at the outset, rather than as an afterthought.” I’ve referred before to the bright young things at the Wearables + Things conference who used their startup status as an excuse for deferring security and privacy until a later date. WRONG: both must be a priority from Day One.

  2. Conduct a privacy or security risk assessment during design phase.

  3. Minimize the data you collect and retain.  This is a tough one, because there’s always that chance that some retained data may be mashed up with some other data in future, yielding a dazzling insight that could help company and customer alike, BUT the more data just floating out there in “data lake” the more chance it will be misused.

  4. Test your security measures before launching your products. … then test them again…

  5. “..train all employees about good security, and ensure that security issues are addressed at the appropriate level of responsibility within the organization.” This one is sooo important and so often overlooked: how many times have we found that someone far down the corporate ladder has been at fault in a data breach because s/he wasn’t adequately trained and/or empowered?  Privacy and security are everyone’s job.

  6. “.. retain service providers that are capable of maintaining reasonable security and provide reasonable oversight for these service providers.”

  7. ‘… when companies identify significant risks within their systems, they should implement a defense-in -depth approach, in which they consider implementing security measures at several levels.”

  8. “… consider implementing reasonable access control measures to limit the ability of an unauthorized person to access a consumer’s device, data, or even the consumer’s network.” Don’t forget: with the Target data breach, the bad guys got access to the corporate data through a local HVAC dealer. Everything’s linked — for better or worse!

  9. “.. companies should continue to monitor products throughout the life cycle and, to the extent feasible, patch known vulnerabilities.”  Privacy and security are moving targets, and require constant vigilance.

  10. Avoid enabling unauthorized access and misuse of personal information.

  11. Don’t facilitate attacks on other systems. The very strength of the IoT in creating linkages and synergies between various data sources can also allow backdoor attacks if one source has poor security.

  12. Don’t create risks to personal safety. If you doubt that’s an issue, look at Ed Markey’s recent report on connected car safety.

  13. Avoid creating a situation where companies might use this data to make credit, insurance, and employment decisions.  That’s the downside of cool tools like Progressive’s “Snapshot,” which can save us safe drivers on premiums: the same data on your actual driving behavior might some day be used become compulsory, and might be used to deny you coverage or increase your premium).

  14. Realize that FTC Fair Information Practice Principles will be extended to IoT. These “FIPPs, ” including “notice, choice, access, accuracy, data minimization, security, and accountability,” have been around for a long time, so it’s understandable the FTC will apply them to the IoT.  Most important ones?  Security, data minimization, notice, and choice.

Not all of these issues will apply to all companies, but it’s better to keep all of them in mind, because your situation may change. I hope you’ll share these guidelines with your entire workforce: they’re all part of the solution — or the problem.

Resolved: That 2015 Is When Privacy & Security Become #IoT Priority!

I’m a right-brained, intuitive type (ENFP, if you’re keeping Myers-Briggs score…), and sometimes that pays off on issues involving technology & the general public, especially when the decidedly non-technical, primal issue of FEAR comes into the equation.

I used to do a lot of crisis management work with Fortune 100 companies, and usually worked with engineers, 95% of whom are my direct opposite: ISTJ.  Because they are so left-brained, rational and analytical, it used to drive them crazy that the public would be so fearful of various situations, because peoples’ reaction was just so darned irrational!

I’m convinced that same split is a looming, and extremely dangerous problem for the Internet of Things: the brilliant engineers who bring us all these great platforms, devices and apps just can’t believe that people could be fraidy cats.

Let me be blunt about it, IOT colleagues: get used dealing with peoples’ fears. Wise up, because that fear might just screw the IoT before it really gains traction. Just because a reaction is irrational doesn’t mean it isn’t very, very real to those who feel it, and they might just shun your technology and/or demand draconian regulations to enforce privacy and security standards. 

That’s why I was so upset at a remark by some bright young things at the recent Wearables + Things conference. When asked about privacy and security precautions (a VERY big thing with people, since it’s their very personal bodily data that’s at risk) for their gee-whiz device, they blithely said that they were just a start-up, and they’d get to security issues after they had the device technology squared away.

WRONG, KIDS: security and privacy protections have to be a key priority from the get-go.

That’s why I was pleased to see that CES asked FTC Chair Edith Ramirez to give opening remarks at a panel on security last week, and she specifically focused on “privacy by design,” where privacy protections are baked into the product from the get-go. She emphasized that start-ups can’t get off the hook:

“‘Any device that is connected to the Internet is at risk of being hijacked,’ said Ms. Ramirez, who added that the large number of Internet-connected devices would ‘increase the number of access points’ for hackers.

Ms. Ramirez seemed to be directing her remarks at the start-ups that are making most of the products — like fitness trackers and glucose monitors — driving the so-called Internet of Things.

She said that some of these developers, in contrast to traditional hardware and software makers, ‘have not spent decades thinking about how to secure their products and services from hackers.'”

I yield to no one in my love of serendipitous discoveries of data’s value (such as the breakthrough in early diagnosis of infections in neonates by researchers from IBM and Toronto’s Hospital for Sick Children, but I think Ms. Ramirez was on target about IoT developers forcing themselves to emphasize minimization of data collection, especially when it comes to personal data:

“Beyond security, Ms. Ramirez said that technology companies needed to pay more attention to so-called data minimization, in which they collect only the personal data they need for a specific purpose and delete it permanently afterward. She directly challenged the widespread contention in the technology industry that it is necessary to collect large volumes of data because new uses might be uncovered.

‘I question the notion that we must put sensitive consumer data at risk on the off chance a company might someday discover a valuable use for the information,’ she said.

She also said that technology companies should be more transparent about the way they use personal data and should simplify their terms of use.”

Watch for a major IoT privacy pronouncement soon from the FTC.

It’s gratifying that, in addition to the panel Ms. Ramirez introduced, that CES also had an (albeit small…) area for privacy vendors.  As the WaPo reported, part of the reasons for this area is that the devices and apps are aimed at you and me, because “consumers are finding — thanks to the rise in identity theft, hacks and massive data breaches — that companies aren’t always good stewards for their information.” Dealing with privacy breaches is everyone’s business: companies, government, and you and me!

As WaPo reporter   concluded: “The whole point of the privacy area, and of many of the products being shown there, is that technology and privacy don’t have to fight. They can actually help each other. And these exhibitors — the few, the proud, the private — are happy to be here, preaching that message.”

So, let’s all resolve that 2015 when privacy and security become as big an IoT priority as innovation!


Oh, before I forget, its time for my gratuitous reference whenever I discuss IoT privacy and security, to Gen. David Petraeus (yes, the very General “Do As I Say, Not As I Do” Petraeus who faces possible federal felony charges for leaking classified documents to his lover/biographer.), who was quite enamored of the IoT when he directed the CIA. That should give you pause, no matter whether you’re an IoT user, producer, or regulator!

Good Paper by Mercatus on IoT Privacy and Security

Posted on 12th September 2013 in privacy, security

I’m politically on the liberal, not the libertarian side, but I’ve come to respect the libertarian Mercatus Center, in large part because of the great work Jerry Brito has done there on governmental transparency.

As part of my preparation to moderate a panel on security and privacy at the IoT Summit on October 1st in DC, I just read a great paper on the issue by Mercatus’ Adam Thierer.

In comments submitted to the FTC for its November workshop on these issues titled “Privacy and Security Implications of the Internet of Things,” Thierer says “whoa” to those who would have the FTC and others quickly impose regulations on the IoT in the name of protecting privacy and security.

Opposing pre-emptive, “precautionary” regulations, he instead argues for holding back:

“…. an “Anti-Precautionary Principle” is the better default here and would generally hold that:

“1. society is better off when technological innovation is not preemptively restricted;

“2. accusations of harm and calls for policy responses should not be premised on hypothetical worst-case scenarios; an

“3. remedies to actual harms should be narrowly tailored so that beneficial uses of technology are not derailed.”

He reminds us that, when introduced, such everyday technologies as the phone (you know, the old  on-the-wall kind..) and photography were opposed by many as invasions of privacy, but social norms quickly adapted to embrace them. He quotes Larry Downes, who has written, “After the initial panic, we almost always embrace the service that once violated our visceral sense of privacy.”

Rather than imposing limits in advance, Thierer argues for a trial-and-error approach to avoid unnecessary limits to experimentation — including learning from mistakes.

He points out that social norms often emerge that can substitute for regulations to govern acceptable use of the new technology.

In conclusion, Thierer reminds us that there are already a wide range of laws and regulations on the book that, by extension, could apply to some of the recent IoT outrages:

“…  many federal and state laws already exist that could address perceived harms in this context. Property law already governs trespass, and new court rulings may well expand the body of such law to encompass trespass by focusing on actual cases and controversies, not merely imaginary hypotheticals. State ‘peeping Tom’ laws already prohibit spying into individual homes. Privacy torts—including the tort of intrusion upon seclusion—may also evolve in response to technological change and provide more avenues of recourse to plaintiffs seeking to protect their privacy rights.”

Along the lines of my continuing screed that IoT manufacturers had better take action immediately to tighten their own privacy and security precautions, Thierer isn’t letting them off the hook:

“The public will also expect the developers of IoT technologies to offer helpful tools and educational methods for controlling improper usages. This may include ‘privacy-by-design’ mechanisms that allow the user to limit or intentionally cripple certain data collection features in their devices. ‘Only by developing solutions that are clearly respectful of people’s privacy, and devoting an adequate level of resources for disseminating and explaining the technology to the mass public’ can industry expect to achieve widespread adoption of IoT technologies.”

So get cracking, you lazy IoT developers (yes, you smirking over there in the corner…) who think that security and privacy are someone else’s business: if you don’t act, regulators may step in, and stiffle innovation in the name of consumer protection. You’ll have no one to blame but yourselves.

It’s a good read — hope you’ll check it out!

 

FTC to hold a workshop on #IoT privacy and security implications!

Posted on 17th April 2013 in Internet of Things

Bravo! I’ve been critical of the President’s silence on the IoT, especially in light of how frequently the Chinese premier mentions it — and  spends money on it.

Now the FTC has broken that silence, with announcement of a Nov. 21st workshop in DC on the Internet of Things’ implications for privacy and security.

Specifically, they are looking for comment on the following questions:

  • What are the significant developments in services and products that make use of this connectivity (including prevalence and predictions)?
  • What are the various technologies that enable this connectivity (e.g., RFID, barcodes, wired and wireless connections)?
  • What types of companies make up the smart ecosystem?
  • What are the current and future uses of smart technology?
  • How can consumers benefit from the technology?
  • What are the unique privacy and security concerns associated with smart technology and its data?  For example, how can companies implement security patching for smart devices?  What steps can be taken to prevent smart devices from becoming targets of or vectors for malware or adware?
  • How should privacy risks be weighed against potential societal benefits, such as the ability to generate better data to improve health-care decisionmaking or to promote energy efficiency? Can and should de-identified data from smart devices be used for these purposes, and if so, under what circumstances?

The commission is requesting written comment on these and other issues by June 1st.

Bravo!

comments: 0 » tags: , ,