FTC report provides good checklist to design in IoT security and privacy

FTC report on IoT

FTC report on IoT

SEC Chair Edith Ramirez has been pretty clear that the FTC plans to look closely at the IoT and takes IoT security and privacy seriously: most famously by fining IoT marketer TrendNet for non-existent security with its nanny cam.

Companies that want to avoid such actions — and avoid undermining fragile public trust in their products and the IoT as a whole — would do well to clip and refer to this checklist that I’ve prepared based on the recent FTC Report, Privacy and Security in a Connected World, compiled based on a workshop they held in 2013, and highlighting best practices that were shared at the workshop.

  1. Most important, “companies should build security into their devices at the outset, rather than as an afterthought.” I’ve referred before to the bright young things at the Wearables + Things conference who used their startup status as an excuse for deferring security and privacy until a later date. WRONG: both must be a priority from Day One.

  2. Conduct a privacy or security risk assessment during design phase.

  3. Minimize the data you collect and retain.  This is a tough one, because there’s always that chance that some retained data may be mashed up with some other data in future, yielding a dazzling insight that could help company and customer alike, BUT the more data just floating out there in “data lake” the more chance it will be misused.

  4. Test your security measures before launching your products. … then test them again…

  5. “..train all employees about good security, and ensure that security issues are addressed at the appropriate level of responsibility within the organization.” This one is sooo important and so often overlooked: how many times have we found that someone far down the corporate ladder has been at fault in a data breach because s/he wasn’t adequately trained and/or empowered?  Privacy and security are everyone’s job.

  6. “.. retain service providers that are capable of maintaining reasonable security and provide reasonable oversight for these service providers.”

  7. ‘… when companies identify significant risks within their systems, they should implement a defense-in -depth approach, in which they consider implementing security measures at several levels.”

  8. “… consider implementing reasonable access control measures to limit the ability of an unauthorized person to access a consumer’s device, data, or even the consumer’s network.” Don’t forget: with the Target data breach, the bad guys got access to the corporate data through a local HVAC dealer. Everything’s linked — for better or worse!

  9. “.. companies should continue to monitor products throughout the life cycle and, to the extent feasible, patch known vulnerabilities.”  Privacy and security are moving targets, and require constant vigilance.

  10. Avoid enabling unauthorized access and misuse of personal information.

  11. Don’t facilitate attacks on other systems. The very strength of the IoT in creating linkages and synergies between various data sources can also allow backdoor attacks if one source has poor security.

  12. Don’t create risks to personal safety. If you doubt that’s an issue, look at Ed Markey’s recent report on connected car safety.

  13. Avoid creating a situation where companies might use this data to make credit, insurance, and employment decisions.  That’s the downside of cool tools like Progressive’s “Snapshot,” which can save us safe drivers on premiums: the same data on your actual driving behavior might some day be used become compulsory, and might be used to deny you coverage or increase your premium).

  14. Realize that FTC Fair Information Practice Principles will be extended to IoT. These “FIPPs, ” including “notice, choice, access, accuracy, data minimization, security, and accountability,” have been around for a long time, so it’s understandable the FTC will apply them to the IoT.  Most important ones?  Security, data minimization, notice, and choice.

Not all of these issues will apply to all companies, but it’s better to keep all of them in mind, because your situation may change. I hope you’ll share these guidelines with your entire workforce: they’re all part of the solution — or the problem.

comments: Comments Off on FTC report provides good checklist to design in IoT security and privacy tags: , , , ,

IBM picks for IoT trends to watch this year emphasize privacy & security

Last month Bill Chamberlin, the principal analyst for Emerging Tech Trends and Horizon Watch Community Leader for IBM Market Development (hmmm, must have an oversized biz card..) published a list of 20 IoT trends to watch this year that I think provide a pretty good checklist for evaluating what promises to be an important period in which the IoT becomes more mainstream.

It’s interesting to me, especially in light of my recent focus on the topics (and I’ll blog on the recent FTC report on the issue in several days), that he put privacy and security number one on the list, commenting that “Trust and authentication become critical across all elements of the IoT, including devices, the networks, the cloud and software apps.” Amen.

Most of the rest of the list was no surprise, with standards, hardware, software, and edge analytics rounding out the top five (even though it hasn’t gotten a lot of attention, I agree edge analytics are going to be crucial as the volume of sensor data increases dramatically: why pass along the vast majority of data, that is probably redundant, to the cloud, vs. just what’s a deviation from the norm and probably more important?).

Two dealing with sensors did strike my eye:

9.  Sensor fusion: Combining data from different sources can improve accuracy. Data from two sensors is better than data from one. Data from lots of sensors is even better.

10.  Sensor hubs: Developers will increasingly experiment with sensor hubs for IoT devices, which will be used to offload tasks from the application processor, cutting down on power consumption and improving battery life in the devices”

Both make a lot of sense.

One was particularly noteworthy in light of my last post, about the Gartner survey showing most companies were ill-prepared to plan and launch IoT strategies: “14.  Chief IoT Officer: Expect more senior level execs to be put in place to build the enterprise-wide IoT strategy.” Couldn’t agree more that this is vital!

Check out the whole list: I think you’ll find it helpful in tracking this year’s major IoT developments.

comments: Comments Off on IBM picks for IoT trends to watch this year emphasize privacy & security tags: , , , , , , , , ,

The #IoT Can Kill You! Got Your Attention? Car Security a Must

The Internet of Things can kill you.

Got your attention? OK, maybe this is the wake-up call the IoT world needs to make certain that privacy and security are baked in, not just afterthoughts.

Markey_IoT_car_reportI’ve blogged before about how privacy and security must be Job 1, but now it’s in the headlines because of a new report by our Mass. Senator, Ed Markey (Political aside: thanks, Ed, for more than 30 years of leadership — frequently as a voice crying in the wilderness — on the policy implications of telecomm!), “Tracking & Hacking: Security & Privacy Gaps Put American Drivers at Risk,” about the dangers of not taking the issues seriously when it comes to smart cars.

I first became concerned about this issue when reading “Look Out, He’s Got an Phone,!” (my personal nominee for all-time most wry IoT headline…), a litany of all sorts of horrific things, such as spoofing the low air-pressure light on your car so you’ll pull over and the Bad Guys can get it would stop dead at 70 mph,  that are proven risks of un-encrypted automotive data.  All too typical was the reaction of Schrader Electronics, which makes the tire sensors:

“Schrader Electronics, the biggest T.P.M.S. manufacturer, publicly scoffed at the Rutgers–South Carolina report. Tracking cars by tire, it said, is ‘not only impractical but nearly impossible.’ T.P.M.S. systems, it maintained, are reliable and safe.

“This is the kind of statement that security analysts regard as an invitation. A year after Schrader’s sneering response, researchers from the University of Washington and the University of California–San Diego were able to ‘spoof’ (fake) the signals from a tire-pressure E.C.U. by hacking an adjacent but entirely different system—the OnStar-type network that monitors the T.P.M.S. for roadside assistance. In a scenario from a techno-thriller, the researchers called the cell phone built into the car network with a message supposedly sent from the tires. ‘It told the car that the tires had 10 p.s.i. when they in fact had 30 p.s.i.,’ team co-leader Tadayoshi Kohno told me—a message equivalent to ‘Stop the car immediately.’ He added, ‘In theory, you could reprogram the car while it is parked, then initiate the program with a transmitter by the freeway. The car drives by, you call the transmitter with your smartphone, it sends the initiation code—bang! The car locks up at 70 miles per hour. You’ve crashed their car without touching it.’”

Hubris: it’ll get you every time….

So now Senator Markey lays out the full scope of this issue, and it should scare the daylights out of you — and, hopefully, Detroit! The report is compiled on responses by 16 car companies (BMW, Chrysler, Ford, General Motors, Honda, Hyundai, Jaguar Land Rover, Mazda, Mercedes-Benz, Mitsubishi, Nissan, Porsche, Subaru, Toyota, Volkswagen (with Audi), and Volvo — hmm: one that didn’t respond was Tesla, which I suspect [just a hunch] really has paid attention to this issue because of its techno leadership) to letters Markey sent in late 2013. Here are the damning highlights from his report:

“1. Nearly 100% of cars on the market include wireless technologies that could pose vulnerabilities to hacking or privacy intrusions.

2. Most automobile manufacturers were unaware of or unable to report on past hacking incidents.

3. Security measures to prevent remote access to vehicle electronics are inconsistent and haphazard across all automobile manufacturers, and many manufacturers did not seem to understand the questions posed by Senator Markey.

4. Only two automobile manufacturers were able to describe any capabilities to diagnose or meaningfully respond to an infiltration in real-time, and most say they rely on technologies that cannot be used for this purpose at all. (my emphasis)

5. Automobile manufacturers collect large amounts of data on driving history and vehicle performance.

6. A majority of automakers offer technologies that collect and wirelessly transmit driving history data to data centers, including third-party data centers, and most do not describe effective means to secure the data.

7. Manufacturers use personal vehicle data in various ways, often vaguely to “improve the customer experience” and usually involving third parties, and retention policies – how long they store information about drivers – vary considerably among manufacturers.

8. Customers are often not explicitly made aware of data collection and, when they are, they often cannot opt out without disabling valuable features, such as navigation.”

In short, the auto industry collects a lot of information about us, and doesn’t have a clue how to manage or protect it.

I’ve repeatedly warned before that one of the issues technologists don’t really understand and/or scoff at, is public fears about privacy and security. Based on my prior work in crisis management, that can be costly — or fatal.

This report should serve as a bit of electroshock therapy to get them (and here I’m referring not just to auto makers but all IoT technologists: it’s called guilt by association, and most people tend to confabulate fears, not discriminate between them. Unless everyone in IoT takes privacy and security seriously, everyone may suffer the result [see below]) to realize that it’s not OK, as one of the speakers at the Wearables + Things conference said, that “we’ll get to privacy and security later.” It’s got to be a priority from the get-go (more about this in a forthcoming post, where I’ll discuss the recent FTC report on the issue).

I’ve got enough to worry about behind the wheel, since the North American Deer Alliance is out to get me. Don’t make me worry about false tire pressure readings.


PS: there’s another important issue here that may be obscured: the very connectedness that is such an important aspect of the IoT. Remember that the researchers spoofed the T.P.M.S. system not through a frontal assault, but by attacking the roadside assistance system? It’s like the way Target’s computers were hacked via a small company doing HVAC maintenance. Moral of the story? No IoT system is safe unless all the ones linking to it are safe.  For want of a nail … the kingdom was lost!

Resolved: That 2015 Is When Privacy & Security Become #IoT Priority!

I’m a right-brained, intuitive type (ENFP, if you’re keeping Myers-Briggs score…), and sometimes that pays off on issues involving technology & the general public, especially when the decidedly non-technical, primal issue of FEAR comes into the equation.

I used to do a lot of crisis management work with Fortune 100 companies, and usually worked with engineers, 95% of whom are my direct opposite: ISTJ.  Because they are so left-brained, rational and analytical, it used to drive them crazy that the public would be so fearful of various situations, because peoples’ reaction was just so darned irrational!

I’m convinced that same split is a looming, and extremely dangerous problem for the Internet of Things: the brilliant engineers who bring us all these great platforms, devices and apps just can’t believe that people could be fraidy cats.

Let me be blunt about it, IOT colleagues: get used dealing with peoples’ fears. Wise up, because that fear might just screw the IoT before it really gains traction. Just because a reaction is irrational doesn’t mean it isn’t very, very real to those who feel it, and they might just shun your technology and/or demand draconian regulations to enforce privacy and security standards. 

That’s why I was so upset at a remark by some bright young things at the recent Wearables + Things conference. When asked about privacy and security precautions (a VERY big thing with people, since it’s their very personal bodily data that’s at risk) for their gee-whiz device, they blithely said that they were just a start-up, and they’d get to security issues after they had the device technology squared away.

WRONG, KIDS: security and privacy protections have to be a key priority from the get-go.

That’s why I was pleased to see that CES asked FTC Chair Edith Ramirez to give opening remarks at a panel on security last week, and she specifically focused on “privacy by design,” where privacy protections are baked into the product from the get-go. She emphasized that start-ups can’t get off the hook:

“‘Any device that is connected to the Internet is at risk of being hijacked,’ said Ms. Ramirez, who added that the large number of Internet-connected devices would ‘increase the number of access points’ for hackers.

Ms. Ramirez seemed to be directing her remarks at the start-ups that are making most of the products — like fitness trackers and glucose monitors — driving the so-called Internet of Things.

She said that some of these developers, in contrast to traditional hardware and software makers, ‘have not spent decades thinking about how to secure their products and services from hackers.'”

I yield to no one in my love of serendipitous discoveries of data’s value (such as the breakthrough in early diagnosis of infections in neonates by researchers from IBM and Toronto’s Hospital for Sick Children, but I think Ms. Ramirez was on target about IoT developers forcing themselves to emphasize minimization of data collection, especially when it comes to personal data:

“Beyond security, Ms. Ramirez said that technology companies needed to pay more attention to so-called data minimization, in which they collect only the personal data they need for a specific purpose and delete it permanently afterward. She directly challenged the widespread contention in the technology industry that it is necessary to collect large volumes of data because new uses might be uncovered.

‘I question the notion that we must put sensitive consumer data at risk on the off chance a company might someday discover a valuable use for the information,’ she said.

She also said that technology companies should be more transparent about the way they use personal data and should simplify their terms of use.”

Watch for a major IoT privacy pronouncement soon from the FTC.

It’s gratifying that, in addition to the panel Ms. Ramirez introduced, that CES also had an (albeit small…) area for privacy vendors.  As the WaPo reported, part of the reasons for this area is that the devices and apps are aimed at you and me, because “consumers are finding — thanks to the rise in identity theft, hacks and massive data breaches — that companies aren’t always good stewards for their information.” Dealing with privacy breaches is everyone’s business: companies, government, and you and me!

As WaPo reporter   concluded: “The whole point of the privacy area, and of many of the products being shown there, is that technology and privacy don’t have to fight. They can actually help each other. And these exhibitors — the few, the proud, the private — are happy to be here, preaching that message.”

So, let’s all resolve that 2015 when privacy and security become as big an IoT priority as innovation!


Oh, before I forget, its time for my gratuitous reference whenever I discuss IoT privacy and security, to Gen. David Petraeus (yes, the very General “Do As I Say, Not As I Do” Petraeus who faces possible federal felony charges for leaking classified documents to his lover/biographer.), who was quite enamored of the IoT when he directed the CIA. That should give you pause, no matter whether you’re an IoT user, producer, or regulator!

IoT Security After “The Interview”

Posted on 22nd December 2014 in defense, Internet of Things, M2M, management, privacy, security, US government

Call me an alarmist, but in the wake of the “Interview” catastrophe (that’s how I see it in terms of both the First Amendment AND asymmetrical cyberwarfare), I see this as a clarion call to the #IoT industry to redouble efforts to make both security AND privacy Job #1.

Here’s the deal: if we want to enhance more and more parts of governmental, commercial, and private lives by clever IoT devices and apps to control them, then there’s an undeniable quid pro quo: we MUST make these devices and apps as secure as possible.

I remember some bright young entrepreneurs speaking at a recent wearables conference, where they apologized for not having put attention on privacy and security yet, saying they’d get to it early next year.

Nope.

Unacceptable.

Security must be built in from the beginning, and constantly upgraded as new threats emerge.  I used to be a corporate crisis manager, and one of the things that was so hard to convince left-brained, extremely rational engineers about was that just because fears were irrational didn’t mean they weren’t real — even the perception of insecure IoT devices and apps has the potential to kill the whole industry, or, as Vanity Fair‘s apocalyptic “Look Out, He’s Got a Phone” article documented, it could literally kill us. As in deader than a doornail.

This incident should have convinced us all that there are some truly evil people out there fixated on bringing us to our collective knees, and they have the tech savvy to do it, using tools such as Shodan. ‘Nuff said?

PS: Here’s what Mr. Cybersecurity, Bruce Schneier, has to say on the subject. Read carefully.

comments: Comments Off on IoT Security After “The Interview” tags: , , , ,

My #IoT predictions for 2015

I was on a live edition of “Coffee Break With Game-Changers” a few hours ago with panelists Sherryanne Meyer of Air Products and Chemicals and Sven Denecken of SAP, talking about tech projections for 2015.

Here’s what I said about my prognostications:

“I predict that 2015 will be the year that the Internet of Things penetrates consumer consciousness — because of the Apple Watch. The watch will unite both health and smart home apps and devices, and that will mean you’ll be able to access all that usability just by looking at your watch, without having to fumble for your phone and open a specific app.

If Apple chooses to share the watch’s API on the IFTTT – If This Then That — site, the Apple phone’s adoption – and usability — will go into warp speed. We won’t have to wait for Apple or developers to come up with novel ways of using the phone and the related devices — makers and just plain folks using IFTTT will contribute their own “recipes” linking them. This “democratization of data” is one of the most powerful – and under-appreciated – aspects of the IoT. In fact, Sherryanne, I think one of the most interesting IoT strategy questions for business is going to be that we now have the ability to share real time data with everyone in the company who needs it – and even with supply chain and distribution networks – and we’ll start to see some discussion of how we’ll have to change management practices to capitalize on this this instant ability to share.

(Sven will be interested in this one) In 2015, the IoT is also going to speed the development of fog computing, where the vast quantities of data generated by the IoT will mean a switch to processing data “at the edge,” and only passing on relevant data to the cloud, rather than overwhelming it with data – most of which is irrelevant.

In 2015 the IoT is also going to become more of a factor in the manufacturing world. The success of GE’s Durathon battery plant and German “Industry 4.0” manufacturers such as Siemans will mean that more companies will develop incremental IoT strategies, where they’ll begin to implement things such as sensors on the assembly line to allow real-time adjustments, then build on that familiarity with the IoT to eventually bring about revolutionary changes in every aspect of their operations.

2015 will also be the year when we really get serious about IoT security and privacy, driven by the increasing public concern about the erosion of privacy. I predict that if anything can hold back the IoT at this point, it will be failure to take privacy and security seriously. The public trust is extremely fragile: if even some fledgling startup is responsible for a privacy breach, the public will tend to tar the entire industry with the same brush, and that could be disastrous for all IoT firms. Look for the FTC to start scrutinizing IoT claims and levying more fines for insufficient security.”

What’s your take on the year ahead? Would love your comments!

comments: Comments Off on My #IoT predictions for 2015 tags: , , , , , ,

In case you missed it, great panel today on the IoT and government

Posted on 19th March 2014 in government, Internet of Things, US government

In case you missed it, old friend Christopher Dorobek put together a great (in all modesty, LOL …) panel today for his “DorobekINSIDER” series on GovLoop about how the Internet of Things will transform government.  I’ll try to summarize it in a later post, but you can listen in here!

comments: Comments Off on In case you missed it, great panel today on the IoT and government tags: , , , , , ,

Tweeting the IoT Summit!

Posted on 1st October 2013 in government, Internet of Things, M2M, privacy, security

I Tweeted throughout the IoT Summit today, cryptic as the comments may have been. You can check them out at @data4all.  Learned a great deal, and picked up several nice examples for the e-book I’m writing on implications for corporate management of the IoT!

Enjoy.  Will do the same tomorrow!

comments: Comments Off on Tweeting the IoT Summit! tags: , , ,

Could IoT Allow Do-over for Privacy, Security — & Trust?

Posted on 13th September 2013 in communication, management, privacy, security

Expect to be reading a lot here about privacy and security between now and my panel on those issues at the IoT Summit in DC, Oct. 1 & 2, as I prep to ask the panel questions!

Here’s another, from Stacy Higginbotham (BTW, she does a great podcast on IoT issues!), based on a conversation with ARM CTO Mike Muller. It’s reassuring to see that this IoT-leading firm is taking privacy and security seriously. Even more refreshingly, theirs is a nuanced and thoughtful view.

Muller told Higginbotham that IoT vendors should learn from some of the missteps on privacy on the Web so far, and make amends:

“’We should think about trust as who has access to your data and what they can do with it. For example, I’ll know where you bought something, when you bought it, how often and who did you tweet about it.

“When you put the long tail of lots of bits of information and big data analytics associated with today’s applications we can discern a lot. And people are not thinking it through. … I think it’s the responsibility of the industry that, as people connect, to make them socially aware of what’s happening with their data and the methods that are in place to make connections between disparate sets of data (my emphasis). In the web that didn’t happen, and the sense of lost privacy proliferated and it’s all out there. People are trying to claw that back and implement privacy after the fact.”

Higginbotham adds that “… what troubles Muller is that today, there’s nothing that supports trust and privacy in the infrastructure associated with the internet of things.”

What struck me, as someone who used to earn his living doing corporate crisis management, is that one of the critical issues in trust (or lack thereof) is guilt by association may not be logically valid, but is emotionally powerful: if people’s preconception of IoT privacy and security standards is that they’re simply an extension of Internet ones, there’s likely to be trouble.

She goes on to differentiate between security, privacy — and trust.

“Trust is the easiest to define and the hardest to implement. It relies on both transparency and making an effort to behave consistently ….  When it comes to connected devices and apps, trust is probably most easily gained by explaining what you do with people’s data: what you share and with whom. It might also extend to promises about interoperability and supporting different platforms. Implicitly trust with connected devices also means you will respect people’s privacy and follow the best security practices….

“Privacy is more a construct of place as opposed to something associated with a specific device. So a connected camera on a public street is different from a connected camera inside your home. It’s easy to say that people shouldn’t be able to just grab a feed from inside your home — either from a malicious hack or the government (or a business) doing a random data scrape. But when it comes to newer connected devices like wearables it gets even more murky: Consider that something like a smart meter can share information about the user to someone who knows what to look for.

“So when thinking about the internet of things and privacy, it’s probably useful to start with thinking about the data the device generates….

(As for security:) “To protect privacy when everything is connected will require laws that punish violations of people’s privacy and draw lines that companies and governments can’t step over; but it will also require vigilance by users. To get this right, users should be reading the agreements they click through when they connect a device, but companies should also create those agreements, especially around data sharing transparent, in a way that inspires trust.

Governments and companies need to think about updating laws for a connected age and set criteria about how different types of data are transported and shared. Health data might still need the HIPAA-levels of regulations, but maybe looser standards can prevail for connected thermostats.”

Sounds to me as if there’s a role in these complex issues for all of us: vendors, government, and users.

But the one take-away that I have from Muller’s remarks is that IoT vendors must realize they have to earn users trust, and that’s going to require a combination of technical measures and unambiguous, plain-English communication with users about who owns their data and how it will be used. To me, that means not hiding behind the lawyers and agate-type legal disclaimers, but clear, easy-to-understand declarations about users’ rights to their data and companies’ need to directly ask them for access, displayed prominently, with the default being that the user completely denies access, and must opt in for it to be shared. 

What do you think?

Higginbotham concludes that “we need to stop freaking out about the dangers of connected devices and start having productive discussions about implementing trust and security before the internet of things goes the way of the web. Wonderful, free and a total wild west when it comes to privacy.” Hopefully, that’s what will happen during our October 1st panel.

comments: Comments Off on Could IoT Allow Do-over for Privacy, Security — & Trust? tags: , , , , , ,

Good Paper by Mercatus on IoT Privacy and Security

Posted on 12th September 2013 in privacy, security

I’m politically on the liberal, not the libertarian side, but I’ve come to respect the libertarian Mercatus Center, in large part because of the great work Jerry Brito has done there on governmental transparency.

As part of my preparation to moderate a panel on security and privacy at the IoT Summit on October 1st in DC, I just read a great paper on the issue by Mercatus’ Adam Thierer.

In comments submitted to the FTC for its November workshop on these issues titled “Privacy and Security Implications of the Internet of Things,” Thierer says “whoa” to those who would have the FTC and others quickly impose regulations on the IoT in the name of protecting privacy and security.

Opposing pre-emptive, “precautionary” regulations, he instead argues for holding back:

“…. an “Anti-Precautionary Principle” is the better default here and would generally hold that:

“1. society is better off when technological innovation is not preemptively restricted;

“2. accusations of harm and calls for policy responses should not be premised on hypothetical worst-case scenarios; an

“3. remedies to actual harms should be narrowly tailored so that beneficial uses of technology are not derailed.”

He reminds us that, when introduced, such everyday technologies as the phone (you know, the old  on-the-wall kind..) and photography were opposed by many as invasions of privacy, but social norms quickly adapted to embrace them. He quotes Larry Downes, who has written, “After the initial panic, we almost always embrace the service that once violated our visceral sense of privacy.”

Rather than imposing limits in advance, Thierer argues for a trial-and-error approach to avoid unnecessary limits to experimentation — including learning from mistakes.

He points out that social norms often emerge that can substitute for regulations to govern acceptable use of the new technology.

In conclusion, Thierer reminds us that there are already a wide range of laws and regulations on the book that, by extension, could apply to some of the recent IoT outrages:

“…  many federal and state laws already exist that could address perceived harms in this context. Property law already governs trespass, and new court rulings may well expand the body of such law to encompass trespass by focusing on actual cases and controversies, not merely imaginary hypotheticals. State ‘peeping Tom’ laws already prohibit spying into individual homes. Privacy torts—including the tort of intrusion upon seclusion—may also evolve in response to technological change and provide more avenues of recourse to plaintiffs seeking to protect their privacy rights.”

Along the lines of my continuing screed that IoT manufacturers had better take action immediately to tighten their own privacy and security precautions, Thierer isn’t letting them off the hook:

“The public will also expect the developers of IoT technologies to offer helpful tools and educational methods for controlling improper usages. This may include ‘privacy-by-design’ mechanisms that allow the user to limit or intentionally cripple certain data collection features in their devices. ‘Only by developing solutions that are clearly respectful of people’s privacy, and devoting an adequate level of resources for disseminating and explaining the technology to the mass public’ can industry expect to achieve widespread adoption of IoT technologies.”

So get cracking, you lazy IoT developers (yes, you smirking over there in the corner…) who think that security and privacy are someone else’s business: if you don’t act, regulators may step in, and stiffle innovation in the name of consumer protection. You’ll have no one to blame but yourselves.

It’s a good read — hope you’ll check it out!

 

http://www.stephensonstrategies.com/">Stephenson blogs on Internet of Things Internet of Things strategy, breakthroughs and management