Catching up on my reading after the New Year, I came across an article by Christopher Wolf and Jules Polonetsky from the Future of Privacy Forum. The article, issued in November, is about an updated privacy paradigm for the “Internet of Things,” and it came out on the heels of the FTC's workshop about the same subject. After reading it, many of the points that both Jules and Chris raised resonated with me, and I'd like to offer a few additional thoughts.
First, Jules and Chris address the concern that the Fair Information Practice Principles, which date back to the early ‘70s, do not adequately address the privacy concerns of 2014. While I agree that the traditional framework of the FIPPS may not be keeping pace with today’s privacy challenges, I firmly believe that the core principles embodied in the FIPPS – notice, choice, data minimization, transparency, security, accountability – remain relevant today.
I would suggest that the FIPPS themselves are not the issue. It’s the implementation; as the article points out, a “simple application” of these principles will not “solve” the more complex issues presented by data collection and information flows that arise with the Internet of Things – or for other business models that continue to evolve in 2014. I think we are saying the same thing here, but I’d say don’t abandon the FIPPS. Simply change the emphasis on different principles in different contexts.
The article highlights the need for a “use-focused privacy paradigm” for the Internet of Things. Amen. I would go even further and suggest that such a use-based model of privacy is applicable today in a wide range of contexts, including online advertising. The NAI Self-Regulatory Code of Conduct recognizes this by imposing higher obligations on our members for different uses of data. For example, the Code requires that members offer an opt-out choice for the use of anonymous or pseudonymous information for Interest-Based Advertising (IBA) but prohibits certain uses of data altogether, such as uses of data for eligibility decisions around credit, health, insurance, and employment.
In the article, Jules and Chris highlight issues around eligibility and note that the potential for harm in these contexts are far greater than other contexts where data may be collected and used for things like market research, product improvement, or in the case of NAI members, advertising.
Another point they highlight in the article is the emphasis on anonymization and de-identification. Here again, we are in violent agreement. Jules and Chris push back on critics noting that few forms of anonymization or de-identification will 100% ensure that data can never be re-identified. However, we aren’t seeking guarantees. We are seeking strong protections to address privacy concerns. The fact that we can't have a 100% guarantee that data won't be re-linked should not undermine the value of de-identification or anonymization processes. As Jules and Chris note, the appropriateness of a specific anonymization practice will depend on the circumstances. Once again, I couldn't agree more with this context, and NAI will be hosting a Privacy Enhancing Technology workshop in February to explore just these issues.
Moving on to the next part of their framework, they discuss the issue of context, particularly the context in which personally identifiable information (PII) is collected. This is a very important point – one that is the focus of the NAI self-regulatory Code of Conduct. Currently no NAI member collects PII to be used for IBA. It simply isn't necessary, and in fact NAI members take extraordinary efforts to avoid collecting PII and merging it with data used for IBA. The Code creates incentives to NOT collect PII.
Jules and Chris are clear that you should be “transparent about data use.” Transparency is a core principle underlying the NAI Code of Conduct. We believe transparency is essential to promoting privacy and good decision-making. Again, it may depend upon the uses and sensitivity of the data. At NAI, for example, we place a very strong emphasis on transparency around health-related information, whether it meets our definition of sensitive health data or is information about any segment that touches the human body.
The article goes on to discuss accountability. Accountability is music to my ears and is the keystone of the NAI's self-regulatory framework. You must have accountability in any framework or privacy paradigm. Accountability mechanisms must be strong, visible, and backed up with tough enforcement.
Jules and Chris close their article by discussing the development of codes of conduct. Here again this is in complete alignment with our thinking. NAI dates back to 2000, and we have one of the original self-regulatory codes of conduct addressing privacy. Throughout the past several years, the NAI framework has been updated two times; just last year we added new principles governing the use and collection of data across mobile applications.
In any area where rapidly evolving business models and business technology, such as online advertising and the Internet of Things, codes of conduct allow for a flexible application of the Fair Information Practice Principles and the other principles including context, transparency, and use limitations.
This article is entirely consistent with my personal philosophy on how we should address privacy, not only for the future but for today’s privacy challenges. I applaud the fact that Jules and Chris did not emphasize the status of an entity as a “first party” or “third party.” I believe this reflects the reality that as we head into a world of the Internet of Things, the focus on first party versus third party, which was prevalent in 2005, is becoming far less valuable and far less important in this debate. As was discussed at the FTC's workshop, it is becoming increasingly difficult for a consumer to determine just who is a first party and who is a third party in any given context, particularly when we'll be talking about hundreds of devices that would be interconnected to this new Internet of Things.
I completely agree with the premise of the article that we need to focus more on use, transparency, and accountability as well as anonymization as we try and adapt traditional notions of privacy to rapidly-evolving business models, technologies, and policy challenges – whether those challenges be involved in the Internet of Things or in what has been considered more traditional issues around online advertising.