Back to top

Blog

Submitted by NAI on February 7, 2019

Privacy has been in the news a lot this past year, and for good reason. Two recent initiatives, GDPR and the California Consumer Privacy Act (CCPA), reflect increased awareness of data collection practices and users' desire for greater transparency and control. However, in today's digital driven economy, "one size fits all" legislation will most likely not be the final answer. Like all great transformational movements, it will take an evolutional process to find the right balance between innovative services and increased user controls. To help make sense of what's happening now, as well as some areas where additional attention is required, we caught up with Kevin Ching, SVP of Product and Data Strategy for NinthDecimal, who joined the NAI Board of Directors earlier this year for a Q&A about several key issues and topics including what's driving the data and privacy conversation and how it's shaping the advertising and marketing ecosystem.

 

Q: There is currently a lot of attention related to digital privacy. What's driving it, and why do you feel it's capturing the spotlight now?

Over the past year there have been several significant developments that have led to increased attention around consumer data and privacy. Most notably, the European Union (EU) General Data Protection Regulation (GDPR), which established new regulations to enhance data protection and privacy for residents of the EU and the European Economic Area. As the GDPR came into force this year, data and privacy became front-and-center for consumers, legislators and the marketing and advertising industry.

In addition, recent news about potential data misuse from high profile companies here in the U.S. shined a spotlight on the issue and the potential impact on U.S. citizens. These were a major driving force behind enactment of the California Consumer Privacy Act (CCPA) in June of last year. The CCPA is currently scheduled to take effect in 2020, and it requires enhanced disclosures from companies on what consumer data they collect, what they do with the data, and more importantly, like the GDPR, give consumers more control of that data.

These new landmark regulations have raised important questions within the industry about how to adapt to these very expansive, but different regulations, and how to balance these rigid requirements with the benefit for consumers and small businesses that rely on the collection of consumer data to drive their businesses.

 

Q: Are recent revelations of data misuse and new regulations directly linked, or is this just coincidental?

While recent events have definitely elevated the discussion of consumer privacy to a whole new level, privacy has been, and will continue to be, essential for citizens and a priority for most of the industry – no matter where you fall within the advertising and marketing ecosystem.

In fact, as data becomes a more integral part of personalized marketing, companies have already moved to provide greater safeguards to ensure proper use of consumer data. Leading organizations like the NAI are also emerging to help ensure data compliance and transparency.

 

Q: How consistent are the regulations imposed by the GDPR and CCPA, and how are they different?

While each initiative has different components and levels of control, both are focused on two primary goals. First, to give consumers more insight into what type of information is being collected and how it's being shared. Second, to create an easy process for consumers to make informed decisions about sharing data. So, it's fair to say that both of these new policies promote enhanced notice and control, but they go about it in very different ways—the GDPR inherently focuses more on use, where the CCPA focuses on sharing of data.

 

Q: Do these regulations reflect a disconnect between the advertising and marketing industry, consumers and legislators, and what does this mean?

Unfortunately, yes. In the grand scheme of things, the digital economy is only a few decades old, and it's moving so fast that technology evolution has outpaced public education, or "data transparency." But it's important to remember that we live in a market economy that is supported heavily by advertising. Therefore, doing away with advertising as we know it is likely to not only alter current business models, but it will also have unintended consequences for consumers, such as eliminating much of the ad-supported content and free services that consumers enjoy today– email and news sources for example. This is a side effect that many consumers—and even policymakers— are not aware of. As an industry, we need to do a better job of explaining this tradeoff and helping to craft and implement policies where privacy and a robust internet ecosystem are not mutually exclusive.

I also want to point out that an increasing majority of companies rely on data to improve the customer experience. We shouldn't lose sight of the fact that there are significant benefits to consumers from data-driven innovation. Take for example online shopping. Think about the ability to save things in a "wish list" or "cart," or when vendors use data to make customized recommendations based on historical purchases. Most consumers probably don't realize that these are among the most common uses of internet data collection. Nor do consumers likely consider the benefit of customized informational services, such as weather predictions that will provide an accurate, timely forecast based on where you are. These types of services are only made possible by sharing of relevant data. And, of course, there are also benefits to data-driven advertising. In addition to trying to pair an advertisement to something that is relevant to a consumer, there are instances when companies use data to limit the number of ads that someone sees. There are a lot of benefits to data collection, and a wide range of data types. It is more important now than ever that we as an industry find the right balance and make sure consumers know what is being collected, and that we make it easier for them to exercise control over how it's being used.

To create this balance, it's important that consumers, policymakers and industry representatives work together to ensure alignment. Transparency is paramount to that success. Without it, we won't be able to create a foundation of trust between consumers and the advertising/marketing industry.

 

Q: What does recent legislation such as the CCPA mean for companies like NinthDecimal that rely heavily on the use of location data?

Recent legislation such as the CCPA are not written specifically with location data in mind. It focuses much more broadly towards consumer data and advertising in general. It's applicable to virtually every company that collects user information.

That said, location data companies that are making data privacy compliance a priority and are proactively creating a privacy friendly infrastructure will be able to adapt to new industry standards or legislation more quickly and effectively than those who are not. At NinthDecimal, we've always been committed to the highest standards in privacy. In 2011 for example, we helped establish privacy guidelines that are widely adopted by the mobile ecosystem. We've also been implementing Privacy by Design as a standard for building products since we launched Location Graph seven years ago. And even though we are a U.S.-only company, we have undergone a thorough GDPR internal audit and meet all compliance standards of our partners.

 

Q: Do you see more changes coming? If so, what and to what extent?

I do think that the CCPA will lead to adoption of similar legislation from other states and has opened the door for a national privacy standard. To continue building towards a well-balanced advertising ecosystem and consumer benefits, it will be important for us to constantly monitor and reevaluate what is working and what needs to be adjusted.

As I mentioned earlier, continued collaboration between the advertising industry and policymakers, along with key learnings from recently implemented data management policies, will be key to building the kind of consumer trust that leads to a thriving ad-supported economy that also benefits consumers.

 

About Kevin Ching

Kevin has an extensive track record in successfully implementing and growing privacy friendly mobile and data businesses globally. He has been recognized for co-developing AdChoices, a self-regulatory program for online interest-based advertising, for the mobile ecosystem. NinthDecimal was one of the first companies to offer a "mobile ad choices" icon and opt-out ad option in 2012. Kevin is also a recipient of the IAB 2017 Service Excellence Award for his work on the Mobile Location Data Guide for Publishers. Earlier in his career, Kevin's contributions helped NinthDecimal win the North American Frost and Sullivan New Product Innovation Award for the company's launch of its audience intelligence platform, Location Graph.

NinthDecimal has been a member of the NAI since 2015.

Submitted by NAI on December 14, 2018

Author: David LeDuc

In a recent article, Your Apps Knew Where you Were Last Night, and They’re Not Keeping it Secret, the New York Times raised some serious concerns about the collection and use of precise location information by mobile apps that deserve close attention by all parties in the mobile ecosystem, including the advertising technology industry.  

The ad tech industry, including NAI member companies, is integral to the innovation and benefits provided by mobile apps.  Partnering with app publishers, a number of NAI member companies offer services that empower significant benefits from users’ location data, fueling the rich and expanding app ecosystem. The location-specific features provided by apps have transformed how we interact with our environments. Users rely on their mobile devices to navigate to their destinations, find nearby services, and receive local information, all in realtime wherever they go. The advertising technology industry even powers systems that enable the delivery of location-specific severe weather alerts and missing children alerts which are especially important as fewer people receive their information from traditional sources such as televisions or radios.

Of course, location data is sensitive to users, and it has the potential to reveal specific personal details, so it should be collected and used responsibly by all parties in the diverse mobile ecosystem, including app publishers, operating systems, software developers, and ad tech companies. In many cases, however, limitations in the consent mechanisms for location data provided within mobile operating systems make it difficult or impossible for apps or software developers to modify the messaging provided to users, which makes it challenging to clearly explain an app’s data collection and use practices. All entities, therefore, have to work together to provide effective transparency and control for consumers.

Transparency and control for users is a fundamental pillar of the NAI Code of Conduct. To that end, the NAI has worked for years to provide an environment that enables the responsible use of location data, beginning with a set of requirements established in our original Mobile Application Code in 2013. Today, the 2018 NAI Code of Conduct (Code) requires member companies that obtain data, including the precise location information shared by applications, to adhere to a robust set of privacy protections. Member companies’ practices are also subject to annual compliance reviews by NAI staff. Specifically, the NAI Code establishes a number of privacy protections, including:

  • Notice: The NAI requires member companies to provide clear, prominent, and meaningful notice regarding their data collection and use practices, including location data. In addition, NAI members must work to ensure that mobile applications which collect and share this data provide similar notices to users, although ultimately mobile applications are responsible for the disclosures they provide on their own properties.

  • Opt-in Consent: The NAI requires member companies seeking to use precise location data for Personalized Advertising to obtain either (i) a user’s opt-in consent; or (ii) reasonable assurances that the app collecting the data has obtained opt-in consent before doing so.

  • Use Limitations for Location Information: NAI members may not use location information collected for Personalized Advertising, or allow it to be used, to determine an individual’s eligibility for employment, credit, health care, or insurance.

  • Limitations on Re-identification: NAI members generally do not associate location information with any information that identifies a particular individual for purposes of Personalized Advertising. If an NAI member sought to associate location data it had already received from an app with an identified individual for Personalized Advertising, it would first need to directly obtain that user’s separate opt-in consent. Similarly, an NAI member also may not transfer location information to a third party unless it first obtains a contractual guarantee that the third party will not attempt to re-identify the user for Personalized Advertising without the user’s separate opt-in consent.

  • Sensitive Data: The NAI does not permit member companies to make inferences about sensitive health conditions, pregnancy termination, or sexuality for Personalized Advertising purposes without a user’s opt-in consent. This restriction applies equally to app interaction data, web browsing data, and precise location data.

The New York Times article raises concerns about the possibility that anyone could obtain pseudonymous location data, linked only to advertising IDs, and connect that information with an identified individual. While this is a technical possibility in some circumstances, the NAI Code and compliance program help ensure that NAI members, a number of whom were mentioned in the article, do not engage in such activities without a user’s permission.

Despite the privacy protections currently required by the Code, the New York Times article raises serious concerns about the adequacy of the current notice and choice protections offered for the use of location data. Mobile applications that collect a user’s data and share it with advertising technology companies must inform users about the nature of their data collection, use, and sharing. As noted, while the specific implementation of this notice is frequently outside the control of ad tech companies, there are additional privacy protective business practices that NAI recommends to help alleviate the concerns raised in the article.

First, NAI members are encouraged to engage in the use of general or “imprecise” location instead of precise location data where possible. NAI considers location data to be imprecise when it cannot be used to determine with reasonable specificity the actual physical location of a person or device. In general, NAI considers location data to be imprecise if it cannot be used to locate a person or device within a 500 meter radius - which is roughly the length of five football fields in any direction. The NAI has issued specific and detailed guidance on how members may render location information imprecise. As a result, many NAI members retain only imprecise location data or interest segments derived from such data, such as coffee shop visitor, rather than the underlying coordinates themselves.

Second, the NAI also encourages members to integrate just in time or enhanced notice provisions in the applications from which they obtain precise location data, in order to clarify the use of such data for advertising, such as the interstitial notice provided in the GasBuddy app referenced in the article.

Finally, companies should implement responsible data retention limits, whereby companies retain data, including precise location data, for only as long as necessary to fulfill a legitimate business need.

As mobile devices grow more capable, and become a bigger part of our daily lives, the NAI is actively exploring opportunities, through additional guidance or changes to our Code, to ensure that users have better notice and more control regarding the sharing and use of their data, including precise location data. For example, changes could include codification of some of the best practices highlighted above, expanding NAI notice and choice requirements to also apply to real-time contextual use of location data, and expanding requirements to other sensors on mobile devices.

The NAI also supports federal privacy legislation that would establish a national privacy framework to ensure that all companies--not just those who have already committed to strict industry self-regulation--are proactive about consumer privacy protection.  Such a framework should leverage existing self-regulation, like the NAI Code and compliance program, and strengthen the FTC’s enforcement capabilities to ensure bad actors are subject to strict penalties.

Submitted by Charlotte Kress on September 17, 2018

On Thursday, September 13th , the FTC began its Hearings on Competition and Consumer Protection in the 21st Century. It was the first in a series of hearings on whether recent economic, business, technological, or international developments require changes to the FTC’s competition and consumer protection enforcement priorities. The series will provide the FTC with a range of viewpoints to evaluate current enforcement and policy in light of today’s digital landscape.

The focal points of the discussion were antitrust law, consumer protection, and consumer data regulation. Across all three categories, panelists highlighted the importance of clear policy goals to ensuring industry compliance, and supported the FTC’s endorsement of self-regulation.

The following is a summary of the points made by panelists in each of the three discussions:

A. Antitrust Law:

The fist topic of discussion was the current landscape of competition and consumer protection law and policy. It primarily focused on the current state of antitrust law. Panelists cautioned a return to the populist antitrust theories of the 1970’s, stating that a reliance on simple market concentration would be harmful to innovation. Antitrust law should not be used as a tool to dismantle the giant technology companies simply because of amorphous concepts of bigness and fairness.

Absent a legitimate antitrust concern, antitrust law is poorly designed to combat social issues like privacy, which are better suited to other forms of enforcement. The discussion clearly emphasized that politically or socially motivated enforcement is a misuse of antitrust law. Instead, antitrust law should focus on providing an adaptable framework and economic concepts for changing market conditions. They should be sufficiently flexible to apply across a broad area of industries. Such policy should be shaped from sophisticated data rather than broad, aggregated industry data.

Moreover, the FTC should continue its effort to harmonize international antitrust substance and procedure, such as by establishing a multilateral framework for antitrust cases. As globalization of antitrust continues, the FTC may need to go beyond soft international guidance to establish a basis for evaluating the actual implementation of such guidance.

B. Consumer Protection:

To help new entrants and multinational companies navigate the global patchwork of consumer protection and privacy laws in the U.S., regulatory guidance must clearly establish the rights of consumers and businesses. Harm to consumers should be interpreted to include the harms contemplated by common law privacy torts because of the abstract nature of the injuries that are made possible by emerging technology. The FTC should support self-regulatory efforts to enhance its mission of fighting fraud, deception, and unfair business practices. Self-regulation and industry standards incentivize businesses to do the right thing while remaining competitively viable. Further, self-regulation is cost-efficient for the FTC (the NAI was specifically highlighted as an example of strong self-regulation).

Consumer education is essential for navigating new marketplaces, and will ensure that the reasonable person standard is not diluted in light of unfamiliar, emerging markets. Similarly, the FTC should consider allocating more resources to its technological capacity, such as by adding a technology department to the agency.

C. Consumer Data Regulation:

The FTC must establish clear goals, values, and implementation measurements for privacy and data security policy. These policies must balance consumer sovereignty and privacy with marketplace efficiencies and the tremendous benefits that the digital economy has provided to consumers. Though the panelists disagreed as to what constitutes a privacy harm, each of them stressed the importance of articulating the particular harm that a policy is meant to prevent. Some thought that harm should be broadly interpreted to include risk and breach of consumer expectations, while others believed that the FTC should only involve itself when there is a substantial harm to consumers.

The panelists also compared U.S. privacy and consumer protection law to European law. The U.S. sectoral approach to privacy lacks consistency and business efficiencies compared to the EU’s comprehensive privacy law. Meaning that it is difficult to clearly articulate how data is protected in the U.S. because it often depends on who holds the data and what the data entails. The haphazard nature of this approach has lead California and the EU to take the helm of privacy regulation. Today, businesses focus their compliance efforts primarily on the GDPR and various California laws, such that they do not look to U.S. law more broadly as to how to build privacy programs and practices. However, commitments to privacy in the EU are typically only met when there is strong enforcement of those commitments. It remains to be seen whether European data protection authorities will truly be able to enforce GDPR for non-compliant businesses that are primarily located in the US.

The FTC has an opportunity to strike a balance between the less permissive European laws and the current scattered approach to data regulation in the US. The agency is well- positioned to do so by engaging the public in these hearings to thoughtfully and creatively keep pace with competition and consumer protection enforcement and policy in the 21st century.

Submitted by Grant Nelson on April 23, 2018

The move from traditional television to smart and connected televisions is accelerating. Few televisions sold today do not have some smart or connectivity feature, and digital advertisers and regulators alike have taken notice. With the growth of data-enabled television comes privacy concerns, and the NAI commenced a working group to develop standards for NAI members demonstrating responsible use of data.

Draft Guidance

Now, the NAI’s Advanced TV Working Group has published a draft for comment entitled “Guidance for Members: Viewed Content Advertising.” The draft is the result of many meetings, redlines, compromises, and discussions with privacy organizations and we are proud of the hard work of each participant.

This guidance does not weaken any existing guidance. It clarifies that the NAI’s Cross-Device Linking guidance applies to data collected from televisions as devices. It embraces the trend of advanced TV OSes converging with existing mobile operating systems. The Viewed Content Advertising guidance requires opt-in consent for technologies that collect all or nearly all of the viewing activity on a television, implementing the core principle enumerated Vizio.

What Now?

The 24 members of the Working Group have crafted every word carefully, but no one can see the whole picture. We invite NAI member companies to read the guidance and provide their constructive feedback via Grant Nelson (grant AT networkadvertising.org) before June 12th. The NAI has shared this and previous drafts, with several privacy organizations and looks forward to reading the suggestions for improvement.

Read the finalized Guidance.