Back to top


Submitted by NAI on December 14, 2018

Author: David LeDuc

In a recent article, Your Apps Knew Where you Were Last Night, and They’re Not Keeping it Secret, the New York Times raised some serious concerns about the collection and use of precise location information by mobile apps that deserve close attention by all parties in the mobile ecosystem, including the advertising technology industry.  

The ad tech industry, including NAI member companies, is integral to the innovation and benefits provided by mobile apps.  Partnering with app publishers, a number of NAI member companies offer services that empower significant benefits from users’ location data, fueling the rich and expanding app ecosystem. The location-specific features provided by apps have transformed how we interact with our environments. Users rely on their mobile devices to navigate to their destinations, find nearby services, and receive local information, all in realtime wherever they go. The advertising technology industry even powers systems that enable the delivery of location-specific severe weather alerts and missing children alerts which are especially important as fewer people receive their information from traditional sources such as televisions or radios.

Of course, location data is sensitive to users, and it has the potential to reveal specific personal details, so it should be collected and used responsibly by all parties in the diverse mobile ecosystem, including app publishers, operating systems, software developers, and ad tech companies. In many cases, however, limitations in the consent mechanisms for location data provided within mobile operating systems make it difficult or impossible for apps or software developers to modify the messaging provided to users, which makes it challenging to clearly explain an app’s data collection and use practices. All entities, therefore, have to work together to provide effective transparency and control for consumers.

Transparency and control for users is a fundamental pillar of the NAI Code of Conduct. To that end, the NAI has worked for years to provide an environment that enables the responsible use of location data, beginning with a set of requirements established in our original Mobile Application Code in 2013. Today, the 2018 NAI Code of Conduct (Code) requires member companies that obtain data, including the precise location information shared by applications, to adhere to a robust set of privacy protections. Member companies’ practices are also subject to annual compliance reviews by NAI staff. Specifically, the NAI Code establishes a number of privacy protections, including:

  • Notice: The NAI requires member companies to provide clear, prominent, and meaningful notice regarding their data collection and use practices, including location data. In addition, NAI members must work to ensure that mobile applications which collect and share this data provide similar notices to users, although ultimately mobile applications are responsible for the disclosures they provide on their own properties.

  • Opt-in Consent: The NAI requires member companies seeking to use precise location data for Personalized Advertising to obtain either (i) a user’s opt-in consent; or (ii) reasonable assurances that the app collecting the data has obtained opt-in consent before doing so.

  • Use Limitations for Location Information: NAI members may not use location information collected for Personalized Advertising, or allow it to be used, to determine an individual’s eligibility for employment, credit, health care, or insurance.

  • Limitations on Re-identification: NAI members generally do not associate location information with any information that identifies a particular individual for purposes of Personalized Advertising. If an NAI member sought to associate location data it had already received from an app with an identified individual for Personalized Advertising, it would first need to directly obtain that user’s separate opt-in consent. Similarly, an NAI member also may not transfer location information to a third party unless it first obtains a contractual guarantee that the third party will not attempt to re-identify the user for Personalized Advertising without the user’s separate opt-in consent.

  • Sensitive Data: The NAI does not permit member companies to make inferences about sensitive health conditions, pregnancy termination, or sexuality for Personalized Advertising purposes without a user’s opt-in consent. This restriction applies equally to app interaction data, web browsing data, and precise location data.

The New York Times article raises concerns about the possibility that anyone could obtain pseudonymous location data, linked only to advertising IDs, and connect that information with an identified individual. While this is a technical possibility in some circumstances, the NAI Code and compliance program help ensure that NAI members, a number of whom were mentioned in the article, do not engage in such activities without a user’s permission.

Despite the privacy protections currently required by the Code, the New York Times article raises serious concerns about the adequacy of the current notice and choice protections offered for the use of location data. Mobile applications that collect a user’s data and share it with advertising technology companies must inform users about the nature of their data collection, use, and sharing. As noted, while the specific implementation of this notice is frequently outside the control of ad tech companies, there are additional privacy protective business practices that NAI recommends to help alleviate the concerns raised in the article.

First, NAI members are encouraged to engage in the use of general or “imprecise” location instead of precise location data where possible. NAI considers location data to be imprecise when it cannot be used to determine with reasonable specificity the actual physical location of a person or device. In general, NAI considers location data to be imprecise if it cannot be used to locate a person or device within a 500 meter radius - which is roughly the length of five football fields in any direction. The NAI has issued specific and detailed guidance on how members may render location information imprecise. As a result, many NAI members retain only imprecise location data or interest segments derived from such data, such as coffee shop visitor, rather than the underlying coordinates themselves.

Second, the NAI also encourages members to integrate just in time or enhanced notice provisions in the applications from which they obtain precise location data, in order to clarify the use of such data for advertising, such as the interstitial notice provided in the GasBuddy app referenced in the article.

Finally, companies should implement responsible data retention limits, whereby companies retain data, including precise location data, for only as long as necessary to fulfill a legitimate business need.

As mobile devices grow more capable, and become a bigger part of our daily lives, the NAI is actively exploring opportunities, through additional guidance or changes to our Code, to ensure that users have better notice and more control regarding the sharing and use of their data, including precise location data. For example, changes could include codification of some of the best practices highlighted above, expanding NAI notice and choice requirements to also apply to real-time contextual use of location data, and expanding requirements to other sensors on mobile devices.

The NAI also supports federal privacy legislation that would establish a national privacy framework to ensure that all companies--not just those who have already committed to strict industry self-regulation--are proactive about consumer privacy protection.  Such a framework should leverage existing self-regulation, like the NAI Code and compliance program, and strengthen the FTC’s enforcement capabilities to ensure bad actors are subject to strict penalties.

Submitted by Charlotte Kress on September 17, 2018

On Thursday, September 13th , the FTC began its Hearings on Competition and Consumer Protection in the 21st Century. It was the first in a series of hearings on whether recent economic, business, technological, or international developments require changes to the FTC’s competition and consumer protection enforcement priorities. The series will provide the FTC with a range of viewpoints to evaluate current enforcement and policy in light of today’s digital landscape.

The focal points of the discussion were antitrust law, consumer protection, and consumer data regulation. Across all three categories, panelists highlighted the importance of clear policy goals to ensuring industry compliance, and supported the FTC’s endorsement of self-regulation.

The following is a summary of the points made by panelists in each of the three discussions:

A. Antitrust Law:

The fist topic of discussion was the current landscape of competition and consumer protection law and policy. It primarily focused on the current state of antitrust law. Panelists cautioned a return to the populist antitrust theories of the 1970’s, stating that a reliance on simple market concentration would be harmful to innovation. Antitrust law should not be used as a tool to dismantle the giant technology companies simply because of amorphous concepts of bigness and fairness.

Absent a legitimate antitrust concern, antitrust law is poorly designed to combat social issues like privacy, which are better suited to other forms of enforcement. The discussion clearly emphasized that politically or socially motivated enforcement is a misuse of antitrust law. Instead, antitrust law should focus on providing an adaptable framework and economic concepts for changing market conditions. They should be sufficiently flexible to apply across a broad area of industries. Such policy should be shaped from sophisticated data rather than broad, aggregated industry data.

Moreover, the FTC should continue its effort to harmonize international antitrust substance and procedure, such as by establishing a multilateral framework for antitrust cases. As globalization of antitrust continues, the FTC may need to go beyond soft international guidance to establish a basis for evaluating the actual implementation of such guidance.

B. Consumer Protection:

To help new entrants and multinational companies navigate the global patchwork of consumer protection and privacy laws in the U.S., regulatory guidance must clearly establish the rights of consumers and businesses. Harm to consumers should be interpreted to include the harms contemplated by common law privacy torts because of the abstract nature of the injuries that are made possible by emerging technology. The FTC should support self-regulatory efforts to enhance its mission of fighting fraud, deception, and unfair business practices. Self-regulation and industry standards incentivize businesses to do the right thing while remaining competitively viable. Further, self-regulation is cost-efficient for the FTC (the NAI was specifically highlighted as an example of strong self-regulation).

Consumer education is essential for navigating new marketplaces, and will ensure that the reasonable person standard is not diluted in light of unfamiliar, emerging markets. Similarly, the FTC should consider allocating more resources to its technological capacity, such as by adding a technology department to the agency.

C. Consumer Data Regulation:

The FTC must establish clear goals, values, and implementation measurements for privacy and data security policy. These policies must balance consumer sovereignty and privacy with marketplace efficiencies and the tremendous benefits that the digital economy has provided to consumers. Though the panelists disagreed as to what constitutes a privacy harm, each of them stressed the importance of articulating the particular harm that a policy is meant to prevent. Some thought that harm should be broadly interpreted to include risk and breach of consumer expectations, while others believed that the FTC should only involve itself when there is a substantial harm to consumers.

The panelists also compared U.S. privacy and consumer protection law to European law. The U.S. sectoral approach to privacy lacks consistency and business efficiencies compared to the EU’s comprehensive privacy law. Meaning that it is difficult to clearly articulate how data is protected in the U.S. because it often depends on who holds the data and what the data entails. The haphazard nature of this approach has lead California and the EU to take the helm of privacy regulation. Today, businesses focus their compliance efforts primarily on the GDPR and various California laws, such that they do not look to U.S. law more broadly as to how to build privacy programs and practices. However, commitments to privacy in the EU are typically only met when there is strong enforcement of those commitments. It remains to be seen whether European data protection authorities will truly be able to enforce GDPR for non-compliant businesses that are primarily located in the US.

The FTC has an opportunity to strike a balance between the less permissive European laws and the current scattered approach to data regulation in the US. The agency is well- positioned to do so by engaging the public in these hearings to thoughtfully and creatively keep pace with competition and consumer protection enforcement and policy in the 21st century.

Submitted by Grant Nelson on April 23, 2018

The move from traditional television to smart and connected televisions is accelerating. Few televisions sold today do not have some smart or connectivity feature, and digital advertisers and regulators alike have taken notice. With the growth of data-enabled television comes privacy concerns, and the NAI commenced a working group to develop standards for NAI members demonstrating responsible use of data.

Draft Guidance

Now, the NAI’s Advanced TV Working Group has published a draft for comment entitled “Guidance for Members: Viewed Content Advertising.” The draft is the result of many meetings, redlines, compromises, and discussions with privacy organizations and we are proud of the hard work of each participant.

This guidance does not weaken any existing guidance. It clarifies that the NAI’s Cross-Device Linking guidance applies to data collected from televisions as devices. It embraces the trend of advanced TV OSes converging with existing mobile operating systems. The Viewed Content Advertising guidance requires opt-in consent for technologies that collect all or nearly all of the viewing activity on a television, implementing the core principle enumerated Vizio.

What Now?

The 24 members of the Working Group have crafted every word carefully, but no one can see the whole picture. We invite NAI member companies to read the guidance and provide their constructive feedback via Grant Nelson (grant AT before June 12th. The NAI has shared this and previous drafts, with several privacy organizations and looks forward to reading the suggestions for improvement.

Read the finalized Guidance.

Submitted by Matt Nichols on April 9, 2018

In late 2017, the NAI was given the opportunity to apply for a pilot survey program in order to run opinion polls and market research on internet users. With this chance to learn more about consumer opinions, we sent out a survey that obtained the responses of 10,000 U.S. consumers to find out more about what they think about online privacy, digital advertising, the ad-supported internet, and ad blocking. The survey was conducted January 29th to February 1st, 2018. 

NAI’s takeaways from the survey results

Our survey’s first question establishes the general level of concern respondents have about their privacy on the Internet. Whether the responses can be contributed to either recent high-profile data breaches, or to the growing national conversation surrounding privacy, “privacy” was stated to be at least “somewhat concern[ing]” for 85% of respondents.  Further, 50% of responses indicate that consumers are either “very” or “extremely” concerned about their privacy. In addition, 14% indicated that privacy was not a concern “at all”.  This indicates there continues to be a variety of attitudes about online privacy, but we must address the majority in the middle who are at least “somewhat concerned” about their privacy.  While this first question establishes that privacy is a concern for most respondents, subsequent questions and responses from the survey further clarifies this concern. 

The survey’s second question asks respondents to share what they felt was the primary reason for their privacy concern on the internet: 56% indicated that hackers were their top concern; a combined 15% said that data collection by either the U.S. or a foreign government was their top concern. As a whole, concerns about data collection by hackers or government entities attribute to 72% of responses to this question. 8% of users were most concerned about website and application publishers collecting data and 7% of users stated that data collection by advertising companies was their primary concern. 

The third question then shifts to help us better understand how consumers believe their access to online content should be financed. The results show overwhelmingly that respondents prefer their online content to be paid for by “Advertising” (67%), and interestingly this response was largely consistent across all age-groups. When this result is combined with the percentage of respondents indicating a preference for a “Donations” model (17%), the two responses account for 84% of all responses. This shows  an even clearer aversion by responders to pay directly for their online content. In fact, only 15% of responses indicated a preference for a subscription or microtransaction model. An interesting parallel to note is that 15% of respondents prefer a subscription or microtransaction model, which aligns with 15% of respondents who previously indicated their biggest privacy concern as AdTech companies and online publishers. 

Responses to the first three questions show individuals’ concern for their online privacy. But, while websites and AdTech play a role in this, albeit a minor one when compared to that of governments and hackers, question four adds further insight to this regarding choice.  When asked who should make the decision concerning opting a consumer out of targeted advertising, responders largely prefer themselves to be in control of this decision, with 79% indicating that “Individuals” should be in control. Interestingly, only 10% of respondents indicated that they prefer their browser to make such decisions on their behalf.

The survey results reveal that while some privacy concerns are associated with AdTech companies, this concern is not nearly as significant as those associated with hackers and government surveillance. But with that, the internet is largely ad-supported, and whether they are aware of this, U.S. consumers prefer their internet to continue to be ad-supported and show a clear disinterest in their content being made available only through subscriptions or microtransactions. But, when consumers are confronted with potential privacy enhancing measures, our survey shows that they want to make this choice themselves. This is a stern rebuttal to both device and browser manufacturers and governments making privacy decisions on consumers’ behalf.

Finally, while ad blocking is sometimes seen as evidence that consumers are taking privacy into their own hands, the final question of the survey shows that ad blockers are not primarily used as a privacy tool, but rather because consumers find ads annoying or because they cause websites to take too long to load and the effect that load time has on data usage.

We hope this survey, and its accompanying results, serve as a catalyst for discourse on not only our industry, but also the NAI’s role as a leading self-regulatory association. 

Full survey results can be found here.

"This research was made possible by Google Surveys, which donated use of its online survey platform. The questions and findings are solely those of the researchers and not influenced by any donation. For more information on the methodology, see the Google Surveys Whitepaper."