• Digital Life Initiative

Privacy Policies as Contextual Integrity: Beyond Rules Compliance

Updated: Mar 28

By Yan Shvartshnaider (York University), Madelyn Rose Sanfilippo (University of Illinois at Urbana-Champaign), and Noah Apthorpe (Colgate University)

There is wide scholarly agreement that privacy policies do not adequately inform users about companies’ information practices. Privacy policies are typically too long for rapid comprehension yet remain incomplete descriptions of company behavior. They famously use ambiguous terms that can be confusing, vague, and misleading to the reader (Reidenberg et al., 2015). So disconnected from any common understanding of the notion, there have been calls to strip them of the “privacy” title and rename them as data management policies because their content does not meaningfully communicate privacy implications.

The advent of recent privacy/data regulation, such as General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA), forces companies to adapt their behavior and rewrite their privacy policies, or else face strict penalties under enforcement. Clear language and transparency are required. However, due to a lack of concrete implementation guidelines, companies’ interpretations of the rules vary widely and are more often grounded in risk management concerns than in consumer protection interests.

Although many companies ostensibly strive to comply with GDPR and other regulation, they do little in practice to improve users’ comprehension of their practices. Furthermore, companies provide additional information to consumers in such a way that increases the required cognitive load for comprehension. For example, companies often overwhelm users with avalanches of simultaneous updates to their privacy policies. These updates are presented as a set of changes to existing text, forcing the reader to go over the privacy policy multiple times to decide whether and how the changes affect their privacy expectations. This practice typically is too tedious or requires too much legal knowledge for most individuals. Frequently masking problems with the privacy policies themselves, prior work (Martin, 2015; Martin & Nissenbaum, 2016) has empirically demonstrated that users approach privacy policies with a mental model of a company’s practices. When confronting the actual, incomplete and often incomprehensible normative statements prescribed in a privacy policy, they fill in the blanks and substitute missing values from their inner privacy models of what they believe should be in a privacy policy rather than what actually exists.

To ensure that users’ mental models align with companies’ information handling practices, we need to examine the deep systematic problems with the way privacy policies are written and formatted.

We argue in favor of steps that would informate (Zuboff, 1988) privacy policies as contextual integrity in order to assess and reason about the role of privacy policies in informing the reader of the possible privacy implications of company behavior. In our previous work (Shvartzshnaider, Apthorpe, et al., 2019), we developed a methodology based on the contextual integrity (CI) framework (Nissenbaum, 2009) for interpreting information flows described by privacy policies in terms of their conformance to contextual informational norms, governing rules and regulation (Sanfilippo et al., 2020).

Figure 1: The CI contextual information flow/norms parameters. For the CI ABC infographic follow the link.

Figure 1 shows that the CI assessment heuristic requires values to be specified for five parameters to generate a complete and unambiguous description of an information flow: sender, information type, subject, receiver and transmission principle. If a policy statement omits any of these parameters in its description of an information flow, the company practice can be viewed as incomplete or ambiguous per the CI framework. The CI assessment heuristic not only offers a finer grain measure of a company’s compliance with its state policy, but it also further provides insights into whether the company practices described as information flows in privacy policy statements align with or violate established contextual norms and consumers’ privacy expectations.

This approach is reminiscent of Dworkin’s conception of Legal Integrity (Dworkin, 1986), which calls for enforcement of law not by blindly following the rules but by ensuring these rules are consistent with societal moral values. To preserve privacy, a company’s information practices should not merely be judged by compliance with existing rules of the law/policy/regulation but also be considered in conjunction with the governing contextual norms and consequently conform to users’ privacy expectations.

Figure 2: Example of CI annotated privacy statements from one of the previous versions of Facebook’s privacy policies.

Current privacy policies are not written with the CI framework in mind and therefore are poorly structured for consumers to determine their privacy implications.

Prior analysis (Shvartzshnaider et al., 2019; Reidenberg et al., 2016) of existing privacy policies also identified specific cases where the structure and placement of a specific statement increases the cognitive effort required to fully comprehend its privacy implications.

For example, consider this statement from the Facebook Privacy policy (Viewed on March 2021):

“Advertisers, app developers and publishers can send us information through Facebook Business tools that they use, including our social plug-ins (such as the Like button), Facebook Login, our APIs and SDKs, or the Facebook pixel. These partners provide information about your activities off Facebook – including information about your device, websites you visit, purchases you make, the ads you see and how you use their services – whether or not you have a Facebook account or are logged in to Facebook.”

The above statement is an example of a “CI parameters bloating" (Shvartzshnaider et al., 2019). The statement prescribes 3 Senders, 6 Information types, 1 Recipient and 7 Transmission Principles. An average consumer needs to evaluate a whopping 126 possible information flows variations to comprehend the privacy implications of this statement. Other issues include (but not limited to):

  • Statements that lack relevant contextual information (i.e., values for one of the CI parameters). For example, if a statement reads: “We share your personal information with third-parties”, it omits transmission principles, to let the reader know when or under what conditions the information is shared.

  • Statements that use vague (e.g. some, few) and hedge terms (e.g. like may, can), for example, in a statement like “We may sometime collect some of your information…”

  • Policies that spread the relevant information throughout the policy, e.g., listing each CI parameter in separate sections such as “The information we collect” and “How we use the information we collect” (see Venmo or Uber policies). Figure 2 illustrates this challenge, highlighting how parameters are not communicated in terms of flows, but often as discrete and decoupled disclosures. What information is collected and by which third parties may be enumerated, especially as is now required by Apple’s app privacy details (Apple, 2021), but as separate lists that do little to inform users what information about them is specifically shared with whom. This further increases the difficulty for the consumer to “connect the dots'' and fully grasp the context in which the information is collected.

As long as the broader notice and consent regime of governance prevails, we should strive toward privacy policies that meaningfully inform users about companies’ information handling practices. Structural flaws that that we and others have pointed out lead to ambiguous and cognitively burdensome statements. Insisting that policy statements comply with requirements of completeness per contextual integrity parameters would go a long way to overcoming some of the most obvious sources of ambiguity and incomprehension toward an improved framing, and evaluation of a company’s data practices. (As a bonus, forcing completeness may give companies and regulators pause when confronted with the actual extent of data practices.)

To achieve these improvements a concerted and coordinated research effort from privacy scholars of all disciplinary stripes is needed -- tools and methods for empirically analyzing existing privacy policies at scale, for systematic learning of context-relevant privacy expectations, for expressions of policies with complete values given for all parameters.

Using CI as a lens to capture information flows in privacy policies and regulations allows a) crowdsourcing and comparing privacy expectations of relevant stakeholders (Apthorpe et al., 2019, Shvartzshnaider et al., 2016), b) identifying gaps in privacy expectations, information governance and information flows that the systems generate (Sanfilippo et al., 2020), and c) performing a legal analysis of the flows to show which flows violate regulation.

Combining expertise from computer science, social, information governance and legal domains provides us with the opportunity to improve existing information governing institutions by aligning them with contextual societal norms and expectations when developing new practices and technologies for contexts like education (Cohney et al., 2020) and Smart Home, IoT environments (Apthorpe et al., 2018).

For more on real-world applications of Contextual Integrity follow @priva_ci on Twitter, visit PrivaCI.info website, and check out CI related publications.


Apple. (2021). App privacy details on the App Store. https://developer.apple.com/app-store/app-privacy-details/

Apthorpe, N., Shvartzshnaider, Y., Mathur, A., Reisman, D., & Feamster, N. (2018). Discovering smart home internet of things privacy norms using contextual integrity. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2(2), 1-23.

Apthorpe, N., Varghese, S., & Feamster, N. (2019). Evaluating the Contextual Integrity of Privacy Regulation: Parents' IoT Toy Privacy Norms Versus {COPPA}. In 28th {USENIX} Security Symposium (USENIX Security 19) (pp. 123-140).

Cohney, S., Teixeira, R., Kohlbrenner, A., Narayanan, A., Kshirsagar, M., Shvartzshnaider, Y., & Sanfilippo, M. (2020). Virtual Classrooms and Real Harms. arXiv preprint arXiv:2012.05867.

Dworkin, R. (1986). Law’s Empire. Harvard University Press.

Martin, K. (2015). Privacy notices as tabula rasa: An empirical investigation into how complying with a privacy notice is related to meeting privacy expectations online. Journal of Public Policy & Marketing, 34(2), 210–227.

Martin, K., & Nissenbaum, H. (2016). Measuring Privacy: An empirical test using context to expose confounding variables. Colum. Sci. & Tech. L. Rev., 18, 176.

Nissenbaum, H. (2009). Privacy in Context: Technology, Policy, and the Integrity of social life. Stanford University Press.

Reidenberg, J. R., Bhatia, J., Breaux, T. D., & Norton, T. B. (2016). Ambiguity in privacy policies and the impact of regulation. The Journal of Legal Studies, 45(S2), S163–S190.

Reidenberg, J. R., Breaux, T., Cranor, L. F., French, B., Grannis, A., Graves, J. T., … Ramanath, R. (2015). Disagreeable privacy policies: Mismatches between meaning and users’ understanding. Berkeley Tech. LJ, 30, 39.

Sanfilippo, M. R., Shvartzshnaider, Y., Reyes, I., Nissenbaum, H., & Egelman, S. (2020). Disaster privacy/privacy disaster. Journal of the Association for Information Science and Technology, 71(9), 1002–1014

Sanfilippo, M. R., Frischmann, B. M., & Strandburg, K. J. (Eds.). (2021). Governing Privacy in Knowledge Commons. Cambridge University Press.

Shvartzshnaider, Y., Tong, S., Wies, T., Kift, P., Nissenbaum, H., Subramanian, L., & Mittal, P. (2016, September). Learning privacy expectations by crowdsourcing contextual informational norms. In Proceedings of the AAAI Conference on Human Computation and Crowdsourcing (Vol. 4, No. 1).

Shvartzshnaider, Y., Apthorpe, N., Feamster, N., & Nissenbaum, H. (2019). Going against the (appropriate) flow: a contextual integrity approach to privacy policy analysis. In Proceedings of the AAIAI Conference on Human Computation and Crowdsourcing (Vol. 7, pp. 162–170).

Zuboff, S. (1988) In the Age of the Smart Machine. Basic Books, Inc.

Yan Shvartshnaider

York University



Madelyn Rose Sanfilippo

University of Illinois Urbana-Champaign



Noah Apthorpe

Colgate University



Cornell Tech | 2021

Download PDF >




Cornell Tech

2 W Loop Rd,

New York, NY 10044

Get Here >

DLI Queries

Jessie G. Taft