By Nico Ebert (ZHAW)
A common narrative in practice sounds something like this: “people claim data protection is important to them, but in reality they give away everything on the internet anyway”. There are also some science studies that seem to prove this again and again: that we are generally careless with our and other personal data and that we consider data protection important but neglect it in everyday life. For example, a “pizza experiment” with 3,000 students at a US university in 2017 concluded that a free pizza was enough of an incentive to reveal the email addresses of three fellow students (Athey et al. 2017).
The catchy phrase “privacy paradox” was coined for the divergence of people’s intentions and behaviour with regard to data protection (Norberg et al. 2007). The term is still used today to expose people as privacy careless (e.g. Krempl 2020). However, it is unclear why this deviation in the area of data protection is declared “unexpected” and “contradictory”, while we observe seemingly irrational behaviour in many other areas of our lives as well. For example, few of us would probably claim that we don’t care about environmental protection. At the same time, we drive to work or take plane trips to faraway countries. So despite our attitude, we accept negative consequences for the environment. Obviously, environmental protection is only one factor among many that influence our behaviour. Also from a data protection perspective, it is therefore more interesting to understand individual behaviour than to leave it at labels like “paradox”.
First of all, data protection probably does not play the same role for every person. As always, there is no one “individual”, but everyone has different privacy preferences and privacy has a different value for everyone. Marketers have long thought in terms of different customer segments, so why lump everyone together when it comes to data protection. There are also different “typologies” in data protection. Alan Westin, for example, distinguishes between data protection fundamentalists, data protection pragmatists and the completely unconcerned. While the unconcerned might freely share their children’s photos on the internet, the fundamentalist would probably not use an internet connection without appropriate encryption and avoid Whatsapp. In 2002, Sheehan selected a representative sample of 889 people in the USA and typologised them with a questionnaire. Conclusion: 16% were completely unconcerned, 81% were pragmatists and 3% were fundamentalists. In the aforementioned “pizza experiment”, perhaps only a few privacy fundamentalists were found among students in their early 20s, but a high proportion of unconcerned individuals with few negative prior experiences.
But even the approach of typologising individuals according to their privacy attitudes has its limits. People’s attitudes are not stable constructs and we often do not know our preferences precisely ourselves. This also applies to our privacy attitudes and preferences (Acquisti et al. 2015). Rather, we also act under great uncertainty in the area of data protection – often spontaneously and in unclear situations. What consequences accepting a cookie banner on a website has for our data protection, we do not know. At least it didn’t have any negative ones yesterday. To make matters worse, we rarely have a free choice for or against more data protection. Instead, we often trade our data and privacy for some other good. Choosing more privacy often means giving up something else (e.g. functionality or convenience). In many cases, we cannot even use the online edition of our newspaper without consenting to advertising tracking. Google only offers consent as the sole “decision option” before using the search engine for the first time – not consenting means not being able to use the search engine. The decision against data protection may also be related to the great value of the exchange good: A free pizza may be very valuable to a 20-year-old student in an expensive Silicon Valley flat share compared to three university email addresses of fellow students. Perhaps the decision against more data protection is even in the interest of the individual herself: sharing content on social media is simply necessary to maintain social relationships.
The context also plays a role: what personal data is involved in what situation and what happens with it? Probably very few people would share the password for their email account, but most of us are probably not very critical of the individual usage data when visiting a website. Video surveillance is socially accepted at the petrol station, but difficult to imagine in the sanitary area at the campsite. And even who wants our data influences our decision: In the pizza experiment, the university asked for the email addresses of fellow students. Perhaps the students would have been more critical with a private company. The fact that we apply specific standards to the respect of our privacy in a certain context and do not expect any violation of them is what Helen Nissenbaum calls “contextual integrity” (Nissenbaum 2011).
Sometimes our privacy decisions are also influenced to our disadvantage without us consciously noticing. With “nudges”, i.e. small but relevant nudges, our behaviour can be influenced in unclear situations. In the form of “dark patterns”, these nudges that are detrimental to us can be found everywhere online. A green button here and a preset checkbox there are often not designed to represent our interest (e.g. Holland 2020). Between 2005 and 2014, for example, Facebook made more and more user data publicly visible to non-members, when users simply clicked on “Accept” (Acquisti et al. 2015).
To come back to the beginning of the text: to blanketly impute contradictory data protection behaviour to the individual and to label this as a “privacy paradox” does not do justice to the complexity of reality. Instead, numerous factors play a role: differences between people, the respective context and the respective personal data are of great relevance. Even the detailed design of the decision-making situation, down to the question of which button on the website is “green and big”, can be very relevant. In a recent study on the effectiveness of privacy notices, we were able to show that they are not clicked behind a link (as expected). However, if the privacy notice was very visible, compact and in simple language, it was definitely noticed (Ebert et al. 2021).
What does this mean for organisations that have to or want to consider data protection aspects in their products and services? The design of data protection is obviously not an exclusively legal topic that can be mapped via compliance checklists. Rather, it would be desirable to include the individuals themselves and their perspective in the design decisions. Other disciplines such as human-computer interaction or behavioural psychology should also be involved. “Privacy by design” therefore also means dealing with individuals and their interests in an interdisciplinary way.
The author would like to thank Dr. Tobias Vogel for his helpful comments on this text.
Dr. Nico Ebert is director of studies for the MAS Business Engineering and lecturer in business informatics at the School of Management and Law of the Zurich University of Applied Sciences ZHAW as well as a fellow of the DIZH (Digitalization Initiative of Zurich Universities). He researches and teaches in the areas of data protection and information security.
Athey, S., Catalini, C., & Tucker, C. (2017). The digital privacy paradox: small money, small costs, small talk (No. w23488). National Bureau of Economic Research
Patricia A. Norberg, Daniel R. Horne, and David A. Horne. 2007. The Privacy Paradox: Personal Information Disclosure Intentions versus Behaviors. Journal of Consumer Affairs 41, 1 (2007), 100–126.
Sheehan, K. B. (2002). Toward a typology of Internet users and online privacy concerns. The Information Society, 18(1), 21-32.
Alessandro Acquisti, Laura Brandimarte, and George Loewenstein. 2015. Privacy and human behavior in the age of information. Science 347, 6221 (January 2015), 509–51
Helen Nissenbaum. 2011. A contextual approach to privacy online. Daedalus 140, 4 (2011), 32–48.
Ebert, N., Ackermann, K. A., & Scheppler, B. (2021, to appear in April). Bolder is Better: Raising User Awareness through Salient and Concise Privacy Notices. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems