In the first part, I have introduced the influential concepts of informational privacy and its dimensions. This post discusses how personal information is processed and used by algorithmic personalization systems and how such systems should operate to ensure personal privacy – permitting and protecting person’s autonomous life.
No raw data – no raw personal information
Professor of Information Sciences Philip F. Agre (1994) identified the grammars of actions that are central to data collection, which he termed as “the capture model of privacy”. Capture is not a new state of surveillance but a process of grammatizing human activities which involves identifying fundamental units of an activity, restructuring these units in a sensible way in computational languages, and then imposing these algorithmic structures of human activities on users (p. 744-747). This model, Agre wrote, “has manifested itself principally in the practices of information technologies” (p. 744). Due to the “messy” nature of information, decisions have to be made on how data are aggregated and categorized. As illustrated in the Facebook’s data model below, classification is the foundation for systems to make sense of data inputs. Accordingly, as the famous book title edited by Lisa Gitelman (2013) “’Raw Data’ is an Oxymoron” suggests, data are never ‘raw’ but always already ‘cooked’ (p. 2). In other words, personal information collected and processed by the algorithmic systems is not neutral or objective information but constructed by the engineers of the algorithmic systems in particular historical, social, cultural and economic circumstances. However, the logics of these systems are opaque, if not concealed from the public eye in the name of protecting intellectual property. We live in what legal scholar Frank Pasquale (2015, p. 3) has called a “black box society,” in which algorithms determine who we are and the contours of our world without us knowing exactly how. What is fundamental here is that the algorithmic systems of data collection and analysis are not purely modeling reality and human behavior as they are but are embedded with certain ideology, perception of the world and behaviors even before they collect data about the world. These analyses of algorithmic systems support the argument that there is no raw personal information per se given the logics of data collection and analysis.
But why do the logics of data collection and analysis matter to the protection of individual privacy? Rössler (2005) clarified how self-determination is violated when one’s expectations of being watched are not matched with the practices of surveillance:
The protection of informational privacy matters so much to people because it is an intrinsic part of their self-understanding as autonomous individuals to have control over their self-representation. By means of the information they give other people about themselves or that they know other people have always had about them, individual simultaneously regulate the range of very diverse social relations within which they live. Without this form of self-determined control over their personal information, nor would self-determined, context-dependent, authentic behavior towards others be possible, nor would it be possible to find an answer authentically to the question of how one wants to live. (p. 116)
Considering an example in which a user grants permission to a media platform to record all of her personal information (metadata and data) with the awareness that the data are then used for behavioral targeting advertising, does it mean that the user has complete control of her self-realization and self-determination? Is there any other expectation that should be taken into account other than the expectation of her data being collected for behavioral targeting advertising, for example how her data are processed and quantified to lead to such advertisements? By knowing the socio-algorithmic process of data collection and analysis, data subjects could make more informed decisions regarding what they are confronted with. Without this knowledge, in the age of datafication and personalization systems, their control over the knowledge others have of them is incomplete. They have little idea how the socio-algorithmic process captures and calculates flecks of their identities; therefore, they do not understand how certain information is presented to them. It could have bitter consequences; for example, in a credit scoring system, the subject does not have full understanding on how his personal information is perceived and how his “worthiness” and “reliability” are calculated. What “worthiness” and “reliability” mean are up to the algorithms’ authors. As the data subject has little knowledge to form well-founded assumptions and expectations over how the systems perceive him, the norms that constitute his informational privacy are not fulfilled. The very moment the subject becomes aware of the structure of the surveillance systems and the decomposition and compositions of their bits of data may result in a change or shift in perspective because now the person is fully aware of the assumptions, conditions, and consequences of his actions. Therefore, protecting one’s control over his personal information is insufficient to provide respect for her as an autonomous subject, as Rössler (2005) argued:
the social and legal norms here must not just be a matter of form but also effective enough for me to be able to assume that my rights to informational privacy are in principle guaranteed, for autonomy can be threatened by the very fact that I no longer feel able to count on the self-evident assumption that these expectations are justified… People want to have control of their own self-presentation; they use the information others have about them to regulate their relationships and thus the role they play in various social spaces. (p. 118)
In summary, following the argument that privacy has the function of permitting and protecting an autonomous life, that is a person is autonomous if she can ask herself the question what sort of person she wants to be, how she wants to live, etc., informational privacy should take into account the categorizing, contextualizing, transforming, and meaning-making of personal information (or “the capture model” in Agre’s term). To secure a fair governance of personal information in the datafication age, informational privacy should encompass both persons’ control over the access to their personal information and the socio-algorithmic process of their personal information. Only when people are informed of how their self-representations and life choices are influenced by the logics of data collection and analysis could they lead an autonomous life.
References:
Agre, Philip E. “Surveillance and Capture: Two Models of Privacy”. The Information Society 10.2 (1994): 101-127.
Amazon.Jobs, “Personalization, 2018, https://www.amazon.jobs/en/teams/personalization-and-recommendations. Accessed 1 Nov 2018.
Cheney-Lippold, John. “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control.”Theory, Culture & Society 28.6 (Nov. 2011): 164-81.
Cheney-Lippold, John. We are data: Algorithms and the making of our digital selves. NYU Press, 2017. Print.
Cole, David, “‘We Kill People Based on Metadata,’” NYR Daily (blog), New York Review of Books, May 10 2014, www.nybooks.com.
Daniel, J. Solove. “A Taxonomy of Privacy”, University of Pennsylvania Law Review. 154 (2006): 477-560
Daniel, J. Solove. Understanding privacy. Harvard University Press, 2008. Print.
Fried, Charles. “Privacy [A Moral Annalysis].” Yale Law Journal 77 (1968): 21.
Gitelman, Lisa, ed. Raw data is an oxymoron. MIT Press, 2013. Print.
Mayer-Schöenberger, Viktor and Kenneth Cukier. Big Data: A Revolution That Will Transform How We Live, Work, and Think. New York: Houghton Mifflin Harcourt, 2013.
Nissenbaum, Helen. Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press, 2009.
Pasquale, Frank. The black box society: The secret algorithms that control money and information. Harvard University Press, 2015. Print.
Rieder, Bernhard. “Week 2: Affordances.” New Media Theories. University of Amsterdam. Amsterdam, 14 Sep. 2017.
Rössler, Beate. The value of privacy. Cambridge: Polity Press, 2005. Print.
Schneier, Bruce. Data and Goliath: The hidden battles to collect your data and control your world. WW Norton & Company, 2015. Print.
Van Dijck, José. “Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology.” Surveillance & Society 12.2 (2014): 197-208.
Varian, Hal R. “Beyond big data.” Business Economics 49.1 (2014): 27-31.
Westin, Alan F. Privacy and Freedom. New York: Atheneum, 1967. Print.
Westin, Alan F. “Science, privacy, and freedom: Issues and proposals for the 1970’s. Part I – The current impact of surveillance on privacy.” Columbia Law Review 66.6 (1966): 1003-1050.