Social Media CyberWatch – Privacy Awareness

Table of Contents


As more people bring more of their lives into cloud services, online service providers have an increasingly powerful role as caretakers of our data. Indeed, in order to fully participate in contemporary society, a presence on social media sites is generally expected. We provide a constant stream of data to these services, which may be used in a manner outlined in terse and lengthy terms of service and privacy policies. These legal documents often go unread, a symptom of a wider lack of awareness of how the Internet works underneath the user interface of the web.

Speaking at TEDx Toronto, Citizen Lab Director Ron Deibert argued that the average Internet user needs a change in attitude, “to develop an ethic of curiosity and experimentation”, or “to become a hacker”, which would aid in understanding how technology works beneath the surface and stimulate more critical thought and behaviour relating to the use and provision of our data on the web.

Below we look at several areas where privacy and security are evolving in the social media context, such as public awareness, policy representations, personal responsibility, and user attempts at privacy protection.

Privacy Awareness Initiatives

In the past month, a number of privacy awareness initiatives and policies were brought to the public’s attention. These endeavours aim at increasing awareness of the control we have over our privacy, what legal entities obtain access to our data, and what we agree to when we sign up for social media platforms.

Facebook tour of privacy settings

At the impetus of the Irish Data Protection Commissioner’s office, Facebook now provides a tour of privacy settings to new users. In a blog post introducing the changes, Facebook reveals the tour to be a short presentation that provides descriptions on the different kinds of privacy settings available to users, each accompanied by a link to the relevant settings. The Commissioner’s office had audited Facebook Ireland’s compliance with Irish and EU data protection requirements in late 2011, and in September 2012 conducted a review to ensure that the audit recommendations were implemented to their satisfaction. New user education was one area that was not acceptably implemented.

New video on privacy

Privacy International, a UK-based non-profit organization, released a video entitled “Why Privacy Matters”. The video featured a variety of individuals including American Civil Liberties Union (ACLU) lawyers, Electronic Frontier Foundation (EFF) staff, and security researchers, who outline their definitions of privacy, why data use transparency is important — even to people with “nothing to hide” — and why privacy is at risk due to outdated laws.

Google’s Transparency Report

While not explicitly framed as a privacy awareness initiative, Google’s recently-released Transparency Report received a great deal of media attention that focused on an increase in government requests to be provided with access to Google user data. This dissemination may help ensure more people are aware of what can happen to their data once it is disclosed to a web service provider. Google’s report indicates that the United States made 7,969 requests in between January and June 2012, an increase in 33% from the same period in 2011. This type of public disclosure of information is viewed by the EFF as one of four ways by which companies can help to protect its users in the face of government data requests.

Back to top

Policy Representations

If every US Internet user skimmed the privacy policies for every unique website they visited, a 2008 study estimated that 154 hours per person, per year, would be devoted to this task. For all American Internet users, this would amount to 33.9 billion hours, a value of $492 billion in time lost (McDonald & Cranor, 2008). To combat this problem of scale, various organizations have recently developed methods of concisely representing key components of online service policies.

Privacy Icons

A collaboration between privacy tool developers disconnect.me and a Mozilla-led working group resulted in “Privacy Icons”, a series of icons that represent the various ways that your data may be used by a website and potential third parties, as outlined in a site’s privacy policy. A recent hackathon saw participants iconify 235 privacy policies, rendering iconographic summaries of each site’s data retention period, use by third parties, ad networks, and relationship with law enforcement.

Grading of Terms of Service

The “Privacy Icons” initiative recalls an earlier attempt at summarizing and increasing awareness of web platform governance: Terms of Service; Didn’t read (TOS-DR). TOS-DR describes their effort as a “transparent and peer-reviewed project”, that assigns grades to a web site’s terms of service, an agreement that users (explicitly or implicitly) accept in order to use a website. These grades are intended to provide a quick summary of the different components of a website’s terms; for instance, the restrictiveness of the content licensing agreement, whether the terms may change at any time without notice, how transparent the service is about government data requests, and more.

Back to top

Personal Responsibility and its Constraints

While it is important for service providers to take privacy seriously in the design of their platforms and to help educate users about how their data is used, users themselves bear some responsibility in their use of a platform. Nevertheless, the design of these platforms plays a role in the degree of responsibility we can take.

Having an “online talk” with children

Google Chairman Eric Schmidt recently told Reuters that people “need to fight for their privacy” or it will be gradually lost over time. Schmidt furthermore urged parents to have an “online talk” with their early adolescent children, since embarrassing content on the web will “follow you for your the rest of your life”. Indeed, ‘human error’ will always be a factor when deciding to whom content should be shared (Egelman & Johnson, 2012).

Algorithmic Gatekeepers

While controlling the distribution of one’s data is perhaps the predominant conception of privacy, other facets of the idea include “moral autonomy” (Van den Hoven, 2001) and self-determination (Cohen, forthcoming 2013): The freedom to form one’s own opinions without pressure to conform to society’s accepted norms. This aspect of privacy is impacted by the ways our digital tools frame our capacity for action.

A recent opinion piece in the New York Times by Evgeny Morozov argues that our online experiences are increasingly mediated by “algorithmic gatekeepers”, software that recommend certain search terms, purchases, and other courses of action to us. These recommendations are often presented as the objective results of computation, but this representation masks the fact that the design decisions underpinning the mechanisms stem from a subjective interpretation of the world (boyd & Crawford, 2011).

The awareness of how our digital environments regulate our behaviour, that “code is law” (Lessig, 2006), is an important component of an understanding of how our digital society is structured and governed. As our capacity for action is shaped by the design of the platforms we use, so too is the personal responsibility we can exercise over what happens to our data.

Back to top

Awareness on the Rise

Several studies and recent events indicate that people are increasingly aware of privacy issues involved in the usage of digital services and are taking steps to protect themselves, though these efforts may sometimes be misguided.

Mobile app choices driven by privacy concerns

Additionally, a recent Pew study reports that 57 percent of mobile application users claim to have either uninstalled or refrained from installing an app based on the ways that the application uses their personal data. This research helped to drive the Office of the Privacy Commissioner of Canada and two provincial privacy organizations to release a report outlining best practices for mobile application developers to protect user privacy. The report recommends that developers be accountable for their actions, transparent about data use practices, securely collect only what is needed to support application functionality, to obtain meaningful consent despite device limitations, and deliver data use reminders at appropriate times.

User Privacy Notices on Facebook

Recent research has found that Facebook users are increasingly reporting to have modified their privacy settings (see Ellison et al, 2011). Additionally, a recent trend on Facebook that emerged in the wake of proposed changes to its data use policy is posting a “privacy notice” on one’s timeline. These notices are intended to assert a user’s control over his or her data, presenting the profile and its contents as private as confidential. However, they are also legally ineffective; Facebook users have already agreed to the platform’s terms of service and associated policies, and unilaterally posting the notice will not override the pre-existing agreement.

This trend evidences two issues. First, many users lack an understanding of the contract formed between them and Facebook when they registered on the platform. Second, these same users are aware and concerned that their data can be provided to third parties in ways that they do not currently have any direct control over.

Back to top

Conclusion

Many initiatives have recently come to light that help raise awareness about how one’s data is used when interacting with social media platforms. A major component in our online behaviour is personal responsibility; this is empowered by privacy awareness and constrained by the very design of the platforms for our digital lives.

Back to top

References

  • boyd, d., & Crawford, K. (2011). Six provocations for big data.
  • Cohen, J. (2013). What Privacy Is For. Harvard Law Review, 126.
  • Egelman, S. & Johnson, M. How Good Is Good Enough? The Sisyphean Struggle for Optimal Privacy Settings. CSCW 2012 Workshop on Reconciling Privacy with Social Media.
  • Ellison, N. B., Vitak, J., Steinfield, C., Gray, R., & Lampe, C. (2011). Negotiating privacy concerns and social capital needs in a social media environment. Privacy online: Perspectives on privacy and self-disclosure in the social web, 19-32.
  • Lessig, L. (2006). Code: And Other Laws of Cyberspace, Version 2.0. Basic Books.
  • McDonald, Aleecia M., and Cranor, L.F. The Cost of Reading Privacy Policies. ISJLP 4 (2008): 543.
  • Van den Hoven, J. (2001). Privacy and the Varieties of Informational Wrongoing. In Readings in CyberEthics, ed. R. A. Spinello and H. T. Tavani. Sudbury, MA: Jones and Bartlett Publishers, 488-500.

Back to top

Andrew Hilts is a Luum Fellow at the Citizen Lab, where he develops privacy, security, and identity policy innovations for the Luum platform (www.luum.com).

Read previous editions of Social Media CyberWatch.