Did you know?

The ViOffice Cloud is now GRATIS for up to 3GB storage space. Register now!
Skip to content
Startseite » Blog » Why Privacy is an urgent Social Issue

Why Privacy is an urgent Social Issue

The Internet is filled with advise, tricks and guidelines about privacy, stopping disclosure of personal information in the digital world and what to do to achieve “complete privacy” online. While such guides are important for the education about “digital self-defense”, they generally have in common, that they put the burden of action on the individual and often times assume a certain amount of technical knowledge, motivation and all too often the financial means of the informed reader to act on the advise. This of course makes sense to a certain degree, since most people claim to care a lot about privacy. But what does “privacy” even mean? Who is affected by privacy infringements and what is the potential fallout? [1, 2, 3]

What is Privacy?

The term „privacy“ in the context of public discussion is fiercely convoluted and vague, often times experienced as burden on the individual without any obvious, clear-cut benefit. To complicate things further, the idea of what privacy entitles and whether or how to protect it is in constant flux over time and place. The general presumption and constant fear of infringed privacy may in some instances lead to becoming a “privacy hardliner” or to the contrary a feeling of helplessness and “blissful ignorance”. Paradoxically, the worry about a lack of privacy can even lead to the well-known attitude of “I have nothing to hide, so who cares?” or the somewhat paranoid stance of “Why would I care about privacy, if ‘they’ know all about ‘us’ anyways?”. It may also spark the fear of standing out all too much, leading to a pressure of compliance. [4, 5, 6, 7, 8]

The demand for privacy, while often times thought of as fear from governments, law enforcement or companies is in reality deeply entrenched with “freedom”. Quite ironically, it is freedom along with (pseudo-) security, that is often used as excuse that permits the restriction of individuals’ or society’s privacy. [9, 10, 11]

It is also not as easy as to label privacy as an egoistic individual choice by those wanting to be left alone and not engage in society or those who want to strictly separate public and private space. Privacy as a concept is less about oneself and more about collective trust, consent and the control of your own information streams. Rather than a dividing force, privacy is therefore an integrational and fundamental building block of a functional democratic society. [12]

Social Sorting and Surveillance

When talking about privacy, the issue of surveillance (from french: “surveiller”; “to watch over”) presents a direct link to society, as it implies a social hierarchy between the observer and the observed. In other terms, surveillance may be seen as asymmetric privacy with powerful watchers on one and powerless subjects on the other side. The lack of privacy appears to systematically affect disadvantaged populations rather than more advantaged ones – or to the very least disproportionally so. [13, 14, 15, 16]

As opposed to the pre-internet (or more precisely, the pre- „interactive web“) era, surveillance has become an omnipresent phenomenon in day-to-day life for pretty much everyone today. Moreover that, through technical progressions such as digitisation, there are fewer private spaces and more to hide than ever, be it in private life, at school or the place of work. This may either come in the form of state surveillance, from companies (and private-public collaborations) or in the form of social surveillance and peer pressure from within society itself. In the context of corporate or capitalistic surveillance (and to some extent also state surveillance), the aims of such privacy intrusions are typically planing, predicting or preventing certain behavior of consumers and market actors through classification of their profile. [7, 9, 13, 14, 15, 17, 18, 19]

In this frame, “Social Sorting”, a term coined in the early 2000s, creates the connection between privacy and social phenomenon as well as society, as opposed to privacy as a question for individuals. Thereby, the ever growing computational power, data sources and digitisation also leads to the automatisation of surveillance, such as for example through advertisement companies that sort people’s profiles into categories by learning about their thoughts and desires and indeed also by influencing them either directly or indirectly. [13, 17, 20]

Socio-economic, but also geo-demographic factors play a vital role in such categorisation, i.e. whether people live in predominately rich and more exclusive urban neighbourhoods or in much more disadvantaged, low-income and high unemployment urban and rural areas. Such categories may not only be used to classify people, but also to classify places, areas and – to go full circle – people that live within it. In combination with individual-level information to varying degrees, such as interests, opinions and spending patterns, marketers, politicians, authorities and any other type of campaign or advertisement actors might target and treat people differently based on their assigned category. [17]

This of course may not only lead to finely tuned ads for an individual, such discriminatory advertisement patterns, special offers or content (think: Social Media timeline sorting) may directly or indirectly transport different values, expectations and ambitions, which in turn is able to deepen, stabilise or break the status quo, depending on how and by whom it is used. [9, 15, 17, 20, 21, 22]

Historically, low-income workers, the unemployed and minority or otherwise marginalised communities are those most likely to be affected by surveillance as well as other privacy intrusions and particular targeting or exclusion through the means of Social Sorting. The effects may be seen as further inequalities within the labour market, education institutions, insurance rates, treatment by law enforcement or policing and many other fields. [7, 13, 15, 16, 17, 21, 23]

In that regard, privacy intrusions as well as any form of surveillance can in many instances also serve as tools of oppression for those without power, by exploiting those that are most vulnerable to it, be it intented or not. [8, 12, 16, 22, 24]

„If it is free, then you are the product“

During the early 20th century, the term “There is no free lunch” was termed by economists, referring to the idea, that nothing in life is ever free of costs and has since found adoption in a wide variety of fields. While that notion surely is not universally and unconditionally applicable, it does hold some truth also for the digital world we live in. [25]

The phrase could appear to be contradicted, with the rise of IT-companies like Google, Facebook and others, seemingly offering gratis services like web search, cloud storage, messengers, e-mail, office suits and even operating systems during the past two decades. However, as we quickly learned, the end user of such services is not these companies’ customer in most cases, but others buying the information about these users are. Through interest-based and behavioural surveillance, such advertisement companies fight over our attention in a mechanism experts call the “Attention Economy”, whereas our capacity to process the overload of information that is raining down on us in every minute of every day is a scarce good. [26]

If we look at the bigger picture of such economy, how it affects individuals and how market actors can profit from it, one may realize that offering gratis services like Google, Facebook and other advertisement companies do, is likely a promising strategy to gain market dominance and therefore also win the fight over individuals’ attention: If it is free, then your attention is the currency – and your personal information is the product. [26, 27]

As such practices become widely known, increasing the collective understanding of the importance of privacy, a new market section emerges: Selling the assurance of privacy. One of the most prominent examples capitalizing on privacy promises is Apple, who market their premium pricing as an exchange for exclusivity and “privacy”, segregating the market into those that care about their individual privacy – and can afford to pay for it – and those that do not. [28]

Although many such companies have since weakened or flatout broken their assurances, the strategy of capitalizing privacy itself exacerbates the aforementioned social issue of surveillance as it excludes those that are unable or unwilling to pay high premiums for a promise of freedom which may or may not hold down the road, pushing them further into the locked-in ecosystems of the gratis-tier Surveillance Capitalists. [26, 28]

Playing the game of rich elites that (supposedly) buy themselves off of digital surveillance or those that are both educated and lucky enough to run their own infrastructure and experience true privacy within their own tiny bubble, while the majority is bound to feed the machine with their private information is a self-reinforcing cycle which benefits no one but those that sell people’s privacy and those that capitalize on it directly. To summarize: Yes, we do have “something to hide”, both personally as well as collectively, be it from a socially minded or self-serving perspective. [12, 28]

What to do?

It is quite obvious, that while all parts of society are affected by privacy intrusions, particularly more disadvantaged communities carry the heaviest burden in that regard as they are likely to be the main targets as well as the most vulnerable to overall surveillance and particularly Surveillance Capitalism. Despite that fact, empirical research on the lack of privacy as well as opinion and individuals’ measures within such populations appears to be scarce; And so is their inclusion and participation into software projects and development. [12, 28]

While there are many great and growing privacy-aware services and software products out there already, many consumers still fall back onto the gratis-tier data harvesters, as their privacy friendly counterparts seem either too expensive for what they offer, appear too technical and exotic or due to lock-in effects of proprietary plattforms.

From the perspective of such privacy friendly, Open Source and Free Software projects, working on further engagement, inclusion and lowering entrance barriers for participation of disadvantaged populations into the development, planning and overall community relation could improve things substantially. Creating further awareness for Open Source and privacy related topics is also an integral part of pushing for a better and fairer digital world. And such efforts are already being done for a while now by many of the small and bigger players of the Free Software scene such as Nextcloud, Collabora, The Document Foundation, GNOME, KDE and many more.

Service and Software providers commited to privacy such as we at ViOffice should also strengthen their ties to those projects, as is the case for us in regards to Nextcloud and Collabora, which are an integral part of our cloud product range. Additionally to that, there should also be other efforts for example through social-aware pricing, privacy-first data handling and information about these topics.

However, “Privacy”, as mentioned before, is a bigger issue than just software-related topics as it touches all parts of our life and affects everything from individuals’ well-being to democracy and society. Furthermore not everything can be regulated from within the market itself, especially since competing with Surveillance Capitalists through pricing alone is a hopeless fight. As such, data and privacy literacy need to be improved on regional and global scale and indeed also institutional privacy laws. Privacy-respecting Free Software projects require broader acceptance and recognition, while those that capitalize on privacy and self-determination need to be regulated much closer and held accountable more rigorously than they have until now.


  1. Consumer Reports (2017): Social Media & Privacy Survey. Online at: consumerreports.org.
  2. Levin, A. and Abril, P. (2009): Two Notions of Privacy Online. Vanderbilt Journal of Entertainment & Technology Law, Volume 11, Pages 1001-1051, Online at: ssrn.com.
  3. Amnesty Internationl (2019): Ethical AI principles won’t solve a human rights crisis. Online at: amnesty.org.
  4. Benjamin, Garfield. (2017). Privacy as a Cultural Phenomenon. Journal of Media Critiques. 3. DOI: 10.17349/Jmc117204.
  5. Farkas, R. (2015): Who cares about privacy? Surprising facts from around the globe. Online at: argonautonline.com.
  6. Gallie, W. B. (1956): Essentially Contested Concepts, Proceedings of the Aristotelian Society, Volume 56, Issue 1, Pages 167–198, DOI: 10.1093/aristotelian/56.1.167.
  7. Selmi, M. (2006): Privacy for the Working Class – Public Work and Private Lives, 66 La. L. Rev. 1035. Online at: law.gwu.edu.
  8. Doctorow, C. (2016): Surveillance has reversed the net’s capacity for social change. Online at: boingboing.net.
  9. Peters, J. (2020): Data Privacy Guide – Definitions, Explanations and Legislation. Online at: varonis.com.
  10. Amnesty International Deutschland (2013): Privatsphäre und Datenschutz. Online at: amnesty.de.
  11. Amnesty International Deutschland (2016): Privatsphäre ist ein Menschenrecht. Online at: amnesty.de.
  12. Bennett, C. (2011): In Defence of Privacy – The concept and the regime. Surveillance & Society 8(4): 485-496.
  13. Brown, I. (2014): Social Media Surveillance. In The International Encyclopedia of Digital Communication and Society. DOI: 10.1002/9781118767771.wbiedcs122.
  14. Marwick, A. (2012): The Public Domain – Surveillance in Everyday Life. Surveillance & Society. Volume 9. DOI: 10.24908/ss.v9i4.4342.
  15. Mirza, H. (2013): Gendered surveillance and the social construction young Muslim Women in schools. In (In) equalities: Race, class and gender. Online at: academia.edu.
  16. Chander, S. (2021): Artificial intelligence – a tool of austerity. Online at edri.org.
  17. Lyon, D. (2003). Surveillance as Social Sorting: Privacy, Risk and Digital Discrimination. ISBN: 0-415-27873-2.
  18. Privacy Technical Assitance Center (2014): Protecting Student Privacy While Using Online Educational Services – Requirements and Best Practices Online at: tech.ed.gov.
  19. Brown, M. and Dent, C. (2018): Privacy Concerns Over Employer Access to Employee Social Media. Monash University Law Review, Volume 43, Number 3, Pages 796-827, Online at: monash.edu.
  20. Amnesty International (2017): “Muslim registries”, Big Data and Human Rights. Online at: amnesty.org.
  21. Lyon, D. (2009): Identifying Citizens – ID Cards as Surveillance. ISBN: 0-745-641-563.
  22. Amnesty International (2016): Edward Snowden – ‘Privacy is for the powerless’. Online at: amnesty.org.
  23. O’Neill, M. and Loftus, B. (2013): Policing and the surveillance of the marginal – Everyday contexts of social control. Theoretical Criminology, 17(4), 437–454. DOI: 10.1177/1362480613495084.
  24. Privacy International (2019): Surveillance and social control – how technology reinforces structural inequality in Latin America. Online at: privacyinternational.org.
  25. Friedman, M. (1975): There’s No Such Thing as a Free Lunch, Open Court Publishing Company. ISBN: 087548297X.
  26. Mintzer, A. (2020): Paying Attention – The Attention Economy. Online at: berkeley.edu.
  27. Zuboff, S. (2016): The Secrets of Surveillance Capitalism. Online at: faz.net.
  28. Doctorow, C. (2020): How To Destroy Surveillance Capitalism. Online at craphound.com.
 | Website

Jan is co-founder of ViOffice. He is responsible for the technical implementation and maintenance of the software. His interests lie in particular in the areas of security, data protection and encryption.

In addition to his studies in economics, later in applied statistics and his subsequent doctorate, he has years of experience in software development, open source and server administration.