Digital Self-Determination


What is Digital Self-Determination, how can it be restricted and how does ViOffice try to preserve the Digital Self-Determination of its users?

In this blog post we would like to explain not only the concept of Digital Self-Determination (in a narrower sense also Informational Self-Determination), but also show how previously discussed topics such as Digital Monopolies, Corporate Social Responsibility and Free Software are interrelated.

What does Digital Self-Determination mean?

Before we go into detail about the meaning of “Digital Self-Determination,” we would first like to discuss the part of general self-determination. The term self-determination has provided room for ongoing philosophical, legal and political discussions since ancient times. Although it is not possible to talk about all approaches here, it seems useful to approach the topic via a definition of the German Ethics Council:

“Self-determination, against the background of the autonomy to which human beings are fundamentally entitled, denotes the possibility of realizing one’s own plans and decisions for action." [1]

In this context, autonomy is defined as “the fundamental ability of human beings to make reasonable considerations of their own accord."

As a possible description of Digital Self-Determination, this implies the possibility of realizing one’s own action designs and action decisions in the digital space or when using digital media. An important component of Digital Self-Determination is Informational Self-Determination, which includes the possibility of making one’s own decisions about the use of personal data. Thus, Digital Self-Determination also encompasses this aspect, but goes beyond it by including other aspects of our digital lives, apart from the disclosure and use of personal data. But how self-determined are consumers when it comes to deciding which private data may be collected and stored online by companies or which messaging service we use for daily communication?

What limits Digital Self-Determination?

To address such questions, the so-called Data Ethics Commission was appointed by the German government, which states in its 2019 report:

“The more information third parties have collected about the individual, the more difficult it becomes to act impartially in social situations or even to completely reinvent oneself as an individual. […] An erosion of the competencies of consumers required for self-determination, for example through an excessive use of decision-making assistants and associated habituation effects, raises ethical questions about the heteronomy and freedom of choice of individuals, but also societal control by individual actors with market power." [2]

The core of this complicated formulation can be illustrated by looking at the interactions between digital monopolies, non-free software (i.e., proprietary software), and data security. Free software is defined in terms of the four freedoms: The freedom to use the software, the freedom to understand it, the freedom to distribute it, and the freedom to improve it. Conversely, it follows that consumers of non-free software are not allowed to determine what they want to do with it, but the rights of use are clearly determined by the developers (or often by their clients). The distribution of copies to third parties is usually prohibited and even punishable. In addition, the source code is not freely accessible, which means that users (or external experts) are often unable to understand exactly what the software does in the background and, for example, which personal data is sent to which companies. [3, 4]

However, these disadvantages can be circumvented as long as an equivalent alternative based on free software exists. The situation becomes particularly problematic if, at the same time, there is a dominant market position among the providers of the non-free, proprietary software, i.e. if there is a (digital) monopoly or a quasi-monopoly. In this situation, there is a dependency of the users, which leads to the fact that the restrictive terms of use must be accepted, whereby digital self-determination is lost to a certain extent. In addition, the so-called lock-in effect prevents or makes it more difficult to switch to other competing software.

The relevance for society as a whole becomes clear when looking at the example of WhatsApp. The planned change to the terms of use, which includes the transfer of personal data to the Facebook parent company, is currently subject of an intense debate. And other changes, such as the softening of the encryption methods used in Whatsapp, are also causing great concern. The problem is mainly that WhatsApp has now become an integral part of everyday communication for many people. Continued use of WhatsApp requires an approval to the changed terms of use. Without this approval, Whatsapp can no longer be used in the future. Whatsapp, as a proprietary service, is a closed environment that, unlike many free software messangers such as Element or even ViOffice Talk, does not allow people to communicate with people on other platforms. Thus, despite privacy concerns, it is very difficult for many people to completely abandon the use of WhatsApp and other Facebook services without experiencing noticeable disadvantages in communicating with their peers. [5]

Another aspect addressed in the above quote are the direct consequences of the personal data collected. Various algorithms evaluate this information and forward it to analysis and advertising companies such as Facebook and Google, which then display advertising tailored to us. This is often product advertising, but also other personal interests or political opinions, people networking and behavioral information. It is important to understand that while personally tailored advertising as such may seem scary, it is not the central problem. Rather, it is merely a symptom of the behavioral and interest analysis that is increasingly intervening in our lives. For this reason, on the one hand the possibilities of influence of important actors increase and on the other hand the chance of single individuals to strike new paths in the course of life decreases. This effect is reinforced by assistance systems that increasingly take personal decisions away from us. [6, 7, 8]

In its recently published “Data Strategy,” the German government analyzes many of the prevailing problems. In the political discourse, the argument that there is no alternative often seems to come up, as well as the statement that there are no adequate alternatives that protect data. This is countered by the fact that digitalization (and digital skills) is still only slowly getting underway in a large number of European countries and companies, especially in Germany.At the same time, these same companies in particular seem to be relying more heavily on behavioral and user analysis by external “Big Data” companies and the use of non-European infrastructure with correspondingly weaker data protection regulation. [9, 10, 11, 12]

How does ViOffice try to protect the digital Self-Determination of its users?

ViOffice is a young (economic) project, which is oriented towards sustainable, ethical and socially just future viability. We see it as our duty to act in a socially responsible manner. This explicitly includes the protection of the Informational and Digital Self-Determination of our users.

Our services are based 100% on free software, which means that it is always possible to see transparently what our applications do and what they do not do. It also makes it easier to switch to another software at a later date, as the lock-in effect is avoided. We do not want to create long-term dependencies on our services, but focus on a compelling and sustainable offering. We do not collect any personal data and reject user tracking as a matter of principle.

Finally, we would like to explicitly point out that we are a digital business enterprise. It is therefore not our intention to denounce individual companies or the use of their services, or to make a final assessment of individual cases. However, we would like to use the best-known examples to provide information on the topic and point out existing problems in this context as well as the business and economic models mentioned. In doing so, we take the opportunity to link the topic in general, independent of individual examples, to our own corporate philosophy.


[1] German Ethics Council (2013): Die Zukunft der genetischen Diagnostik - von der Forschung in die klinische Anwendung. Stellungnahme [German]. Online under [30.04.2013] (Version from 19.06.2020:

[2] Datenethikkommission (2019): Opinion of the Data Ethics Commission. Online under]( [23.10.2019].

[3] Free Software Foundation (2002): What is Free Software? Online under

[4] Free Software Foundation (2013): Proprietary Software ist often Malware. Online under

[5] Eva-Maria Weiß (2021): WhatsApp ändert Nutzungsbedingungen: Daten werden mit Facebook geteilt [German]. Heise Online, 07.01.2021.

[6] Shoshana Zuboff (2019): Surveillance Capitalism and Democracy. Aus Politik und Zeitgeschichte (APUZ 24-26/2019). Bundeszentrale für politische Bildung. Online under

[7] Soshana Zuboff (2019): Im Zeitalter des Überwachungskapitalismus [German]. Netzpolitik.ORG, 12.06.2019.

[8] Shoshana Zuboff (2014): A Digital Declaration. FAZ.NET, 15.09.2014.

[9] German Federal Government (2021): Data Strategy of the German Federal Government. Online under

[10] Eurostat (2018): Cloud Computing Services used by more than one out of four enterprises in the EU. Eurostat Press Release 193/2018. [13.12.2018].

[11] Eurostat (2020): Urban and rural living in the EU. Online under!PB76cU

[12] Eurostat (2018): Internet advertising of businesses. Online under