Prepared for the interdisciplinary symposium “Super-Scoring? Data-driven societal technologies in China and Western-style democracies as a new challenge for education.” Cologne, Germany; October 11, 2019. The essay can be downloaded here as a PDF-File.
by Isabel Zorn
Education for the subject
The goals of education and training are based on underlying concepts of humanity. In western, democratic societies, for example, people are generally regarded as capable and worthy of education, regardless of their abilities, their constitution, their past or their predicted ability to perform. Man as a subject is to be promoted in his development. In Germany, good schooling is available free of charge, even compulsory, even for children with severe mental disabilities and regardless of predictable future performance (with or without scoring). This is based, for example, on the principle of the equivalence of people and the prohibition of discrimination (cf. Basic Law of the Federal Republic of Germany). Regardless of all concepts and goals of educational offers, educational theoretical considerations assume the unavailability of the subject. How a subject develops on the basis of perceived educational offers is neither determinable nor predictable and should not be predetermined. Although under the pressure of economisation, the focus is also repeatedly on the transfer of vocational skills, the general rule is that general schools should develop people to develop their personality in a self-determined way and to be able to experience full participation in society. The UN Convention on the Rights of Persons with Disabilities (UNCRPD), which is based on human rights, underlined this once again. This unavailability of the subject and the democracy-based claim to free development of people’s personality is also what Evgeny Morozov (Morozov, 2015) focuses on: He develops reasons for why individuals should be interested in data protection, because data protection is a good worth protecting: It is not about whether people currently have something to hide, it is much more important instead that their free and broad development capability should not be restricted by restricted access to information. However, this is the danger of automated pre-filtered information provision based on earlier data (or scoring results).
However, educational science and education are charged with the dilemma of oscillating between liberation and the ‘ideological state apparatus’ (cf. Althusser 1971) or ‘governmental leadership’ (Foucault 2000: 64). Dander points to this problem in particular when it concerns pedagogy within the formal education system and in the field of education and training of children and young people (Dander, 2014).
Education thus stands in the tension between the promotion of free development and the formation of people. It promotes behaviour that increases people’s chances instead of regulating people through punitive education. There is a growing debate about the extent to which education is also a form of human governance exercised by democratic governments or by app-developing companies through nudging (the barely perceptible call for behavioural change). When such demands are triggered by algorithms and the analysis of previous behavior, the sociologist Herder even speaks of the governmental power of algorithms (Herder, 2018). Algorithms embedded in media can calculate very precisely who is susceptible to which recommendation for behavioural change on which topic and when. If this were taken further, such methods could possibly educate more effectively (Herder called them “governing” after Foucault) than education or upbringing methods.
The media are traditionally considered to have a significant influence on the formation and education of people. Media pedagogy in particular has therefore always been confronted with questions about the manipulation potential of media (in earlier centuries: access to books, access to cinema films with kissing people, effects of depictions of violence; currently: effects of computer games, destruction of human relationship skills through media consumption, opinion manipulation by the press, reception of fake news, etc.).
Media Education and Scoring Practices
Media pedagogy is education science as well as pedagogical practice with reference to media, it analyses pedagogically significant questions with reference to media and develops concepts for the development of media competence.
“Media pedagogy encompasses all questions of the pedagogical significance of media in the areas of leisure, education and occupation. Wherever media as means of information, influence, entertainment, instruction and everyday organisation gain relevance for human socialisation, they become the object of media pedagogy” (Hüther & Schorb, 2005) 265.
In doing so, it makes use of the knowledge and methods of media studies as well as education and educational sciences and considers, among other things, how educational processes in a mediatized society should be designed in order to both prevent educational inequalities and use the potential of the media to open up new educational processes. An important area is the analysis of which media competence(s) are necessary in the mediatized, technicized, datafied society with which goals.
To this end, it makes use of the findings of media studies, for example, with questions about theories of use and conditions of life in a mediatized society (Krotz, 2001), motivational theories, the effects of media in media socialization research (i.e. questions about how people are socialized by media and what role media play in development processes), or media impact research (questions about how the reception of certain media and their contents or the use of their forms of interaction affect people).
Media education and media literacy: protection and empowerment
Using a pedagogical perspective, media pedagogy develops reasons, for example, for people having to understand media and its possible uses, its potentials and risks (media education (Spanhel, 2006) and whether and how media competence can be imparted (Baacke, 1996) and how this impartment can be structured. Media pedagogy is discussed and specific media offerings open up possibilities for individual educational processes (Jörissen, 2011).
Media education deals with what people, and it is primarily children and young people who are looked at, know about the media and what they need to understand and be able to do. The protection of people from the risks of media use, such as excessive media consumption, is also emphasised. Where there is protection from being able to use the media independently, there is talk of conservation pedagogical approaches. While media education approaches claim to impart the relevant knowledge and skills sufficiently, educational theory approaches in media education (Jörissen & Marotzki, 2009; Spanhel, 2010) unavailability of the subject. People are thus offered educational opportunities that open up educational possibilities to them, but what people take from these offers and how this educational experience changes their self and world view cannot therefore be determined.
Typical questions in the discipline of media pedagogy:
The following questions of media pedagogy can be summarised as examples:
- How must education be designed so that participation, equal opportunities, increased opportunities and subjective articulation possibilities are guaranteed and promoted through education?
- Where must people be protected, protected from negative influences by media or by dangerous media actions?
- What effects do (missing) media literacy have in the mediatized society? What are the risks for individuals?
Inequality in media use studies
Media usage studies provide information about which people use which media and how. (JIM, D21, DIVSI). Here it can be seen that almost the entire population increasingly uses digital media, but there are great differences in the way media is used and the adeptness of media use. This adeptness, which is called media competence or digital index or information literacy by its various meanings, must be developed for all people in a mediatised democratic society. However, various studies repeatedly point out that disadvantages in the form of education, income, inclusion and employment also go hand in hand with disadvantages in media literacy. For example, the ICILS study draws attention to the parallelism of educational disadvantage and information literacy (Bos et al., 2014) or the D21 study draws attention to the connection between employment (especially in office occupations) and digital competence (Initiative D21 e. V., 2016) p. 26).
There is an increasing need for methods for the theoretical or empirical analysis of relevant topics and media-induced social problems, for example the intensification of social inequality indicated by the ICILS study through the widening gap in educational disadvantage, although the acquisition of further education is made more difficult due to a lack of media competence.
Digital Inequality and algorithmic scoring
The emergence of digital media increasingly reinforced inequality structures (Klein & Pulver, 2019)(Iske & Kutscher, 2020; Hargittai, 2002; Zorn, 2017). If the behaviour of people in the social media and on the Internet is recorded and evaluated by scoring, it can be assumed that existing disadvantages will be more acutely reflected here and further aggravated (O’Neil, 2016). It can also be assumed that people will try to shape their recordable behaviour, which is supposed to contribute to scoring, in such a way that as few disadvantages and advantages as possible arise for them. Whether this is possible is left to be seen. If it is possible, this requires not only technical skills but also skills that, for example, according to the Baacke model (Baacke, 1996) refer to dimensions such as media criticism skills, media studies, media use. It can therefore be assumed that disadvantaged people with less media literacy might be less well versed in behaving favourably and thus be even more disadvantaged by scoring and superscoring practices because of these lower media literacy levels.
Competences to protect one’s own data and privacy thus gain not only ideological but also material value. So far, these competencies have not been very pronounced in the population: Current research results on the subject of private sphere competence (longitudinal study representative for Germany, N=2,100, Trepte & Masur, 2015) show that younger people under 18 and older Germans over 64 have the lowest private sphere competence. The 18- to 30-year-olds know the most (Trepte, 2016; Trepte & Masur)
It can be assumed that people want to avoid possible disadvantages. In this respect, people who have knowledge of the survey and disadvantage structures of scoring practices and the necessary media skills will presumably develop practices through self-management practices and in anticipatory obedience (cf. Foucault’s Panoptikum theories) for which they expect a positive assessment or which conceal their actual actions as effectively as possible.
Transparency about the effectiveness of the scoring algorithms is a necessary basis for this. The report on consumer protection (Sachverständigenrat für Verbraucherfragen, 2018) cites the transparent presentation of scoring practices as an important requirement. In order to understand the practices and to develop competent ways of dealing with them in one’s own media use, media skills are required.
Media Literacy as an Objective of Media Education in the Context of Scoring Practices
A primary goal of media education is the teaching of media competence, which includes, for example, the teaching of competences in media criticism, media studies, media use and media design (Baacke, 1996). These competence dimensions have already been sufficiently described. Here it will be explained to what extent these four competence dimensions could be developed for dealing with media scoring mechanisms and what should be conveyed here.
Dimension Media Studies:
Which media types record which data? What alternative media are there? Which (alternative) media can I use to achieve my goals? What is data encryption?
Dimension Media use:
How are which media installed and used? How is encryption software used? What do you communicate with certain media (rather not) and how do you leave few data traces?
Dimension of media criticism capability:
What knowledge about German and European legislation must be conveyed? What knowledge of technical, institutional and economic contexts is necessary? How do media systems work? How does scoring work? Who benefits from scoring how? What’s allowed? Where is privacy protected/attacked by media and how?
Ganguin ((Ganguin, 2006) defined the competence of media criticism more precisely with a view to digital datafication. After that, the media criticality capability mentioned as one dimension at Baacke contains 5 dimensions:
- perceptive faculty (…)
- decodability
- Analytical capability (…)
- Reflectivity (…)
- Judgment (…) (p. 71 et seq.)
These exemplary questions for the transfer of competence make it clear that in the context of digital algorithm-based media the subdivision into technical and pedagogical questions quickly becomes blurred.
Technical competences, namely technical knowledge and technical action, are also increasingly required. For the development of competences, education for sensitisation and the acquisition of knowledge is not sufficient: It is also necessary to impart the ability to act.
A media pedagogical perspective on scoring practices
Scoring is a form of collecting and analyzing performance and behavior that results in a score, i.e. an evaluation. Educational science has traditionally been concerned with the illumination and development of methods for evaluating performance and behaviour. Evaluation, especially scoring in educational contexts, traditionally used to have two objectives: Feedback and selection (Maier, 2010). Grade evaluations aim on the one hand to give feedback and to stimulate and motivate improved performance. On the other hand, grades can be used to select: who belongs to the better, who to the worse in a population, e.g. a school class? Catá Backer already spoke in his lecture today about the tradition of selective lists. Social selections take place, for example, on who is admitted to which secondary school after primary school, who is allowed to take the Abitur, who is admitted to which course of study on the basis of the Abitur grade, who is allowed to take which course of study, etc.
However, it is well known from research on grading that grading in schools is very subjective. In a famous experiment that can still be reproduced today, the educationalist Ingenkamp shows that an essay that is graded by 30 student teachers repeatedly receives 5 different grades (Ingenkamp, 1972).
Now one could argue that neutral valuations would be better possible with algorithmic valuation methods. However, this is only partly true: the evaluation of educational outcomes is influenced by which standards are considered important and applied. The construction of “neutral” algorithms is also based on standards. However, it is often no longer visible which scales these are, which variables are processed and how, and who programmed these variables and how. The evaluation criteria become opaque, but appear neutral.
Another problem with seemingly neutral algorithmic evaluations is that they influence culture and behaviour. In his examples, Felix Winter shows how grading and learning culture influence each other (Winter, 2004). Which culture do we want to shape where and how? These would be questions that would also have to be clarified in a broad discourse if scores were to be applied. The applause of the audience for those scientists who received a high score at a scientific evaluation portal Researchgate shows that numbers convey values, even if one has no knowledge of the underlying values (i.e. influence variables and their processing).
If the same variables and algorithms always determine scores, only certain human behaviors are considered and uniformly evaluated as positive or negative. However, this contradicts the idea of a subektor-oriented education. She values diversity and difference.
If media – in this case Researchgate or facebook – are used for one purpose, namely networking, communication and publication exchange, but are programmed for another purpose, namely the creation of scores, people voluntarily and free of charge offer their behaviour to companies and thus their date for an unintended purpose.
In order to have knowledge about data collection and data processing possibilities up to the creation of scores and to develop corresponding behaviours, media competence is required.
Quiz
In the lecture I developed a kind of quiz for the audience on media literacy regarding data collection. The questions aim to make the discrepancy between knowledge about data protection and practical data protection behaviour tangible and reflectable.
- Do you know what the privacy problem about Whatsapp is?
- Do you not have Whatsapp on your phone?
- Do you use a privcay protecting e-mail provider?
- And do you use e-mail encryption?
- You know a protective search engine?
- Do you use it?
- Do you know a privacy safe weather app?
- Do you know at excellent resource for explaining how to make your phone safer?
The quiz questions aim at reflecting on one’s own knowledge and the possible dissonance of one’s own behaviour. It can be assumed that people interested in data protection know why it is important to them, but for certain reasons they do not use data protection practices. One of the reasons for this may be the lack of knowledge about possible courses of action.
Data Literacy Methods
Competence requires both knowledge and the ability to act.
In this respect it is necessary for a successful mediation and promotion of media competence both to impart enlightening knowledge and to offer practical and practicable options for action. Enlightenment alone is not enough. In the following, practical tools are presented as examples for both areas.
Tools for knowledge and sensitizing
Part of media pedagogical work lies in showing the effects of media, their social effects and the significance for the development of values and opinions. Traditionally, an important part is the illustration of the effects of advertising and its power of manipulation. In relation to digital media and scoring practices, a practical teaching example is presented here:
The following is a tool for sensitizing pupils to the power of algorithmic prognostics and scoring.
The experience of how easily predictions can be made by algorithms and that you can do this yourself could be a powerful sensitization. With a data set and analysis software this can be shown in school lessons.
Man benötigt ein Datenset (https://archive.ics.uci.edu/ml/datasets/student+performance), dann kann man daraus mit der Software orange (https://orange.biolab.si ) Variablen definieren und mit verschiedenen Verknüpfungen arbeiten lassen. Das kann verändert werden. Die Festsetzung bestimmter Daten wird deutlich. Such a method has been developed by (Grillenberger & Romeike, 2018):
“This data set includes various attributes of the students, their habits and their family situation as well as the points they scored in three examinations. Based on the process the students familiarized themselves with and carried out manually before, this task was performed with software assistance and automated, in order to show the high potential of such analyses and to allow students to adjust their analysis flexibly. For this purpose, we used the tool Orange, which enables data analysis without any programming knowledge by using a graphical interface to visualize and model the data flow. Using this tool, the students were able to conduct analyses and results that were fascinating for them: In particular, they were able to predict the third examination grade with a relatively high accuracy.”
Those who develop technologies themselves recognize the effectiveness of algorithms. Technical competence: On the one hand, it is necessary to teach the basics of programming and the functionality of algorithms and analysis tools. On the other hand, there is a need to demonstrate practical applications (tools) and behaviors in a recipe-like manner.
Tools for Action
The question of the necessary education is linked to the acquisition of knowledge and the range of possible courses of action. Anyone who has recognised that it is possible to make predictions about future behaviour from personal data and a two-score rating may be able to use his or her data on the Internet more sparingly and in a more controlled manner in future, but does not yet know how he or she will succeed in this and what needs to be done.
Therefore, not only an understanding of how algorithms work is needed, but also practical knowledge and its implementation.
If you have the choice between a data processing weather app and a data protecting weather app, you probably use the data protecting weather app. However, this is often unknown and difficult to research. Often this exceeds the possibilities of the individuals. Here there is a need for supply and broad mediation and publication of possibilities for action. Researching these issues exceeds the possibilities of citizens as individuals. Education providers and technology developers are just as much in demand here. Similarly, known bodies are required for the presentation and publication and review of such offers. Suitable tools and instructions are presented below. It has already become apparent that a great deal of specialist knowledge is required to know these tools.
Ina Sander detected 8 data literacy tools which provide knowledge and concrete actions to take in her valuable Master Thesis (Sander, 2018). Two of them are
- Do not track: https://donottrack-doc.com/de/intro/
- Me and My Shadow: https://myshadow.org/resources
Tools in German language are:
- A helpful German information page is Digital Courage, Digitale Selbstverteidigung: https://digitalcourage.de/digitale-selbstverteidigung
- A more well-known European initiative is klicksafe https://www.klicksafe.de/ It imparts knowledge and practical alternatives for action.
Such tools could not only be used for media literacy training, but could also generally be used to stimulate educational professionals to use recommended software and everyday practices in their everyday work.
However, in many areas, despite the educational offerings mentioned above, it is not yet known how to secure privacy and how this can be technically implemented. The need for research can be identified here.
Yes, these things are important…. But I do not have anything to hide!
Do we have something to hide is not the adequate question: We do have something to defend!
A democratic pluralistic society takes free diverse individual development und freedom of speech as a basis as well as the power of the people which is to be negotiated in public discourse.
A democratic society may therefore want to defend the principle of public discourse when preparing decisions.
In reply to the presentation by the colleague from Algorithm watch Nicolas Kayser-Bril: He told us an example that in the UK algorithms are rather used to detect welfare fraud than tax avoidance, even though society loses more from tax avoidance. Why has this decision been taken? This is a political question. But he explains, that „Governance through numbers contributes to the “politics of inevitability”, a concept developed by historian Timothy Snyder to describe the process through which political power can be taken away from the public debate.“
Understanding how programming is structured can help to debate technology and algorithms politically. Programming requires the definition of the problem, the data set and the form of evaluation. If a broad social discourse is desired in a democracy about data- and algorithm-controlled evaluations and consequently decisions, then the population must be trained and educated accordingly.
A democratic society may want to defend free individual development and freedom of speech. Evgeny Morozov (Morozov, 2015) points how we risk this when we adapt our behavior to what we think is appropriate, and how we lose a broad horizon of knowledge when it is machines who assume what we are interested in from early on and present more and more of the same and less of different topics and perspectives. We may then lose options of becoming a more entire personality.
Greenwald (Greenwald, 2014), the journalist who published Snowdens reports on NSA prcatices, explained that we do behave differently when we feel observed, independent of if we objectively had something to hide:
„When we’re in a state where we can be monitored, where we can be watched, our behavior changes dramatically. The range of behavioral options that we consider when we think we’re being watched severely reduce. This is just a fact of human nature that has been recognized in social science and in literature and in religion and in virtually every field of discipline. There are dozens of psychological studies that prove that when somebody knows that they might be watched, the behavior they engage in is vastly more conformist and compliant. Human shame is a very powerful motivator, as is the desire to avoid it, and that’s the reason why people, when they’re in a state of being watched, make decisions not that are the byproduct of their own agency but that are about the expectations that others have of them or the mandates of societal orthodoxy. (…)If you’re willing to render yourself sufficiently harmless, sufficiently unthreatening to those who wield political power, then and only then can you be free of the dangers of surveillance and scoring .“ (Greenwald, 2014).
This change of behavior is not just of high relevance on a personal individual level, but has implications on a democratic society and freedom as the following influential important judgement has shown.
Data storage changes practices: census verdict
The famous 1983 Federal Constitutional Court ruling on the census (census ruling) already pointed out that people are likely to change their actions if they have to assume that their actions and data will be observed, processed and interpreted.
“Individual self-determination, however, presupposes – even under the conditions of modern information processing technologies – that the individual is given freedom of decision about actions to be taken or omitted, including the possibility of actually behaving in accordance with this decision. Anyone who is unable with sufficient certainty to overlook what information concerning him is known in certain areas of his social environment, and who is unable to assess the knowledge of possible communication partners to some extent, can be significantly inhibited in his freedom to plan or decide on his own merits. The right to informational self-determination would not be compatible with a social order and a legal order enabling it, in which citizens can no longer know who knows what, when and on what occasion about them. Anyone who is unsure whether deviant behaviour is noted down at any time and permanently stored, used or passed on as information will try not to attract attention through such behaviour. Anyone who expects, for example, that participation in a meeting or a citizens’ initiative will be registered by the authorities and that risks may arise for him as a result will possibly refrain from exercising his corresponding fundamental rights (Article 8, 9 of the Basic Law). This would impair not only the individual development opportunities of the individual, but also the common good, because self-determination is an elementary functional condition of a free democratic community based on the ability to act and participate of its citizens.(Bundesverfassungsgericht quoted from openjur, 1983, p. 94) Openjur paragraph. 94.
The judgment thus links individual actions with the effects on democracy: The renunciation of the active freedom of expression is not only a restriction of the free development of the personality for the individual, but it is also threatening for the development of democracy and the rule of law.
“From this follows: Under the modern conditions of data processing, the free development of personality presupposes the protection of the individual against unlimited collection, storage, use and disclosure of his/her personal data. This protection is therefore covered by the fundamental right of Article 2 (1) in conjunction with Article 1 (1) of the Basic Law. The basic right guarantees in this respect the authority of the individual to determine the disclosure and use of his or her personal data.(Bundesverfassungsgericht quoted from openjur, 1983, p. 95)”
The census verdict made it clear why data storage can affect people’s behaviour and thus their social order.
Conclusion: Technical education is political education
Data storage and processing, which are technically, non-transparently and possibly unrecognizable and can have consequences for your individual and the society, become a politically relevant topic. Technical education is thus political education. Political education in scoring and inequality categorization requires basic technical knowledge or at least the interest to look at the underlying technical processes.
A pluralistic democratic society needs education that demonstrates the connection between technical constitutions and their effects on society, social coexistence and, if necessary, social change, and that allows people to understand and act independently. The communication of basic technological knowledge must therefore be linked with insights into the correlations between the technical and the social (Gapski, Tekster, & Elias, 2018)
It is also necessary to act outside of education, for example by means of regulations on a legal basis. Education alone cannot cope with the effects of technologies on individual and social development. A shift of responsibility to individuals and their individual actions, decisions that are often only guided (nudged) reactions, but not truly informed decisions, ignores the social implications that the Federal Constitutional Court’s ruling has shown. An informed social political discourse is therefore necessary in many places, which makes decisions transparent and allows the political to be political, even and especially when politics may increasingly resort to numbers and scores.
References
Baacke, D. (1996). Medienkompetenz – Begrifflichkeit und sozialer Wandel. In A. v. Rein (Ed.), Medienkompetenz als Schlüsselbegriff (pp. 111–123). Bad Heilbrunn. Retrieved from http://www.die-frankfurt.de/esprid/dokumente/doc-1996/rein96_01.pdf#page=111
Bos, W., Eickelmann, B., Gerick, J., Goldhammer, F., Schaumburg, H., Schippert, K.,. . . Waxmann Verlag GmbH (Eds.). (2014). ICILS 2013: Computer- und informationsbezogene Kompetenzen von Schülerinnen und Schülern in der 8. Jahrgangsstufe im internationalen Vergleich. Münster: Waxmann.
Volkszählungsurteil, No. Az. 1 BvR 209/83, 1 BvR 420/83, 1 BvR 362/83, 1 BvR 440/83, 1 BvR 484/83, 1 BvR 269/83, openJur 2012, 616 (Bundesverfassungsgericht zitiert nach openjur December 15, 1983).
Dander, V. (2014). Von der ‚Macht der Daten‘ zur ‚Gemachtheit von Daten‘. Praktische Datenkritik als Gegenstand der Medienpädagogik. medialekontrolle.de. (3.1), 2014. Retrieved from http://www.medialekontrolle.de/wp-content/uploads/2014/09/Dander-Valentin-2014-03-01.pdf
Ganguin, S. (2006). Das ‘Kritische’ an der Medienkritik. In H. Niesyto, M. Rath, & H. Sowa (Eds.), Medienkritik Heute: Grundlagen, Beispiele und Praxisfelder (pp. 71–86). München.
Gapski, H., Tekster, T., & Elias, M. (2018). Bildung für und über Big Data – ABIDA Gutachten. Retrieved October 11, 2019, from http://www.abida.de/sites/default/files/Gutachten_Bildung.pdf
Greenwald, G. (2014). TED Talk: Why privacy matters. Retrieved October 09, 2019, from https://www.ted.com/talks/glenn_greenwald_why_privacy_matters/transcript#t-69178
Grillenberger, A., & Romeike, R. (2018). Developing a theoretically founded data literacy competency model. In Proceedings of the 13th Workshop in Primary and Secondary Computing Education, WiPSCE 2018, Potsdam, Germany, October 04-06, 2018 (9:1-9:10). doi:10.1145/3265757.3265766
Hargittai, E. (2002). Second-Level Digital Divide: Differences in People’s Online Skills. First Monday, 7(4), 942. doi:10.5210/fm.v7i4.942
Herder, J. (2018). Regieren Algorithmen? Über den sanften Einfluss algorithmischer Modelle. In R. Mohabbat-Kar, B. E.P. Thapa, & P. Parycek (Eds.), (Un)berechenbar? – Algorithmen und Automatisierung in Staat und Gesellschaft (1st ed., pp. 179–203). Berlin: Kompetenzzentrum Öffentliche IT. Retrieved from https://www.oeffentliche-it.de/documents/10181/76866/7+Herder+-+Regieren+Algorithmen.pdf
Hüther, J., & Schorb, B. (2005). Grundbegriffe Medienpädagogik (4., vollst. neu konzipierte Aufl). München: kopaed.
Ingenkamp, K. (Ed.). (1972). Beltz-Studienbuch: Vol. 23. Die Fragwürdigkeit der Zensurengebung: Texte u. Untersuchungsberichte (3. Aufl.). Weinheim (Bergstr.): Beltz.
Initiative D21 e. V. (2016). 2016 D21-DIGITAL-INDEX – Jährliches Lagebild zur Digitalen Gesellschaft, from http://www.initiatived21.de/wp-content/uploads/2016/11/Studie-D21-Digital-Index-2016.pdf
Iske, S., & Kutscher, N. (2020). Digitale Ungleichheiten im Kontext Sozialer Arbeit. In N. Kutscher, T. Ley, U. Seelmeyer, F. Siller, A. Tillmann, & I. Zorn (Eds.), Handbuch Soziale Arbeit und Digitalisierung. Weinheim [u.a.]: Beltz Juventa.
Jörissen, B. (2011). “Medienbildung” – Begriffsverständnisse und -reichweiten. In H. Moser, P. Grell, & H. Niesyto (Eds.), Medienbildung und Medienkompetenz – Beiträge zu Schlüsselbegriffen der Medienpädagogik (pp. 211–235). München: kopaed.
Jörissen, B., & Marotzki, W. (2009). Medienbildung – eine Einführung: Theorie – Methoden – Analysen. Utb: Vol. 3189. Bad Heilbrunn: Klinkhardt.
Klein, A., & Pulver, C. (2019). Professionalisierung in der Sozialen Arbeit. In I. Bosse, J.-R. Schluchter, & I. Zorn (Eds.), Handbuch Inklusion und Medienbildung (pp. 319–325). Weinheim [u.a.]: Beltz Juventa.
Krotz, F. (2001). Die Mediatisierung kommunikativen Handelns: Der Wandel von Alltag und sozialen Beziehungen, Kultur und Gesellschaft durch die Medien. Wiesbaden: VS Verlag für Sozialwissenschaften. Retrieved from http://dx.doi.org/10.1007/978-3-322-90411-9
Maier, U. (2010). Formative Assessment – Ein erfolgversprechendes Konzept zur Reform von Unterricht und Leistungsmessung? Zeitschrift für Erziehungswissenschaft, 13(2), 293–308. doi:10.1007/s11618-010-0124-9
Morozov, E. (2015). “Ich habe doch nichts zu verbergen”: Essay. Aus Politik und Zeitgeschichte, 65(11-12), 3–7. Retrieved from http://www.bpb.de/apuz/202238/ich-habe-doch-nichts-zu-verbergen?p=all
O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy (First edition).
Sachverständigenrat für Verbraucherfragen. (2018). Verbrauchergerechtes Scoring: Gutachten. Retrieved November 12, 2018, from SVRV: http://www.svr-verbraucherfragen.de/wp-content/uploads/SVRV_Verbrauchergerechtes_Scoring.pdf
Sander, I. (2018). Critical big data literacy and attitudes towards privacy (Master Thesis). Cardiff University, Cardiff.
Dichanz, H. (Ed.). (2006). Handbuch Medienpädagogik : in 5 Bänden / hrsg. von Horst Dichanz …: Vol. 3. Medienerziehung: Erziehungs- und Bildungsaufgaben in der Mediengesellschaft. Stuttgart: Klett-Cotta.
Spanhel, D. (2010). Medienbildung statt Medienkompetenz? Zum Beitrag von Bernd Schorb (merz 5/09). Medien + Erziehung, 49–54.
Trepte, S. Privatheitskompetenz: Das Wissen der Bürger über Privatheit und Datenschutz: Empfehlungen an Wirtschaft und Politik. In M. Friedewald, R. A. Quinn, M. Hansen, J. Heesen, T. Hess, J. Lamla,. . . M. Waidner (Eds.), Forum Privatheit und selbstbestimmtes Leben in der digitalen Welt. Karlsruhe. Retrieved from https://www.forum-privatheit.de/wp-content/uploads/Forum-Privatheit_Policy-Paper-Literacy.pdf (Original work published 2016).
Trepte, S., & Masur, P. K. (2017). Privacy Attitudes, perceptions, and behaviors of the German population: Research report. In M. Friedewald, R. A. Quinn, M. Hansen, J. Heesen, T. Hess, J. Lamla,. . . M. Waidner (Eds.), Forum Privatheit und selbstbestimmtes Leben in der digitalen Welt. Karlsruhe. Retrieved from https://www.forum-privatheit.de/wp-content/uploads/Trepte_Masur_2017_Research_Report_Hohenheim.pdf
Winter, F. (2004). Leistungsbewertung: Eine neue Lernkultur braucht einen anderen Umgang mit den Schülerleistungen. Grundlagen der Schulpädagogik: Vol. 49. Baltmannsweiler: Schneider-Verl. Hohengehren. Retrieved from http://www.gbv.de/du/services/agi/7B9E3953CEA28942C1256FB30050B2E6/420000136084
Zorn, I. (2017). Wie viel „App-Lenkung“ verträgt die digitalisierte Gesellschaft? Herausforderungen digitaler Datenerhebungen für die Medienbildung. In S. Eder, C. Mikat, & A. Tillmann (Eds.), Schriften zur Medienpädagogik: Vol. 53. Software takes command. Herausforderungen der „Datafizierung“ für die Medienpädagogik in Theorie und Praxis (pp. 19–33). München: kopaed.