Logo and page links

Main menu


AVT – exit report

Legal basis

Processing of personal data

In short, personal data is any information that can be tied to a natural person, directly or indirectly. It does not matter which format the information has. Text, images, video and audio are all included. “Processing” includes everything that is done with the information: collection, structuring, changing, analysis, streaming, disclosure, transfer, storage, deletion, etc.

Introduction

Public authorities process a lot of personal data about us as citizens. In order for this processing to be lawful, they need a legal basis – a statutory authority. This statutory authority defines the framework of what the authority (such as a municipality) can use our personal data for, and serves to protect citizens from intervention in their lives.

Municipalities are responsible for Norway’s primary and lower secondary school (see Section 13-1 of the Education Act). This means they are responsible for and have the authority to decide how this education is provided – within certain limits, of course. As an example, it would be difficult to achieve good and effective education if each parent had the power to decide which textbooks they wanted their child to have. In addition, the municipalities are “school owners” of the public primary and lower secondary schools. In other words, municipalities have the power to decide what happens in “their” public primary and lower secondary schools. Legally, the municipalities are then also the “data controller” for the processing of the pupils’ personal data. The learning analytics system that is being developed in the AVT project is something municipalities will be able to offer schools, in line with other teaching aids and support systems teachers use to plan, teach and follow up on the pupils’ learning. If the school chooses to implement the system, it will not be optional for the pupils.

Legal basis requirement

Municipalities need a legal basis in order to impose pupils to use the learning analytics system. And because the system uses artificial intelligence, they need a legal basis for both the application phase and the development phase (also called “post-learning”) for the artificial intelligence (algorithm). The development of the algorithm is a process that often continues after the system has been implemented – as a perpetually repeated process – because the system learns from new data that is added and generated. The purpose of the two types of processing required by these phases must be defined in detail by the data controller, and we return to the AVT project’s definitions later in this chapter.

It is natural to split the question of legal basis in two, based on the two main phases in an AI project; the development phase and the application phase. The two phases utilise personal information in different ways.

Article 6 of the General Data Protection Regulation defines several alternatives for lawful processing. Two of these are relevant for learning analytics:

 

Both of these legal bases require what we call supplementary legal basis. This means that the processing must have a legal basis in another law. The requirement for a supplementary legal basis follows from Article 6 (3) of the Regulation.

The supplementary legal basis does not necessarily need to regulate the processing in detail, even if the Regulation does allow it. However, we also need to consider the legal basis in light of other legal requirements. Among other things, the supplementary legal basis must be designed in such a way that the processing is predictable for the person whose data is being processed. This is part of the legal protection principle of predictability, which is a central principle in Norwegian law. In addition, the more invasive the processing is, the clearer the legal basis should be. Also, the principle of data minimisation (Article 5 of the GDPR) is relevant, because it limits the scope of personal data available for lawful processing. This principle states that processing must be limited to only what is necessary in relation to the purposes of processing. In order to make assessments and adapt the level of education to the needs of individual pupils, it is for example, necessary to process information about the pupil’s learning progress in different subjects, but it is not necessary to process information about the pupil’s leisure time activities or when the pupil did their homework, even if this information could be relevant for the assessment.

It is also worth noting that the conditions associated with “legal obligation” are more narrowly defined than those associated with “in the public interest”. If based on the former, the controller must be legally obligated to perform the processing, and the purpose must follow from the supplementary legal basis. This means that the legislator in a democratic process must define the purpose of processing the controller is legally obligated to perform. Furthermore, the data subject has no right to object to the processing if the data controller is legally obligated to perform the processing. The limited scope of action that follows from “legal obligation” indicates that there are no real alternatives to the processing of personal data for achieving the stated purpose. A logical consequence of this would be that it would not serve any practical purpose to give the data subject any right to object.

However, if the processing is conducted for the purposes of performing a task carried out in the public interest or in the exercise of official authority vested in the controller, the purpose does not need to be specified in the legal basis (but it may). It is sufficient that the purpose in itself is necessary for performing the task or exercising the authority in question. As such, letter (e) gives the data controller room to define the purpose themselves. This means that the requirements for the supplementary legal basis are somewhat less expansive if the processing is based on “public interest” than if it is based on “legal obligation”.

For processing based on “public interest”, the data subject also has the right to object to the processing, see Article 21 of the Regulation. Such objections must be made on grounds relating to “that persons particular situation”, but the provision does not specify further what this entails. The wording suggests it may concern individual circumstances of virtually any nature. The right to object, however, is not absolute. If the data controller can demonstrate “compelling legitimate grounds for the processing which override the interests”, the processing may continue (see Article 21 (1) of the GDPR).

The Regulation also provides other legal bases for processing, which are not relevant for our project. Basing the processing on consent, for example, is not an option. Consent may be withdrawn at any time, and is not an appropriate solution where there is an imbalance of power between the parties. Basing the processing on contracts between the municipality and the parents is also not an option, because nobody can be forced to enter into a contract.

Discussion of legal basis

The defined purpose of the AVT project's application phase

Use learning analytics to support teachers in their assessment work, to provide better adaptations and to provide pupils with insight into their own learning.

The application phase

Legal basis was the most frequently discussed topic in this sandbox project, and these discussions primarily focused on the application phase of the learning analytics system. 

There is no doubt that the municipalities have a legal basis for processing the personal data of pupils, both to provide educational adaptations and to assess their performance. In 2021, a new provision was added to the Education Act (Section 15-10).

This provision grants school owners the general right to process personal data, including special categories of personal data, whenever necessary to perform tasks pursuant to the Education Act.

It follows from Section 1-3 of the Education Act that education must be adapted to the abilities and aptitudes of the individual pupil. Furthermore, it follows from the Education Regulations that pupils have a right to continuous assessments in subjects. The purpose of assessment is to promote both learning and the desire to learn, as well as to provide information about competence acquired at various stages of instruction in the subject. The basis for assessment is the competence objectives in the subject curriculum. The requirements for continuous assessment are defined in Section 3-10 of the regulations. The regulations specify that continuous assessment is an integrated part of education, and that this assessment shall be used to promote learning, adapt education and increase competence in the subject. The municipality is responsible for ensuring that the requirements of both act and regulations are met, which includes making available any and all resources necessary for ensuring compliance with these requirements.

The discussions in the sandbox has primarily focused on whether the legal obligation municipalities have to adapt education and assess pupil performance, also extends to the processing of the pupils’ personal data using learning analytics and artificial intelligence. As mentioned above, this must be seen in light of, among other things, the supplementary legal basis, the principle of data minimisation, and the legal protection principle of predictability. Before we move on to the discussion, we would like to provide a quick summary of how adaptation and assessment work today.

There is no universal method for how municipalities assess pupil competence and provide adapted education. Methods vary between schools and inbetween teachers at the same school. Nevertheless, the AVT project has provided a general example: The teacher gathers assessment information manually by looking at completed activities in each teaching aid used by the pupils. Each teaching aid presents assessment data in different ways. The teacher must therefore try to connect the dots and build an overview of the pupil’s competence and development, based on the pupil’s answers and activities, in order to generate the best possible basis for assessment. This basis is then applied in the adaptation of education. This process is relatively unregulated, and the tools used spans from the teacher’s own memory, via unstructured written notes, to more structured systems, such as digital learning platforms with various functionalities.

It could be relevant to compare the learning analytics system with teaching aids and support systems for the compilation of activity data from one major provider with a broad product portfolio. These providers often have similar functionality for compiling pupil performances from their own product portfolio. The learning analytics system of the AVT project will collect more of this essential functionality for teachers in a single system, which will also use artificial intelligence.

There are several benefits of implementing this type of centralized learning analytics system, which gathers activity data from different teaching aids and uses artificial intelligence to process it:

  • The activity data for each individual pupil will be collected from different teaching aids and easily accessible from a single system.
  • The presentation of the data is more structured, because they have a comparable format and are tied to elements in the curriculum. This will improve the quality of the analysis, performed by both the system and the teacher.
  • Analyses of individual pupil learning will be more accurate, as the pupil’s data can be compared with aggregated data about other pupils.
  • New personal data is generated, and a pupil profile can indicate where the pupil is in their learning process and provide recommendations for what to do next. This would be difficult to extract without the compilation provided by the system’s AI.

The AVT project has evaluated that continuous adaptation and assessment work, with or without the use of digital learning analytics, is the same processing, in principle. The legal obligation to provide assessment and adapted education is consistently implemented in the Education Act and the associated regulations. This obligation can be fulfilled in various ways, using various tools – both printed and digital. The scope of data processed about each individual pupil and the metadata (context) this is added to vary, and is not tool-dependent. The AVT project therefore maintains that there is no difference, in principle, in whether a teacher performs their own analyses of pupil competence and adaptation needs by, for example, studying the pupil’s performance across various teaching aids, notebooks and verbal participation in class, and having a learning analytics system to help the teacher make these assessments by compiling and processing data from some of the same teaching aids in a visual interface.

Further on, the AVT project has considered that the choice of tool is not relevant in the consideration of whether or not a legal basis is sufficient. In support of this view, they refer to the Privacy Appeals Board’s decision in the Spekter case, where the Board wrote:

From the Privacy Appeals Board’s Spekter decision:

“While the data may be qualitatively different as a result of digital collection compared to collection through observations and communication with the pupils, this is also not something that would lead to the legal basis in this case being deemed insufficient. The Board does not believe it is relevant in this context to assess the appropriateness of Spekter versus other means of collecting the same information.”

As a further argument in support of the use of a learning analytics tool falling within the scope of the legal obligation, the AVT project emphasised that it is the school’s responsibility to adapt education to ensure the best possible basis for assessment. This applies both to the breadth of the subject and to in-depth knowledge within certain areas of the subject. Furthermore, the AVT project points to the fact that specific competence objectives within the curriculum define the framework of what is necessary, and that the principle of necessity in the Personal Data Act must therefore be seen in light of this. Personal data that is not relevant for the pupils’ attainment of the learning objectives for the subject will therefore be unnecessary, whereas personal data that is relevant for the pupils’ attainment of the learning objectives will be necessary.

As a basis for the structuration of personal data in the learning analytics system, a domain structure (called fagkart – subject map – in the project) was developed. This subject map is a digital representation of the curriculum and more detailed subject topics and concepts within it. In addition, the project has defined which types of metadata to register, through an xAPI profile and examples. The AVT project claims this framework ensure that all personal data used are relevant, and therefore necessary. They claim that the elements of the metadata are what must be verified in light of the necessity principle, and not the scope of the data sources themselves. The AVT project argue that they have subjected these elements to a rigorous assessment. In the subject map, six of the ten elements, for example, are taken straight from the curriculum, whereas the last four are based on the curriculum.

The view of the Data Protection Authority is that the choice of tool may be relevant for the assessment of whether the processing is covered by the supplementary legal basis. Different tools process personal data differently, and processing using one tool may constitute a greater intervention than processing using another tool.

Tools are less relevant in the assessment if it involves two similar tools processing the same quantity of personal data in almost identical ways. The use of learning analytics and artificial intelligence will, among other things, generate new and different types of personal data about the pupils, which means that much more personal data about each individual pupil will be processed. The system will also be capable of learning, finding connections, conducting probability analyses and drawing conclusions far beyond what both humans and systems that do not use artificial intelligence are capable of. In our view, therefore, the use of learning analytics with the help of artificial intelligence represents a fundamental deviation from current methods of adaptation and pupil assessment.

Even so, whether or not there is a fundamental difference between the current method and the future method involving learning analytics is not the determining factor for which purpose of processing is most appropriate. Even if there is a fundamental difference, this does not necessarily entail that the processing cannot be based on a “legal obligation”. To consider this, it would be relevant to explore whether, and if so, how, the processing of personal data is affected by the implementation of the learning analytics tool.

The Data Protection Authority supports that, in connection with assessment of the pupil’s performance and adapting the education to meet the pupil’s needs, it may be necessary to process personal data that are relevant for the pupil’s attainment of the learning objectives defined by the curriculum. We do not, however, share the view that the scope of the data sources is irrelevant in a consideration of which types of data it would be necessary to process, to “comply with a legal obligation” or to “perform a task carried out in the public interest”, respectively. The necessity requirement defines the limits of what constitutes lawful processing of personal data within the defined purposes. The principle of necessity is a legal standard in the EU/EEA and must be interpreted in accordance with the Regulation’s purpose. If the processing is based on a “legal obligation”, the framework provided by the Education Act sets a firm boundary, in that the legislator lays down the purposes, and considerations of which types of data are considered necessary must be made in light of these. If the processing is based on “public interest”, the processing must also be performed within the scope of the Education Act, but the controller has more freedom to define what the purposes are. In the Data Protection Authority’s view, the learning analytics system appears to be processing more types of personal data than what is strictly necessary to fulfil any legal obligation. The Data Protection Authority, then, has some doubts about whether the municipalities can be said to have a legal obligation to use learning analytics and artificial intelligence to adapt education and assess pupil performance. Based on the Data Protection Authority’s limited understanding, therefore, it would seem more appropriate to base the processing on “public interest”, where the municipalities have more freedom to define what the purposes of the processing are.

It is important to point out that the sandbox project did not include a comprehensive analysis of how the obligation to provide assessments and adapted education to each individual pupil is currently handled, or of the degree to which current methods are suitable for fulfilling this obligation. The AVT project has highlighted that the quality of continuous assessment and adapted education is likely to increase considerably if the learning analytics system were implemented. The project has furthermore argued that one of the central values of the learning analytics system is that each pupil would have improved access to and use of their personal data, as a result of the data being available on a joint platform managed by public authorities.

The development phase

In considering the legal basis, it is natural to distinguish between the application phase, as described below, and the development phase. Among other things, the development phase entails that the learning analytics system uses activity data from the pupils to train the algorithm, making it more accurate in its predictions.

The AVT project concluded that the purposes for the two phases . One must therefore consider the legal basis for the algorithm’s development phase on its own, separate from the main purpose. In the sandbox, the partners agreed that “public interest” was the most appropriate legal basis for the development phase. The municipalities do not currently have any legal obligation to contribute to the development of AI tools for use in education.

The processing of personal data about individual pupils for the purpose of developing the algorithm in a learning analytics tool is a task carried out in the public interest, and not just in the interest of the individual pupil whose personal data is processed. There is reason to assume that processing for this purpose may appear less predictable for the pupils and their parents than does the processing of personal data in connection with the use of the learning analytics system.

By basing the processing on “public interest”, the pupils are granted the right to object. Read more about this later on in this report. The AVT project aims to design the learning analytics system in a way that protects this right, with an easily accessible function where pupils can request that their data should not be used to train the system. In addition, they want to limit the impact on the pupils’ privacy by pseudonymising personal data used to develop the algorithm. They also want to look into the possibility of anonymising the pupils’ personal data for this purpose. The latter is dependent on the policy adopted by the European Data Protection Board in its guidelines on anonymization, currently under preparation.

Like the processing of personal data necessary for compliance with a legal obligation, the processing of personal data necessary for the performance of a task carried out in the public interest also requires a supplementary legal basis.

The AVT project has primarily emphasised Section 13-3e, together with Section 15-10, of the Education Act as the supplementary legal basis for the processing of personal data in the development phase. Section 13-3e of the Education Act regulates the municipalities’ obligation to promote quality development in education. The AVT project argues that this obligation entails making sure learning analytics are as accurate and effective as possible, which makes it necessary to develop the chosen algorithm. This includes the processing of personal data about pupils for post-learning purposes.

The topic was not discussed in detail in the sandbox, because most of the attention was directed at the legal basis for processing in the application phase. The Data Protection Authority finds it challenging to identify a supplementary legal basis for the development phase in the Education Act or associated regulations, but does not rule out that the basis for processing provided by the AVT project may be used. This challenge is not unique to the education sector; it is something most public bodies who wish to contribute to the development of AI tools risk encountering. The issue is also briefly discussed in the final report from the NAV sandbox project, which ran parallel to this project.

The Ministry of Education is currently working on a new education act, which will not be ready until 2023, at the earliest. The education sector, therefore, currently has a unique opportunity to clarify how schools can contribute to the development of educational AI tools.

The right to object

The processing of personal data on the basis of “public interest” means that the data subjects, with certain limitations, have the right to object to the processing (see Article 21 of the GDPR). In the sandbox, we discussed this right in light of the processing performed by the learning analytics system. As previously mentioned, an objection must be based on the data subject’s “particular situation”. As we have limited experience with the use of artificial intelligence in primary and lower secondary education, it can be difficult to envision the kinds of particular situations this may involve.

The right to object to processing by the learning analytics system can have both positive and negative implications. An objection would lead to an assessment by the municipality, which, in turn, may lead to improvements in the system. Also, there is an imbalance of power between the municipality and the pupil, and the right to object could contribute to levelling out this imbalance. The resources it would require to process potential objections can be a drawback, and it is difficult to estimate how many objections there will be. The use of artificial intelligence is innovative and unfamiliar territory for many people. Uncertainties may give rise to concerns in both pupils and parents, which may, in turn, lead to objections against processing. While objections on these grounds are not necessarily valid, and the controller does not necessarily need to grant them, the objections must of course be processed, and the burden of evidence is on the controller.

The number of objections could likely be reduced with transparency (which is also a requirement), such as about which types of data are being processed, how the data is used, what the pupils’ rights are, and how they may exercise their rights. At the end of the day, the majority of people will likely appreciate the improvements in adaptations, with subsequent improved learning for the children. Provided that the system is developed responsibly.

Summary – legal basis

The Data Protection Authority and the AVT project dedicated two workshops to discussing the legal basis for processing of personal data by the learning analytics system, and our views differ in terms of the choice of legal basis and whether the supplementary basis is sufficient. The AVT project believe the processing is in response to a “legal obligation” and points to the supplementary basis in education legislation to support this conclusion. The Data Protection Authority’s argument is that “public interest” is the most appropriate basis for the processing, and that the supplementary basis ought to be stronger than it currently is.

In this context, it is important to point out that the partners in the AVT project know considerably more about the education sector, learning analytics and the Education Act than the Data Protection Authority. The project has been ongoing since 2017, and the partners have worked actively on these assessments and the system itself. The AVT project, therefore, has a much broader basis for assessment than the Data Protection Authority, and the project has been able to dive much deeper into this subject matter. The Data Protection Authority’s contributions are considered as a guidance and do not constitute a review of the legality of the learning analytics system. At the same time, the AVT project is conscious of the fact that there are some central questions concerning the processing of data in the application of the system they do not yet have any experience with, and that may entail a reconsideration of their view on the legal basis for processing at a later date.

We would also like to emphasise that the education sector has an opportunity to initiate the establishment of a stronger legal basis than it currently has. The new education act, for example, may include explicit provisions for the use of artificial intelligence and digital learning analytics, with a more clearly defined framework for their use. If so, this would ensure a stronger democratic foundation for the use of such tools and a higher degree of predictability, for pupils, parents, teachers, school owners and others who may be affected.