AI in education: What is allowed, what isn’t?
AI in education: What is allowed, what isn’t?
Artificial intelligence (AI) is becoming a more and more integral part of the everyday life of students. But where is the border between what’s learning support and what’s cheating? And does NMBU’s new set of rules concerning AI provide clear answers?
Journalist: Åsmund Godal Tunheim
Photographer: Håvard R. Magelssen
Translator: Eva Weston Szemes
New guidelines on the use of AI
AI tools like ChatGPT, Grammarly and Keenious can simplify a lot of the tasks in the everyday life of students. For many, the AI tools are welcome additions to the toolbox, and they are used actively as sparring partners for understanding theories, clean up unclear language, or debugging and explaining R scripts. But many of you have probably had a slightly bad conscience when they’ve asked ChatGPT for help: ‘Do I cheat the system to get better grades now? Can I really use AI to make the schoolwork less time-consuming?’
When you also hear stories about people claiming not to have used AI but were still accused of cheating and ended up facing a board, it’s natural to get worried. The arrival of artificial intelligence has also brought a fear of being caught cheating.
There is a lot of uncertainty around which tools are allowed when completing assignments, and until now we’ve only had the general guidelines concerning plagiarism and cheating at NMBU. They don’t say anything about where NMBU stands when it comes to using AI in teaching, learning and assessment. AI is a lot of different things, and can be used in many different ways. Without clear guidelines for what types of AI can be used in different situations, all use of AI turns into a shot in the dark where you don’t know if you’ll hit the target or if the bullet will come back to bite you.
Based on these concerns, NMBU has made new guidelines about using artificial intelligence in education in the last few months. But who made these guidelines, and how strict are they?
The making of the policy
Leading the development of the policy is Niklas Pettersen Mintorovitch, senior advisor for the learning centre at NMBU. He explains that AI became a hot topic at the university in the winter of 2023. Nobody knew exactly how to handle the deployment of these new tools, and it was necessary to find a way to regulate this quickly. “No university had a policy in place, and nobody knew how to handle the situation,” he contextualises.
Niklas says that the School of Economics and Business at NMBU made some guidelines for the use of AI when writing a master’s thesis in autumn 2023. At the start of 2024, REALTEK adopted some of these, adjusting them slightly to their own subjects. What’s been lacking completely until now, is a set of rules for the university as a whole.
The work on these guidelines didn’t start until spring of this year. A strategy group was made, consisting of deans, vice deans, directors, subject specialists, one student representative, and Niklas, who agreed that the most important thing was to make some guidelines about the assessment of the students’ learning. Niklas was tasked with making the first draft. This was ready in June, and the strategy group had a meeting where they discussed it and gave their feedback.
The student representative could not attend the meeting, however. Therefore, the students were practically not involved in the process before the second draft was finished in August. According to AU member Marthe Sponberg, the new student representative of the strategy group, this student involvement was not good enough, even though several members of the group thought so.
But Marthe stood her ground. She also realised that she, with her relatively limited perspective, could not represent the students well enough on her own: “I felt I had to put my foot down in the meeting and say that this has to be discussed in the Student Parliament”. Ingelin Mortensen, another AU member, adds: “If you hadn’t, it wouldn’t have been discussed in student board meetings and the Student Parliament, with all the opinions there”.
The voice of the students is heard
The second draft was discussed at the Student Parliament in September, after having been a topic for the student boards at the faculties. Here, there were a lot of input, that was summarised by AU and sent to Niklas. The Student Parliament voiced their concern about the ambiguity of the term “writing support”, and that unprecise wording could lead to false accusations of cheating. They also said that there should be room for a course responsible to make some adjustments for their courses. Finally, it was pointed out that it’s very important to involve students heavily in a topic that’s so incredibly relevant for our everyday student life.
The feedback from the students was taken seriously. “We need to make it clear that even though the process was bad in the beginning and in the first draft, the inclusion has been very good now,” Ingeli praises. She is very happy that the students have been taken seriously and included. Both Marthe and Ingeli is very happy with the final draft, that has taken the feedback from the Student Parliament into consideration.
AI is not prohibited!
Niklas confirms that the feedback from the students has greatly influenced the third and fourth drafts. He has had a clear intention during the whole process to make it clear that artificial intelligence should not be prohibited, but should rather be adopted as a technology with a huge potential for streamlining and better learning in higher education. He thinks it’s crucial for the students to learn to use AI in a critical and responsible way.
Niklas emphasises that the guidelines are not meant to stop the use of AI. Rather, the intention is to give the students the freedom to use the tools, as long as they are honest about how they use them. He points out that the students need to understand what’s good practice when it comes to use of AI, and how to avoid becoming dependent on the technology. “It’s not about outlawing AI, it's about making sure that the students use it to supplement their own learning, not to replace it,” he emphasises.
How the guidelines will be communicated
An important question about the new guidelines is how they will be communicated to the students. Niklas says that the plan is to convey them through lectures, especially in big, mandatory courses for new students, to make sure that every student receives information about the policy. “There needs to be good online resources in place, and lectures to attend if you want to,” he adds.
Marthe points out that it will be important to use digital platforms like Canvas to inform the students of the new guidelines. “We need to make sure that all students, especially the new ones, receive this information early,” she says. It’s essential that the policy is not just available, but that the students know where to find them.
The guidelines in practice
So, what do these guidelines mean for us, the students of NMBU? The course responsible decides what’s allowed and not, but mainly, this is the rule: You are allowed to use AI, but you need to be upfront about it. If you use AI to improve your language, you need to make the reader aware. If you use it as part of your research method, you need to explain how and why. But most importantly, AI should not replace critical thinking – we need to check the quality of the information we get from AI.
Hopefully the guidelines will reduce the stress of students somewhat. Maybe you will avoid the pang of bad conscience when you use AI to find literature? And maybe you don’t have to face a disciplinary board for the wrong reasons? One thing is for sure: AI will be increasingly integrated into our daily life both at home, in studies and at work, and it will become more and more important to learn to use AI effectively and critically.
Both Niklas, Marthe and Ingeli agree that artificial intelligence is a tool NMBU has to teach us to use, so that we are ready for an increasingly technological future. Then we’ll have to wait and see whether there are any lecturers left in five years.
The AI service ChatGPT was used as an aid in the writing of the Norwegian version of this case.
A link to the guidelines: