Skip to ContentSkip to Navigation
About us Organization Quality assurance Education quality assurance

UG basic rules for the use of AI in teaching

Artificial Intelligence (AI) tools offer students and lectures great opportunities for working faster and in different ways. However, this also poses questions when AI can or cannot be used. Therefore, the UG has introduced a set of ten basic rules to integrate responsible and competent use of AI, and more specifically the use of Generative AI (GenAI), in teaching. The UG wants the use of AI to tie in with academic ways of working and academic core principles.

Additional rules may apply at the level of your faculty, degree programme, and course unit because the impact of AI tools can vary strongly for each discipline.

GenAI

GenAI refers to AI models that are able to generate new and unique content, such as text, images, sound or other forms of output. Examples of AI tools with GenAI functionalities include ChatGPT, Google Bard, and DALL-E. Extensive general information about types of AI tools, tips on how to use them, and their possible impact on teaching can be found on RUG EDU Support.

Ten UG basic rules for the use of AI in teaching

Please note: additional rules may apply at the level of individual faculties, degree programmes, and course units.

  1. AI tools may be used as aids to support general functionalities (study tool/assistant/input for own work). General functionalities include, for example, brainstorming, gaining inspiration, summarizing general information, fine-tuning own work (language correction, language assistant), translation, and independent study/sparring partner (generating mock exam questions and answers). Please note: AI tools are not reliable academic sources and their output must always be processed critically in accordance with academic ways of working. Students are always responsible for their own submitted work.
  2. When GenAI functionalities are used (to create new content or replace own work), this should always be mentioned/referred to. The most important difference with the functionalities listed under rule 1 is that GenAI serves to partly replace or outsource one’s own working and learning process. If a student uses GenAI in any way other than those described under rule 1, they should explicitly mention this. This also enables the lecturer to give more targeted feedback about acquiring academic ways of working and learning to use tools in a responsible way. Any such use must be mentioned/referred to in a recognizable way, for example under methods/sources/references. The following should at least be stated:
    1. name and version of the tool
    2. for what purpose and how it was used

    Please note: individual degree programmes and course units may set further requirements to the form and content of references, for example a more detailed explanation of the use, examples of prompts entered, reflection on reliability and bias, and verification of information.

  3. Any additional rules with regard to the use of GenAI functionalities in addition to the above-mentioned rules 1 and 2 must be communicated before the start of the course unit in question . When in doubt, ask the lecturer. Using tools for functionalities other than those listed under rule 1 may be permitted, partly permitted, or not permitted. This may vary per degree programme and per course unit, because it depends on the learning outcomes. Students will be notified about this in good time (before the start of the course unit in question), in any case via the syllabus/Brightspace.
  4. Using AI tools is regarded as cheating if:
    1. the work submitted cannot sufficiently be considered to be the student’s own work, so that their knowledge, understanding, and skills as described in the learning outcomes cannot be assessed and evaluated. Outsourcing work to a tool (or to someone else) to such an extent is not permitted because it affects the heart of academic ways of working. Students must always be able to take responsibility for verifying and analysing information, and for their own academic substantiation. Lecturers teach students to understand this link.

      Or
    2. the student has not correctly mentioned/referred to the use. Submitting a literal copy of GenAI output (or any other output) as own work is never permitted. The definitions of cheating/plagiarism as set out in the Teaching and Examination Regulations of the relevant degree programme apply. The Board of Examiners of the degree programme must determine for each individual case whether it constitutes a case of cheating.
  5. Use the positive functionalities of AI tools, but do so consciously and critically. AI tools offer lots of great opportunities. However, using tools also involves risks with regard to the reliability of output (factual errors, biases, non-existing references) and the processing of data (infringement of copyrights and privacy, security, and storage of personal, company, and research data). You should therefore not enter sensitive information or data. Follow the GDPR. Users are personally responsible for using AI tools consciously, critically, and responsibly.
  6. Processing agreements/UG licences are a precondition for requiring the use of tools in teaching. If there is no processing agreement between the UG and the owner of the tool, and/or there is no UG licence, students may not be required to create a personal account or to purchase a tool (or a version with more functionalities). In such cases, a similar alternative that is free of charge must be offered. This also applies to open source tools. Proper processing and storage of personal information and data must be in place if use in teaching is required, and students must have equal tools at their disposal.
  7. Scores of AI detection tools do not constitute evidence for cheating. Cheating scores that are generated by AI detection tools may not be used to prove that a student has cheated, because these scores are unreliable (high risk of incorrect scores and lack of transparency about how the tools work). Examiners are responsible for checking the authenticity of submitted work, assessing the work, and reporting any suspicions of cheating/plagiarism to the Board of Examiners of the degree programme in question. Boards of Examiners are responsible for determining whether cheating/plagiarism has indeed been committed and for giving students suspected of cheating/plagiarism the opportunity to put their case.
  8. Lecturers bear final responsibility for the assessment of students and the content of the teaching. Lecturers are encouraged to use aids in their teaching and assessment, such as automated marking of multiple-choice exams in accordance with predefined answers. However, automated decision-making/marking based on a GenAI model, without any human checks of the assessment process, is not permitted. The relevant examiner legally bears final responsibility for administering examinations and determining their results.
  9. Any adjustments to modes of assessment in order to ensure the validity of assessments must be communicated in good time. The lecturer may conduct an additional oral check when cheating is suspected. In order to ensure the validity of assessment, it may be necessary to adjust a mode of assessment, for example, from a written exam to an oral exam. This is only permitted if the learning outcomes can still be determined. Students must be notified in good time (via Ocasys) and adjustments to modes of assessment during an academic year are in principle not permitted, except in cases of force majeure. In addition, lecturers may conduct additional checks in the form of oral tests when cheating is suspected. These tests do not constitute additional examinations. Students must, however, be notified of this possibility in advance.
  10. Interim checks are conducted for theses/final projects. The thesis or final project is an important part of the degree programme, in which many of the learning outcomes are assessed. Interim checks (for example in the form of a meeting or an intermediate product) are therefore conducted to assess the authenticity of the work and the creation process. These checks are often already conducted. Interim checks may count towards the final assessment, although this is not a requirement.

UG policy on AI in teaching

The basic rules are part of broader UG policy on AI in teaching. Policy in the field of AI in teaching is temporary and will be updated if and when necessary. The UG aims to train its students within the context of their degree programmes to become competent and responsible users of AI tools, in accordance with academic ways of working, attitudes, and core principles. To this end, the UG uses a set of basic rules and has introduced action lines to support students and lecturers.

Degree programmes will have to review their teaching, assessment, and learning outcomes and decide to what extent they are affected by AI tools. There is no one-size-fits-all approach for this, as the impact varies per degree programme and per discipline. Faculties can therefore supplement the UG policy with their own rules and activities to suit their own context. This will enable us to explore the promising potential of AI tools together within joint frameworks. Read the full UG policy on AI in teaching [PDF].

Last modified:12 January 2024 2.08 p.m.
View this page in: Nederlands