Skip to main content
To KTH's start page To KTH's start page

Generative AI – how will it affect you as a teacher?

When the chatbot ChatGPT was launched in November 2022, generative AI technology became available to the public. This started a paradigm shift that will affect us as a school, government authority, and research center for a long time. But what exactly is generative AI technology? What opportunities and challenges come with this new technology, and how can we at KTH benefit from this paradigm shift?

The language models GPT-3 and GPT-4

ChatGPT is based on the GPT-3 model, developed and trained by the company OpenAI. The GPT-3 model consists of 175 billion parameters, and it is trained on data until 2020. The next-generation model, GPT-4, is available to paying users. It is expected to be even larger and will be able to analyze both text and images.

What is generative AI technology?

Today, various generative AI tools are available on the internet that can be used to create material according to the user's wishes, such as text, images, video, or code. For example, ChatGPT, Whisper, GitHub Copilot, and DALL.E 2. But what exactly is generative AI technology?

Generative AI technologies are based on so-called probability-based machine learning models. These systems thus have no actual knowledge, but they predict, for example, what the next piece of text should contain based on the prompts they receive and statistical modeling of large data sets. The developers are forthcoming that their training data contains a lot of "bias" in that a large part of the texts are written in the Western world and in English. This also affects the material that is created.

Not the first time that new technology has stirred the pot

Technical achievements in information retrieval and computation are nothing new; in recent decades, we have developed systems that are used systematically in higher education. Introducing new technologies has always caused discussion and concern about how learning and assessment of various skills will be affected. Examples of technology that previously influenced are calculators, Wikipedia, and more. Modern discussions have revolved around tools such as Google Scholar for searching for relevant research publications or Grammarly as support in scientific writing. Despite this concern, using Grammarly is now encouraged in KTH's language courses.

Generative AI technology during examinations

Students have always had the opportunity to cheat (possibly for payment) with help from third parties who do not take the course. What is new is that this help does not have to come from a person but can be generated with AI technology, which makes it more accessible and cheaper. An examination based solely on assessing a final product (such as an essay or program code) is vulnerable to third-party assistance. The connection between process and product is broken if all or part of the product has been produced by an outsider, for example a person or AI technology. The product is then not the result of the student's learning process, and what is then assessed is unrelated to the student's learning. See below for examples of how to reduce the exam's vulnerability to third-party assistance.

Is the legislation keeping up?

The discussion is ongoing about how generative AI technology is affected by GDPR and copyright laws. However, the discussion is ongoing regarding whether ChatGPT and other generative AI technologies may cause problems regarding this type of legislation. At the time of writing (April 2023), EU is working on the creation of laws about AI, such as setting demands for increased transparency around how data is used.

General Data Protection Regulation (GDPR)

Using others teaching material

New opportunities

Generative AI technologies, and the app OpenUni  in particular, can greatly help students. An example is creating a tool for note-taking during lessons or for summarizing a recorded lecture. A tool like this can help all students, but especially students with, for example, dyslexia or concentration difficulties. Tools of this type could replace some of the special adaptations currently required for some students entitled to Funka support.

We asked ChatGPT about how universities should act regarding generative AI and received the following answer in April 2023:

"Technical universities are likely to encourage students to use AI in their studies, as it is a valuable tool for solving complex problems and developing new technologies. Students who use AI in their studies may be required to follow certain guidelines or best practices to ensure that their work is ethical, transparent, and reproducible. In some cases, universities may offer courses or workshops to help students learn how to use AI effectively and responsibly. Ultimately, the goal is to help students develop the skills and knowledge they need to succeed in their chosen field, while also ensuring that they are using AI in a responsible and ethical manner."

Approach to Generative AI technology and examination

To begin with, we would like to refer you to the ethical guidelines in the course plans "On examination, each student must honestly report the help received and sources used." Remind your students of this and have an open discussion with them about these issues.

Proctored written classroom exams are not the only way to prevent cheating and misleading using AI tools during an examination. KTH advocates diversity in examination forms, and many are no more vulnerable to cheating with the help of AI tools than written classroom exams. Some examples are oral examination, examination in seminar form with problem-solving on the blackboard, and supervised digital examination in the computer room using SEB (Safe Exam Browser). These examination forms can all be advantageously utilized as a continuous assessment during the course rather than as a comprehensive final examination.

Furthermore, some forms of examination prevent and make cheating more difficult with AI tools, in addition to supervised examination. You can use oral examinations in many ways. One way is to check that the student can explain a previously submitted solution or their process. Step-by-step assignments, where multiple steps in the solution or development process are required, will make cheating with AI tools harder. When formulating home assignments, requiring students to provide references can be good. You can also ask questions in such a way that the answers must be based on or depend on specific local contexts, making cheating difficult with AI tools.

Three areas have been identified where the exam may be challenging or require extra thought concerning generative AI: degree projects, programming assignments, and unsupervised distance exams.

Example 1: Degree projects 

Continuous supervision now becomes even more important than before to ensure development toward the learning goals. Generative AI tools such as ChatGPT can be used in the writing process. Still, it is important that this is stated in the report and that it is clear how these tools have contributed to the final result.

Example 2: Programming assignments

In all courses, students who use AI tools to generate code for a programming assignment must report this in the same way as someone who received help from a tutor or a source on the web (according to the ethical approach in the syllabus).

In an introductory programming course, it may be forbidden to use AI tools to generate code for examining assignments but allowed to use AI tools to explain error messages or create test cases. In larger projects, several steps in the development process should be assessed (such as specification, prototypes at different stages, and the final product).

Example 3: Unsupervised distance exams

This type of examination can be vulnerable to generative AI technology if it lacks some follow-up supervised knowledge check. You can design the form of the examination or the assignments to make unethical use of AI tools harder. This can be done, for example, by the students having to use information only available in a course-specific context or by having them explain their solutions afterward.

Closing words

The purpose of learning is to equip oneself for one's future. Worries about cheating and misleading during examinations can sometimes be experienced as accusing our younger colleagues (students) of not taking the process and their own development seriously. At KTH, teachers and students must work together to create a culture that puts learning first, where cheating and misleading are not perceived as an option.

In-depth reading resources 

Backer et al. (2023). Programming Is Hard - Or at Least It Used to Be: Educational Opportunities and Challenges of AI Code Generation . Proceedings SIGCSE 2023, ACM.

KTH's information on disciplinary matters

The PriU group about assessment and examination methods. (2023). Promoting learning and preventing cheating . Report published 2023-03-31.  

Rudolph, J., Tan, S., & Tan, S. (2023). ChatGPT: Bullshit spewer or the end of traditional assessments in higher education? . Journal of Applied Learning and Teaching, 6(1).