Skip main navigation

How are students already using these tools?

In this article, Dr Brenda Williams and Dr Jayne Pearson explore postgraduate students’ use and perceptions of GenAI tools.
In the previous step, you considered some of the perceived and actual limitations of generative AI tools. In this step, you will explore some ways in which a group of postgraduate students have reported, using generative AI, its advantages and limitations, and consider implications for your own context and teaching practice.
Decorative image Created by Martina Stiftinger as part of the Visualising AI project
Conversations around generative AI abound in higher education establishments, and King’s is no different. We are navigating this rapidly changing arena and, of course, one arm of that involves open conversations with our students to determine how they are currently using AI and what impact it has on their learning experience.
We held structured discussions with postgraduate taught (PGT) students from the Institute of Psychiatry, Psychology and Neuroscience, encouraging them to express their thoughts, good, bad or indifferent, on generative AI. From these conversations, four distinct themes emerged:
  1. the significance of generative AI for learning
  2. integrity and ethics
  3. support needed from staff
  4. research and real-world skills.
Every student we spoke to had used generative AI and a range of these tools were mentioned (ChatGPT, Google Bard, GPT-author, MSBing, Google Imagen AI and, of course, Grammarly). So, as one student said,
”the genie is truly out of the bottle”.
AI is here to stay and the only thing to do is embrace it, encouraging all the positives but also discussing and highlighting the negatives.

Theme 1: the significance of generative AI for learning

Students emphasised the positive impact generative AI has on their learning process: to help with understanding the literature, breaking down complex topics and explaining terms. They felt AI provided insights and generated ideas that they may not have considered on their own, and that it helped promote deep approaches to learning by reducing time needed for searching and understanding basics.
But they recognised that AI did get things wrong. They did not feel it was effective at critical thinking and so did not consider that it could generate a first-class essay at postgraduate level. In fact, they all said that they would not use it to write an assignment for them, but they would use it to generate a structure or in assisting their academic discourse. When quizzed on whether this was ‘cheating,’ they believed that these lines were already blurred as tools like Grammarly, Spellcheck and Microsoft Editor have been in use for a considerable time. They therefore wondered why using generative AI in this way would be frowned upon. This led to a more detailed discussion around ethics.

Theme 2: integrity and ethics

From a societal point of view, students thought that developing a view of the world/academia from generative AI was dangerous as it was trained on big data from a largely western point of view and, as a large language model, has inherent biases. Interestingly, being from the younger generation themselves, some considered it ageist, being designed without consideration of the older generation who may not be so ‘tech savvy’.
When asked when and why students might resort to using generative AI inappropriately, they emphasised that if a student was under pressure; running out of time to submit an assignment; or potentially where English was not their native language and they struggled to express themselves. All these scenarios are ones that staff can help guard against through encouraging students to seek support where needed and providing effective academic skills training. By making sure assessment deadlines are well spaced and that we do not over-assess, and finally through providing opportunities for students to improve their language skills through formative writing and speaking activities.
They thought that the punishment for inappropriate use of AI had to be high enough to make it not worth the effort. Exactly what this punishment might be was not forthcoming. However, they wanted an ‘honour code’ to provide clear expectations for every assessment which enables students to determine when and for what AI could be used.

Theme 3: guidance from staff

There was a consensus that specific guidance was required on how to leverage generative AI for learning, with the feeling that its positive aspects should be highlighted. It was recognised that staff may also use AI in developing learning, creating assessments and providing feedback, but they wanted this to be discussed openly, and student input sought.
Regarding inclusivity, it was considered that non-native speakers and students who were not so tech savvy may find it more difficult to use prompts effectively and that training on the use of AI should be provided to level the playing field. They were unanimous in expressing that no student should be expected/required to use paid versions of AI tools, a sentiment fully supported by King’s. Students are just as nervous as staff about the rapidity with which AI is evolving and question whether they can adapt quickly enough.

Theme 4: research and real-world skills

Students said:
”AI enables us to do things differently.”
”It will not make us dumber just because more information is now available at a click of a button”.
They emphasised that there is always change and highlighted how, with the advent of the internet, students effectively stopped poring over books in the library. They stated that:
“the way we conduct research is just different now.”
They felt that university should hone real-world skills, including how to engage with generative AI effectively. As one student said:
“it’s not about finding information anymore but about processing and integrating information.”

In their mind, educational tasks should be authentic and favour interpretation rather than memory, requiring creativity and criticality.

Now that you have completed this step, you have heard how PGT students are currently using AI and what they consider are its advantages and limitations. In the next step, you will learn about what leaders in higher education think about the potential of AI.

Additional information

If you would like to explore this topic further, here are some additional resources.

Compton M. Evolving understanding. How students are using it: advice and recommendations. Student perspectives on generative AI @ UCL (8min video). YouTube. 2023, May 24.

Compton M. Academic integrity and evolving AI literacies. Student perspectives on generative AI @ UCL (8min video). YouTube. 2023, May 24.

Join the conversation

  • Do you think generative AI can have a positive impact on learning?
  • Are these views shared in the student communities you are familiar with?
  • Are there any particular issues arising in your context?
© King’s College London
This article is from the free online

Generative AI in Higher Education

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now