Embracing ChatGPT and other generative AI tools in higher education: The importance of fostering trust and responsible use in teaching and learning

Embracing ChatGPT and other generative AI tools in higher education: The importance of fostering trust and responsible use in teaching and learning

July 10, 2023
Photo: istock/Supatman

In his piece ‘Embracing ChatGPT and other generative AI tools in higher education: The importance of fostering trust and responsible use in teaching and learning’ (The Head Foundation Digest, June 2023), Mr Jonathan Sim (NUS Philosophy) acknowledges that generative AI (artificial intelligence) has brought significant disruptions to education, particularly in the areas of teaching and examination. He argues, however, that AI presents valuable opportunities to advance the educational experience for teachers and students within institutions of learning.

Mr Sim first deals with the issue of distrust in AI. While much discourse has focused on AI being an enabler for cheating, he questions our basis for attributing AI use to cheating. He compares consultation with AI tools to similar consultations with colleagues, friends and professionals (like editors). He asks if the partner in such consultations—whether AI or human—impacts our judgement of whether consultations constitute cheating.

Mr Sim then argues that not all students use AI tools with the express intention of cheating. AI tools can be used as instruments to help students achieve deeper learning, like how students learn with the aid of search engines and consultations with tutors. He also flags the dangers of excessively depending on AI detection tools as they can give false positives. He claims that his own original work, for instance, had been flagged by such tools as “parts written by an AI”. Because accusations of cheating through these tools can be difficult to plausibly deny, Mr Sim argues that excessive use of AI detection tools can be harmful to trust relationships.

Beyond education’s relationship with AI, Mr Sim also advocates for the potential of generative AI to be a positive force in the classroom. With the advance of AI (and the subsequent difficulty in “AI-proofing” assignments), he argues that assessments should move away from evaluating student performance but instead focus on being more formative. The latter would turn assignments into opportunities for students to learn deeply into particular topics. An example would be self-directed research projects because the student needs to dive deeply into their proposed topic to be able to formulate good research questions.

He also encourages the use of AI tools within the classroom as an interactive learning partner. Generative AI can be useful to help students understand particular topics when they ask it questions. Alternatively, students can work on crafting prompts for the AI to get the most effective responses which suit their requirements. In both instances, students hone their questioning and evaluative skills as they interact with the AI.

Read the article here: https://digest.headfoundation.org/2023/06/15/embracing-chatgpt-and-other-generative-ai-tools-in-higher-education-the-importance-of-fostering-trust-and-responsible-use-in-teaching-and-learning