Change the world

29/01/2024

Higher education should embrace the positives and navigate the pitfalls of generative AI.

 

Generative Artificial Intelligence (AI) is transforming the academic landscape. Learning, Development and Assessment Strategist Cheryl Foxcroft, Emeritus Professor of Psychology, unpacks how Nelson Mandela University is navigating uncharted waters. 

 

AI is rapidly replacing “COVID-19” as the planet’s biggest buzz word.

According to IBM, AI leverages computers and machines to mimic the problem-solving and decision-making capabilities of the human mind. To do so, computational systems, data and data management, and advanced AI algorithms (codes) are employed.  

What is generative AI?

McKinsy and Company describes this as algorithms (e.g., ChatGPT) that can be used to create new content, including audio, code, images, text, simulations, and videos. 

Can AI benefit higher education?

Yes it can – for both staff and students. 

Students·

  • It personalises learning: tracking progress, customising learning pathways (for example, AI-powered learning management tools can nudge students to track their progress, work on weaknesses or work towards a distinction)
  • Tutor Bots provide quick, 24/7 answers to common student questions – a valuable addition to human tutor and lecturer input
  • Engaging learning experiences are offered by AI technology, which actively engages student interest, provides translations of texts, enhancing learning and success.  

Lecturers

  • Greater ease of analysing student learning patterns in a module, from which adjustments can be made to pedagogy, content and curricula, LT activities and learning resources 
  • AI-powered learning technologies can answer routine, repetitive questions, allowing lecturers more time to focus on lecture planning and designing innovative assessment tasks, for example 
  • Task automation: takes care of mark capturing and analysis, admin such as monitoring class attendance, quickly identifying struggling students, or those needing more support, or who are excelling and can be stretched more. 

UNESCO states that AI in education must adhere to a “human-centred approach”.

Our University supports human-centred digital innovation. Human outcomes and liberating human potential, along with ethical considerations, are thus prioritised when designing and delivering digital innovations – and when making technology and AI-based choices. 

 

This includes monitoring the outcomes of using a specific AI tool to ensure that its use does not violate the University’s values and public good ethos. University supports human-centred digital innovation. Human outcomes and liberating human potential, along with ethical considerations, are thus prioritised when designing and delivering digital innovations – and when making technology and AI-based choices.  

This includes monitoring the outcomes of using a specific AI tool to ensure that its use does not violate the University’s values and public good ethos. 

When implementing human-centred AI-based innovation, we:

  • Emphasise the agency of staff and students to personalise their learning and work experiences, and career paths
  • Responsibly leverage intelligent technologies to nudge and enhance active engagement in learning, research and writing development, individual and team performance and societal impact 
  • Encourage ongoing personal development, especially in terms of sharpening existing and developing future work-related skills and responsible digital citizenship. 

ChatGPT – and PedAIgogy

Applications of AI in academic disciplines and professions are growing at a rapid rate. Universities thus need to include this in curricula to prepare students for the world of work.  

ChatGPT, the best-known generative AI tool, poses both a threat and an opportunity for higher education. The trick is to work with innovation, rather than against it.  

The threat is that dishonest behaviour will grow as students engage ChatGPT to write assignments and pass these off as their own work. However, the more academics engage with ChatGPT, the more they discover that rather than trying to ban its use (which is close to impossible), they instead find a range of opportunities to transform their pedagogy and topics to research.  

In the process, a new pedagogy has emerged. PediAIgogy fosters a new era of knowledge and learning where AI changes everything (Downes, S. 2023).  

Examples of PedAIgogical transformation sparked at the University through the arrival of ChatGPT include:

  • ChatGPT has prompted serious debates and reflections in faculties on the nature of questions posed in assessment tasks. The questions currently posed mostly focus on ‘understanding and applying’ existing knowledge – which is susceptible to student searches on the internet or ChatGPT. But, in higher education, we should be developing and assessing critical thinking and problem-solving in assessment tasks. Questions, especially those originating from real-world problems, which are related to analysing, evaluating, comparing, designing or focusing on transforming existing knowledge, are the foundation of the competencies needed from HE graduates and in the world of work. These answers cannot be Googled or generated via ChatGPT.  
  • Some lecturers are thinking smartly. They are utilising ChatGPT to produce more innovative assessment tasks, giving students a ChatGPT generated assignment and asking students to spot correct aspects, distorted information, flaws, missing facts and references and to then rewrite it, for example. 
  • ChatGPT helps with language development. Students trying to master writing in English can study ChatGPT essays, fostering a sense of mastery and understanding of structure, key phrases, introductions and conclusions.  
  • Teachable moments. ChatGPT-generated essays provided by lecturers can spark conversations about the data sources drawn from by and ethical issues with new technologies, creating platforms for academic integrity and digital ethics discussions within a module and fostering collaboration in verifying AI-generated information, thus preparing students for the ongoing challenges and dilemmas related to human-machine interactions. 

PLAGIARISM VERSUS ENABLING GENERATIVE AI USE 

Nelson Mandela University is balancing a reflective approach to exploring the appropriate application of generative AI tools in the academic project with actively reviewing learning and teaching, assessment and plagiarism policies, to include the responsible use of AI tools and the consequences of the improper use. 

How can the responsible use of generative AI tools be explored and enabled by the university?

  • Developing a position statement that can be included in all module guides. This provides students and lecturers with clear guidance about what is permissible. 
  • Developing resources and training opportunities for students and academics in the form of modules and short courses, webpages with online resources, webinars, communities of practice, and so on. This sends the message that everyone needs to learn about the use and misuse of AI tools.
  • Dialoguing with students about their views on using generative AI tools, which ones they use, what they can and cannot use them for in assessment tasks, and what the consequences of misuse are. This helps students to be responsible and honest when it comes to the work they produce. 
  • Detection software and training - numerous tools can detect if students have used AI to produce academic papers. Still, most provide conflicting results, making it challenging to rule whether or not a paper includes AI-generated information and other aspects of plagiarism. The expertise of our academics needs to be relied on as they can pick up trends and patterns regarding sources, content and layout. However, more dialoguing and training is needed regarding setting the criteria used to detect plagiarism and the percentage of similarity that is unacceptable. 

This article was published in the latest edition of Thetha our alumni and friends’ magazine

Contact information
Mrs Debbie Derry
Deputy Director: Communication
Tel: 041 504 3057
debbie.derry@mandela.ac.za