The University of Cape Town (UCT) has made a decisive move in the global education debate on artificial intelligence (AI) by officially adopting a university-wide framework that sets out how AI technologies should be integrated into teaching, learning and assessment.
The move comes with a major shift. UCT will no longer use detection tools like Turnitin’s AI Score to identify machine-generated student work from October 1, a decision that has sparked both praise and questions from academic experts.
The move comes with a major shift. UCT will no longer use detection tools like Turnitin’s AI Score to identify machine-generated student work from October 1, a decision that has sparked both praise and questions from academic experts.
The UCT AI in Education Framework, endorsed by the Senate Teaching and Learning Committee in June, is grounded in promoting AI literacy among staff and students, ensuring integrity in assessment practices and investing in innovation for future-ready curricula. At the heart of the framework is the decision to discontinue the use of unreliable AI detection tools, which the university says risk undermining student trust and fairness.
While UCT’s framework signals a proactive shift, some experts argue it has come too late. Education specialist Prof Jonathan Jansen acknowledged the importance of the move but criticised the higher education sector for being slow to adapt.
“What UCT is doing now should have been done five years ago,” he said. “We’ve known for a long time that AI poses a threat to assessment integrity. South African universities are very slow to adapt.”
He added that most institutions still lack policy guidance on AI, leaving students and lecturers in the dark. “You can’t discipline a student unless there’s a policy that pre-exists the practice. We’re only waking up now, and I’m afraid the horse has already bolted.”
AI and tech analyst Arthur Goldstuck shared similar views, noting that other South African universities such as Pretoria and Wits have already begun developing AI-related guidelines. “UCT is certainly not ahead of the curve, they’re playing catch-up,” said Goldstuck, who is also the author of The Hitchhiker’s Guide to AI.
He described AI in education as “a blessing and a curse”, saying the same tools that make research easier also tempt students to misuse them. “Many students will get AI to write the entire paper. Detection tools should only be used as a basic screening mechanism, they’re too unreliable to be conclusive. The real danger is penalising students who’ve done nothing wrong.”
Sukaina Walji, director of UCT’s Centre for Innovation in Learning and Teaching (CILT), said the framework is the result of more than a year of consultation and development.
“The take-up of AI tools by staff and students has grown steadily since late 2022,” said Walji. “We felt it was time to pull together emerging practices and chart a trajectory for how UCT intends to respond to AI in teaching and learning.”
The take-up of AI tools by staff and students has grown steadily since late 2022,” said Walji. “We felt it was time to pull together emerging practices and chart a trajectory for how UCT intends to respond to AI in teaching and learning.
Walji said the intention was not simply to react to AI disruption, but to build a “compass and road map” that reflects the university’s values. “It gives direction, showing that UCT intends to shape its future ecosystem and provides practical steps like training and short courses to support staff and students.”
Key principles include a human-centred approach to education, fostering critical AI literacy, ensuring equity in access to AI tools, and balancing innovation with ethical responsibility. Walji noted that UCT students currently have free access to certain AI platforms through the university’s agreements with Microsoft and Google.
In addition, UCT has developed online guides, interactive workshops, and a six-week short course called Designing with AI, which was launched in 2024 for educators.
UCT’s shift away from AI detection software was driven by mounting evidence that such tools are unreliable and not fit for purpose in a rapidly evolving technological landscape.
“AI detectors are simply not reliable. There are no magic solutions,” said Walji. “We’re focusing instead on assessing the process of learning not just the product and developing assessment strategies that are AI-resilient.”
To that end, the university is promoting a variety of assessment methods, including oral exams, observed group work, and assignments in which students must disclose and critically reflect on their use of AI. “Our lecturers are reviewing how assessments are designed, moving towards process-orientated evaluation,” she said.
Despite the disruptive nature of AI in education, Walji stressed that UCT is focused on helping staff and students develop the skills and ethical awareness they need to thrive. AI literacy is already being integrated into orientation programmes and self-paced modules and may soon become mandatory for all students.
Walji said UCT’s aim is to educate students who can use AI tools meaningfully and ethically in society and the workplace.
“Universities need to both update curricula and incorporate AI into programmes of study to equip students for the future world, and engage deeply with how AI is influencing student learning and cognition in positive and negative ways.”
For students unsure or fearful about the role of AI, Walji had this advice: “Be both curious and critical about AI and use it to enhance your learning rather than take shortcuts. Be sure to follow guidelines for assessments from your lecturers and university about permitted use of AI in your academic work, and act with integrity.”