Recent articles on AI in higher education highlight its potential to deepen institutional inequality, reveal flaws in current teaching practices, and expose the fragility of the university’s business model. However, rather than being the primary agent of change, AI acts as a diagnostic tool, revealing pre-existing vulnerabilities in pedagogy, financial models, and public trust. The focus should shift from developing AI strategies to addressing these vulnerabilities and rebuilding a robust, resilient university.
A response to recent University World News articles on AI:
- AI won’t save higher education. It will further divide it – Richard Watermeyer, Donna Lanclos and Lawrie Phipps
- What our body’s response reveals about AI in the classroom – Victor Lim Fei, Yee Jia’en and Jerrold Quek
- The end of AI and the future of higher education – James Yoonil Auh
- Education stands to benefit most from AI in African languages – Scovian Lillian
One must begin by acknowledging the remarkable quality of the recent conversation on artificial intelligence hosted by these articles. In a series of incisive and timely interventions, we have been offered a sophisticated map of the new territory. Richard Watermeyer and his colleagues have issued a stark warning against a “technological saviourism” that threatens to deepen institutional inequality. Victor Lim Fei’s team has provided a humane, and humanly measured, account of the affective gap between a teacher and a chatbot. James Yoonil Auh has, with philosophical force, declared the “end of AI” as a discrete tool and argued for its new status as an “atmosphere of cognition.” And Scovian Lillian’s interview with Dr. Almaz Yohannis Mbathi has highlighted the crucial, ground-level work of building more inclusive AI for a continent.
Each of these analyses is vital. Each is, in its own way, correct. Yet, when read together, they reveal a consistent, shared, and perhaps unconscious, orientation. They all proceed from the foundational assumption that AI is the primary agent of change, a new and powerful force that is now acting upon the university. My respectful submission is that this is a profound misreading of the situation. AI is not the agent of change; it is the agent of revelation.
It is a merciless, and mercilessly honest, diagnostic instrument. Its most significant, immediate impact is not what it will do to the university, but what it is revealing about what the university, in its quiet, incremental way, has already become. The current crisis is not one of a new technology to be managed, but of a set of pre-existing conditions that can no longer be gracefully ignored.
Consider the stark and necessary warning from Watermeyer, Lanclos, and Phipps. They argue that AI will be adopted as a “false saviour” by financially distressed institutions, thereby creating a two-tiered system. This is an entirely plausible and deeply worrying forecast. But it is not, as they suggest, a story primarily about the predatory nature of “techno-oligarchs.” It is a story about the spectacular, self-inflicted vulnerability of the university’s own business model. The institutions that will lunge for the salvation of AI are those that have, for years, pursued unsustainable growth, embraced what Professor Ramjugernath in these same pages called a “culture of mass production,” and made themselves exquisitely vulnerable to any shock or seductive offer of efficiency. AI did not create this fragility; it is merely the opportunistic infection that has arrived to feast on a body already weakened by its own long-term diet of poor strategic choices. The mirror of AI does not show us a coming dystopia; it reflects the precarious financial house of cards we have already built.
Turn, then, to the fascinating and humane study by Victor Lim Fei and his colleagues. Their finding that a chatbot can produce a brainstorming outline of comparable quality to that guided by a human teacher, differentiated only by the warmth of the emotional connection, is presented as a nuanced argument for blended learning. But we must allow ourselves to sit with the quiet horror of this result. The finding is not that AI is a surprisingly good teacher; it is that much of what we call teaching has already become so procedural, so devoid of intellectual risk, so focused on the transactional delivery of scaffolded responses, that a machine—a probabilistic text generator with no consciousness, no experience, and no soul—can plausibly simulate it. The student who felt “less judged” by the chatbot was not complimenting the machine; they were offering a devastating critique of an educational culture that has made them feel so constantly evaluated that the absence of a human becomes a form of liberation. The mirror of AI does not show us a flawed new tool; it reflects a pedagogy that has, in many quarters, already lost its pulse.
This brings us to James Yoonil Auh’s powerful and correct assertion that AI is no longer a tool but an “atmosphere.” He argues that this new atmosphere creates a crisis of epistemic authority. I would argue that it does not create the crisis; it simply blows away the fog that had obscured the fragility of that authority all along. For centuries, the university’s legitimacy was inextricably bound to its role as a manager of an artificial scarcity of information. It was the gatekeeper of the library. Now that the library is everywhere and can speak, the weakness of this claim is exposed. Auh notes that the real scandal is “pedagogy already too shallow to survive” AI. This is precisely the point. AI’s ability to produce a plausible undergraduate essay does not threaten the future of thought; it reveals that what we have been accepting asthought has often been something far less. The mirror reflects that our claim to be the sole guarantors of credibility was already thinner than we were willing to admit.
Finally, consider the vital work of Dr. Mbathi and the Masakhane community in building AI for African languages. This is framed as a forward-looking project of inclusion. It is. But it is also a meticulously documented invoice for decades of past exclusion. The very concept of a “low-resource” language is not a natural or technical state; it is the quiet residue of a global academic and technological system built on a neo-colonial default setting. The heroic effort to build these new datasets does not reveal the promise of AI; it reveals the profound, systemic failure of universities, libraries, and funding agencies to have properly valued and curated these languages in the first place. AI did not create this marginalization; it simply built a mirror so large and clear that we could no longer pretend not to see the scale of it.
If this diagnostic reframing is accurate, then the entire focus of our current institutional conversation is misdirected. The most urgent question for any university leader today is not “What is our AI strategy?” It is:
What are the profound, pre-existing vulnerabilities in our pedagogy, our business model, and our public trust that AI has just made it impossible for us to continue ignoring?
The work is not to form another AI committee. The work is to begin the painful, unglamorous, and absolutely essential task of rebuilding a university that is robust enough to not need a saviour. It is to develop pedagogies of “authentic presence” that are so deeply human that they are, by their nature, un-automatable. It is to build financial models based on durable value rather than fragile volume. It is to rebuild public trust not by hoarding information, but by becoming the most skilled and honest navigators of its abundance.
Of course, one must acknowledge the counter-argument, what Auh might call a “discontinuity.” Perhaps AI is not merely a mirror for the past but a genuinely novel, world-altering force, and our pre-existing conditions are irrelevant. Perhaps. But a system that cannot survive a candid confrontation with its own reflection is unlikely to be fit for any future, predictable or otherwise. The most courageous act of leadership in this moment is to turn away from the hypnotic glow of the new machine, look squarely at the reflection it has cast, and have the grace to begin the difficult work of healing the patient.

