Higher education is in the middle of a slow-burning crisis. It is not just about misinformation and disinformation, but something that might prove just as corrosive: a crisis of synthetic knowledge, an epistemic pollution, where the very foundations of what we consider knowledge, worth knowing, and true are being eroded.
Of course, the problem is not that AI-generated content exists. It is that it increasingly looks like knowledge while bypassing the very mechanisms that give knowledge its legitimacy. Traditional safeguards such as peer review, scholarly debate, and methodological transparency are being diluted by an environment where credibility is shaped less by expertise than by linguistic persuasion. If a claim is well-worded, neatly structured, and delivered confidently, it is increasingly accepted as knowledge, or truth, regardless of whether it has any epistemic weight.
This is not just about students using ChatGPT to write their essays. It is about how entire institutions such as universities, publishers, and policymakers are beginning to rely on AI-generated syntheses that blur the line between human understanding and statistical plausibility. What happens when those responsible for generating knowledge can no longer be sure where it came from, or whether it was constructed through the intellectual efforts that once defined scholarship?
Have you subscribed to edAI Brief yet? Weekly update on AI in higher education for those short on time. Out every Monday morning CET. 🚀
The Rise of Synthetic Persuasion
The danger of generative AI is not its occasional hallucinations, nor is the crisis simply that AI makes mistakes. Mistakes can be corrected. The crisis is that AI produces fluent coherency - text that follows the rules of argument and structure so well that it bypasses skepticism altogether. We are entering a world where knowledge is no longer built through struggle and debate, but assembled through statistical prediction. A world where what seems credible replaces what is true.
Students are already experiencing this shift. Ask a classroom to analyze an argument, and many will now reach first for AI-generated interpretations. These responses are crisp, digestible, and syntactically polished. But they are also fundamentally hollow: they bypass the act of interpretation itself, substituting engagement with an authoritative-seeming summary. The result is not just a generation of students who struggle with deep reading - it is a shift in what reading actually means. The act of wrestling with a difficult text and formulating a response grounded in knowledge is being replaced, or at least supplemented by the ability to rapidly navigate a sea of pre-generated interpretations.
This is how the synthetic knowledge crisis unfolds. Not through outright falsehoods, but through a gradual weakening of the criteria that distinguish knowing from appearing to know.
Higher Education’s Own Complicity
Universities are not innocent victims in the ongoing synthetic knowledge crisis; they are accelerating it. The relentless pressure to secure funding, generate research, rack up citations, and deliver 'impactful outputs' has created an environment where quantity trumps depth. Research has become a commodity, and generative AI is only amplifying a trend that was already underway: the capitalisation of knowledge production, where intellectual labor is increasingly evaluated by metrics that reward speed over substance.
Consider the academic paper mill phenomenon. AI has made it trivially easy to generate plausible-sounding research papers, many of which are now slipping into lower-tier journals. The response from universities? More detection software, more plagiarism policies, more reactive measures that fail to address the underlying issue: the incentive structures of academia reward the illusion of productivity over actual intellectual contribution. AI is not the disease; it is the symptom of a system that already prioritized measurable output over thoughtful inquiry.
Worse still, many institutions are embracing AI under the banner of efficiency. Automated grading, AI-generated feedback, and algorithmic curriculum design are being championed as solutions to administrative overload. But efficiency in the service of what? When evaluation is outsourced to a system that cannot distinguish between deep engagement and superficial coherence, the very premise of higher education is beginning to shake.
We have created an environment where both students and faculty are subtly incentivized to engage with knowledge in ways that are performative rather than substantive. Of course, rigorous research is still being conducted and published. But even in the most serious work, speed remains a dark passenger. Taking the student perspective, feeding an essay prompt into a generative AI and lightly editing the result is not necessarily "cheating" in the traditional sense - they are simply adapting to an ecosystem where knowledge production is increasingly simulated rather than earned. Is this wrong? Perhaps - perhaps not.
The Future of Knowledge in Higher Education
Higher education can either double down on epistemic rigor, reinforcing the processes that make knowledge meaningful by rethinking learning and research, or it can drift further into an era of frictionless, low-effort intellectual consumption in oceans of synthetic knowledge.
This is not a call for technophobia. AI will, and should, have a place in higher education. But that place cannot be uncritical adoption, nor can it be framed purely in terms of productivity gains. What is at stake is nothing less than the epistemic integrity of the university itself. If higher education becomes comfortable with AI-generated coherence, it will have relinquished its role as a steward of inquiry and curiosity.
Making this change isn’t easy to do. Institutions need to redesign assessments that privilege process over product, reaffirm the value of disciplinary methodologies that resist easy summarization, and teach students how to recognize the difference between knowledge and its synthetic cousin. It will be uncomfortable and slow, no doubt.
But the alternative is worse: a future where the academy is no longer a space for the pursuit of truth, but a clearinghouse for well-formed but meaningless arguments - endlessly generated, infinitely circulated, and utterly devoid of the intellectual struggle that once defined what it meant to know.
You might find our co-creative conversations interesting :)
Excellent article! I presented a paper at the Distance Learning Administration (DLA) conference in 2016 about this evolving issue of the nature of expertise. Link below to slideshare summary presentation. The book "Death of Expertise" would also be a good read for anyone new to this "concerning" trend in (higher) education especially where technological advances intersect.
https://www.slideshare.net/slideshow/the-evolution-of-faculty-and-expertise-in-higher-education/63190254