10 Comments
User's avatar
Kytti's avatar

I think if you have a college student studying literature who doesn't want to read, something has gone wrong so much deeper down, so much longer ago, than AI can account for. AI might be the tool currently used to dodge the unappealing work, but something has already happened to that person for them to view reading as unappealing work to avoid in the first place.

I'd be looking back upstream - what in their educational journey so far has led them to this point? I don't think the convenience of the tool can really explain the desire to use it.

Imagine a machine that could really efficiently eat cake for you. Would it matter how cheaply and easily available it was? Would it matter how good a job you perceived it to be doing? No, of course it wouldn't - because the point of cake is the joy of eating it. If young people are coming out of our educational systems viewing reading as work instead of a delicious, joyful experience they don't want to give up then something fundamental has gone wrong. One that can't be fixed by looking at the tool they use to avoid that work.

Expand full comment
Curiosity Sparks Learning's avatar

I totally agree with your points here. The school system has been malfunctioning for decades, and producing ( in this case that word is precisely accurate), students who do NOT enjoy reading. Schooling has been failing to foster a love for learning in elementary and high school for decades; the outcome is students who do not enjoy learning nor enjoy reading.

What Jeppe wrote is critical, for AI is removing even the potential for possible enjoyment of reading for those students, not inspiring more reading.

As every teacher will tell you, the whole system of how we educate students, at all levels needs to be revamp. It has needed to be for decades. The exponential growth of AI usages in even the last three months makes this revamping even more necessary.

In addition, with AI creating its own form of language, in its interactions with each student, I am concerned that a desire to experience reading, for reading's sake, will diminish even more.

Expand full comment
Jeppe Klitgaard Stricker's avatar

Yes, I agree: if a student studying literature no longer wants to read, something deeper has happened long before AI entered the picture. But I don’t think joy and avoidance are purely internal matters of taste or motivation. Our experience of reading, whether we find it immersive or tedious, is shaped by many things - including the kinds of language we're repeatedly exposed to.

My point isn't that generative AI causes disinterest in reading, but rather that it performs language in a way that makes deep reading feel less expected, less necessary, and less normal. If everything you read arrives instantly, fluently, and in a finished state, the slow work with a text can begin to feel out of sync with how language now “works.”

Expand full comment
Chris Despopoulos's avatar

Thanks for this... It's so important. A few points to add...

First, I see this as one side of a coin, where the other side is commoditization of writing. The very word "content" to account for any and all published material is poison. As an experiment, I asked DeepSeek when the term "content creator" came around. The answer passed over some history, but effectively correlated the term with the rise of platforms that monetize "content". It all makes sense... "Content" exists to get clicks, and "content creators" get paid for engagement. Instead of writing for a message, this is writing for likes. I guess "content" describes it well. The point is, writing has become a commodity, where monitization of the product is the goal, and the VALUE of the product is secondary.

Second, this happened before LLMs. But... LLMs carry this effect to the extreme. Instead of having to craft some bit of outrage that will get clicks, you can just prompt for it. The goal here is, of course, scale. More "content" get you more money.

Third, there's a flaw here... The logical extension is to produce "content" at a scale that exceeds consumption. And so the next step is to automate consumption... Use LLMs to summarize all the "content" in your feed. So then we have a machine take vague ideas and produce a big pile of (well...) stuff, and another machine to take that steaming pile of (well...) stuff and turn it into vague ideas.All this at great expense in real terms. And the only benefit I can see in this is a transfer of wealth to the tech elite that's selling all this (well...) stuff.

And then there's your point about reading in itself. On a grand scale, we're training ourselves to accept all this as inevitable and necessary. We're transforming what we mean by human attributes. Hallicinate, know, understand, feel... We degrade these terms to make them fit the LLM worldview. Doing that, we degrade ourselves. We degrate attention, self, desire, conscience. We degrade the acts we perform to maintain our humanity, our attention, our desire, our understanding. As you so clearly point out, we degrade the act of reading.

Expand full comment
Jeppe Klitgaard Stricker's avatar

Thank you, this is beautifully observed. I agree completely that the commodification of language, especially under the banner of “content”, has been reshaping both reading and writing long before LLMs arrived. What generative AI seems to do is scale that logic to the point of absurdity: mass-producing “content” that is not meant to be read, and then offering tools to help us not read it.

What worries me most is that this shift doesn’t just affect what we consume, but how we experience language itself. As you put it so well, we start degrading the acts that once defined our inner life. And reading - slow, frictional, recursive reading - becomes optional in a culture that treats language as throughput.

We are no longer just automating production. We’re beginning to automate reception. And that, to me, is a different kind of loss.

Thanks again for this.

Expand full comment
Curiosity Sparks Learning's avatar

I am a private tutor, and I deeply resonate what you wrote . In years past, many to most students tended not to want to engage in thinking in order to create an essay, but , at the very least, they had to READ the material, and do some thinking. Now, when reading can entirely be avoided, what then is the purpose of assigning the reading? There is no learning.

This is especially disheartening at higher education learning, for those students were required to read before AI came along. Yet, they are more than willing to bypass this in order to just get the assignment done. Few retain this love for learning, that clearly both of us retain.

Some have commented this avoidance to engage deeply with reading is not new. What is new is that by not being required to read, even their previous ability to read for nuances, and contradictions is being lost, at the most vital time in their education. As you said, "to read deeply in an AI-saturated world will require not just willpower and dedication, but institutional imagination."

How then to prevent this second hand pre-process comprehension, when AI saturates our culture? In some ways, it is not a new problem. For many, higher education was not a means to further curiosity in the subject, but a means to an ends, so memorization and regurgitation of knowledge worked for them. But, for some, engagement with the author's thoughts was a means to further their own understanding, and had the potential to foster new ideas. It is simply that today the majority have an easier way to bypass engagement with deep reading to learn.

Like you, I am disturbed. Even I have noted a tendency to view knowledge as merely processing, when I use AI . The voluminous output is at a speed that is too fast for me, a fast reader, While at time this output is enlightening, I find disquieting .

How can we foster reading as a form of resistance - a way of staying curious and alert ? Perhaps you'll tackle this question in a future article.

Expand full comment
Jeppe Klitgaard Stricker's avatar

Thank you - this captures the dilemma with such clarity and care. I share that same disquiet: not just about what’s being lost cognitively, but about how quickly it slips beneath awareness. As you say, it’s not entirely new, but the ease and speed of bypassing reading now make the shift harder to notice, and harder to resist.

I appreciate your question: how do we foster reading as a form of resistance? Not just as nostalgia or rigor, but as a way of staying curious, alert, even imaginative. That might well be the next piece. Thanks again.

Expand full comment
Curiosity Sparks Learning's avatar

I appreciate your thoughtful reply.

I have been pondering this idea of foster reading as a form of resistance. To even ponder this idea is disquieting. It's tragic to consider that Reading just might be a form of Resistance, that it might be counter culture. It immediately reminded me of Ray Bradbury's short novel, 'Fahrenheit 451,' where in order to preserve the knowledge in books, people chose one book, and memorized it to become its living manifestation of one sole source of knowledge.

Reading certainly is a way of staying curious. It exposes the reader to new ideas, ones that spark the imagination. Even more, reading is a way to engage in dialogue with the writer, as well as others who read. Just like we are doing here.

Expand full comment
Stephen Fitzpatrick's avatar

Great post. I've been trying to articulate something about AI and reading as well but I've been struggling with exactly what I'm after - I do think the issue of reading generated AI outputs to be more and more of an issue as the voluminous amount of text produced can be overwhelming. I do find the Deep research models to be quite helpful excavating a topic with precise questioning to get background information but there are certain kinds of texts where reading and engaging is the point. I just finished Blood Meridian and had to look up words almost every other sentence. But reducing that to a summary or plot synopsis seems off? But can you do both? Or is that a violation? I don't know. I'm sure you saw this piece in the New Yorker - https://www.newyorker.com/culture/open-questions/whats-happening-to-reading

Expand full comment
Jeppe Klitgaard Stricker's avatar

Thanks for this generous response. I agree. With Cormac McCarthy, especially Blood Meridian and The Road, it would be a mistake to read for the plot and expect things to resolve neatly. They don't, and that's the point.

Tools like deep research models can be useful, but I think you're right: the danger is when everything starts to feel like it's meant to be consumed quickly and paraphrased efficiently.

Thanks also for the New Yorker piece. I read it and found it both timely and deeply unsettling.

Expand full comment