4 Comments
User's avatar
Craig Van Slyke's avatar

At a minimum, the rest of us can look forward to learning from their missteps. To some extent though, they seem to be addressing the access equity problem. I fund one premium AI tool for each of my doctoral students, but that’s just a tiny, tiny effort. They’re happy though!

Expand full comment
Jeppe Klitgaard Stricker's avatar

Absolutely, Craig - time will tell whether this was a brilliant idea or just a very expensive step in the wrong direction. Most institutions will have to start small, so keep inspiring those PhD students!

Expand full comment
Rob Nelson's avatar

I'm in the this approach is unwise camp. I liked the question in your headline, but I'm skeptical that CSU knows the answer. Saying they solved accessibility to AI by buying everyone a license and announcing that they're "the first AI-powered university system in the United States—and a global leader in AI and education" are just words for reporters to repeat.

If an institution wants to spend money on AI there are better options like spinning up campus teams to build LLM tools for teachers or departments to explore using these things for actual learning. Or, working with Boodlebox or another company that charges based on the actual use of AI services instead of an enterprise-wide contract.

OpenAI has great brand-name recognition as a homework machine and Sam Altman has PT Barnum-like skills at getting attention, but both those things would make a sensible University system run away.

Expand full comment
Jeppe Klitgaard Stricker's avatar

Thanks, Rob. I appreciate it. I, too, remain doubtful about the broader implications here. And as you rightly point out, this kind of money will buy access to all sorts of generative AI approaches - including more locally owned and controlled ones. But I do see this move by CSU as a wake-up call for all institutions, whether or not they aspire to go in the same direction. Thanks again.

Expand full comment