Part of this is very reasonable; AI is upending how students learn (or cheat), so adding a requirement to teach how to do it in a way that improves learning rather than just enhances cheating makes sense. The problem with the broad, top-down approach is it looks like what happens in Corporate America where there's a CEO edict that "we need a ____ strategy," and every department pivots projects to include that, whether or not it makes sense.
https://www.purdue.edu/newsroom/2025/Q4/purdue-unveils-compr...
Where the actual news is:
> To this end, the trustees have delegated authority to the provost, working with deans of all academic colleges, to develop and to review and update continuously, discipline-specific criteria and proficiency standards for a new campuswide “artificial intelligence working competency” graduation requirement for all Purdue main campus students, starting with new beginners in fall 2026.
So the Purdue trustees have "delegated authority" to people at the University to make a new graduation requirement for 2026.
Who knows what will be in the final.
Perhaps the world is going the direction of relying on an AI to do half the things we use our own brains for today. But to me that sounds like a sad and worse future.
I’m just rambling here. But at the moment I fail to see how current LLMs help people truly learn things.
For the same reason that elementary schools don't allow calculators in math exams.
You first need to understand how to do the thing yourself.
When I heard that today, it sounded like self-serving partnership, and, frankly, incompetence.
But I like to think that actually learning the history was important and it certainly was a diversion from math/chemistry/physics. I liked Shakespeare, so reading the plays was also worthwhile and discussing them in class was fun. Yeah, I was bored to tears in medieval history, so AI could have helped there.
After more than a trillion dollars spent, LLMs can replace: (a) a new secretary with one week of experience (b) a junior programmer who just learned that they can install programs on a desktop computer, and (c) James Patterson.
That's the bright future that Purdue is preparing its students for.
Yes, AIs will be a huge thing...eventually...but LLMs are not AI, and they never will be.
Purdue not necessarily uniquely but specific to their charter does a really good job at workforce development focus in their engineering. They are very highly focused on staffing and training and less so on the science and research part - though that exists as well.
This tracks what I would expect an in line with what I think it should be best practice
This is not remotely the kind of thing that a school should be making a requirement at this time. The technology is changing way too fast to even be sure that basic fundamental skills related to it will remain relevant for as many as 4-5 years.
"all as informed by evolving workforce and employer needs"
“At the same time, it’s absolutely imperative that a requirement like this is well informed by continual input from industry partners and employers more broadly."
Purdue is engaging in the oldest profession in the world. And the students pay for this BS.