Sometimes I get weeks in the summer that are more research focused. This past week is very much a teaching and service focused week at my university. I haven’t had any time to ponder topics related to research or current events. So, I will share what I’ve been telling my fellow college educators. This will sound backward to some and like common sense to others. Feel free to comment with your thoughts.
College professors who teach 200-level or “principles” classes should not change all that much in response to AI. Students still need to know something. There need to be a few concepts and vocabulary words in their heads. For example, a person cannot use a calculator effectively if they do not know what a square root is at all.
I see highly trained mid-career professionals bragging about how they get ChatGPT to do their work. Can a 20-year-old do that if they don’t know what words to use in a prompt? How does vibe coding go for people who never learned to write out a single line of code? (not a question I have an expert answer to right now)
We should largely be sticking to the “old ways” and at least to some extent still require memorization. Having an exam on paper is a good way to ensure that the students can form coherent thoughts of their own, when possible.
Indeed, students might become AI jockeys when they get to the workplace. A 400-level class would be a good place for them to start heavily integrating AI tools to accomplish tasks and do projects. For anyone unfamiliar with American college categories, that would mean that an undergraduate might heavily use AI tools in their 4th and final year of study.
AI makes a great tutor for learning and enforcing principles, but it should not serve as a replacement test-taker. A human who cannot read and write will not be able to take full advantage of an intelligent machine in the next decade. Voice recognition is getting very good and the models are getting more agentic, so this might all change if we can keep the data centers on long enough. In the future, you might argue that having students write an exam answer by hand is as superfluous as teaching them to play the violin.
As of 2025, what you might see is some teachers who feel pressured to claim they are integrating AI more than they actually want to. A relative I talked to his summer in a corporate job told me that she feels intense pressure at work to be able to claim that she’s using AI. Anyone doesn’t have the appearance of embracing AI looks behind or expendable!