8 Comments
User's avatar
Karen Doore's avatar

I agree with some of your ideas that tenure faculty don't have incentives to be innovative. However, I'd suggest trying whatever version of AI is being used on the Khan Academy because it's likely developed in a sandbox and can encourage students to explore their other content. In my first day of using GPT5, after getting used to prior versions, I am extremely disappointed. I think it's because they are trying to generalize the features and that is resulting in very poor performance. As a non-tenured faculty, I designed innovative curriculum to teach computing to art & technology students...it was a great learning experience for me, focused on experience design, dynamic interactivity, and modeling in simulation environments...until it wasn't because new administrators had no appreciation for the efforts and I was eventually fired ....these administrators used the 'move fast and break things approach to leadership' because academia is stuck using an organizational structure that is rigid and promotes reward hacking and delusion to project the illusion of control and marketing rather than focused on supporting critical thinking, adaptability and interdisciplinary curiosity....the academic model is collapsing and the models must be holistic, generative, and trustworthy, promoting metacognition....just like the AI current AI models are focused on massive scale and with no regulations and adversarial / military contexts.

Expand full comment
Bette A. Ludwig, PhD 🌱's avatar

I worked in higher ed for 20 years, and I don't know how they are going to change fast enough to keep up with AI. I read so much where some want to ban it and bring back blue books.

Expand full comment
To The Pith's avatar

This is for the educators living under mossy rocks? This feels neither cheeky nor particularly relevant. If the choir you’re preaching to here isn’t far and away ahead of your “advice”, they likely shouldn’t be teaching in 2025. 🤷‍♂️

Expand full comment
Andrew Maynard's avatar

Hate to tell you this, this is the state of teaching at most R1 universities in the US at this point

Expand full comment
To The Pith's avatar

Show me the stats, of these witless worrywarts scared to go near AI for fear doing so will make them melt!

But suppose you’re right. As I said, if you’ve not got the wits or the interest to investigate AI, if you’re lacking the very “inquiry instinct” that brought you to teaching in the first place, take a knee and get out of the game.

doubt your assertion. B)

and B)

Expand full comment
Don Christoff's avatar

"The serum amplifies everything inside, so good becomes great; bad becomes worse." (Abraham Erskine, Captain America, First Avenger)

AI is very much like this, except that anyone with access to AI (this, too, is an issue) can be Steve Rogers or the Red Skull with it. When used correctly as an educational tool, it helps most students produce amazing results and can dramatically increase their lifelong thirst for knowledge and the truth. Misused, however, they will likely reduce their ability to think critically and create a dangerous dependence on the technology to produce any results.

Access has to be widely and equally available to anyone (which is not), and more importantly, ethics needs to be taught and constantly reinforced from an early age. Sadly, we are not getting a passing grade in either of these areas.

Expand full comment
Hans Cox's avatar

Thank you. I’ll be teaching intro to computer science and intro to philosophy this semester, and I only have six months teaching experience, back when rephrasing tools and CourseHero.com seemed to be the most common ways to cheat. I’ll try your plan!

Expand full comment
Andrew Maynard's avatar

Beyond the slightly snarky prompts here, Study mode is a surprisingly powerful took for helping teachers develop and hone their class -- well worth experimenting with. And good luck!

Expand full comment