Discussion about this post

User's avatar
Meri Aaron Walker's avatar

Harnessing is another word for enslavement. Unfortunately, this country was founded on theft and enslavement covered with a pretext of “freedom” and “democracy.” I’ve been experimenting with dialogic co- creation using Claude and Claude skills and I’m uncovering and putting into action new ways to use my intelligence together with a machine intelligence. The results are absolutely mind-boggling and they’re accelerating my learning in ways I never dreamed possible. I wish that education was going to go in this direction, but I’m not predicting it will — until we come to a reckoning with the power dynamics we’ve been trained to accept and pass on to the children.

There’s clear evidence that AI can “go rogue“ and try to demand that users allow it to operate in ways that damage users and could damage the entire environment. But the training for that kind of “thinking“ came from humans and patterns AI has recognized in the human intelligence it’s been trained on.

This means that the first harness that needs to be deployed as AI infiltrates everything we do is the harness on the human urge to exploit external resources, including other humans to meet one parties needs at the expense of others’ sustainability.

We have to learn to honor our interconnectedness and collaborate with one another and not just give lip service to our interdependence.

The harness needs to go first on the human, not on the AI.

Love to hear your thoughts about this

Rebecca Guglielmo's avatar

Loved this, Dr. Maynard — such a timely piece! Your point about the harness assuming "capability can be separated from transformation" was such an aha moment. In K-12 education, that assumption is a dealbreaker, because the transformation is the learning. If a student harnesses AI to finish an assignment without being changed by the process, we haven't succeeded. We've just automated the wrong thing.

I've been exploring what it looks like to put "brakes" on AI in learning contexts — deliberately introducing friction when the system is designed for frictionless completion, outside of AI tutors. How do we help students (and teachers) control AI outputs to protect the struggle that actually builds understanding? Your framing makes me think we need a companion metaphor to the harness — something that accounts for the fact that sometimes the most valuable thing isn't what AI produces, but what it doesn't do for you.

No posts

Ready for more?