Finding a Third Path for Schools
by Eileen Wedegartner
Artificial intelligence has already entered classrooms and workplaces, whether schools feel ready or not. The real question is not whether educators should respond — but how. Instead of banning AI or embracing it blindly, schools can take a third path: teaching students how to understand, use, and question these tools responsibly.
When ChatGPT first appeared, the world did not lurch. There were no earthquakes. Most people I know ignored it. Life is busy, and new ideas arrive every day. It felt like one more headline that would burn hot for a week and then gutter out. Among the thousand wounds left by the pandemic, AI did not feel urgent.
Then, quietly, it stopped being a headline and became the story.
It showed up in living rooms, offices, and college dorms. It moved from a curiosity to a routine, embedded in our habits like an overachieving intern. This is why I call AI an arrival technology. It is not something we “roll out” when we are ready; it arrives and reshapes the landscape while we are still deciding what to think about it.
Beyond Professional Paralysis
Skepticism is a vital safeguard. We must ask hard questions about accuracy, bias, privacy, and equity. But there is a difference between healthy caution and professional paralysis. We do not prepare for the future by pretending it isn’t already here.
When Massachusetts announced a partnership with Google to provide free AI training, I felt a surge of hope. The Commonwealth is signaling a choice: readiness over resistance. It is a public acknowledgment that AI is already part of modern literacy — whether we love it or not.
I am an unembarrassed idealist. I believe educators make lemonade from lemons. When the world changes, we adapt—not by surrendering what matters, but by recommitting to it.
What matters is ensuring our students are learning what they need to live in their world.
We do not reduce harm by ignoring a tool; we reduce harm by teaching people how to use tools.
The Elephant in the Classroom: Cheating
I am not naïve about cheating. I have watched students do more work to avoid work since long before generative AI. The technology has changed, but teenage nature has not.
However, if we begin and end the conversation with cheating, we miss the point. We train students to see AI as a hiding place instead of a creative forum, and teachers to fear the tool rather than master it.
AI has the potential to shift curriculum, instruction, and assessment toward student mastery by helping educators support learning in ways responsive to individual needs.
A Framework for the Future
Doug Fisher, a longtime leader in educational leadership, offers a useful frame for this transition:
Teach about AI.
Teach for AI.
Teach with AI.
Teaching ABOUT AI means understanding what it is, how it works, why it hallucinates, and what ethical use requires.
Teaching FOR AI prepares students to use these tools responsibly in a world where AI will shape writing, research, and persuasion.
Teaching WITH AI focuses on how educators use tools to plan, scaffold, and shorten feedback loops.
Inside this framework is a hard truth.
Blocking is not teaching.
Blocking may create temporary control, but it is not an educational strategy. Teaching happens when we establish norms, verify sources, and treat integrity as a habit we practice rather than simply a rule we enforce.
AI is a hammer. A hammer can frame a house or fracture a finger. The difference is the hand that holds it and the purpose behind it.
From Literacy to Fluency
To lead this shift, I recently completed the Google AI Professional Certificate. It was not a “click-through” exercise; it required real work and hours of engagement.
The goal was to move from AI literacy — knowing what it is — to AI fluency.
Fluency is the ability to use AI intentionally and ethically with a teacher’s judgment. It is knowing when AI is a good use of time — freeing a teacher for one-to-one conferencing — and when it is a poor use, replacing student thinking.
AI fluency is not a technology mindset.
It is a teaching mindset.
The Challenge of Access
Many districts are wrestling with a practical barrier: the fear of opening tools like Gemini due to privacy and compliance concerns.
While caution is correct, locking systems down without a plan for safe exploration creates a different risk. Students will simply work around the system on unmanaged devices, making exploration less visible and less safe.
The solution is managed, intentional access.
“Safe and secure” does not have to mean “blocked.” It can mean governed.
We must build platforms that amplify the teacher’s voice, not replace it, reducing mundane tasks so educators can focus on the human work that cannot be automated: relationship and coaching.
The Third Path
Massachusetts has opened a door. This should not be treated as a headline but as a readiness moment.
We do not have to choose between banning tools out of fear or setting students loose without guidance.
We can choose a third path:
Teach about, for, and with AI.
Build guardrails and teach integrity.
Keep learning in human hands.
The job of education has always been to help students live intelligently and with integrity in a complex world.
At JFYNetWorks, we are ready to work with school leaders, teachers, and partners to move from anxiety to action. From tentative on-ramp to full fluency, we can help you map the route.
AI is here.
Our students need to know it — and use it.
Let’s build the path forward together.
Eileen Wedegartner is a Google-certified JFYNetWorks Learning Specialist.
Other posts authored by Eileen can be found here.
HOW ARE WE DOING? In our pursuit to serve up content that matters to you, we ask that you take a couple of minutes to let us know how we’re doing. Please click here to be navigated to our JFYNet Satisfaction Survey. Thank you!