I had the great (and greatly expensive) pleasure of attending and exhibiting at EDTECH Week in New York. The conference brought together thought leaders in education technology and startups and covered the grand themes of our time: AI in education, education and AI, AI and workforce development, how AI will change education, and finally, how AI will change education.
Basically, it was an AI conference masquerading as an education conference, with multiple sessions running in parallel for three days in 45-minute intervals. By the end, the panels all blurred together and fed into a rather dystopian outlook in which AI tools were not just involved in education but seemed to be running all interactions in and around it. These systems claim to know better what students need than teachers, supply teachers with pre-approved lessons, and listen in on classrooms to analyze performance and recommend improvements, or, you know, teacher replacements.
I am not exaggerating. I listened to pitches from companies that do all of the above, not as a unified Orwellian system (yet) but as a fragmented mess of services that teachers must navigate, trust, and somehow comprehend.
One panel summed up the conference perfectly. An education leader raised thoughtful concerns about overloading students with digital tools. A panelist agreed that teamwork is a durable skill that must be taught in schools and then pulled out his phone to demonstrate an AI-driven app that simulates teamwork with simulated students on a simulated project. I had hoped someone would comment on the absurdity of this conversation, but everyone was focused on the demo, which quickly unraveled because of “technical reasons.” I almost suggested developing an app to replace panelists just to see if that might trigger a response.
So we are facing a brave new world where students are surrounded by elastic layers of content and responses that understand them, adapt to them, offer no resistance, are always awake, and always judging. In the middle of all that softness and surveillance, a child is still expected to become a person, to make choices, build character, and form bonds while never unobserved, never allowed to stumble in private, or discover anything the slow, human way.
But this was New York, after all, the city of relentless optimism and unapologetic hype. It reminded me of the craze surrounding blockchain and NFTs. Unlike those, AI is here to stay, not only because of the flood of venture capital but because it genuinely has the potential to accelerate learning, help students, and reduce administrative overhead for teachers. We just need to get through the next couple of years to see how the first wave of technology performs in real classrooms and then decide what works and what does not.
So yes, I am optimistic about the role of AI in daily life, especially in education, if it is used with clear purpose and transparent boundaries.
At GaiaXus, we are developing an AI system that helps teachers connect environmental concepts and data to examples their students already know and care about. The first prototype, scheduled for release later this year, will keep the AI element fully visible and open to scrutiny. It is designed to prompt reflection and discussion between students and their peers, not to replace those interactions. We spend a great deal of time on interface design, teacher feedback, and classroom testing to ensure our tools truly serve learning rather than adding another layer of technology or chasing the hype train.
And yes, investors, if you read this, we have AI in our platform and it will work. But no, we are not making it the central focus of our existence. That place is reserved for students and teachers.
It was that teamwork demo that stayed with me. Not because of its failure — that happens to all of us — but because of what it represented: our deep urge to automate even the things that make us human. The challenge ahead is to resist that temptation and build technology that strengthens connection rather than replaces it.
