Redefining Education in the Age of AI: From Passive Consumers to Participatory Co-Creators
We are witnessing an inflection point in the history of education. AI is no longer a futuristic add-on; it’s here—present in our students’ pockets, homes, and increasingly, in their learning experiences. The PPai framework developed by Margarida Romero and colleagues (2023) isn’t just a taxonomy of student-AI interaction. It’s a roadmap for how education can redefine itself—radically and intentionally—if we are to prepare the next generation for a world shaped by co-intelligence.
Romero’s work provides a continuum—from passive consumption to participatory co-creation and expansive learning—that we as teachers, leaders, and system builders must take seriously. And it forces us to ask: Where are we on this continuum? Where are our students? And what’s holding us back from moving forward?
The Continuum of AI Use: From Consumers to Co-Creators
Romero outlines six progressive stages of student engagement with AI:
1. Passive Consumer – Think of the student watching auto-played TikToks, using Grammarly without understanding why it suggests changes, or copy-pasting from a chatbot. They benefit from AI outputs but have no agency.
2. Interactive Consumer – Here, learners begin to respond to AI. They use Duolingo, for example, which adapts based on their responses. The AI “listens” but students remain dependent on the system’s cues.
3. Individual Content Creator – A shift happens. A student uses ChatGPT to brainstorm an essay opening, or DALL·E to generate an image for a digital storytelling project. There’s intentionality now, and some understanding of how the AI operates.
4. Collaborative Content Creation – A group of students builds a multimedia history presentation using Midjourney for visuals, GPT for script refinement, and Synthesia for narration and video. AI is integrated, not for substitution but augmentation.
5. Participatory Knowledge Co-Creation – Students engage in real-world problem solving—climate change in their community, food insecurity on campus—using AI to analyze local data and propose actionable solutions with stakeholders. It’s no longer about assignments. It’s about agency.
6. Expansive Learning – This is where learners are not only solving problems but reshaping the learning ecosystem itself. They use AI to model scenarios, negotiate values across disciplines, and propose systemic changes. A student-led AI ethics committee in one school in Montreal recently created policy recommendations that were adopted by the district. That’s expansive learning in action.
Why This Model Matters Right Now
This is not hypothetical. I work with teachers and systems across the world and I can tell you: for the most part we hover around level 2 or 3 in education with isolated pockets reaching level 4. We’re not stuck because we lack talent or tools. We’re stuck because our structures weren’t built for this.
Too many digital policies focus on compliance rather than creativity. Too many assessments still reward recall over reasoning. Too many classrooms ban the very tools students will need to navigate the future. And too often, digital literacy is everyone’s job—which means it becomes no one’s responsibility.
Romero’s continuum gives us a language to redefine these context.
Real-Life Application: What This Looks Like in Schools
Early Elementary (Stages 1–2): A Grade 4 class uses Google’s Teachable Machine to create a sorting activity—teaching it to identify objects by shape and color. Students explore “what happens when it gets it wrong?” They start to understand training data and bias—even at age 7.
Middle School (Stages 3–4): A Grade 9 class uses ChatGPT to explore multiple perspectives on a controversial topic, like facial recognition in schools. Each student refines a different argument strand, then collectively builds a debate model. The teacher scaffolds bias detection and teaches prompt engineering. AI isn’t replacing thinking—it’s expanding it.
High School (Stages 5–6): A team of students partners with a local Indigenous community to design AI-informed conservation strategies for land use. They use satellite imaging, AI-generated simulations, and stakeholder interviews. Their goal isn’t just to inform policy—it’s to co-create it. This is participatory, interdisciplinary, ethical AI use—and it begins in a high school classroom.
What Needs to Shift
1. Curriculum and Assessment – We need to embed AI and data literacy across all subjects, as well as some specific Digital Literacy courses (wellness, citizenship etc..) This includes data awareness, prompt design, ethical reasoning, and AI critique. Assessment must shift from output to process, from final product to demonstrated learning journey.
2. Teacher Capacity – Teachers are not just implementers. They are learning architects. But they need time, tools, and trust to experiment with AI meaningfully to design what they want to see or envision. PD must focus not just on tool training but on redesigning pedagogy around inquiry, agency, and creativity while using AI intentionally and responsibly.
3. Leadership and Infrastructure – AI should not be treated as a compliance or IT issue. It’s a generational opportunity to redesign education with intention, responsibility, and purpose. This means going beyond adoption to actively shaping how AI is integrated — aligned with our values and aspirations.
Leadership must create space for co-creation, innovation labs, and experimentation, allowing educators and students to help design what’s next. Infrastructure must support ethical governance, human-centered design, and equitable access, not just technology deployment.
Transparency around AI’s limits and potential should be modeled from the top. This is about building trust and readiness — and laying the foundation for systems that can evolve with the technology, not react to it.
4. Student Voice and Agency – Students should help shape the rules of AI use in schools. When they co-create AI ethics policies, academic integrity documents or lead peer workshops, they move from object to subject in the AI learning journey. It must be clear to them what are the guidelines, how they can explore, and how they can apply AI to amplify their voices. We want to avoid cognitive atrophy which most research says happens when AI tools are being used as crutch by students. Make sure it is with intent.
Anchoring This in Research and Reality
Romero’s PPai model doesn’t exist in isolation. It builds on:
· Engeström’s Expansive Learning theory (2001), which calls for learners to transform systems, not just adapt to them.
· The OECD’s 2023 report on “AI in Education,” which emphasizes the need to teach with—not just about—AI.
· ISTE’s Digital Citizenship framework, which now includes AI literacy as a core element.
· Rebecca Winthrop’s “The Disengaged Teen” which makes clear: young people are not disinterested in school—they’re disinterested in irrelevance.
Final Word: The Moral Responsibility
AI is not just a tech trend—it’s a literacy. Like reading and numeracy, students who are left behind in AI fluency will be locked out of opportunity. And those who misuse it due to lack of guidance will face consequences far beyond bad grades.
Romero’s PPai framework gives us a call to action. Not to introduce AI to students—but to shift our entire educational paradigm to prepare them to lead, co-create, and build a better world with it.
Because in the Age of AI, the most powerful thing a school can teach isn’t how to use the tool.
It’s how to use it with wisdom, agency, and purpose.