Available April 15, 2025
Find New View EDU on Apple Podcasts, Spotify, and many other podcast apps.
Artificial intelligence (AI) continues to be one of the hottest topics in education. Should we be using it? Should we allow students to use it? When, where, and how does it fit into our schools and our vision for the future of education? Yet despite all the attention, AI is so new and fast-moving that we don’t have a lot of evidence or research upon which to base decisions. That’s a problem that Chris Agnew and the Generative AI Hub at Stanford University’s SCALE Initiative are trying to address.
Chris joins host Morva McDonald to discuss the Generative AI Hub and how he and his team are thinking about the challenges and opportunities generative AI brings to the education landscape. He boils down the purpose of the initiative to three areas: research, tools, and engagement. He shares how they’re applying research to understand how schools are using AI and whether those uses are effective.
Systemically, Chris says, there are three major ways in which AI currently is being applied. One is for efficiency purposes, to help save time and human labor in various ways. Another is for impact, to improve various education outcomes across key skill areas. And the third is more aspirational: reimagining what learning and school can look like. He says that though it’s crucial we better understand how these three use cases are playing out in schools, the only data we have on that may be somewhat unreliable. The Stanford SCALE team is working to rectify that issue through systems mapping and collecting large-scale data to better pinpoint what schools are doing and how effective those efforts are in gaining the desired outcomes. They’re also building a robust research repository, which will allow educators to learn and explore all of the existing research on generative AI in schools.
What is more certain, according to Chris, is that there are some conditions within schools and systems that are more optimal for the successful and responsible use of AI. Those include: building AI literacy and competency among all stakeholder groups, from students all the way up through trustees, parents, and other external stakeholders; understanding risk mitigation at a granular level; and identifying opportunities. The opportunity piece involves first asking, “What problem are we trying to solve through our use of AI? How do we intend to use AI to help us step forward on something specific?” Chris warns that as with any type of technology, we need to think of AI as a tool that can be used for specific purposes—but it should never become a tool we use only because it seems like the next big thing.
Critically, Chris points out, generative AI needs to be something we use not to put up more barriers between students and the real world, but to buy back time and opportunity to lean further into human-centered interactions and relevant real-world learning. He says we have a choice between using AI to optimize a flawed, tech-heavy system, or using AI to make education more efficient, human-centered, and relational. The work of his team at the Generative AI Hub aims to provide the tools and research needed to help schools do that.
Key Questions
Some of the key questions Morva and Chris explore in this episode include:
- What is the Generative AI Hub at the Stanford SCALE Initiative? Why was it created, and what is it working on currently?
- What do we know about the ways in which schools are using AI right now? What do we need to think about as we seek to apply AI to education systems?
- What are the characteristics that allow for responsible and effective AI adoption in schools? What should leaders think about in terms of risk management and helping get their teams and communities on board?
- What does the research repository contain, and how is it optimized to help educators learn and apply research to their growing use of AI in the classroom?
Episode Highlights
- “I would say AI in schools is week by week, month by month, as far as how fast the technology is changing, how fast takeup is changing. And so it is something that school leaders need to keep front of mind and be actively talking about, even though they can't be expected to know the answers. Because here's the great secret. Right now, nobody knows the answers. Even the heads of the largest LLMs are building tools that they don't know where that will lead. And we need to face it head on.” (12:21)
- “I hope we all know that technology is not a thing in its own right. Technology we should be adopting as a tool to do X or to do Y, but we're not just adopting technology for technology's sake. AI is the same. And so determining, OK, AI is this tool. What are we thinking about what we want to apply this to?” (15:23)
- “I feel like we're at the precipice of the opportunity to make this choice or go down a more, say, dystopian or technology-for-technology's-sake path. Right now we have this technology, when you apply it to education in schools, and specifically you take AI with this very new powerful technology. You can use it to optimize a currently flawed system or use it to completely reimagine what's possible.” (27:22)
Resource List
- Check out the work of the Generative AI Hub.
- Watch Chris talk more about the hub and its focus.
- Delve into the research repository.
- Learn about the research methodology surrounding AI and what we currently know.
- Read a Stanford article on human learning and the opportunities of AI.
- Keep up to date on the work of the Generative AI Hub team by subscribing to their newsletter.
Full Transcript
- Read the full transcript here.
Related Episodes
- Episode 68: Technology Innovation in Independent Schools
- Episode 49: The View From the Classroom
- Episode 45: Designing Schools for Blended Learning
- Episode 32: Restoring Humanity in Education
- Episode 31: AI and the Future of Education
- Episode 28: Supporting Healthy Habits for Students in a Digital World
About Our Guest
Chris Agnew is passionate about applied learning through hands-on, immersive experiences. Over 25 years in education, he has bridged academia and nontraditional classrooms—from glaciers in some of the most remote places on Earth, tech apprenticeships with Fortune 500 companies, leadership development for Google executives, and early literacy programs in national parks. He thrives on exploring how people build skills through real-world application and discovery.
Chris has taught in and led K-12 schools and post-secondary experiential organizations. While leading Teton Science Schools for six years, the organization educated nearly 100,000 in-person students—spanning a K-12 lab school, visiting school groups from across the U.S. learning next-generation science standards in national parks, and a graduate school of education in partnership with multiple universities.
In recent years, Chris transitioned to edtech to expand access and reduce costs for transformative learning models. Before joining Stanford, he worked at two venture-backed edtech companies, most recently leading U.S. higher education strategy and credentialing for Multiverse, a UK-based company pioneering professional apprenticeships as an alternative to college.
Today, Chris leads the Generative AI for Education Hub at the Stanford Accelerator for Learning. The hub serves as a trusted resource for K-12 education system leaders—superintendents, state, and federal officials—on what works in leveraging generative AI to benefit students, schools, and learning.