New View EDU Episode 71: Full Transcript

Read the full transcript of Episode 71 of the NAIS New View EDU podcast, which features Chris Agnew, director of the Generative AI Hub at Stanford University’s SCALE Initiative. He joins host Morva McDonald to discuss the challenges and opportunities generative AI brings to the education landscape.

Morva McDonald: Like many organizations, at NAIS, we’ve been thinking hard about AI and how to support schools in making good decisions about if, when, and how to adapt AI into their organizations. Given our values related to equity and inclusion, we’ve also been thinking  hard about how to get smarter about the ways in which AI can be used to improve efforts around inclusive practices. While trying to get smarter, I came across Chris Agnew’s work as part of the Stanford SCALE initiative, which focuses on Systems Change for Advancing Learning and Equity.

Chris Agnew is the Director of the Generative AI for Education Hub at the Stanford Accelerator for Learning. The Generative AI for Education Hub delivers trusted research, insights, and tools for K-12 education leaders to leverage generative AI to benefit students, schools, and learning. 

Chris is a lifelong educator, just like all of us. He has worked in K-12 classrooms, led education organizations, and is passionate about creating learning environments that are immersive, relevant, and embedded in the real world. We’re going to learn a lot today.

Hi, Chris. Thanks for being here today. I'm super excited to learn with you through this conversation. So currently, you're working at the Stanford SCALE Initiative for Generative AI for Education, the hub. And just talk to me about why was it created? What are you trying to do? What's its purpose?

Chris Agnew: This hub and what we're trying to build is really exciting, I think. But first off, Morva, thanks so much for having me on this. Excited to talk about AI and education and the risks and opportunities with it. 

So a little bit about the hub. We launched in July, when I began in this role as the director. The SCALE Initiative was about doing and taking research in education that can have impact at a system-wide scale. So what I mean by that is, a district, statewide, or country-wide scale. So one of the initiatives within SCALE is the National Student Support Accelerator that's focused on high-impact tutoring. And over the last couple years, some of their research started to step into AI and its role with tutoring because of the obvious kind of V1 applications of AI to tutoring. 

And in many ways, to state the obvious, the recognition was, ooh, AI and its role in education is going to be a big thing and we need to dedicate a focused initiative around this. Enter the Generative AI for Education Hub. So what we want to be is the trusted source for education system leaders, specifically superintendents, state, and federal K-12 leaders on what's working and what's not when it comes to student schools and learning. 

And so in that we have three priorities: Research, tools, and engagement. And I'll drill down a little bit into the first two, research and tools.  We've been in this phase of generative AI for the last two and a half years that's been really focused on use case generation. So chat GPT came out, additional tools have rolled out, and education has been in this phase of, it could do this, it could do that, identifying all the ways it could be applied.

We see our core role as evolving that conversation, from simply all the ways it could be applied, to starting to apply research to it to determine, where are we starting to see efficacy and impact? Where should we maybe double down on? Where are we not seeing the ROI on that? And so in that research realm, we're both doing research, original research, and also centering and collecting research with something called the research repository that we can get into a little bit later in our call. 

Morva McDonald: In your research, when you're thinking about impact, are you thinking about it at the level of, like, the organization, at the level of individuals? How are you thinking about efficacy? Are you focused really on efficacy for students? Like, what's the kind of the level, right, at which you're making that judgment?

Chris Agnew: So what we want to have, we want to have impact at a system-wide scale. So our research is about what levers can be pulled to implement at, probably the smallest grain size would be a school. But what decisions can a superintendent make? What decisions can a statewide ed leader make? That being said, a lot of that relates to how students are using the tool, how teachers are using the tool.

And we generally, on the why behind Gen. AI, we largely have identified three big categories of why these system leaders might be thinking about Gen. AI in their schools. So one is as an efficiency play. So can I save teachers five hours a week? Can I give teachers their Sunday evenings back, say? Or can it save my finance team a bunch of time? It's all about efficiency. 

Two, pretty unique to education, and I think really what many of us really care about, is about efficacy and impact. So how can this move the needle on student learning in the academic realm? How could it move the needle on social emotional? How could it improve outcomes in durable skills? But it's all in the category of outcomes. 

And the third category that I think is really important to keep front of mind in this, but sadly right now we're not seeing as much products out in the marketplace in this category, is Gen. AI to reimagine what learning looks like. So how does, because we have this powerful new technology, how does it change what school looks like, how does it change what a school day looks like, how does it change the role of a teacher and students and all the different ways it could reimagine? So those are the three big categories that we're thinking about the why behind Gen AI.

Morva McDonald: That's great. I think that's a really helpful framework, particularly as you know, like this audience is, you know, we're an association and most of the people who will listen to this podcast are educational leaders in schools and the framing, I think even just the framing of like, am I thinking about this in terms of efficiency? Am I thinking about it in terms of efficacy impact for kids and teachers ostensibly? Or am I really thinking about it in terms of like, how do we just reconceive and reimagine, to your point, kind of what learning looks like and how it takes shape? I think that framing itself is just really helpful.

You know, I imagine you have a pretty big vision, right, and view of like, what schools are doing and what it looks like. Just describe that to us from your kind of perspective. Like what are you seeing? How are you seeing schools use it? How are you thinking about that, you know, both at the systems level, but also in individual schools themselves? How do you see it taking shape?

Chris Agnew: Well, a colleague or somebody that we collaborate with a fair bit, Keith Krueger at CoSN, which is a national network for school district technology leaders, I think has a great frame on what we're seeing right now. So I'll start with the kind of the state as is, the state of play. 

They're largely categorizing what schools are doing into one of three categories. There’s first, the ban it, just like not allowed. Two, the let every flower bloom category of just like do it all, try it out and we'll see where we land. And three, the category of duck and cover, as he puts it. Of just like, this might blow over, maybe it's a fad, this seems hard, let's wait.

I'd say the direct answer on what is actually happening in schools is we don't know. Right now the only firm data on what's happening is all survey-based. And as we know, humans are not totally reliable in reporting this is what my behavior is. So I think we can all attest to on Monday on LinkedIn, I might have seen a survey saying 50% of teachers are using it weekly. And then on Friday, you see a similar survey with very different results. 

So actually, some of our first research is around landscape mapping and determining based on real user behavior, what are people using this for and how often are they using it? So right now we have some research in the works, that is around working with one of the larger Gen AI K-12 platforms and looking at their user data and simply understanding how far is this reaching across the country? How much time are teachers spending on these tools? What are they using it for? Are they using it for assessment, lesson planning, exit tickets? So we can start to drill down on how people are spending their time using these tools, to start to develop a map of state of play of where we are right now. 

This brings me to really kind of our broader forming research agenda, I put into three main categories as we go forward. One, with this fast changing technology, we just want to understand what behavior is now and how that's changing. Our partners at the Center for Reinventing Public Education did some great research during the pandemic that used landscape mapping as how schools behavior was changing month by month to manage risk, manage the pandemic, to help schools build strategies for the future. We're using a similar kind of rapid response research to map the landscape as our first kind of research agenda area. 

Two, we got into this a little bit before, second category of research is around efficacy and impact. And this could be in a bunch of different ways, all around the different ways schools might be using these tools, whether, are we seeing an effect in saving teachers time? Are we seeing impact in moving the needle on student learning in certain categories, or are we seeing impact in changing learning models? So efficacy research. 

The third category is about enabling conditions. And so this, we haven't gotten there yet, but what we hope to do is look a bit more on what are the conditions that need to be in place for successful AI adoption, management, policy building, et cetera, so that schools can factor that in as they build their AI policies, build AI strategies, et cetera.

Morva McDonald: It is, in some ways there are lots of things right now that I think are happening that we can use our COVID experience, having been a head of school during COVID as an analog to. This is probably one of them. And it's rapidly changing. And earlier we were saying like there are kind of three responses maybe that schools are having like one is like duck and cover, you know, one is like, this is a fad, is it going to blow over ,those are all kind of related. And I'm wondering if you were talking to a head of school, right, who can, has significant power, right, to kind of help their faculty think about how to engage. Are you all, even at this early stage, do you have a thought about how you would advise them to engage? Kind of what would you say to them right now?

Given what you see but also like it's really changing and it's rapid, we all know that, but like there are ways of… there are dispositions one can have in a moment like this. I'm wondering if you have thoughts about that.

Chris Agnew: I have thoughts, I don't have answers, to be clear, and you are asking the spot on right question. I'd say this definitely warrants the “lean in” mindset and like, “go straight at the thing you don't understand” mindset, rather than try and avoid it. 

So your analogy to early days of COVID, I think is spot on. I still, the March 13th, 2020, when I was leading an organization that had a K-12 school, decisions in those moments are still burned in my mind. And while that period of time, we might've been measuring hour by hour, day by day. I would say AI in schools is a week by week, month by month, as far as how fast the technology is changing, how fast take up is changing. And so it is something that school leaders need to keep front of mind and be actively talking about, even though they can't be expected to know the answers. 

Because here's the great secret. Right now, nobody knows the answers. Even the heads of the largest LLMs are building tools that they don't know where that will lead. And we need to face it head on. 

So three big buckets, I'd say, school leaders can be thinking about. And I get this from a couple different things. My time in schools, but I'd say more importantly, I spend so much of my time right now talking to school leaders, district leaders, state education leaders. And so this is some just pattern recognition on schools that I see are navigating this well and looking at it as an opportunity that's helping staff feel optimistic and curious, rather than scared and leaning back. 

It follows these, these are three attributes that they often have. So one first category is AI literacy and building AI literacy, what the technology is, where it comes from, what it can be used for, for all your stakeholder groups. So the obvious ones, students, teachers, school leaders, but also other categories, a big one being parents and outside community, in building AI literacy of those groups. So that's one big category.

Morva McDonald: And in our context, that would include board of trustees, members, trustees of the board. 

Chris Agnew: Excellent example, yes, yes. Governance is a great example there. 

So second category, AI literacy is one. Two, thinking about the big category of risk mitigation. So we think of lots of opportunity with AI and there's, of course, with all things that have opportunity, there are associated risks. And we want to be in a position of maximizing opportunity and minimizing risk. So risk mitigation, big categories related to data privacy, bias, etc. Data privacy is a big one in making buying decisions. School leaders might feel pressure to, they're getting pitched by ed tech companies. They're seeing other schools adopt ed tech tools and products. And really understanding data privacy implications for their student body, for their school, leaning on their school general counsel or school board and governance for guidance on that. But risk mitigation, big category. 

And lastly is the opportunity category, of what problem are you trying to solve or what opportunity are you trying to step into with this? And I hope we all know that technology is not a thing in its own right. Technology we should be adopting as a tool to do X or to do Y, but we're not just adopting technology for technology's sake. AI is the same. And so determining, OK, AI is this tool. What are we thinking about what we want to apply this to? And I go back to my previous frame on like, is our first entry as an efficiency play, are we wanting to save a little time in this, in our like back end office? It feels like a safe place to experiment. Or maybe we want to focus it on a teaching and learning challenge that we've been facing over the last number of years. And we feel like we could step forward in that realm or some other area.

But determining, being deliberate in like, OK, what's our opportunity here? And what I'm seeing, and this aligns with many other change management strategies that schools can use. Schools or districts that are building a working group of early adopters that include some teachers that just love technology and love playing around. A few school leaders or department heads that really like engaging in this. And giving them a little more freedom to push the boundaries, bringing that group together on a periodic basis, say monthly. And one thing I will note of too, right here in this kind of early working group, is a great opportunity to involve students too, because this is a technology specifically high school age students, where often high school students can be miles ahead of faculty in their just native fluency in the tool. 

But this kind of early adopter working group can help school leadership feel maybe one step ahead or seeing just a little bit into the future based on what they're doing, that can help form, OK, where's our system-wide, school-wide, longer term plan here.

Morva McDonald: One thing I was thinking about this and tell me if I'm wrong, if I don't have a good understanding of this, sometimes as a school leader, I think it's really helpful when you're thinking about change to think about whether it's kind of bottom-up reform, or top-down reform. 

And one of the things I'm hearing you say, it's like, from the leadership, there needs to be kind of the conditions, the enabling conditions. But in many ways, it's a bottom-up reform in the sense that it's organic, and finding people who are the early adopters who want to engage or are interested in creating this space in your context, to allow people to explore without it being high stakes. Does that seem like, is that a good perspective from your understanding?

Chris Agnew: I would say certainly the bottom up and top down is a good frame. I think one really empowering thing about artificial intelligence for teachers that I talk to, school leaders that I talk to, and my own personal experience, is it can be, it can give a greater sense of agency to educators on “I built a thing and I learned a lot from it and now I can feel like I can do more.” 

So it's such a great opportunity to give some of that agency to teachers, to early adopters, to students, and say, hey, what can you do with this and help us learn? So great bottom up. And it can't only be bottom up, because there are school or system-wide systems that need to have a clear point of view on, because large scale data is involved, which gets to school policies and security. 

System-wide tech policy relates here. And I certainly, in some of my conversations with district leaders and state leaders, there's a little bit of a push-pull on, and this is in the public context, but there's a little bit of push-pull on at the state level, they're saying, hey, we want to give schools the freedom to like, do their thing on the ground and let them lead. And so they're almost pushing it to districts. 

And districts, what I hear from some districts is, hey, we'd love a little more guidance from the state on what our boundaries are, where we can experiment and where we can't. So I think that both bottom up and top down is really important.

Morva McDonald: I mean, you could, you can extrapolate that example, I think, to a smaller school setting, right, of the desire for like, what are the parameters, right? Most good learning happens inside some kind of parameters that have been designed and thought of, right? And then the autonomy and freedom to explore inside those parameters. And that's kind of what you know, when I was thinking about a school, that's kind of how you want to think about what you're trying to do.

I know that at your work that you're building out a kind of research repository for this.

Talk to us about what that is, who do you think that's for, why it's an important thing to be doing at this point in time.

Chris Agnew: Great, well thank you for asking about that. This is something we're really excited about that we launched at the end of January. So back to my, what I mentioned earlier and wanting to evolve the conversation from use case generation to efficacy and impact. With the research repository on our website that I'm sure we'll link to in show notes or wherever, it is the single source for all the research that exists around Gen. AI and K-12 that is searchable, filterable by user, by application, by study design, all of that. 

When we launched, there were 181 papers, which goes to show that we're at a pretty nascent stage. Like to be clear, 181 papers, that's quite small. And almost all those papers were written in the prior 24 months. And we're seeing rapid increase in that research, which is good news. But the idea is that as a school leader, you might be scratching your head and saying, OK, we want to be deliberate in how we're thinking about AI and what problems or opportunities it can create for our school. And we're thinking about a middle school math product. Or we've been struggling with middle school math. What does the research say for AI tools for educators oriented to math at the middle school level? And you can filter to that and then see what the research says. 

Or say you're getting hit with a lot of different ed tech companies pitching their products to you and you want to be grounded in making buying decisions based on what the research is saying. So you can go to the repository, filter for what you're curious about, and you can do several different things. Right now it's a small enough pool that in many cases you might filter and only find four papers that exist on that category. And so you can look there and see what those say and if they match with what you're curious about. 

You can take another step, and you can start to push your own AI knowledge as a school leader, because school leaders need to be playing with the tool themselves just so they can start asking the right questions. One great use case that we've been hearing from district and school leaders is they go to our repository, they filter for what they're curious about, they identify several papers, they download those papers and they create a notebook LM, which is a Google tool. You can take those four papers and upload them into a notebook LM. And guess what you can do with that? You can create a 15 minute podcast that synthesizes those four papers into something you can listen to on your run. You can chat with those four papers and ask questions about your specific context. You can do a bunch of different things to engage more deeply with that research. 

Last use case that we've been hearing from people: ChatGPT has recently released a deep research tool. And we've heard great applications where users go to ChatGPT's deep research tool and steer prompt deep research to look at the Generative AI for Education Hub’s research repository, just in the prompt. And I am a...High school, independent school leader, and we're struggling with X, Y, and Z problem. Look at all the research and come back to me with relevant research addressing this challenge we're facing right now. And what schools are getting back is really interesting. 

And one final plug I'll ask is please, I would love, we would love to hear feedback from users of the repository on what they're finding helpful, additional features that would make it more useful to them as a practitioner, because over the next six months we're talking to several technology companies to make it more a more powerful tool oriented to practitioners. Our core audience for this is school leaders, superintendents and state education leaders on how can this tool be more useful for you in making decisions, building school policy, tackling challenges, etc. So please share feedback with us.

Morva McDonald: That's really exciting. I I think that it, for me, it blends all three, but two of the things that you talked about earlier, one is like, how do I get smarter? How do I build my AI literacy within my organization? How do I tap that? How do I understand what kinds of opportunities or challenges that I have? So you can either, I can imagine, go to the repository and say, I don't really understand this, and I'm interested in what the research evidence says about X, or I have this opportunity or this challenge in front of me. I've kind of done some problem definition, and now I'm looking for what AI can kind of help me with in that realm. 

And so the blending of those two things, I think is really powerful and exciting to be at the front end in the evolution of this in this kind of way, right, because often things happen and then we try to categorize them and capture them. It sounds like you're doing a lovely job of building that system while we're also just in such a high speed kind of evolution of how this is working. So I really appreciate that. I think that's going to be a tremendous tool for educators.

Chris Agnew: Yeah, that's the goal. And one, and I mentioned this earlier, but I want to keep hitting on this to encourage school leaders to lean into these questions of, as you were saying in several ways right there, that these are the early days of this technology and it is changing so fast. 

So you can frame that in one of two ways as a school leader of like, I don't know how to tackle this because I don't know the answers in one frame. The other is, I don't know how to tackle this because nobody knows the answers. And it's this almost this liberating gift of, to just jump into it head on, because you don't have to know the answers, because nobody does know the answers. So grab a paper that looks interesting and bring that to your next professional learning community, and just get teachers curious, get people talking about it, get students talking about it. And see where that goes.

Morva McDonald: Yeah, I love that. I think that's such a creative process. And that's exciting. And to your point, it can really rejuvenate people's interest and excitement about the work in education right now. 

One of the things I've heard you've said is that you think that AI offers us, to some degree, paradoxically, I think, the chance to make education more human-centered, which at the, at the surface just seems completely contrary, right? I think that seems like a contradiction. Tell us a little bit about that belief and how we can capture that in schools.

Chris Agnew: Morva, thanks for asking this question because I don't know if you remember, this is a throwback to NPR used to have this, This I Believe series that I used to love. It had the short audio essays. If they still had that today, this might be my like two minute This I Believe. But it's something I do feel really strongly about. And I feel like we're at the precipice of the opportunity to make this choice or go down a more, say, dystopian or technology for technology's sake path. 

Right now we have this technology, when you apply it to education in schools and specifically you take AI with this very new powerful technology. You can use it to optimize a currently flawed system, or use it to completely reimagine what's possible. The challenge we're faced with is, Gen. AI right now and will be, largely, live almost entirely in the private sector. Living in the private sector means they have venture money they've raised, they have financial targets they need to hit. And so there's all these incentives to deliver results to schools right now, solve their problem next week. And what that can create, and we're seeing it in some of the AI products right now that are coming out, is we'll build you a faster worksheet. That is not what the world needs more of. This is in the category of optimizing what I would call a currently flawed system.

There's a different world, where we've crossed this threshold where rather than technology only interfacing to us in front of a screen and putting a wall between us and other humans and plugging kids into their computers as, quote, “learning” and leaving it at that, but where technology can now shift to hover around us, but not between us. And here's a few different ways I could see that creating opportunity, big picture, and then I'll give one really tangible example. 

So we know, working in education for decades, there are all these things that often, and even more and more today, are the things we really care about, of, is this student a good communicator? Do they care about their classmates? Are they a good team member? Do they have a growth mindset? All these things that we know are the stuff that really matters, and we do a crap job of assessing and tracking that. Our tools are just really crude and definitely not scalable. When we can assess them, it relies on very single human based systems usually. 

So are there ways that AI can be used to actually measure these things that we really care about, and because we are now able to measure them better, we can track them better and we can value them better. That's one area. 

Two, in the efficiency realm, can we save teachers time, in all these ways that teachers have tasks they have to do as teachers to fill out paperwork, to go through assessments that take real and meaningful time that steal from what we really care about, which is that teacher getting time with their kids face to face, one-on-one coaching, small group sessions. So can we give, can we make sure we, one, save teachers time, but make sure that time gets reallocated to the stuff we care about, which is that human to human time. 

And then lastly, one example, and then I'll drill down on this one to make it super tangible. I've certainly seen a shift in the last 15 years, and then more recently, a shift in recognition of, to help our learners be ready for their lives post-school, building project-based learning into how they engage with school sets them up better to be better team members, to work on real world projects, et cetera. And guess what? Project-based learning for a teacher is really hard to operationalize, really hard to assess, and just takes lots of time. So it can often be the check the box, OK, we did our one project for the year or for the semester.

But really incorporating into all subjects across the school day, what a lift. It's hard to do. AI has shown promise in some ways where it can really unlock better project-based learning and making things more relevant to the real world. Here's a really tangible example. 

So I worked in nonprofit education, hands-on immersive learning settings for 20 years. And just post COVID, I left that. I left it for a few reasons. One reason being, I recognized the business model for creating those learning experiences was broken. Even with massive philanthropy, there was not a path to unlock at scale these types of learnings to more kids. And so I recognized technology had to play a role.

So I spent a couple of years working for a couple of venture-backed tech companies. One of those was this company called Multiverse that used apprenticeships as an alternative to university. Really popular in the UK, they grew and scaled their model and then they entered the US market. And I was one of their earlier hires on the US team and led credentials and their higher ed strategy. And one thing that towards the end of my time there that we were working on was so all learners, all apprentices were placed in real jobs doing real work, and all their assignments or deliverables were real work products that represented a demonstration of the learning outcomes that they needed to hit for their apprenticeship. 

Coaches, which were the student facing instructional role, coaches’ role was to engage with apprentices one-to-one or in small groups, and then also to track apprentices’ learning. A significant time of coaches’ time was spent assessing their apprentices' learning. So taking a slide deck that the apprentice built, or taking a massive spreadsheet and then matching it to the learning outcomes, massively labor intensive. And we were building an AI tool that would allow these coaches to use AI to assess student output for, against the learning outcomes, and still then be human reviewed, but shorten their assessment time significantly so that those apprentices or those coaches would get more time with their apprentices, one, most importantly.

But two, and now I'll bridge this – this was in a post-secondary realm, but this has real implications for K-12 setting as well. We're starting to see some tools, some AI-powered assessment tools that don't give the final grade for students, but allow, in previous days, students might have submitted an essay, waited three weeks, and then gotten assessments back from their teacher after like, hours and hours of grading time from the teacher. With AI-powered assessment tools, in less time, students can get three cycles on that same essay. So they can get three more reps on that essay and take way less student time. 

So more frequent feedback, more student practice, and more time with teachers working directly with kids rather than sitting on their couch at home with a red pencil grading papers. So these are some real ways I get super excited about the potential of AI to make education more human, and reimagine what's possible that is hands-on, project-based, relational, in the real world, with AI being a support to all of that.

Morva McDonald: That description and the details of that description are super helpful. And the thing that I walk away from, I think, Chris, in that, is the significant importance of a school or a school leader of having a good, strong understanding of what is the purpose of schooling? What are we trying to do with kids? What do we know from educational research that's really good for kids that isn't just about cranking out children in some kind of way, but really helping them really think deeply about learning? And then how do we leverage AI as one of many tools in a tool set? How do we leverage AI to improve upon that work? And without a clear purpose of schooling, then you can optimize AI to the wrong end as a tool. 

So I think that, for me, that clarity is, we sometimes think AI is going to come in or technology come in and kind of revamp everything. And what it comes back to is like, do we have a good solid understanding of as educators, what we're trying to do with kids? And to what extent can this tool exactly AI for what, which I think is just, you know, we're kind of at the end of our podcast, but I think that's just a lovely way to end, which is to leave our audience with the question, right?

It's a good inquiry approach, right? Which is AI for what? To what end? To what purpose? And I think I'm excited about continuing to learn from you and what you're doing at Stanford and the SCALE Initiative and the Generative AI for Education Hub. I think that's very exciting. I appreciate that you guys are kind of right at the forefront of helping educators get their hands around this and just really doing the service that that requires. 

So thank you for that. Thanks for the time today. And I'm looking forward to just watching it develop and grow and hoping that some of our folks in NAIS participate in this engagement with you. So thank you. Thanks, Chris.

Chris Agnew: Morva, thank you so much. And please, I encourage people to visit our website, visit the repository, you can subscribe to our monthly newsletter, follow us on LinkedIn. Those are all ways to stay plugged in, because one theme of many that we talked about was this is changing fast. You don't have to have the answers, but just following along and finding those few key trusted sources that you feel like is OK, I can just keep tabs to help my school stay curious and still learning on this, that is essential. And so we want to be helpful and be that trusted source for school leaders.

Morva McDonald: Happy to be with you at the beginning. So thank you. Appreciate it.

Chris Agnew: Excellent.