Share this story
Experts and stakeholders weighed in at a recent meeting of the governor’s Teacher Advisory Committee on the benefits and consequences of implementing artificial intelligence tools in North Carolina public schools.
Here are four takeaways from the conversation.
The use of AI tools in schools is not new
Krista Glazewski, executive director of the Friday Institute, said that artificial intelligence has been sprinkled throughout education systems for decades.
This includes things like adaptive testing, intelligent tutoring systems especially for math, and other programs that collect analytics and adjusts based on a student’s knowledge and skill level.
“Many people, and people that I work with, have been studying learning analytics for almost two decades now, and ways in which your data can be mined and then made actionable within a system,” Glazewski said, “(including) engaging with any kind of AI-driven system — collecting analytics, actions chat engagement, and then putting that into practice or making decisions or inferences about what learners are doing with out systems.”
Many traditional AI strategies have been focused on specific subjects such as math, and they are now branching out into other areas.
School districts should consider rules and statutes while setting new guidelines
Advocates of boosting AI use in schools want to facilitate conversations about use for districts and prevent them from banning it all together.
In those conversations, Allison Reid, the senior director of digital learning and libraries in Wake County Public School System, said they should start with the guidelines that are already in place.
“Every strategic plan for every district says something about preparing kids for the future or something about the four C’s of innovation,” Reid said.
Using the district’s goals and the terms and conditions of tools can help guide the rules set for teachers. For instance, the OpenAI website states ChatGPT is not meant for users younger than 13, and those between the ages of 18 and 13 should use it with the guidance of an adult.
Reid said it’s important to help staff understand how to use AI responsibly so they don’t approach it from a place of fear.
“You do have to have buy in from leadership to understand it, and some education has to happen there as well,” Reid said.
The N.C. Department of Public Instruction recently published a guidebook for how to use AI in classrooms — just one of four states to do so.
AI will not cause job losses, but not using it might
“The only place you are really safe in the job market is probably plumbing, maybe preschool,” Charlotte Dungan, director of content development at Teach AI & Code.org said.
Overall, panelists said teachers will not be forced to integrate the new technology outside of a computer science requirement. However, using AI may help teachers save time and prepare lessons more efficiently.
Glazewski said new tools that use AI can streamline a teacher’s classroom training; beyond methods from when she was becoming a teacher .
“I had to do this thing where they would videotape me on a VHS, and I had to tally how I interacted with every student — with the number of questions I asked and the number of genders and how I engaged with boys or girls,” Glazewski said.
Teach FX is a resource that records a teachers classroom and how they are engaging with student questions as well as how they build on each other, “not for evaluation purposes or professional learning purposes, (but) for my own reflective practice in my professional learning community,” Glazewski said.
AI can be used as an inclusion tool but still has to be monitored
“To me, the pandemic was a major lesson about the shortcomings of technology in a fully technology-focused approach of learning,” Jonathan Rowe, a senior research scientist at the NSF AI Institute for Engaged learning, said.
Rowe said that AI misinformation and society’s ability to think critically about the tools are top concerns.
“They’re not trained to produce information that is true, they’re trained to produce information that seems plausible, and isn’t natural,” Rowe said.
Beyond fact checking, a proposed solution is for teachers and students to use tools such as Chat GPT together and talk through the cause of bias together.
Beyond bias in the technology, another challenge to inclusion can also be access. According to Glazeweski, the next digital divide can be caused by which schools have enough AI literacy and which ones are cut off from it completely.
“We want to build on what you’re already doing. So your academic integrity policy, you’re not tossing it out the window, but you are adding some components to talk about how AI can be used in your classroom and still follow academic integrity policies,” Glazeweski said.