It’s no longer news that artificial intelligence is advancing at a rapid pace, and it’s certainly hard to ignore the way it is transforming various aspects of our daily lives. One aspect AI has substantially influenced is education – specifically, personalized learning experiences, automated grading, ethical concerns, and more.
In November 2022, OpenAI released ChatGPT, a responsive AI chatbot that has gained significant attention in the education sector. Many worry that AI writing tools like ChatGPT may diminish students’ critical thinking abilities and hinder the traditional educational process. They fear that the ease of access to AI writing tools might lead students to rely too heavily on AI-generated content, potentially stifling their ability to think critically and independently. This could have consequences for not only students’ education but also in careers down the road.
In dealing with AI technology, the Westminster community is doing its best to balance both the positives and negatives.
“We want to make sure that when students produce an assignment, it’s a chance for us to really accurately assess how much they have learned on a concept,” said associate dean of students Brooks Batcheller. “On the other hand, what we’re also trying to do is prepare kids for the world that they’re going into, broadly speaking. And so we’re in a position where we want to find ways to show the power and the benefits of AI without just pretending like it doesn’t exist.”
Like many other teachers, Batcheller highlights the importance of teaching students how to collaborate effectively with artificial intelligence, emphasizing understanding how to leverage its capabilities and make informed decisions based on AI recommendations.
Honor Council advisor Jake Kazlow provided more information on the process the council uses, specifically emphasizing possible concerns with the technology.
“Our biggest concern is making sure that we’re not getting any false positives, meaning that we’re not saying, ‘okay, this paper was AI generated’ when it wasn’t. And so, whatever we do, we’re making sure that our method of detection is ensuring that anyone that we’re flagging for sure did use it to generate the entirety of their paper.”
With appropriate development and support, routine and time-intensive tasks in the classroom can be outsourced to intelligent agents so that human teachers can focus on the deeply relational work of higher-level instruction.
This doesn’t mean that the school is headed toward a future where AI replaces human teachers. In fact, it could well mean that highly trained and capable teachers are even more important facilitators of learning in an AI-enabled academic environment.
Some faculty members at Westminster have responded to AI integration by embracing it as a means to enhance their teaching.
“[ChatGPT can act] like that scaffolding to get that person who’s struggling back up to the level, and then I can be there to help tutor and capture those things along the way.” said Kazlow.
Upper School performing arts teacher Kate Morgens was one of the first teachers who chose to incorporate AI into her acting class: one example was using AI to come up with scripts on the spot for students to have to act out.
“We’ve got some teachers that are trying to use it for the first time in classes. So I think it’s a lot of trial and error right now, and it’s a lot of teachers just kind of being creative; seeing what works and what doesn’t,” said Batcheller.
Recently, Upper School chemistry teacher Juliet Allan used ChatGPT as a way to quiz students on binary ionic compounds, having it come up with ten random binary ionic compounds, and then having the students figure out the respective name for each chemical.
“We’ve ultimately decided that every teacher can decide how they want this tool to be used in their classroom,” said Honor Council head prefect Marina Quinterno.
Students too will need to learn how to use and understand the technology. Their diverse use of AI in their learning experiences sheds light on how AI tools are being utilized within the student body.
“I use it to generate ideas for a project or help me figure out how to solve a math or chemistry problem. I’ll put it in, and it’ll describe the steps of how to solve it. I think it is effective if you use it in the right way. But if you use it in the wrong way, that’s against the honor code, and you can have some consequences.” said junior Anthony He.
Senior Akul Rana added a nuanced perspective, emphasizing the need for specificity when utilizing AI.
“I think it’s more helpful as a generator tool, and it’s more dangerous as using it as a tool for copy and pasting… but I think if you use it right, it can be very helpful,” said Rana.
Students offered viewpoints that provided a glimpse into the ongoing discourse about AI’s role in shaping the future, with considerations ranging from its potential as a transformative tool to concerns about misuse and ethical implications.
“I think there’s two sides to it. There’s a lot of good stuff–it has potential to help a lot of people since it’s widely accessible. But at the same time, people can misuse these tools. Also, the data that you train AI software with can definitely skew stuff with prejudices,” said sophomore Samanyu Ganesh.
“If anything is abused, it can be bad. Moderation is key. I honestly do think that it has helped society in numerous ways, in medical fields and scientific fields. I mean, it’s life-changing—it’s saved people’s lives, literally, and it has made our lives easier. But is the easiest way the best way? That’s the question we have to ask ourselves, right?” said Quinterno.
Given the potential that generative machine learning offers to educators, leaders, and students, we should not only think about how this technology can assist teachers and learners in what they’re doing now, but also how to ensure that new ways of teaching and learning flourish alongside the applications of AI.
Edited by Andrew Su