Progressively over the last two years AI started to appear more within my life at school. Of course I had heard of AI before ChatGPT, however, ChatGPT was the talk of the town when I started the masters program. Rumors of students using it to write entire papers was something that was new to me, and entirely shocking. The discussions about ChatGPT and other AI systems within the classroom are never ending. The readings and discourse following them sparked many questions for me. The specific questions that I always go back to are, if AI keeps progressing, can limitations be set in the classroom? And can we teach students to use AI as a tool in coordination with their current knowledge rather than just using the knowledge that AI has?
It is inevitable that the pathways of AI and academia are going to continue to merge and force each other to grow. As teachers it is important to remember that having opinions about AI is okay, but we are here to progress into the future with our students. There may never be a world where AI does not exist again, at least in the capacity it does now. A quote from TextGenEd: An Introduction to Teaching with Text Generation Technologies that helps me keep an open mind about AI is, “While we consider these paths forward, writing instructors must confront our own investments and biases in this future of AI and writing”. In my mind the main thing that teachers can do to create limitations of AI in the classroom is creating an environment where your students learn about AI assistance and how it can benefit but also derail your projects.
An example of an instructor creating a setting that promoted learning about AI and how to use it without misuse is the article Professional Writing for Healthcare: Writing & Revising Research Summaries with Artificial Intelligence written by Heidi Mckee. Mckee constructed a project for her students based on comparing AI and human ability. The students were to find a peer-reviewed research article to base their project off of. The project consisted of five steps: (1) Writing a summary in your own words, (2) Inputting the research article into AI systems to get a summary from AI, (3) writing a reflection on both their writing and the AI writing, (4) drafting a revised summary from both their writing and the AI writing, and finally (5) annotated draft highlighting the parts that were from the AI summaries. In doing this project, Mckee allowed her students to explore the outputs of AI while also learning the limitations of what AI can/should do.
So where does this leave us as students and teachers? The student and teacher relationship can be mirrored in the “strict parents create sneaky children” quote. A teacher that tells a student that there is no exception and you can not use AI systems, creates sneaky students that learn ways around it.
Excellent use of the word “sneaky” here. It rings so true, that outlawing this thing will do us no good, and it won’t stop students from misusing it. Showing the limitations of AI directly to the students can reveal its weaknesses and hopefully create value in crafting the written word.
‘strict parents creating sneaky children’ is exactly right. Some argue that using AI is cheating on homework. I argue that, if a student can easily do their homework using AI, it was probably a lazy homework assignment to begin with.
I love the strict parents quote, it really hones in on things I also talked about in my blog post. The idea of students trusting you as an instructor is one that has become at the forefront of my brain, I think that students will not see the limitations of AI if we do not show them, or take a strong stance against it. Great explanation here.
I definitely agree with the sentiment that if we don’t teach students how to use Ai they’re just going to use it to cheat. If we want them to do any amount of human writing then we need to teach them the limitations of Ai.