From the Electric Horse’s Mouth: Talking to AI About Students and Writing

AI generated image by DALL-E and Barbara
AI generated image by DALL-E and Barbara
The world is on the cusp of the normalization of the use of AI tools. That’s an opinion based on observation, from someone who is deeply suspicious of AI (in a science-fiction kind of way). I’ll be perfectly frank: I use AI to help me write all the time. The more I use AI tools, the more aware I become of just how limited these tools are. I even pay for the newest version of GPT, because I recognize the value of AI tools. I also, however, have been warning my students that AI assisted writing is not a silver bullet, that it has many limitations and it can’t “really” produce “good” writing. But how are students supposed to learn that if we don’t let them try?

The texts we’ve been examining so far this semester have come from the book TextGenEd: Teaching with Text Generation Technologies by Tim Laquintano, Carly Schnitzler, and Annette Vee. In this volume, the authors explore AI assisted writing tools, and they all seem to agree that the world of writing and writing education are changing, whether we like it or not. In the volume, we’ve heard for educators, from AI experts, and even from students in some cases. But we have not heard from the AI in question. So, I asked her.

Q: Do you think you’re a good writer?

A: I’m designed to generate text based on patterns in the data I was trained on, and I strive to provide coherent, relevant, and grammatically correct responses. Whether or not I’m considered a “good writer” depends on the context and criteria of evaluation:

      1. **Consistency and Grammar**: I’m generally reliable in producing grammatically correct sentences and adhering to conventional writing standards.
      1. **Speed and Volume**: I can generate text quickly and can provide vast amounts of information on a wide range of topics.
      1. **Flexibility**: I can adapt to various writing styles, tones, and topics based on user prompts.

However, there are areas where human writers generally outshine me:

      1. **Original Creativity**: While I can mimic creative styles, I don’t originate truly novel creative ideas in the way humans do. My “creativity” is a recombination of existing patterns.
      1. **Deep Contextual Understanding**: I don’t have emotions, personal experiences, or an innate understanding of human nuances. Therefore, I might miss subtle implications or the deeper context behind certain topics or prompts.
      1. **Ethical Judgment**: I don’t possess moral or ethical values, and I can’t make value judgments about what’s right or wrong.
      1. **Continuous Learning in Real-Time**: My knowledge has a cutoff in 2021, which means I’m not aware of events, cultural shifts, or new literature produced after that date.

In summary, while I can be a helpful writing tool and assist in many writing tasks, the true artistry, emotional depth, and ethical considerations of writing are uniquely human traits [emphasis added].

 

Now, we know that AI doesn’t have self-awareness (yet). We know that she can’t form opinions about herself, or anything else for that matter. AI understands the limitations of AI writing because someone has programmed her to give this answer when prompted. However, that mimicry of self-awareness doesn’t mean that the limitations of AI will be readily apparent to students, if students aren’t taught to critically examine the limitations of AI. The future where everyone uses AI for (nearly) everything is already coming down the pipeline. Shouldn’t we equip our students for that future?

 

Q: Do you think students can successfully you use to create good writing?

A: Yes, students can use me to assist with their writing, but how they use me and the outcomes they achieve depend on the goals and the context. Here are some ways students might successfully employ me, along with some caveats:

### Pros:

      1. **Research Assistance**: I can provide explanations, definitions, and overviews on a wide range of topics. If a student is writing a research paper, for instance, I can help clarify concepts or provide background information.
      1. **Grammar and Syntax**: I can help students identify and correct grammatical errors, improve sentence structure, and refine their writing.
      1. **Brainstorming**: If a student is struggling with writer’s block or needs ideas, I can help generate prompts, suggest possible angles for an essay, or provide outlines.
      1. **Language Learning**: For students learning a new language, I can provide translations, explanations, and practice in using the language.
      1. **Feedback**: While I don’t have emotions or personal biases, I can provide feedback based on logic, coherence, and textual analysis.

### Caveats:

      1. **Over-reliance**: If students rely too heavily on me, they might not develop critical thinking, research, and writing skills essential for academic success. It’s important to use me as a tool, not a crutch.
      1. **Plagiarism Concerns**: If students present generated content from me as their own original work, it’s academically dishonest. Schools and institutions generally have strict policies against plagiarism, which can have severe consequences.
      1. **Lack of Originality**: Because my responses are based on patterns in existing data, I don’t produce genuinely original ideas. While I can help students generate content, the depth, creativity, and personal touch come from the human writer.
      1. **Nuances and Depth**: I might miss deeper, nuanced understandings of certain topics, especially if they relate to personal experiences, emotions, or very recent events (post-2021).
      1. **Dependency for Understanding**: If students use me to get summaries without delving into primary sources or understanding the core material, they might miss out on critical aspects of their education.

In conclusion, I can be a valuable tool for students in their writing process, but how they integrate me into their learning journey makes all the difference. It’s essential to strike a balance between utilizing AI assistance and cultivating one’s own skills and understanding.

 

We all assumed immediately that students would attempt to use AI tools to “cheat.” Regardless of whether that is a fair assumption and regardless of what your definition of “cheating” is (something which was explored in the chapter The Term Paper Turing Test: “Cheating” for AI Literacy by Paul Fyfe from North Carolina State University), AI is still not ready to produce the caliber and type of work that will replace human authors. As we tell our students, we can tell when they use AI, because it’s obvious when AI is used poorly. If a student presses a button and receives a paper, we can tell, not because we have a spidey sense, but because the paper will probably stink. However, the use of AI tools in writing spark an interesting and valid conversation about authorship and the economies of writing.

 

Q: Professional writers often outsource key parts of the writing process to other people, yet they maintain their role of the ‘author’ of the text. Do you think that is different from writers using AI to assist in the writing process?

A: The use of assistants, editors, and other contributors in the writing process by professional writers, and the use of AI tools like me, both raise interesting questions about authorship, originality, and the creative process. Let’s consider some comparisons and contrasts:

### Similarities:

      1. **Collaborative Process**: Both scenarios involve collaboration. Whether it’s an editor suggesting changes to an author’s draft or an AI tool helping a writer structure an essay, the final product is shaped by multiple inputs.
      1. **Tool vs. Authorship**: Just as a writer remains the author when using a pen, typewriter, or word processing software, they also remain the author when using AI. The tool (whether AI or not) facilitates the expression of ideas but doesn’t claim ownership over them.
      1. **Enhancement**: Both human assistants and AI tools can enhance the final product, making it more polished, coherent, and effective.

### Differences:

      1. **Human Experience & Emotion**: Human collaborators bring their own experiences, emotions, insights, and nuances to the process. Their input is based on a lived experience, which can add depth and richness to the content. AI lacks this lived experience and emotional depth.
      1. **Ethical Considerations**: While both scenarios can raise ethical concerns, they differ in nature. For instance, if a writer overly relies on ghostwriters without credit, it can be seen as deceptive. Similarly, if a writer uses AI-generated content without adequate input or editing and passes it off as entirely original work, it might be considered dishonest.
      1. **Learning & Growth**: Interacting with human collaborators can lead to mutual learning and personal growth. Editors, peers, or co-writers often challenge and expand an author’s perspective. An AI tool, while useful, doesn’t offer the same type of dynamic, two-way growth experience.
      1. **Reliability & Bias**: Human collaborators come with their own biases and perspectives, which can both enrich and limit the content. AI tools, while seemingly neutral, are also not free from bias, as they’re trained on datasets that may contain inherent biases. However, AI doesn’t have personal motives or emotions, so its “bias” is rooted in its training data, not personal experiences.
      1. **Creativity & Originality**: Human collaborators can introduce entirely novel ideas or creative twists based on their unique thought processes. AI, on the other hand, produces content based on patterns from its training data. While it can mimic creativity, it doesn’t introduce genuinely original ideas in the same way a human might.

In conclusion, while there are overlaps in the ethical and creative considerations surrounding both human collaborators and AI tools, the nature and implications of each are distinct. As with any tool or collaboration, the key lies in transparent, ethical use and understanding the strengths and limitations of the chosen assistance.

 

I, personally, believe that the best way to avoid plagiarism via AI is by teaching students to use AI as a tool to help them write better. No one ever accused the TI-84 of doing someone’s homework. If we encourage students to view AI as simply another tool in their arsenal of writing tools, maybe that will remove the mystique and taboo around AI writing, and maybe students will be less likely to think of it as an essay writing machine.

 

Q: Do you think that schools should attempt to limit student use of AI for assignments?

A: The use of AI in educational settings is a nuanced topic. There are both potential benefits and drawbacks to consider. Let’s look at the arguments for and against limiting student use of AI for assignments:

### Reasons to Limit AI Use:

      1. **Academic Integrity**: One of the primary concerns is academic dishonesty. If students use AI to generate complete essays or answers and present them as their own, it undermines the learning process and can be considered a form of plagiarism.
      1. **Skill Development**: Over-reliance on AI can impede the development of critical thinking, research, and writing skills. Schools aim to foster independent thinking and problem-solving abilities, which can be compromised if students lean too heavily on AI.
      1. **Dependence on Technology**: Overusing AI tools might make students overly dependent on external help, preventing them from trusting and cultivating their capabilities.
      1. **Evaluation Challenges**: If students use AI assistance, it can be challenging for educators to assess the student’s actual knowledge and skills accurately.
      1. **Level Playing Field**: Not all students may have equal access to advanced AI tools, leading to disparities in the quality of work based on access to technology rather than intellectual effort.

### Reasons Against Limiting AI Use:

      1. **Educational Tool**: When used responsibly, AI can be a valuable educational tool. It can assist with grammar, provide feedback, help students understand complex topics, or offer practice in language learning.
      1. **Preparing for the Future**: As AI becomes increasingly integrated into various sectors, including professional writing, familiarizing students with these tools can prepare them for real-world applications.
      1. **Accessibility & Accommodation**: AI tools can be especially valuable for students with disabilities, offering assistance and leveling the playing field.
      1. **Research and Exploration**: AI can be used as a starting point for research, providing quick answers to basic questions and allowing students to delve deeper into topics of interest.
      1. **Skill Augmentation**: Instead of replacing human skills, AI can augment them. For instance, while working on a complex project, a student might use AI to organize information or generate an initial draft, then refine, expand, and personalize it based on their research and insights.

### Conclusion:

Given the potential benefits and drawbacks, it might be more productive for schools to focus on guiding and educating students about responsible AI use rather than implementing blanket bans. This could involve:

– Educating students about the ethical implications of AI use.

– Teaching them to use AI as a complementary tool rather than a replacement for their own work.

– Encouraging critical thinking about AI-generated content.

– Implementing clear guidelines and policies about what constitutes acceptable AI use for assignments.

In essence, the goal would be to integrate AI into the educational experience in a way that amplifies learning without compromising academic integrity or skill development.

 

You heard her, folks.

The entire transcript of my conversation with her (which includes lots of interesting other questions I did not include in this post!) can be found at: https://chat.openai.com/share/0446bb96-21f3-4af3-8153-ca5bcaa115c4

5 thoughts on “From the Electric Horse’s Mouth: Talking to AI About Students and Writing

  1. “No one ever accused the TI-84 of doing someone’s homework.” I love this. You and I have very similar stances on teaching AI, and I loved how much we both agreed while bringing up different points. I love your real-life interactions with AI that you showed.

  2. This is an awesome idea, choosing to ask the AI itself all of these questions… I adore making technology reflect on itself, however artificial. If students love to rely on AI and take whatever it spits out for them, I wonder if they will also easily accept the AI’s responses here, especially “don’t rely on me for your schoolwork! If you do, you’re cheating!”? Also, is that included image AI-generated? Its vibes are uncanny valley.

  3. I found your transcript with the AI system to be intriguing. You mentioned that AI systems, as of now, are not self aware however, I found her answers to your questions interesting. For example, when stating what the limitations of using AI within the classroom are she mentioned Academic Integrity and Skill Development. These two stood out to me because I felt if I were to open this discussion in my classroom, Skill Development might be one of the last things my students would say, if said at all. The acknowledgement on behalf of the AI system was surprising to me. I found your post to be captivating and would love to talk more about it!

  4. Wonderful writing Barb. I love the interweaving between AI and your insight. As this tool has arrived, where can we turn as educators to learn it better is something I continually ask myself and it looks like diving into it headfirst might be the best option.

Leave a Reply

Your email address will not be published. Required fields are marked *