Confronting the Elephant in the Classroom

The integration of AI into our daily educational practices, while offering a ton of benefits, has ushered in complex ethical dilemmas, particularly concerning academic integrity. A recent survey among higher education instructors has unveiled a stark reality: the number one challenge they face today is finding effective strategies to prevent student cheating, a concern that has surged by 28% from the previous year, overshadowing other pressing issues such as providing timely feedback and adjusting content to meet students' needs.

This blog post aims to shed light on this pressing issue, extending the conversation beyond higher education to include the K-12 education system. The survey data highlights a significant challenge for administrators and coaches when trying to have conversations with teachers on AI’s role in education.

Our recommendation is to “address the elephant in the room” from the beginning. Acknowledge the concern, talk about the steps the school or district is making to support teachers, then share strategies that proactively address the issue of academic dishonesty in the AI era.

The Data Speaks: Academic Integrity at the Crossroads

Teachers' alarm over academic dishonesty is not unfounded. The rapid evolution of AI tools has made them readily accessible to students and has provided them new avenues for cheating that are harder to detect and deter. This seismic shift calls for a reevaluation of traditional academic integrity policies and the development of novel approaches that resonate with the digital generation.

However, at the same time, early research by Denise Pope (Stanford Graduate School of Education, 2023) shows, while still too high, the percentage of students cheating has stayed about the same or even decreased slightly since ChatGPT and other AI tools became widely available to students.

While the research doesn’t discuss it, I believe the reason is for the first time in a long time, administrators, teachers, and students are having conversations about academic integrity and what cheating looks like in 2024 and beyond.

Fostering This Dialogue Through Active Engagement

One promising strategy to confront this challenge is starting this dialogue among students through active engagement activities. By engaging students in discussions about what constitutes cheating in the era of AI, teachers can better cultivate a culture of integrity and responsibility.

Building on this idea, our coaches developed scenarios to challenge students to debate and define the boundaries of cheating in the context of AI. These scenarios ranged from using AI for creative inspiration and research assistance to leveraging AI for problem-solving in homework assignments. Each scenario prompted students to consider the balance between AI's role and the student's own learning and intellectual contribution.

For example, the following three scenarios are presented to students to debate, discuss, and decide if the example is “cheating,” by rating each from (1) Human Powered to (5) Bot Powered.

Scenario #1: Creative Writing and AI Inspiration

Betty was assigned a creative writing assignment and is struggling to get started. She has a serious case of white page syndrome. She goes to ChatGPT and prompts the Bot to create a story that aligns with the rubric provided by her teacher. The Bot provides a full story. Betty then asks it to regenerate stories multiple times and the Bot provides multiple variations. After she reads them, Betty uses ideas from the various Bot versions and weaves them into her creative writing assignment. Is this plagiarism?

Betty's use of ChatGPT to overcome writer's block by generating story ideas raises fundamental questions about originality and authorship. This scenario prompts students to consider the extent to which integrating AI-generated content into one's work compromises academic integrity. The discussions reveal a spectrum of opinions, mirroring the diverse perspectives in the ongoing debate about AI's role in creative processes.”

Scenario #2: Research Assistance from AI

James was assigned a research paper at the beginning of the semester. He used ChatGPT to brainstorm topics to formulate his thesis. He then spent the last few months researching and writing his paper. Now that he has a final draft, he returns to ChatGPT to have the Bot check his paper for spelling, grammar, and logic. The Bot makes suggested changes which James implements. He submits the paper to his teacher. Is this plagiarism?

James's approach to using ChatGPT for brainstorming and editing his research paper underscores the nuanced ways AI can support academic work. This example challenges students to think about the transparency and disclosure of AI use in academic settings, pushing them to consider where the line between legitimate assistance and academic dishonesty lies.

Scenario #3: Learning and Problem-Solving with AI

Fred has 20 problems from his math class. He tries the first one and is totally confused. He used Photomath to solve it. Once he saw the worked solution, he understood the concept and did a few more problems on his own. Fred feels prepared for tomorrow’s quiz, so he uses ChatGPT to solve the rest of the problems. Is this cheating?

Fred's reliance on Photomath and ChatGPT to understand and complete math problems highlights the potential of AI as a learning tool. Yet, it also brings to the forefront the critical issue of dependency on technology for problem-solving, urging students to reflect on the balance between using AI for learning enhancement and ensuring the authenticity of their knowledge and skills.

Through these exercises, it became evident that the ethical implications of AI in our schools are multifaceted and evolving. The discussions did not always yield clear-cut answers but did foster critical conversations on the ethical use of AI tools.

The Way Forward

As educational leaders and instructional coaches grapple with the challenges posed by AI, it is crucial to recognize the importance of addressing academic dishonesty head-on. The development of professional learning programs that start with the "elephant in the room" can empower teachers at all levels to navigate the complexities of AI in education.

By examining using examples, and having teachers create their own to meet their specific needs, schools can begin to lay the groundwork for responsible and ethical AI use in students’ academic pursuits.

Call to Action

In facing the challenges and opportunities presented by AI, it is clear that a collective effort is needed. We invite teachers, instructional coaches, and school leaders to engage in this critical conversation and explore strategies to promote academic integrity in the AI age. For further information, support, or collaboration, do not hesitate to contact Edvative Learning. Together, we can navigate this new landscape and ensure that our student practices remain rooted in integrity, ethics, and lifelong learning.

Previous
Previous

Overcoming Teacher Guilt for Greater Efficiency and Effectiveness

Next
Next

Need an easy way to explain the need for ai in your school?