Is it OK to Use AI on Your College Homework?
  • Jan 2025
  • 0

Is it OK to Use AI on Your College Homework?

21st January 2025

Students have incredibly packed schedules.

In my classes, I often assign a significant amount of writing, so I can understand why they might feel drawn to using AI tools like ChatGPT to complete their homework.

Using someone else's work without giving credit is plagiarism, but what about using AI? Would a professor recognize that as cheating?

To address this underlying concern, I want to speak directly to students who might consider submitting AI-generated work for their homework.

As a professional writer, I can spot differences in writing styles. When a piece of writing feels mechanical or lacks the personal touch, it stands out.

Writing style is deeply personal, it reflects who you are as an individual. Your work has a unique voice that makes it distinctly yours. AI-generated content, on the other hand, lacks this individuality. It’s clear when your personal voice is missing from your writing.

With its polished vocabulary and strong command of grammar, AI excels at producing content that sounds impressive but often lacks substance. I can quickly recognize when writing is technically correct yet feels hollow.

That being said, as AI tools improve and students become more adept at using them, identifying AI-generated content in my classroom, and in yours, for those training to become educators, will likely become more challenging.

However, think about this: writing is an essential part of the thinking process. If you bypass that process, what are you truly learning?

In response to a Gemini ad encouraging parents to use generative AI to help their kids write fan letters, Alexandra Petri, a humor columnist for The Washington Post, set aside her usual tone to ask a serious question: “Do you know what writing is?”

Her answer is profound:

“Writing is thinking in a form you can share with others. It is a way to take thoughts, ideas, and stories from your mind and communicate them to someone else. As E.M. Forster once quoted, ‘How can I tell what I think until I see what I say?’ To remove the ability to write for yourself is to strip away the ability to think for yourself.”

She’s absolutely correct.

Writing isn’t just about completing homework; it’s a skill that improves with practice. If my class isn’t helping you become a better writer, then it’s falling short, not just for your academic growth, but also for your development as a future professional, educator, or whatever path you choose in life.

Currently, tools like ChatGPT and other AI models are free because they’re learning from you. Every time you use them, you’re essentially volunteering your time and knowledge to help train their systems.

These AI tools won’t stay free forever. What will you do when access comes with a cost?

More importantly, how will you handle real-life situations where there’s no AI to rely on? Writing is more than just putting words on paper, it’s a form of thinking that prepares you for real-time problem-solving. Every time you write, you’re practicing for moments when you need to adapt and think on your feet, whether in your career or daily life.

AI companies can afford to offer their tools for free because they rely on a foundation built through unauthorized use of others' work. Developers of large-language AI models scrape text from the internet and digital sources without permission. This includes copyrighted material.

In the fall of 2023, I discovered that my biography of children’s authors Crockett Johnson and Ruth Krauss, a book that took me over ten years to write, was part of the dataset used to train these AI systems. It’s possible that some of my other works were used as well, though I have only confirmed this particular book through a dataset exposed by The Atlantic. There are other datasets I have not had access to.

I never agreed to let my work be used in this way, nor have I received any form of compensation for the labor and effort that was taken without my consent.

So, here’s a moral question for anyone considering the use of large-language AI: Are you okay using a tool that is built on the unapproved use of other people’s hard work?

As Brian Calvert highlighted in an article for Vox earlier this year, “OpenAI’s GPT-3, for instance, consumes nearly 1,300 megawatt-hours (MWh) of electricity annually, equivalent to the energy used by about 130 homes in the United States.”

He also noted, “If ChatGPT were integrated into the 9 billion searches conducted daily, the IEA [International Energy Agency] estimates electricity demand would rise by 10 terawatt-hours per year—matching the consumption of around 1.5 million European Union residents.”

The environmental impact of AI extends beyond energy use. Cooling the servers that power these systems requires significant amounts of water. The production of hardware for AI systems contributes to water pollution through the mining of materials like silicon, germanium, gallium, boron, and phosphorus.

I am not opposed to technological progress. My mother worked as a computer programmer in the 1960s and later spent most of her career teaching computer science. I’ve been using computers since 1979, starting with the TRS-80, where I learned to program in BASIC. In 1997, I launched my first website.

As someone who embraced modern computing early on, I admire the advancements represented by generative AI. I’m sure my mother, while disapproving of her son’s work being used without permission, would find large-language AI fascinating. She would undoubtedly have deeper insights on this subject than I can offer.

But humans are more than information processors.

We ask questions. We imagine. We create.

We feel joy, endure pain, and strive for justice. We find meaning in a world that often feels confusing.

I recognize that AI is here to stay, and some of you may need to use it in your future jobs. It excels in certain areas, grammar assistance, simplifying complex legal texts, and helping individuals with learning disabilities understand challenging material. These are examples of its positive contributions.

However, AI cannot think. It mimics, borrowing ideas, often without credit, which amounts to plagiarism, but it lacks genuine thought.

In my classroom, I want you to think. My goal is to understand what you’re learning and the questions you’re grappling with. College is about expanding your mind, not outsourcing that growth.

This is why, unless specifically assigned, I ask you to rely on your own intellect rather than AI’s imitation of it.

Humans are not machines. We dream, create, and connect in ways that AI cannot. These abilities help us navigate challenges and embrace life’s joys.

So, write.

Develop your unique voice. Let it be a reflection of your thoughts and ideas, not just a shortcut to finish a homework.

0 Comments


LOAD MORE COMMENTS

Leave Your Comment Here