News Editor edits an article written by AI

The following article talks about AI regulations on campus. It includes new regulations for AI usage on campus as well as student and faculty perspectives on the use of AI. 

There’s only one catch: it was written by Chat GPT. 

Given rising concern about AI stealing our jobs, I was curious to see how well AI could do mine. I presented Chat GPT with the challenge to “write an article for UCCS student newspaper The Scribe about AI regulations on campus in the style of Tori Smith.” This is the article it wrote with my edits in red: 

The Growing Role of AI at UCCS: Balancing Innovation with Regulation 

Artificial Intelligence (AI) is no longer a futuristic concept; it’s embedded in our daily lives, from ChatGPT writing essays to facial recognition unlocking phones. At UCCS, AI tools have sparked conversations among students and faculty about their potential to enhance learning — and the challenges of ensuring fair and ethical use. 

The University of Colorado Colorado Springs has embraced AI’s possibilities in the classroom, but with its benefits come necessary regulations. “AI is a tool, not a substitute for critical thinking,” said Dr. Sarah Keller, an English professor. Dr. Keller has encouraged students to use AI tools like Grammarly or ChatGPT to brainstorm ideas but warns against relying on them for final submissions. “Learning happens in the struggle, not the shortcut,” she emphasized. 

  • Other than that, the second quote from Keller tightly sums up her argument and captures a strong voice. It’s a shame it’s not real. 

UCCS recently updated its academic integrity policy to address AI tools explicitly. The policy now clarifies that using generative AI without proper citation constitutes academic dishonesty. Students caught submitting AI-generated work as their own face penalties ranging from assignment failure to disciplinary action. 

For students like engineering sophomore Marcus Kim, these regulations feel necessary. “AI is amazing, but it’s like using a calculator in math class — you still need to show your work,” Kim said. However, not all students share his enthusiasm for these rules. Political science major Lila Ortiz expressed concerns about the policy’s vagueness. “It’s hard to know where the line is. Can I use AI to edit my grammar, or is that cheating?” 

To address these concerns, UCCS is hosting workshops to educate students and faculty on appropriate AI use. The sessions focus on identifying when and how to integrate AI ethically into coursework. A recent session, led by the UCCS Ethics Center, attracted over 100 participants, highlighting the campus community’s interest in navigating this evolving landscape. 

Beyond academics, AI’s presence is expanding across campus operations. The Office of Information Technology (OIT) has implemented AI-driven systems to streamline tasks like tech support and network monitoring. These initiatives aim to reduce response times and enhance efficiency, but they also raise questions about data privacy. “We prioritize student data protection and transparency in all AI integrations,” said an OIT spokesperson. 

The debate over AI at UCCS reflects broader societal conversations. As technology evolves, so too must our understanding of its implications. UCCS is taking a proactive approach, encouraging dialogue and education while implementing guardrails to ensure AI is a tool for growth, not a crutch. 

What remains clear is that the future of AI at UCCS — and beyond — is up to us. It’s not just about how we regulate technology but how we adapt and grow alongside it. 

Final Thoughts 

OK, we’re done letting the robot talk now. First of all, this article was … kind of good. It’s organized well, the quotes are strong and, aside from a few errors, the writing follows journalistic style. 

Here’s why I feel OK about my job and the future of journalism in general. After I read the article a couple more times, I realized it actually wasn’t as good as I thought. The problem with the article is there’s no attribution for the information it gives, which would make sense since almost none of it is true.  

For Chat GPT to write me a good news article, I would still have to go out and conduct interviews and do my own research and then feed it the information I want it to report on. If UCCS were implementing a new academic integrity policy, and there was not yet any available reporting on that policy, I would have to tell Chat GPT about the policy before it could write anything. 

AI might be a useful tool in journalism and writing in general when it comes to getting help with wording or organization. But when it comes to content, in its current state, a human being is still necessary to do a lot of the heavy lifting. So, I shall sleep peacefully tonight knowing that Chat GPT will not be the next n news editor for The Scribe. 

Photo courtesy of AI Studios.