"

3 The AI Dilemma

A Guide for College Writing Students

J.T. Bushnell

J.T. Bushnell

I was still a pretty young teacher when I first ran into the idea of “BS jobs” and broke out in a cold sweat.

The idea came from an anthropologist named David Graeber, first in a viral article and then in a book, both of which use the full expletive in their titles. He argued that most modern jobs were completely worthless, and that we could distinguish them from necessary jobs by imagining the impact on society if they disappeared.

No garbage collectors, dock workers, doctors? “The results would be immediate and catastrophic,” Graeber wrote in STRIKE! magazine. But no private equity CEOs, PR researchers, legal consultants? “It’s not entirely clear how humanity would suffer. Many suspect it might markedly improve.”

At the time, I wasn’t sure which category “writing instructor” belonged to. From a certain viewpoint, you could say that I invented unnecessary work for my students, who then invented unnecessary work for me, like interlocking cogs in a big useless machine. If we stopped, would any immediate catastrophe really ensue?

I had answers, but they were abstractions that slipped through my fingers. I couldn’t hold them like a farmer held an ear of corn, ripping open the husk for anyone who doubted.

Then generative artificial intelligence like ChatGPT came along, and suddenly, the question was real. People were actually entertaining the idea that writing classes were worthless, a waste of time, a BS occupation on par with telemarketing. Many of them were students, including my own. You might be one such student.

And so I think it’s time to confront this question, to talk openly and honestly about it together, and to test out the alternative – relegating our writing to AI. Graeber’s question might help. If you and I let AI do all our writing, would nothing change? Would it have no impact on society? Or is there something important happening when you write, and when you learn to write? Would we lose something if we stopped?

What does writing give you?

A Harvard researcher named Emily Weinstein told a story on NPR that might help clarify these questions. She was talking to a group of teenagers about their opinions on AI, and one teenager said that his friend’s dad had used AI to write a birthday card for his wife. The wife called it the best card he’d ever given her and was even moved to tears.

But here’s the crazy part, at least to me. The teenager who told the story thought it was a great example of AI’s value. It showed why AI was a useful tool for writers. The other teenagers mostly agreed. Weinstein, the researcher, said that her reaction, like mine, was a little different, and she wanted to understand their viewpoint better. A different teenager explained, “Isn’t the intention what matters? Like, if he brings you flowers from a flower store, he didn’t grow the flowers or pick them. And you still think it’s really nice. So, isn’t that kind of like what’s happening here?”

Let me pause here to admit that I use AI, mostly for two things. One is cooking. When you have to put dinner on the table for your family, night after night, sometimes it helps to get new ideas or instructions. I don’t feel guilty about it. I care about putting dinner on the table. It’s the product that matters to me, not the process.

The other place I use AI is writing that I don’t find particularly rewarding or valuable – bureaucratic forms, simple contracts, multiple-choice questions for reading quizzes. I do a lot of reviewing and adjusting in all of these, but it’s the same as cooking – I care only about the outcome because there’s nothing enriching about the process. Even for someone who enjoys writing, like me, it’s just drudgery.

For some people, that’s what all writing feels like: drudgery. They only want to arrive at the outcome, the meal, the document, as quickly and easily as possible. For these people, I’d imagine it seems perfectly acceptable to use AI to write a wife’s birthday card, especially if the card is so effective.

But I wonder how the wife would feel if she knew that the message came from AI. Would she see it as a bouquet from the florist? Or like an exam her husband had cheated on? How would you feel?

Almost universally, my students have had the same reaction that Weinstein and I did – they were disturbed. They thought it was a great example of AI’s hollowness, not its value. To them, it showed why AI writing is upsetting, even when it’s effective, and how two identical messages can have totally different meanings based on whether they came from a human or a bot.

They also discovered that this distinction exists on the production side. Writing with AI, they said, felt boring and meaningless. It was hard to care about, or even remember – an obvious enough fact for anyone who has used AI this way but one that’s now backed by research from scholars at MIT and Wellesley. Compared to participants who used only their brains, those who wrote essays with AI showed less satisfaction, ownership, memory, and even neural connectivity when they wrote essays with AI.

For some of us, writing can be meaningful, interesting, even enjoyable. It allows us to give expression to something inside ourselves, to discover what we want to say and then say it. It allows us to think things through, pursue a solution to problems we aren’t sure how to handle, exercise our cognitive skill. It’s a way to connect with other people, to be understood, to understand ourselves and the world around us. In these situations, we’re interested in more than just the destination. We’re interested in the journey, because that journey fulfills something deeply human in us. For those reasons, we think it’s worth it to do the meaningful work ourselves, even if AI can spit out a document as good as ours, or better.

The question we’re all struggling with is where to draw the line. Teachers usually see their coursework as important, enriching, even interesting. Students, on the other hand, sometimes see it as drudgery. When each side draws the line at a different location, they’re bound to disagree about the value of using AI to complete coursework.

But this considers writing only from the perspective of the writer. What about the reader? Do they only want the bouquet? Or do they care whether you grew the flowers yourself? Which one matters more to you when you’re reading, or deciding who to work with, or figuring out who to love and trust?

AI writing techniques

When we talk about using AI in writing, most people think of one technique: you tell AI what you want and let it write the whole thing. The risk is that people might know the writing came from AI, and have a negative reaction. That includes teachers, but also the bosses or colleagues or clients who don’t appreciate bots in their inboxes. In fact, using AI can damage your professional reputation by lowering people’s opinion of your competence and motivation, according to a study from Duke researchers.

And so a lot of people have moved on to a second technique: you ask AI for the ideas, then put them in your own words. Because it’s your writing, you’re less likely to get caught, but the AI is still doing all the “thinking.” The appeal of this method, like the first, is that you reach the finish line with less time and effort. You don’t have to exercise that organ in your head very hard. Some people call that efficiency, and they’re not wrong.

But that’s also the downside of both these methods. Exercise makes you stronger. For some of us, that’s the whole reason we go to college – to learn, to improve, to develop skills that will help us in our careers and personal lives. AI seems to have the opposite effect, according to a growing body of research, reducing critical thinking, originality, and confidence. In other words, using AI makes it harder to think for yourself. That’s scary. And so some of us are asking if we really want to sacrifice exercise for efficiency, especially in college.

So here’s a third technique. In almost every kind of exercise, it helps to have someone pushing and guiding you – but not helping with the work. If they start lifting your weights, or offer to run the last lap on your behalf, you’d probably tell them to back off. Can we use AI that way?

To answer that question, it might help to consider the phases of the writing process where you’d ask a human for help. When you’re looking for a topic or deciding what to say about it, you might talk to your teacher, or even run some initial ideas by them. You’re not asking them to choose your topic or provide your ideas, just to help guide your thinking in a productive direction. You might ask a librarian for good places to track down information, but it would probably never occur to you to ask the librarian to bring the information, let alone to read and digest it for you.

Maybe you give your rough draft to a classmate for peer review or take it to a tutor at the writing center, pointing out the parts you’re not sure about or find a little shaky. Their fresh eyes can help you see things more clearly, find improvements, even catch typos. But you’d never say, “Rewrite this whole thing for me, better.” Most of us would be embarrassed to ask another human to do our writing. That embarrassment comes from the implicit understanding that we’re crossing a line, and that’s probably a helpful guidepost about making the same request to AI.

In other words, the one phase where we can’t really use AI this way is the drafting. But I will tell you, personally, that sometimes I get stuck in this phase just because the blank page feels so silent and important that I freeze up. To me, the AI interface feels less silent, less important, and I’ve typed whole paragraphs there that I afterward paste into my document. The point isn’t to have the AI take it over, just to listen without judgment, which can help loosen the flow of ideas. I’m still the one trying to explain myself, and the writing is completely my own.

That’s the guiding principle in this third technique: you always start by doing your best to explain yourself, using the AI as a dialogue partner afterward to help push your thinking. If at any point it starts doing too much, you tell it to back off. You have to decide, I’m the writer, I’m the boss, I’m calling the shots here, and the AI is only an assistant, a collaborator, a subordinate.

Preliminary research suggests that this technique can counteract AI’s worst effects on learning. In a study I led, it restored the creativity of top fiction writing students, which had been diminished by more passive uses of AI, and it significantly improved the creativity of lower level students. Likewise, research from Wharton has shown that when students use AI in a similarly proactive way to study, their test scores are as high as those who study without AI, whereas unregulated AI use produces a 17 percent drop in test scores.

That’s not to say this method is without risk. Once you free the genie from the bottle, will you really be able to resist the temptation of its magic? Will the risk be worth it if your brainpower alone can produce equivalent results? Will there be any side effects, like the loneliness and dependence suggested by other research? Would it be better just to ask the teacher, or the librarian, or the classmate for help?

The AI future

In college my roommate and I were both communication majors, and after college we both found ourselves unemployed, then underemployed – he delivered newspapers and I waited tables. But not for the same reason. For him, the point of college had been to earn a degree, which he thought of as a passport to a good job. Then he carried it to job openings where there were a hundred other applicants with the same degree, and he had nothing to set himself apart.

I was unemployed because I’d turned down a job offer to pursue my real passion. It was an excruciating decision, and I never would’ve had the judgment or courage for it if I hadn’t just spent four years exercising those exact qualities, or building confidence in my ability to learn. The main way I did that was by writing.

In fact, writing has proven to impart many of the skills that employers now value most – critical thinking, problem-solving, communication, adaptability. Those skills are probably the reason my roommate eventually found a career in the wine industry, moving through positions that had nothing to do with his degree.

But even if you have a career in the field you studied, the technical training you get in college is eventually going to become outdated. What will you do to keep up with professional, technological, and cultural shifts we can’t predict, like the ones we’re experiencing right now from AI? You’ll need to be able to think, to solve problems, to communicate, and to adapt. Writing helps impart those skills. That’s probably why liberal arts majors eventually end up with higher salaries than engineering majors, according to surveys from the Census Bureau.

Maybe the job market won’t be as bleak for today’s students as it was for my roommate, but I wouldn’t count on it. Multiple media outlets have been reporting a sharp decline in entry-level jobs recently. The reason? Employers are enlisting AI to accomplish the tasks that they once hired recent graduates for, and so it’s possible you might find yourself in the same situation my roommate did. Casually offloading your work to AI won’t give you a leg up on other applicants. They can do that just as easily as you can.

Besides, any job that can be accomplished that way is likely to disappear soon. Maybe Graeber would be pleased. If that’s how people approach writing, I suppose it is a BS job – and so is the rest of college. Imagine for a moment that the whole institution becomes a big AI factory, where students do nothing except paste assignment prompts into AI and then paste AI output into their assignments. And they take on crippling debt to spend four years of their lives on this work: pasting text. Does this have any shred of value to humanity? Or is it a fever dream of the worthless work Graeber describes?

For some students, it’s no dream at all – it’s their everyday waking reality. We teachers can’t force you to do your own thinking and writing anymore. AI has given you a choice.

What will you choose? What will you base your choice on? How much thought will you put into it? All we can do is ask you the questions. You have to answer for yourself.

Discussion Questions

  1. Because of the rapid advancement of AI technology, this reading might become outdated quickly. What portions seem outdated already, and what portions still seem relevant?
  2. Have you ever received or encountered AI-generated writing? What effect did it have on you and your opinion of the message?
  3. What effect do you think it would have on our society to let AI do all our writing for us? Do you think AI makes it irrelevant to learn writing, or more relevant?
  4. Do you think there’s a place for AI in the writing process? What technique(s) have you used in the past? How did it feel compared to the writing you’ve done on your own?
  5. Where do you think AI should be prohibited, allowed, or required for you and for your classmates? What advice would you give me for how to handle AI as I reshape future versions of this course?

Works Cited

Bastani, Hamsa, et al. “Generative AI Can Harm Learning.” The Wharton School Research Paper Series, The Wharton School, University of Pennsylvania, 18 July 2024, papers.ssrn.com/sol3/papers.cfm?abstract_id=4895486.

Bushnell, J.T., and Wayne Harrison. “A New Muse: How Guided AI Use Impacts Creativity in Online Creative Writing Courses.” Oregon State Ecampus Research Unit, Oregon State University, Apr. 2025, ecampus.oregonstate.edu/research/wp-content/uploads/Bushnell-Harrison-2025.White-paper.pdf.

Deming, David. “In the Salary Race, Engineers Sprint but English Majors Endure.” The New York Times, 20 Sept. 2019, https://www.nytimes.com/2019/09/20/business/liberal-arts-stem-salaries.html.

Fang, Cathy Mengying, et al. “How AI and Human Behaviors Shape Psychosocial Effects of Chatbot Use: A Longitudinal Controlled Study.” MIT Media Lab, Massachusetts Institute of Technology, 25 Mar. 2025, www.media.mit.edu/publications/how-ai-and-human-behaviors-shape-psychosocial-effects-of-chatbot-use-a-longitudinal-controlled-study/.

Graeber, David. “On the Phenomenon of Bullshit Jobs: A Work Rant.” STRIKE! Magazine, Aug. 2013, https://strikemag.org/bullshit-jobs.

Lee, Hao-Ping (Hank), et al. “The Impact of Generative AI on Critical Thinking: Self-Reported Reductions in Cognitive Effort and Confidence Effects from a Survey of Knowledge Workers.” Microsoft Research, 25 April 2025, www.microsoft.com/en-us/research/publication/the-impact-of-generative-ai-on-critical-thinking-self-reported-reductions-in-cognitive-effort-and-confidence-effects-from-a-survey-of-knowledge-workers/.

Kosmyna, Nataliya, et al. “Your Brain on ChatGPT: Accumulation of Cognitive Debt When Using an AI Assistant for Essay Writing Task.” arXiv, Cornell University, 10 June 2025, https://arxiv.org/abs/2506.08872.

Phang, Jason, et al. “Investigating Affective Use and Emotional Well-being on ChatGPT” OpenAI Research, OpenAI, 21 Mar. 2025, cdn.openai.com/papers/15987609-5f71-433c-9972-e91131f399a1/openai-affective-use-study.pdf.

Reif, Jessica A., et al. “Evidence of a Social Evaluation Penalty for Using AI.” PNAS, National Academy of the Sciences, 8 May 2025, www.pnas.org/doi/epub/10.1073/pnas.2426766122.

Roose, Kevin. “For Some Recent Graduates, the A.I. Job Apocalypse May Already Be Here.” The New York Times, 30 May 2025, https://www.nytimes.com/2025/05/30/technology/ai-jobs-college-graduates.html.

Szuhaj, Ben. “AI & Authenticity—What Does It Mean to Be ‘Real’ in 2025?” Kungfu Blog, Kungfu.ai, 18 Mar. 2025, www.kungfu.ai/blog-post/ai-authenticity–what-does-it-mean-to-be-real-in-2025.

Terry, Owen Kichizo. “I’m a Student. You Have No Idea How Much We’re Using ChatGPT.” The Chronicle of Higher Education, 12 May 2023, www.chronicle.com/article/im-a-student-you-have-no-idea-how-much-were-using-chatgpt.

Wright, Ben. “Reports of English Literature’s Death Have Been Greatly Exaggerated.” The Telegraph, Telegraph Media Group, 10 Dec. 2024, www.telegraph.co.uk/business/2024/12/10/reports-english-literature-death-have-greatly-exaggerated/?ICID=continue_without_subscribing_reg_first.

The AI Dilemma Copyright © 2025 by J.T. Bushnell is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted. This chapter is also available in ScholarsArchive@OSU.

License

Icon for the Creative Commons Attribution 4.0 International License

A Dam Good Argument Copyright © 2022 by Liz Delf, Rob Drummond, and Kristy Kelly is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.