This is the right idea and a good way to explain to kids why writing their own work matters.
I also think there is a need to go back to in-class writing.
This is the right idea and a good way to explain to kids why writing their own work matters.
I also think there is a need to go back to in-class writing.
“I cannot want your learning more than you do.” THIS!
It’s not about the grades, it’s about the learning. People who cheat are only cheating themselves out of the opportunity to learn, grown and develop the skills they need for life.
Kudos to this prof!
And yes, to getting back to in class assignments.
The professor should be fired for failing to do his job and for making unfounded assumptions of wrongdoing without adequate proof.
Dont like the idea of the professor throwing his arms up and basically giving up. There are ways around this issue. For example the professor can have students write (even a shorter) essay in class using old fashioned bluebooks? Or have students submit drafts/outlines/research as part of the assignment?
Amen.
Uh…
Fired? I disagree with that strongly.
Is the university to blame also? Are they giving professors the tools they need to combat the use of AI? Are they asking students to adhere to some code of conduct and verify that the work is their own? I don’t know the answers to these questions, btw.
I agree when he says this: “I hate how AI has forced me to turn into a punitive detective, rather than, well, a teacher. I reject that completely.” How can he do his job properly against AI? This is obviously going to be more of a problem in the coming years. What is the point of trying to learn anything when AI can just do it all for us? It’s scary.
Surely his job is to grade student work, not AI work. He is calling attention to something people already seem to be forgetting: learning is the point.
Assuming the Reddit post references real events, students using AI for papers is a legitimate problem and there are legitimate solutions. Giving everyone a 10/10 because the teacher can’t tell who is using AI and who s not such a solution. Other options include things like.
I seem to recall my daughter talking about an assignment where they were told to use AI, and then to critique AI’s response, drawing on class materials, their own reflection, and double checking sources.
That’s an interesting idea, because presumably, anyone putting that prompt into Chatgpt, etc… would get a similar answer.
So maybe an interesting take could be that all students have to give the prompt to chatgpt, etc., but then show how they have added their own research and unique ideas to the initial output. The goal could be to differentiate their work from that of AI.
Edit: which is pretty much what @worriedmomucb just said, haha.
I think these conversations must already be happening in academia. It’s certainly going to be interesting to see how this plays out in education at all levels.
Wouldn’t AI be able to generate all that too?
Grading has become impossible, demoralizing, and saddening. There was always an element of plagiarism-policing to grading, especially since the late ‘90s, when it started to become easier for students to find materials online, and it got much worse during and after Covid. But it’s an entirely different animal now. I feel like half my job consists of begging students to think for themselves and have enough self-respect to do their own work and see its benefits for their intellectual development. I feel like I spend an inordinate amount of time looking for signs of AI use, which of course I can’t prove (except sometimes I can, and I tell them how I know, and they do it anyway). Yes, I design assignments to be as AI-proof as possible (which means I’m limiting myself by assessing students on some skills and not others), and yes, I use in-class writing (I’ve always used it, even before it came back into vogue as a way to get around AI). Yes, I have them compare actual student responses to AI-generated responses to the same question to get at the flaws inherent in AI use (half of them typically think the AI response is better, and they can’t even explain why). Oh, for the days of simple cut-and-paste plagiarism, which was so much easier to identify and prove.
In short, our jobs as faculty are increasingly boiling down to AI policing (at least for those who haven’t surrendered to it). Frankly, I don’t blame anyone for how they handle it, because the situation is impossible and untenable.
A good friend of my gave up teaching at the university level for just this reason. Sad, because I think she was a wonderful professor.
I’m going on the assumption that if every student used AI beyond that, their responses would all be too similar? But I have no idea.
Wow, that is just sad. I am sorry you are going through this.
We created AI, so we have to find ways to outsmart it. Meanwhile, are whole sections of our kids’ brains going to become smooth? Seriously!
Nobody is requiring you or others to incorporate AI into class assignments, but saying AI has no use in humanities or social sciences is not accurate. AI can be like having a team of 1000 researchers to assist with your work. That team could read through and evaluate volumes of texts in seconds, something that would take years for you to do yourself. They could look at your ideas from relevant lenses you wouldn’t possibly consider, or argue why your ideas are correct/incorrect. They could immediately find references relevant to your ideas, references you wouldn’t have found on your own.
It doesn’t have to be a choice of either the bot writes paper for you or you don’t use AI. AI can assist and support your goals, allowing you to dramatically enhance your productivity and accomplish goals that would not have been possible in previous decades. I expect persons working in humanities + social scientist fields who can use AI effectively to enhance their productivity will become increasingly in demand, in the future.
You don’t know my work, but no, AI cannot do this. It never will be able to do this.
I do archival research. I work with manuscripts from the 17th-19th centuries, some of which are digitized, and some of which aren’t. AI bots will never know how to read and interpret them like I can. It will never know how to use them to develop an argument like I can. It will never know the questions to ask, the evidence to choose, or the secondary literature to consult (AI is well-known for producing fake quotes and fake citations, and as far as I can tell, those problems are getting worse, not better, and as far as library search engines – well, AI has made Google infinitely worse, so I’ll continue to rely on my own search terms in online catalogues, thank you.). AI only works with text and other information that is already in its system, which it combines and recombines without ever producing an original thought, because it cannot think. I (and other scholars in my discipline and plenty of others) produce work that is original. AI cannot do that. The value of our work is in its originality, not its volume.
Inefficiency is not a terrible thing when you’re talking about work that requires meticulous effort and fine-grained analysis. I understand that AI might work well with massive data sets, but my discipline doesn’t. The definition of “productivity” varies across discipline, and in mine, it’s about quality over quantity.
(And if we’re talking about teaching – AI will never be able to grade papers or write syllabi, recommendation letters, or lectures better than I can, either. My training and expertise are actually worth something.)
Your earlier comment implied no use in humanities or social sciences as a whole, not just your work. I was replying to that. AI has different degrees of benefit for different fields within humanities and social sciences. This thread was originally about a (communications?) professor teaching a classroom of not well specialized students, who presumably will have careers in a variety of different fields, many of which will likely be quite different from your own.
AI can’t mentor or coach students. It lacks empathy - it doesn’t really challenge you to think more deeply or to fall in love with learning. So there are roles for us as educators - but maybe we need to rethink how we teach and what we want students to learn.