| It's pretty easy to generate a paper via ChatGPT. How are professors handling it? |
| Many many ways. Tracking keystrokes, edit in word, better and better detection systems. |
| More difficult to detect for CS major assignments. |
|
We can go anywhere from "it's your education: waste the opportunity to learn if that's your choice" all the way to "I will make it effing impossible for you to pass by letting a machine do your work for you." In the latter case we require in-class work, track editing in Google Docs, build in elaborate multi-stage peer review, assign non-digitized texts, give oral exams, and basically exhaust ourselves trying to long-cut around their shortcuts. But none of that is as demoralizing as trying to teach when most students don't prepare the assigned work or reading. If they would just read the g**d**n book (the book, not an AI summary) so, so many problems could just be completely avoided. But they won't, and they don't. Timing-wise this is a post-covid experience for me as a prof, but I don't think it's pandemic fallout: I think it's just where we are right now in higher education, and indeed where we are as a culture.
Mostly I'm just sad. I've always tried to be very purposeful and conscientious in my teaching and in my treatment of students. But I'm running out of workarounds, and you don't get anywhere with begging. Or preaching. Or individuating. Or choosing an easier textbook. Or rewriting a course. Or innovating. Or shaking up the grading system. Or "making things relevant." If it doesn't look like a direct ticket to the next internship, they probably won't even show up regularly - which makes it impossible to help them succeed. |
|
Many classes are incorporating AI into the class work (you can use Chat gpt or Claude or whatever), such that more work is assigned with the expectation you use AI to complete the “busy” work so to speak and you must indicate you used it.
Great example is a class my kid is taking on new business development where the teacher says that the class progresses 5x faster than pre-AI because as one example, the expectation is you use AI to create your marketing plan…but of course you need to make sure it is correct and relates to your business idea. Just one example. |
| That works well in certain academic areas and not so well in others. When the academic area is about the mastery of a human skill, the AI shortcuts are devastating to a student's ability to develop said skill, even if they generate reams of product. |
| One professor has them do all writing in class. Others use AI detectors. |
| By having you do things in person like give a speech on a topic. |
| people love AI for thing they dont really value. like, yeah, AI is great for marketing or coding but what about literature. AI is pretty good with that too. Sorry, you're not writing anything new. Papers maybe weren't the ultimate teaching tool. So back to tutorials and defenses. |
| I am doing in-class weekly testing to force them to show up prepared for class. I also ask them to describe and reflect on the work they do during laboratory exercises. AI cannot generate something meaningful if it doesn't know what they did in the lab and what results they got. They still try to use AI and it is relatively easy to tell that the discussion has little to do with their class experience (too general and pompous). But this is science and I assume that it is harder with the humanities. |
Thanks for this - it _is_ harder with the humanities, because there is so much out there about existing literature that a paper can be generated in a snap - and so can a summary of what you were supposed to read, or plausible answers to your homework questions, etc., etc. But what people often forget is that the deliverable in the humanities isn't actually the paper. Sure, it's the thing we grade, but the paper is really a _symptom_ of the thoughts and skills that are developed through struggling with big ideas. The transformation of the student's capacities is what we're working on: the world doesn't need another student essay on Macbeth, but the _student_ does. AI completely kills off that deliverable in the student, and replaces it with fake evidence of learning. And halfway-ing it ("I used AI for some ideas and then wrote the essay using that outline") compromises that student transformation, too. It's not like there weren't other shortcuts in the past - it's just that these are the shortest and potentially most devaluing ones we've encountered. |
| At this point, if colleges don’t teach kids incorporating gpt in their learning, they are doing a disservice to the students. The CS students who don’t know how to use AI to code will lose to kids who do. |
| They will eventually have to learn ChatGPT and teach how to use it as a tool. Won’t be good for dinosaur professors. |
| I don't teach in a writing-heavy discipline so I've switched my short papers to discussion board assignments with required interactions, and short-answers on in-class tests. |
|
I know it's not the same, but I assume professors kind of freaked out about using handheld/scientific calculators when they were first introduced and then using the Internet in the mid-1990s...but seems like everyone adapted to the new technology.
By say 2027/28 these threads on DCUM will seem quaint. |