| My husband is a political scientist at a small LAC, he's doing in-class written pop quizzes, and 10-15 minutes at the beginning of each class where students write summaries of the reading- but since it's in class, they won't be able to just ChatGPT their answers. For him, the use of AI is infuriating. Students are getting to the point where they can't write or think critically. A big chunk of his class period is spent watching them write, instead of lecturing, so that he knows they're actually reading the texts. Each semester, he fails more and more of his students for plagiarism, because if you're using AI, you're not using your own thoughts/analysis/work. The quality of his students has dropped dramatically since Covid, and AI is just making it worse. He teaches Middle East politics and international law. Don't we want our future leaders in these areas to be able to think critically, analyze data and make informed policy decisions for our government? Saying ''well, it's new- just adapt to it, they're going to use it anyway" is a very shortsighted viewpoint. |
|
Right now, they’re pretty helpless. As much as they claim they can tell if someone’s using it, they really can’t. Students are using it, then reformatting to look like their own work.
It’s time to bring back pop quizzes and in-class essays. |
|
This reminds me of when the Internet came to be, and teachers were adamant that we only use texts as sources. I think professors need to get with it. There is now AI, students should be using it to the best of their abilities.
Professors who are psychotically trying to monitor AI? It’s stupid. AI is here, and the students who use it well will succeed. Who cares if they wrote their paper with AI, if they did a great job. They did a great job. |
| As a current college student, I think professors need to get with the times. Those professors still assigning 200+ pages of reading each week, plus discussions, plus assignments, plus quizzes. It's kind of ridiculous. Like, now that we have chatgpt, the point is that we do not need to spend hours reading 200+ pages and analyzing it in our own minds. Maybe that used to be a relevant skill, but less and less so these days. I would prefer to learn something more relevant for the future. |
I've read AI papers, and no, they don't do a great job. But if it's good enough for my students then maybe it should be good enough for me. Anyone here who is paying tuition have a problem if I just let AI grade and provide feedback for me? Because if it does a great job then it should be good enough, right? I don't actually need to read anything that way, so it would make me more efficient. And it would certainly make me less "psychotic." Do we have a deal, then? |
There are many college professors that have written editorials about how AI produces great work on even esoteric topics. It’s interesting how everyone on DCUM claims it sucks, but the people willing to put their name to something claim it produces great work. I don’t think they are getting paid by Open AI? |
This was clearly written by someone who is not a college student. But it does capture the general attitude towards college work right now, alas. The real problem is that college students do not actually know what is relevant because they do not yet have the life or career experience to pass that judgment. Their resulting attempts to triage and prioritize are therefore guesses, and they are usually not good guesses. |
No, but it does need to be noted that they come from academic _disciplines_ and subfields that can be served by the capacities of AI. My discipline currently cannot. Maybe someday, but we will be low-priority for the kind of specialized training that the models will require. Until we can _really_ train them ourselves, robustly, on a near-end-user level, these LLMs will be of little use in my field. |
Like what? A philosophy professor wrote an editorial on him asking AI to write a paper on Multiple Realization and he was blown away with what it produced. |
|
Colleges are responding to ChatGPT and similar AI tools in a range of ways, from cautious acceptance to strict regulation. Some institutions see AI as a useful tool for learning, encouraging students to use it ethically for brainstorming, editing, or coding assistance. These schools are updating policies and coursework to reflect AI’s presence in academic life, integrating discussions about digital literacy, responsible AI use, and academic integrity.
On the other hand, many colleges remain concerned about plagiarism and the erosion of critical thinking. Some have reinforced honor codes, implemented AI-detection tools, and trained faculty to recognize AI-generated writing. Others have redesigned assignments to be more resistant to AI misuse, such as using oral exams, in-class writing, or personalized prompts that are harder for AI to complete convincingly. Ultimately, colleges are realizing that banning AI outright isn't realistic. Instead, they're working to teach students when and how it’s appropriate to use tools like ChatGPT. The focus is shifting from prohibition to adaptation—guiding students to develop original thought while also understanding the role AI can play in research, writing, and problem-solving. This transition mirrors the broader challenge of aligning education with rapid technological change. |
Academic dishonesty report, here I come. |
|
Our DS is a physics major and the department has adopted a "No AI Policy" Policy. So, the department runs things with the assumption that Ai is a tool that students can use. What this means is the classes have astronomically accelerated in difficulty and they're permitting take home exams that are designed so gpt struggles to get you near the answer. This means students are able to both use a tool that they'll inevitably use in their research, while also developing their critical thinking skills. There's no shame in using AI, but you are expected to cite its use.
DD loves it and the upperclassmen have commented by how much more difficult the major has become, but that it is a necessary step in the right direction. |
This sounds fabulous to me! We’re living in a time of transition. It’s a huge opportunity to learn how best to leverage AI - both in service of learning, and in order to free ourselves up for higher-level thinking and output. I wish all schools (and departments within schools) were open to the approach of the Physics department described above. |
I mean...why read War and Peace or Hamlet anymore? You can just get the summary on ChatGPT. |
|
The truth is that they aren't tackling it very well. Some schools, like Ohio State, have even started requiring AI usage.
Schools don't know what they're doing, and kids only get caught using AI when they're dumb enough to leave in the ChatGPT prompts in their responses. The professors themselves oftentimes can't tell the difference between AI and strong writing. |