How are professors dealing with ChatGPT and other LLMs?

Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It means using LLM are part of real life and they will be used.


+2

It is very odd that the message is, essentially, that AI is coming for your job and you’ve better be ready to use it.

I use AI in my job all the time. It’s part of the landscape and students are using it to work smarter, not harder.

With nepotism giving objectively undeserving people all kinds of opportunities, can the hoi polloi have this one thing to give them a leg up?


Is it really getting a leg up, though?

Say you have two kids in an English class. One reads Hamlet. The other reads an AI-generated summary. The first writes a sophomoric but original essay. The second used AI to generate an essay in seconds. The first walks away from the class having interacted with one of the great works of western literature. What did the second gain?


I agree with you in theory: my granddad was an academic and definitely would have agreed with you. But the consensus now is that university is merely a vehicle by which to access opportunities and obtain prestige and that the educational element is kind of beside the point.

Look at these rabid posts all over this forum and the ongoing, endless debates over which schools are even “worth” attending.


For assessment of whether a student has mastered a set of knowledge, nothing beats in-class tests. (That was true before, AI, btw.). For research projects, we outline how AI can be used effectively and also teaching about when it will be ineffective, and how to structure research using it as a tool. Using AI effectively is an important skill and I don't mind if they leverage it (for 20% of the grade). So far, simply putting my assignments into AI and turning in the resulting paper still doesn't produce good results, but perhaps in 6-12 months, it will!
Anonymous
I am a prof and we got rid of pretty much all graded written work that could be done at home and handed in. Now it is oral or in class assignments and more tests and quizzes and exams. Takes up more class time and doesn't assess the same kind of thinking as older written assignments used to but it is the only way to actually assess learning.

In one of my classes, I had a few discussion posts but I could tell most were AI so I included questions about the discussion posts on the exam and asked them to reflect on what they had shared in their posts. The majory of students got zero on this as they couldn't remember what AI drivel they had posted on the discussion board!
Anonymous
Anonymous wrote:I am a prof and we got rid of pretty much all graded written work that could be done at home and handed in. Now it is oral or in class assignments and more tests and quizzes and exams. Takes up more class time and doesn't assess the same kind of thinking as older written assignments used to but it is the only way to actually assess learning.

In one of my classes, I had a few discussion posts but I could tell most were AI so I included questions about the discussion posts on the exam and asked them to reflect on what they had shared in their posts. The majory of students got zero on this as they couldn't remember what AI drivel they had posted on the discussion board!


Great idea. Will steal it for my class.
Anonymous
Anonymous wrote:I am starting to think I will do a lot more flash quizzes and tests where they show up and I hand out a passage and have them write three paragraphs analyzing it on the spot - or have them present it or share in groups for a grade.

But what really disturbs me is I had students write a research paper in a social science course and a large number chose the same topic because apparently they asked the AI what to write about. At this point I am more concerned that someone is taking Intro to Psych or Criminology or Sociology but they can’t even be bothered to articulate an interesting research question or they have no particular question that they care enough about to want to learn something about it. It’s the lack of intellectual curiosity that is most depressing to me.


This! And you develop and hone this through deep engagement with the material. I'm so disappointed that my kid's school allows use of AI in brainstorming but not the final writing. Shouldn't the brainstorming be the part where creativity and curiosity can flourish? The rest is more mechanical.
Anonymous
My DS is in STEM and most of his grades are in-class labs and tests. His advanced calculus was concept heavy, in class tests, no calculators.
Pretty much anything goes on take home assignments but they aren’t worth much and not putting in the effort will result in a failing test grade.
His social science classes are hand written in-class tests. There is some ability to research concepts ahead of time - but google, spark notes, etc would have the same effect.
I don’t think his professors often explicitly forbid AI- they just don’t provide opportunities for it to play a large part in grading.
As a parent, I worry about AI (and screens in general) because I think it harms stamina, resilience, and love of learning.
I was pleased to hear my son express some disenchantment - the more he learns, the more he sees AI limitations and flaws + it’s effect on grit.


Anonymous
Anonymous wrote:I'm a professor, and I began using oral exams and in-class tests and quizzes, as well as short in-class writing assignments a couple of years ago. When I have an at-home assignment, I make it extremely specific - e.g, asking for a specific number of direct quotations.


I am a professor and this is not only what I do, it’s what I’ve been doing for two decades
Anonymous
My child's non AI written essay was graded by his tutor who was likely by 80% using AI ( I fed the response through a few AI filters). This is at a highly ranked UK school.
Does one bring this up to the department chair? Or maybe it is sanctioned by the school so let it go?
Anonymous
Anonymous wrote:My child's non AI written essay was graded by his tutor who was likely by 80% using AI ( I fed the response through a few AI filters). This is at a highly ranked UK school.
Does one bring this up to the department chair? Or maybe it is sanctioned by the school so let it go?

Bring it up with the chair.
Anonymous
Humanities professor here. I started doing a few things (not sure how effective they are):
1. I have extensive conversations with my students about why I don't allow them to use AI (outside of fixing mechanics of writing) ie the goal of the class is develop critical thinking, analytic writing, etc. Ultimately, however, I tell them that if they are ok paying $95K+ a year while robbing themselves of the opportunity to learn, it's their problem, not mine.
2. I do a fair amount of writing in class -- either exams or low-stakes "reflection" type assignments so that I have a good sample of their authentic writing (by hand). So if I see a huge discrepancy I have a conversation with them.
3. The at home writing I require needs to be very specific and has to make specific references to the text. (this is not fool proof of course)
All this is super time-consuming and takes time away from other (academic) things I would rather be doing. But I also note that many of my students are quite opposed to AI. So maybe there is hope?
Anonymous
Anonymous wrote:Genuinely curious. A lot of undergrad assignments can easily be run through AI whether it's an intro poli sci class asking students to compare Hobbes and Locke or a calculus problem set. Does this mean a shift to in class writing, oral exams etc.?


Most is just just burying their heads in the sand and either relying on their institutions to “punish” or pretending they can detect accurately the use of AI. Very few are actively embracing it and teaching and testing around it with more oral group presentations, Socratic seminars style tests and writing tests….
Anonymous
My DS is in college now, he has more in person presentations and group projects than I remember. But he also has classes in which he is supposed to use AI, learning the differences between different AI, and that professors know what is happening and is using it with the kids, not against.
Anonymous
Anonymous wrote:My DS is in college now, he has more in person presentations and group projects than I remember. But he also has classes in which he is supposed to use AI, learning the differences between different AI, and that professors know what is happening and is using it with the kids, not against.


This is good. Unis can either accept AI and navigate around it or try the endless war fighting it. There will soon be a clear difference in education quality between these two.
post reply Forum Index » College and University Discussion
Message Quick Reply
Go to: