Anonymous wrote:Anonymous wrote:Anonymous wrote:It means using LLM are part of real life and they will be used.
+2
It is very odd that the message is, essentially, that AI is coming for your job and you’ve better be ready to use it.
I use AI in my job all the time. It’s part of the landscape and students are using it to work smarter, not harder.
With nepotism giving objectively undeserving people all kinds of opportunities, can the hoi polloi have this one thing to give them a leg up?
Is it really getting a leg up, though?
Say you have two kids in an English class. One reads Hamlet. The other reads an AI-generated summary. The first writes a sophomoric but original essay. The second used AI to generate an essay in seconds. The first walks away from the class having interacted with one of the great works of western literature. What did the second gain?
Anonymous wrote:In-class bluebook tests are common at my kid's college.
Anonymous wrote:Anonymous wrote:STEM faculty here so don't have to error about AI-written essays turned in by students. One way to fight AI is to limit assignments to no more than 10% of the total grade (tell students their effort doing lowly-weighted assignments will pay off during exam and make that true) and make in-class midterm and final exams the remaining 90%. If the class has individual/group projects/labs that are hands-on, then AI can't do much. If they are "take home" and are merely much more difficult assignments, it'll be harder to catch but fortunately few classes I teach have those types of projects. So far, my TAs have told me some students did use AI because sometimes the techniques and mathematical notations they used to do assignments were not the ones covered in class. It's unmistakeable. It's also sad and depressing to see and I'm sure those in the humanities have seen it much worse. I remember this phrase from a professor to students that went viral recently: "I cannot want to learn this more than you" which I completely agree.
You are STEM ffs you should be integrating AI into your lessons.
Anonymous wrote:In-class bluebook tests are common at my kid's college.
Anonymous wrote:Genuinely curious. A lot of undergrad assignments can easily be run through AI whether it's an intro poli sci class asking students to compare Hobbes and Locke or a calculus problem set. Does this mean a shift to in class writing, oral exams etc.?
Anonymous wrote:One of my kid’s professors required them to hand write everything - essays, problem sets, short answers, in-class assignments, all of it. My kid is dyslexic so it’s kind of a bummer (kid refuses to go to disability services to get accommodations, which would fix issue). It seems reasonable, but students do end up doing their homework on the computer then copying it out by hand. Which, for sure, means they learn it well…so perhaps mission accomplished.
Anonymous wrote:Anonymous wrote:It means using LLM are part of real life and they will be used.
+2
It is very odd that the message is, essentially, that AI is coming for your job and you’ve better be ready to use it.
I use AI in my job all the time. It’s part of the landscape and students are using it to work smarter, not harder.
With nepotism giving objectively undeserving people all kinds of opportunities, can the hoi polloi have this one thing to give them a leg up?
Anonymous wrote:It means using LLM are part of real life and they will be used.
Anonymous wrote:STEM faculty here so don't have to error about AI-written essays turned in by students. One way to fight AI is to limit assignments to no more than 10% of the total grade (tell students their effort doing lowly-weighted assignments will pay off during exam and make that true) and make in-class midterm and final exams the remaining 90%. If the class has individual/group projects/labs that are hands-on, then AI can't do much. If they are "take home" and are merely much more difficult assignments, it'll be harder to catch but fortunately few classes I teach have those types of projects. So far, my TAs have told me some students did use AI because sometimes the techniques and mathematical notations they used to do assignments were not the ones covered in class. It's unmistakeable. It's also sad and depressing to see and I'm sure those in the humanities have seen it much worse. I remember this phrase from a professor to students that went viral recently: "I cannot want to learn this more than you" which I completely agree.
Anonymous wrote:I'm a professor, and I began using oral exams and in-class tests and quizzes, as well as short in-class writing assignments a couple of years ago. When I have an at-home assignment, I make it extremely specific - e.g, asking for a specific number of direct quotations.
Anonymous wrote:The student on the UVA admissions instagram yesterday mentioned that one of her classes was "technology free," which was why she wasn't sharing a picture the way she did with her other classes.