|
I teach writing-based grad courses where student need to learn specific structures for writing. Auto-generating content will keep them from learning these skills.
And Grammarly is advertising its new generative-AI capabilities, not just checking sentence-level typos and grammar anymore. https://www.grammarly.com/ai AMC |
PP to whom you are replying here. Actually, I am a college professor, and a department chair. And I do consider it my job both to expect largely correct grammar coming directly from my students (not coming from AI) and to require students to attempt to filter out their own mistakes. Productive self-evaluation is part of being a professional, and even just part of being a functioning adult. My students running their writing through generative AI tools prevents me from understanding not only their objective skills and their general abilities, but also their thought processes. It is not in fact acceptable to me, but even if it were, it would be a lazy way for me to pretend that my students do not need to improve and that I do not, in fact, need to help them. |
I am generally curious how you will react when your boss (or maybe your boss' boss) issues the edict that every and all class now needs to incorporate generative AI into your courses. Don't say that edict won't happen...I will repeat again, don't say that edict won't happen...the only question is will it happen next year (probably not)...in five years (50%+ chance)...in 7 years (nearly 100% certainty). |
I mean, if you're going to to send out something like this, maybe be sure it's free of grammatical errors and the like including improper subject-verb agreement and improper usage of commas and apostrophes. Maybe also write with a little more word economy. |
Why don’t you engage with the substance of what the OP is saying, instead of jumping on with some cheap shots? |
The issue is not the existence of generative AI: the issue is what it can teach students vs. what it can actually prevent them from learning, and that will remain a moving target as the technology evolves. At this exact moment in higher education we are seeing far more homework-level abuse of AI than anything else, as students experiment with assignment shortcuts and faculty experiment with brand-new ways of cultivating pre-professional productivity that are not yet fully formed. The consumer-facing versions of today's AI tools are not robust enough (yet) to produce professional academic writing that people actually want and need to read, especially since the tools we currently have are not trained on the materials we actually research and study. Once they are, that is where the questions will come, likely starting in the technical and professional disciplines. And as far as AI requirements go, the first step will be that we faculty will be required to have AI policies in our syllabi and enforce them fairly and transparently. I see that coming in a year or two. But required to teach with it? That will be discipline- and subject-dependent, and everyone impacted by higher education (including but not limited to students, parents and guardians, and employers) should want colleges and universities to make choices in that regard that will produce the best return on investment. Some of that will depend on what the tools look like down the line. |
Agreed. I taught a writing course at a top 25 professional school for many years. I finally did a “grammar tip of the day” because while I expected a certain level of grammatical excellence, the reality was that it didn’t exist. I found it shocking and disturbing. This was before AI! |
I seriously doubt you are correct. I believe nearly 100% of all disciplines/subjects will be required to incorporate it in some fashion. Each course/discipline will have a unique way of incorporation. As an example, do you have the discretion to tell your students they must turn in hand-written papers? This is an honest question. I liken generative AI to using a computer vs. hand writing. If in fact you could force your students to turn in hand written 10 page papers, then I would agree your university will give you the flexibility to prohibit it forever. However, if your university has a policy that you must allow your students to use a computer (or you cannot prohibit your students from using a computer, so they are free to hand write it if they so wish)...well Generative AI will be the same. You can use the calculator for STEM classes as an equal analogy. |
| The real issue is they don't teach grammar and spelling in schools anymore. |
Same PP again: yes, we can require work of all kinds to be submitted in whatever format we believe is best for the assignment and the students. We are also allowed to develop and enforce our own rules for the use of technology in the classroom. (But there is also the appropriate regulation, of course, that any students who have documented disability needs must be permitted to use any technology included in their accommodations.) I do not see much traction in a handwritten research paper prepared outside the classroom. But a handwritten exam inside of it? Absolutely. |
| Curious how you accessed this? Did you kid send it to you? |
AI is a flash in the pan. I'm in the business world. It's already fizzling out. The problem with generativeAi is nothing it produces sounds genuine and sometimes it hallucinates. Besides, I want AI to take care of tasks like dishes and laundry so I can write more. I don't want it to write so that I have more time to do dishes and laundry. That's the fundamental flaw in the value prop and why it's ultimately going to be abandoned in the corporate world. I'm in a big corporate -- 50 billion in global revenues -- and already see this. |
Pointing out hypocrisy isn't a cheap shot. If OP wrote that, OP is a terrible, sloppy writer who has no business complaining about mistakes by others. If I were going to take a cheap shot, I'd observe that OP is also really whiny and insufferable. |
Dude, you are completely making up your post. Here is the Point72 chairman from yesterday on AI: The Point72 founder told CNBC’s Andrew Ross Sorkin on “Squawk Box” that his financial firm has found ways for even the early AI models to save the company money. “I’ll give you one little anecdote. My CTO comes to me and says I can save the firm $25 million by using these LLMs to improve our efficiency,” Cohen said, referencing his chief technology officer and the large language models like ChatGPT. You don't even understand what Generative AI can do. Here is another tidbit regarding Tyler Perry and opening his film ventures in Atlanta: Over the past four years, Tyler Perry had been planning an $800 million expansion of his studio in Atlanta, which would have added 12 soundstages to the 330-acre property. Now, however, those ambitions are on hold — thanks to the rapid developments he’s seeing in the realm of artificial intelligence, including OpenAI’s text-to-video model Sora, which debuted Feb. 15 and stunned observers with its cinematic video outputs. “Being told that it can do all of these things is one thing, but actually seeing the capabilities, it was mind-blowing,” he said in an interview with The Hollywood Reporter on Thursday, noting that his productions might not have to travel to locations or build sets with the assistance of the technology. Generative AI is writing code, putting graphic designers out of work, putting advertising copyrighters out of work, etc. Seems like you are soon to be put out of work yourself. |
My kid was “taught” to write from standardized “templates.” I expected that to end in middle or high school, but it did not. Perhaps public schools do it to ensure “fair” (ie objective ) grading. But it did not teach her to use the English language with any beauty or personal style. I wonder if that is one way that private schools are superior? |