Are college students using AI to write papers?

Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:I’ve been out of college for a long time, but I saw something on the news about this.

Is it true that college students are using AI to write their papers for their classes? Even at elite colleges! What can we do to stop this?

I asked my nephew, and he said he doesn’t use it, but he could be an outlier.


My opinion is that kids are going to use it. The best way to address it is to help them understand the most efficient ways to use it, what its limitations are, and that you always need to cross-reference what AI gives you. They still need to do the work, understand the concepts, etc., because AI could produce an answer or essay that is totally off base. It's a good tool for generating outlines, for giving you an idea on how to sharpen a paragraph you've written, and, if you feed it text that you are having trouble understanding, it can break it down and make things easier to understand.

My college-aged daughter has used it, and we have had these discussions. She understands the limitations; she has caught AI giving inaccurate answers, but does feel it is a useful tool for helping her get started on essays by generating a sample draft.

I use AI for parts of my work, but I wouldn't advise anyone new in my position to use it until they have a good grasp on what their work product should include and strong knowledge of the required content. It is going to be available to this generation as they move into the work field, so perhaps college is the time for them to learn how to use it responsibly and apply analytical skills to ensure AI produces accurate work products.



AI content generation constitutes academic dishonesty in most disciplines. Good luck to her.


Give it a rest. I bet when calculators were first introduced that was also considered academic dishonesty.


And if using them was against the school rule at the time, it WAS academic dishonesty. It doesn’t matter that they are ubiquitous now.

DC’s math tests (Alg 2/Trig, but I believe other classes do the same at the school) all have two sections - the first in which they are not allowed to use a calculator. They have to hand that section in and receive the second, calculator-allowed section before they can take out their calculators. Using a calculator on the non-calculator section is cheating and would constitute academic dishonesty.

Using genAI in a way disallowed by the school or the professor is academic dishonesty, even if you don’t care because “everyone does it” or because AI will be everywhere in less time than in took calculators to become common household items.


Again…give it a rest.
Anonymous
Anonymous wrote:
Anonymous wrote:Smart kids need to learn how to use AI.

We are now teaching people at work how to use it and it’s like come on man high school students configured this out.

The smart kids use it to edit their papers, brainstorm, check, grammar, etc.

Really smart ones popped their syllabus into it and ask for to write a study guide.

Savy ones can put their whole wardrobe in there and ask for it to pop out a bunch of outfits for the week. Most of. DC could use that



I’m baffled by this idea that there’s a “smart” way and a dumb way to use it. As if learning how to craft the prompts is so difficult. It is what it is. You click a button and it does very thing for you in an instant. Is that amazing? Yes. Is that useful? Sure. Does it indicate your level of intelligence or savvy? No, it doesn’t.


It’s really not as easy as that when you understand how it can be used. I don’t disagree that it requires a ton of intelligence but yes it requires hours of work and some savvy.

For example, it’s easy to learn to use Excel for simple spreadsheets…but it’s useful to know all its advanced functionality for lookup tables and all kinds of other applications. It takes time to learn that.

GenAI is similar because it can now create entire 30-slide presentations from scratch but it takes time and effort to do that and someone who just knows how to use it to write a simple essay will be baffled how to use it for more complicated applications. You will learn after many hours figuring it out..but it take many hours.

The folks using Gen AI for movies/video said it took them nearly 100 hours to really figure it out so it could be used for a real commercial purpose.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:I’ve been out of college for a long time, but I saw something on the news about this.

Is it true that college students are using AI to write their papers for their classes? Even at elite colleges! What can we do to stop this?

I asked my nephew, and he said he doesn’t use it, but he could be an outlier.


My opinion is that kids are going to use it. The best way to address it is to help them understand the most efficient ways to use it, what its limitations are, and that you always need to cross-reference what AI gives you. They still need to do the work, understand the concepts, etc., because AI could produce an answer or essay that is totally off base. It's a good tool for generating outlines, for giving you an idea on how to sharpen a paragraph you've written, and, if you feed it text that you are having trouble understanding, it can break it down and make things easier to understand.

My college-aged daughter has used it, and we have had these discussions. She understands the limitations; she has caught AI giving inaccurate answers, but does feel it is a useful tool for helping her get started on essays by generating a sample draft.

I use AI for parts of my work, but I wouldn't advise anyone new in my position to use it until they have a good grasp on what their work product should include and strong knowledge of the required content. It is going to be available to this generation as they move into the work field, so perhaps college is the time for them to learn how to use it responsibly and apply analytical skills to ensure AI produces accurate work products.



AI content generation constitutes academic dishonesty in most disciplines. Good luck to her.


Give it a rest. I bet when calculators were first introduced that was also considered academic dishonesty.


And if using them was against the school rule at the time, it WAS academic dishonesty. It doesn’t matter that they are ubiquitous now.

DC’s math tests (Alg 2/Trig, but I believe other classes do the same at the school) all have two sections - the first in which they are not allowed to use a calculator. They have to hand that section in and receive the second, calculator-allowed section before they can take out their calculators. Using a calculator on the non-calculator section is cheating and would constitute academic dishonesty.

Using genAI in a way disallowed by the school or the professor is academic dishonesty, even if you don’t care because “everyone does it” or because AI will be everywhere in less time than in took calculators to become common household items.


Again…give it a rest.


NP. Compelling counterargument.
Anonymous
Anonymous wrote:
Anonymous wrote:I’ve been out of college for a long time, but I saw something on the news about this.

Is it true that college students are using AI to write their papers for their classes? Even at elite colleges! What can we do to stop this?

I asked my nephew, and he said he doesn’t use it, but he could be an outlier.


Take away the "out" part.


I do not understand people who hate kids so much that they assume they’re all liars.
Anonymous
Anonymous wrote:Yes, and they study for exams by having AI fill in their study guides (no need to take notes or pay attention in class! and it's quite amusing to read some of the AI 'hallucinations'!). I've already moved to in-class essay exams, which I try to make AI-proof. And this coming term I'm implementing a quasi-tutorial system wherein students write papers but then have to explain and defend them to me and one or two other students, in small group discussions.

If profs continued to assess as if it were 2022 many if not most students would be getting As for literally zero work, having learned absolutely nothing in the course.


Another prof here. PLEASE WARN YOUR KIDS ABOUT USING AI TO GENERATE A STUDY GUIDE. I see nothing wrong with it ethically, but in my experience, AI does a bad job of distilling what I emphasized on my slides, or the bigger picture, and at times it feeds the students blatantly incorrect information. One of my students sent me the AI study guide he got from asking AI to do it off of my slides / lecture notes (I try to make the slides have a lot of info on them so students CAN use them as a study guide). About 50% of it was helpful, and about 30% of it was wildly irrelevant (student now spending time studying stupid little sidebar notes) and about 20% was explicitly wrong. For example, I went over a specific theory, and the name of the specific theory was on the slides. It gave a similar-sounding name for the theory - a name that an outsider would think sounds pretty similar but it was completely different in meaning. And then described the theory in a way that was really wrong - almost like a smart outsider guessing what the theory meant based on the name. And being wrong.

And it also missed really really big important concepts. Ones that had several slides. So the student would have failed to study those.

The thing is, developing your own study guide is actually a big part of the learning. I told my student that he would have been so much better off going through the lecture notes and making his own study guide. Luckily the student emailed the guide ahead of time and I emailed him back and told him to throw it away and that he'd do terrible on the exam if he used it!

So it'd be helpful to warn your student not to do this - they won't do as well on exams.
Anonymous
Anonymous wrote:
Anonymous wrote:Yes, and they study for exams by having AI fill in their study guides (no need to take notes or pay attention in class! and it's quite amusing to read some of the AI 'hallucinations'!). I've already moved to in-class essay exams, which I try to make AI-proof. And this coming term I'm implementing a quasi-tutorial system wherein students write papers but then have to explain and defend them to me and one or two other students, in small group discussions.

If profs continued to assess as if it were 2022 many if not most students would be getting As for literally zero work, having learned absolutely nothing in the course.


Another prof here. PLEASE WARN YOUR KIDS ABOUT USING AI TO GENERATE A STUDY GUIDE. I see nothing wrong with it ethically, but in my experience, AI does a bad job of distilling what I emphasized on my slides, or the bigger picture, and at times it feeds the students blatantly incorrect information. One of my students sent me the AI study guide he got from asking AI to do it off of my slides / lecture notes (I try to make the slides have a lot of info on them so students CAN use them as a study guide). About 50% of it was helpful, and about 30% of it was wildly irrelevant (student now spending time studying stupid little sidebar notes) and about 20% was explicitly wrong. For example, I went over a specific theory, and the name of the specific theory was on the slides. It gave a similar-sounding name for the theory - a name that an outsider would think sounds pretty similar but it was completely different in meaning. And then described the theory in a way that was really wrong - almost like a smart outsider guessing what the theory meant based on the name. And being wrong.

And it also missed really really big important concepts. Ones that had several slides. So the student would have failed to study those.

The thing is, developing your own study guide is actually a big part of the learning. I told my student that he would have been so much better off going through the lecture notes and making his own study guide. Luckily the student emailed the guide ahead of time and I emailed him back and told him to throw it away and that he'd do terrible on the exam if he used it!

So it'd be helpful to warn your student not to do this - they won't do as well on exams.


Thank you for this whole post. These two points are so critical:

AI often hallucinates and provides information that is just wrong.
The act of reviewing notes and distilling them into a study guide is what helps you learn.
Anonymous
Anonymous wrote:
Anonymous wrote:Yes, and they study for exams by having AI fill in their study guides (no need to take notes or pay attention in class! and it's quite amusing to read some of the AI 'hallucinations'!). I've already moved to in-class essay exams, which I try to make AI-proof. And this coming term I'm implementing a quasi-tutorial system wherein students write papers but then have to explain and defend them to me and one or two other students, in small group discussions.

If profs continued to assess as if it were 2022 many if not most students would be getting As for literally zero work, having learned absolutely nothing in the course.


Another prof here. PLEASE WARN YOUR KIDS ABOUT USING AI TO GENERATE A STUDY GUIDE. I see nothing wrong with it ethically, but in my experience, AI does a bad job of distilling what I emphasized on my slides, or the bigger picture, and at times it feeds the students blatantly incorrect information. One of my students sent me the AI study guide he got from asking AI to do it off of my slides / lecture notes (I try to make the slides have a lot of info on them so students CAN use them as a study guide). About 50% of it was helpful, and about 30% of it was wildly irrelevant (student now spending time studying stupid little sidebar notes) and about 20% was explicitly wrong. For example, I went over a specific theory, and the name of the specific theory was on the slides. It gave a similar-sounding name for the theory - a name that an outsider would think sounds pretty similar but it was completely different in meaning. And then described the theory in a way that was really wrong - almost like a smart outsider guessing what the theory meant based on the name. And being wrong.

And it also missed really really big important concepts. Ones that had several slides. So the student would have failed to study those.

The thing is, developing your own study guide is actually a big part of the learning. I told my student that he would have been so much better off going through the lecture notes and making his own study guide. Luckily the student emailed the guide ahead of time and I emailed him back and told him to throw it away and that he'd do terrible on the exam if he used it!

So it'd be helpful to warn your student not to do this - they won't do as well on exams.


Thanks to the prof above for saying this.

Over in the "Jobs" forum, the AI posts have so much fearmongering. And yet, the AI work I've seen is frankly pretty crappy or off-target. My corporation has been experimenting with AI and there are lots of failures and areas where it can't be used yet.

It seems to me that the people who like AI work the best are unable to perceive quality differences, just like the student above. This makes me question how they can use AI to quickly make excellent work. Of course, sometimes completing an assignment is just about handing something in. But we should expect more of ourselves and of education.

I think education is going through a tough adaptation period right now. But I believe it will all work out, and that students will benefit from AI as a tool and still learn and internalize concepts.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Yes, and they study for exams by having AI fill in their study guides (no need to take notes or pay attention in class! and it's quite amusing to read some of the AI 'hallucinations'!). I've already moved to in-class essay exams, which I try to make AI-proof. And this coming term I'm implementing a quasi-tutorial system wherein students write papers but then have to explain and defend them to me and one or two other students, in small group discussions.

If profs continued to assess as if it were 2022 many if not most students would be getting As for literally zero work, having learned absolutely nothing in the course.


Another prof here. PLEASE WARN YOUR KIDS ABOUT USING AI TO GENERATE A STUDY GUIDE. I see nothing wrong with it ethically, but in my experience, AI does a bad job of distilling what I emphasized on my slides, or the bigger picture, and at times it feeds the students blatantly incorrect information. One of my students sent me the AI study guide he got from asking AI to do it off of my slides / lecture notes (I try to make the slides have a lot of info on them so students CAN use them as a study guide). About 50% of it was helpful, and about 30% of it was wildly irrelevant (student now spending time studying stupid little sidebar notes) and about 20% was explicitly wrong. For example, I went over a specific theory, and the name of the specific theory was on the slides. It gave a similar-sounding name for the theory - a name that an outsider would think sounds pretty similar but it was completely different in meaning. And then described the theory in a way that was really wrong - almost like a smart outsider guessing what the theory meant based on the name. And being wrong.

And it also missed really really big important concepts. Ones that had several slides. So the student would have failed to study those.

The thing is, developing your own study guide is actually a big part of the learning. I told my student that he would have been so much better off going through the lecture notes and making his own study guide. Luckily the student emailed the guide ahead of time and I emailed him back and told him to throw it away and that he'd do terrible on the exam if he used it!

So it'd be helpful to warn your student not to do this - they won't do as well on exams.


Thanks to the prof above for saying this.

Over in the "Jobs" forum, the AI posts have so much fearmongering. And yet, the AI work I've seen is frankly pretty crappy or off-target. My corporation has been experimenting with AI and there are lots of failures and areas where it can't be used yet.

It seems to me that the people who like AI work the best are unable to perceive quality differences, just like the student above. This makes me question how they can use AI to quickly make excellent work. Of course, sometimes completing an assignment is just about handing something in. But we should expect more of ourselves and of education.

I think education is going through a tough adaptation period right now. But I believe it will all work out, and that students will benefit from AI as a tool and still learn and internalize concepts.


In the real world it's stuff where quality differences just don't matter...which is a lot of stuff.

An IPO prospectus is going to be tons of legalize and boring as hell no matter what. Nobody ever comments "that was a wonderful IPO prospectus...so much better written than other IPO prospectuses". Same with all SEC filings. You do need to make sure it doesn't hallucinate, but nobody there is no "quality" aspect.

Anonymous
Anonymous wrote:
Anonymous wrote:Smart kids need to learn how to use AI.

We are now teaching people at work how to use it and it’s like come on man high school students configured this out.

The smart kids use it to edit their papers, brainstorm, check, grammar, etc.

Really smart ones popped their syllabus into it and ask for to write a study guide.

Savy ones can put their whole wardrobe in there and ask for it to pop out a bunch of outfits for the week. Most of. DC could use that



I’m baffled by this idea that there’s a “smart” way and a dumb way to use it. As if learning how to craft the prompts is so difficult. It is what it is. You click a button and it does very thing for you in an instant. Is that amazing? Yes. Is that useful? Sure. Does it indicate your level of intelligence or savvy? No, it doesn’t.


There absolutely is a smart way and a dumb way to use AI. Only someone who hasn’t used an AI program like ChatGPT could write that.

It actually is difficult to learn how to prompt AI, what information to give AI, how to then re-prompt and re-prompt and give it samples of what you want want.

You don’t understand -garbage in, garbage out. My high school and college age kids use it extensively to create study guides, practice tests, come up with multiple formats/versions of writing that they then work on to make it their own.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Smart kids need to learn how to use AI.

We are now teaching people at work how to use it and it’s like come on man high school students configured this out.

The smart kids use it to edit their papers, brainstorm, check, grammar, etc.

Really smart ones popped their syllabus into it and ask for to write a study guide.

Savy ones can put their whole wardrobe in there and ask for it to pop out a bunch of outfits for the week. Most of. DC could use that



I’m baffled by this idea that there’s a “smart” way and a dumb way to use it. As if learning how to craft the prompts is so difficult. It is what it is. You click a button and it does very thing for you in an instant. Is that amazing? Yes. Is that useful? Sure. Does it indicate your level of intelligence or savvy? No, it doesn’t.


There absolutely is a smart way and a dumb way to use AI. Only someone who hasn’t used an AI program like ChatGPT could write that.

It actually is difficult to learn how to prompt AI, what information to give AI, how to then re-prompt and re-prompt and give it samples of what you want want.

You don’t understand -garbage in, garbage out. My high school and college age kids use it extensively to create study guides, practice tests, come up with multiple formats/versions of writing that they then work on to make it their own.


I have used it and this is all just, ugh, no. It’s all garbage out.
Anonymous
Yes, basically all of them.
Anonymous
Anonymous wrote:Meanwhile out in the workplace, employers want you to incorporate it as appropriate


This…. It’s relevant in everything.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Yes, and they study for exams by having AI fill in their study guides (no need to take notes or pay attention in class! and it's quite amusing to read some of the AI 'hallucinations'!). I've already moved to in-class essay exams, which I try to make AI-proof. And this coming term I'm implementing a quasi-tutorial system wherein students write papers but then have to explain and defend them to me and one or two other students, in small group discussions.

If profs continued to assess as if it were 2022 many if not most students would be getting As for literally zero work, having learned absolutely nothing in the course.


Another prof here. PLEASE WARN YOUR KIDS ABOUT USING AI TO GENERATE A STUDY GUIDE. I see nothing wrong with it ethically, but in my experience, AI does a bad job of distilling what I emphasized on my slides, or the bigger picture, and at times it feeds the students blatantly incorrect information. One of my students sent me the AI study guide he got from asking AI to do it off of my slides / lecture notes (I try to make the slides have a lot of info on them so students CAN use them as a study guide). About 50% of it was helpful, and about 30% of it was wildly irrelevant (student now spending time studying stupid little sidebar notes) and about 20% was explicitly wrong. For example, I went over a specific theory, and the name of the specific theory was on the slides. It gave a similar-sounding name for the theory - a name that an outsider would think sounds pretty similar but it was completely different in meaning. And then described the theory in a way that was really wrong - almost like a smart outsider guessing what the theory meant based on the name. And being wrong.

And it also missed really really big important concepts. Ones that had several slides. So the student would have failed to study those.

The thing is, developing your own study guide is actually a big part of the learning. I told my student that he would have been so much better off going through the lecture notes and making his own study guide. Luckily the student emailed the guide ahead of time and I emailed him back and told him to throw it away and that he'd do terrible on the exam if he used it!

So it'd be helpful to warn your student not to do this - they won't do as well on exams.


Thanks to the prof above for saying this.

Over in the "Jobs" forum, the AI posts have so much fearmongering. And yet, the AI work I've seen is frankly pretty crappy or off-target. My corporation has been experimenting with AI and there are lots of failures and areas where it can't be used yet.

It seems to me that the people who like AI work the best are unable to perceive quality differences, just like the student above. This makes me question how they can use AI to quickly make excellent work. Of course, sometimes completing an assignment is just about handing something in. But we should expect more of ourselves and of education.

I think education is going through a tough adaptation period right now. But I believe it will all work out, and that students will benefit from AI as a tool and still learn and internalize concepts.


Subscribe to Artificial Lawyer. Mind-blowing how much activity is in this space.
Anonymous
Your nephew is not going to tell you he uses AI to write papers. LOL

Yes, lots of students use it to do the work for them.
Anonymous
Lots of kids use it, and, according to my DC, lots of kids are getting in trouble/suspended for doing so. And this is at an elite college.
post reply Forum Index » College and University Discussion
Message Quick Reply
Go to: