AI now writes 25% of code in the US: Should Computer Science students rethink their career plans?

Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:A machine learning engineer focus seems like a good call for new grads.

yes, that's what my kid is focusing on. And some universities now have a Masters program in AI. UMD just started one.


I wish colleges/universities would go the opposite way. Instead of spinning up new niche majors that will eventually be pointless, focus on the liberal arts (broadly). When treated as a science, CS is a liberal art. Teach kids how to actually think and learn and grow. Then they can adapt to the changes in the job market as they come.

Picking up individual technical skills is a piece of cake once you understand the logic and algorithms and engineering principals that underlie them.


Yes, we need more liberal arts majors! Their work cannot be replaced by AI!


I mean, you could have actual scientists working in the federal government.

Or the report produced by MAHA that was full of AI hallucinations.

Take your pick.
Anonymous
My class of 23 DS is using AI do help with coding. Luckily he got into a position before
Anonymous
It is a concern for us. Our DS is in his 2nd year doing CS at Perdue.

Anonymous
I'm a computer scientist/software engineer. I use AI to write my code daily. It doesn't write itself though. You have to know how to craft a prompt and check its work. It types out what I know how to do and can tell it to do in a prompt, but you have to explain to it what to do.
Software Engineers have been using copy/paste off of Stack Exchange for years. This speeds up that process and just prints out the code.
Computer Science isn't going away. We still need new software engineers, maybe more. They need to learn how to craft prompts and use AI to help them program faster and more efficiently.
It seems like the same people worrying about this are the people who thought all cashiers at McDonalds were going to be fired because a kiosk showed up. You either learn how to use technology to enhance and assist you in your career or you become a dinosaur.
CS majors will be the ones who are learning in college what those of us in the field already are learning on the job.

We aren't all going to be obsolete.
Anonymous
Yes absolutely!!
Anonymous
We need plumbers and auto mechanics
Anonymous
Anonymous wrote:It is a concern for us. Our DS is in his 2nd year doing CS at Perdue.



By all means, have him switch to become an English major

I'm sure they are teaching him AI and how to use it to create data analytics and programs in less time than it has taken in the past. Data science is increasingly more important in every field.

Anonymous
Anonymous wrote:What does DCUM think?

https://timesofindia.indiatimes.com/education/news/ai-now-writes-25-of-code-in-the-us-should-computer-science-students-rethink-their-career-plans/amp_articleshow/122002732.cms


As a programmer who uses AI extensively, I'm skeptical of the truthfulness of this. AI is helpful, but without oversight, the code it generates is often inefficient and prone to bugs. It will do things that are completely unacceptible. In one case I remember it modified the code to work for my test case even though the main code was completely brokne in an effort to pass. Seriously if people are really using AI to write code for critical systems we're in serious trouble.
Anonymous
Anonymous wrote:I'm a computer scientist/software engineer. I use AI to write my code daily. It doesn't write itself though. You have to know how to craft a prompt and check its work. It types out what I know how to do and can tell it to do in a prompt, but you have to explain to it what to do.
Software Engineers have been using copy/paste off of Stack Exchange for years. This speeds up that process and just prints out the code.
Computer Science isn't going away. We still need new software engineers, maybe more. They need to learn how to craft prompts and use AI to help them program faster and more efficiently.
It seems like the same people worrying about this are the people who thought all cashiers at McDonalds were going to be fired because a kiosk showed up. You either learn how to use technology to enhance and assist you in your career or you become a dinosaur.
CS majors will be the ones who are learning in college what those of us in the field already are learning on the job.

We aren't all going to be obsolete.


Thank you. A lot of unnecessary fear
Anonymous
Anonymous wrote:
Anonymous wrote:Hmm. At my work we have a custom IT project and are employing developers and there's no out of the box AI that will work for our purposes.

They are turning on AI features all over the place and encouraging us to use it. But apparently it's not available for the purpose where we actually would like to try it.


What I find interesting is the reason that AI is able to code currently is because it’s been trained on tons of code and stack overflow questions. If nobody is creating new code or asking new questions on SO, where does new original content come from?

I do admit that ChatGPT does an excellent job of writing specific snippets of code; you’ll still need someone good to understand and integrate, and to design the system architecture.


Yes, you need someone good to prompt it (basically write a spec) for small pieces that its able to do. Review what it does. Often make several sets of changes to its output and test it. I do believe it's a powerful tool, but it's not magic and without human interaction to guide and review it with an understanding, it is not reliable. Further, for anything novel or intricate that it has never seen, somewhere like StackOverflow, you definitely need human creativity. I recently encountered a fairly novel system that made a valiant effort, but overall, its approach was inferior to a more geometric solution I devised and it implemented. My point is this is helpful but isn't going to replace the human part of the equation anytime soon. It will likely continue to improve so no clue how this will play out though in 10+ years. The problem today is its really not capable of innovating just mimicry.
Anonymous
Anonymous wrote:
Anonymous wrote:
As a software engineer I tried a snippet of code from Google Gemini once. It was wrong.


Oh, ok. You tried it *once* and it was wrong, therefore it's useless.

I'd highly recommend that you put in the time and check out the latest tools.

Try Copilot (with VSCode), or Cursor. Check out Claude Code - I think you'll be surprised.

All of these now work in an agentic mode - give them a problem, and they will break it down into tasks and present possible solutions. You can ask follow-up questions and give clarifying instructions.

Don't expect a one-shot solution, unless you're writing short scripts (which they are *great* at, BTW).

Finally, try out Gemini or OpenAI 'deep research' mode. I've had it produce me a 35 page paper on post-quantum algorithm adoption and 20 minutes later I had a great document with references, etc. As a bonus, it made me a great infographic.



And only half the references were complete fabrications!
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
As a software engineer I tried a snippet of code from Google Gemini once. It was wrong.


Oh, ok. You tried it *once* and it was wrong, therefore it's useless.

I'd highly recommend that you put in the time and check out the latest tools.

Try Copilot (with VSCode), or Cursor. Check out Claude Code - I think you'll be surprised.

All of these now work in an agentic mode - give them a problem, and they will break it down into tasks and present possible solutions. You can ask follow-up questions and give clarifying instructions.

Don't expect a one-shot solution, unless you're writing short scripts (which they are *great* at, BTW).

Finally, try out Gemini or OpenAI 'deep research' mode. I've had it produce me a 35 page paper on post-quantum algorithm adoption and 20 minutes later I had a great document with references, etc. As a bonus, it made me a great infographic.



And only half the references were complete fabrications!


First PP here - I bit my tongue to keep from saying that.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:I think students who love CS should major in it. Those who were doing so only for a high paying career (or because their parents were forcing it) should not. There will always be CS jobs, just not as many if the lower level coding types of jobs.


Correct. It is more important than ever to go to a high-quality school that is known for rigorous CS with curriuculum in AI and emphasis on critical thinking and problem solving. The CS bubble started to break for graduates in 2024. The schools that have done well in placement despite the bubble are the same set of schools that are highly regarded for engineering and adjacent fields: MIT, calTech, Stanford, UCB, CMU, GaTech, UIUC, &the ivies that are at the top of stem(Princeton, Penn, Cornell, Harvard, Columbia), probably a couple more close. Higher level CS positions available to students with a bachelors have always preferentially hired from these schools. Good but not highly rigorous programs mainly send students to lower-level coding positions after a bachelors. Hiring managers look at the courses taken and the most rigorous schools tend to have the highest percent of students who have taken rigorous coursework, often grad level if available, and have extensive coding experience through on-campus and summer internships/research. It is much more cost effective to recruit at colleges where the vast majority of students are desirable, not just the top 5%.

I doubt Gatech and UIUC can hang in there much longer. Percent employed has decreased dramatically at Gatech.


Yeah, all UIUC has ever done was... invent the web browser (Mosaic). Surely AI can take the place of all future innovations.


Berners Lee might argue with that. What it did was write a viable implementation based on the work of others.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Hmm. At my work we have a custom IT project and are employing developers and there's no out of the box AI that will work for our purposes.

They are turning on AI features all over the place and encouraging us to use it. But apparently it's not available for the purpose where we actually would like to try it.


What I find interesting is the reason that AI is able to code currently is because it’s been trained on tons of code and stack overflow questions. If nobody is creating new code or asking new questions on SO, where does new original content come from?

I do admit that ChatGPT does an excellent job of writing specific snippets of code; you’ll still need someone good to understand and integrate, and to design the system architecture.


My husband is a principal software engineer and data architect - luckily (I guess) for him, he works in a highly sensitive area of government which likely won’t adopt AI very quickly, if they do at all. He says the same as the above. So far, AI can’t do it all.


He’s old. He’ll retire before his job becomes obsolete.

Kids just starting out need to be more practical.
Anonymous
Anonymous wrote:Most CS majors don't do coding.


Writing the code is the last and easiest step.
post reply Forum Index » College and University Discussion
Message Quick Reply
Go to: