Anonymous wrote:Most CS majors don't do coding.
Anonymous wrote:Anonymous wrote:Anonymous wrote:Hmm. At my work we have a custom IT project and are employing developers and there's no out of the box AI that will work for our purposes.
They are turning on AI features all over the place and encouraging us to use it. But apparently it's not available for the purpose where we actually would like to try it.
What I find interesting is the reason that AI is able to code currently is because it’s been trained on tons of code and stack overflow questions. If nobody is creating new code or asking new questions on SO, where does new original content come from?
I do admit that ChatGPT does an excellent job of writing specific snippets of code; you’ll still need someone good to understand and integrate, and to design the system architecture.
My husband is a principal software engineer and data architect - luckily (I guess) for him, he works in a highly sensitive area of government which likely won’t adopt AI very quickly, if they do at all. He says the same as the above. So far, AI can’t do it all.
Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:I think students who love CS should major in it. Those who were doing so only for a high paying career (or because their parents were forcing it) should not. There will always be CS jobs, just not as many if the lower level coding types of jobs.
Correct. It is more important than ever to go to a high-quality school that is known for rigorous CS with curriuculum in AI and emphasis on critical thinking and problem solving. The CS bubble started to break for graduates in 2024. The schools that have done well in placement despite the bubble are the same set of schools that are highly regarded for engineering and adjacent fields: MIT, calTech, Stanford, UCB, CMU, GaTech, UIUC, &the ivies that are at the top of stem(Princeton, Penn, Cornell, Harvard, Columbia), probably a couple more close. Higher level CS positions available to students with a bachelors have always preferentially hired from these schools. Good but not highly rigorous programs mainly send students to lower-level coding positions after a bachelors. Hiring managers look at the courses taken and the most rigorous schools tend to have the highest percent of students who have taken rigorous coursework, often grad level if available, and have extensive coding experience through on-campus and summer internships/research. It is much more cost effective to recruit at colleges where the vast majority of students are desirable, not just the top 5%.
I doubt Gatech and UIUC can hang in there much longer. Percent employed has decreased dramatically at Gatech.
Yeah, all UIUC has ever done was... invent the web browser (Mosaic). Surely AI can take the place of all future innovations.
Anonymous wrote:Anonymous wrote:Anonymous wrote:
As a software engineer I tried a snippet of code from Google Gemini once. It was wrong.
Oh, ok. You tried it *once* and it was wrong, therefore it's useless.
I'd highly recommend that you put in the time and check out the latest tools.
Try Copilot (with VSCode), or Cursor. Check out Claude Code - I think you'll be surprised.
All of these now work in an agentic mode - give them a problem, and they will break it down into tasks and present possible solutions. You can ask follow-up questions and give clarifying instructions.
Don't expect a one-shot solution, unless you're writing short scripts (which they are *great* at, BTW).
Finally, try out Gemini or OpenAI 'deep research' mode. I've had it produce me a 35 page paper on post-quantum algorithm adoption and 20 minutes later I had a great document with references, etc. As a bonus, it made me a great infographic.
And only half the references were complete fabrications!
Anonymous wrote:Anonymous wrote:
As a software engineer I tried a snippet of code from Google Gemini once. It was wrong.
Oh, ok. You tried it *once* and it was wrong, therefore it's useless.
I'd highly recommend that you put in the time and check out the latest tools.
Try Copilot (with VSCode), or Cursor. Check out Claude Code - I think you'll be surprised.
All of these now work in an agentic mode - give them a problem, and they will break it down into tasks and present possible solutions. You can ask follow-up questions and give clarifying instructions.
Don't expect a one-shot solution, unless you're writing short scripts (which they are *great* at, BTW).
Finally, try out Gemini or OpenAI 'deep research' mode. I've had it produce me a 35 page paper on post-quantum algorithm adoption and 20 minutes later I had a great document with references, etc. As a bonus, it made me a great infographic.
Anonymous wrote:Anonymous wrote:Hmm. At my work we have a custom IT project and are employing developers and there's no out of the box AI that will work for our purposes.
They are turning on AI features all over the place and encouraging us to use it. But apparently it's not available for the purpose where we actually would like to try it.
What I find interesting is the reason that AI is able to code currently is because it’s been trained on tons of code and stack overflow questions. If nobody is creating new code or asking new questions on SO, where does new original content come from?
I do admit that ChatGPT does an excellent job of writing specific snippets of code; you’ll still need someone good to understand and integrate, and to design the system architecture.
Anonymous wrote:I'm a computer scientist/software engineer. I use AI to write my code daily. It doesn't write itself though. You have to know how to craft a prompt and check its work. It types out what I know how to do and can tell it to do in a prompt, but you have to explain to it what to do.
Software Engineers have been using copy/paste off of Stack Exchange for years. This speeds up that process and just prints out the code.
Computer Science isn't going away. We still need new software engineers, maybe more. They need to learn how to craft prompts and use AI to help them program faster and more efficiently.
It seems like the same people worrying about this are the people who thought all cashiers at McDonalds were going to be fired because a kiosk showed up. You either learn how to use technology to enhance and assist you in your career or you become a dinosaur.
CS majors will be the ones who are learning in college what those of us in the field already are learning on the job.
We aren't all going to be obsolete.
Anonymous wrote:It is a concern for us. Our DS is in his 2nd year doing CS at Perdue.
Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:A machine learning engineer focus seems like a good call for new grads.
yes, that's what my kid is focusing on. And some universities now have a Masters program in AI. UMD just started one.
I wish colleges/universities would go the opposite way. Instead of spinning up new niche majors that will eventually be pointless, focus on the liberal arts (broadly). When treated as a science, CS is a liberal art. Teach kids how to actually think and learn and grow. Then they can adapt to the changes in the job market as they come.
Picking up individual technical skills is a piece of cake once you understand the logic and algorithms and engineering principals that underlie them.
Yes, we need more liberal arts majors! Their work cannot be replaced by AI!