Anonymous wrote:Anonymous wrote:
As a software engineer I tried a snippet of code from Google Gemini once. It was wrong.
Oh, ok. You tried it *once* and it was wrong, therefore it's useless.
I'd highly recommend that you put in the time and check out the latest tools.
Try Copilot (with VSCode), or Cursor. Check out Claude Code - I think you'll be surprised.
All of these now work in an agentic mode - give them a problem, and they will break it down into tasks and present possible solutions. You can ask follow-up questions and give clarifying instructions.
Don't expect a one-shot solution, unless you're writing short scripts (which they are *great* at, BTW).
Finally, try out Gemini or OpenAI 'deep research' mode. I've had it produce me a 35 page paper on post-quantum algorithm adoption and 20 minutes later I had a great document with references, etc. As a bonus, it made me a great infographic.
Anonymous wrote:
As a software engineer I tried a snippet of code from Google Gemini once. It was wrong.
Anonymous wrote:I think students who love CS should major in it. Those who were doing so only for a high paying career (or because their parents were forcing it) should not. There will always be CS jobs, just not as many if the lower level coding types of jobs.
Anonymous wrote:I think students who love CS should major in it. Those who were doing so only for a high paying career (or because their parents were forcing it) should not. There will always be CS jobs, just not as many if the lower level coding types of jobs.
Anonymous wrote:Anonymous wrote:Hmm. At my work we have a custom IT project and are employing developers and there's no out of the box AI that will work for our purposes.
They are turning on AI features all over the place and encouraging us to use it. But apparently it's not available for the purpose where we actually would like to try it.
What I find interesting is the reason that AI is able to code currently is because it’s been trained on tons of code and stack overflow questions. If nobody is creating new code or asking new questions on SO, where does new original content come from?
I do admit that ChatGPT does an excellent job of writing specific snippets of code; you’ll still need someone good to understand and integrate, and to design the system architecture.
Anonymous wrote:I am stunned with your creativity in starting AI threads. Every day, something different.
Anyway, there will need to be to be someone to “manage” the AI, so com sci majors should be focusing on that.
Anonymous wrote:I think students who love CS should major in it. Those who were doing so only for a high paying career (or because their parents were forcing it) should not. There will always be CS jobs, just not as many if the lower level coding types of jobs.
Anonymous wrote:Anonymous wrote:Hmm. At my work we have a custom IT project and are employing developers and there's no out of the box AI that will work for our purposes.
They are turning on AI features all over the place and encouraging us to use it. But apparently it's not available for the purpose where we actually would like to try it.
What I find interesting is the reason that AI is able to code currently is because it’s been trained on tons of code and stack overflow questions. If nobody is creating new code or asking new questions on SO, where does new original content come from?
I do admit that ChatGPT does an excellent job of writing specific snippets of code; you’ll still need someone good to understand and integrate, and to design the system architecture.
Anonymous wrote:Hmm. At my work we have a custom IT project and are employing developers and there's no out of the box AI that will work for our purposes.
They are turning on AI features all over the place and encouraging us to use it. But apparently it's not available for the purpose where we actually would like to try it.