| None |
|
I'm a researcher, so my job is a mix of coding, writing, reviewing materials, etc.
I use AI for coding basically every day. I'm a pretty good programmer by researcher standards, but AI allows me to tackle a lot of things that would otherwise be outside of my comfort zone, or would take a long time to learn to do well. AI is also a very good programming tutor. I'm not using agentic tools yet, but I'm planning to give them a try soon. I review all the code that AI writes, and often modify it quite a bit. I think it's easy to get into a lot of "tech debt" if you continue to build on things without understanding how or why they work. Also, in the case of research I think there's an ethical issue if you're not really confident of what is going on. I don't use AI to write, but I do occasionally use AI to provide feedback on my writing. I think that's pretty typical of academics these days. I wouldn't feel guilty having AI write, for example, a methodological summary of my computer code, as long as I reviewed it for accuracy. But, I don't actually do this in practice, as I prefer being hands on. I know others who use AI to produce a first draft of their presentations, though I have not done this. If I only need a high level understanding of a paper, I'll have AI summarize it for me. If I need to provide detailed feedback, I write down my own scattered thoughts and have AI organize them into a well structured response. I think these uses are pretty typical of my colleagues as well. Comparing the recent activities of my colleagues, it's pretty apparent who is spending time to incorporate AI into their workflow and who is not. The former are running circles around the latter. We're making progress on big tasks and process improvements that previously seemed unobtainable, and the direction of those improvements is being set by our most technologically savvy folks because they're using AI to expand the scope of what we can do. |
| It’s obvious who in this thread is using AI to write for them, even the ones who claim they’re not. |
|
I saw this article pop up in my newsfeed and it's pretty eye-opening.
https://fortune.com/2026/02/11/something-big-is-happening-ai-february-2020-moment-matt-shumer/ Not sure what versions you all are using, but in the article, the authors claims the paid for AI models are vastly superior to the freebies. Also, it appears that the AI improvements are happening faster and faster. Fascinating and troubling at the same time. |
| Zero |
|
Not much. The problem for me is that my job is pretty dependent on data and research that typically aren't digitized, and that the AI my employer is willing to pay for doesn't have great results for accuracy. AI's strength is not in collection of new empirical data from the physical world. My field is trying to move toward making more extant publicly held data searchable and accessible, which would make AI more useful (and it's super frustrating that the funding and will weren't there to make that same data accessible for humans 10-20 years ago), but we're way behind the tech world.
There is a portion of my job I could imagine that AI could do well (revising reports in specific ways), if fed good data and trained in the conventions, but at this point it's typically easier for me to do it myself than wrestle with the tools I have access to. That may change in the next few years and reduce contract projects and pay, but it'll still require physical field work and human checkers. |
| I don't use it but give it a swing every few months. I have an engineering degree but am really just an educated technician. I work on energy systems but they are niche and customized to the point that AI systems haven't digested them to a useful degree. Not yet anyway. |
Lol. No you can't. Maybe a year ago. But still probably not |
| Never. My previous boss used it for everything and it was just embarrassing for him. My new one never does and she comes off far more competent. |
It may not be doing you as many favors as you think. I could tell my supervisor did my annual plan using AI and it made me livid. Like another poster said, I'm not even worth my boss's time once a year anymore? Blech. Tying this back to the original question from OP, I'd be ok if my boss told me they used AI to help them write it to save time. Passing it off as your own is insulting. Your staff isn't telling that to your face because, well, you're the boss. Don't be naive to think that they aren't talking amongst themselves about it though. |
You don’t appear to understand how current AI tools work. The pp said she wrote her own notes and used the AI to format it into an evaluation. That’s totally appropriate and I would not be upset if my supervisor did it. There is a big difference between asking an LLM to write a structured document based on data you feed it and asking it to write something from scratch. LLMs excel at taking data and turning them into structured document. They excel at “structured” language in general like computer code, regulatory writing, etc. |
| Never |
| Keep in mind that AI is inherently sexier and racist among other things. It’s trained on our society. |
Lol, that should be *sexist* |
|
I use it daily. I have to do a ton of monotonous writing in a marcom role and it helps clean up stuff like social posts and email copy. Also helps develop editorial and project calendars. I’ve got a paid Claude account and have all my projects in it.
And we have an AI policy at work that permits this use. |