Anonymous
Post 01/24/2026 20:12     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:No. Who is allowed to use AI for their job?!

literally everyone?


We’re not allowed to use it at my massive employer. National security field.


Models are in FedRAMP high environments. They're using it at DoD, the FFRDCs, the NSA. Who has banned this?


We’re private sector. What they use at the agencies doesn’t apply to us.


If govies are using a technology and you're not allowed to, that's really saying something about your org.


No. It says something about the US government.

We would have to develop our own closed model in order for it to be secure, or at least mostly secure. As you can see from the responses to this thread, there is no business case here. It doesn’t save time and will only cost money. Anything we build will be old tech five minutes later. It doesn’t make financial sense and it doesn’t make workflow sense.


+1
Another private sector here, with a lot of proprietary information.
It very disheartening that the "govies" on here think they are ahead. Think about what you and your coworkers are pasting into that box or pointing the AI to. You are sending your (OUR) data to AI. Memos, code, really think about it, numnuts.
Anonymous
Post 01/24/2026 16:59     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:No. Who is allowed to use AI for their job?!

literally everyone?


We’re not allowed to use it at my massive employer. National security field.


Models are in FedRAMP high environments. They're using it at DoD, the FFRDCs, the NSA. Who has banned this?


We’re private sector. What they use at the agencies doesn’t apply to us.


If govies are using a technology and you're not allowed to, that's really saying something about your org.


No. It says something about the US government.

We would have to develop our own closed model in order for it to be secure, or at least mostly secure. As you can see from the responses to this thread, there is no business case here. It doesn’t save time and will only cost money. Anything we build will be old tech five minutes later. It doesn’t make financial sense and it doesn’t make workflow sense.


No one is "developing their own closed model". There are open source models you can run locally on your servers. If the NSA can figure out how to do this (including with proprietary models in FedRAMPd environments), your security concerns are incorrect. And that you think you would need to develop your own models means you are detached enough from how this works that "there's no business case here" is almost certainly incorrect. Do you write? Does your org produce code? Do you synthesize text data in any way? If so, you have a business case.


You’re weirdly invested in this tool getting into every nook and cranny. Why?


You’re weirdly invested in commenting on a thread where you don’t even understand the technology.

OP here. Surprised this thread seems to have turned into a pro vs. anti AI discussion!
Anonymous
Post 01/24/2026 00:10     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:No. Who is allowed to use AI for their job?!

literally everyone?


We’re not allowed to use it at my massive employer. National security field.


Models are in FedRAMP high environments. They're using it at DoD, the FFRDCs, the NSA. Who has banned this?


We’re private sector. What they use at the agencies doesn’t apply to us.


If govies are using a technology and you're not allowed to, that's really saying something about your org.


No. It says something about the US government.

We would have to develop our own closed model in order for it to be secure, or at least mostly secure. As you can see from the responses to this thread, there is no business case here. It doesn’t save time and will only cost money. Anything we build will be old tech five minutes later. It doesn’t make financial sense and it doesn’t make workflow sense.


No one is "developing their own closed model". There are open source models you can run locally on your servers. If the NSA can figure out how to do this (including with proprietary models in FedRAMPd environments), your security concerns are incorrect. And that you think you would need to develop your own models means you are detached enough from how this works that "there's no business case here" is almost certainly incorrect. Do you write? Does your org produce code? Do you synthesize text data in any way? If so, you have a business case.


You’re weirdly invested in this tool getting into every nook and cranny. Why?


I want organizations doing important work to be doing it well. And that there are apparently no economic pressures on your taxpayer-funded org making them keep up -- I'm sure this isn't the only symptom of that.


Taxpayer funded? You're responding to someone in the private sector after complaining that person doesn't use federal tools. You're not even coherent.

A thing I notice about needed and useful innovations is that people adopt them without needing to be constantly told how vital it is that they be adopted.


National security field.
And do you think the defense primes have innovated in decades on anything?
Anonymous
Post 01/23/2026 23:49     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:No. Who is allowed to use AI for their job?!

literally everyone?


We’re not allowed to use it at my massive employer. National security field.


Models are in FedRAMP high environments. They're using it at DoD, the FFRDCs, the NSA. Who has banned this?


We’re private sector. What they use at the agencies doesn’t apply to us.


If govies are using a technology and you're not allowed to, that's really saying something about your org.


No. It says something about the US government.

We would have to develop our own closed model in order for it to be secure, or at least mostly secure. As you can see from the responses to this thread, there is no business case here. It doesn’t save time and will only cost money. Anything we build will be old tech five minutes later. It doesn’t make financial sense and it doesn’t make workflow sense.


No one is "developing their own closed model". There are open source models you can run locally on your servers. If the NSA can figure out how to do this (including with proprietary models in FedRAMPd environments), your security concerns are incorrect. And that you think you would need to develop your own models means you are detached enough from how this works that "there's no business case here" is almost certainly incorrect. Do you write? Does your org produce code? Do you synthesize text data in any way? If so, you have a business case.


You’re weirdly invested in this tool getting into every nook and cranny. Why?


I want organizations doing important work to be doing it well. And that there are apparently no economic pressures on your taxpayer-funded org making them keep up -- I'm sure this isn't the only symptom of that.


Taxpayer funded? You're responding to someone in the private sector after complaining that person doesn't use federal tools. You're not even coherent.

A thing I notice about needed and useful innovations is that people adopt them without needing to be constantly told how vital it is that they be adopted.


There is no security-involved co not using AI unless they are physical security guards. This poster is lying
Anonymous
Post 01/23/2026 22:20     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:No. Who is allowed to use AI for their job?!

literally everyone?


We’re not allowed to use it at my massive employer. National security field.


Models are in FedRAMP high environments. They're using it at DoD, the FFRDCs, the NSA. Who has banned this?


We’re private sector. What they use at the agencies doesn’t apply to us.


If govies are using a technology and you're not allowed to, that's really saying something about your org.


No. It says something about the US government.

We would have to develop our own closed model in order for it to be secure, or at least mostly secure. As you can see from the responses to this thread, there is no business case here. It doesn’t save time and will only cost money. Anything we build will be old tech five minutes later. It doesn’t make financial sense and it doesn’t make workflow sense.


No one is "developing their own closed model". There are open source models you can run locally on your servers. If the NSA can figure out how to do this (including with proprietary models in FedRAMPd environments), your security concerns are incorrect. And that you think you would need to develop your own models means you are detached enough from how this works that "there's no business case here" is almost certainly incorrect. Do you write? Does your org produce code? Do you synthesize text data in any way? If so, you have a business case.


You’re weirdly invested in this tool getting into every nook and cranny. Why?


I want organizations doing important work to be doing it well. And that there are apparently no economic pressures on your taxpayer-funded org making them keep up -- I'm sure this isn't the only symptom of that.


Taxpayer funded? You're responding to someone in the private sector after complaining that person doesn't use federal tools. You're not even coherent.

A thing I notice about needed and useful innovations is that people adopt them without needing to be constantly told how vital it is that they be adopted.
Anonymous
Post 01/23/2026 14:28     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:No. Who is allowed to use AI for their job?!

literally everyone?


We’re not allowed to use it at my massive employer. National security field.


Models are in FedRAMP high environments. They're using it at DoD, the FFRDCs, the NSA. Who has banned this?


We’re private sector. What they use at the agencies doesn’t apply to us.


If govies are using a technology and you're not allowed to, that's really saying something about your org.


No. It says something about the US government.

We would have to develop our own closed model in order for it to be secure, or at least mostly secure. As you can see from the responses to this thread, there is no business case here. It doesn’t save time and will only cost money. Anything we build will be old tech five minutes later. It doesn’t make financial sense and it doesn’t make workflow sense.


No one is "developing their own closed model". There are open source models you can run locally on your servers. If the NSA can figure out how to do this (including with proprietary models in FedRAMPd environments), your security concerns are incorrect. And that you think you would need to develop your own models means you are detached enough from how this works that "there's no business case here" is almost certainly incorrect. Do you write? Does your org produce code? Do you synthesize text data in any way? If so, you have a business case.


You’re weirdly invested in this tool getting into every nook and cranny. Why?


I want organizations doing important work to be doing it well. And that there are apparently no economic pressures on your taxpayer-funded org making them keep up -- I'm sure this isn't the only symptom of that.
Anonymous
Post 01/23/2026 13:57     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:No. Who is allowed to use AI for their job?!

literally everyone?


We’re not allowed to use it at my massive employer. National security field.


Models are in FedRAMP high environments. They're using it at DoD, the FFRDCs, the NSA. Who has banned this?


We’re private sector. What they use at the agencies doesn’t apply to us.


Well level up.


Level up why? To use AI in the workplace?


Yes hon. It’s like saying you don’t wanna “go to the cloud”. Or learn call waiting. Or wanna learn how to use the Internet . Or you don’t want to use computers you’d rather use a typewriter.

You better educate yourself on AI or you’re gonna be left behind.


NP. It's not an issue of education, it's that AI is not fit for every possible purpose. It's a tool, and you don't use the same tool for every job. To use your example, people do in fact prefer typewriters in remote places with unreliable electricity: you could crank a generator to power your tablet, or you could just get on with your work on a typewriter.

My workplace could, in theory, completely reconfigure everything so that my job could be done by AI. It would be a massive undertaking and the only payout would be replacing my one little job. Not worth it.


You don’t understand AI enough to really discuss it and I can tell by just reading what you wrote.


Are you here to educate people based on your substantial and nuanced understanding of this tech or identify someone failing to inhale the hype as hard as you? What are you some 28 year old 'consultant' at deloitte or accenture? Idiot.
Anonymous
Post 01/23/2026 13:44     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:No. Who is allowed to use AI for their job?!

literally everyone?


We’re not allowed to use it at my massive employer. National security field.


Models are in FedRAMP high environments. They're using it at DoD, the FFRDCs, the NSA. Who has banned this?


We’re private sector. What they use at the agencies doesn’t apply to us.


If govies are using a technology and you're not allowed to, that's really saying something about your org.


No. It says something about the US government.

We would have to develop our own closed model in order for it to be secure, or at least mostly secure. As you can see from the responses to this thread, there is no business case here. It doesn’t save time and will only cost money. Anything we build will be old tech five minutes later. It doesn’t make financial sense and it doesn’t make workflow sense.


No one is "developing their own closed model". There are open source models you can run locally on your servers. If the NSA can figure out how to do this (including with proprietary models in FedRAMPd environments), your security concerns are incorrect. And that you think you would need to develop your own models means you are detached enough from how this works that "there's no business case here" is almost certainly incorrect. Do you write? Does your org produce code? Do you synthesize text data in any way? If so, you have a business case.


You’re weirdly invested in this tool getting into every nook and cranny. Why?


You’re weirdly invested in commenting on a thread where you don’t even understand the technology.
Anonymous
Post 01/23/2026 13:43     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:No. Who is allowed to use AI for their job?!

literally everyone?


We’re not allowed to use it at my massive employer. National security field.


Models are in FedRAMP high environments. They're using it at DoD, the FFRDCs, the NSA. Who has banned this?


We’re private sector. What they use at the agencies doesn’t apply to us.


Well level up.


Level up why? To use AI in the workplace?


Yes hon. It’s like saying you don’t wanna “go to the cloud”. Or learn call waiting. Or wanna learn how to use the Internet . Or you don’t want to use computers you’d rather use a typewriter.

You better educate yourself on AI or you’re gonna be left behind.


NP. It's not an issue of education, it's that AI is not fit for every possible purpose. It's a tool, and you don't use the same tool for every job. To use your example, people do in fact prefer typewriters in remote places with unreliable electricity: you could crank a generator to power your tablet, or you could just get on with your work on a typewriter.

My workplace could, in theory, completely reconfigure everything so that my job could be done by AI. It would be a massive undertaking and the only payout would be replacing my one little job. Not worth it.


You don’t understand AI enough to really discuss it and I can tell by just reading what you wrote.
Anonymous
Post 01/23/2026 13:26     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

I don't use it.
Anonymous
Post 01/23/2026 11:38     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:No. Who is allowed to use AI for their job?!

literally everyone?


We’re not allowed to use it at my massive employer. National security field.


Models are in FedRAMP high environments. They're using it at DoD, the FFRDCs, the NSA. Who has banned this?


We’re private sector. What they use at the agencies doesn’t apply to us.


If govies are using a technology and you're not allowed to, that's really saying something about your org.


No. It says something about the US government.

We would have to develop our own closed model in order for it to be secure, or at least mostly secure. As you can see from the responses to this thread, there is no business case here. It doesn’t save time and will only cost money. Anything we build will be old tech five minutes later. It doesn’t make financial sense and it doesn’t make workflow sense.


No one is "developing their own closed model". There are open source models you can run locally on your servers. If the NSA can figure out how to do this (including with proprietary models in FedRAMPd environments), your security concerns are incorrect. And that you think you would need to develop your own models means you are detached enough from how this works that "there's no business case here" is almost certainly incorrect. Do you write? Does your org produce code? Do you synthesize text data in any way? If so, you have a business case.


You’re weirdly invested in this tool getting into every nook and cranny. Why?


You're just being trolled by someone in bad faith.
Anonymous
Post 01/23/2026 11:35     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:Claude does everything for me now. Probably shaved 5-10 hours a week for me of BS writing and editing that I have to do.


Wow!
Anonymous
Post 01/23/2026 11:33     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:No. Who is allowed to use AI for their job?!

literally everyone?


We’re not allowed to use it at my massive employer. National security field.


Models are in FedRAMP high environments. They're using it at DoD, the FFRDCs, the NSA. Who has banned this?


We’re private sector. What they use at the agencies doesn’t apply to us.


If govies are using a technology and you're not allowed to, that's really saying something about your org.


No. It says something about the US government.

We would have to develop our own closed model in order for it to be secure, or at least mostly secure. As you can see from the responses to this thread, there is no business case here. It doesn’t save time and will only cost money. Anything we build will be old tech five minutes later. It doesn’t make financial sense and it doesn’t make workflow sense.


No one is "developing their own closed model". There are open source models you can run locally on your servers. If the NSA can figure out how to do this (including with proprietary models in FedRAMPd environments), your security concerns are incorrect. And that you think you would need to develop your own models means you are detached enough from how this works that "there's no business case here" is almost certainly incorrect. Do you write? Does your org produce code? Do you synthesize text data in any way? If so, you have a business case.


You’re weirdly invested in this tool getting into every nook and cranny. Why?
Anonymous
Post 01/23/2026 11:16     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:No. Who is allowed to use AI for their job?!

literally everyone?


We’re not allowed to use it at my massive employer. National security field.


Models are in FedRAMP high environments. They're using it at DoD, the FFRDCs, the NSA. Who has banned this?


We’re private sector. What they use at the agencies doesn’t apply to us.


If govies are using a technology and you're not allowed to, that's really saying something about your org.


No. It says something about the US government.

We would have to develop our own closed model in order for it to be secure, or at least mostly secure. As you can see from the responses to this thread, there is no business case here. It doesn’t save time and will only cost money. Anything we build will be old tech five minutes later. It doesn’t make financial sense and it doesn’t make workflow sense.


No one is "developing their own closed model". There are open source models you can run locally on your servers. If the NSA can figure out how to do this (including with proprietary models in FedRAMPd environments), your security concerns are incorrect. And that you think you would need to develop your own models means you are detached enough from how this works that "there's no business case here" is almost certainly incorrect. Do you write? Does your org produce code? Do you synthesize text data in any way? If so, you have a business case.
Anonymous
Post 01/23/2026 11:10     Subject: Has using AI (ChatGPT etc…) reduced the number of hours you work?

I used it to pull data from a number of pages on a website and then aggregate. Saved a lot of time, though I had to be familiar enough with the data to correct the bot when it reported the data from the same page twice.