I am so tired of every tech bro telling us how AI will change the world without giving us any concrete examples

Anonymous
Heres 3 examples. Yes I used chat to draft but I put in enough detail in my prompt that it came up with real world info info.

We used to have two support analysts manually categorize 5,000 IT support tickets per week. Now a model auto-tags 92% of them with 96% accuracy, and one analyst audits exceptions.

Our paralegals draft first-pass contract summaries. It used to take them hours in many cases. Have you ever read a commercial lease?? It is a monster. Now a model generates a structured summary in 45 seconds and the paralegal reviews and edits; prep time dropped from 1+ hours to under 15 minutes.

We replaced manual fraud screening of 100% of transactions with an AI risk score. Humans now review only the top 3% flagged.
Anonymous
All the huge AI investment is all the tech companies investing in each other's AI to artificially make it seem like they are all raising crazy amounts of money.

But if AI results in mass unemployment, that won't make the rich richer. Who are they are going to sell their products to if no one has a job? People need money to BUY things that the rich want to sell us. Thus they need jobs.

Anonymous
Anonymous wrote:Heres 3 examples. Yes I used chat to draft but I put in enough detail in my prompt that it came up with real world info info.

We used to have two support analysts manually categorize 5,000 IT support tickets per week. Now a model auto-tags 92% of them with 96% accuracy, and one analyst audits exceptions.

Our paralegals draft first-pass contract summaries. It used to take them hours in many cases. Have you ever read a commercial lease?? It is a monster. Now a model generates a structured summary in 45 seconds and the paralegal reviews and edits; prep time dropped from 1+ hours to under 15 minutes.

We replaced manual fraud screening of 100% of transactions with an AI risk score. Humans now review only the top 3% flagged.


I hope someone is still reading these. Because AI is not that great at summarizing!
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:What you are saying here is you're not on GitHub, Substack, LinkedIn, or any of the places where people talk about the specific things they're building. You don't go to meetups where people demo their tools. Your version of curiosity is "posting here and demanding people tell you."


This comment exemplifies the problem. The prediction engine works very well in coding and other predictable disciplines. But tech bros are too far out ahead of their skis in so many areas they really don’t understand the lack of utility.

That’s ok. Let’s see where this all leads.


You clearly don't know anyone in tech.... its not replacing people and needs a lot of human help.


It does need human help but less and less. And if you think ai (we are mostly talking about GAI here on this forum but regular/simple AI has been around for years), you clearly don’t know tech or industries at all. I’m in law- not even tech- and a major firm just laid off hundreds of people they don’t need anymore bc of AI from client IT/tech support to back end tech support to admins etc.

And they are not hiring first year lawyers like they used to.
Anonymous
Anonymous wrote:
Anonymous wrote:Heres 3 examples. Yes I used chat to draft but I put in enough detail in my prompt that it came up with real world info info.

We used to have two support analysts manually categorize 5,000 IT support tickets per week. Now a model auto-tags 92% of them with 96% accuracy, and one analyst audits exceptions.

Our paralegals draft first-pass contract summaries. It used to take them hours in many cases. Have you ever read a commercial lease?? It is a monster. Now a model generates a structured summary in 45 seconds and the paralegal reviews and edits; prep time dropped from 1+ hours to under 15 minutes.

We replaced manual fraud screening of 100% of transactions with an AI risk score. Humans now review only the top 3% flagged.


I hope someone is still reading these. Because AI is not that great at summarizing!


Pp, seriously? That’s all you’ve got?? I am typing on my tiny phone and added a sentence or two. This isn’t a letter to the president. It’s a chat forum.
Anonymous
Anonymous wrote:Heres 3 examples. Yes I used chat to draft but I put in enough detail in my prompt that it came up with real world info info.

We used to have two support analysts manually categorize 5,000 IT support tickets per week. Now a model auto-tags 92% of them with 96% accuracy, and one analyst audits exceptions.

Our paralegals draft first-pass contract summaries. It used to take them hours in many cases. Have you ever read a commercial lease?? It is a monster. Now a model generates a structured summary in 45 seconds and the paralegal reviews and edits; prep time dropped from 1+ hours to under 15 minutes.

We replaced manual fraud screening of 100% of transactions with an AI risk score. Humans now review only the top 3% flagged.


Wait... are these 3 examples of things you personally did but you asked AI to write this up? Or you asked AI to come up with 3 examples?
Anonymous
Anonymous wrote:
Anonymous wrote:What you are saying here is you're not on GitHub, Substack, LinkedIn, or any of the places where people talk about the specific things they're building. You don't go to meetups where people demo their tools. Your version of curiosity is "posting here and demanding people tell you."


This comment exemplifies the problem. The prediction engine works very well in coding and other predictable disciplines. But tech bros are too far out ahead of their skis in so many areas they really don’t understand the lack of utility.

That’s ok. Let’s see where this all leads.


It’s funny that you insult tech bros by using one of the corniest and hackneyed bro expression.

In any event, news flash, there are brilliant people working on this and there is a reason for it. GAI is transformational and will be no matter how many of you complain, all as you don’t even realize you’re using AI constantly throughout your day.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:What you are saying here is you're not on GitHub, Substack, LinkedIn, or any of the places where people talk about the specific things they're building. You don't go to meetups where people demo their tools. Your version of curiosity is "posting here and demanding people tell you."


This comment exemplifies the problem. The prediction engine works very well in coding and other predictable disciplines. But tech bros are too far out ahead of their skis in so many areas they really don’t understand the lack of utility.

That’s ok. Let’s see where this all leads.


It’s funny that you insult tech bros by using one of the corniest and hackneyed bro expression.

In any event, news flash, there are brilliant people working on this and there is a reason for it. GAI is transformational and will be no matter how many of you complain, all as you don’t even realize you’re using AI constantly throughout your day.


I think we found the tech bro
Anonymous
Anonymous wrote:
Anonymous wrote:Heres 3 examples. Yes I used chat to draft but I put in enough detail in my prompt that it came up with real world info info.

We used to have two support analysts manually categorize 5,000 IT support tickets per week. Now a model auto-tags 92% of them with 96% accuracy, and one analyst audits exceptions.

Our paralegals draft first-pass contract summaries. It used to take them hours in many cases. Have you ever read a commercial lease?? It is a monster. Now a model generates a structured summary in 45 seconds and the paralegal reviews and edits; prep time dropped from 1+ hours to under 15 minutes.

We replaced manual fraud screening of 100% of transactions with an AI risk score. Humans now review only the top 3% flagged.


Wait... are these 3 examples of things you personally did but you asked AI to write this up? Or you asked AI to come up with 3 examples?


I asked AI to come up with several example of productivity saves, and these 3 were ones I see in my immediate world so I cut and paste them. Although I’ll admit the support ticket human reduction at my co is higher than what I posted here. We had a number of support people and now we are down to one human.

You should look up ‘agentic AI’ bc that’s the next step
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:What you are saying here is you're not on GitHub, Substack, LinkedIn, or any of the places where people talk about the specific things they're building. You don't go to meetups where people demo their tools. Your version of curiosity is "posting here and demanding people tell you."


This comment exemplifies the problem. The prediction engine works very well in coding and other predictable disciplines. But tech bros are too far out ahead of their skis in so many areas they really don’t understand the lack of utility.

That’s ok. Let’s see where this all leads.


It’s funny that you insult tech bros by using one of the corniest and hackneyed bro expression.

In any event, news flash, there are brilliant people working on this and there is a reason for it. GAI is transformational and will be no matter how many of you complain, all as you don’t even realize you’re using AI constantly throughout your day.


I think we found the tech bro


I’m a woman and I’m old.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Heres 3 examples. Yes I used chat to draft but I put in enough detail in my prompt that it came up with real world info info.

We used to have two support analysts manually categorize 5,000 IT support tickets per week. Now a model auto-tags 92% of them with 96% accuracy, and one analyst audits exceptions.

Our paralegals draft first-pass contract summaries. It used to take them hours in many cases. Have you ever read a commercial lease?? It is a monster. Now a model generates a structured summary in 45 seconds and the paralegal reviews and edits; prep time dropped from 1+ hours to under 15 minutes.

We replaced manual fraud screening of 100% of transactions with an AI risk score. Humans now review only the top 3% flagged.


I hope someone is still reading these. Because AI is not that great at summarizing!


Pp, seriously? That’s all you’ve got?? I am typing on my tiny phone and added a sentence or two. This isn’t a letter to the president. It’s a chat forum.


Your contracts dufus. If I am paying a lawyer and they are using AI, I would be firing that lawyer.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Heres 3 examples. Yes I used chat to draft but I put in enough detail in my prompt that it came up with real world info info.

We used to have two support analysts manually categorize 5,000 IT support tickets per week. Now a model auto-tags 92% of them with 96% accuracy, and one analyst audits exceptions.

Our paralegals draft first-pass contract summaries. It used to take them hours in many cases. Have you ever read a commercial lease?? It is a monster. Now a model generates a structured summary in 45 seconds and the paralegal reviews and edits; prep time dropped from 1+ hours to under 15 minutes.

We replaced manual fraud screening of 100% of transactions with an AI risk score. Humans now review only the top 3% flagged.


Wait... are these 3 examples of things you personally did but you asked AI to write this up? Or you asked AI to come up with 3 examples?


I asked AI to come up with several example of productivity saves, and these 3 were ones I see in my immediate world so I cut and paste them. Although I’ll admit the support ticket human reduction at my co is higher than what I posted here. We had a number of support people and now we are down to one human.

You should look up ‘agentic AI’ bc that’s the next step


I can't speak to the others, but the legal one makes no sense to me. The firm probably has a template commercial lease they prefer to use and the part that AI can do (copying and pasting in the parties and numbers) could be done by a paralegal very quickly. An attorney would still need to review the final product carefully and negotiate changes with the other side whether it was generated by a paralegal or AI, and a paralegal is less likely to make mistakes. Then it says "a model generates a summary" but that's not the initial task described, which is drafting the lease. There's not really much to "summarize" in a lease other than the initial parties and terms which you would have fed into the prompt in the first place, and AI is not good at summaries and I imagine would have a difficult time cross-referencing the various subsections of the lease (as in "notwithstanding paragraph 3, and subject to the exceptions in section 5b, below, the parties shall...). Using AI seems like a huge waste of time and may create more work.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:What you are saying here is you're not on GitHub, Substack, LinkedIn, or any of the places where people talk about the specific things they're building. You don't go to meetups where people demo their tools. Your version of curiosity is "posting here and demanding people tell you."


This comment exemplifies the problem. The prediction engine works very well in coding and other predictable disciplines. But tech bros are too far out ahead of their skis in so many areas they really don’t understand the lack of utility.

That’s ok. Let’s see where this all leads.


You clearly don't know anyone in tech.... its not replacing people and needs a lot of human help.


It does need human help but less and less. And if you think ai (we are mostly talking about GAI here on this forum but regular/simple AI has been around for years), you clearly don’t know tech or industries at all. I’m in law- not even tech- and a major firm just laid off hundreds of people they don’t need anymore bc of AI from client IT/tech support to back end tech support to admins etc.

And they are not hiring first year lawyers like they used to.


Actually I do. And, it’s not a replacement yet. It’s an excuse.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Heres 3 examples. Yes I used chat to draft but I put in enough detail in my prompt that it came up with real world info info.

We used to have two support analysts manually categorize 5,000 IT support tickets per week. Now a model auto-tags 92% of them with 96% accuracy, and one analyst audits exceptions.

Our paralegals draft first-pass contract summaries. It used to take them hours in many cases. Have you ever read a commercial lease?? It is a monster. Now a model generates a structured summary in 45 seconds and the paralegal reviews and edits; prep time dropped from 1+ hours to under 15 minutes.

We replaced manual fraud screening of 100% of transactions with an AI risk score. Humans now review only the top 3% flagged.


Wait... are these 3 examples of things you personally did but you asked AI to write this up? Or you asked AI to come up with 3 examples?


I asked AI to come up with several example of productivity saves, and these 3 were ones I see in my immediate world so I cut and paste them. Although I’ll admit the support ticket human reduction at my co is higher than what I posted here. We had a number of support people and now we are down to one human.

You should look up ‘agentic AI’ bc that’s the next step


I can't speak to the others, but the legal one makes no sense to me. The firm probably has a template commercial lease they prefer to use and the part that AI can do (copying and pasting in the parties and numbers) could be done by a paralegal very quickly. An attorney would still need to review the final product carefully and negotiate changes with the other side whether it was generated by a paralegal or AI, and a paralegal is less likely to make mistakes. Then it says "a model generates a summary" but that's not the initial task described, which is drafting the lease. There's not really much to "summarize" in a lease other than the initial parties and terms which you would have fed into the prompt in the first place, and AI is not good at summaries and I imagine would have a difficult time cross-referencing the various subsections of the lease (as in "notwithstanding paragraph 3, and subject to the exceptions in section 5b, below, the parties shall...). Using AI seems like a huge waste of time and may create more work.


If a lawyer is getting paid, they need to review the contract.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:What you are saying here is you're not on GitHub, Substack, LinkedIn, or any of the places where people talk about the specific things they're building. You don't go to meetups where people demo their tools. Your version of curiosity is "posting here and demanding people tell you."


This comment exemplifies the problem. The prediction engine works very well in coding and other predictable disciplines. But tech bros are too far out ahead of their skis in so many areas they really don’t understand the lack of utility.

That’s ok. Let’s see where this all leads.


You clearly don't know anyone in tech.... its not replacing people and needs a lot of human help.


It does need human help but less and less. And if you think ai (we are mostly talking about GAI here on this forum but regular/simple AI has been around for years), you clearly don’t know tech or industries at all. I’m in law- not even tech- and a major firm just laid off hundreds of people they don’t need anymore bc of AI from client IT/tech support to back end tech support to admins etc.

And they are not hiring first year lawyers like they used to.


GAI is not a reality yet.
post reply Forum Index » Jobs and Careers
Message Quick Reply
Go to: