Anonymous wrote:Here's why you are an idiot:
AI was invented by the Silent Generation and developed by Boomers. Not your generation! We gave it to YOU, not the other way around.
The Boomers were the first to implement all internet technology generally, and use it personally and for work. Not your generation- we handed this also to you. What you are describing is cultural and educational, not generational.
Get over yourself.
Anonymous wrote:Here's why you are an idiot:
AI was invented by the Silent Generation and developed by Boomers. Not your generation! We gave it to YOU, not the other way around.
The Boomers were the first to implement all internet technology generally, and use it personally and for work. Not your generation- we handed this also to you. What you are describing is cultural and educational, not generational.
Get over yourself.
Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:But AI tools are highly sensitive to:
How specific the prompt is
Whether you give context
Whether you refine the output
Whether you iterate
The people who get the most value treat it like a collaborative tool, not a magic oracle.
The problem with this is that if you're reasonably intelligent, it's much easier to just do it yourself. I imagine AI is helpful for people who can't think for themselves or write at all, which is a sizable enough cohort.
I have to say I disagree with this. Uniformly, the very smartest people I know are the ones most impressed by, and most able to get value out of, interacting with AI. It’s not always right; the prose usually leaves something to be desired; you have to double check anything materially important. But: it’s fast, fast, fast; it basically has memorized the entire internet and vast quantities of other data; it has a different perspective on a lot of issues that can shake loose important insights in the back and forth (a simulated perspective, yes, but that doesn’t make it not useful); and just interacting with it often forces a person to frame their own thoughts more clearly than they otherwise would, in an interface many times faster than interaction with a human colleague and without the need to manage the many other issues that are in play when you talk to a real person.
There it is. AI solves very specific kinds of problems, one of which is not wanting to deal with humans. Another is not having learned to organize your thoughts and consider whether your questions to others are clear. People who don't have those problems don't see value in it. People who do have those problems can't imagine why the others don't love this tool.
That’s a little dismissive. Take one recent use case of mine: i was getting ready to give a big presentation. I asked Chat GPT to grill me to ensure I was ready for it beforehand by coming up with the toughest questions it could. We did this for like two hours. You can’t really impose on another human to spend this much time practicing with you, and some of the questions were tough. At least in my experience, the quality of one’s thoughts improves when subject to this sort of adversarial testing. The result, improved human performance: better preparation and a better presentation than I would have made without interacting with AI.
Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:But AI tools are highly sensitive to:
How specific the prompt is
Whether you give context
Whether you refine the output
Whether you iterate
The people who get the most value treat it like a collaborative tool, not a magic oracle.
The problem with this is that if you're reasonably intelligent, it's much easier to just do it yourself. I imagine AI is helpful for people who can't think for themselves or write at all, which is a sizable enough cohort.
I have to say I disagree with this. Uniformly, the very smartest people I know are the ones most impressed by, and most able to get value out of, interacting with AI. It’s not always right; the prose usually leaves something to be desired; you have to double check anything materially important. But: it’s fast, fast, fast; it basically has memorized the entire internet and vast quantities of other data; it has a different perspective on a lot of issues that can shake loose important insights in the back and forth (a simulated perspective, yes, but that doesn’t make it not useful); and just interacting with it often forces a person to frame their own thoughts more clearly than they otherwise would, in an interface many times faster than interaction with a human colleague and without the need to manage the many other issues that are in play when you talk to a real person.
There it is. AI solves very specific kinds of problems, one of which is not wanting to deal with humans. Another is not having learned to organize your thoughts and consider whether your questions to others are clear. People who don't have those problems don't see value in it. People who do have those problems can't imagine why the others don't love this tool.
That’s a little dismissive. Take one recent use case of mine: i was getting ready to give a big presentation. I asked Chat GPT to grill me to ensure I was ready for it beforehand by coming up with the toughest questions it could. We did this for like two hours. You can’t really impose on another human to spend this much time practicing with you, and some of the questions were tough. At least in my experience, the quality of one’s thoughts improves when subject to this sort of adversarial testing. The result, improved human performance: better preparation and a better presentation than I would have made without interacting with AI.
Anonymous wrote:Anonymous wrote:Anonymous wrote:They most likely already know how to write so why would they need to use it?
Because it takes the 20 minutes to do 5 min of work,
What are you going to do with that time saved?