Anonymous wrote:AI doesn't even understand physics. See this
https://www.thealgorithmicbridge.com/p/harvard-and-mit-study-ai-models-are
It gets the physics completely wrong but gets the orbits right -- but people did that too in the distant past before they figured out Newtonian mechanics (epicycles). AI (or at least the current version of deep networks) optimize for prediction and matching, not concept abstraction. I keep on top of this research for my work, and AI is too far away from this. The situation is far worse in biology.
And before someone jumps in to say that it's only a matter of time, hardly anyone (in academic research or at companies) is optimizing for this. AI firms have bet on AGI, but most of those designs are just beefed up transformers -- which are powerful but have serious limitations.
Anonymous wrote:AI doesn't even understand physics. See this
https://www.thealgorithmicbridge.com/p/harvard-and-mit-study-ai-models-are
It gets the physics completely wrong but gets the orbits right -- but people did that too in the distant past before they figured out Newtonian mechanics (epicycles). AI (or at least the current version of deep networks) optimize for prediction and matching, not concept abstraction. I keep on top of this research for my work, and AI is too far away from this. The situation is far worse in biology.
And before someone jumps in to say that it's only a matter of time, hardly anyone (in academic research or at companies) is optimizing for this. AI firms have bet on AGI, but most of those designs are just beefed up transformers -- which are powerful but have serious limitations.
Anonymous wrote:AI doesn't even understand physics. See this
https://www.thealgorithmicbridge.com/p/harvard-and-mit-study-ai-models-are
It gets the physics completely wrong but gets the orbits right -- but people did that too in the distant past before they figured out Newtonian mechanics (epicycles). AI (or at least the current version of deep networks) optimize for prediction and matching, not concept abstraction. I keep on top of this research for my work, and AI is too far away from this. The situation is far worse in biology.
And before someone jumps in to say that it's only a matter of time, hardly anyone (in academic research or at companies) is optimizing for this. AI firms have bet on AGI, but most of those designs are just beefed up transformers -- which are powerful but have serious limitations.
Anonymous wrote:I’m an engineer. AI can’t do what I do, i.e., creative thought.
Anonymous wrote:AI doesn't even understand physics. See this
https://www.thealgorithmicbridge.com/p/harvard-and-mit-study-ai-models-are
It gets the physics completely wrong but gets the orbits right -- but people did that too in the distant past before they figured out Newtonian mechanics (epicycles). AI (or at least the current version of deep networks) optimize for prediction and matching, not concept abstraction. I keep on top of this research for my work, and AI is too far away from this. The situation is far worse in biology.
And before someone jumps in to say that it's only a matter of time, hardly anyone (in academic research or at companies) is optimizing for this. AI firms have bet on AGI, but most of those designs are just beefed up transformers -- which are powerful but have serious limitations.
Anonymous wrote:there's a steady drumbeat of "AI will replace coders" and "AI will replace lawyers"
and yet everyone is still pushing their kids into engineering. I dont get it. engineering seems as/more vulnerable to me.
Anonymous wrote:I’m an engineer. AI can’t do what I do, i.e., creative thought.