It’s like weightlifting: sure you can use a forklift to do it, but if the goal is to build up your own strength, using the forklift isn’t going to get you there.
This is the ultimate problem with AI in academia. We all inherently know that “no pain no gain” is true for physical tasks, but the same is true for learning. Struggling through the new concepts is essentially the point of it, not just the end result.
Of course this becomes a different thing outside of learning, where delivering results is more important in a workplace context. But even then you still need someone who does the high level thinking.
A use case I’ve been working through is learning a language (not programming). You can use LLMs to translate and write for you in another language but you will not be able to say, I know that language, no matter how much you use the LLM.
Now compare this to using the LLM with a grammar book and real world study mechanisms. This creates friction which actually causes your mind to learn. The LLM can serve as a tool to get specialized insight into the grammar book and accelerate physical processes (like generating all forms of a word for writing flashcards). At the end of day, you need to make an intelligent separation where the LLM ends and your learning begins.
I really like this contrast because it highlights the gap between using an LLM and actually learning. You may be able to use the LLM to pass college level courses in learning the language but unless you create friction, you actually won’t learn anything! There is definitely more nuance here but it’s food for thought
I like this analogy along with the idea that "it's not an autonomous robot, it's a mech suit."
Here's the thing -- I don't care about "getting stronger." I want to make things, and now I can make bigger things WAY faster because I have a mech suit.
edit: and to stretch the analogy, I don't believe much is lost "intellectually" by my use of a mech suit, as long as I observe carefully. Me doing things by hand is probably overrated.
Misusing a forklift might injure the driver and a few others; but it is unlikely to bring down an entire electric grid, expose millions to fraud and theft, put innocent people in prison, or jeopardize the institutions of government.
There is more than one kind of leverage at play here.
I do appreciate the visual of driving a forklift into the gym.
The activity would train something, but it sure wouldn't be your ability to lift.
I think a better analogy is a marathon. If you're training for a marathon, you have to run. It won't help if you take the car. You will reach the finish line with minimal effort, but you won't gain any necessary muscles.
I feel like the aviation pilot angst captured by "automation dependency" and the fears around skills loss is another great analogy. [0]
[0] https://eazypilot.com/blog/automation-dependency-blessing-or...
How seriously do you mean the analogy?
I think forklifts probably carry more weight over longer distances than people do (though I could be wrong, 8 billion humans carrying small weights might add up).
Certainly forklifts have more weight * distance when you restrict to objects that are over 100 pounds, and that seems like a good decision.
> This is the ultimate problem with AI in academia. We all inherently know that “no pain no gain” is true for physical tasks, but the same is true for learning. Struggling through the new concepts is essentially the point of it, not just the end result.
OK but then why even use Python, or C, or anything but Assembly? Isn't AI just another layer of value-add?
Wondering why the obvious solution isn’t applied here - instead of giving already well known problems that have been solved thousand times give students open research opportunities- stuff which is on the edge of being possible, no way to cheat with Ai. And if Ai is able to solve those - give harder tasks
The real challenge will be that people almost always pick the easier path.
We have a decent sized piece of land and raise some animals. People think we're crazy for not having a tractor, but at the end of the day I would rather do it the hard way and stay in shape while also keeping a bit of a cap on how much I can change or tear up around here.
Thanks for the analogy. But I think students may think to themselves:"Why do I need to be stronger if I can use a forklift?"
I've been showing my students this video of a robot lifting weights to illustrate why they shouldn't use AI to do their homework. It's obvious to them the robot lifting weights won't make them stronger.
I like the weightlifting parable!
Unlike weightlifting, the main goal of our jobs is not to lift heavy things, but develop a product that adds value to its users.
Unfortunately, many sdevs don't understand it.
I think this is a pretty solid analogy but I look at the metaphor this way - people used to get strong naturally because they had to do physical labor. Because we invented things like the forklift we had to invent things like weightlifting to get strong instead. You can still get strong, you just need to be more deliberate about it. It doesn't mean shouldn't also use a forklift, which is its own distinct skill you also need to learn.
It's not a perfect analogy though because in this case it's more like automated driving - you should still learn to drive because the autodriver isn't perfect and you need to be ready to take the wheel, but that means deliberate, separate practice at learning to drive.