> AI to speed up the understanding process
What’s your hypothesis of how AI can accelerate how your brain understands something?
I have some success with this method: I try to write an explanation of something, then ask the LLM to find problems with the explanation. Sometimes its response leads me to shore up my understanding. Other times its answer doesn’t make sense to me and we dig into why. Whether or not the LLM is correct, it helps me clarify my own learning. It’s basically rubber duck debugging for my brain.
Quick, easy access to explanations and examples on complex topics.
In my case, learning enough trig and linear algebra to be useful in game engine programming / rendering has been made a lot easier / more efficient.
The same way Google or Wikipedia enables learning.
> What’s your hypothesis of how AI can accelerate how your brain understands something?
What are your beliefs / hypothesis of how having a human teacher can help you understand something?
AI explanations are no longer terrible garbage. The LLM might not be doing original research, but it has definitely read the textbook. :/ And 1000 related works.
You shouldn't believe the LLM when it tells you how to micro-optimize your code, but you can take suggestions as a starting point and verify them.