You don't need to pay an external crowd for that.
You can run Claude Code using a local instance of ~recent Ollama fine, and it'll do the teaching job perfectly well using (say) Qwen 3.5.
Doesn't even need to be one of the large models, one of the mid-size ones that fit in ~16GB of ram when given 128k+ context size should be fine.