Warp founder here. It's cool to see the community excitement here.
Note that we are going to add bring-your-own-model directly into Warp. Would love interested folks to weigh in on the discussion here: https://github.com/warpdotdev/warp/discussions/9619
Long overdue - I was all in a few years ago with Warp, but after the last couple of years of not addressing this need, I have moved on from Warp. I now DO NOT see the need to embed AI into the terminal when you can have all sorts of TUI doing the same job.
Makes plenty of sense to upstream this (possibly makes more more than forking, although I suppose it's one way of gauging interest and implementation complexity).
Will this require a paid plan? That is, could Warp, when modified, work with a local model on Ollama, no charge?