Ooh, this looks great!
The usage costs are rather high compared to S3 - 30x higher PUT/POST. It looks like batching operations is going to be vital.
Yeah pricing seems okay with batching. The 128MB memory cap per Durable Object is what I'd watch. A repo with a few thousand files and some history could hit that faster than you'd expect, especially during delta resolution on push.
I think by operation they mean `git clone` or `git push`, which can read or write hundreds or thousands of objects per operation.
Hmm, I'd expect to be able to actually access the contents of the git repo...
Docced Features: clone repos init new repos import repos
Missing features: list branches and tags list objects list commit history create new commits read raw git objects merge branches or repos read git object by path