> but given that you can actually run the models yourself on AWS Bedrock
That's not exactly how it works. Anthropic are hosting their models in AWS Bedrock as a managed service. Customers call those LLMs just like calling any other API. There's no visibility into what kind of AWS infrastructure is serving that API request.