Exactly zero of those account for an individual's or company's ability tolive by their own moral code
And this AI software is not a mere static object like a hammer that can be handed off to a customer and what it is used for is their business, to build a house or bash a living skull.
This is a system that must be constantly maintained by it's builders.
Moreover, even if we use your standard, the law, it has already been decided in Anthropic's favor.
What you require is that Anthropic actively participate in activities that they consider abhorrent and/or unwise. SCOTUS has already ruled that a business cannot even be required to sell a cake to someone if it does not like the intended purpose (in that case, at a celebration at a gay wedding).
> even if we use your standard, the law, it has already been decided in Anthropic's favor.
I support Anthropic here. They had a deal with the Govt and the Govt bullied them. That should not be allowed, and Anthropic is suing which makes sense to me. Anthropic should be allowed to set any terms of use for the product that they want, and gain or lose business based on those terms. That's fine.
I'm saying that the failure is actually upstream. It should not be possible for Anthropic's AI to be used to mass surveile or murder people, because those things should be illegal by law and the govt should not be allowed to do it and should not be doing it. Somehow it isn't this way though.
So now that we find ourselves in this failed state, we have to rely on Anthropic to be "the law": to identify what what's "evil" and disallow it. I'm saying that's out of scope for a tech company and they shouldn't be expected to do that. They should only be in the business of making good tech and then be free to let it be used by anyone for any purpose that that the law allows.
This also means that if it's illegal to share information on how to build a bomb without AI, then it should be illegal for Claude to share that information with AI. So Anthropic to does need to make sure they're not breaking the law themselves as well.