> If they are trusting AI to replace labor, they should trust AI to be accountable for bad filters.
Surely all of the AI hype is true and there are no hypocrites in Corporate America.
> What happens when a human misses over a document or 2?
If they were obligated to produce it and don't they can get into some pretty bad trouble with the court. If they hand over something sensitive they weren't required to, they could potentially lose billions of dollars by handing trade secrets to a competitor, or get sued by someone else for violating an NDA etc.
>Surely all of the AI hype is true and there are no hypocrites in Corporate America.
Worst case they are right and now we have more efficient processing. Best case, bungling up some high profile cases accelerates us towards proper regulation when a judge tires of AI scapegoats.
I don't see a big downside here.
>If they were obligated to produce it and don't they can get into some pretty bad trouble with the court.
Okay, seems easy enough to map to AI. Just a matter of who we hold accountable for it. The prompter, the company at large, or the AI provider.