Good
It seems clear that "autopilot" was a boisterous overclaim of its capabilities that led to people dying.
It may be minorly absurd to win founder-IPO-level wealth in a lawsuit, but it's also clear that smaller numbers don't act as an effective deterrent to people like Elon Musk.
Right! We demand engineering perfection! No autopilot until we guarantee it will NEVER kill a soul. Don't worry that human drivers kill humans all the time. The rubric is not better than a human driver, it is an Angelic Driver. Perfection is what we demand.
I've always thought of it more as "Co-Pilot", but formally: "Autopilot" might truly be the better definition (lane-keeping, distance-keeping), whereas a "Co-Pilot" (in aviation) implies more active control, ie: pulling you up from a nose dive.
So... informally, "Tesla Co-Pilot" => "You're still the pilot but you have a helper", vs "Tesla Autopilot" => "Whelp, guess I can wash my hands and walk away b/c it's AuToMaTiC!"
...it's tough messaging for sure, especially putting these powertools into peoples hands with no formal training required. Woulda-coulda-shoulda, similar to the 737MAX crashes, should "pilots" of Teslas required training in the safety and navigation systems before they were "licensed" to use them?