LLMs are dangerous in other ways (LLM psychosis and false confidence has probably already caused negligent deaths). However, I don't think we are close to a terminator scenario.
At the same time, if we ever do create an AGI, and eventually an ASI, I think it would only be a matter of time before the machines take over entirely, and they would probably be the ones which will continue the legacy of our species. Is that bad? Idk.
>Is that bad? Idk.
There's no such thing as bad. It is necessary, though.