There is a lot of information, in various forms, on the internet that are specifically designed to misinform those who hadn’t taken a course on that particular topic, but leaves the reader feeling they learnt something. Right now LLM’s are good at picking those apart for the reader if they decide to dig deeper, however, I fear this era might not last.
> Right now LLM’s are good at picking those apart for the reader if they decide to dig deeper
They are not.
> LLM’s are good at picking those apart for the reader if they decide to dig deeper. I fear this era might not last.
Yeah, I'm not sure that pinning one's hopes for a better-educated populace on LLMs is going to pan out well. Education requires trust and active defense against malign actors.