My wife is a doctor and there is a general trend at the moment of everyone thinking their intelligence in one area (say programming) carries over into other areas such as medicine, particularly with new tools such as ChatGPT.
Imagine if as a dev someone came to you and told you everything that is wrong with your tech stack because they copy pasted some console errors into ChatGPT. There's a reason doctors need to spend almost a decade in training to parse this kind of info. If you do the above then please do it with respect for their profession.
I'm reminded of an effect called Gell-Mann Amnesia.
When reading news stories on topics you know well, you notice inaccuracies or poor reporting - but then immediately forget that lesson when reading the next article on a topic you are not familiar with.
It's very similar to what happens with AI.
> general trend at the moment
“A little knowledge is a dangerous thing” is not new, it’s a quote/observation that goes back hundreds of years.
> Imagine if as a dev someone came to you and told you everything that is wrong with your tech stack because they copy pasted some console errors into ChatGPT.
You mean the PHB? They don’t need ChatGPT for that, they can cite Gartner.
> My wife is a doctor and there is a general trend at the moment of everyone thinking their intelligence in one area (say programming) carries over into other areas such as medicine, particularly with new tools such as ChatGPT.
My wife is a lawyer and sees the same thing at her job. People "writing" briefs or doing legal "research" with GPT and then insisting that their document must be right because the magic AI box produced it.