logoalt Hacker News

OpenAI’s o1 correctly diagnosed 67% of ER patients vs. 50-55% by triage doctors

441 pointsby donsupremeyesterday at 12:30 AM388 commentsview on HN

Comments

lvl155yesterday at 8:55 PM

I’ve some family in medicine and it scares me how much they now rely on AI. Some even quote it like Bible.

SilverElfinyesterday at 7:37 PM

I’ve had much better luck with diagnosis of my own family’s issues than with doctors. Usually now, I’m feeding them more information to begin with, so that their 30 minute office visits are not wasted, requiring another expensive follow up appointment.

While I’m sure there can be ways in which such studies are wrong, it’s very obvious that AI can accelerate work in many of these areas where we seek out professional help - doctors, lawyers, etc.

show 1 reply
journalyesterday at 7:51 PM

would it ever diagnose incorrectly to save more lives? kinda weird an ai would decide who die so others may survive, but i guess whatever.

show 1 reply
Aboutplantsyesterday at 7:12 PM

Now show me the result of Triage Doctors with aided AI help

bluefirebrandyesterday at 7:23 PM

Unfortunately, from my understanding Doctors don't necessarily diagnose for accuracy, they often diagnose to limit liability.

They aren't going to take a stab at an uncommon diagnosis even if it occurs to them, if they might get sued if they're wrong.

Edit: I'm not trying to say Doctors deliberately diagnose wrong. Just that if there are two possible diagnoses, one common that matches some of the symptoms and one rare that matches all symptoms, doctors are still much more likely to diagnose the common one. Hoofbeats, horses, zebras, etc

nikhilpareek13today at 12:56 PM

[flagged]

appz3yesterday at 9:21 PM

[flagged]

Noahxelyesterday at 7:56 PM

[flagged]

wg0yesterday at 7:33 PM

The Guardian needs to raise their bar on what to report and how to give readers full context on the ongoing NFT AI trust me bro crypto scam and that context would be that it is a mathematical model of human language and not medical expert or replacement for one.

show 3 replies
taurathyesterday at 7:23 PM

I’d love to see a follow to that radiologist evaluation, where it failed so miserably on the thing it was supposed to be the best at that now there’s a shortage of radiologists.

show 1 reply
Benderyesterday at 1:42 PM

Humans could not diagnose and treat me correctly. They almost killed me. Curious where I could feed my symptoms and the same data I gave to an ER to an AI to test it.

show 2 replies
tedgghyesterday at 8:42 PM

Believable and not shocking. LLMs literally may have saved my sons and potentially her mother too by allowing us to fact check a lot of non sense data and scare tactics by a group of at least 5 different doctors ambushing us to make a life changing decision in minutes. The problem is doctors, at least in the US, prioritize liability exposure over patients long term outcomes. Let’s say you need an intervention where two options A and B are available to you. A carries 1% risk of complications but a great outcome. Option B has 0.1% risk of complications but once you are discharged the short term effects are challenging and long term effects not well understood. Well, 10/10 times doctors will suggest option B and will do anything they can to nudge you into making that choice, like not telling you the absolute numbers and constantly using the word “death”. They also lie about the outcomes, because again, once you accept the procedure, sign and are sent home, they have nothing to do with you.

show 3 replies
Kuyawatoday at 12:39 AM

As a 60yo I developed my own AI medical assistant [1] and I've used it extensively for many conditions, I can't be happier. After analyzing some lab tests it even recommended a marker that was not considered first by the doctor, so yes, it won't replace doctors but it is a very helpful tool for self-diagnosing simple conditions and second opinions.

[1] https://mediconsulta.net (DeepSeek)