logoalt Hacker News

wofoyesterday at 9:21 PM1 replyview on HN

Sounds like babysitting an LLM, with the alarming difference that this AI can kill you if you are not paying enough attention


Replies

tialaramexyesterday at 11:49 PM

Oh don't worry the LLMs absolutely can kill us, just slightly more indirectly.

Triggering psychosis is not difficult and the LLM is easily capable of doing that. For a person they soon get freaked out and are likely to summon help. "Johnny started acting crazy and I'm not sure what to do, please come". But the LLM isn't a person, Johnny needs to know more about the CIA's programme to cross breed Venusians with Hollywood stars? Here's an itinerary with the address of a real hotel in LA and an entirely hallucinated CIA officer's schedule.

Next thing you know, Johnny is shot dead by officers responding to a maniac with a fire axe who broke into an LA hotel and was screaming about space aliens.

show 1 reply